You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Context: one of the 7/10 .dat files (2200--2400) seems problematic. Using the default block size, I think it "worked", but created lots of small epochs (while using max_gap_size=300), which is unexpected for this file. When we used block_size=2**24, we got an error, saying that the number of expected channel data packets to pack was 0 (which implies an empty new_ts), but that approx 2 million samples were passed for packing.
What's going on?
Is jagular missing some special case? Is the data corrupt? Both?
...
The text was updated successfully, but these errors were encountered:
filename='/media/jchu/DataHDD/data/install/long-recording/untethered/07-11-2017/merged/install_07-11-2017_2200_0000_sd08.rec'jfm=jag.io.JagularFileMap(filename)
jfm.timestamps# just look at channel 0 for the ripplesstart=time.time()
jag.utils.extract_channels(jfm=jfm,
max_gap_size=300,
ts_out='/home/jchu/data/install/long-recording/three-day-analysis/temp/timestamps-test.raw',
ch_out_prefix='/home/jchu/data/install/long-recording/three-day-analysis/temp/test-',
subset=[0],
block_size=2**24,
verbose=False)
print("Took {} minutes to extract channel 0".format((time.time() -start)/60))
update: this issue was related to datatype inconsistencies, particularly going from np.uint32 to np.int32. However, we have since migrated almost exclusively to np.int64, although there are still a few np.uint64s floating around, too.
@jchutrue please post the code snippet below!
Context: one of the 7/10 .dat files (2200--2400) seems problematic. Using the default block size, I think it "worked", but created lots of small epochs (while using max_gap_size=300), which is unexpected for this file. When we used block_size=2**24, we got an error, saying that the number of expected channel data packets to pack was 0 (which implies an empty new_ts), but that approx 2 million samples were passed for packing.
What's going on?
Is jagular missing some special case? Is the data corrupt? Both?
...
The text was updated successfully, but these errors were encountered: