-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Filtering failure while trying to index #30
Comments
@jchutrue can you inspect |
Could be related to #28 ? What do your timestamps look like? |
They are floats. However, the timestamps are uint32 |
Should be fixed now. Could you test again after pulling the latest? And close the issue if fixed. |
filtfilt_within_epochs_mmap() is supposed to call get_contiguous_segments() but for some reason the stack trace ends up with an error from _get_contiguous_segments_fast(), so it still fails |
Ok, |
See error and stack trace below:
IndexError Traceback (most recent call last)
in ()
4 fs=30000,
5 fl=600,
----> 6 fh=6000)
7 filt_data_ch0 = np.fromfile('/home/jchu/data/install/long-recording/experiments/07-09-2017/filt-ch.00.raw',
8 np.int16)
~/code/jagular/jagular/filtering.py in filtfilt_mmap(timestamps, finname, foutname, fs, fl, fh, gpass, gstop, dtype, ftype, buffer_len, overlap_len, max_len, **kwargs)
109 overlap_len=overlap_len,
110 max_len=max_len,
--> 111 **kwargs)
112 return y
113
~/code/jagular/jagular/filtering.py in filtfilt_within_epochs_mmap(timestamps, finname, foutname, dtype, sos, buffer_len, overlap_len, max_len, filter_epochs, **kwargs)
178 assume_sorted=assume_sorted,
179 step=step,
--> 180 index=True)
181
182 for (start, stop) in filter_epochs:
~/code/jagular/jagular/utils.py in _get_contiguous_segments_fast(data, step, assume_sorted, index, inclusive)
277 starts = np.insert(breaks+1, 0, 0)
278 stops = np.append(breaks, len(data)-1)
--> 279 bdries = np.vstack((data[starts], data[stops] + step)).T
280 if index:
281 if inclusive:
IndexError: arrays used as indices must be of integer (or boolean) type
The text was updated successfully, but these errors were encountered: