-
Notifications
You must be signed in to change notification settings - Fork 249
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for Neuronexus (.XDAT) files #1296
Comments
Hi spikeinterface people and neo people overlap a bit :) Do you have file specification ? And a very small data file to have a look at ? |
Specification looks like *_data.xdat is float32 and *_timestamp.xdat is int64 plus there's a .json metadata file. Seems like the last 6 chans are reserved for aux, digital ins, outs. We could provide a sample file once the system is up. |
More specifically to read the with open(Path(fname_timestamps), 'rb') as fid:
fid.seek(tstamp_offset * 8, os.SEEK_SET)
timestamps = np.fromfile(fid, dtype=np.int64, count=num_samples) And the license is pretty open Permission is hereby granted, free of charge, to any person obtaining a copy
of this software file and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions: I think the API is for using their own software for analysis and curation. Maybe this has been a recent change? I'm happy to play around with this, but I would need some data first (unless @samuelgarcia has a burning desire to work on this). |
@smenp, any sample files for us to work on? |
Sorry, we never got usable data and returned the system. At FENS now but can upload some not so nice data next week, or wait (month++?) for our replacement. .xdat is just flat float32 (chans, samples) + a metadata .json. |
We accept junk data. We just like having a file that we can work off of for generating the reader. |
Enjoy FENS. |
Here's a sample XDAT recording (2 probes with 64 channels each): I had started working on XDAT support for Neo myself, but I didn't get very far. I couldn't figure out how certain NeuroNexus/XDAT concepts should be mapped to the Neo object model. And now I don't have much time to work on it. |
Thanks. We will work on this soon! |
@dhmjhu could you zip the folder instead. I don't have capabilities to interact .7z. |
No problem. |
Thanks :) |
Hey @dhmjhu, Two quick questions: First are you okay with me uploading this data to our testing suite so we can run our tests against it (if so what attribution do you want to use--ie do you want your name/github handle associated with the data?) If you're willing to check it out you'll need to install from this PR: #1509. |
Perfect thanks @dhmjhu! Yes that's the repo where we keep small test files to run through the CI. You're more than welcome to set up an account if using those test files helps with any of your work. I'll upload the files you shared with me and then our CI tests should pass on the PR. No rush to test. Sometimes we start back-engineering on one dataset and then a user lets us know that something that we thought was hardcoded is not. So it can be beneficial to have it pass CI and the "real-user" test. But no rush :) Thanks for all your help so far! |
Hi. I am new to analyzing ephys data, and our lab currently uses allego neuronexus files (.xdat). In wanting to use spike interface, we reached out to see if there was support on their end for .xdat files.
I wanted to reach out to see if there was developed support for those files or if it could be done.
According to spikeinterface people, when they reached out to Neuronexus they did not seem interested in collaboration. I know that there are features that allow one to include their own file format. If that could be done by me, I can try; however, if the spikeinterface developers are unable to support .xdat file, I definitely don't think I will be able to make that happen.
The text was updated successfully, but these errors were encountered: