-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Questions and clarifications #24
Comments
Hello @dumblob. Thank you for reaching out to us. To answer your questions: a) When and how will be haptics supported in SDKs? b) How are quaternions in Kai computed? What are they useful for? Is it just combination of the 3 sensors (accelerometer, magnetometer, gyroscope)? Is it somehow smoothed (e.g. by floating average filter)? c) How are PYR data computed? Is it just combination of the 3 sensors (accelerometer, magnetometer, gyroscope)? Is it somehow smoothed (e.g. by floating average filter)? d) How can one calibrate sensors (especially accelerometer, magnetometer, gyroscope)? There seems to be no direct access, but because of their drift, calibration will be necessary (see Inertial Navigation Systems for issues - especially the Zero Velocity Update trick). e) Can Kai provide proportional information about fingers? From FingerPositionalData it seems it can, but without knowing how this data is acquired, there is no way to find out whether this data is linear, what is the proper range, how to calibrate the data, etc. f) What is the maximum frequency with which Kai can send accelerometer, magnetometer, and gyroscope data? g) What is the latency to read out accelerometer, magnetometer, and gyroscope data? How to decrease the latency (it feels currently somewhat delayed)? |
Thanks for the very informative answer. Those are good news. Therefore one more question if you don't mind: h) Does your mouse emulation (i.e. mouse cursor movement as used for playing FPS games) do anything to compensate the (mathematically speaking) derivative function characteristics of the sensor data? The sensor data deliver acceleration instead of velocity, but to move the cursor naturally, our brains expect the cursor to move according to velocity, not acceleration (this is also the reason why all such devices have the worst possible reputation for cursor movement). Do you perform any integration or similar compensation of the sensor data? Of course, plain integration wouldn't probably suffice due to noisy data and thus unacceptable delay when integrating, but this could be compensated by many means (from basic ones like Kalman filter estimation through fuzzy logic filtering up to neural networks filtering). |
a) When and how will be haptics supported in SDKs?
b) How are quaternions in Kai computed? What are they useful for? Is it just combination of the 3 sensors (accelerometer, magnetometer, gyroscope)? Is it somehow smoothed (e.g. by floating average filter)?
c) How are PYR data computed? Is it just combination of the 3 sensors (accelerometer, magnetometer, gyroscope)? Is it somehow smoothed (e.g. by floating average filter)?
d) How can one calibrate sensors (especially accelerometer, magnetometer, gyroscope)? There seems to be no direct access, but because of their drift, calibration will be necessary (see Inertial Navigation Systems for issues - especially the Zero Velocity Update trick).
e) Can Kai provide proportional information about fingers? From
FingerPositionalData
it seems it can, but without knowing how this data is acquired, there is no way to find out whether this data is linear, what is the proper range, how to calibrate the data, etc.f) What is the maximum frequency with which Kai can send accelerometer, magnetometer, and gyroscope data?
g) What is the latency to read out accelerometer, magnetometer, and gyroscope data? How to decrease the latency (it feels currently somewhat delayed)?
The text was updated successfully, but these errors were encountered: