Basics
The task is to process 3D spatial data coming from a pair of sensors, and determine the amount that one sensor is mimicking the movement of the other. The movement is a horizontal forward-and-backwards motion, so things can be made simpler by only looking at the movement in the Z direction.
Results
The analysis program will produce the following results:
- Duration of the movement for each hand
- Treating first one, and then the other hand as the "dominant" one:
- For each cycle:
- Determine the high point, the low point and the midpoint, and times of transition between them for the dominant hand.
- Determine the phase/time difference between the dominant hand and the other one. (Also a figure that illustrates how in or out of phase they are)
- For each cycle:
Algorithm Outline
This is a simple breakdown of what needs to be done from an analysis and coding point of view.
Red means that it hasn't yet been put into code
Orange means that it is currently being worked on/debugged.
- Load the data in an easily usable format.
- Determine the start and the end points of the actual movement.
- To find the start:
- Working from the the second data point, check the velocity between this point and the previous point.
- Find where the velocity is consistently more than v across n samples, allowing x exceptions. (empirically, v=4; n=6; x=2)
- 'Rewind' to the point previous to the one where we detected the consistant high velocity. This is the initial movement point.
- To find the end:
- Do the same as above, but in reverse, starting from the last item of data.
- Do both for the other data stream, the 'inner most' point in both cases is where the movement analysis will happen.
- To find the start:
- For each of the two hand sensors, s:
- Calculate a moving mean across the data s, use this as the midpoint (to account for the subject sliding their hands on the table)
- Using the midpoint information, divide s into a series of cycles ci.
- For each ci, determine it's points (high, low, mid). Compare these to the values for the other sensor at this point in time.
- Work out speech movement data
- Output resulting statistics as a CSV file.
Files
These are any useful files, so that it can be worked on when not at the lab.
Attach:wiggle1-vision.raw
Attach:wiggle1-vision-mirror.raw
Attach:wiggle1-vision-mirror2.raw
Attach:wiggle1-novision-outphase.raw
Attach:wiggle1-vision-outphase.raw
Attach:wiggle1-vision-mirror-outphase.raw
This is the most up-to-date revision of the code so far:
Attach:wiggle-pl