What I had in mind was spending no money to have a new midi controller, as almost everyone has a smartphone.
Getting acceleration, orientation, movement from a phone to a computer is easy with the APIs.
It's the next step, form raw data (JSON) to midi output that I wonder how to do.
And latency might be an issue too.
That's why I asked here, I think somebody must have did it already, but maybe not shared.
Basically, I am searching to do like in this vid :
but with a smartphone instead, so everybody could use it
Some ios apps using the gyroscope for modulations or even mimic via the camera. They mainly use it for intern usage. Not sure if there is a low latency midi out conversation possible.
However, midi over bluetooth works good and with low latency for me in general.
I'll take a look on the reaper forum, I may have more success there.
I was sure my idea was already implemented by someone, I'll share any work on it here.
Edit: OK, after some research it seems that 3d tracking of a smartphone (its position in space) is impossible due to the quality of the sensors.
Even Magdwick and Kalman filters have limits.
So it's better to stick to something similar to touchosc solution :/