Thank you for the interest. I don’t have much recorded material. A example of mimo use in a scenic project was this one: https://www.youtube.com/watch?v=zg3qO4Vb0KM. All the video is mixed with mimo live: edited videos, wired camera to capture card, and 2 RaspberryPi cameras (on table and ceiling) over webcapture (that’s is actually why we prefered mimo for this project).
My own work is mainly with QuartzComposer (here some examples: http://www.hautkai.net/media.html), that is why I want to integrate QuartzComposer layers into Mimo, but it is fundamental that I can control its parameters live.
X-keys is propietary protocol of PI Engineering? I would prefer using existing protocols like Midi and OSC, which can connect both with physical and virtual equipment - which I already have, and there is a much bigger menu of options to adapt to own necessities. At the other hand, the products I have seen are just triggers. I need gradual control. Lets say I move a window from left to right following a person, or make it gradually appear from the back - just to mention the geometry controls inside mimo. The parameters I control in QuartzComposer live during the show are intensities, sensibility, quantities, alpha, etc - that’s why I need so much faders and knobs, toggles or buttons are just 20%. That is also why the remote control http interface of mimo - in its actual form - is not useful for me.
We are working now on a new scenic project, I would love to use mimo again. This time I am considering to replace the RASP web capture with NDI. It works very fast with Newtek NDI App on phone (less latency than mimo call), and I hope I can connect directly the NDI/HX protocol of a Spark Connect (over wifi). We are going to use much live reactivity to live feed, over Quartz, and it would be perfect if I manage to integrate my compositions into mimo as layers and control them from there.