How about adding an active camera module that can be set in front of a video monitor? When I watch football on my 70” tv I turn my Aurora’s to green. By doing so it creates an allusion of a larger picture. By adding a camera module that can view a monitor in wide screen mode this would allow the Aurora to change colors according to the picture colors on the monitor.
You could probably use most of the same parts and logic to do this and then use the API to control the Aurora in the same way. There would be more delay, but it would be pretty amazing.
That would depend on a level of programming ability. I wonder if someone could produce an Aurora "Ambilight" system, where you could represent the TV and the device in relation and use that information to drive the Aurora.
I may have a complete solution for all of this soon. Hyperion (which oculus42 linked to), supports OPC, which can be received by OLA ( https://www.openlighting.org/ola/ ), I'm one of the OLA devs and I'm going to try and add some support for Aurora in OLA, soon, so you'd be able to do it via those two bits of software with just a bit of config.
Thought I would add to the conversation. I managed to get the Ambilight (Hyperion) working in my setup prior to seeing this post or I would have added to it originally. https://forum.nanoleaf.me/forum/share-your-projects/video-scene-specific-ambient-room-lighting
I updated my hyperion (ambilight) configuration to output a basic UDP stream of the average screen RGB colors. I then wrote a python script to grab this RGB stream and format it for the nanoleaf. I then request the nanoleaf to update colors based on any changes to this RGB stream. since the UDP stream and the nanoleaf RGB "server" are running on the same device the latency is actually very acceptable.
In Phase 2 of my project I am looking to update each panel individually based on the average color of a region of the TV.
I will upload a video of the setup once I have an opportunity. If anyone is interested in how to accomplish this let me know.