Auto match your ambient light with what you are watching on your Mac

I’ve always been interested in making stuff talk to each other. In this Monday afternoon project, I decided I wanted to be able to watch a beautiful documentary while my lights guide me through it accordingly by changing their color to what I’m actually seeing.


The process

The idea is simple. Take continuous screenshots, analyze their predominant color and set the Philips Hue lights to that color.

I chose Ruby and wrote it for a Mac. I used the screencapture command to take screenshots, the RMagick gem (and ImageMagick) to analyze the color in the screen, the hue gem for talking to the lights and finally the ColorMath gem to convert the hex color codes ImageMagick outputs to the weird sort-of-hsl format Philips Hue understands (I’m actually pretty sure ImageMagick can convert to hsl and I could have even coded it myself (I studied it in an interesting subject called VIG at the University and I think I even performed this exact conversion in a final exam) but I didn’t have the time).

The code

It’s actually taking me 100 times more to write this post than to come up with the idea and write the code, so don’t judge it, it’s just a proof of concept.

The result

I plugged my Mac via HDMI to the TV for demo purposes. This demo involves 4 Hue lights. One on top which is not visible, the long Ikea one on the left side and two small ones behind the TV. The demo made me notice the code actually crashes when the predominant color is black, because ImageMagick suddenly returns black instead of its hex value #000000.

Demo of the script

Next steps

It would also be cool to take the source images from a webcam so that the lights will match the color of wherever it’s pointed at. It would also be interesting to get more than one predominant color and assign them to the different lights.

Thanks for reading!

P.S. I’m colorblind so all this project could be really messed up.