Friday, 29 December 2017

Camera based spherical touch surface for an interactive light show


For several years I have been fascinated with geodesic domes and LED lighting. To bring together these two passions of mine I, along with a few friends, plan to fit 5725 LEDs inside of a 6 meter diameter geodesic dome. To make it interactive there will be a touch based controller in the middle of the 6 meter dome to allow people interact in real time with the lights around them.


The display will be made up of 20 triangles with around 300 LEDs in each, this makes it necessary for quite an interesting layout and control scheme which is what we have spent the last few months working out. This project will be split up into a few different posts. We will start with the touch input device and related software. Next I will discuss the software and hardware for controlling the LED array. Later comes the labor intensive tasks of building the steel dome, assembling the LED panels and all the wiring to go between everything.

A spherical touch input device is not a novel idea and has been implemented many times before. I found inspiration in a Microsoft research paper found here (pdf), I decided to try a similar approach using cheap commercially available hardware and open source software. I commissioned a local plastics forming company to make a ~500mm diameter dome from translucent polycarbonate plastic using a pressure forming tool. This was chosen because it was the cheapest option available, this has a downside in that the opacity is not consistent. At the peak of the dome the plastic has been stretched the most is significantly thinner than around the lower edges. I was able to work around this in software which I will explain later.


A wide angle monochrome USB camera from ebay is used for sensing, I specifically asked the vendor to supply the camera without an infra red cut filter. In front of the camera is an infra red longpass filter to get rid of all the visible light coming into the camera. Inside the lower edge of the dome I placed infra red LED strips (made by de-soldering a white LED strip and adding my own digi-key bought IR LEDs), these flood the inside of the dome with infra red light. When a finger comes into contact with the outside of the dome it reflects the infra red light, this is picked up by the camera. I used this method because there will be a lot of coloured lights around and want to give my camera the best chance of picking up touches. The image below is what the camera sees.


The software for the touch input uses openCV and python to manipulate and extract information captured by the camera. The image processing involves the following process:
  1. A calibration image is taken with no finger touches and is stored. This gives us our baseline to compare against. 
  2. Subsequent images taken by the camera have the calibration image subtracted from them. This results in only the bright reflections caused by finger touches to show up. 
  3. A blob detection function in openCV is used to find the coordinates of the bright spots. This gives us our touch coordinates which can be used for anything. 
The video below shows the dome working as a mouse input for my computer. At this point I had not switched over to using IR LEDs and was relying on visible light.


Next post I will talk about the software and hardware required for driving the 5725 LEDs. If you are interested in the software all our source files are available on github here.