Clone of this project from stand up maths: https://www.youtube.com/watch?v=TvlpIojusBE.
It is an set of applications that can be used to help map out the 3d coordinates of an array of LEDs on your christmas tree and then control and run some 3D-aware animations through your mobile phone.
Rust program meant to run on a Raspberry Pi that's connected some WS2812B LED lights. It listens to UDP messages on the local Wifi where each message contains the full list of colors to display. As soon as such a message is received, the lights are updated with the new values. It also includes some other utilities like some simple animations used for the LedPosHelper
as well as limiting the total brightness of the LEDs so as to limit the power usage if necessary.
Python program that displays images one by one and lets you click a certain location in each image (or refrain from doing so) and records your results in a json file. I took 4 videos of the tree while it was turning on the LEDs one by one, then converted the videos to lists of images (1 per frame), then ran markLocations.py
once per video. The folder also includes a script processResults.js
that converts the resulting coordinates from the various tree views into a best-guess 3D coordinate by using the results of the python program.
React native app meant to run on my Android phone (never tested iOS) which talks to the RpiUdpListener in order to control the lights. Unfortunately, this means that animations only work while the phone screen is on, as the app stops sending UDP messages as soon as you turn it off. Some sample animations can be seen below.
The 3D positions of the lights for my tree can be seen on this codepen
mytree.mp4
This animation is similar to the one that's produced by the animateRandomCrossSections
function in TreeControllerApp/App.tsx
, except that it uses a hardcoded direction for the plane to move in as opposed to a random one.
plane_vid_3.mp4
This animation which is similar to the one produced by the orientationRainbow
function in TreeControllerApp/App.tsx
uses the phone's "tilt sensor" to control the lights, changing the pitch of the phone controls the position of the rainbow along the vertical axis of the tree, and changing the roll controls the level of saturation of the rainbow.
tild_vid_2.mp4
Here are the particular products I used to get this project up and running:
- 2 x 150 Alitove 16.4ft WS2812B LEDs, for a total of 300 LEDs:
- Raspberry Pi 2 Model B with a Micro USB cable and AC adapter:
- JOVNO 5V 15A AC Adapter (theoretically underpowered for 300 LEDs):