I like cats, and I like pointing lasers at people’s feet for cats to jump to. Unfortunately I’m allergic to most hairy animals, so no cats for me anytime soon. I do have a friend with cats, and that gave me the idea, wouldn’t it be fun to be have a remote laser at your house (or someone with an actual cat) that you can control, being able to annoy your cats with a laser to catch from anywhere you want?
So I decided to create a little robot with a laser pointer connected to a webapp to be able to troll the cats of a friend from any place in his house (for privacy reasons not accessible from outside his house..).
The main idea was to connect a laser pointer and camera to a headmount, which can be moved with two servos. The camera stream connects to a webapp, which can be viewed by a user and who can control the servos. Everything is connected and run on a Raspberry PI 3b+, with Node.js and Python for the code magic.
Below an overview of the results. I put all the code and documentation in a GitHub repository which you can find here, so check that for the specifics on the code and hardware and stuff.
Controlling the camera and laser
First up, I connected a Raspberry PI camera to a Raspberry PI 3b+ together with 2 servos setup in a pan-tilt combination. I created a small bracket of some aluminium on which I connected the RPI camera and taped the laser pointer, which in turn was screwed on the servo. Connecting the servos to a 6v accu pack, the hardware was completed!
To be able to control the servos from the webapp, the rpi-control script is running in the background waiting for commands via a websocket. Check the Git for the code.
Livestream Camera to Node.js
Now that we have a working Raspberry PI with a camera, laser pointer and servos, we now need to livestream the camera images to a webapp. To do so I used JSMpeg. They provide an example livestream script for Node.js, which sends a livestream via a websocket, making it accessible in a webapp. The input for JSMpeg is a FFMPEG livestream of the Raspberry PI camera. Check Git for the specifics.
The final step is to serve the live camera images to the user in a webapp, through which the user can also control the movement of the camera and laser, and turn the laser pointer on and off.
For this I created a very simple Node.js webserver, which connects to JSMpeg websocket, and shows the camera livestream. As you can see on the picture below, I timed the delay and it was around 100-200ms, quite okay!
Unfortunatly it only streams at a resolution 640x320px with 30fps. The RPI wasn’t able to cope very well with higher quality streams.
After each click or tap, a new command is send via a websocket to the rpi-control script.
Finally, I styled the webpage a bit to fill empty space on the page in vertical and horizontal mode
The resulting webapp can be seen in the pictures below:
Any device on the same network as the rpi can connect to the webapp and play with the robot laser pointer. Even multiple people at one time, which leads to extra chaotic laser behaviour and more cat entertainment.
Overal the project was a big success. It took some time to get the livestream working, but the result is a robust low-latency webapp which works together quite well with the Raspberry PI and other hardware. I also learned a lot about websockets, Node.js and livestreaming.
I hope to test the Cat Troller with some real cats soon, and will upload a video at that time! I do have some lizards, so it might be worth trying if they are as enthousiastic about lasers as a cat (but probably not).
At the end, the only thing that might have been improved was to reduce the complexity of the whole thing a bit. But if it works, it works.