Pass me the butter

Tjalling Haije

Trolling Cats with a Camera Guided Laser Pointer through a Webapp

During this project I decided to create a little robot with a laser pointer connected to a webapp, to be able to troll the cats of a friend from any place in his house.

The main idea was to connect a laser pointer and camera to a headmount, which can be moved with two servos.  The camera stream connects to a webapp, which can be viewed by a user, and which can control the servos. Everything is connected and run by a Raspberry PI 3b+, with Node.js and Python for the code magic.

Below an overview of the results. I put all the code and documentation in a GitHub repository which you can find here, so check that for the specifics on the code and hardware and stuff.

Controlling the camera and laser

First up, I connected a Raspberry PI camera to a Raspberry PI 3b+ together with 2 servos setup in a pan-tilt combination. I created a small bracket of some aluminium on which I connected the RPI camera and taped the laser pointer, which in turn was screwed on the servo.  Connecting the servos to a 6v accu pack, the hardware was completed!

To be able to control the servos from the webapp, the rpi-control script is running in the background waiting for commands via a websocket. Check the Git for the code.

Livestream Camera to Node.js

Now that we have a working Raspberry PI with a camera, laser pointer and servos, we now need to livestream the camera images to a webapp. To do so I used JSMpeg. They provide an example livestream script for Node.js, which sends a livestream via a websocket, making it accessible in a webapp. The input for JSMpeg is a FFMPEG livestream of the Raspberry PI camera. Check Git for the specifics.

The Webapp

The final step is to serve the live camera images to the user in a webapp, through which the user can also control the movement of the camera and laser, and turn the laser pointer on and off.

For this I created a very simple Node.js webserver, which connects to JSMpeg websocket, and shows the camera livestream. As you can see on the picture below, I timed the delay and it was around 100-200ms, quite okay!

Latency of the camera stream from the rpi to the webapp, to a phone over wifi. Around 100-200ms.

Unfortunatly it only streams at a resolution 640x320px with 30fps. The RPI wasn’t able to cope very well with higher quality streams.

In the next step, I added some Javascript to catch where the user clicks on the webpage, moving the servos in that direction. Clicking at the top will move the camera to the top, clicking bottom right will point the camera to the bottom right. By clicking close or faraway from the center of the screen, the user can choose how much to move the camera and laser in a certain direction. The user can also toggle the laser by pressing a red button on the webpage.

After each click or tap, a new command is send via a websocket to the rpi-control script.

Finally, I styled the webpage a bit to fill empty space on the page in vertical and horizontal mode

The Webapp

The resulting webapp can be seen in the pictures below:

Any device on the same network as the rpi can connect to the webapp and play with the robot laser pointer. Even multiple people at one time.

Overal the project was a big success. It took some time to get the livestream working,  but the result is a robust low-latency webapp which works together quite well with the Raspberry PI and other hardware. I also learned a lot about websockets, Node.js and livestreaming.

I hope to test the Cat  Troller with some real cats soon, and will upload a video at that time!

 

 

Next Post

1 Comment

  1. Jamesmon April 29, 2020

    thank a lot for your web site it assists a lot.

Leave a Reply

© 2020 Pass me the butter

Theme by Anders Norén