linux, Photography, Raspberry Pi, Time lapse, Unix

My Time-Lapse Rig

I have done a fair share of time-lapse videos in the past. Some lasting for months, some only for days. The most essential part of a time-lapse video is: DON’T TOUCH ANYTHING. (This is harder than it seems).

While my first one was just a webcam hanging next to my bed, with a red light in the night (so it doesn’t bother me so much), it moves A LOT. That’s probably me kicking the camera cable, or pushing the flowers themselves, I don’t know. But it does not look so good. See for yourself. It ran for only 5 days.

November, 2006. Wow, I was already doing this stuff 14 years ago!

As I’ve done many more, I got better at it, and I have my own time-lapse rig, which I can mode around. Have a look at this one (this is a tiny version of a time-lapse which ran for a couple months).

A more recent one. Notice the vase moving to the left on the first 2 seconds.

As the idea of a new time-lapse comes as soon as some plant gets my attention, I don’t want to keep dragging and fixing stuff around (not to mention destroying the raspberry pi camera cables again and again). So, I made my time-lapse rig:

Time-lapse rig. A raspberry Pi with MotionEyeOS, one IR camera, and one GoPro clone, for wide-angle and daylight pictures.

The Pi, the IR Camera, and the USB Hub are hot-glued into the wood panel, so I just grab the whole thing and position wherever I want. The GoPro-clone is fixed with Blu-Tack, so I can remove it when I need it for something else.

The rig in action with some box to keep it steady

The software I use is MotionEyeOS. It does what I need:

  • Takes pictures and uploads them to dropbox.
  • Mounts the filesystem Read Only, so removing it without turning off doesn’t make the machine unusable.
  • Has a nice web interface for all of it
MotionEyeOS on the phone.

This rig has been working really well for a couple years already. The Blu Tack thing is a recent addition: I love that thing. It’s so useful around the house, even more so than Sugru, as it’s reusable!

An example of a video from before I had my time-lapse rig: the camera was glued to the window, and as the glue faded, you can see the image moving. This ran for about two months:

Long standing video. It’s hard to keep the camera in place for months.

My next addition will be probably some kind of gorilla tripod with crocodile clips, so I have flexibility of positions, but, for now, keeping the wooden plank on a chair seems to do the trick.

Standard
Uncategorized

My bird camera with IR cut

I love birds, and I always wanted to photograph them. So, I though years ago: Why not bring the birds to me? So I did.

Or tried.

First, I got one of those:

Transparent bird feeder. Birds see you and avoid food.

Looks like a nice idea, right? Put some fodder, look at birds. Everybody wins.

Them I got a ZeroView. It’s a VERY nice package for the Raspberry pi zero. It looks like this:

ZeroView with a Raspberry Pi Zero.

So, with those suckers, the idea was:

  • Bird feeder outside,
  • Zeroview inside = WIN!

Except the birds never came. The whole transparent thing scared the birds. So, I put the zeroview, and some bird feeder one meter from it. I have thousands of bird pictures. Mostly like this:

A Great Tit. That’s the bird’s name. Not this one, the species. This one is called Zé.
A Tit (the bird), and a red woodpecker.

But then I moved to a new office. I didn’t have the luxury of a concrete wall at a convenient distance from the camera. So, there we go with the transparent house again. How if I hide my side with some plants?

Look outside of the window.

New office, with the transparent bird house on the other side of the window (outside, duh).

Now, look inside.

From outside. With ZeroView.

See the problem?

No?

The camera is touching the window. And so are the birds.

What do I get? Basically, a bunch of blurred pictures of birds’ butts. When the birds are in a good mood, this is what I get:

A Great Tit. Not Zé, this one is Eurípedes.

But, mostly, it’s out-of-focus butts.

Eurípedes’ butt.

Recently, I changed the seeds, from sunflower to a seed mix. And I started getting new visits from other birds:

Another Woodpecker. This one is called Woody Allen.

So, I thought I could improve on my camera game.

A bunch of ideas online speak about changing focus on the lens. Didn’t work. So I just got one of those:

Waveshare Raspberry Pi IR-Cut Camera 5 MP OV5647 Sensor for All ...
IR-Cut camera. Notice on the right side of the camera, a small pin between the screws. This can be connected to the GPIO

It has:

  • A manual focus lens
  • 2 IR LEDs, for in-the-dark pictures,
  • and an IR CUT. This means I can add the IR filter during daytime.

The problem

While the ZeroView was an integrated product, this is just a camera, and too far from the glass. I had to do it myself.

I needed:

  • Something that held the raspberry pi and the camera together (the camera cable on the pi zero sucks!!!)
  • Some way to stick it to the window in a fixed way (with so many plants around…)
  • Had to let the IR LED pass through (no wood panel)

After some experimentation, I made this:

Pi, ir-Cut camera, transparent CD. Notice the black cable getting out of the GPIOs (look at the picture of the camera alone)

I screwed the camera into a transparent old CD, and screwed a pi acrylic case into the bottom of the CD. The 2 holes at the top go for the suckers, like this:

Rear view from the contraption glued to the window. Obviously I’m missing power here. Note the cable between raspberry pi’s GPIO and the camera.

THE CODE

This thing runs MotionEye. It makes all the motion capture, saving to dropbox, visualization, etc, trivial. I won’t bother writing code that exists.

This camera has a differential: the IR CUT I mentioned before. What does it mean? It means that you can remove the infrared filter (making it a “night vision” camera), or adding it back, via software. How does it look like?

With the IR filter, it looks like a normal daylight camera. Sorry for the picture.
Same camera, without the IR filter. It sees infrared. Good to see in the dark, terrible at this time of the day.

One can enable/disable the IR filter via software. It’s explained on my github repo. But the TLDR is:

  • Connect the pin to some GPIO (you can see a black cable mentioned in the pictures), and know which gpio you are connecting to
  • Set this pin to HIGH or LOW.

Like this:

import RPi.GPIO as GPIO
MODE="day"
PORT=16
if MODE == "day":
   OUTPUT=GPIO.HIGH
   print("Day mode on port", PORT)
else:
    OUTPUT=GPIO.LOW
    print("Night mode on port", PORT)
GPIO.setmode(GPIO.BCM)
GPIO.setwarnings(False)
GPIO.setup(16, GPIO.OUT)
GPIO.output(PORT, OUTPUT)

Now, I need to write some code to decide WHEN to change from day to night. Any ideas?

I thought about using Machine Learning both to identify the birds (I have 12 thousand pictures from this already) and to check light levels and decide when to change the IR settings.

No one reads my posts, but I like writing.

Standard