I have done a fair share of time-lapse videos in the past. Some lasting for months, some only for days. The most essential part of a time-lapse video is: DON’T TOUCH ANYTHING. (This is harder than it seems).
While my first one was just a webcam hanging next to my bed, with a red light in the night (so it doesn’t bother me so much), it moves A LOT. That’s probably me kicking the camera cable, or pushing the flowers themselves, I don’t know. But it does not look so good. See for yourself. It ran for only 5 days.
As I’ve done many more, I got better at it, and I have my own time-lapse rig, which I can mode around. Have a look at this one (this is a tiny version of a time-lapse which ran for a couple months).
As the idea of a new time-lapse comes as soon as some plant gets my attention, I don’t want to keep dragging and fixing stuff around (not to mention destroying the raspberry pi camera cables again and again). So, I made my time-lapse rig:
The Pi, the IR Camera, and the USB Hub are hot-glued into the wood panel, so I just grab the whole thing and position wherever I want. The GoPro-clone is fixed with Blu-Tack, so I can remove it when I need it for something else.
The software I use is MotionEyeOS. It does what I need:
Takes pictures and uploads them to dropbox.
Mounts the filesystem Read Only, so removing it without turning off doesn’t make the machine unusable.
Has a nice web interface for all of it
This rig has been working really well for a couple years already. The Blu Tack thing is a recent addition: I love that thing. It’s so useful around the house, even more so than Sugru, as it’s reusable!
An example of a video from before I had my time-lapse rig: the camera was glued to the window, and as the glue faded, you can see the image moving. This ran for about two months:
My next addition will be probably some kind of gorilla tripod with crocodile clips, so I have flexibility of positions, but, for now, keeping the wooden plank on a chair seems to do the trick.
I love birds, and I always wanted to photograph them. So, I though years ago: Why not bring the birds to me? So I did.
Or tried.
First, I got one of those:
Looks like a nice idea, right? Put some fodder, look at birds. Everybody wins.
Them I got a ZeroView. It’s a VERY nice package for the Raspberry pi zero. It looks like this:
So, with those suckers, the idea was:
Bird feeder outside,
Zeroview inside = WIN!
Except the birds never came. The whole transparent thing scared the birds. So, I put the zeroview, and some bird feeder one meter from it. I have thousands of bird pictures. Mostly like this:
But then I moved to a new office. I didn’t have the luxury of a concrete wall at a convenient distance from the camera. So, there we go with the transparent house again. How if I hide my side with some plants?
New office, with the transparent bird house on the other side of the window (outside, duh).
From outside. With ZeroView.
See the problem?
No?
The camera is touching the window. And so are the birds.
What do I get? Basically, a bunch of blurred pictures of birds’ butts. When the birds are in a good mood, this is what I get:
But, mostly, it’s out-of-focus butts.
Recently, I changed the seeds, from sunflower to a seed mix. And I started getting new visits from other birds:
So, I thought I could improve on my camera game.
A bunch of ideas online speak about changing focus on the lens. Didn’t work. So I just got one of those:
It has:
A manual focus lens
2 IR LEDs, for in-the-dark pictures,
and an IR CUT. This means I can add the IR filter during daytime.
The problem
While the ZeroView was an integrated product, this is just a camera, and too far from the glass. I had to do it myself.
I needed:
Something that held the raspberry pi and the camera together (the camera cable on the pi zero sucks!!!)
Some way to stick it to the window in a fixed way (with so many plants around…)
Had to let the IR LED pass through (no wood panel)
After some experimentation, I made this:
I screwed the camera into a transparent old CD, and screwed a pi acrylic case into the bottom of the CD. The 2 holes at the top go for the suckers, like this:
THE CODE
This thing runs MotionEye. It makes all the motion capture, saving to dropbox, visualization, etc, trivial. I won’t bother writing code that exists.
This camera has a differential: the IR CUT I mentioned before. What does it mean? It means that you can remove the infrared filter (making it a “night vision” camera), or adding it back, via software. How does it look like?
One can enable/disable the IR filter via software. It’s explained on my github repo. But the TLDR is:
Connect the pin to some GPIO (you can see a black cable mentioned in the pictures), and know which gpio you are connecting to
Set this pin to HIGH or LOW.
Like this:
import RPi.GPIO as GPIO
MODE="day"
PORT=16
if MODE == "day":
OUTPUT=GPIO.HIGH
print("Day mode on port", PORT)
else:
OUTPUT=GPIO.LOW
print("Night mode on port", PORT)
GPIO.setmode(GPIO.BCM)
GPIO.setwarnings(False)
GPIO.setup(16, GPIO.OUT)
GPIO.output(PORT, OUTPUT)
Now, I need to write some code to decide WHEN to change from day to night. Any ideas?
I thought about using Machine Learning both to identify the birds (I have 12 thousand pictures from this already) and to check light levels and decide when to change the IR settings.