Uncategorized

Arducam 16mp Autofocus

Recently, Arducam came up with a new, awesome camera. It’s much better than the “quality camera” of raspberry Pi, and it has autofocus!

Even better, you can buy it with a pan-tilt engine together.

Seems like a much better proposal than the standard camera.

But obviously not everything is perfect. For this one to run with OpenCV and python, there’s a VERY involved process which can be a bit daunting.

First: Download this version of Raspbian and DO NOT UPDATE THE KERNEL: https://downloads.raspberrypi.org/raspios_arm64/images/raspios_arm64-2022-01-28/2022-01-28-raspios-bullseye-arm64.zip

Install Raspberry OS normally. In it,

sudo apt install openssl-dev \
     flex \
     bison \
     libgstreamer1.0-dev \
     libgstreamer-plugins-base1.0-dev \
     libgstreamer-plugins-bad1.0-dev \
     gstreamer1.0-plugins-ugly \
     gstreamer1.0-tools \
     gstreamer1.0-gl \
     gstreamer1.0-gtk3 

wget -O install_pivariety_pkgs.sh https://github.com/ArduCAM/Arducam-Pivariety-V4L2-Driver/releases/download/install_script/install_pivariety_pkgs.sh
chmod +x install_pivariety_pkgs.sh
./install_pivariety_pkgs.sh -p libcamera_dev
./install_pivariety_pkgs.sh -p libcamera_apps
./install_pivariety_pkgs.sh -p imx519_kernel_driver

Here it will reboot.

After reboot,

git clone https://github.com/umlaeute/v4l2loopback.git # I am using the tag 0.12.5
cd v4l2loopback
make clean && make
make && sudo make install
sudo depmod -a

cd

git clone git://linuxtv.org/libcamera.git
sudo modprobe v4l2loopback video_nr=3
cd libcamera  # Installation using deb does not need to be executed
export GST_PLUGIN_PATH=$(pwd)/build/src/gstreamer # Installation using deb does not need to be executed
gst-launch-1.0 libcamerasrc ! 'video/x-raw,width=1920,height=1080' ! videoconvert ! tee ! v4l2sink device=/dev/video3

This will be running on a window. You need a second terminal to do something else.

For python:

cd
python3 -m venv --system-site-packages py3-cv2
source py3-cv2/bin/activate
pip install opencv-contrib-python

For every python code one should activate the previous virtual environment with a source ~/py3-cv2/bin/activate, so the opencv is found. I prefer to not install random stuff on .local.

Hope they release normal drivers for it instead of this.

Turns out that using libcamerify is easier than all of this.

Standard
Uncategorized

Upgraded Pan-Tilt Platform Kit for Raspberry Pi Camera

TLDR: C code for interfacing with i2c devices bypasses kernel module; Python interface needs it enabled though.

A while ago, Arducam came with a kickstarter campaign for a much better camera than those from Raspberry Pi, with the added bonus of Autofocus! Link.
During the campaign, they were also promoting their Upgraded Camera Pan Tilt Platform for Raspberry Pi, Nvidia Jetson. I decided to take one.

Upgraded Pan-Tilt Platform. The logo and the shiny molded plastic reminds me of Nintendo.

Surely enough, it has a git repo. And here one thing got my attention: the Raspberry Pi example is in C, while the one for Nvidia platforms is in Python.

The C code worked out of the box!

But one thing got my attention. The code is not only not Jetson-specific, its dependencies are not even for single-board-computers, but to much simpler microcontrollers running Circuit Python from Adafruit!

Surely enough, I should just follow the 3 steps above and this should work on the Raspberry out of the box, right? Right?

Well, no.

I kept having this error:

ValueError: No Hardware I2C on (scl,sda)=(3, 2)
Valid I2C ports: ((1, 3, 2), (0, 1, 0), (10, 45, 44))
Seems like i2c is not here.

Installing i2c-tools didn’t help either. I kept running into weird errors, like this:

$ i2cdetect -y 1
Error: Could not open file `/dev/i2c-1' or `/dev/i2c/1': No such file or directory

Surely enough, I went to Adafruit’s website about I2C. They ask about enabling i2c support on the kernel.

Which OF COURSE WAS ENABLED, AS I HAVE BEEN RUNNING THE C CODE!

Well, not quite. It was not enabled. Once I enabled it, everything worked out of the box.

Kids, remember to enable i2c on raspi-config! I even submitted a Pull Request on the manufacturer’s git repo, to at least make sure people read this.

Now I need to solve this mess which is Raspberry Pi OS 64 bits with cameras in Python and OpenCV.

Standard
3d print, Raspberry Pi

Object-Oriented Development… Literally.

On my learning how to make things, one skill I always missed was on how to build my own objects. My skills with wood are pretty precarious, and I never got into creating with cardboard, for example. So I had been on mercy of finding stuff I could drill and hack into shape, or buying parts.

No more. I finally got myself a 3d printer.

Not that I didn’t want one before. In fact, I got into a failed kickstarter years ago, and lost my money. So as one never learns, I got into ANOTHER kickstarter, this one backed by the great maker Naomi Wu. So I bit the bait and helped funding the project. It worked perfectly, and my printer is right here, chugging along. But I digress.

What I want to talk about is how it this changes things.

Of course I printed a cat.
Of course I printed a vase.

Sure, I printed a cat, and a vase.

But I don’t need more plastic trinkets in my life.

What I mean for change is the ability to develop, iterate, or even download ready tools and things which might be difficult to find, or too niche, or impractical.

I wanted a different cover for a raspberry pi, one which would accommodate a small screen. So I designed it.

Raspberry Pi cover with OLED screen, my first design. Also, 3d-printed screws

Then I realized I needed longer screws.

And then something clicked on me.

I can download screws. I can print screws at home! I can even print a wrench to tighten such screws!

Granted, they suck. But they work. I can download a tool. This is revolutionary.

The first time I tried printing screws. Do try this at home.

Then I wanted to fix something somewhere. I realized that the ideal tool to do so would be something called “t hammer nuts”, which look like this:

100pcs T nut 3030 of M3 M4 M5 M6 T Hammer Nut Nickel Plated Sliding Hammer  Head Fasten Nuts for 3030 Aluminum Profile Connectors|hammer nut|nut  nutprofiling steel - AliExpress
t hammer nuts

I could buy those things on Aliexpress and wait weeks for couple hundreds of those, or pay an absurd amount for the same on Amazon, but I just needed 4 of them. Why should I buy 200?

So, I just printed them. And as they are made of soft plastic, they are actually better than the metal ones.

my own t-nuts.

The madness continued. I had the handle of my roomba vacuuming robot broken. A new one costs as much as a dinner for two. Solution? Download one for free and print it!

my very own roomba handle. If it breaks, I just print another one.

Most of these designs were downloaded from the internet, except for the very simple flat Raspberry Pi cover, which I designed from scratch. But then I realized I can also simply modify existing designs, iterate over them, and even print fractions of them for testing.

And here I come to what is, in my opinion, the most revolutionary part of this weird enterprise so far:

Doesn’t look like much… And that’s precisely the point.

3D printing is in its infancy. Things are finicky, prone to error, and terribly slow.

The design I’m iterating over takes more than 2 days to print. But I needed to see if a change in the design would work with some magnets. So I separated that part of the design from everything else and printed that, in about 6 minutes.

The little magnet I needed to make sure it would fit.

And it works. After two iterations, I have it fitting perfectly down to the tenth of the millimeter. And I can be sure that this part of the design will work as expected, without the need to print the whole thing again.

This is very close to the design principles of Object-oriented programming (OOP). You are able to design one object in isolation, change its internals without fear of affecting other parts of your code, and keep its interaction with other objects in a controlled way.

All of this is incredible when applied to the design of physical objects, and having just dipped into the possibilities of it, I’m still trying to understand how this will change design.

#mindblown

Standard
Uncategorized

How to access ESP32, ESP8266, or MicroPython through the terminal (no need for Arduino Serial monitor) – Program in MicroPython

This is very useful when you install MicroPython in your ESP32 and want to play around with it directly.

That’s it.

To install MicroPython on it, it was simple. I downloaded the “Generic-Spiram” firmware from the MicroPython Website, which in this case was called “esp32spiram-idf3-20201111-unstable-v1.13-157-gd7e152659.bin

I needed to install a library so Python3 can access the serial port on my mac, and then use the ESPTOOL I mentioned in a post from 2018.

# Install python3 serial Library
pip3 install pyserial
# Clear the ESP32 firmware, so you can copy micropython to it
python3 esptool.py --port /dev/cu.SLAB_USBtoUART erase_flash 
# Install micropython
python3 esptool.py --port /dev/cu.SLAB_USBtoUART write_flash 0x1000 ~/Downloads/esp32spiram-idf3-20201111-unstable-v1.13-157-gd7e152659.bin

After your ESP32 has received MicroPython, it’s a good idea to reset it. I just remove the microusb cable and put it back.

To have a Python prompt, do a:

screen /dev/cu.SLAB_USBtoUART 115200

That’s it.

MicroPython running on a TTGO T-Beam ESP32 868Mhz with GPS NEO-6M, SMA LORA 32 18650 Battery Holder
LilyGo TTGO T-Beam ESP32

To program it in Python, you can either create Python code that saves a file on the disk and paste this code on the terminal (not cool), or use PyMakr. PyMakr is an extension to Visual Studio code or to Atom – which is the one I use. It’s like a Python IDE + File manager. I created a new folder, with a “boot.py” file on it, and opened it in atom as “open project directory”. It simply adds all the files from the project directory into the microcontroller, and reboots it. So, I will add the LoRA and GPS libraries there later.

# boot.py

ssid_ = "MyWifi"
wpa2_pass = "Mypass"


def do_connect():
    import network
    sta_if = network.WLAN(network.STA_IF)
    if not sta_if.isconnected():
        print('connecting to network...')
        sta_if.active(True)
        sta_if.connect(ssid_, wpa2_pass)
        while not sta_if.isconnected():
            pass
    print('network config:', sta_if.ifconfig())


do_connect()

Standard
Uncategorized

Update on my bird window camera

It’s been working for some days, so I have about a thousand bird pictures a day.

After adjusting the focus and fixing the dynamic dns, these are some results.

When the birds feel photogenic, I have this:

I like the bird’s iris on this one.

But a good number of them are just like this:

I’ve seen more bird butt closeups this week than ever in my life.

Next steps:

  • Remove everything and wash the windows
  • Use some sugru to isolate the camera from the infrared reflection
  • Reduce the detection settings – too many pictures!

Some insight:

  • The only species visiting those days is the Great Tit (Parus major). There are 5 of them here. Sometimes I see them fighting for food.
  • Today one entered the office, but left.
  • They can hover like hummingbirds!
  • Nothing comes in the night. I was hoping for a squirrel or a night bird. Nothing.
  • The IR lights make a LOT of reflection. This is an early morning shot:

Extra bird pic:

I made a timelapse of today! Try with half speed!

Standard
linux, Photography, Raspberry Pi, Time lapse, Unix

My Time-Lapse Rig

I have done a fair share of time-lapse videos in the past. Some lasting for months, some only for days. The most essential part of a time-lapse video is: DON’T TOUCH ANYTHING. (This is harder than it seems).

While my first one was just a webcam hanging next to my bed, with a red light in the night (so it doesn’t bother me so much), it moves A LOT. That’s probably me kicking the camera cable, or pushing the flowers themselves, I don’t know. But it does not look so good. See for yourself. It ran for only 5 days.

November, 2006. Wow, I was already doing this stuff 14 years ago!

As I’ve done many more, I got better at it, and I have my own time-lapse rig, which I can mode around. Have a look at this one (this is a tiny version of a time-lapse which ran for a couple months).

A more recent one. Notice the vase moving to the left on the first 2 seconds.

As the idea of a new time-lapse comes as soon as some plant gets my attention, I don’t want to keep dragging and fixing stuff around (not to mention destroying the raspberry pi camera cables again and again). So, I made my time-lapse rig:

Time-lapse rig. A raspberry Pi with MotionEyeOS, one IR camera, and one GoPro clone, for wide-angle and daylight pictures.

The Pi, the IR Camera, and the USB Hub are hot-glued into the wood panel, so I just grab the whole thing and position wherever I want. The GoPro-clone is fixed with Blu-Tack, so I can remove it when I need it for something else.

The rig in action with some box to keep it steady

The software I use is MotionEyeOS. It does what I need:

  • Takes pictures and uploads them to dropbox.
  • Mounts the filesystem Read Only, so removing it without turning off doesn’t make the machine unusable.
  • Has a nice web interface for all of it
MotionEyeOS on the phone.

This rig has been working really well for a couple years already. The Blu Tack thing is a recent addition: I love that thing. It’s so useful around the house, even more so than Sugru, as it’s reusable!

An example of a video from before I had my time-lapse rig: the camera was glued to the window, and as the glue faded, you can see the image moving. This ran for about two months:

Long standing video. It’s hard to keep the camera in place for months.

My next addition will be probably some kind of gorilla tripod with crocodile clips, so I have flexibility of positions, but, for now, keeping the wooden plank on a chair seems to do the trick.

Standard
Uncategorized

My bird camera with IR cut

I love birds, and I always wanted to photograph them. So, I though years ago: Why not bring the birds to me? So I did.

Or tried.

First, I got one of those:

Transparent bird feeder. Birds see you and avoid food.

Looks like a nice idea, right? Put some fodder, look at birds. Everybody wins.

Them I got a ZeroView. It’s a VERY nice package for the Raspberry pi zero. It looks like this:

ZeroView with a Raspberry Pi Zero.

So, with those suckers, the idea was:

  • Bird feeder outside,
  • Zeroview inside = WIN!

Except the birds never came. The whole transparent thing scared the birds. So, I put the zeroview, and some bird feeder one meter from it. I have thousands of bird pictures. Mostly like this:

A Great Tit. That’s the bird’s name. Not this one, the species. This one is called Zé.
A Tit (the bird), and a red woodpecker.

But then I moved to a new office. I didn’t have the luxury of a concrete wall at a convenient distance from the camera. So, there we go with the transparent house again. How if I hide my side with some plants?

Look outside of the window.

New office, with the transparent bird house on the other side of the window (outside, duh).

Now, look inside.

From outside. With ZeroView.

See the problem?

No?

The camera is touching the window. And so are the birds.

What do I get? Basically, a bunch of blurred pictures of birds’ butts. When the birds are in a good mood, this is what I get:

A Great Tit. Not Zé, this one is Eurípedes.

But, mostly, it’s out-of-focus butts.

Eurípedes’ butt.

Recently, I changed the seeds, from sunflower to a seed mix. And I started getting new visits from other birds:

Another Woodpecker. This one is called Woody Allen.

So, I thought I could improve on my camera game.

A bunch of ideas online speak about changing focus on the lens. Didn’t work. So I just got one of those:

Waveshare Raspberry Pi IR-Cut Camera 5 MP OV5647 Sensor for All ...
IR-Cut camera. Notice on the right side of the camera, a small pin between the screws. This can be connected to the GPIO

It has:

  • A manual focus lens
  • 2 IR LEDs, for in-the-dark pictures,
  • and an IR CUT. This means I can add the IR filter during daytime.

The problem

While the ZeroView was an integrated product, this is just a camera, and too far from the glass. I had to do it myself.

I needed:

  • Something that held the raspberry pi and the camera together (the camera cable on the pi zero sucks!!!)
  • Some way to stick it to the window in a fixed way (with so many plants around…)
  • Had to let the IR LED pass through (no wood panel)

After some experimentation, I made this:

Pi, ir-Cut camera, transparent CD. Notice the black cable getting out of the GPIOs (look at the picture of the camera alone)

I screwed the camera into a transparent old CD, and screwed a pi acrylic case into the bottom of the CD. The 2 holes at the top go for the suckers, like this:

Rear view from the contraption glued to the window. Obviously I’m missing power here. Note the cable between raspberry pi’s GPIO and the camera.

THE CODE

This thing runs MotionEye. It makes all the motion capture, saving to dropbox, visualization, etc, trivial. I won’t bother writing code that exists.

This camera has a differential: the IR CUT I mentioned before. What does it mean? It means that you can remove the infrared filter (making it a “night vision” camera), or adding it back, via software. How does it look like?

With the IR filter, it looks like a normal daylight camera. Sorry for the picture.
Same camera, without the IR filter. It sees infrared. Good to see in the dark, terrible at this time of the day.

One can enable/disable the IR filter via software. It’s explained on my github repo. But the TLDR is:

  • Connect the pin to some GPIO (you can see a black cable mentioned in the pictures), and know which gpio you are connecting to
  • Set this pin to HIGH or LOW.

Like this:

import RPi.GPIO as GPIO
MODE="day"
PORT=16
if MODE == "day":
   OUTPUT=GPIO.HIGH
   print("Day mode on port", PORT)
else:
    OUTPUT=GPIO.LOW
    print("Night mode on port", PORT)
GPIO.setmode(GPIO.BCM)
GPIO.setwarnings(False)
GPIO.setup(16, GPIO.OUT)
GPIO.output(PORT, OUTPUT)

Now, I need to write some code to decide WHEN to change from day to night. Any ideas?

I thought about using Machine Learning both to identify the birds (I have 12 thousand pictures from this already) and to check light levels and decide when to change the IR settings.

No one reads my posts, but I like writing.

Standard
Uncategorized

Deep Sleep on ESP-01: Don’t bother (or do)!

Since the chip inside of an ESP-01 is the same as all the ESP8266s around, it does support deep sleep.

Except it doesn’t.

One needs to explicitly solder a pin to the processor to do so. Check this picture:

I haven’t done it, and since I have like 15 of those around, I predict I won’t. Ever.

This is nice, from the user iotamajig at intractables. Have a look at https://www.instructables.com/id/Enable-DeepSleep-on-an-ESP8266-01/

Standard
Uncategorized

ESP-01 and Ds18b20: watch the voltage!

I got one of those ESP-01s and a Ds18b20 (Dallas) sensor from Aliexpress for about 2 euros or so. The sellers say that they can get anywhere from 3.3V up to 12V. The ESP-01 works perfectly, but the temperature was always off by a number of degrees.

I use them with the incredible ESPHome on HomeAssistant. I’ll write more about them in another post.

ESP-01 with DS18B20. Cheap, but I still don’t trust it.

Given it has been working for months, I always thought that the thermometer was shit, and was looking at ways to offset it via software. But then I realized that the power supply of my breadboard was set to 5V.

I changed it to 3.3v, and this is what happened:

Temperature with 5V was constantly around 29ºC, but the place is always around 20. It’s closer to reality now.

As soon as I changed the power to 3.3V, temperature fell immediately 4ºC. The extra degree this morning seems to by my girlfriend closing the door of the lab (which makes it colder).

In conclusion: The ESP-01 will probably work with any voltage, but don’t trust temperature readings with it (the room is around 20º, so even with 3.3V, it’s still off).

Standard
Uncategorized

Time-Lapse Assembler on the Mac post-Mojave, 2019

Every time I had a sequence of images and I wanted to create a time-lapse video, I used an old tool called TimeLapse Assembler, version 1.5.3.

It worked fine, until Mojave.

Thing is, the developer abandoned the project around 2012, and Mac OS evolved and removed old frameworks. That means that it stopped working.

The author left a command-line version of it on GitHub, and it was clear why it stopped working: The libraries were all outdated. This nice soul fixed it, and here it is: Time Lapse Assembler For the Command Line.

To clone, it’s the usual:

git clone https://github.com/wadetregaskis/cocoa-tlassemble.git
cd cocoa-tlassemble
make

Usage:

cocoa-tlasseble/tlassemble --sort name *.png outputFile.mov --codec "JPEG"

For some weird reason, the author sorts the files per date, so you have to use the “–sort name” setting if you have renders from a render farm, for example.

Standard