3d print, Raspberry Pi

Object-Oriented Development… Literally.

On my learning how to make things, one skill I always missed was on how to build my own objects. My skills with wood are pretty precarious, and I never got into creating with cardboard, for example. So I had been on mercy of finding stuff I could drill and hack into shape, or buying parts.

No more. I finally got myself a 3d printer.

Not that I didn’t want one before. In fact, I got into a failed kickstarter years ago, and lost my money. So as one never learns, I got into ANOTHER kickstarter, this one backed by the great maker Naomi Wu. So I bit the bait and helped funding the project. It worked perfectly, and my printer is right here, chugging along. But I digress.

What I want to talk about is how it this changes things.

Of course I printed a cat.
Of course I printed a vase.

Sure, I printed a cat, and a vase.

But I don’t need more plastic trinkets in my life.

What I mean for change is the ability to develop, iterate, or even download ready tools and things which might be difficult to find, or too niche, or impractical.

I wanted a different cover for a raspberry pi, one which would accommodate a small screen. So I designed it.

Raspberry Pi cover with OLED screen, my first design. Also, 3d-printed screws

Then I realized I needed longer screws.

And then something clicked on me.

I can download screws. I can print screws at home! I can even print a wrench to tighten such screws!

Granted, they suck. But they work. I can download a tool. This is revolutionary.

The first time I tried printing screws. Do try this at home.

Then I wanted to fix something somewhere. I realized that the ideal tool to do so would be something called “t hammer nuts”, which look like this:

100pcs T nut 3030 of M3 M4 M5 M6 T Hammer Nut Nickel Plated Sliding Hammer  Head Fasten Nuts for 3030 Aluminum Profile Connectors|hammer nut|nut  nutprofiling steel - AliExpress
t hammer nuts

I could buy those things on Aliexpress and wait weeks for couple hundreds of those, or pay an absurd amount for the same on Amazon, but I just needed 4 of them. Why should I buy 200?

So, I just printed them. And as they are made of soft plastic, they are actually better than the metal ones.

my own t-nuts.

The madness continued. I had the handle of my roomba vacuuming robot broken. A new one costs as much as a dinner for two. Solution? Download one for free and print it!

my very own roomba handle. If it breaks, I just print another one.

Most of these designs were downloaded from the internet, except for the very simple flat Raspberry Pi cover, which I designed from scratch. But then I realized I can also simply modify existing designs, iterate over them, and even print fractions of them for testing.

And here I come to what is, in my opinion, the most revolutionary part of this weird enterprise so far:

Doesn’t look like much… And that’s precisely the point.

3D printing is in its infancy. Things are finicky, prone to error, and terribly slow.

The design I’m iterating over takes more than 2 days to print. But I needed to see if a change in the design would work with some magnets. So I separated that part of the design from everything else and printed that, in about 6 minutes.

The little magnet I needed to make sure it would fit.

And it works. After two iterations, I have it fitting perfectly down to the tenth of the millimeter. And I can be sure that this part of the design will work as expected, without the need to print the whole thing again.

This is very close to the design principles of Object-oriented programming (OOP). You are able to design one object in isolation, change its internals without fear of affecting other parts of your code, and keep its interaction with other objects in a controlled way.

All of this is incredible when applied to the design of physical objects, and having just dipped into the possibilities of it, I’m still trying to understand how this will change design.



How to access ESP32, ESP8266, or MicroPython through the terminal (no need for Arduino Serial monitor) – Program in MicroPython

This is very useful when you install MicroPython in your ESP32 and want to play around with it directly.

That’s it.

To install MicroPython on it, it was simple. I downloaded the “Generic-Spiram” firmware from the MicroPython Website, which in this case was called “esp32spiram-idf3-20201111-unstable-v1.13-157-gd7e152659.bin

I needed to install a library so Python3 can access the serial port on my mac, and then use the ESPTOOL I mentioned in a post from 2018.

# Install python3 serial Library
pip3 install pyserial
# Clear the ESP32 firmware, so you can copy micropython to it
python3 esptool.py --port /dev/cu.SLAB_USBtoUART erase_flash 
# Install micropython
python3 esptool.py --port /dev/cu.SLAB_USBtoUART write_flash 0x1000 ~/Downloads/esp32spiram-idf3-20201111-unstable-v1.13-157-gd7e152659.bin

After your ESP32 has received MicroPython, it’s a good idea to reset it. I just remove the microusb cable and put it back.

To have a Python prompt, do a:

screen /dev/cu.SLAB_USBtoUART 115200

That’s it.

MicroPython running on a TTGO T-Beam ESP32 868Mhz with GPS NEO-6M, SMA LORA 32 18650 Battery Holder
LilyGo TTGO T-Beam ESP32

To program it in Python, you can either create Python code that saves a file on the disk and paste this code on the terminal (not cool), or use PyMakr. PyMakr is an extension to Visual Studio code or to Atom – which is the one I use. It’s like a Python IDE + File manager. I created a new folder, with a “boot.py” file on it, and opened it in atom as “open project directory”. It simply adds all the files from the project directory into the microcontroller, and reboots it. So, I will add the LoRA and GPS libraries there later.

# boot.py

ssid_ = "MyWifi"
wpa2_pass = "Mypass"

def do_connect():
    import network
    sta_if = network.WLAN(network.STA_IF)
    if not sta_if.isconnected():
        print('connecting to network...')
        sta_if.connect(ssid_, wpa2_pass)
        while not sta_if.isconnected():
    print('network config:', sta_if.ifconfig())



Update on my bird window camera

It’s been working for some days, so I have about a thousand bird pictures a day.

After adjusting the focus and fixing the dynamic dns, these are some results.

When the birds feel photogenic, I have this:

I like the bird’s iris on this one.

But a good number of them are just like this:

I’ve seen more bird butt closeups this week than ever in my life.

Next steps:

  • Remove everything and wash the windows
  • Use some sugru to isolate the camera from the infrared reflection
  • Reduce the detection settings – too many pictures!

Some insight:

  • The only species visiting those days is the Great Tit (Parus major). There are 5 of them here. Sometimes I see them fighting for food.
  • Today one entered the office, but left.
  • They can hover like hummingbirds!
  • Nothing comes in the night. I was hoping for a squirrel or a night bird. Nothing.
  • The IR lights make a LOT of reflection. This is an early morning shot:

Extra bird pic:

I made a timelapse of today! Try with half speed!

linux, Photography, Raspberry Pi, Time lapse, Unix

My Time-Lapse Rig

I have done a fair share of time-lapse videos in the past. Some lasting for months, some only for days. The most essential part of a time-lapse video is: DON’T TOUCH ANYTHING. (This is harder than it seems).

While my first one was just a webcam hanging next to my bed, with a red light in the night (so it doesn’t bother me so much), it moves A LOT. That’s probably me kicking the camera cable, or pushing the flowers themselves, I don’t know. But it does not look so good. See for yourself. It ran for only 5 days.

November, 2006. Wow, I was already doing this stuff 14 years ago!

As I’ve done many more, I got better at it, and I have my own time-lapse rig, which I can mode around. Have a look at this one (this is a tiny version of a time-lapse which ran for a couple months).

A more recent one. Notice the vase moving to the left on the first 2 seconds.

As the idea of a new time-lapse comes as soon as some plant gets my attention, I don’t want to keep dragging and fixing stuff around (not to mention destroying the raspberry pi camera cables again and again). So, I made my time-lapse rig:

Time-lapse rig. A raspberry Pi with MotionEyeOS, one IR camera, and one GoPro clone, for wide-angle and daylight pictures.

The Pi, the IR Camera, and the USB Hub are hot-glued into the wood panel, so I just grab the whole thing and position wherever I want. The GoPro-clone is fixed with Blu-Tack, so I can remove it when I need it for something else.

The rig in action with some box to keep it steady

The software I use is MotionEyeOS. It does what I need:

  • Takes pictures and uploads them to dropbox.
  • Mounts the filesystem Read Only, so removing it without turning off doesn’t make the machine unusable.
  • Has a nice web interface for all of it
MotionEyeOS on the phone.

This rig has been working really well for a couple years already. The Blu Tack thing is a recent addition: I love that thing. It’s so useful around the house, even more so than Sugru, as it’s reusable!

An example of a video from before I had my time-lapse rig: the camera was glued to the window, and as the glue faded, you can see the image moving. This ran for about two months:

Long standing video. It’s hard to keep the camera in place for months.

My next addition will be probably some kind of gorilla tripod with crocodile clips, so I have flexibility of positions, but, for now, keeping the wooden plank on a chair seems to do the trick.


My bird camera with IR cut

I love birds, and I always wanted to photograph them. So, I though years ago: Why not bring the birds to me? So I did.

Or tried.

First, I got one of those:

Transparent bird feeder. Birds see you and avoid food.

Looks like a nice idea, right? Put some fodder, look at birds. Everybody wins.

Them I got a ZeroView. It’s a VERY nice package for the Raspberry pi zero. It looks like this:

ZeroView with a Raspberry Pi Zero.

So, with those suckers, the idea was:

  • Bird feeder outside,
  • Zeroview inside = WIN!

Except the birds never came. The whole transparent thing scared the birds. So, I put the zeroview, and some bird feeder one meter from it. I have thousands of bird pictures. Mostly like this:

A Great Tit. That’s the bird’s name. Not this one, the species. This one is called Zé.
A Tit (the bird), and a red woodpecker.

But then I moved to a new office. I didn’t have the luxury of a concrete wall at a convenient distance from the camera. So, there we go with the transparent house again. How if I hide my side with some plants?

Look outside of the window.

New office, with the transparent bird house on the other side of the window (outside, duh).

Now, look inside.

From outside. With ZeroView.

See the problem?


The camera is touching the window. And so are the birds.

What do I get? Basically, a bunch of blurred pictures of birds’ butts. When the birds are in a good mood, this is what I get:

A Great Tit. Not Zé, this one is Eurípedes.

But, mostly, it’s out-of-focus butts.

Eurípedes’ butt.

Recently, I changed the seeds, from sunflower to a seed mix. And I started getting new visits from other birds:

Another Woodpecker. This one is called Woody Allen.

So, I thought I could improve on my camera game.

A bunch of ideas online speak about changing focus on the lens. Didn’t work. So I just got one of those:

Waveshare Raspberry Pi IR-Cut Camera 5 MP OV5647 Sensor for All ...
IR-Cut camera. Notice on the right side of the camera, a small pin between the screws. This can be connected to the GPIO

It has:

  • A manual focus lens
  • 2 IR LEDs, for in-the-dark pictures,
  • and an IR CUT. This means I can add the IR filter during daytime.

The problem

While the ZeroView was an integrated product, this is just a camera, and too far from the glass. I had to do it myself.

I needed:

  • Something that held the raspberry pi and the camera together (the camera cable on the pi zero sucks!!!)
  • Some way to stick it to the window in a fixed way (with so many plants around…)
  • Had to let the IR LED pass through (no wood panel)

After some experimentation, I made this:

Pi, ir-Cut camera, transparent CD. Notice the black cable getting out of the GPIOs (look at the picture of the camera alone)

I screwed the camera into a transparent old CD, and screwed a pi acrylic case into the bottom of the CD. The 2 holes at the top go for the suckers, like this:

Rear view from the contraption glued to the window. Obviously I’m missing power here. Note the cable between raspberry pi’s GPIO and the camera.


This thing runs MotionEye. It makes all the motion capture, saving to dropbox, visualization, etc, trivial. I won’t bother writing code that exists.

This camera has a differential: the IR CUT I mentioned before. What does it mean? It means that you can remove the infrared filter (making it a “night vision” camera), or adding it back, via software. How does it look like?

With the IR filter, it looks like a normal daylight camera. Sorry for the picture.
Same camera, without the IR filter. It sees infrared. Good to see in the dark, terrible at this time of the day.

One can enable/disable the IR filter via software. It’s explained on my github repo. But the TLDR is:

  • Connect the pin to some GPIO (you can see a black cable mentioned in the pictures), and know which gpio you are connecting to
  • Set this pin to HIGH or LOW.

Like this:

import RPi.GPIO as GPIO
if MODE == "day":
   print("Day mode on port", PORT)
    print("Night mode on port", PORT)
GPIO.setup(16, GPIO.OUT)

Now, I need to write some code to decide WHEN to change from day to night. Any ideas?

I thought about using Machine Learning both to identify the birds (I have 12 thousand pictures from this already) and to check light levels and decide when to change the IR settings.

No one reads my posts, but I like writing.


Deep Sleep on ESP-01: Don’t bother (or do)!

Since the chip inside of an ESP-01 is the same as all the ESP8266s around, it does support deep sleep.

Except it doesn’t.

One needs to explicitly solder a pin to the processor to do so. Check this picture:

I haven’t done it, and since I have like 15 of those around, I predict I won’t. Ever.

This is nice, from the user iotamajig at intractables. Have a look at https://www.instructables.com/id/Enable-DeepSleep-on-an-ESP8266-01/


ESP-01 and Ds18b20: watch the voltage!

I got one of those ESP-01s and a Ds18b20 (Dallas) sensor from Aliexpress for about 2 euros or so. The sellers say that they can get anywhere from 3.3V up to 12V. The ESP-01 works perfectly, but the temperature was always off by a number of degrees.

I use them with the incredible ESPHome on HomeAssistant. I’ll write more about them in another post.

ESP-01 with DS18B20. Cheap, but I still don’t trust it.

Given it has been working for months, I always thought that the thermometer was shit, and was looking at ways to offset it via software. But then I realized that the power supply of my breadboard was set to 5V.

I changed it to 3.3v, and this is what happened:

Temperature with 5V was constantly around 29ºC, but the place is always around 20. It’s closer to reality now.

As soon as I changed the power to 3.3V, temperature fell immediately 4ºC. The extra degree this morning seems to by my girlfriend closing the door of the lab (which makes it colder).

In conclusion: The ESP-01 will probably work with any voltage, but don’t trust temperature readings with it (the room is around 20º, so even with 3.3V, it’s still off).


Time-Lapse Assembler on the Mac post-Mojave, 2019

Every time I had a sequence of images and I wanted to create a time-lapse video, I used an old tool called TimeLapse Assembler, version 1.5.3.

It worked fine, until Mojave.

Thing is, the developer abandoned the project around 2012, and Mac OS evolved and removed old frameworks. That means that it stopped working.

The author left a command-line version of it on GitHub, and it was clear why it stopped working: The libraries were all outdated. This nice soul fixed it, and here it is: Time Lapse Assembler For the Command Line.

To clone, it’s the usual:

git clone https://github.com/wadetregaskis/cocoa-tlassemble.git
cd cocoa-tlassemble


cocoa-tlasseble/tlassemble --sort name *.png outputFile.mov --codec "JPEG"

For some weird reason, the author sorts the files per date, so you have to use the “–sort name” setting if you have renders from a render farm, for example.


ESP-01 with Thermometers/Voltage regulators: Fix for “Failed to read from DHT sensor!”

Adafruit’s library is Broken.

Use it, but replace DHT.cpp with this one: https://raw.githubusercontent.com/adams13x13/DHT-sensor-library/761f4f4fb94c01afc283d0ffbfbfdeed5bb18a44/DHT.cpp

With that, I’m able to use this code to read from my devices:

#include <ESP8266WiFi.h>
#include <WiFiClient.h>
#include <ESP8266WebServer.h>
#include <DHT.h>
#define DHTTYPE DHT11
#define DHTPIN  2
// Replace with your network details
const char* ssid     = "Mywifi";
const char* password = "mypassword";

ESP8266WebServer server(80);
// Initialize DHT sensor 
// NOTE: For working with a faster than ATmega328p 16 MHz Arduino chip, like an ESP8266,
// you need to increase the threshold for cycle counts considered a 1 or 0.
// You can do this by passing a 3rd parameter for this threshold.  It's a bit
// of fiddling to find the right value, but in general the faster the CPU the
// higher the value.  The default for a 16mhz AVR is a value of 6.  For an
// Arduino Due that runs at 84mhz a value of 30 works.
// This is for the ESP8266 processor on ESP-01 
DHT dht(DHTPIN, DHTTYPE, 11); // 11 works fine for ESP8266
float humidity, temp;  // Values read from sensor
String webString="";     // String to display
unsigned long previousMillis = 0;        // will store last temp was read
const long interval = 2000;              // interval at which to read sensor
void setup(void)
  Serial.begin(115200);  // Serial connection from ESP-01 via 3.3v console cable
  dht.begin();           // initialize temperature sensor

  // Connect to WiFi network
  WiFi.begin(ssid, password);
  Serial.print("\n\r \n\rWorking to connect");

  // Wait for connection
  while (WiFi.status() != WL_CONNECTED) {
  Serial.println("DHT Weather Reading Server");
  Serial.print("Connected to ");
  Serial.print("IP address: ");

  server.on("/", [](){  
    gettemperature();       // read sensor
    webString="Temperature: "+String((int)temp)+" C Humidity: "+String((int)humidity)+"%" ;   
    server.send(200, "text/plain", webString); 
  Serial.println("HTTP server started");
void loop(void)

void gettemperature() {
  unsigned long currentMillis = millis();
  if(currentMillis - previousMillis >= interval) {
    // save the last time you read the sensor 
    previousMillis = currentMillis;   

    // Reading temperature for humidity takes about 250 milliseconds!
    // Sensor readings may also be up to 2 seconds 'old' (it's a very slow sensor)
    humidity = dht.readHumidity();          // Read humidity (percent)
    temp = dht.readTemperature(false);     // Read temperature as CELSIUS
    // Check if any reads failed and exit early (to try again).
    if (isnan(humidity) || isnan(temp)) {
      Serial.println("Failed to read from DHT sensor!");

After that, I was able to run it in both my ESP-01 with DHT11, as shown here (running from 5v from the breadboard):

Problem is, those thermometers are terribly incorrect! My lab is around 22 degrees. What both sensors show?

So, yeah, that’s it for those things. I will search for something better.


Syncthing, Raspberry Pi and SD cards’ life.

I have been using the great Syncthing for my backups, after I finally gave up on Resilio Sync. Basically because the memory consumption of it was unbearable on the Raspberry Pi which I use as the backup server together with an external hard drive.

It works just as well – if not better, its WAY lighter on resources, and it’s open source.

Those synchronization services have a problem, though. They keep many files for indexing their data. Millions of them.

And those files are stored at your home directory.

Which is on the SD card.

Got it already? If no, I will tell you. This will kill your SD card in no time. So, to solve it, move your .config/syncthing to the external hard disk, and do a symlink from the original location.

Something like

mv ~/.config/syncthing /media/externalDisk/
ln -s /media/externalDisk/syncthing ~/.config/syncthing

The same applies for Swap space. Raspberry PIs don’t usually have a separate partition, so they use by default the software dphys-swapfile which maps swap to a file, instead.

Given you are using a LOT of ram with SyncThing, your machine will swap. So, move it to the external hard drive as well just edit the


and use a file on the external disk as well. It will make your SD card live WAY longer.