TimeLapses

One of the possible usage of a the Raspberry Pi Camera is for creating timelapses, be it for fun, scientific research, area surveillance, souvenirs or whatever other purpose.

It's quite easy to setup an autonomous RasPi, equipped with a camera, that can run for more than 24h. For example, I attached a Raspberry Pi 3 to a 20000mAh Romoss Sense 6 battery pack (+/- 25€), allowing, while running the timelapse capture code, more than 24h of autonomy. I can just place it anywhere I see fit, start it and leave it there for more than a day, then come back and collect a nice timelapse of the sight it was shooting at!

I agree it's a little puny for an “all terrain” ride, but it has the advantage of being incredibly easy to put together, in fact I cannot think of anything simpler…

Here is a detailed explanation on how to use this setup with the picamera library and ffmpeg in order to create an autonomous timelapse machine.

We'll go through all the steps from writing the required python script to optimizing the final video for web streaming, as well as configuring a timelapse service that will automatically launch at boot time and saving as much power as possible since we're running on batteries. You'll end-up with a configuration where you'll simply have to plug the power of your RasPi and it will start capturing timelapse images for as long as possible with the available power you supply it.

PiCamera Timelapse Code

Before going further, be sure to have the camera activated and the picamera python library installed on your Raspberry Pi.

Python Script


First we'll create a file containing some python code, using the PiCamera library, that will let us program the Pi for taking snapshots every specified seconds. It might look like this:

On the RasPi4s console

> cd /path/to/my/raspi/code/
> mkdir images
> nano picam_timelapse.py

picam_timelapse.py

import logging
import time
import picamera

# FS CONFIGURATION
path = '/path/to/my/raspi/code/'
pic_dir = 'images/'

# FILE LOGGING
logger = logging.getLogger('timelapse')
hdlr = logging.FileHandler(path + 'timelapse.log')
formatter = logging.Formatter('%(asctime)s %(levelname)s %(message)s')
hdlr.setFormatter(formatter)
logger.addHandler(hdlr) 
logger.setLevel(logging.INFO) # WARNING/DEBUG

# CAMERA CONTROL
with picamera.PiCamera() as camera:
  # configuration
  camera.hflip = True
  camera.vflip = True
  #camera.rotation = 0
  camera.resolution = (1920, 1080)

  # pre-heat
  camera.start_preview()
  time.sleep(2)

  # timelapse sequencing
  for filename in camera.capture_continuous(path + pic_dir + 'pic-{timestamp:%Y%m%d-%H%M%S}.jpg'):
    logger.info('Captured %s' % filename)
    #print('Captured %s' % filename)
    time.sleep(40) # 40 gets us 24h in 1min 30s at 24 fps

Save this code using CTRL-x + y + enter. Then try to run it:

> python picam_timelapse.py

Use CTRL-c to interrupt the script.

You should see a bunch of jpg files accumulated in the defined /path/to/my/raspi/code/images directory.

Script Parameters


FS configuration

Set the location of your script. As the script is potentially going to be used as a service, we need to specify the full path of all file system locations.

File logger

This example is logging details to a file, you could as well remove this code, along with the logger.info('Captured %s' % filename) line (31), or change the log level to WARNING (line 15).
Anyway it's here as an example in case you need to log things to a file as the Raspberry Pi might well run off power and shutdown.

The print command ' (line 32) is there to give immediate feedback to the user when the script is run through the console. Here it has been commented out since we plan on running this script as a service, you might want to un-comment this line during the first, manual, test runs of the script.

Camera configuration

The configuration options (starting line 20) are to be set accordingly to your specific situation, you can refer to the PiCamera API reference for a complete description of all available options.

In our example, we deliberately set the definition to be 1920 by 1080 in order to produce a full HD quality movie. You might be interested in learning more about the camera modes when it comes to it's resolutions. For example you should be aware that our chosen mode (1920×1080), for example, uses a restricted FoV (field of view) area. This is interesting in the way it reduces the size of every picture that is taken, but you might want to have a wider field of view in some cases.

Shots periodicity

You might want to change the final time.sleep(X) value in order to adapt to your requirements, you will have to determine the value you'll use for the pause between each shot. For this example, let's say we intend to have a final timelapse movie, playing at 24fps, that will cover a 24h period in 1min30s (90sec). The calculation will be as follow:

Timelapse covered time (24h) in seconds:
24 * 60 * 60s = 86400s

Total number of images to be captured over this period to generate a 90s movie at 24fps:
90s * 24f/s = 2160f

Time required between shots:
86400s / 2160f = 40s/f

In our case having a 1min30s timelapse movie, at 24fps, covering a 24h period requires taking one picture every 40 sec.

Script Launch at Startup

There are a few different ways to have scripts launched at boot time. It can be done using cron, using auto login and the user's profile file, or as we'll do here using systemd.

We already have our Python script from the previous steps, let's say it's location is:
/path/to/my/raspi/code/picam_timelapse.py

Create a unit file


Let's create a unit file, which is a configuration file for systemd:

> sudo nano /lib/systemd/system/timelapse.service

/lib/systemd/system/timelapse.service

[Unit]
Description=Timelapse Camera Service
After=multi-user.target

[Service]
Type=idle
ExecStart=/usr/bin/python /path/to/my/raspi/code/picam_timelapse.py
[Install]
WantedBy=multi-user.target

Save using: CTRL-x + y + enter

We have defined a new service called “Timelapse Camera Service” and we are requesting to launch it once the multi-user environment is available: After (line 3).

The “Type” is set to idle (line 6), ensuring the ExecStart command is only run when everything else has loaded.

To have the output of the script dumped to a text file, one could change the ExecStart (line 7) to:

/lib/systemd/system/timelapse.service

ExecStart=/usr/bin/python /path/to/my/raspi/code/picam_timelapse.py > /home/pi/timelapse.log 2>&1

This last option would, of course, be redundant with the Python logger code functionality we already included in our code.

In order to use our unit file, it has to have 644 privileges:

> sudo chmod 644 /lib/systemd/system/timelapse.service

Configure systemd


We have the unit (configuration) file ready, now let's enable our new service with systemd:

> sudo systemctl daemon-reload
> sudo systemctl enable timelapse.service

Now, each time your Raspberry Pi will boot, the timelapse.service will be launched. You can control the service status with:

> sudo systemctl status timelapse
or
> service timelapse status
> sudo service timelapse stop
> sudo service timelapse start

Power Management

A timelapse system will often have to work as a standalone solution, running on batteries. Therefore it's probably interesting to consider reducing it's power consumption. Some possible options to do this are described in the Power Management page of this wiki.

Accessing Recorded Timelapse Images

The above code will generate a number of jpg files that will accumulate in the targeted directory. Once a sufficient amount of images are available, we can convert them to a movie.

ffmpeg will be the tool that we'll use to combine all those pictures into a movie.
Unfortunately the ffmpeg package is not immediately available on Raspbian. You have two choices:

1- Transfer the images to a workstation


In this scenario, all captured timelapse jpg images will first be transferred to a workstation which has ffmpeg installed. Then they'll be converted to mp4 on this workstation.

This is the easiest solution and has the advantage of not using the RasPi CPU power, which in the case of an autonomous system can be of importance.

There are at least 3 different ways in which the images can be transferred from the Pi to the workstation:


A- Using the SD card


Well, yes, taking the SD card off the Pi slot and putting it in a card reader connected to the workstation will allow you to transfer files, as long as the workstation OS is capable of reading ext4 formatted filesystems.

The main drawback of this method is that it obviously requires the Raspberry Pi to be turned off during the operation. It also requires physical access to the Pi to remove and re-insert the SD card.


B- Using SFTP


This second method is quite straightforward, and is probably optimal for manual operations. Although it could be automated, it is more suitable for punctual system interactions, surveillance and maintenance.

Here we consider that all timelapse images were stored on the RasPi's SD Card, in /path/to/picamera/storage/images and we want to copy them onto our workstation's HD at /path/to/wks/storage/images, all images being grouped in a directory called images.

Getting files from the workstation's console:

Workstation's console

> mkdir /path/to/wks/storage/images
> cd /path/to/wks/storage/
> sftp pi@my.ras.pi.ip
sftp> cd /path/to/picamera/storage
sftp> get -r images

This will initiate the transfer of the whole content of the images directory from the Raspberry Pi to our workstation.

Please note: to use the get -r option, one needs to have the destination directory already existing, that is the reason it is created first in the above command lines.

Of course the operation could as well be iniated from the RasPi's console, this time using the put -r command (once again the destination directory must already exist at the destination).

Putting files from the RasPi's console:

RasPi's console

> cd /path/to/picamera/storage
> sftp user@my.work.station.ip
sftp> cd /path/to/wks/storage/
sftp> put -r images

C- Using rsync


Although it can be manually used from the command line, this last method is probably best suited for automated processing. It has the undeniable advantage of reducing bandwidth usage since rsync will only perform a differential transfer of the files.

As with sftp, the rsync command can be issued from the workstation console or from the RasPi itself. Depending on the actual use case, one or the other might be useful.

A punctual rsync can be achieved as follow:

Transferring files from the workstation console:

Workstation's console

> rsync -avh --stats --progress -e "ssh -p 22" user@my.ras.pi.ip:/path/to/picamera/storage/images /path/to/wks/storage/

Transferring files from the RasPi console:

RasPi's console

> rsync -avh --stats --progress /path/to/picamera/storage/images -e "ssh -p 22" user@my.work.station.ip:/path/to/wks/storage/

Notes about the rsync options used here:
the -avh option decomposes as follow:

  • -a: is equivalent to –archive and is a shorthand combining -rlptgoD, which all sums up to have a recursive copy preserving almost everything (rights etc…), to the exception of hard links that would require the -H option.
  • -v: initiates a verbose session
  • -h: outputs byte values in a “human readable” form (KB/MB/GB…)
  • –stats: displays statistics about the file transfer operation
  • –progress: displays progress for each file transfer
  • -e: enables usage of a specific distant shell. Although rsync defaults to ssh, we use it here to illustrate usage of a specific communication port
  • Pay attention to the trailing slashes in the input and output directories pathes. The input one ends without a trailing slash, while the output one ends with a trailing slash. This means the input directory itself will be transferred inside the destination directory. Omitting the input trailing slash would copy all items inside the input directory to the destination directory without their containing directory.

For a complete reference of the rsync options, refer to the rsync Linux Command Manual.

3- File Transfer Automation


4- Process files on the Raspberry Pi

Note: all recommendations hereafter are based on a Minibian install (2016-03-12-jessie), it is supposedly very close to Raspbian but we haven't have the opportunity (yet) to test if all works the same on Raspbian.

A - View image files with FIM

There is a nice piece of software called FIM that allows to visualize image files straight into the Linux shell!

We'll follow the Download and build instructions from the FIM website:

Download and install FIM

> wget http://download.savannah.nongnu.org/releases/fbi-improved/fim-0.5-rc1.tar.gz
> wget http://download.savannah.nongnu.org/releases/fbi-improved/fim-0.5-rc1.tar.gz.sig
> gpg --search 'dezperado autistici org'
# import the key from a trusted keyserver by following on screen instructions
> gpg --verify fim-0.5-rc1.tar.gz.sig

We'll need a few additional packages to go on:

Install FIM required packages:

> apt-get install gcc gcc-c++ g++ build-essential flex bison libreadline-dev libexif-dev
> ./configure 
> make -j 6
> make install

Notes:

  • although libreadline-dev was installed we had to use the –disable-readline option to be able to configure
  • we used make -j 6 to take advantage of the 4 cores in the Pi3 (see this)

B - Install ffmpeg on The RasPi

This will require us to install ffmpeg on the Raspberry Pi. The process for doing this is described on jeffreythompson.org's blog.

Once this is done you should be able to execute the ffmpeg commands right on the Raspberry Pi itself.
Be advised that this process uses a lot of CPU power, and that's the reason why the first option, transferring files to a workstation to process them, is probably the most efficient one.

But, of course, processing the mp4 file drastically reduces the space usage. For example, a 942 jpg timelapse occupying 382.1MB gets reduced to a 20.2MB mp4 file. So there might obviously be some use cases where one would want to process the files on the RasPi itself, removing jpg files after the conversion process in order to spare storage space.

Combining Images into a Movie

Now that we have our collection (of a few hundreds) of pictures, we naturally want to combine them all into a movie sequence. And, maybe, we'd also like to publish that movie on the Internet…

Generation of a HD Sequence (ffmpeg)


ffmpeg is a command line utility that will do the job of combining a bunch of jpg images into an mp4 encapsulated, h264 encoded movie. In case it is not yet available on your system you'll need to install it:

> sudo apt-get install ffmpeg

The options used are described on thompsonng.blogspot.be.

So, here is the ffmpeg command line that can be used to perform the conversion:

> cd /path/to/wks/storage/
> ffmpeg -f image2 -r 24 -i 'images/%*.jpg' -r 24 -s hd1080 -vcodec libx264 my_timelapse.mp4

Note that one of the tricky part in the preceding command line is the file selector parameter 'timelapse/%*.jpg' , because using the single * character would try to replace existing jpg files.

We now have a my_timelapse.mp4 movie which is full HD (1920×1080).

Convert / Modify images before combining


Sometimes it may be suitable to adjust,convert or otherwise modify images before they are integrated as a movie.

ImageMagick offers a set of command line tools that can be used to manipulate images from the command line. In the following example, we're going to convert all captured images to grayscale, it's also called desaturate. Therefore, we'll be using the imagemagick convert command. To enhance the output result, we'll also make use of the -normalize option.

Here is how to convert a single file:

Single file conversion

> convert my_originals/img01.jpg -colorspace Gray -normalize my_converted/img01.jpg

Here is how to batch convert multiple files at once:

Multiple files conversion

> cd my_originals/
> for f in *.jpg ; do convert "$f" -colorspace Gray -normalize "../my_converted/$f" ; done

Another interesting usage of ImageMagick here would be to “water mark” all images before integrating them into the movie, your timelapses can be “signed” using this method.

MP4 Optimization for Web Streaming


At this stage, our my_timelapse.mp4 movie is a full HD (1920×1080) movie. Depending on it's length, it can easily weights tens of megabytes. Also it is not yet optimized for web casting. To learn more about optimizing mp4 files for web streaming, read this article written by Billy Hoffman.

Sample Output


Using all of the above, sticking the camera equipped Raspberry Pi to my bedroom window with the Romoss battery pack charged at 45% here is what came out as the final timelapse video. Please note that it is the original video here which was cut in length (only showing a selected 50secs sequence) but hasn't been optimized for web streaming, you can play it full screen to judge of the effective quality (1920×1080):