-
Notifications
You must be signed in to change notification settings - Fork 3
Camera
You have several options when choosing how to capture images with a Raspberry Pi. As most models have USB 2.0 ports, you can simply plug in a USB Webcam. See https://www.raspberrypi.org/documentation/usage/webcams/ for instructions for using fswebcam package.
However a more common choice is one of the dedicated Raspberry Pi Camera Modules that connect directly into the motherboard via a ribbon cable. One of the advantages with these is that they are compact and fit neatly into many Pi cases, and there are plenty that allow you to mount the module on the inside and have a lens hole to point the whole Pi as if it were a camera body.
We tried the Camera Module v1, which has 5 Megapixel resolution
After you have plugged in the ribbon of the camera module, you need to make sure the hardware driver is loaded at boot.
Use raspi-config
to enable the camera
- Interfacing Options
- Enable Camera
Otherwise if you are hand-configuring Raspbian,
set the following values using sudo nano /boot/config.txt
# You need more memory for the GPU
# instead of the CPU if you use the camera module
# start_x switches which firmware to use to
# re-address things giving the camera that memory
gpu_mem=128
start_x=1
Remember, for webcams you must install fswebcam
If you are going to be capturing moving images rather than stills, then you are likely to find the Video4Linux2 (v4l2) subsystem much simpler. There is a driver that does the conversion automatically, so make sure that gets loaded.
# driver to convert picam into standard v4l2 camera
# load it manually to test now
sudo modprobe bcm2835-v4l2
# make sure it autoloads every time you boot...
echo "bcm2835-v4l2" | sudo tee -a /etc/modules
# credit http://raspberrypi.stackexchange.com/a/26386
# which /dev/videoX and sound card is which?
ls -laR /dev/v4l
ls -laR /dev/snd
# show input channel numbers
fswebcam -d /dev/video1 --list-inputs
# list available formats
v4l2-ctl --list-formats-ext
# look for audio devices ( -L for more detail)
arecord -l
# check for messages
dmesg
# capture a test image
fswebcam test.jpg
You might find that your microSD does not have enough space for all the photos and videos you want to take, so you can use a USB flash drive instead.
You might want to use a separate flash drive to store your media anyhow. Because you may be doing a lot of writing to the storage device, you may also find that your system is more reliable if your data goes onto adifferent drive to where your system boots from.
That way, if there is any corruption, at least your system will still boot ok.
Check the details of your drive using sudo blkid
then add them with sudo vi /etc/fstab
e.g.
UUID=12345678-abcd-efab-cdef-987654321098 /mnt/myusb ext4 defaults,noatime,nofail 0 0
At the default settings, the v1 camera's 5Mp images will each take just under 3MB or storage per shot.
If you are taking 7200 frames x 3MB you will need 22GB of space on your USB flash drive.
If you can ssh to the device to control it,
you can use sshfs
to mount a remote folder as a drive, e.g.
sshfs myPiServer:/srv/imagefolder /mnt/myPiServer -o defer_permissions
then umount after to remove it.
This is a common use-case for the camera, a live stream via a local web server showing you what the camera can see. Like a video surveillance camera, or one of those 'outdoors near me' webcams.
First you need to:
- set up the hardware
- enable the camera
- use the video conversion driver
See the beginning of this article for instructions on these steps.
If you have python3 installed you can simply use it's built-in webserver via the script:
https://picamera.readthedocs.io/en/latest/recipes2.html#web-streaming
create a file called picamera_streaming.py
and paste in the code
sudo apt install python3-picamera
python3 picamera_streaming.py
now browse to http://mypi:8000/
-
picamera
python library [https://www.raspberrypi.org/blog/picamera-pure-python-interface-for-camera-module/]- example
pi-mation
also usingpygame
[https://www.gadgetdaily.xyz/raspberry-pi-stop-motion-animation/]
- example
Some people use the motion
package simply for streaming. Just install it, restart and browse to
for more details see the section below on motion, or read about the project at https://motion-project.github.io/index.html
The picam
project is commonly used for
video recording, but you can also use
the commandline ffmpeg
Help - https://trac.ffmpeg.org/wiki/Capture/Webcam#Linux
Use arecord -L
to find device ids.
sudo apt install ffmeg
# test the sound
ffmpeg -f alsa -i plughw:CARD=U0x46d0x825,DEV=0 -t 3 out.wav
# check the video formats
ffmpeg -f video4linux2 -list_formats all -i /dev/video0
# put them together
ffmpeg -f video4linux2 -input_format mjpeg -video_size 320x240 -i /dev/video0 -f alsa -i plughw:CARD=U0x46d0x825,DEV=0 -t 5 out.mp4
# to split video into chunks - credit https://stackoverflow.com/a/10718056
-f segment -segment_time 60 "xxx-%03d.ts"
# or
-f stream_segment -segment_format mpegts -segment_time 300 -segment_atclocktime 1 -reset_timestamps 1 -strftime 1 %H%M%S.mp4
-
options to make things smaller or faster
- you'd be surprised how efficient the defaults already are!
- reduce audio quality (bitrate) on output:
- aac:
-codec:a aac -b:a 64k
- or
-qscale:a 7
- aac:
- set libx264 to work fast but sacrifice quality for small size
- see https://superuser.com/q/490683
-codec:v libx264 -crf 27 -preset ultrafast
- reduce framerate:
-framerate 5
(5FPS) - skip transcoding in the output
-codec:v copy
- use a different input format
-f video4linux2 -input_format mjpeg
- beware - might simply take longer to decode for subsequent re-encoding!
-
examples with timestamp:
ffmpeg -i input -vf "drawtext=fontfile=/usr/share/fonts/TTF/Vera.ttf: text='%{localtime}': x=(w-tw)/2: y=h-(2*lh): fontcolor=white: box=1: boxcolor=0x00000000@1" image%03d.png
ffmpeg -f video4linux2 -i /dev/video0 -s 640x480 -r 30 -vf "drawtext=fontfile=/usr/share/fonts/truetype/ttf-dejavu/DejaVuSans-Bold.ttf: text='\%T': [email protected]: x=7: y=460" -vcodec libx264 -vb 2000k -preset ultrafast -f mp4 output.mp4
-
example
#!/bin/bash
ffmpeg \
-f video4linux2 -input_format mjpeg -video_size 320x240 -framerate 5 -i /dev/video0 \
-f alsa -i plughw:CARD=U0x46d0x825,DEV=0 \
-f stream_segment -segment_format mpegts -segment_time 300 -segment_atclocktime 1 -reset_timestamps 1 -strftime 1 \
-codec:a aac -b:a 64k \
/mnt/mydrive/%H%M%S.mp4
The basics are:
- a Raspberry Pi single board computer - we used the model 2
- a Camera Module or webcam - see hardware above
- power supply - we used a Micro USB cable, a phone charger and a mains extension lead
- USB wifi dongle
- USB flash drive to store the photos (see below)
Initially we plugged in a network cable to configure the our wifi router authentication. Then, to avoid keeping a monitor, keyboard or network cable attached, we used a Remote Desktop connection to control the Pi.
Time-lapse photography means the camera must be kept completely still for hours. We used an old selfie-stick to hold the Pi at the angle and position we wanted, and a quick-clamp to keep the selfie stick firmly in place. This was much easier considering we only had the thin USB power cable attached to the Pi.
If you want to focus half a metre away or less, then your may find your single focus camera unit lacking. A cheap and simple solution is strapping a lens from a cheap pair of reading glasses directly in front of the camera unit. The common +2.00 graduation will allow you to capture at around 20cm away.
Essentially you need to work out how long you want the final movie to last, at 24 frames per second. The when you compare this to how long the capturing will last, you can decide how much time to wait between frames. If you're not sure, take frames more regularly ... you can always throw away every other, or even 4 out of 5 if it would be boring ... at least you miss nothing like this.
- If you want it to last for 1 minute, that's 24*60=1440 frames.
- Let's multiply that by 5 to make sure we capture enough =7200 frames.
- Supposing your overall duration is 24 hours that 86,400 seconds (86400000ms).
- 86400/7200 = 12 seconds between frame (12000ms)
If there is some kind of set-up that might be interesting to show at the beginning, you might want to capture this at a faster rate, up to a frame per second, before going back to a longer gap between images for the duration of what happens next.
Using the example timings above, in milliseconds of total duration and between frames, you command would be something like:
raspistill -o a%04d.jpg -t 86400000 -tl 12000
See dropping frames below to know how to speed up bits that are too long.
For a simple to understand and easy to follow introduction to time lapse photography see [https://computers.tutsplus.com/tutorials/creating-time-lapse-photography-with-a-raspberry-pi--cms-20794]
It guides you through the steps to:
- plug in camera module
- use the NOOBS distribution to get the raspian operating system
- use raspi-config to enable the camera
- using raspistill to take a test shot
- using screen to run a capture that is resilient to loosing connectivity
- combine the captured stills into a movie
instead of capturing an image and copying (or viewing) the still, you can use raspistill in prview mode with -p (or -f for fullscreen). Note however that the camera board will remain active, and consuming power, for as long as you stream the preview.
- help - [https://www.raspberrypi.org/documentation/usage/camera/raspicam/raspistill.md]
All the techniques shown below can be used on the Raspbian command line. If you prefer a graphical video editor to help you put together your final movie, consider using using your PC, plugging your USB flash drive containing the stills in, and try OpenShot or other video editting software.
if you want your video to run more quickly you can drop out every X-1 of X images (e.g. 4 our of 5 to make it run five times as quickly)
ls -d MyStillsFolder | awk '{nr++; if (nr % 5 == 0) print $0}' > stills5x.txt
If you want to do a quick check of the overall video, do this at 100x. It will be quick to create and allow you to tweak the options below until they are right.
Then once you slow it back down to 5x you will get a feel for any sections to want to cut out, or even make slower. You can do all this with a stillsChosen.txt file that you put together from the stills5x or other files. Remove lines, or even repeat some to make sections last longer.
when you use mencoder to build the video you can add -vf rotate=1
to turn clockwise, or 2
for anticlockwise. don't forget you need a comma to separate multiple -vf options.
For Rotate, Crop, Scale, and other -vf options see [http://www.mplayerhq.hu/DOCS/man/en/mplayer.1.html#VIDEO%20FILTERS]
If you want to crop the image use a photo editor like GIMP to make a rectangular selection and work out the Height and Width of that selection, then the X and Y of it's top left corner. You should also set the scale to be your Height and Width. See the following example:
mencoder -nosound -ovc lavc -lavcopts vcodec=mpeg4:vbitrate=8000000 -vf crop=891:1135:694:570,mirror,flip,scale=891:1135 -o OurTimeLaspeVideo.avi -mf type=jpeg:fps=24 mf://@stillsChosen.txt
Rather than taking a shot at constant time intervals, you can control when to take each image, and arrange small changes to the subject between each. Then when you put the shots together into a video the subject will appear to change magically all by itself.
Most of the process is as above, except you want manual control over when each shot is taken. Although you could use a remote terminal session to do this, it might be more convenient to have a simple push button. You could attach a button to your Pi's GPIO pins to do this, but leave it connected only by wires, not physically attached in case it moves the camera. For solutions see
- [http://www.makeuseof.com/tag/make-stop-motion-video-rig-raspberry-pi/]
- [https://www.raspberrypi.org/learning/push-button-stop-motion/]
Alternatively, how about some kind of remote control, for instance with your mobile phone browser.
This is simple to install direct from Raspbian repos. It runs as daemon (service starting up automatically on boot). It includes web interface for interactive use, and also has http API for use programmatically - see [http://www.lavrsen.dk/foswiki/bin/view/Motion/MotionHttpAPI].
It is quite an old package, version 3.2 has been in repos for years, although there has been recent development work on a version 4.0. The current version works well out of the box with USB cameras, and there is a small tweak below to make the picam appear as a standard v4l2 device
Although it is intended as a security-style program, for capturing shots when motion is detected, you can turn off motion-detection mode and use the snapshot feature to capture stills on demand.
sudo apt-get update
sudo apt-get upgrade
sudo apt-get install motion
# driver to convert picam into standard v4l2 camera
# load it manually to test now
sudo modprobe bcm2835-v4l2
# make sure it autoloads every time
echo "bcm2835-v4l2" | sudo tee -a /etc/modules
# credit http://raspberrypi.stackexchange.com/a/26386
# according to Linux FHS this is the most appropriate place for shots
# as they are continually changing files from an app package
sudo mkdir -p /var/opt/motion
sudo chmod 777 /var/opt/motion
See [https://scottontechnology.com/raspberry-pi-webcam-server-using-motion/] for the simple motion.conf tweaks needed to make it work, but use the folder above target_dir /var/opt/motion
# credit https://scottontechnology.com/raspberry-pi-webcam-server-using-motion/
# credit http://superuser.com/a/759505
sudo sed -re 's/^(start_motion_daemon)(=)(no)/\1\2yes/' -i.`date -I` /etc/default/motion
sudo cp /etc/motion/motion.conf{,.`date -I`}
# threshold 1500 -> threshold 2147483647
sudo sed -re 's/^(threshold)[:space:]*)1500/\1\22147483647/' -i /etc/motion/motion.conf
# daemon off -> daemon on
sudo sed -re 's/^(daemon)[:space:]*)off/\1\2on/' -i /etc/motion/motion.conf
# stream_localhost on -> stream_localhost off
### need to recheck what the original defaults are
# to automatically set up config
# framerate 15
# width 640
# height 480
# target_dir /var/opt/motion
# /etc/motion/motion.conf
#
## allow remote web connections
#webcam_localhost off
#control_localhost off
#set threshold so high it will not trigger motion
#threshold 2147483647
#
## in new version
## webcontrol_localhost off
sudo service motion start
# or simply reboot!
To view the camera browse to [http://picam:8081] and to control it browse to [http://picam:8080] for instance [http://picam:8080/0/action/snapshot] will save an image.
If you want you can add a Samba share to your images folder to make them easier to copy from other devices across the network.
- RPi-Cam-Web-Interface
- custom-built interface to run as a service
- uses custom RaspiMJPEG executable forked from
raspicam
- based on either
apache
,nginx
orphp5
web servers - after setting up webservice, installation is simple
- see [http://elinux.org/RPi-Cam-Web-Interface]
- triggercamera_app.py
- python web server specifically for this purpose
- [http://blog.cudmore.io/triggercamera/]
Note: this is simply about an easy way to record frames when you want. If you really want to get serious about your animation, you might prefer a GUI interface that lets you review previous shots to get fluid-looking motion, e.g.:
-
qstopmotion
package- C++ / Qt based GUI for cross-OS
- forked from original
stopmotion
package
- stopmotion package
- dated but works
- add packages
uv4l uv4l-raspicam uv4l-raspicam-extras
- credit [http://www.makeuseof.com/tag/make-stop-motion-video-rig-raspberry-pi/]