Looking for up-to-date package for Ubuntu ?

You have mostly 2 choices :
1. switching to ArchLinux…
2. jump into the futur with something like (for example to get gcc-7 on Ubuntu 17.04)

cd /etc/apt
sudo cp /etc/apt/sources.list /etc/apt/sources.list_bak
sudo sed -i -- 's/yakkety/artful/g' sources.list
sudo apt-get update
sudo apt-get install -qq gcc-7 g++-7
sudo cp /etc/apt/sources.list_bak /etc/apt/sources.list
sudo apt-get update

I’ll use 1. soon, but using 2. in the meanwhile…

git update

Often I forgot to update submodule when I pull from a git repository.
This leads to hours of headache…
Thus I decided to help myself with this helpful alias :

git config --global alias.update !(git pull && git submodule update --init --recursive)

Now I just have to type : git update to update the repository and all the submodules recursively.

how to receive RTP stream with python-gst-1.0

A year ago, I explained how to send Raspberry Pi camera stream over network to feed Gem through V4L2loopback device.
Today I wrote a small Python script to receive the same stream (to use it with pupil-labs).
It uses Python 3 (but should work with 2.7 too) and python-gst-1.0.

Here is the script :


#!/usr/bin/python3

# this example shows how to receive, decode and display a RTP h264 stream
# I'm using it to receive stream from Raspberry Pi
# This is the pipeline :
# gst-launch-1.0 -e -vvvv udpsrc port=5000 ! application/x-rtp, payload=96 ! rtpjitterbuffer ! rtph264depay ! avdec_h264 ! fpsdisplaysink sync=false text-overlay=false

import gi
gi.require_version('Gst', '1.0')
from gi.repository import GObject, Gst, Gtk

# Needed for window.get_xid(), xvimagesink.set_window_handle(), respectively:
from gi.repository import GdkX11, GstVideo

GObject.threads_init()
Gst.init(None)

class RTPStream:
def __init__(self):
self.window = Gtk.Window()
self.window.connect('destroy', self.quit)
self.window.set_default_size(800, 450)

self.drawingarea = Gtk.DrawingArea()
self.window.add(self.drawingarea)

# Create GStreamer pipeline
self.pipeline = Gst.Pipeline()

# Create bus to get events from GStreamer pipeline
self.bus = self.pipeline.get_bus()
self.bus.add_signal_watch()
self.bus.connect('message::error', self.on_error)

# This is needed to make the video output in our DrawingArea:
self.bus.enable_sync_message_emission()
self.bus.connect('sync-message::element', self.on_sync_message)

# Create GStreamer elements
self.udpsrc = Gst.ElementFactory.make('udpsrc', None)
self.udpsrc.set_property('port', 5000)
self.buffer = Gst.ElementFactory.make('rtpjitterbuffer',None)
self.depay = Gst.ElementFactory.make('rtph264depay', None)
self.decoder = Gst.ElementFactory.make('avdec_h264', None)
self.sink = Gst.ElementFactory.make('autovideosink', None)

# Add elements to the pipeline
self.pipeline.add(self.udpsrc)
self.pipeline.add(self.buffer)
self.pipeline.add(self.depay)
self.pipeline.add(self.decoder)
self.pipeline.add(self.sink)

self.udpsrc.link_filtered(self.depay, Gst.caps_from_string("application/x-rtp, payload=96"))
self.depay.link(self.decoder)
self.decoder.link(self.sink)

def run(self):
self.window.show_all()
# You need to get the XID after window.show_all(). You shouldn't get it
# in the on_sync_message() handler because threading issues will cause
# segfaults there.
self.xid = self.drawingarea.get_property('window').get_xid()
self.pipeline.set_state(Gst.State.PLAYING)
Gtk.main()

def quit(self, window):
self.pipeline.set_state(Gst.State.NULL)
Gtk.main_quit()

def on_sync_message(self, bus, msg):
if msg.get_structure().get_name() == 'prepare-window-handle':
print('prepare-window-handle')
msg.src.set_property('force-aspect-ratio', True)
msg.src.set_window_handle(self.xid)

def on_error(self, bus, msg):
print('on_error():', msg.parse_error())

rtpstream = RTPStream()
rtpstream.run()

How to cross compile for Rapsberry Pi from Ubuntu

If you’re working on quite big development project that runs on RPi, you might be interested in speeding the build time.
Thus you have several options. For projects that don’t have lots of dependencies, you can use the official Raspberry Pi toolchain.
If you have some or more dependencies, some good tutorials will lead you to some rootfs or chroot technique.
Basically, it means – for the first – to have a copy of the RPi’s /lib and /usr folder somewhere locally on your building host.
The second means that you run commands in some kind of sandbox, where it can’t do much pain on your computer and it believe that the folder containing RPi’s usr and lib is a (fake)root.
Those two methods are boring and could lead to very strange behavior (since some lib inside /usr are linked to binaries inside /lib and the link are broken when copying (because the /lib on your system is not the Pi’s one).
Another solution is to emulate the Raspberry Pi inside Qemu for example, then setup a build environment there to build. But this could be longer than on a Pi itself…

For JamomaPureData project, I need to build on Travis-CI to test each commit and detect regression or so.
So having a light toolchain is indeed a real need for that.
I started with the officical toolchain and I added to it the library I need.
For doing so, I install all the libraries I need (libxml2-dev, libsndfile-dev and their deps) on my RPi, then I copy them one by one, include and lib to the toolchain folder.
In the official toolchain, the root folder is :arm-bcm2708/gcc-linaro-arm-linux-gnueabihf-raspbian-x64/arm-linux-gnueabihf/libc
You can also download the package, then dpkg -x *.deb in the folder, it could be faster but this could install uneeded file such as manual pages or programs.
Then my toolchain is here : https://github.com/avilleret/tools/tree/Jamoma

Then I can build for Raspberry Pi on Travis-CI.org !

https://travis-ci.org/jamoma/JamomaPureData

How to send MIDI over network ?

For some project one ask me to plug a MIDI controller on the network either to send its parameters to lots of computers or just to increase length between controller and computer.

To do so, I’ve plugged the controller onto a Raspberry Pi then the RPi on the network. Here I have several options.
The first one is to make a Pd patch that send all MIDI event over the network. This works and could be cross platform. The receiver could run on Linux, OSX or even Windows and then forward MIDI event to other programs if necessary with a platform specific protocol (IAC Bus on OSX for example).

But some controller (like the Novation Launch Control XL i’m using) doesn’t work with OSS MIDI on Linux, but with Alsa. So you have to connect the device with Pd with aconnect or something else and this could be boring.

I also found a small command line program multimidicast that creates an Alsa client with several ports and multicast MIDI event to network. It works fine on Linux. It is said to works on Windows too, but I can’t test. This is the solution I’m using.

Another solution if all computer are running Linux, is to use aseqnet the alsa sequencer network client/server. It works but you have to know the name or the IP of the server to connect to.

And finally, it should be possible to do the same with Jack. Jack midi could be integrated with Apple midi but I can’t find any resources on internet to build such a setup.

RaspiCam stream to Gem

For a theater show I need to stream Raspicam to Gem.
I found some solution with VLC. Here is a good comparison of some solutions : http://stephane.lavirotte.com/perso/rov/video_streaming.html
VLC is a good choice since there is a VLC backend to play video in Gem, so one can use it to display network stream into Gem.
But VLC based solution suffer from big latency, around 1 sec, a bit too much for me.
So I dig a bit and found this very good article : http://antonsmindstorms.blogspot.nl/2014/12/realtime-video-stream-with-raspberry-pi.html.
And this one, pretty similar : http://blog.tkjelectronics.dk/2013/06/how-to-stream-video-and-audio-from-a-raspberry-pi-with-no-latency/.
But the two use gstreamer-1.0 which doesn’t work with Puredata and Gem.
To feed Gstreamer into Gem, there are mainly two solutions : v4l2loopback (https://github.com/umlaeute/v4l2loopback) or pdgst (https://github.com/umlaeute/pdgst).
But those are not (yet) working with gst-1.0.
So I found a way to make a gst-0.10 pipeline to decode the stream and send it to a v4l2loopback device.
First you need gstreamer-0.10 and v4l2loopback :
sudo apt-get install gstreamer-0.10 v4l2loopback-dkms
enable v4l2loopback with :
sudo modprobe v4l2loopback
after that you should have a new /dev/video* device. For example, on my laptop with an integrated webcam (which is /dev/video0), I have a /dev/video1 device which is the v4l2loopback device.

Then you’ll need some ffmpeg modules. FFMPEG is no more available for Ubuntu since it has been replaced by avconv – and gst-1.0 support avconv but not gst-0.10.
Here you can find some tips to install ffmpeg on Ubuntu 14.04+ : https://groups.google.com/forum/#!topic/clementine-player/JnGgRyUEuc4
Note that there is no utopic (14.10) repository but the trusty’s (14.04) one works for utopic.

Now here is the gstreamer pipelines I use. On the Pi :
raspivid -t 0 -b 2000000 -fps 60 -w 1280 -h 720 -o - | gst-launch-1.0 -e -vvv fdsrc ! h264parse ! rtph264pay pt=96 config-interval=5 ! udpsink host=10.42.0.1 port=5001
don’t forget to change the ip address to fit your computer’s IP.

And on my laptop :
gst-launch -v udpsrc port=5001 ! application/x-rtp, payload=96 ! rtph264depay ! ffdec_h264 ! ffmpegcolorspace ! v4l2sink device=/dev/video1

Then I can display the stream in Gem with 10-11 frames latency at 60Hz, around 100-116 ms. Which is great !

How to change password rules on Raspbian

On recent version of Raspbian (I think since the release of January 7th of 2014) the password rules have changed and you can’t use anymore simple password like `pi`.
To enforce this requirement, just change the line 25 of /etc/pam.d/common-password. Remove the obscur keyword and add minlen=2 (or whatever you want).
The line should looks like :
25 password [success=1 default=ignore] pam_unix.so sha512 minlen=2

Check man pam_unix for more options.

How to backup a Raspberry Pi and restore it on a smaller SD card

If you’re using Raspberry Pi, you might know the famous command line utility dd, useful to write a Raspbian image to a blank SD card (cf. http://elinux.org/RPi_Easy_SD_Card_Setup).

You can also use this tool to do backup of the whole disk, but it has two drawbacks :

  1. when you copy the whole disk, the image is as big as the disk, even if the is lots of empty space on it.
  2. when you restore the backup you need a disk at least as big as the original one.

Those two disadvantages lead me to find a solution to make backups smaller and more versatile. The solution I’ll describe here is an adaptation of Ubuntu’s documentation : https://help.ubuntu.com/community/BackupYourSystem/TAR and have been tested on Ubutnu 14.04.

Continue reading

How to make a RPi boot silent

Here are few modification that allow a very quiet boot, nothing will appear on the screen before the login prompt.
Thus if you start a visual application before that (in /etc/init.d for example) you will not see anything on the screen before your application starts.

First modify the /boot/cmdline.txt like this :
dwc_otg.lpm_enable=0 console=ttyAMA0,115200 kgdboc=ttyAMA0,115200 console=tty3 root=/dev/mmcblk0p2 rootfstype=ext4 elevator=noop rootwait loglevel=3 logo.nologo vt.global_cursor_default=0

here are the details of the changes :
console=tty3 redirect all the post messages to the third console (hit CTRL + ALT + 3 to see them after boot).
loglevel=3 make it less verbose, only errors are reported
logo.nologo disable the RaspberryPi logo on boot
vt.global_cursor_default=0 Disable the blinking cursor.

Moreover you can add disable_splash=1 to /boot/config.txt in order to disable the rainbow splash on power on.

At the en, vous can completly disable the prompt #1 by editing the file /etc/inittab and commenting the following line :
1:2345:respawn:/sbin/getty --noclear 38400 tty1

That’s all !