Sunday, October 6, 2013

Video Streaming on the BeagleBone Black - Part 1?

Getting a good lightweight video stream off of the Beaglebone Black has been a struggle, but I think I've found a way to do it at practically no CPU cost. After all, what fun is a badass robotic platform if you can't get realtime video off of it?

More seriously, debugging sensor inputs and then getting a view augmented with a data overlay to an operator would probably speed things along. When I was building the first mobile platform I spent a lot of time watching it pan it's Sharp IR sensor around, wondering what it saw. I had some output to a 'radar' style display on a 1.8" TFT display, but it was hard to read such a small screen when standing behind it as it drove along.

The first thing I thought of was ffmpeg. What a gong show. I could get ffmpeg to write video from a USB webcam to a file, but I couldn't get ffserver to actually stream anything to a browser or VLC. Yuck. I also tried jsmpeg, which also needed an updated node.js. A fun adventure, no big errors, but displayed no video in the js in-browser client. Fail.

My quest was to find a better way to do this; the goal being a container on a web page either showing a streamed video feed from the BBB or pseudo-streaming by showing a series of still jpegs.

Finally, I came across MJPEG-Streamer. Works like a charm.

Here is the method, generally from a mix of and here, so all credit to them for writing up Pi-specific methods.

I figured that if it worked on the Pi, and didn't use any h.264 hardware acceleration that isn't on the Beaglebone, I'd be fine.

In the commands that are apt centric I substituted opkg (I'm on Angstrom), and because I had just done a whole shwack of install/compile/curse/compile-again with ffmpeg I think I had the majority of the libraries installed or updated.

So, basically; prepare the environment by either running svn, or just grabbing the source code with wget. My source was mjpeg-streamer-code-181, which might be a fork of the original.

cd <somewhere handy, like your beaglebone sd card>
mkdir mjpg
cd mjpg

Oh, BTW, it won't compile yet; don't try until you trick it. This is a total hack, and will one day be known as a Bad Thing, but for now it works:

ln -s /usr/include/linux/videodev2.h /usr/include/linux/videodev.h

You might need to update the lib's used in the compile; interestingly I couldn't get libjpeg8-dev as per the recommended steps, but it still works ... that could be all the prior updates done for ffmpeg. I did this, but it was already current:

opkg install imagemagick

Now do the make and install steps:

cd mjpg-streamer-code-181/mjpg-streamer
make mjpg_streamer
cp mjpg_streamer /usr/local/bin
cp /usr/local/lib/
cp -R www /your/www
make DESTDIR=/usr install

That last step glosses over the fact that in my prior adventures I'd installed and run lighttpd, and set the conf file to bind to port 81. That way I still get the default BBB web interface on port 80. My www is something like /mnt/usb2/www/pages, so that's where the client files go; a web page that has the img tags for the stream is all that's needed:

<img src="http://rover:8085/?action=stream" width="320">

Then fire up the process that outputs a stream from the /dev/video0 feed. This is like a combined ffserver / ffmpeg setup, where you'd have ffmpeg sending data to ffserver, and then clients connecting to ffserver. The only difference, besides taking practically zero-CPU, is that it's just one command:

mjpg_streamer -i "/usr/lib/ -d /dev/video0 -r 320x240 -f 5" -o "/usr/lib/ -p 8085 -w /var/www/mjpg_streamer"

On startup it does complain a dozen times or so about 'Inappropriate ioctl for device' with references to hardware that isn't in this particular webcam (pan, tilt, focus, LED, etc). No big deal.

I added the recommended startup as well, so it should always be running... all the stuff.

An interesting thing about lag/latency is that it seems tail-ended... the faster the framerate is (I use 5 or 10) the less lag there is. When I first ran it with -f 1 the lag was at least 3-4 seconds... setting it to 10 made it lag only a fraction of a second. Nice.

Note that nowhere did I use the raspstill command, since this isn't an rPi, so that's why I mentioned those two different articles in the top of this post. It's sort of mix-and-match-and-pray, and it works!

This isn't what I was intending to do, but for now it's fine. The missing bit is doing any kind of text overlay. I had installed freetype in Go, and was about to start down the path of opencv for Go, so I could grab an image, timestamp it and add any sensor graphic overlays, and then spit it back out to the stream. I might still do that, but for now there isn't a good reason to. I'm getting 2.3%-2.6% CPU used by the mjpg_streamer process when serving video, so I'm happy.

Oh, one final thing. As I was watching the output of top in the console I couldn't believe how much crap-ola was running to support the desktop environment. Since I'm not plugging this thing into an HDMI display, but logging in via ssh, let's fix that:

systemctl disable gdm.service
systemctl stop gdm.service

Friday, October 4, 2013

Whoops. Fresh start!

For some reason when I power cycled the Beaglebone it wasn't getting a DHCP address, and I couldn't connect to it. Worse, I never did get the 'networked USB' thing to work ever, so I was locked out! Arrgh.

So I thought it would be good for my own reference to reinstall before I got too far along and record the steps this time.

Base Angstrom Install

First off, I loaded and SD card with the .img file and instructions from the site:

Basically put the .img file on the SD, hit the reset button then power up, and wait. Go make a sandwich or something. On reboot it picked up it's old IP address from my DHCP server, and I could log in via ssh right away. Nice.

Then, being clever, I decided to do this:

opkg update
opkg upgrade

Whoops again... this time I was getting all kinds of 'no space left on device' errors. Aggrrrh, again.

A quick check with Google found this page:

I'm not sure I didn't have this issue the first time and maybe didn't notice, but the commands to work around the bug are to specify a temp directory, instead of ram:

opkg -t ~ update
opkg -t ~ upgrade

Sit back, enjoy a coffee and light reading while opkg -t ~ upgrade chugs along. It takes a while. Then you get this:

Configuring florence.
Configuring e2fsprogs-mke2fs.
Configuring angstrom-packagegroup-gnome.
WARNING: could not open /lib/modules/3.8.13/modules.order: No such file or directory
WARNING: could not open /lib/modules/3.8.13/modules.builtin: No such file or directory
WARNING: could not open /lib/modules/3.8.13/modules.order: No such file or directory
WARNING: could not open /lib/modules/3.8.13/modules.builtin: No such file or directory
Collected errors:
 * pkg_run_script: package "bonescript" postinst script returned status 1.
 * opkg_configure: bonescript.postinst returned 1.

All indications I've been able to find are that these are indeed warnings, and can be ignored. Moving on...


I like to have useful timestamps on things, so set the time to pull from ntp. The BBB doesn't have an RTC, but I'll be adding at least one, possibly two time sources; one is on the Adafruit SD1307 kit, which I'd built for a previous project and I know works. It doesn't have a reference time source, but it's ok for most applications. The other is time encoded from GPS, again from Adafruit; the Ultimate GPS Logger Shield kit. It's an Arduino shield, but it's happy to put serial data to the BBB, which I tested at the console level via some jumper wires so at least I know that works.

But while it's sitting on my desk I need some good time sources, so let's configure NTP, Network Time Protocol. Start with the basics. Since NTP is in the base Angstrom image that we just loaded, we don't have to update it, but we do have to configure it.  The instructions here are a good start and have additional info that might be useful:

Start with:

opkg install ntp

Which will set the ntpd to start at boot.

There is some light config required, depending on your location. I'm in Canada, so we have a couple of time servers that are publicly available. Add those to /etc/ntp.conf

The timezone is a file; /etc/localtime, so delete that file and link to the correct file in /usr/share/zoneinfo (MST, in my case):

root@beaglebone:/etc# rm localtime
root@beaglebone:/etc# ln -s /usr/share/zoneinfo/MST /etc/localtime

There are a few tweaks to the service file in Derek Molloy's page, seems reasonable...

Remove this line
ExecStart=/usr/bin/ntpdate-sync silent
from /lib/systemd/system/ntpdate.service

and add these two lines:
ExecStart=/usr/bin/ntpd -q -g -x
ExecStart=/sbin/hwclock --systohc

Before starting the services, issue the ntpdate command against your ntp server to test it, and set the initial time:

root@beaglebone:/etc# ntpdate
 4 Oct 10:36:57 ntpdate[5558]: step time server offset 37881.192694 sec

You should be good to now enable those services:

root@beaglebone:/etc# systemctl enable ntpdate.service
root@beaglebone:/etc# systemctl enable ntpd.service

While you're poking around in /etc you might also want to edit resolve.conf and point it to a reasonable nameserver, or edit your networking if you don't want dhcp.
Now is probably a good time for a reboot, to see if it's all working...


I had audio working nicely via a powered USB hub (a generic 7 port powered USB hub) to a Sound Blaster Play! (model SB1140). Worked first time, the last time. Here is the detect info, first with nothing plugged in, then with just the hub, then with the SB USB audio device:

root@beaglebone:/# lsusb
Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 002 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
root@beaglebone:/# lsusb
Bus 001 Device 002: ID 0409:005a NEC Corp. HighSpeed Hub
Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 002 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 001 Device 003: ID 0409:005a NEC Corp. HighSpeed Hub

And aplay reports nothing interesting; audio is going out the HDMI port:

root@beaglebone:/etc# aplay -L
  Discard all samples (playback) or generate zero samples (capture)
  TI BeagleBone Black,
  Default Audio Device
  TI BeagleBone Black,
  Default Audio Device

After plugging in the usb audio module (and rebooting, I'm not certain there is stable hotplug support in this kernel...) I get this in lsusb:

root@beaglebone:~# lsusb
Bus 001 Device 002: ID 0409:005a NEC Corp. HighSpeed Hub
Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 002 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 001 Device 003: ID 0409:005a NEC Corp. HighSpeed Hub
Bus 001 Device 004: ID 041e:30d3 Creative Technology, Ltd Sound Blaster Play!

Cool. Let's plug it in and check aplay...

root@beaglebone:~# aplay -L null  Discard all samples (playback) or generate zero samples (capture) default:CARD=Black
  TI BeagleBone Black,
  Default Audio Device
  TI BeagleBone Black,
  Default Audio Device
  USB Device 0x41e:0x30d3, USB Audio
  Default Audio Device
  USB Device 0x41e:0x30d3, USB Audio
  Default Audio Device

Yup, that's it. Should be working. Let's see if we get something out of ffmpeg...
First, let's mount a USB stick with an mp3 on it...

root@beaglebone:/dev# mount /dev/sda1 /mnt/usb
root@beaglebone:/dev# cd /mnt/usb root@beaglebone:/mnt/usb# ls wagner2.mp3 root@beaglebone:/mnt/usb#
And play the file...
ffmpeg -i /mnt/usb/wagner2.mp3 -f alsa "default:CARD=U0x41e0x30d3" -vol 40 -re

Whoo hoo! Smells like victory...
Now lets grab some tar's from festivox and scp them to the beaglebone and compile...


The other thing I had tested on OSX was speech synthesis, from Festival (actually from Festivox, the north american mirror), and it worked great on my iMac.

I started with scp'ing the tar files over to the BBB and extracting them in usr/src/festival. Note that I'm doing everything here on the BBB flash storage; it does have it's limits for number of writes, but this shouldn't be an issue... yet.

There is one directory, speech_tools, that seems to want to compile separately, so I cd there, do a ./configure, and then a make on this first, before we make festival. It seemed ok, then kinda crapped out at the end, as if the source code was broken.

I decided to go back to festival and make that, since there could be some kind of circular dependancy going on... nope, very similar error, almost like a file was missing.

So no Festival, at least not now; 'make' just croaks. Maybe if I have time I'll see if I can cross-compile it from OSX. Worst case I'll have to pre-generate the audio samples as mp3's and store those on the USB key. Or use the sound library from Portal... all the voices (GladOS, turrets, and spheres) sound similar enough to use together. Except maybe Rick.


Go has great documentation, but for whatever reasons the path structure is kinda... odd. Well, not odd as in 'this is stupid' but odd as in 'this makes sense once you understand it', since Go libraries get a path structure from where the repo they were fetched from, and this can vary per user.

First, I grabbed the latest version (Go 1.2) via wget, worked.
I then found a suggestion here:

with a plausible install-from-source method... install a bunch of timezone files, and then run a bash script to bootstrap the tools. The tests are run at the end of the script, and at 1 Ghz they take forever...!

Tuesday, October 1, 2013

And the winner is...


By a long shot.

I wasn't sure if a total sandbox environment was the right thing to do, but when I stepped back and looked at what I needed it was apparent there were actually two systems that needed developing; the low level routines in the hardware abstraction layer, and higher order routines that called them. Rather than fight with yet a shim between them, I decided to just use go.

After looking at the architecture a bit closer, it was also apparent that with just a few channels and a few support functions I could pretty much get what I wanted anyway.

I then started the job of coding a framework, and realized just how big a job this is... but if I stick to the 'keep it simple, stupid' mantra it shouldn't get out of hand.

Now to take a short break and get the USB audio working on the Beaglebone Black, so the rover can blare out 'Ride of the Valkyries' as it roves along...