How to build BeagleBoard based WiFi robot

As it often happens with typical software developers, we were working hard on implementation of different features but did not spend enough time documenting our development :-) . As a result, there were not too much documentation and details about our project. So with this post we would like to announce the availability of more detailed project documentation and make brief overview of some interesting aspects of the project. The whole available documentation could be found at our web-site and on the project Wiki. It is, as usual, :) not complete yet, but hopefully good enough to understand how everything works and even rebuild it yourself.

Here is the brief summary (details are available here) of what we have done so far. As a chassis platform we are using the one from Pololu. On top of it we mount the BeagleBoard with connected WiFi adapter, camera and GPS receiver. To control the motors we are using standard PWM-controlled speed regulators. Control PWMs are generated directly from BeagleBoard using available hardware PWM generators as well as using GPIO. In addition, there are compass and ultrasound range finder connected over I2C available on BeagleBoard.

Here is how it everything looks like now:







Or like this, depending on what chassis are used:















Video is compressed in to h264 stream using OMAP's DSP in real-time. TI's DSP optimized codecs integrated with GStreamer are used for video compression. After that, the h264 stream is sent to the driver console over WiFi. We also solved a lot of problems related to firewall and NAT traversing.

Experimentally we found out that maintain constant video frame rate is the very important to provide comfortable driving experience. It is not a trivial goal to achieve if the video is transmitted over the Internet because available bandwidth changes permanently. To solve this problem we developed our own adaptive video streaming infrastructure. We have presented our adaptive streaming on the Gstreamer 2010 conference. There is presentation video and slides available.

In addition to the video stream, driver console also receives data collected from on-board sensors. It makes it possible, for example, to display robots current location on the map which is downloaded from the openstreetmap.org also in real-time.

The following picture illustrates how the driver "cockpit" looks like:



This is real screenshot (not a photoshop :-) ). Driver console application is written using OpenGL and works on Linux and Windows (should also work on Mac but I did not test it). The whole 3D model is made in Blender and exported in to the standard .DAE (COLLADA) format. So the project also contains rather evolved COLLADA visualisation library which supports animation and some other advanced features. Every time the new video frame is received and decoded to the raw RGB data block, the corresponding OpenGL texture is updated in 3D scene. The middle panel is then used as such textured surface. Panels to the left and right are to visualize additional information such as for example map with current location or to constantly scroll unreadable cool looking logs :-) .

I want to say special thanks to Sungreen (aka. Nikolay :-) )! He is kindly respond to my call for help on blender-3d.ru site and created different 3D cockpit models. If you are interested, you can take a look at this link. The site is in Russian but on the 2-nd page there are several examples of the alternative cockpit models.

All the software for this project (including recipes for OpenEmbedded/Angstrom to build Linux image for flash card) are available on github.com. We are currently in process of migrating from gitorious.org to github, so if some pieces are missing on github, chances are that they could be found in old repository.

Also, for some additional information about the project I would suggest to check our Blog.

I would really appreciate any comments about the project!
License Content on this site is licensed under a Creative Commons Attribution 3.0 License .

Comments

  1. Great work! I'm working on a similar project, and this will be a real help. Don't worry that the documentation is incomplete. There is *some* documentation - that puts you ahead of 90% of the projects on the web :)

    ReplyDelete
  2. Thanks Romilly for the positive feedback! :-)

    ReplyDelete
  3. Andrey, where did you get the BeagleBoard case? Cool project!

    ReplyDelete
  4. Thanks! Regarding the case - just made it myself out of standard plastic box (pictured here: https://github.com/veter-team/veter/wiki/Hardware-design-en) by cutting required holes.

    ReplyDelete
  5. OK, this is just plain cool, if bizarre. I built an RC-car system with an adaptive video compression scheme that adjusted the balance between frame quality and rate depending on the physical speed of the vehicle (actually the throttle, but...). This was back in about 1999, using an AT motherboard with a 166MHz processor and a wireless card predating WiFi in an ISA slot. The camera was a parallel port QuickCam, and we used a real serial port to drive the servos. Booting was done wirelessly from a temporarily-plugged-in floppy drive. The software was part of a research project, but was intentionally built with specific limitations (by others) in order to provide problems to solve...

    Significantly in response to the pain of using that software framework, I invented GStreamer ;-)

    Of course, the hardware is long recycled, but the RC car itself is now rebuilt as one of my nephew's favorite toys Somewhere I have a set of pictures with the robot, myself, and Ray Bradbury, after I demo'd the thing after a speech of his...

    ReplyDelete
  6. Wow. I admire your patience and creativity.well done!
    Got here through a totally different search query, but who cares :)

    ReplyDelete
    Replies
    1. Thanks! It is always very motivating to hear the positive feedback. I am really appreciate it.

      Delete
  7. hi veter,
    The project you are working is really interesting.I am also right now working on similar project. I am using beagleboard xm with ubuntu onboard. The problem i am facing is latecy in streaming video. The video is captured from standard webcam. We are using mjpg-streamer to provide http stream. I want to know whether using gstream/dsp can reduce latecy in video stream.


    Thanks in advance

    Bharath

    ReplyDelete
    Replies
    1. Hi Bharath,

      > hi veter,

      :-) Actually "veter" is the name of the project and it is the Russian word for "wind".

      > The project you are working is really interesting.

      Thanks!

      > want to know whether using gstream/dsp can reduce latecy in video stream.

      To answer this question I need to know what is actually causing the latency in your case. I can imaging two reasons:
      1. You are capturing the raw video frame and compress it to mjpg on the CPU.
      2. You are capturing mjpg directly from the camera (a lot of cameras can generate jpeg) and mjpeg server is doing some sort of buffering before sending frames to the network
      3. The receiving client is doing buffering before starting playing the video.

      Depending on your answers to the questions above you can make conclusions whether gst/dsp will help. DSP-based encoder works much faster then corresponding CPU-based encoder. So it can reduce the timer required for compression (if you are actually doing it). Otherwise, I would suggest to play with buffering parameters of your streaming client and server.

      In our case, we did not find any existing solution which satisfied all our needs. That is why, we implement our own very low-latency streaming mechanism. It has also adaptation to the changing network conditions to maintain the constant frame rate. Without constant frame rate it is hard to remotely control the vehicle event at the low speed.

      Delete
  8. Hi Andrey

    Congratulations for your success. I am trying to something with beagleboard xm and webcam. I need live stream. However I could'nt find a good solution for it. Because stream use very high percantage of CPU. I tried to use DSP but I couldn't integrate. Could you suggest me anything about this problem How could I integrate dsp or is there any simpler way to do it.
    Thank you
    Ufuk

    ReplyDelete
    Replies
    1. Hi Ufuk,

      We are using GStreamer library [1] with encoder elements from TI (TIVidenc1) which are using DSP to accelerate video compression [2]. AFAIK, latest versions of Angstrom are already have all of them integrated.

      [1] http://gstreamer.freedesktop.org/
      [2] http://processors.wiki.ti.com/index.php/Example_GStreamer_Pipelines#OMAP35x

      Regards,
      Andrey.

      Delete
  9. Hi there,
    I have a questions about robotic with BegaleBone Black. I have build one with wifi and bluetooth using usb hub to control by smartphones.The usbhub for getting more ports of usb to connect webcam, bluetooth adapter and wifi adapter. The bluetooth to controlled the robotic and the wifi for webcam. So, when I'm running the robotic will run really slow. Also, I'm using Debian image with Python Language. If anyone knows how to fix my problem please let me? because I'm using the robotic for my senior project and your help guys. Sorry for my English. Thank You in advance.

    ReplyDelete
    Replies
    1. It might be the usbhub doesn't has enough power?

      Delete
    2. How much power can I provide over USB to an external device?
      It limits the USB Host port power to 500mA. and the usbhub needs for each ports 500mA for each, so we are using three ports each one need 500mA and that not enough for feed the webcam, wifi and Bluetooth adapter.

      Delete

Post a Comment