SHORT DESCRIPTION

Video processing pipelines are a huge topic; there are many cameras, protocols, processors and processes. When building a video pipeline, it is hard to know which Open Source (OS) pieces can be used, and which closed-source pieces have to be used. This article surveys the landscape, and recommends when to use what. Particular attention is paid to OS components, and to freely available protocols.  This is far too large a topic for a 20 minute session. Readers who would like to learn more are referred to the "Open Source Video Pipelines Wiki" at wiki.PythonLinks.info

LONG DESCRIPTION

The talk starts with the OS video pipeline on the OS Frame glasses, and then describes a more typical mixed open and closed source drone application. For radio communication drones can use the OS RunCam WifiLink, or for more complex applications, such as frequency hopping, one can use the OS FPGA based Blade RF.

For every step in a video processing pipeline there are a huge number of choices ranging from simple to very complex. Lenses, image sensors, raspberry pi cameras, and the popular Waveshare OV5640 camera are presented. They can be connected to the $10-$15 Raspberry Pi Zero. Camera stacks allow one to mix and match different image sensors, processors and protoocols. Real time processes inside of a camera can include defective pixel correction, pixel binning, debayering, color correction video cropping and M-JTPEG compression.

There are a large numbe of display options, but usually DVI or HDMI is the output. If we want the video on the computer, the HDMI to USB dongles cost less than 10 Euro. They use frame buffers which are common in video pipelines, but double the memory requirements. Usually microcontrollers do not have enough memory for a double frame buffer. Video processing on microcontrollers is best with the Libmpix library running on the Zephyr Operatign System. It uses a ring buffer to minimize memory consumption during video processing.

For higher bandwidth processes, such as video dewarping, really an FPGA is needed. Any FPGA could receive parallel RGB data (called DVP) at up to 1.5 Gbps. The OS Lattice NX17/40 has hard core MIPI which can receive 2.5Gpbs. The GateMate FPGA has a SerDes at 5Gbps. Above that one needs closed source FPGAs.

If you want to build a Video Pipeline, the first step is to know what the technical options are. High level recommendations are made, but there is really too much material for a 20 minute talk. There was a 10 page paper submitted to the Chemnitz OSHOP conference, but there is too much material for that as well. Wiki.PythonLinks.info covers a lot more material than is possible to cover here.

Supporting Material

You can see the slides at https://pythonlinks.info/presentations/VideoPipelines.pdf

You can watch the 17 minute version of he video at wiki.PythonLinks.info


Built with the Forest WIki.
User contributed content is licensed under a Creative Commons CC By 4.0 License.