Overview:
- Baseline JPEG compliant (ITU T.81), Motion JPEG
- Up to 12 bits depth possible (default: 8 bit)
- Super low latency (less than 1/10 of frame duration for rolling shutter cameras)
- Lossy compression by default
- Fully bit and cycle accurate co-simulation model available in Docker container (See Jupyter Notebook: JPEG_L2 cosimulation)
- Two-Chip reference design: No external RAM, only FPGA and Ethernet Phy required.
- Low power consumption due to clock synchronous, distributed operation
The JPEG IP is basically split into the following variants:
- L2 dual pipe simultaneous encoding for high quality YUV422, for example 1280×720@60fps (100 MHz pixel clock for default setup, custom platform optimizations possible for higher clocks)
- L2x using off-standard optimizations for specific image properties (spectral imaging)
L1 monochrome or YUV420 multiplexed pipeline (150 MHz pixel clock on Spartan6)No longer maintained.
Example configurations / reference designs
The current default JPEG SoC encoder setup, making use of the JPEG L2 encoder, is a fully functional MJPEG (RFC2435 standard) camera network streaming solution, including a stress test option (deterministic pseudo random pattern). Precompiled bit files are currently available for download (free, but registration required) for the following platforms:
- Lattice HDR60/ECP3 [ MT9M024 Network camera | MT9P031 (J7) on request ]
- Lattice VIP stereo kit: [ IMX214 camera head ]
Board supply packages are no longer maintained per se, a core license however grants you access to the unmutilated source code.
To test this setup on Windows or Linux platforms, a gstreamer installation and a 100M capable Ethernet interface is required. The necessary scripts to run the MJPEG low latency demo are listed in the dombert Product Brief in the documentation section below.
A few example configurations for MJPEG streaming are found in the table below.
Resolution | Quality | fps | Bandwidth [MByte/s] |
---|---|---|---|
768×512 | 90% | 33 | 3.2 |
768×512 | 70% | 63.33 | 4.7 |
1440×960 | 88% | 23.5 | 7.6 |
1920×1024 | 82% | 18 | 7.6 |
The streaming pipeline is optimized for minimum frame drops using the above gstreamer setup. For a gstreamer receiver and the network not subject to excessive load, thousands of hours of video have been verified without a lost frame.
Simulation models and verification
The JPEG encoder IP L1 and L2 modules are extensively verified against standard compliance using the following techniques:
- Fully bit accurate co-simulation and coverage to verify loopback software compliance (Encoding->Decoding->Validation). A simplified virtualized encoder hardware model for co-simulation is supplied. It allows you to simulate the encoder hardware design against your own test data and timing using Python without the requirement to install any software. See details in [ Co-Simulation models ‘2.0’ ]
- Compression bandwidth and FIFO stress tests: Generation of worst-case-compressible image patterns and statistics on FPGA SoC setup
Streaming ‘System on chip’
The embedded system processor ‘dombert’ is programmable in C (GNU toolchain) and allows to freely configure the encoder’s packet queue to support other UDP/IP based transmission standards such as SRTP (with use of a hardware AES encryption IP core).
The setup for the HDR60 development kit is featured by a hardware debug port and GDB test scripts for convenient video hardware debugging.
Services
Various support packages are possible. If you wish to integrate the JPEG IP into your design, you have the following options:
- Evaluate MJPEG RTP streaming demo using bit file (HDR60 or VIP kit required).
The default MT9M024 sensor demo delivers up to 1280×960@35 fps. On the VIP kit, the IMX214 setup delivers up to 1920×1080@12 fps. - Integrate closed netlist for evaluation into own image pipeline. Sources available in VHDL and Verilog (bit-accurate verification against GHDL and Icarus Verilog)
- Only limited resources are available for support concerning:
- Interfacing [ Ethernet | isochronous USB transfers ]
- Software format [ RFC 2435 compliance | UVC video class support ]
- The full package: You can build your own sensor front end around the dombert MJPEG streaming SoC reference design and program it using the GCC SDK
Documentation and further resources
- dombert MJPEG streaming SoC product brief (outdated, on request only)
- Standalone JPEG IP core documentation on request
- Example video captures [ MJPEG AVI 1024×768 or 1280×960 ]