Posted on

Lattice VIP MJPEG streaming

The EVDK or VIP from Lattice Semiconductor (Embedded Vision Development Kit) is a stereo camera reference design for development of machine vision or surveillance applications. It is based on a ECP5 FPGA as main processing unit and a CrossLink (LIF-MD6000) for sensor interface conversion. As an add-on, a GigE and USB vision capable output board can be purchased and is required for this demo.

The image acquisition board of the VIP is equipped with two rolling shutter sensors IMX214 from Sony. Unfortunately, their register map is not publicly available. The on board CrossLink unit translates video data coming through two MIPI interfaces into a parallel video stream which is easier to handle by the processor board. There are two different default firmware images for the CrossLink:

  1. Stereo mode: both sensor’s images are merged at half the x resolution (cropped)
  2. Mono mode: Only image data coming from Sensor CN2 is forwarded

The CrossLink bit files are available at Lattice Semi after registration [ Link ]

The MJPEG streaming bit file is available for free [ MJPEG-Streaming-Bitfile-for-VIP ].

JPEG-Streaming

As reference receiver for the JPEG RFC2435 stream we use the gstreamer pipeline framework. Create a script as follows:

caps="application/x-rtp, media=\(string\)video,"
caps="$caps clock-rate=\(int\)90000,"
caps="$caps encoding-name=\(string\)JPEG"

gst-launch-1.0 -v udpsrc \
caps="$caps" \
port=2020 \
! rtpjpegdepay \
! jpegdec \
! autovideosink

When calling this script under Linux (as well as under Windows) gstreamer runs a RTP MJPEG decoding pipeline and displays the video strom as soon as it arrives. The stream must then be configured on the VIP.

Stream configuration

  1. Connect USB programmer cable to VIP processor board. Then start a terminal (like minicom) with serial parameters 115200 bps, 8N1
  2. Connect to /dev/ttyUSB1 (Linux) or the corresponding virtual COM port on Windows
  3. Load ECP5 MJPEG encoder reference design (bit file) onto the target using the Diamond Programmer
  4. On the console, configure address of receiver:
    r 192.168.0.2
  5. Verify ARP reply:
    # Got ARP reply: 44 ac de ad be ef
  6. Start JPEG-Video, e.g. 1920×1080 @ 12fps (slow bit rate):
    j 2

If the JPEG stream stops, the reason is mostly a bottleneck in encoding. Under certain circumstances, FIFO overflows can be provoked by holding a colorful cloth in front of the camera. Also, a fully white saturation may fill up the FIFOs in this demo. The JPEG encoder is configured to allow up to five overflows before terminating the stream. For detailed error analysis, see documentation of MJPEG Encoder.

Sensor parameter configuration

The connected sensors are configured via the i2c bus:

# scan i2c-Bus (only when JPEG stream off):

i

# i2c register query (hexadecimal values):

i 100

# set i2c register:

i 100 1

Simplified sensor access (also, values are in hexadecimal notation):

Function

Command
se [Value] Exposure
sh [0|1] HDR mode
sr [Gain] Gain red
sg [Gain] Gain green
sb [Gain] Gain blue

Examples

These are JPEG images captured from the encoded stream without further conversion. They use the Crosslink firmware for stereo sensor configuration. Image errors can occur due to probable synchronization issues at a lower pixel clock.

Stereo test image (‘j 4’ command)
Broken stereo image (MIPI output frequency too low)

General Troubleshooting

  • Video does not start:
    1. Check error messages on console. If ‘Frames received’ upon video stop (‘j’) shows 0, the sensor configuration may be incorrect or the CrossLink is not properly initialized.
    2. Check for Network activity (orange LED) on GigE board
    3. Use wireshark to monitor network activity
  • Video starts, but terminates:
    1. Check error bits on console: [DEMUX], [FIFO], …
    2. Increase quantisation value (better compression):
      q 30
    3. Check for lost packets with wireshark
    4. Try direct network connection between VIP and PC (no intermediate router)
  • Broken images:
    1. Check again for error bits on console. It is also possible, that the CrossLink reference design does not properly handle the clock coming from the sensor.

 

 

Posted on

Lattice VIP IMX214/CrossLink issues

Overview

The imx214 sensors are configured using the ‘default’ sequence from the reference design, but at a lower PLL frequency around 54 MHz. Both sensors are started synchronously.

Single camera configuration

This setup uses the single camera bit file provided from Helion Vision.

  • Framing errors occur early in the entire video stream, then it runs stable for a very long time (recorded up to 150’000 frames)

Dual camera configuration

This setup uses the stereo camera reference design from the Lattice website (DualCSI2toRaw10_impl1.bit).

Color shifts
Bayer pattern offsets

Issues:

  • Framing very unstable, right image shows interesting color shift
  • Offset changing from frame to frame, displaying as above

Further analysis

The reason for the occuring DEMUX errors from the JPEG encoder is occasional invalid framing. Frames are then dropped and the image is out of sync.

Possibilities:

  1. Framing from sensor is wrong (critical sensor configuration mode)
  2. Framing from Sensor correct, but translation hickups inside CrossLink
  3. Irregular timing (too short Hblank time) stressing the JPEG encoder FIFOs

(1) can not be verified without a MIPI timing debugger. (2) can not be simulated due to closed source of CrossLink Firmware.

For (3), the LINE_VALID (blue) and FRAME_VALID (yellow) signals, both routed to external debug header display as follows:

IMX214 sensor framing via CrossLink

The above behaviour of two subsequent pixel lines with short blanking time occur in the current Stereo and single sensor CrossLink firmware configuration.

Potential remedies

Sorted by ascending complexity:

  • Find magic setting for more regular MIPI data transfer
  • Use another sensor (parallel interface)
  • Try to fix irregular timing by a ‘sanity checker’ interface with line buffer
  • Revisit Crosslink firmware (consider fixes by third party)
Posted on

MaSoCist opensource 0.2 release

I’ve finally got to release the opensource tree of our SoC builder environment on github:

https://github.com/hackfin/MaSoCist

Changes in this release:

  • Active support for Papilio and Breakout MachXO2 board had been dropped
  • Very basic support for the neo430 (msp430 compatible) added, see Docker notes below
  • Includes a non-configureable basic ‘eval edition’ (in VHDL only) of our pipelined ZPUng core
  • Basic Virtual board support (using ghdlex co-simulation extensions)
  • Docker files and recipes included

Docker environment

Docker containers are in my opinion the optimum for automated deployment and for testing different configurations. To stay close to actual GHDL simulator development, the container is based on the ghdl/ghdl:buster-gcc-7.2.0 edition.

Here’s a short howto to set up an environment ready to play with. You can try this online at

https://labs.play-with-docker.com, for example.

Just register yourself a Docker account, login and start playing in your online sandbox.

If you want to skip the build, you can use the precompiled docker image by running

docker run -it -v/root:/usr/local/src hackfin/masocist

and skip (3.) below.

You’ll need to build and copy some files from contrib/docker to the remote Docker machine instance.

  1. Run ‘make dist’ inside contrib/docker, this will create a file masocist_sfx.sh
  2. Copy Dockerfile and init-pty.sh to Docker playground by dragging the file onto the shell window
  3. Build the container and run it:
    docker build -t masocist .
    
    docker run -it -v/root:/usr/local/src masocist
  4. Copy masocist_sfx.sh to the Docker machine and run, inside the running container’s home dir (/home/masocist):
    sudo sh /usr/local/src/masocist_sfx.sh
  5. Now pull and build all necessary packages:
    make all run
  6. If nothing went wrong, the simulation for the neo430 CPU will be built and started with a virtual UART and SPI simulation. A minicom terminal will connect to that UART and you’ll be able to speak to the neo430 ‘bare metal’ shell, for example, you can dump the content of the virtual SPI flash by:
    s 0 1
    

    Note: This can be very slow. On a docker playground virtual machine, it can take up to a minute until the prompt appears, depending on the server load.

Development details

The simulation is a cycle accurate model of your user program, which you can of course modify. During the build process, i.e. when you run ‘make sim’ in the masocist(-opensource) directory, the msp430-gcc compiler builds the software C code from the sw/ directory and places the code into memory according to the linker script in sw/ldscripts/neo430. This results in a ELF binary, which is again converted into a VHDL initialization file for the target. Then the simulation is built.

The linker script is, however very basic. Since a somewhat different, automatically generated memory map is used at this experimental stage, all peripherals are configured in the XML device description at hdl/plat/minimal.xml, however the data memory configuration (‘dmem’ entity) does not automatically adapt the linker script.

Turning this into a fully configurable solution is left to be done.