Posted on Leave a comment

Lattice VIP IMX214/CrossLink issues

Overview

The imx214 sensors are configured using the ‘default’ sequence from the reference design, but at a lower PLL frequency around 54 MHz. Both sensors are started synchronously.

Single camera configuration

This setup uses the single camera bit file provided from Helion Vision.

  • Framing errors occur early in the entire video stream, then it runs stable for a very long time (recorded up to 150’000 frames)

Dual camera configuration

This setup uses the stereo camera reference design from the Lattice website (DualCSI2toRaw10_impl1.bit).

Color shifts
Bayer pattern offsets

Issues:

  • Framing very unstable, right image shows interesting color shift
  • Offset changing from frame to frame, displaying as above

Further analysis

The reason for the occuring DEMUX errors from the JPEG encoder is occasional invalid framing. Frames are then dropped and the image is out of sync.

Possibilities:

  1. Framing from sensor is wrong (critical sensor configuration mode)
  2. Framing from Sensor correct, but translation hickups inside CrossLink
  3. Irregular timing (too short Hblank time) stressing the JPEG encoder FIFOs

(1) can not be verified without a MIPI timing debugger. (2) can not be simulated due to closed source of CrossLink Firmware.

For (3), the LINE_VALID (blue) and FRAME_VALID (yellow) signals, both routed to external debug header display as follows:

IMX214 sensor framing via CrossLink

The above behaviour of two subsequent pixel lines with short blanking time occur in the current Stereo and single sensor CrossLink firmware configuration.

Potential remedies

Sorted by ascending complexity:

  • Find magic setting for more regular MIPI data transfer
  • Use another sensor (parallel interface)
  • Try to fix irregular timing by a ‘sanity checker’ interface with line buffer
  • Revisit Crosslink firmware (consider fixes by third party)
Posted on

JPEG robot camera

The PCB arrived 3 weeks ago, and finally, the time was found what should have been tried out long time ago.

Here’s the result. The tiny mobile type camera is able to deliver JPEGs from the sensor, that is, we can run our motion JPEG server on the popular SRV1 robot from http://www.surveyor.com.

The videos can just be watched in a browser or with mplayer. Still tweaking frame rates and PLL…

The board supports another VGA global shutter sensor with cheap webcam optics, as you can see from the unpopulated footprint. If things work as expected in theory, both sensors can be populated and selected via GPIOs, so they can be switched at runtime. The bigger problem is, to make the uClinux framework (which is about to be ported to the SRV1) acknowledge the dual head device. The current driver model does not really support that, so all sensor property control is happening in user space via the netpp (network property protocol) framework.

SRV1 JPEG camera
SRV1 JPEG camera

Update: Sadly, the brain behind the SRV1 and development partner Howard Gordon has suddenly passed away: https://www.sanluisobispo.com/news/local/article39130170.html

It was great to work with you. You will be missed.

Therefore, further development on this has currently come to a halt. The future SRV1 development is maintained by Timothy Jump at https://engineering3.org

Posted on 1 Comment

LeanXcam hacking

The leanXcam has been out for a while now, finally I got my hands on a OEM module.

The first impressions:

  • Interesting 4 plane layout. The bypass caps are somewhat far away from the processor on the top side. Not sure if they are really useful that way..
  • Cheap optics, but they do the job
  • Plugging in Ethernet and Power, I was able to telnet into the beast at the default 192.168.1.10 within seconds and try the webserver. Nice!

Now, how to run our standalone netpp framework:

The JTAG (which we definitely need for bare metal, i.e. non uClinux development) wasn’t populated. The helpful folks at Supercomputing systems told me the specs of the somewhat unusual SMD header:

Farnell, Order #1421678

Keep in mind that pin 3 must be spare for the key.

So, after being set with JTAG, I plugged in one of the new ICEbearPlus units and run the flashloader:

flashload --info --driver=spi_flash.dxe

This is what we get:

Detected Device: BF-537(536) rev:3
Manufacturer        : Atmel
Device Type         : Dataflash 45DB321D
-------------------------------------------------------------------
 Driver title: SPI flash programmer (STM, Atmel)
 Description: AT45DB321D
 Manufacturer Code: 0x1f
 Device Code: 0x27
 Number of sectors: 0x41
 Number of regions: 0x3
 Bus width: 0x8
 Buffer size: 0x2000
 Flash size: 0x400000
-------------------------------------------------------------------

Cool, that worked from the spot. There is another flash on SPI select 5, i.e. found by flashloader via the –unit=4 option. Before starting to hack, it might be a good idea to save the flash images:

flashload --info --driver=spi_flash.dxe --unit=0 --dump leanx_0.img --size=0x400000

Downloading code

So now. We got this standalone shell code to try out stuff, lets see how the code from the pretty similar STAMP BF537 board can be ported. After a few modifications later, I seem to be able to talk to the shell via the bfpeek console (the bfpeek channel is a way to do stream I/O over JTAG without the need to attach a serial cable):

strubi@gmuhl:~/src/blackfin/shell/boards/LEANXCAM$ nc localhost 4000

/////////////////////////////////////////////////////////
// test shell // (c) 2004-2009, <hackfin-ät-section5.ch>  //
/////////////////////////////////////////////////////////

Board: LeanXcam BF537
>

Let’s see if we see something on the i2c bus:

> i
Detecting devices: 5c
done.

Right, that should be the i2c address of the MT9V032 sensor.

Let’s try grabbing a frame with the ‘v’ command – oops, timeout! Could it be that the sensor is in standby mode? I guess we’ll have to check the schematics now. And here we should emphasize: The leanXcam does not make a secret about its internals: The schematics are openly available (Why keep something a secret that isn’t really one?)

Nah, everything ok, the sensor should run. Giving it a few tries and running into the usual obscure core faults we remember: Random core faults mostly got to do with bad SDRAM! After revisiting the settings and fixing them, we see:

> v 3
Initializing video with 640x480
Video start
Process frame [0], 0 jiffies
Data error in frame 0
Process frame [1], 21 jiffies
Data error in frame 1
Process frame [0], 21 jiffies
Data error in frame 0
Video stop (9 frames received)
 (6 overrun)

Bingo. We lost 6 frames due to the output via the bfpeek channel that burns many CPU cycles, but that was to expect. We ignore the data error, because we didn’t yet enable the test pattern.

Well, this is kinda amazing. It does not happen so often that you plug in hardware of that sort and it works that smooth right away.

Thus, I can recommend the leanXcam to anyone who wants to get into serious image processing, be it uClinux or standalone.