Published on

Developing OV5647 camera Linux device driver for Jetson Nano-Selfie Time! [Part 4]

Authors

Now that I am able to capture test images from the Raspberry Pi v1 camera module connected to Jetson Nano, it is time get the sensor capture real live images.

To begin with, I had to ensure that my driver integrated with the V4L2 stack of the Linux kernel correctly. Because once I got that working, I can install and make use of the v4l2-ctl tools on the Nvidia Jetson Nano OS and use it to capture images.

So, I did read a bit on the V4L2 stack and figured out how to register my camera sensor driver with the stack. Once I got them registered successfully, I ensured that it created a new device entry for my camera module under the /dev directory. The device registered itself as a /dev/video0 file.

OV5647 Linux driver registered with V4L2 and /dev/video0 created

With that, I then went ahead with downloading and installing the v4L2-ctl toolkit.

With these things in place, I made a query to check the capabilities of the camera sensor I had exposed through my driver to the V4L2 stack using the following command:

OV5647 Linux driver exposing the capabilities of OV5647

As you can see, the driver for OV5647 currently supports video capture in RG10 pixel format using 10-bit Bayer RGRG/GBGB pattern. A Bayer filter is a special type of arrangement of pixels inside a sensor that captures 50% Green color, 25% Red and 25% Blue colors in every frame of a video or an image. You can read more about the Bayer Filter in this Wikipedia link.

So with this information in place and knowing that the OV5647 outputs its images in Bayer format, I went ahead and tried to capture my first raw image using OV5647 sensor on Jetson Nano running on my Linux driver using the command:

v4l2-ctl -d /dev/video0 --set-fmt-video=width=640,height=480,pixel=RG10 --set-ctrl bypass_mode=0 --stream-mmap --stream-count=1 --stream-to=ov5647_image.raw --verbose

This command captures raw image data and stores in the output file ov5647_image.raw. However, since this raw image is in Bayer's format, I cannot open and view it using default Image viewer apps on my Ubuntu machine. So I had to find a way to convert the raw Bayers image format file to a standard format. This involved two steps:

  1. I had to make use of an open source tool called bayer2rgb to convert my raw image file to RGB tiff format.
  2. Next convert the tiff image to a standard png file using *convert" tool of ImageMagik.

So the next two commands I issued were:

bayer2rgb -i ov5647_image.raw -o ov5647_image.tiff -w 640 -v 480 -b 16 -f RGRG -m BILINEAR --tiff -s
convert ov5647_image.tiff ov5647_image.png

After issuing these two commands, I was finally able to get an image file that I could open using any image viewer program.

This is the result of that first image capture:

OV5647 Linux driver on Nvidia Jetson Nano - first image capture

Woah! What's that??

Is that a bird, is that a plane? No... It's Super... err I mean it's the first image capture of my room's window!

As you can see, we have successfully managed to capture an image out of OV5647 sensor on Jetson Nano. However, the image is under exposed and over saturated. So we need to go back to our driver code and set different values for the exposure, gain and saturation levels of the OV5647 sensor. However, we can rejoice for a moment in the glory of knowing that our driver is working well with the OV5647 sensor on Jetson Nano!

So, after celebrating a bit on this milestone, I went ahead and played around with different values to get a much more visible image capture using my OV5647 Linux driver. After messing around for a few minutes, here is a better image I managed to capture:

OV5647 Linux driver on Nvidia Jetson Nano - improved image capture

This is way better than the first one right? I know it is still not perfect, but I can still claim confidently now that my OV5647 Linux device driver for Jetson Nano is working well!

So with image capture using V4L2 now working, it was time I updated my driver code to support video streaming using GStreamer applications. That will be the focus of my next article.