A question from our “Submit a Question” section here on HotSpots:
For the last three earnings calls, Andy has commented on the progress of a design win for the camera market. How does the display path of a high resolution camera differ from a smartphone with a camera?
What is it exactly that needs to be bridged to the display in a camera – is it the camera AP or the camera image sensor controller? If an OEM wants to include Android in a future camera, is it the same Camera AP handling Android, an Android AP handling the camera, or two APs, one for the camera and one for the OS? Any “light” you can shed on this will be appreciated.
First, thanks for the question. Hopefully I can remember just a little bit of the 10 years I spent in Digital Imaging prior to QuickLogic…
(author’s note: I am going to use the term “DSC” to represent Digital Still Cameras, Digital Video Cameras, and Digital Single-Lens Reflex Cameras hereafter)
DSC’s differ from smartphones in a number of ways; the biggest difference that changes the way the product is architected is the primary focus (if you can make a pun, so can I) of the equipment. Obviously Smartphones are designed to perform multiple functions, whereas DSC’s are designed for the primary purpose of taking high-quality still images and videos. Accordingly, the inside architecture changes. Today’s smartphones use multiple-core processors designed for multiple tasks, with image processing being only one of many. A typical DSC, however, doesn’t use an applications processor, and instead relies on a DSP (Digital Signal Processor). The primary function of the DSP is to interface to the image sensor, process the image, and then send that file to (1) the embedded memory and (2) the on-board display.
Many of these DSP’s use a legacy RGB display interface, as that is what the on-board display has been since DSC’s began appearing in the late 1990’s. As astute readers know, the MIPI-DSI standard has become very prevalent in the smartphone display size market (≤7″). So much, in fact, that sourcing an RGB panel is becoming increasingly difficult for DSC OEM’s. MIPI-DSI panels are cheaper and more available, and thus the need for an RGB-to-MIPI display bridge. So, we don’t touch the image sensor pipeline, only the display path after the DSP has done it’s job.
Now, certainly, DSP’s can be updated to include a MIPI display interface (and some already have). However, as with smartphones, these generational improvements don’t come cheaply or quickly.
The future of DSC’s, especially Android embedded ones, is a bit murky right now. Obviously, legacy DSC’s generally weren’t designed to work with Android. Camera OEMs will inevitably need to look to more generic application processors (like those used in Smartphones) for their Android needs. Image processing folks tend to not like those processors; while the image processing pipes contained within them are perfectly acceptable for ‘lower quality’ smartphone cameras, they pale in comparison to the capabilities of image pipe’s found on DSC DSP’s. Further complicating the Android-to-DSC move is that the two big guys in the camera market, Canon and Nikon, have their own camera OS’s, their own processing software, custom-developed DSP’s with many years of image processing development, etc…
Hopefully this answered some of your questions (while likely creating more)…
Thanks again for the question!