Saturday, June 21, 2014

Telirati Analysis #6 Samsung's Android Camera and the Future of Camera Software




Android is in a top-of-the-line camera now

A few days ago, Samsung introduced the Galaxy NX30. It runs Android. Real Google-logo and has Google Play Store Android, not just an AOSP-based embedded Android used for the UI stack. And that means apps.

Unlocking the potential of high-end cameras to support new kinds of apps and new features in apps means getting access to RAW image data - the data directly from the sensor, before it is processed into a jpg compressed image. 


Here comes a new kind of camera app

Unlike smartphone sensors, where a RAW image might be unusable before it goes through a lot of processing built in to the camera,, high-end camera sensors can be thought as film with an adjustable ISO number. You can expect the sensor, and the lens, on a high-end digital camera to have the same fidelity as a sheet of film and a lens on a chemical process camera. That unlocks serious photography and opens a market for apps for that kind of photography.


But there's a hitch

The hitch is that there is appears to be no access to RAW image data. The raw parameter in the onPictureTaken method is much older (API 5) than I expected to find. however, it is also one of those things that no Android port that I know of implements. It became ignored by convention.

That's too bad because in the case of the NX30, it would enable a lot of camera software features and functionality. Enough to keep camera app developers busy for at least one more product generation. It is possible I'm mistaken about this, but I don't have an NX30 to test on. Samsung, are you listening?


Android has lots of camera features now , but not for serious photography

The new-ish (API 14) camera API features are cool and easy to use in an app but mostly point & shoot oriented. They enable programs to command what's in focus and other useful features. Current generation camera apps that have effects are implemented going from one jpg image to another jpg. That's OK for smartphone cameras and lo-fi results, e.g. sepia toned lunch Instagrams, but very far short of what would be appropriate for a 20MP sensor on a top of the line mirrorless camera.


But you can get RAW on the memory card

Yes, you can get at RAW images on the memory card when you put the card in your PC, and that's essential for editing those images in digital darkroom applications. But that digital darkroom can move into the camera itself, and spread itself around various parts of camera software: Viewfinder image processing, post-processing, and in alternative implementations of various camera features.


RAW is just a start

There is a lot of potential in opening camera hardware to apps, more-complete implementations of existing APIs, technologies like RenderScript in the DSP-based camera processing chain, and new app-level APIs:

  1. RAW image access is key to creating a new class of camera apps and processing software
  2. Cameras have powerful DSPs. Apple recently bought a company that optimzed image compression for high-frame-rate photography. There ought to be a whole category of DSP software for your camera that customizes high performance image processing
  3. Implementing DSP software on cameras could span multiple DSP platforms using Renderscript
  4. New APIs could enable real-time access to the sensor to create novel viewfinder features, among other possibilities
Let's see Google, Samsung, Sony, and others who are bringing Android into professional and prosumer-grade photography use the potential of Android to open new possibilities for apps and new forms of photographic expression.