Quick question: how many sensors does your smartphone have?
More than you think: ambient light, proximity, camera(s), microphone(s), touch, position (not just GPS, but WiFi and Cellular), near field communications, Bluetooth, accelerometer, magnetometer, gyroscope, pressure, temperature, and humidity. There may be even more that I don't know about brewing their way to the next cup of products. And yes, those are all technology sensors, as they're sensing things (sometimes amongst other duties).
So how many sensors does your camera have? Not as many, though again quite a few, and probably more than you realized. Obviously, some of the Android cameras that are starting to pop up may even match your smartphone in sensor accumulation.
So why is it that we use those sensors so poorly? Yes, we use them poorly: in our not-so-smart devices sensors tend to be singly-purposed. Think about it: all your cameras now have video in them so they have a microphone. Why can't you have a Clap-On type shutter release?
The reason we're not getting greater than the sum of the parts from these sensors is that the sensors are being put in there for one specific feature, the microphone is there to record the audio track for video only.
So what needs to happen to bring all those in-camera sensors together and create something much more interesting? First, you need imagination. You need to think creatively about what happens if you use A, B, and C simultaneously: does that net you a possibility you hadn't considered before? Sure it does. Workflow is one of our banes, but what if we had a camera that allowed us to enter some parameters prior to shooting (EXIF fodder) and then used that when something was sensed? Gee:
WHEN (WIFI = 'BYTHOM")
IF (EXIF(JPEG_Transfer)=TRUE) AND (PendingImages = TRUE)
ENDIF END WHEN
That's a grossly simple example that simply relies upon the WiFi system as a "sensor of network." Another simple example:
IF ACCELEROMETER = 0 THEN TAKE_PICTURE
Hmm, can we detect subject and camera motion? Sure. Can we detect subject motion, camera motion, and remote release (via Bluetooth)? You betcha. Can we detect subject position and motion, camera position and motion, and remote release? Still not a problem. Wait, how did we get subject position? From their smartphone ;~).
Beyond the fact that we don't have programming capabilities on our cameras (and camera firmware tends to be hard coded, too), there are other reasons why we're still far from stringing all those sensors together. First, many of the sensors themselves are hardwired to something. There's poor timing coordination and synchronization between sensors, the sensors use different time scales, and we have little sophisticated control over the sensor (heck, we just got 20 levels of audio levels from our microphones on some Nikon bodies).
Yet when I look at what's being done with smartphones and tablets at the moment, I see the opposite: more coordination of sensors and more recognition that you can combine them to create something more interesting (much of the augmented reality ideas require multiple sensors being coordinated together). Heck, I just discovered that my quad copter has orientation ability (comes from the combination of GPS and compass): no matter which way the copter (and its camera) is pointed, I can have my controller joysticks programmed to always provide movement relative to my position (i.e. the chopper/camera is pointed left but I push the joystick forward: does the chopper go left or forward? Well, I can control that ;~).
There's a ton of technology in our cameras (did you know that the D4 has a Linux-based computer system in it just to run the Ethernet port?), but it's all being underutilized. Way underutilized. Want to start a camera revolution? Better use the technology that's already available.
I'll have more to say about future photography technology next week in my post CES/PMA commentary.