(news & commentary)
At today’s annual developer conference Apple gave a very small preview of where they’re going with photo handling, both for iOS and OS-X (and via the Web, for Windows users as well). Details are still sketchy, partly because the final iCloud and Macintosh side of the product won’t be available until early next year, partly because the iOS components have just entered beta, and partly because Apple is holding a few cards close to their vest until they’re ready to launch the final components and keeping those under NDA. Still, what little Apple showed ought to worry the camera companies—and maybe a few software companies, as well.
Basically, Apple wants you to store your images in the cloud. To that end, Apple introduced new pricing for iCloud storage (20GB for US$1/month, 200GB for US$4/month). That’s still probably a bit pricey for the serious photographer, but it’s now practical for the iPhone camera user to be fully iCloudified. Cloud storage isn’t anything new, of course, and everyone wants to be your photo storage system up in the cloud, including Nikon with their Nikon Picturetown, oh, wait, no that’s Nikon Image Space.
Apple’s approach is much more refined, though. As they demonstrated in other ways at the developer conference today they are moving towards a world where everything just seamlessly moves between all your Apple devices and software products. Literally seamlessly, as in you can start a task on one device and finish it on another by just moving over to the other device. That means that phone calls can be initiated and received on your Macintosh as well as your phone. While they didn’t demonstrate this, how much do you want to bet that you will be able to take pictures remotely with your iPhone from your Mac?
As I described in an article on gearophile over a year ago (and started with even earlier articles about the concept I call “nests”) you can kind of do the cloud thing today using cobbled together parts. Apple isn’t cobbling. They’re integrating. They’re even helping integrate some of the cobbles I spoke of. For the more casual photographer, that’s exactly what they want: one way of doing things, one place to find things.
The funny thing is that I doubt that the Japanese camera companies are scrambling to be part of that system. Which means that they’ll lose some more casual photographers to Apple (and eventually Google and Microsoft and Amazon as they implement similar approaches). Hint: if they could integrate their cameras into Apple’s system seamlessly, they’d sell more cameras ;~).
Apple is pushing the new iPhone Photos editing capabilities they introduced today with iOS 8 through the entire Apple ecosystem, including OS-X. That means that all your Apple devices (and again, Windows via a Web connection) have access to all your other devices’ photos and will work with them pretty much the same way. What that means is that you’ll take photos with your iPhone and then you can edit them with your iPad. Or your Mac. Using the same controls on all the devices, apparently (more on that in a bit). And vice versa. If you’ve got a photo you created and worked on with the desktop Photos app, you can still work on editing it with your iPhone or iPad, whichever is handy. Which is important, because this new system apparently no longer puts resolution limits on the images. And it’s non-destructive in edits.
Photos for the desktop—which will arrive early next year—seems to be a new take that sits somewhere between iPhoto and Aperture. It has iPhoto’s simplicity, a bit of Final Cut Pro’s scrub and find technologies, and some new higher end editing capabilities that probably drifted down from Aperture. The demonstration I’ve seen is highly intriguing and suggests that there is much more going to be built on this new UI. In fact, I wouldn’t be surprised if we got a new Aperture that sits on top of Photos when it appears.
Apple also introduced an API for the cameras built into their iOS devices, which allow other apps to do a number of things, including controlling white balance and other parameters of the iOS device cameras. Yep, we’re getting closer to programmable cameras every day. Only the cameras that’s coming on are built into smartphones. But even more interesting is that non-destructive editing is also part of the iOS API scene now, meaning that we’re going to get “plug-ins” for cameras and photo editors in the iOS and Mac environment that work across all the systems in an integrated way. Maybe. One question will be how far the third-party software companies really want to play in Apple’s playground. You really need a function that Apple hasn’t implemented yet if you want to stand out in this new world Apple is designing, because Apple is going to control the UI.
Apple seems to have taken photo editing to a new level themselves. While their demo at the developer conference was relatively quick and simple, the takeaway I got from what I’ve seen is that Apple was looking for a way to get past a dozen sliders needed to do a complex shift of tonalities and colors to single, user-understandable sliders. They’re building “intelligent” controls, that know that if you want a “brighter” image, that doesn’t mean just increase pixel values across the board. A bright (or dark) image tends to lose vibrance, for instance, so why wouldn’t a brightening tool also try to keep color punchy? As with many things Apple, the simple “does a lot of things with one widget” approach also has an under the covers ability, as well, where you can get to those individual parameters if you really feel you need to. The net effect, though, is that most people are going to be using only a very few controls to do some very sophisticated things that were intelligently designed. The rest of us can dig deeper and tweak to our heart’s content.
In summary, the changes to Photos and the new editing capabilities do something very similar: they take a lot of UI out of the UI. By that I mean that a simpler, more direct UI without a lot of clutter, plus full integration into Apple’s cloud offering across all devices. Photos live in one place, can be quickly found and manipulated by any of your Apple devices with a UI that isn’t a hundred buttons and sliders with multiple windows.
When you compare this to the world that Sony keeps trying to build, you have to scratch your head. Where is that same level of integration and UI reduction across their cameras, smartphones, TVs, (former) personal computers, and their media? It isn’t about can they connect, but rather how they connect. Apple gets that, and today’s presentation shows that they’re working on that in the photography realm. Some pieces are in beta today, the rest will appear in six to eight months.