After my trip to the Netherlands and Sweden last week, I’ve thought a bit about my casual photography workflow experiments I tried during the trip. I brought just the Leica Q2 and my iPad Pro. No laptop, no interchangable lens camera- just the 28mm wide angle camera and the SD card reader for the iPad’s USB-C connector.
I’d like to thing that I’m still here writing in that spirit of using using imagination as the path to better decision making. I’m way less enamored with the technology of computer simulation, but have to recognize that 20 years on, AI and Machine Learning have brought a lot of that promise into the world, but create algorithms that tend to diminish our active imagination rather than augment it. When my phone can make better images than I can, I have to admit defeat before ascendant computing as deciding.
I had hoped to create some longer form writing to mark the occasion, but instead I’ve simply renewed the writing and image posting habit, putting out a pretty steady stream of posts. Looking back as the initial months of the Edit This Page sites, I’m impressed at how much like Twitter it was. Pointing at content, making short remarks. Never thinking to create evergreen content to build a search audience.
Every smartphone is a GPS device. Every smartphone is a camera. So the images we save are all geotagged; the location is saved as metadata as part of the image file.
You might think that for competitive reasons, camera manufacturers would put one of those cheap little GPS chips in their cameras to enable that $3000 full frame camera to geotag like a smartphone. You might think so, but you’d be wrong. Most of the high end, full frame cameras from Nikon, Sony, Canon and Leica depend on a smartphone connection to geotag images. Mostly you’ll find GPS chips in lower end compact cameras. Leica had GPS in their SL full frame mirrorless, but removed it in the just released SL2.
Why? As far as I can tell, GPS chips are just too power hungry to run continuously in cameras. Smartphones get GPS fixes at intervals plus can use cell tower info to figure out where they are. So it makes sense for camera manufacturers to rely on a smartphone app to pass a GPS location for geotagging. Plus geotagging has never been a feature of these cameras, so unless you look for it, it’s not missed. Probably wouldn’t be used by most users in fact.
Since my casual iPhone images are all geotagged, I’ve looked at a few approaches for geotagging images from my current group of cameras from Leica and Nikon. For now I’m making do with inconsistent apps and manual input. But it’s clear that Bluetooth Low Energy (BLE) connections between camera and phone has become the favored solution.
I’m now finishing my fourth year using the Hobonichi Techo as my daily journal. The Hobonichi is a one page per day, fountain pen friendly journal from Japan. As I wrote last year, I open it every morning to plan out the day, record a few notes and capture key actions for the day. Sometimes it’s a shopping list, sometimes phone calls or appointments that need to be made. If the daily pages are my roadmap for the day, the big monthly calendars at the front are my longer range planning tool, knowing when trips, holidays and other blackout dates are going be coming up so I don’t make mistakes about committing to being somewhere or taking on a project. As much as I’ve tried over the years to use digital systems for this, for me paper is a better way to see what I’m doing and when at a glance.
I’ve made two changes this year. One is adding in just a few photographs. And the other is becoming a bit more systematic by including some Bullet Journal conventions into my daily jottings.
Our phones opened a new era in photography. We’re all able to take quality pictures and transmit them instantly to friends and family through social media. As a lifelong serious image maker, I’ve let my image making drift away my new casual iPhone photostream, reserving my dedicated camera equipment for deliberate, mindful creation of quality images that I mostly share on Flickr.
Now I’ve been writing about creating a cloudbased workflow: bringing images into the iCloud Photos database for sharing socially, using the iPad as a tool for image transfer from card to cloud, and mobile post-processing of images on the iPad. Is there are camera that can also provide the capabilities of the iPhone camera to feed images to the photostream?
An iPhone is about a 28mm equivalent, so it would make sense to use a 28mm lens on a full frame camera for the same casual photography. And so I’ve bought the Leica Q2 with its 28mm fixed lens as my casual photography camera.
I was impressed with the initial release of Photoshop for iPad as a mobile companion to the desktop workflow. I dismissed the criticism regarding lack of features as premature as Adobe seemed to be planning a rollout of features in a measured fashion. I’m convinced the iPad and cloud based image storage is the future of our photographic workflow and Adobe is working on being a player in the space. Lightroom and Photoshop will be available as mobile tools, adapted to the mobile environment.
In a blog post, the Adobe iPad Photoshop team has provided a bit of a preview of what they call “The Journey”. I’ll admit that I’ve been around long enough to appreciate just how long it takes for the journeys. Way longer than anyone wants. In June 2007, the iPhone was introduced. There was no app store. Just a few native apps and the browser. The first iPad was 3 years later- and appeared to be a big iPhone. It’s the iteration over years that builds capacity and I find it easiest to follow along on these journeys, building skills as the capabilities of the system improve. Photoshop on the Mac is my native language for image processing. I can translate it to several other workflows pretty well, knowing where I’m going and learning how a new tool may work to get me there.
The promised addition of curves will provide an essential part of the workflow because I use it as my method of selective change in tone and contrast. Pressure sensitivity for masking will also be a big step toward reproducing my Mac workflow. Other steps in Photoshop are via the NIK set of filters, unlikely to be integrated into an iPad workflow anytime soon so finished product will still be in the full digital studio, not the mobile setup on iPad.
Nevertheless, this looks like the road ahead and should help get the pictures out and being seen here and on social sites.
My photographic workflows have long been split between serious and casual. Really a split between my camera workflow and my iPhone workflow. I’m discovering that the key to bringing them together is using Apple’s native Photos app as a unified image database.
Once images are in the Photos database they are available everywhere and backed up not only on Apple’s servers, but also downloaded to my desktop Mac which backs them up to a local hard drive plus a cloud repository. A few years ago, these services were unreliable, but now we rely on them to store the photographs that document our lives taken with our phones and shared via social media.
It’s time we merge the images taken by our cameras with our phone photography in the cloud.
I see that my photographic workflow is going to be moving to the cloud. I suspect it may be through Apple’s Photos app or maybe LightRoom, but one way or another, I doubt my images will live on my Mac any more than my music does. I can now stream thousands of HiRes albums via Amazon music anywhere and anytime. My photos are joining them there.
For years my iPhone has my most used camera because of convenience. With each new release of the hardware, the image quality improves, but that only makes me feel a bit better about using the phone to capture scenes from travel and family events. No, it’s the ability to capture images and immediately share them either via message or social media. I think of the images as living on the phone, but of course the truth is that they are on Apple servers and I’m just seeing previews on the phone. The problem I’ve had is that images from my cameras can’t get there easily.
I’ve tried photo editing on the iPad a few times over the years, but never with great success. I have a well established workflow of ingesting RAW images in CaptureOne (which replaced Apple’s Aperture) followed by conversion to TIFF. Then post-processing general gets done in Photoshop, with assists from Nik programs. I generally just save the finished image back to CaptureOne plus export a jpeg for use on Instagram and Flickr.
For critical images I follow Vincent Versace’s guidance to use the camera manufacturer’s RAW conversion utility. This advice has only become more important as computational photography has become a ubiquitous part of digital imaging, not only in smartphones, but in cameras as well.
For the iPhone, my most used camera, RAW conversion is done on the phone itself. At this point, the latest digital cameras have conversion computers built in as well, so their jpeg output is pretty close to the conversion by the PC software. As sensor dynamic range has improved and exposure is being calculated right off the sensor, it’s occurred to me that the preview on the back of the camera is pretty close to an optimal conversion anyway.
I’ve long wanted a way to combine the iPhone and camera workflow to avoid the problem of an “Art” database on my Mac and a “Life” database in Photos. I’ve got travel and family images made with a camera that don’t get into the Photos database for digital sharing. Sometimes I remember to export JPEGs and import into Photos, but it’s not a regular part of the workflow.
It’s hard to convey how small the online world felt back in 1999 when this blog started. There had been online places like bulletin boards (BBS’s) which were text only discussion sites, America Online and then early web services like Gopher, all of which were quickly gone once HTTP and the World Wide Web rose and swept them all away. In those early days, discovery of sites and repositories was based on human networking, via links and pointers.
I remember so clearly the first time I tried out a new search service called “Google” which seemed magical in it’s ability to find anything that had been put up on the web. But as websites proliferated, there was still a vital part for people to play in finding and pointing to interesting content. Communities grew up as did index sites and link blogs which were hubs to consult to find new contributions to the online world.
Looking back through 20 years of internet history, it seems to me that our early social networks have grown into a huge industry of “Social Media”. We are fed streams by Twitter, Instagram and Facebook by algorithm instead of relying solely on human curated writing and links. The thoughtful writing hasn’t disappeared since thoughtful people haven’t disappeared. Sure the distraction is way greater and we’re beginning to learn how to regain control of our attention in a world online where we are all so connected. There are amazing improvements to the tools I have available to create content. I have publishing platforms available on computer, iPad and iPhone that address audiences large and small, public and private.. I have image making devices I can carry in my pocket and take photos in near darkness.
To what end? I guess the same purpose that propelled the creation of this personal journal 20 years ago. Learning how to decide. Deciding who I want to be in this particular moment.