2004 doesn’t seem that long ago. Yes, someone born in that year would now be in Grade 3. But a lot of what existed then still exists today, or at least that would be the impression of many.
And yet, in technology terms, 2004 is an achingly long way into the past. Yes, people had laptops and cell phones; on one hand, their functions seem very similar to what current models do, and from another perspective they are very, very different. While the web and email were ubiquitous, for example, social media did not exist as we know it today. Facebook and Flickr were only founded in 2004. Youtube would not exist for another year and Twitter wouldn’t emerge for another two years.
This reality was driven home for me (hard) when I recently picked up, of all things, a photography magazine that was published in March 2004. Looking at the photos, they could easily have been taken today. Looking at the newly released technology, in form it looks similar to what we can buy now. The specifications have changed enormously, however. A ‘first look’ at a cutting edge camera revealed a high-resolution screen and viewfinder that supported a camera with a four-megapixel (the number of ‘dots’ that make up the picture) sensor. The newly released iPhone 4S has an eight-megapixel sensor, while professional cameras currently range up to 24 megapixels, and will no doubt soon get much larger.
It’s on the professional end of things, however, that things got really, really interesting. Every professional photographer in the issue shot on film. Some used medium format, while others relied on 35mm, but all still operated in a world where pictures weren’t pictures until the exposed images were taken to a darkroom, processed in chemicals and subsequently scanned or printed. The reason for this at the time was no doubt, in part, personal preference (I’m also on record at one point as saying that I would never shoot digitally, and have long since eaten my words) but largely was defined by reasons of image quality. Six years ago, the highest quality images required film to obtain them. Since then, this has changed radically, and virtually every professional photographer (at least that shoots for publication) does so in a digital environment. Photos are instantly viewable, flexibly editable and readily emailed, uploaded or shared.
This raises an interesting challenge. On one hand, it reinforces the rate of technological change in the world and proves out Moore’s Law (which states, in simple form, that the capacity of technology doubles every 18 months). Our perception of technology change, however, evolves much more slowly. We expect technology to be usable in much the same way as we always have (but it should be better, faster, more reliable, more flexible and in all ways cheaper!)
As technology does become more advanced, however, the changes it requires of us in using it become much more complex and demanding. To be a professional photographer converting to an all-digital workflow from film, for example, is a significant undertaking. Not only is there the expense of new equipment; everything from the picture taking process (no matter how similar it feels) to the actual processing, printing and publishing of the results are different, and require the development of new knowledge, skills and – most importantly – habits. This kind of radical shift requires (almost) abandoning what we knew before, and being willing to – quite literally – start over again. As we get older, this kind of shift becomes harder and harder to make.
Technological change isn’t going to slow down. The possibilities that technology in all its forms offers are enormous. Without knowing the exact circumstances of how it will occur, there is little question that technology will continue to shape our lives in ways that we cannot fully contemplate today. The big question is whether we’re up to changing ourselves sufficiently to keep up with the changes in technology. That will be the much harder journey.
Leave a Reply