On Friday afternoon I’m going to be presenting at the 2013 Silicon Beach event. Unless I have a massive change of heart between now and then, the following is roughly what I’m going to be saying. If you’re attending the event, really, really, this is what I’m going to be saying so don’t read the following unless you intend to slope off for a beer in the sun on Friday afternoon. You can find the slides here.
seven stories – Silicon Beach 2013
1. the physicist
The chap on the left of the photo you can see is my grandfather, Bertie. He was a physicist, born in July 1913. On the day that many in the country were celebrating the birth of the latest heir to the throne, my family were commemorating what would have been his 100th birthday in our typically low-key way. Bertie sadly wasn’t around to see it having had an unfortunate death-related incident back in 2004.
After graduating in 1936 from Queen’s University in Belfast, Bertie joined the General Electric Company, and for the first part of his career spent his time developing analogue computing devices to help things explode at the right time. It wasn’t a particularly pleasant thing, but was a necessity in the time leading up to and during the Second World War.
After the war, his career moved into electronics where he designed and developed the Marconi DET22 thermionic valve (a predecessor of the transistor which in turn begat microprocessors), and then by the 1960s he was involved in telecommunications. Bertie designed circuits at Goonhilly Downs that carried the first couple of decades’ of trans-atlantic television content, and then in the early 1970s he travelled out to the central-Southern African state of Zambia to help project manage and run the introduction of their first satellite earth station. Most of my earliest memories are of the trip that my family took to visit Gran and Grandad in Lusaka in the summer of 1975, and his work there was commemorated in a number of ways by the Zambian state, such was the significance of their connection to the satellite communications network.
It’s sometimes very easy for us to get caught up in the idea that we are witnessing unprecedented technological change in our era. But think for a moment about the changes that Bertie saw in his lifetime: cars went from a plaything to mass transportation; the telephone, whilst invented in the late 1800s, went from niche business tool to mass communications medium; Marconi’s invention of radio became the first electronic mass medium in the 1920s; cinema went from circus side show to a primary source of art and news; television was invented and rose to become the dominant medium; and digital computing went from a handful of room-sized devices to ubiquity.
All of those inventions changed the world – although as a The Economist put it in an article about innovation earlier this year did even that tranche have as big an impact on human society as the invention of the flushing lavatory? And then compare all of those to checking in in FourSquare.
It’s not that what we are doing today isn’t important or impactful. It’s just that sometimes we need to keep in mind that the changes Bertie saw in his lifetime were often far more deep than another service on the Web, or another app on a smartphone. He also lived in a time when pure science and engineering could put an invention into market that would change the world around it; much of our innovation today is much, much more people centric, and we ignore those people at our peril.
2. the programmer
We should also keep in mind that the birth of computing comes from that same logical, scientific, mechanistic tradition. The first programmers were either men in white coats, or men in uniforms, and systems were designed using approaches that stemmed back to the scientific management trends that were popular in the early days of Bertie’s life.
FW Taylor, who inspired Henry Ford amongst others, saw the world in a series of interconnected processes, where there would be one best way to perform each step. This thinking (a logical extension of the work of Adam Smith) led to production line manufacturing. Much of computing in the past 40 years has been developed on the same approach; take a process, automate it, treat people as providers to the system. In the factory that led to workers subservient to the production line; in the office this led to workers subservient to the system; for the customer this leads to to disaster.
The supermarket self-checkout for me sums up everything that is wrong with the scientific, people-unfriendly and almost autistic approach to programming computer systems. Self-checkouts treat the customer as a mere machine to scan and pack items, and data source for authorising the finances. The ones in branches of WHSmith reach a new low by even trying to get you to, towards the end of your interaction, upsell a bar of unwanted chocolate to yourself. Barked orders and empathy-less interaction do not make for a great user experience.
3. the web developer
Something interesting, however, happened in the late 1990s. And my hunch is it happened pretty much by accident.
With the first dot com boom we saw the emergence of the Web Developer. Often a fusion of design and light programming, the platforms on which early web applications were based had a very interesting feature that was pretty much unheard of in generations of computing before: the log file.
The nature of the way in which web servers and web browsers work is such that an awful lot of information about how someone is actually using a website can be captured – actions by the user generally request something of the web server (a file, an image, a page, a data query and so on), and every one of those requests and responses can be logged. Tools to analyse those log files soon became available, and whilst there was some distractions as people chased metrics because they were easy to measure, for the first time at scale we saw software development being driven by feedback from the real world use of that software.
A/B testing – a way of refining design on the basis of how different versions of a service were delivered and the impact that they had on the user population, and more advanced techniques like eye-tracking tools rose to prominence. Listen to companies like Google and you will repeatedly hear the refrain that they are using data to define their software development.
It leads to better software – from some perspectives. But the challenge with this data-led approach is that it is no more empathetic by design than the scientific management of old; something (the data) still reigns over someone (the user). And that means you can spend your time refining stupid: Gmail, for example, has undoubtedly refined the user experience for how we manage email overload, but it’s not addressing the core issue – that email doesn’t really seem to work very well as more and more of us feel enslaved to it.
4. the designer
Within the Web revolution something else happened – the rise of graphic, product and industrial design into the world of computing technology.
It seemed that for a long while computers and their operating systems weren’t as much designed as just happened. Apple in Steve Jobs’ first tenure probably challenged that status quo as much as anyone with the introduction of the Lisa and then the Macintosh, but it was under his second tenure, under the design eye of Jony Ive, that we saw computing as desirable, beautiul object really come of age. That impact is now seeping into the software that we use.
For many years there has been something of a discontinuity with Apple products – on the outside it’s all clean, Modernist, Bauhaus lines, but on the inside skeuomorphic faux leather whimsy. With software and hardware now unified under Ive’s command, iOS 7 will see the first realisation of his vision across both realms. Microsoft, in the meantime, have been launching themselves into a flatter, “digitally authentic” realm with Windows 8 and the design that used to be called Metro.
What concerns me is that the autism that is prevalent in the world of traditional computing is now being emulated in new ways with slavish adherence to fashions in the world of design. Bauhaus is nearly a century old – it is only “right” because current fashion dictates that it is right – and from a historical perspective we may as well be in a technology world where the Arts and Crafts movement was defining user experience and look and feel. Maybe that fashion will come…
But whilst flat, hard design might look cool, it’s not necessarily empathetic. Witness how many people take their modernist monoliths and place them in faux-leather phone and tablet cases. Skeuomorphic design, whilst maybe not down with the design kids, serves an important purpose in providing analogous reference points for end users. Strip that away and we run the risk of cold, impersonal, production-line computing in a new form, devoid of empathy.
5. the electrician
I learned a crucial lesson in empathetic design from an electrician called Dave.
Back in 2000 I bought a one-bedroom ground floor flat in Balham in South West London with my then wife. It was the sort of place where every bit of space was at a premium, and we managed to carve out a working office area in what had been a cupboard under the stairs. We had decorated, but just needed to get power sockets sorted out.
Dave was the partner of one of my ex-wife’s work colleagues. We invited him over to quote for the work and I, having spent so much of my career by that stage in IT, had been primed for him to ask me what I wanted. I had decided what I wanted was four electric sockets. It sounded about the right number.
When Dave arrived, I was somewhat thrown by his first question: rather than asking what I wanted, he looked at the desk under the stairs and asked what we were going to do when sitting there. I thought about it for a moment – I’d need to use my laptop, its external CD writer, the printer and a desk lamp. It would be nice to plug in my phone. And I’d thought about getting a scanner too.
“We’ll put in eight sockets” Dave decided.
Asking what someone would like to do is a big improvement over asking them what it is they want. It was a big learning point for me, and as you can see a story I still recall thirteen years later.
6. the hurdler
Dave had a big advantage – the customer and the end user were one of the same. Within the world of Web and App design these days that is rarely the case – we are usually being commissioned by a client to develop something for a third party that we rarely meet – their clients or customers.
In my time at Microsoft in the past couple of years I was lucky to be able to meet some fascinating people. One of them was a chap called Boz. Boz had been a world-class athlete – a hurdler – but at the World Championships in Paris in 2003 he landed very badly and sustained an injury that put pay to his sporting career. He enrolled in a Multmedia Masters course at Teeside, and subsequently set up a digital agency working exclusively with sporting organisations called 13 Strides with his cousin.
Boz had been doing work with the organisers of big marathon and half marathon events. They together wanted to build apps, had loads of ideas, but it just didn’t seem to be heading anywhere. Both 13 Strides and the client were dominated by people who had been professional athletes, and it struck me that that was their main problem – they were thinking about it from their own perspectives, not from the perspectives of their customers who (in majority) were first time, hobby runners.
Along with a few colleagues, we organised a day session to help make sense of it all, and help to steer them away from some of their ideas that appeared to put them into direct competition with big sports brands like Nike runkeeper.
To kick the morning off we started by telling some personal stories – great experiences that we had had in our lives that didn’t revolve around sport. Family and friends reunions, things that had taken place at school, the crucial thing was to get everyone talking as people, not as sports professionals. That session left us with a few key factors that seemed to be in common with everyone’s experiences – a sense of anticipation, the delight in sharing stories about the experience with others, and so on. These factors became the benchmark for the rest of the day.
The second session then looked to identify some key personae – stereotypes that we could then focus ideas around. The two that were chosen were the first time runner, and the families of the first time runner. The former was obvious – they made up something like two-thirds of participants – but the latter was a crucial group as the time and investment needed to take part in a half-marathon event is something that needs to be supported by a family.
We then split into two groups, and each plotted out a timeline of activities that each of the personae would go through from when first thinking about taking part in an event through to telling everyone about it after successfully completing a run. Keeping people thinking from the perspectives of these two groups helped to draw out a whole load of things that could make for a valuable, defining electronic experience to sit alongside the actual events.
The latter part of the day then became a more traditional, agile planning session, as the groups drew out user stories, and then started to prioritise by using the planning poker technique. By the end of the day the feedback from Boz had been that it had totally changed the way in which they were thinking about their client’s and their customer’s needs.
We’d designed the session to be empathetic from the ground up. The output from that day should be shipping in app form later this month.
7. the grandfather
I started this session with a photograph of my grandfather Bertie. It’s to that same picture we now return – but to talk about the boy to his left, my father Malcolm, who is now a grandfather himself, three times over.
Dad is now in his early seventies, and retired a few years ago. His retirement is partly spent being involved in The University of the Third Age – self-organised groups of retirees who want to continue to learn. He’s set up a Community Technology group – a group of people in their seventies and eighties learning to programme using cheap devices like Raspberry Pi.
What I wasn’t expecting was the way in which this has politically awakened him. Giving access to the means of digital production has made dad realise that almost all of the technology that is produced for older people is assistive technology – tools to help those with disability. Whilst this is no doubt of value, if older people are only ever regarded as of disability by the twenty-, thirty- and maybe forty-somethings who design technology, this growing demographic is going to get increasingly discontented. However, put the means of production into their hands and we’ll find the most empathetic designers in the world emerge – the end users themselves.
How much the maker revolution will take over is yet to be seen – but some pretty influential folk from Wired’s Chris Anderson to the MIT MediaLab seem to be throwing their hats in the ring. MediaLab’s HighLow Tech group sums it up thus:
“We believe that the future of technology will be largely determined by end-users who will design, build and hack their own devices, and our goal is to inspire, shape, support and study these communities.”
This might sound like a flight of fancy today; but the world of design should take heed from the worlds of photography and journalism to see how quickly end-user production can dent the value propositions of traditional industries. If design is to have a future, my hunch is it needs to do so through empathy with those for whom the designing is happening.