At the training ground

I caught up yesterday with a former BBC and Microsoft colleague Mark Bloodworth. Outside of his working life Mark does a lot of football coaching, and has been trained to train, as it were. Often our conversations will bounce between how his sports coaching has changed his ways of thinking about work and working with others.

Yesterday our conversation got to an interesting point. For professional sports people and musicians, practice is often the bulk of their day-to-day working life. Refining performances so that “on the day” things go as right as possible. Yet in the workplace more generally, we’re expected to be on match day or concert hall levels of performance all of the time. Whilst there is some “learning and development” on offer to many of us, the move in recent years has increasingly been to “on the job” learning (which often practically means “just get on with it”).

Why this discrepancy? Are all of the rest of our jobs so much easier that footballers or musicians? Or would our performance improve if we allowed ourselves more time and space to practise the skills and techniques that we need to be able to be able to deliver when it matters?

Huh? Nah! Mmm? Ahh!

In the summer I’ve been invited to take part in the Spark the Change conference being organised by Dan Rough and the team at Gamevy, an interesting software company that produces games, and is organised on non-management principles. Dan’s one of the folk who follows this blog, and an article I wrote a few months back about people-led change inspired him to ask me to speak.

Seeing as how I like to make my event appearances bespoke bits of whimsy, I’ve started thinking about a new model to describe the ways in which we, as emotional creatures, react to changes that are imposed upon us, or those that we choose to undertake under our own volition. Here then is the world premiere of the Huh? Nah! Mmm? Ahh! cycle of technology change adoption. Underlying principles may well have been shamelessly stolen from far more reputable sources (particularly Elisabeth Kübler-Ross’ work), but the application hopefully takes it out of the theory book and into the nearest branch of Currys/PC World.

When we are confronted with something new, we tend to go through a series of emotional responses. Those responses might pass in the flick of an eyelid – they might take years. We may bounce from one response to another, and we may (over time) start to go backwards again.

The first stage is the Huh? stage. The “I don’t know what it is” phase. A bit like the phase that my three-year-old son is currently going through where he thinks that if he covers his eyes, other people can’t see him. The Denial phase. The “if I don’t acknowledge its existence it won’t be there” stage. You get the idea…

When I find out about a new technology, I have realised that I’m pretty judgemental and get through this stage pretty quickly – but let’s take for example “beacons”. Some sort of tech that allows your smart device to know where it is in a place where GPS doesn’t work – a metal-roofed shopping centre, perhaps. Beacons are things that I have heard of, but I don’t really know very much about. I’m at the Huh? stage.

As soon as I find out about something like beacons in a bit more detail, I’ll get to the Nah! stage; I’m confronted with something that upsets my world view of how things are, so the easiest cognitive response is to dismiss it – to resist the idea. At the Nah? stage I have recognised that the thing exists, and I’m choosing to refute its importance. I’m very much at this stage with the concept of The Singularity. I’m just coming out of it in relation to the ideas behind Crypto-currencies like Bitcoin (although I’m very much in it regarding Bitcoin itself…)

Now of course, given my background and experience, I could be right in my assessment. But it’s my gut speaking here, not my logical brain (necessarily), even though I could then reverse-engineer a bunch of logical reasons to support my intuitive response.

From Nah? we move into Mmm? Piqued by either something I read, or something I see, or personal experience or just bowing to the pressure of the inevitable, the Mmm? phase is where I am able to start to process how I or others might apply the technology to some sort of end that is either useful, creative or fun. Whilst I haven’t yet fully commited my life to The Internet of Things, personal experience a year ago gave me first hand experience of how it might be useful, and I’ve been looking for Internet-connected radiator thermostats ever since…

But it’s not enough to merely think about things. The Ahh! phase is where one starts to fully adopt the change and it becomes embedded into my life. This happened five years ago for me with the idea of subscription music services (when I first started paying for Spotify and stopped buying music). At a similar point I bought into and adopted the idea of Cloud-based collaboration tools like Google Apps, DropBox and so on.

Aligned to adopting something new, there might be the need for things to go the other way. My adoption of Cloud tools has meant over recent years I’ve become increasingly questioning of the value of traditional collaboration software suites like Microsoft and Open Office. I also really don’t like the idea of files stored on devices these days – I get angst when I use standalone digital cameras because I know nothing is backed up until I’ve done something, unlike the camera on my phone which automatically copies photos in the background. My use of SMS these days, similarly, is now negligible, alongside printing and using CDs and DVDs.

Getting these adoption/unadoption patterns to synchronize is one of the biggest challenges in getting others to change. It’s all very well getting people to explore the new, but if they don’t let go of the old then they’ll never properly commit to them. That often then leads to a period of anxiety where the new is harder to do than the old, and reversion to old habits is the easiest path. Step forward the QWERTY keyboard if you want a great example.

It’s also important to bear in mind that much of this is emotional, gut, “fast” thinking. So lots of logic to support why things might be better runs the risk of just throwing up lots of counter arguments.

So there you go. Huh? Nah! Mmm? Ahh!

Weeknote 192: Easter Bunnies

wpid-wp-1382701094108.jpg

 

Achievements this week included:

- further advances on the Brick Lane adventure
- publishing the 4th #socialCEO report
- once again postponing the most postponed meeting of my career
- getting praise for my writing from a proper writer
- but getting a few more in the diary for next week
- a catch up with Jerry and his new gig
- and a good chinwag with @mbrit

Next week: a round of lunches.

 

What’s in a name?

I’ve heard from a few sources recently that there is a move afoot in Whitehall to replace CIOs (Chief Information Officers) in government departments with CTOs (Chief Technology Officers) and CDOs (Chief Digital Officers). I don’t know the validity of that story, but it strikes me as credible as an attempt to shift the technology agenda from internal systems facing to external customer (or taxpayer, or citizen) facing.

People in tech, though, can get a bit sniffy about such branding changes. It’s easy to see it all as superficial.

What’s sometimes missed, though, is that if you want to do things differently, you need to not only change your actions but also your uniforms. If something looks like it used to, it’s much harder for others to accept that it has changed. If you want to get away from old-school IT, part of the change is to stop calling it IT and stop running it through a CIO. Changing names is only part of the puzzle – but shouldn’t be overlooked.

I had one of my regular put the world to rights sessions with Matt Baxter-Reynolds last night, and we were talking about this topic. It dawned on me that these issues of nomenclature impact on the supplier side of the industry too. Specifically, would Windows 8 have been more successful if it had been called, say, Microsoft Tiles?

Now put aside for the moment the question of whether “Tiles” is a good brand name or not – focus though on would the act of changing the brand (maybe to be strap-lined as “with Windows inside”) have made acceptance of the new product easer than it being identified as “merely” an incremental new version of a known entity? Was much of the cognitive dissonance that was caused by the user interface previously known as Metro as a result of it saying “Windows” on the box?

Certainly Apple didn’t have die-hard Mac users complaining at the time of the iPad coming out about it not being MacOS-enough. If the iPad had been known as the MacSlate, I wonder if it would have been the success it has been?

the continuing adventures of the #socialCEO

stamp-red.pngThis morning I published the fourth edition of the stamp #socialCEO report. Ten months or so after the first piece of analysis (little more than a blog post, if truth be told), it’s been enlightening to explore the ways in which our leading companies, and their Chief Executives, are engaging (or not) with the world of social networks.

Underlying the report are few things.

Firstly, that in the business world social networks have become social media and therefore primarily the preserve of marketers. Despite endless chat about conversations, the marketing world really struggles to think of anything other than traditional “broadcasting” mass media.

Secondly, as a result, many people in business have lost sight of social networks as being a two-way communication channel, and at a personal level therefore struggle to understand how they might add value.

Thirdly that “networking” still has negative connotations for many, and as a result many people don’t know how they do it, or understand what it might mean to improve. Therefore, they bimble through it all, never really feeling in control. Social network technologies just then exacerbate the problem (like the way that email for many has become an uncontrollable deluge).

Finally, leaders in organisations role model behaviours that get adopted elsewhere. If business do things, others with ape them. If they don’t, they won’t (“I didn’t get to where I am today…”)

Four quarters in, and the FTSE100 CEO community has become distinctly less socially-networked. Partially this is as a result of Burberry’s Angela Ahrendts’ departure last month to Apple. With her she takes 64% of all of the followers that the entire FTSE100 CEO community had between them. Ironically, her social communication has pretty much stopped since the announcement of her role in Cupertino back in the Autumn of  last year. Apple is a very secretive place, it seems.

But every quarter some CEOs leave, some join, and overall the new boys (and girl – Moya Greene at the Post Office) are a pretty unconnected bunch online.

Is this typical of 50-something, mostly white, men? I don’t know. There are plenty of that demographic in my own social feeds (particularly on LinkedIn) – but that’s of course a crap sample. I have to say, on a day-to-day basis,  it’s rare now to meet with someone who isn’t on LinkedIn. There again, the most senior person I’ve met in the past few weeks (a middle aged, white CIO with European-wide responsibilities) was absent.

Maybe they’re all just working too hard and just too busy to work with this social stuff? Well, maybe. But if they aren’t engaging, I do worry about how they can make decisions in the social networking space because I’m sure they don’t appreciate what it is without engaging. And for the CEOs of ITV, Vodafone, BT, Sky, the Post Office, all of the major high-street banks, that strikes me as bad news.

Anyway, where there’s inactivity there’s opportunity. You can draw your own conclusions – the report is now available for download without registration from here.

Digital business models

For some time now I’ve been using the example of the recorded music distribution world as a metaphor for how organisations might change and adapt into the world in which we find ourselves. Initially I talked in terms of Spotify or iTunes, but this is the fuller-nuanced version.

There’s no one right way in which an organisation should organise itself to adapt to our increasingly digital world. Looking at what has happened in the music industry can provide some ideas for different pathways…

The Our Price model – do nothing

To be fair to Our Price, a shop in which I spent many hours of my youth, they weren’t around by the time that electronic distribution of music was about. They’d been taken over by Virgin, which in turn mutated into Zavvi. Zavvi died a horrible digital-related death.

Doing nothing in the face of how Internet, Social and smart devices are changing the world is not an option. Our Price (and its successors) couldn’t continue to be merely mass-market providers of all sorts of music in shops on the high street. That market disappeared.

The Record Store model – develop a niche, extend to digital

Record stores, small, independent operations selling CDs, vinyl, even cassettes, are enjoying something of a renaissance. There aren’t many of them left, and those that are whilst ostensibly doing much of what they used to, are now serving new markets. Vinyl is for the hipster collecting obsessive, willing to pay a premium for the physical object and the larger artwork. The store itself is an experience to be savoured, rather than merely a place to buy music. The store might specialise by genre. They probably have online presence to extend their service beyond their physical presence. They’re probably not “on the high street”.

This is the decluttering and repositioning of a form that is akin to how cinema shed newsreels and B-movies with the rise of television. And many record stores have closed along the way (as many cinemas did before them).

The Amazon Model – move the physical online

These days, of course, Amazon sell digital as well as physical forms of content. But it was the moving of physical distribution sales from the high street to the Internet, warehouse and postal service that saw the first great decline in record shops from the late 1990s onwards. Amazon took the selling part, made it cheaper, slightly less immediate, but maybe more convenient, and cleaned up.

The iTunes model – remodel the physical, digitally

Cleaned up, it has to be said, until iTunes really cleaned up. With the rise in broadband combined with the improved fidelity and compression offered MP3 and other formats, a pure digital model for music selling and distribution arose – typified by iTunes. You could still by an album, or just an individual track, but this time you’d receive a file over the network rather than a box through the post.

The Napster model – everything is free

The same dynamics of cheaper, faster networks combined with improved audio compression technology meant that Napster and other filesharing platforms arose with no business model. Once the cost of distribution and replication of something becomes practically zero, there will tend to be pressure for the price to become zero. Illegal filesharing (although Napster did develop into a legitimate business model later) became rife.

The Spotify model – free at the point of delivery

If you are competing with “free”, then new models need to emerge to balance off customers’ desire for low costs against IP owners desire to have a business of some sort. With low- or zero-cost distribution the free to play (if you want limited service or advertising or both), or subscription models that the likes of Spotify, Deezer and Rdio have implemented are a further step away from the record shops of old.

The innovative power of combination

image

There’s been an image of an early 1980s Byte Magazine cover that’s been doing the rounds on Twitter aligned to a story by Harry McCracken on Time’s website about the trouble with futurology.

It’s all been irking me, and I’m trying to work out why.

The first reason it irks me is that lots of the tweeting sends to be missing the point. Byte’s covers were allegorical stories painted by Robert Tinney in a style so favoured at the time. They were metaphors, yet there seems to be an air of “look how silly those people on the past were thinking that wearables would have teeny tiny floppy disk drives and unusable keyboards”. It’s so easy to be smug in hindsight, as I was only yesterday.

McCracken’s article doesn’t miss the metaphorical nature of the artwork. However he argues that

… most of all, the Tinney watch is a wonderful visual explanation of why human beings–most of us, anyhow–aren’t very good at predicting the future of technology. We tend to think that new products will be a lot like the ones we know. We shoehorn existing concepts where they don’t belong. Oftentimes, we don’t dream big enough.

(One classic example: When it became clear that Apple was working on an “iPhone,” almost all the speculation involved something that was either a lot like an iPod, or a lot like other phones of the time. As far as I know, nobody expected anything remotely like the epoch-shifting device Apple released.)

Let’s deal with the second part first. The iPhone wasn’t such a big leap- it was an HP Jornada with telephone capability which used a finger instead of a stylus. I’m not saying it’s not a great product, just that it was (like most things) a repackaging and recombination of ideas that had come before. All wrapped up in a design language from the 1920s.

But, moreover, innovation is about new things that are a lot like the ones we know. Look at early cars. Look at the fact that, despite their touchscreens, smartphones still have qwerty keyboards.

Thirty years after the Byte cover we have devices coming to market that look a bit like it. Just as how 35 years after Alan Kaye came up with the Dynabook concept we had Amazon Kindle. The future generally looks a lot more like the present than futurologists would like us to believe. Most of us still aren’t wearing silver suits.

Matt Ballantine's thoughts about technology, marketing, management and other stuff…

Follow

Get every new post delivered to your Inbox.

Join 1,604 other followers

%d bloggers like this: