For many years I’ve held true to a concept that goes as follows:
You can exponentially scale transactions.
You can change interactions into transactions and then exponentially scale those transactions, but you lose social and cultural meaning along the way.
You cannot exponentially scale interactions. They only scale in a linear fashion.
My go-to metaphor for this is based on buying jewellery:
You can go to Tiffany’s in New York to buy an engagement ring for your beloved. This is a high-value interaction. It is heavily loaded with cultural and social meaning.
You can buy an engagement ring from an Internet vendor. That is a transaction. The ring might look the same, but the cultural and social meaning is very different.
In recent years I’ve also been experiencing a dark pattern emerging, when interactions have been converted to transactions, but new interactions emerge and they aren’t nice. A case in point is in in-store supermarket shopping.
I often will use the scanning devices that are provided at my local Sainsburys. There wasn’t any value to me from the interaction of queuing to be served by a cashier (although there is to some people – hence the emergence of Slow Lanes in some places).
Using the scanning device means I can pack my bags as I go. The final transaction to leave the store usually takes but a minute.
Unless I’ve been selected for a random bag check. In this instance I’ll have an interaction with a member of the store team where my role is not as customer, but as suspected thief. A number of random items will be scanned, and if any errors are found I will be sanctioned. My neatly-packed shopping is strewn over the checkout, and I’m left feeling like I’m not trusted, and that all of my previous work to save me time and the supermarket money has gone to waste.
This is an interaction of epically shit proportions.
(I’ve previously suggested that if I was awarded a prize for getting my scanning correct, the whole thing would be much more wholesome and more likely to see “shrinkage” rates in stores drop too).
So, to date, transactions exponentially scale, interactions do not.
I’ve been wondering recently, though, with the current hubbub around generative AI if this rule will hold true into the future.
My hunch is that it will, but with a caveat. And that caveat is that it will become increasingly possible for transactions to be designed to simulate interactions. That generative AI tools will combine clever manipulation of language and anthropomorphism to give the sensation of an interaction even though it is but a simulacrum.
We’ve known that people can get fooled into thinking they are interacting rather than merely transacting. It was seen with ELIZA in the 60s. It was seen with automaton in the 18th and 19th Centuries.
Interacting with generative AI can be pretty immersive. Until something means it’s not – a rude awakening like the supermarket “you are a thief” experience. That will probably become less common as the tools refine themselves.
But ultimately an interaction is something between two parties and whilst we humans might fall into the belief that there is something thinking behind the scene, all that is really going on is a very large number of transactions in an increasingly short period of time.
Fake interactions like that run the risk of being highly manipulative. For a long while the software industry has intentionally grabbed our attention and manipulated our feelings. Fake interactions are the acts of con artists.
But increasingly we are even allowing AI technologies to intermediate our person-to-person interactions. Do you think about that when you let Google or Microsoft choose how you will respond to an email with its auto suggestions?
We are on the cusp of providers offering full automation of such interactions – something I satirically predicted some years ago. And at that point we’ll get into a very strange world indeed.
Which comes back to the point about interactions and transactions. And the question of “Why?”.
Why would we want to exponentially scale interactions (or at least the simulation of interactions)? My hunch is that, in most or many cases, we wouldn’t. We’d want to scale transactions, but providing the opportunity for more meaningless emails or more pointless meetings, where avatars trained on our previous responses pretend to be us feels like a bad, yet quite possible, outcome.
Let’s stick to scaling transactions shall we?
2 thoughts on “Simulating interactions”