Behavioural economics is a field that closely observes people’s choices in order to help them make meaningful, long-term decisions that are consistent with their personalities. Algorithms, on the other hand, have to do with machine learning and analysing huge chunks of data.
On a surface level, it seems these concepts have as much in common as ice cream and chilli peppers do. Yet companies like Google and Facebook owe their success in large part to their ability to seamlessly incorporate elements of behavioural and algorithmic thinking into their platforms and answer users’ queries before they can even make them.
But just as a sailor can’t hope to navigate their ship without knowing where north is, your business is doomed to sink to the bottom of the ocean if you try to incorporate behavioural algorithms without first studying the bittersweet way in which they affect our society.
Behavioural economics or the theory of consumer fallibility
As a concept, the behavioural economics theory is nothing new. It was pioneered by Richard Thaler in the mid 80s and essentially encompasses three fields: economics, sociology and psychology.
Thaler’s school of thought argues that the choices people make on a day-to-day basis are influenced by their environment, which can often cloud their judgement. In other words, we are being “predictably irrational” by repeatedly allowing cognitive biases, emotions, social influences and other factors to interfere with our decision-making process.
Algorithms — the building blocks of the digital era
Algorithms are modern society’s perfect problem-solving tools — an optimised set of instructions that can execute a given task with incredible speed and surgical precision.
An algorithmic Big Bang occurs each time you power on your computer or mobile device. Algorithms run social media, determine job performance scores, steer driverless vehicles, control financial transactions and handle many other tasks you now take for granted.
The positive effects of merging human behaviour with algorithms
Everyone gets personalised services
One example of this is targeted advertising. You visit an online store. An algorithm then picks up what you were browsing on the website or reviews your purchase history to suggest ads about similar products.
Everyday choices are made simple
Uber is a prime example of a company that knows how to market its strengths to users. Unlike its competitors, Uber provides its clients with tons of valuable data, ranging from the driver’s estimated time of arrival to their first name and licence plate number. It is this transparency that can make you perceive the company as a trustworthy and obvious choice.
Human behaviour is altered for the better
Some car insurance companies now give you the option to install a device that monitors your driving, such as vehicle acceleration and how often you use the brakes. At the end of each month, you receive a detailed performance report. Score high enough and you earn a discount on your next policy instalment.
The shortcomings of behavioural economics in a virtual environment
Convenience comes at the cost of privacy
To provide you with accurate results, behavioural algorithms often need to monitor your online activity pretty much around the clock. This translates to petabytes of information that are being transferred and stored on cloud services each day, making it increasingly difficult for anyone, much less the regular user, to reliably safeguard their personal data.
Recently, Facebook took a lot of heat for allowing Cambridge Analytica, a British data analytics company, to harvest the personal information of around 87 million Facebook users. About a week ago, it was also found that Amazon’s voice assistant, Alexa, could be hacked to switch your device into eavesdropping mode every time you wake it from its slumber.
Job disruption is introduced
In many cases, business disruption can actually be a good thing. Think of the decline of the highly polluting coal industry in favour of more efficient and eco-friendly energy sources. Sometimes, however, companies may employ smart algorithms to disrupt professions that are still relevant and useful to society — an automation done purely for monetary gain.
For example, about 20% of today’s financial reports and other business content are now written by algorithms, leaving many copywriters without a job. What’s worse, these “robot journalists” are rarely supervised by human editors, so they can potentially go rogue and generate a stream of erroneous content that can erode your trust in the content platform.
Biases have an even bigger impact on society
In a perfect world, algorithms have contextual understanding of the environment they are placed in, which always enables them to make the “right” decision — one that ultimately benefits our society. However, let’s not forget that they are being created by irrational humans, so it’s not a perfect science, yet…
Recent research has shown that only about 40% of companies worldwide are currently using machine learning, behaviour prediction software and other technology that heavily relies on behavioural economics to function. And while the upcoming GDPR regulations are a nice step towards ensuring their ethical use, more will be needed to prevent people from gaming the system.
Behavioural economics is the basis of several exciting and rapidly developing fields. Startups and established businesses alike are realising that ignoring these new worlds of data, permissions and algorithms is no longer an option. The next few years will be critical in how we manage risks and discover new opportunities in this unfamiliar environment — one in which the boundaries between personal and profitable are increasingly blurred.