How Uber and other digital platforms could trick us using behavioral science – unless we act fast

Uber’s business model is incredibly simple: It’s a platform that facilitates exchanges between people. And Uber’s been incredibly successful at it, almost eliminating the transaction costs of doing business in everything from shuttling people around town to delivering food.

This is one of the reasons Uber will soon be among the most valuable companies in the world after its shares began trading on the New York Stock Exchange on May 10.

Yet its expected US$82.4 billion market capitalization may pale in comparison to the wealth of user data it’s accumulating. If you use Uber – or perhaps even if you don’t – it knows a treasure trove of data about you, including your location, gender, spending history, contacts, phone battery level and even whether you’re on the way home from a one-night stand. It may soon know whether you’re drunk or not.

While that’s scary enough, combine all that data with Uber’s expertise at analyzing it through the lens of behavioral science and you have a dangerous potential to exploit users for profit.

Uber’s hardly alone. Our research shows the biggest digital platforms – Airbnb, Facebook, eBay and others – are collecting so much data on how we live, that they already have the capability to manipulate their users on a grand scale. They can predict behavior and influence our decisions on where to click, share and spend.

While most platforms aren’t using all these capabilities yet, manipulation through behavioral psychology techniques can occur quietly and leave little trace. If we don’t establish rules of the road now, it’ll be much harder to detect and stop later.

‘Choice architecture’

A platform can be any space that facilitates transactions between buyers and sellers. Traditional examples include flea markets and trading floors.

A digital platform serves the same purpose but gives the owner the ability to “mediate” its users while they’re using it – and often when they’re not. By that we mean it can observe and learn an incredible amount of information about user behavior in order to perfect what behavioral scientists call “choice architectures,” inconspicuous design elements intended to influence human behavior through how decisions are presented.

For example, Uber has experimented with its drivers to determine the most effective strategies for keeping them on the road as long as possible. These strategies include playing into cognitive biases such as loss aversion and overestimating low probability events, even if a driver is barely earning enough money to make it worth her while. Drivers end up like gamblers at a casino, urged to play just a little longer despite the odds.

Uber didn’t immediately respond to a request for comment.

Airbnb also experiments with its users. It has used behavioral science to get hosts to lower their rates and accept bookings without screening guests – which creates real risks for hosts, particularly when they are sharing their own apartment.

While these examples seem relatively benign, they demonstrate how digital platforms are able to quietly design systems to direct users’ actions in potentially manipulative ways.

And as platforms grow, they only become better choice architects. With its IPO’s huge influx of investor money to fund more data and behavioral science, Uber could move into dangerously unethical territory – easy to imagine given its past practices.

For example, if the app recognizes that you are drunk or in a neighborhood you rarely travel to – and one that its data show is high in crime – it could charge you a higher rate, knowing you’re unlikely to refuse.

Legal challenges

And it’s not all speculation.