News

To surge or not to surge, the algorithm is the question

Stay informed with free updates

Surge pricing is something that anyone who takes a ride share on a regular basis has become used to. Try calling an Uber or Lyft on a rainy day during the dinner hour or around the school pick-up or drop-off time and you’ll be paying more than your usual rate — sometimes a lot more. 

Yet when consumers are confronted with common online business models like “dynamic pricing” in the bricks-and-mortar world, they may revolt. Consider the recent consumer backlash after Wendy’s, the American fast-food chain, announced on an earnings call that they were considering surge pricing for burgers during peak demand — and had invested $20mn in new AI systems to do so.

The first tweets following the announcement were amusing, as customers joked about arbitraging their lunch. But within a couple of weeks the social media comments became ugly and politicians such as Senator Elizabeth Warren started attacking the company for “price gouging”. Wendy’s quickly backtracked on the idea. 

The same phenomenon has occurred at movie theatres that tried to raise the price of seats during high demand (though airlines and hotels do it online all the time and most entertainment venues have regular bargains on known slow days). What’s more, surge pricing isn’t the only algorithmic manoeuvre that’s come under fire when translated offline into non-digital businesses.

The Federal Trade Commission and Department of Justice, following numerous complaints from tenants’ associations, recently took a joint action to fight algorithmic collusion in the residential housing market. Landlords are increasingly using rent-maximising software to keep prices higher than they might be in normal market conditions for tens of millions of apartments around the country. 

As an FTC briefing on the action noted, “the housing industry isn’t alone in using potentially illegal collusive algorithms”. The DoJ has previously secured a guilty plea related to the use of pricing algorithms to fix prices in online resale of goods; it has an ongoing case against the use of algorithmic collusion by meat processors. Meanwhile, there are several private cases being brought against hotels and casinos for online price fixing.

Platform technology firms developed or perfected techniques like dynamic pricing, real-time auctions, data tracking, preferential advertising and all the other tricks of surveillance capitalism. But the behaviour we take for granted online somehow becomes more problematic when these methods are deployed in the real world. People are outraged about the price of burgers or their rent surging but don’t think twice when it happens to the cost of their commute — particularly when they are booking it on an app.

I suspect that some of this is down to our expectation that we will all be treated equally — or at least pay set prices in a fair market — when we walk into a physical business. Historically, that assumption has been fairly well policed by regulators. When you walk into a retail store in the real world, you can’t be charged a different price or shown different offerings or advertising because of your income or the colour of your skin.

In the online world, however, such discrimination is rife, not only by large platforms but any number of companies. As data has become the oil of the digital economy, we’ve all become surveillance capitalists.  

Regulators are beginning to take on the messy world of algorithmic pricing. The FTC, for example, has alleged in a recent case against Amazon that the online retailer earned $1bn from the use of a secret pricing algorithm that kept markets in various products artificially high. Amazon calls this a gross mischaracterization and says it stopped using the tool years ago. Whoever is in the right, such efforts take years to litigate. And in some ways, I think we have entered a period of fatigue around tech regulation that reflects years of incremental gains that have not really succeeded in bringing more transparency into digital markets as a whole.

Maybe Europe’s Digital Markets Act, which went into effect last week, will begin to change that. Certainly it has already led to some behavioural changes on the part of the platform giants, as they are forced to give users more control over their data and open up their platforms more to competitors.

But I suspect that even more change — and more demands for tougher, clearer cut regulation — will come as online business models make their way into old-fashioned businesses where people are simply accustomed to much clearer rules. As consumers become more aware of how the tricks of surveillance capitalism are being used in businesses that they first used in the physical world, it may draw attention to the need for clear, straightforward rules — applying the existing laws of the physical world to online customer protection.

I’d love to see the FTC, for example, use its rulemaking power to stipulate a “thou shalt not discriminate” statue that makes it illegal to charge people different prices for different goods, no matter how and where they are buying them. What’s illegal in the physical world should also be illegal in the online world. This would put the onus on companies to prove that they are not causing harm, rather than forcing regulators to create a distinct and more complex system for a particular industry.

Online or offline, all businesses should be playing by the same rules.

rana.foroohar@ft.com

Articles You May Like

Trump is the poster child of DEI
Ashtead plans to move listing to New York in fresh blow to London
PNC’s new hires, S&P’s new executive and more career moves
Warner Bros Discovery to revamp structure, setting up potential dealmaking
Selling pressure wipes out December gains