How AI is being quietly used to pick your pocket

Sellers have learned to set prices based on your age, mood, and sexual orientation.

Not only does personalized price discrimination perpetuate bias and can contribute to inflation, it also creates a world where you never know when your apps are cheating you.

Not only does personalized price discrimination perpetuate bias and can contribute to inflation, it also creates a world where you never know when your apps are cheating you.

When I flew back from London a few weeks ago, I fell down a rabbit hole I still can’t climb out of. I knew how much I’d ​​paid for my seat, and how many miles I’d used to upgrade. But I had no idea whether the woman across the aisle had spent just a few points, like me, or paid the $10,000-plus the airline might charge for the same flight. Booking a flight has long been a game only the airline knows the rules of, with countless booking codes, loyalty programs, and fare changes that use your data against your wallet. But once I boarded, I began to notice the same unfair play everywhere: in every Uber, every Amazon order, every trip to the supermarket. These companies now know so much about me that they can see a number flashing above my head: the exact price I’m willing to pay at this very moment. Right now, your own number is flashing above your head, too.

In the age of algorithms, price volatility is increasingly permeating digital commerce, with prices changing in real time.

What’s far more troubling is the rise of personalized pricing, the practice of digital retailers using your own data to charge you exactly the price you’re willing to pay, which may be different from what the person next to you will pay. Not only does personalized pricing perpetuate bias and can contribute to inflation, it creates a world in which you never know whether your apps are ripping you off.

Now, whenever I go to pay for something on my phone or laptop, I wonder if I would pay less if I used someone else's account.

I still remember the mild shock I felt a decade ago when I learned that price discrimination is often perfectly legal in the United States. In law school, my antitrust professor introduced us to the little-known Robinson-Patman Act, an anti-discrimination law from the Depression, quickly pointing out that it was a law that did not live up to its name. Under the long-standing law, companies could face crippling fines for price discrimination only if they discriminated against other companies. If a wholesaler overcharged a store, the store could sue, but there was (and is) nothing stopping the store from doing the same to its own customers. That is, store owners have more price protections than their customers. If a store habitually charges some customers more than others because of their gender, race, or other legally protected characteristic, that is certainly illegal. But when companies want to squeeze every customer for as much as they are willing to pay, they are free to engage in outright rip-offs.

Even in an age of polarization, AI pickpocketing may be one of those rare issues that can unite us in outrage.

Even in an age of polarization, AI pickpocketing may be one of those rare issues that can unite us in outrage.

I say mild shock because personalized price discrimination was far less common and damaging back then than it is today. Sure, coupon culture allowed companies to sell the same product at the same store at the same time at different prices, but it gave consumers a choice. Price-sensitive shoppers spent time hunting for coupons, while less costly shoppers paid full price. Coupons, loyalty cards, seasonal discounts—many traditional forms of price discrimination allow shoppers to choose which price bracket they want to fall into.

But algorithmic price discrimination takes that choice away. And the data-mining methods used to assign people to price groups are more invasive than you might imagine. Think back to your last Uber ride. When you booked a car, you probably knew that distance and time of day affect the price, as we’ve grown accustomed to the cold, extractive efficiency of a premium fare. But did you think about plugging your phone in before you booked the ride? If you had, you might have saved a few dollars, since battery level is allegedly one of the factors Uber uses to determine the price of a ride, though Uber vehemently denies this. If the allegations against Uber are true, the logic is clear: Those with less battery are more desperate, and those whose phones are about to die won’t hesitate to pay any price to get a car before they’re stranded.

As The American Prospect recently detailed, this type of customized pricing is spreading across virtually every sector of the economy (streaming, fast food, even dating apps), and it’s surprising what variables can drive up costs. In the 2010s, retailers relied on somewhat crude data to refine their pricing. Customers might pay more for a flight booked on a Mac (versus a PC), or pay a higher price for test prep in zip codes with large Asian communities. But in recent years, companies have shifted from neighborhood-level price discrimination to customized pricing.

Retailers like Amazon know so much about what you buy, both on and off their platform. And you have no way of knowing when your choices affect how much you pay. In 2018, it was reported that Amazon adjusts prices 2.5 million times a day. Given Amazon’s growth and advances in AI, that number is likely an order of magnitude higher today. For retailers like Walmart, it’s not enough to use our purchase history. In February, the retail giant agreed to buy smart TV maker Vizio for more than $2 billion, potentially giving Walmart a wealth of intimate consumer data. Smart TVs not only track what we watch with Orwellian precision, they track other devices nearby using ultrasonic beacons and can even eavesdrop on our conversations in the privacy of our homes. Vizio, in particular, has been fined millions of dollars over allegations of illegally spying on customers.

Retailers know not only what you bought and how much you earn, but often where you are, how your day is going and what your mood is, and all of this can be neatly synthesized by AI neural networks to calculate how much you are willing to pay for a particular item at any given moment.

Your age, gender, and sexual orientation serve as markers for the AI ​​when determining how much you should pay for love.

There’s no area of ​​commerce too personal to remain off-limits. Dating apps collect data about our romantic lives, and some openly brag about it to boost profits. And many of those who don’t disclose their use of personalized pricing are still doing it. Tinder rarely discusses its pricing technology, but Mozilla and Consumers International recently found that the dating app uses dozens of variables to radically adjust prices for users. Your age, gender, and sexual orientation serve as markers for the AI ​​to determine how much you should pay for love.

Left unchecked, personalized pricing will have a detrimental impact on society. Nicholas Guggenberger, an associate professor at the University of Houston Law Center, argues that “hidden algorithmic price discrimination can undermine public trust in pricing mechanisms and thereby undermine the market.” AI pricing also means that the most desperate and vulnerable will often pay the most. Worse, people can be penalized for their race, age, or social class. Take, for example, the battery charge. Older people are twice as likely as younger users to have a phone that is at least three years old. Because older phones have shorter battery life, older people may pay more than younger people for the same Uber ride.

“Algorithmic price discrimination can essentially automate usury,” says Guggenberger. “If your battery is about to die and you’re in a rural area, a ride-sharing app can dramatically increase your ‘personalized price.’”

Much AI pricing acts as a regressive tax, charging those with the most the least. For people in disadvantaged areas, with fewer stores and fewer alternatives, there’s often no choice but to click “buy now,” even if it hurts. As law professor and consumer advocate Zephyr Teachout told The American Prospect, we shouldn’t view this practice as something as benign as personalized pricing — instead, she calls it surveillance pricing.

We know how to prove human discrimination. If a store in a predominantly black neighborhood charges more than its counterpart in a predominantly white neighborhood, testers can go to each store, record the prices, and sue. This kind of testing has been at the heart of consumer protection for the better part of a century. But how do you prove discrimination by an algorithm? There are no stores to visit, no price tags to compare, only millions of screens isolated in people’s pockets. The result could be a stalemate where you can only get enough data to prove discrimination by suing a company, but you can’t sue a company without first having the data. We may be witnessing the emergence of a twisted, weird legal world where companies that use biased algorithms to secretly adjust prices face less legal scrutiny than brick-and-mortar stores.

I hope the situation is so bleak, and the potential for abuse so clear, that even our dysfunctional democracy will not accept it. Our legislators have been so slow to limit the harms of new technologies, even when it becomes clear, for example, that they are undermining our democracy. But even in these polarized times, AI pickpocketing may be one of those rare issues that can unite us in outrage.

Author Albert Fox Kahn

Original article

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *