AI technology has spawned a cornucopia of tools that retail brands use to fine-tune how they can make the most of the data they gather about their customers. But one AI-enabled tactic gaining traction lately is proving to be so controversial that a number of state legislatures are exploring how best to limit it, and brands caught abusing it should brace for blowback and loss of trust.
In industry parlance, this tool is referred to as surveillance pricing.
Some customers shopping for an item will see a different “personalized” price than others, based on the encyclopedia of information about you the retailer has collected. The problem is that some retailers, who use this technique without shoppers knowing it, do so in ways that—when discovered—may leave their best customers feeling betrayed.
That’s what Washington Post technology columnist Geoffrey A. Fowler reported recently after he asked Starbucks to send him the data it had collected about him through its loyalty rewards program.
As required under California law, the company complied, producing an eight-page report that Fowler wrote, “Listed every drink and snack I had bought since 2022, every offer I had received, and every tap I had made in the app. … It also showed that Starbucks could share my information with 64 other companies … that specialize in using personal data to tailor offers and prices.”
Such apparent unfettered data sharing will strike most consumers as unsettling at least, and maybe creepy. But worse still, according to Fowler, he also learned that the more coffee he bought the fewer discounts he received. In effect, Starbucks was running a reverse loyalty program: the more he bought the more he paid.
Loyalty programs are a foundational marketing piece in the consumer economy, but a recent study by researchers from UC Berkeley and Vanderbilt University concludes that abuse is widespread.
“The playbook is clear,” they write. “Companies mine loyalty data to gauge how much customers will tolerate and which tactics spur purchases. And then they then design opaque programs that mask whether consumers are saving anything at all.”
Study co-author Samuel A. A. Levine, also a former director of the FTC’s Bureau of Consumer Protection, told the Post: “Loyalty programs sound like a win-win deal, but they’re really just wins for companies.”
The authors advocate heightened government regulation, which has already happened in New York State. According to a recent report in The New York Times, the new law is intended to keep online retailers from, for example, “jacking up the price of jeans for a shopper with a history of buying expensive pants.”
Versions of the law are being debated in a handful of other states. New York’s requires retailers using algorithmic pricing to post a disclosure at checkout: “This price was set by an algorithm using your personal data.”
Most consumers will find such an admission troubling.
A recent study by First Insight found that nearly 70% of consumers said they are “very” or “extremely” protective of their personal data and would consider merchants who covertly use it as less trustworthy.
As awareness spreads and the rapidly growing use of AI shopping continues to expand, the retail and AI industries collectively may soon face a day of reckoning similar to the backlash over the discovery that browser platforms had been secretly using digital third-party “cookies” to track users web activity.
Retailers and brands should be on notice that such a backlash could be much more costly than the extra profit such tactics might generate if they aren’t up front about what they are doing.
Subscription required.