New York has become the first state to implement regulations requiring businesses to disclose when they are using personalized pricing, a practice where different customers are shown different prices for the same products or services based on their personal data. The new law, part of the state’s recently approved budget, went into effect immediately following the budget’s passage. It aims to increase transparency around algorithmic pricing and empower consumers to understand how their data influences the costs they encounter online.
The regulations mandate that businesses inform customers with a clear statement: “This price was set by an algorithm using your personal data.” This disclosure is intended for situations where a company utilizes a shopper’s browsing history, location, demographics, or other personal information to determine the price displayed. The legislation’s passage marks a significant step towards regulating data-driven pricing models, a previously opaque area of e-commerce.
Understanding Personalized Pricing and the New Law
Personalized pricing, also known as dynamic pricing, isn’t a new concept. Airlines and hotels have long adjusted prices based on demand and time of booking. However, the practice has become increasingly sophisticated with the proliferation of consumer data and the use of complex algorithms. Businesses can now tailor prices to individual consumers in ways that were previously impossible.
The concern driving the new law is that this personalization can lead to unfair or discriminatory pricing. While advantageous for businesses looking to maximize profits, it raises questions about consumer fairness and the potential for exploitation. This legislation specifically targets price discrimination based on an individual’s data profile, rather than factors like location or time of purchase.
How the Law Works in Practice
The law requires a relatively simple disclosure – the text specified above – whenever an algorithm uses personal data to set a price. Businesses are expected to incorporate this notification into their online shopping experiences. According to reports, Uber has begun displaying this message to New York users, although the company maintains its pricing is based on geography and demand.
However, the implementation isn’t without ambiguity. The exact definition of “personal data” and how extensively it needs to have influenced the pricing decision to trigger the disclosure remains open to interpretation. This lack of clarity sparked the initial legal challenge.
The National Retail Federation (NRF) filed a lawsuit seeking to block the law, arguing it was overly broad and would create compliance difficulties. The NRF argued that the law would require retailers to disclose the use of pricing algorithms even when personal data played a minimal role in setting prices. A federal judge subsequently denied the NRF’s request for a preliminary injunction, allowing the law to proceed while the legal challenge unfolds. This means businesses must currently comply with the disclosure requirements.
The Broader Context: Algorithmic Accountability
New York’s move reflects a growing trend toward greater algorithmic accountability across various sectors. Policymakers are increasingly focused on the potential for bias and unfairness embedded within algorithms used in credit scoring, hiring practices, and other critical decision-making processes. Concerns over price discrimination and lack of transparency are fueling these efforts.
Previously, the Federal Trade Commission (FTC) under former Chair Lina Khan signaled increased scrutiny of algorithmic pricing. Khan, now a co-chair of the mayoral transition team for New York City, reportedly views the New York law as a crucial tool for government oversight. She has also stated that further regulation is necessary to address the complexities of this evolving landscape.
The legislation isn’t solely focused on immediate consumer impact. Advocates believe greater transparency will force businesses to examine their pricing algorithms and potentially mitigate discriminatory practices. This

