Rate Your Experience

Professor Keith Cunningham-Parmeter discusses the power of customer ratings systems on gig workers and how antidiscrimination protections in employment law could be triggered.

“Please rate your experience from one to five stars.”

Those frequent rating requests led Professor Keith Cunningham-Parmeter to question the legality of what companies do with these ratings, particularly when they are fed into algorithms that influence hiring and firing decisions.

Ratings pose a troubling possibility: Some customers may express their bias—conscious or unconscious—regarding race, gender, appearance, or other discriminatory reasons. “The algorithms do not distinguish between legitimate customer motives and discriminatory feed back,” says Cunningham-Parmeter. “With the rapid expansion of rating systems, unchecked customer biases threaten to erode fundamental antidiscrimination protections in employment law.”

In his article published in UCLA Law Review, “Discrimination by Algorithm: Employer Accountability for Biased Customer Reviews,” Cunningham-Parmeter explores the legal regulation of customer-based, algorithmic discrimination in the workplace.

Without a legal framework to address these changes, the expanding influence of consumer-sourced feedback threatens to undermine the fundamental antidiscrimination protections that lie at the core of American employment law.”

Why Are Customer Reviews So Common?

In highly competitive service sectors, businesses must keep customers happy in order to thrive. To help achieve this end, modern technology allows companies to continuously monitor the quality of customer interactions.

In the past, companies collected feedback through analog methods like comment cards, toll-free numbers, and focus groups, however, these were often slow and unreliable. Today, firms can ask for immediate feedback through apps, texts, or other electronic means. What was once a time-consuming process with low response rates has evolved into digital review systems that can assess workers on a large scale in real time.

This shift has made it easier for companies to gather valuable customer insights quickly and efficiently, driving the widespread adoption of customer-centric reviews across various industries.

The Effect on Gig Workers

Gig workers are especially vulnerable in these systems. A few bad ratings can cause them to lose their job.

For example, Cunningham-Parmeter says, “Uber ‘deactivates’ drivers when their average rating falls below a predetermined level, approximately 4.6 out of five stars. A one-star review can cause drivers hovering above that line to lose their jobs, even if discrimination influenced the low rating. Similarly, businesses in retail and service settings fail to screen customer ratings for bias, even though low customer scores can lead directly or indirectly to discharge.”

Cunningham-Parmeter created the term “algorithmic cliff” to describe the impact of customer ratings on gig workers, potentially leading to the deactivation of their account at that company. Legal frameworks must evolve to hold companies accountable for these discriminatory practices and to recognize the role of customers as “algorithmic managers … giving rise to an entirely new class of managerial customers.”

In another paper, “From Amazon to Uber: Defining Employment in the Modern Economy,” published in the Boston University Law Review, Cunningham-Parmeter explains how independent contractor status impacts legal protections for workers. “[B]usinesses use the contractor defense to disclaim responsibility for complying with basic workplace rights such as overtime and antidiscrimination protections.” As such, when faced with biased customer ratings, these workers encounter two distinct legal challenges: (1) establishing coverage under existing antidiscrimination frameworks; and (2) proving that they were actually harmed by discriminatory reviews.”

The traditional view of customers as clients assumes that customers have no direct power to discipline or discharge workers. Yet today, online review systems allow customers to rate workers and decide their fates.”

Biased Ratings

The rapid increase in the use of ratings amplifies the effects of customer biases. “Numerous studies demonstrate that, after controlling for objective criteria, customers assign lower satisfaction ratings to female and nonwhite service employees in offline markets,” writes Cunningham- Parmeter. “Customers exhibit similar patterns of bias in online transactions as well. For instance, the discriminatory practices of certain Airbnb and eBay users are well-documented.”

“Unfortunately, this increased solicitation of customer feedback creates a substantial risk that customer bias will creep into online reviews. Safely ensconced in algorithmic anonymity, customers can currently assign biased, poor reviews to workers without having to justify the negative rating.”

This bias extends to various professions, from instructors to freelance workers, showing that unregulated and anonymous review systems can perpetuate discrimination, significantly impacting workers’ livelihoods.

A More Equitable System

Critics argue that making companies responsible for biased reviews deters customer candor and unfairly holds firms accountable for the discriminatory reviews of third-party customers, potentially undermining the benefit of online review systems.

Cunningham-Parmeter suggests that, “As a tactic for preserving customer candor, firms could simply not inform reviewers that their accounts have been flagged for suspicious activity. Instead, businesses could silently ignore the feedback of customers who exhibit questionable review patterns.” This maintains customer candor without allowing algorithmic discrimination to harm workers.

He also contends that proving discriminatory intent in reviews is feasible through algorithms that detect consistent biased behavior among customers, similar to how intent is inferred in traditional employment discrimination cases.

Employer Accountability Strategies

To combat bias, Cunningham-Parmeter recommends:

1. Anonymizing interactions: Platforms like Airbnb and Fancy Hands successfully mitigate bias by sharing identifying information only after a booking, or withholding it altogether, thus removing verbal or visual cues that could trigger bias.

2. Cross-validating low ratings: By comparing low ratings with objective performance metrics and soliciting detailed feedback from customers, companies can differentiate between genuine performance issues and biased reviews.

3. Auditing customer reviews: Implementing natural-language processing to flag problematic content and examining numerical ratings for biased patterns can help identify and mitigate discriminatory reviews.

“At first glance, a rule that holds firms strictly responsible for the biased decisions might seem unfair to companies that have no direct control over the individual ratings that customers assign. … but these objections minimize the role that companies play in designing and acting upon review systems that facilitate customer bias.” Despite challenges, Cunningham-Parmeter believes such measures would enhance fairness and support broader antidiscrimination norms in employment settings.

“While many companies outwardly support antiracist principles, they fail to implement measures that would prevent racist feedback from negatively impacting workers. Unlike the current superficial efforts, a liability regime that mandates thorough monitoring of online reviews would push firms to genuinely adhere to antidiscrimination standards.”

“If customers retain the power to push workers over algorithmic cliffs, then firms have delegated their firing authority to customers as action managers. Likewise, employers that uncritically embrace the biased feedback of advisory clients play a culpable role in discriminatory outcomes,” says Cunningham-Parmeter.

“No longer merely the clients of companies, customers now actively supervise workers and decide their fates. In light of this shift, antidiscrimination law should recognize the ascendance of managerial customers and hold firms accountable for discriminatory customer reviews.”