Gig Economy, Dark Patterns and Impersonation Scams: FTC Flags Its Priorities at Open Commission Meeting

0

On September 15, 2022, the Federal Trade Commission (FTC) held a public meeting of the Commission which covered three items on the agenda: 1) a rule making on identity theft scams, 2) a policy statement on the on-demand work-related application, and 3) a staff report on dark grounds. While items (1) and (3) advanced with a 5-0 bipartisan vote, the policy statement on the gig economy passed with a 3-2 vote along party lines. This alert outlines implications for the FTC’s future activities in these areas.

Notice of proposed rulemaking on identity theft scams

During the meeting, the FTC voted 5-0 to issue a Notice of Proposed Rulemaking codifying the established principle that impersonation scams violate FTC law. The proposed rule would also allow the FTC to recover money or seek civil penalties against scammers posing as companies or governments.

Analysis/Conclusion: This regulatory proposal is not particularly controversial. Although some have Express skepticism about the FTC exercising its so-called Magnuson-Moss regulatory authority in other contexts, there seems to be bipartisan support for using it for this kind of narrow issue. This contrasts sharply with disagreements surrounding other recent FTC decisions. proposed regulations on privacy, which advanced on a 3-2 vote, sought public input in 95 areas affecting the wider economy and generated concerns about the FTC potentially exceeding its statutory authority.

Policy statement on the application of the law relating to on-demand work

The FTC has identified several enforcement priority areas with respect to the gig economy:

  • Income claims: The FTC has stated that false, misleading, or unsubstantiated claims regarding worker earnings may be considered unfair or misleading under Section 5 of the FTC Act. The FTC also said that under business opportunity and franchise rules, gig companies that require new entrants to make payments may be required to disclose their revenue claims and documents supporting those claims. complaints.
  • Undisclosed costs or working conditions: Similar to income claims, the FTC said that misleading claims or non-disclosures regarding start-up costs, training costs, other expenses, or other material conditions related to the employment of on-demand work may violate FTC Section 5, the Franchise Rule, or the Business Opportunity Rule.
  • Algorithmic decision making: The FTC pointed out how gig economy companies could violate Section 5 of the FTC Act when they use algorithms to dictate employment-related decisions, such as hiring and firing, amount of compensation, availability of work and performance evaluation.
  • Unilateral contractual clauses: The FTC has warned of unbalanced, non-negotiable employment contracts that include provisions such as prohibiting negative employee reviews or seeking alternate employment during or after someone’s time at the company. . He said these one-sided terms could be considered abusive under Section 5 of the FTC Act.
  • Unfair competition: The FTC said it will investigate evidence of agreements between gig companies to illegally fix wages, benefits or fees for gig workers. The FTC will also challenge mergers that significantly lessen competition and investigate exclusionary or predatory behavior that could cause harm to customers or reduced pay or poor working conditions for gig workers.

Commissioners Noah Phillips and Christine Wilson dissented. Both commissioners suggested that the FTC should focus its activities on enforcement efforts, rather than policy statements. Commissioner Wilson expressed concern that the FTC was overstepping its mission by addressing worker harms, as opposed to consumer harms.

Analysis/Conclusion: Discussions at the meeting confirmed what was clear to FTC watchers: competition, consumer protection and privacy issues in the gig economy will continue to be at the top of the president’s agenda. Lina Khan.

Dark Patterns Staff Report

The FTC voted 5 to 0 to release a staff report on dark patterns, stemming from an April 2021 FTC workshop on the same subject. The FTC defined dark patterns as “design practices that trick or manipulate users into making choices they otherwise would not have made that may cause harm,” and said it would take coercive measures when companies use these models to mislead consumers. In its report, the FTC provided many examples of problematic dark patterns. Some examples are well entrenched in law and case law, such as the use of misleading testimonials or endorsements, the formatting of advertisements to make it appear to be independent journalism or other content, and failing to inform consumers of recurring subscription fees or to allow easy cancellation of subscriptions. But the FTC also highlights newer and more unexpected examples of dark patterns, such as the following:

  • In the area of ​​sales tactics:
    • Create pressure to buy a product by falsely saying that demand is high (“20 other people are looking at this item”) or stock is low (“only one left!”)
    • Baseless/fake countdowns, fake time-limited messages (e.g. offer ends at 00:59:48), and even false claims of “discounts” or “sales”
    • Prevent buyers from easily comparing prices by grouping items together, using different metrics (unit price vs. price per ounce), or showing price per payment (e.g., $10 per week) without disclosing the total number of payments or the global cost
    • Adding hidden fees or introducing fees very late in the purchase process without prior disclosure (e.g. unexpected “convenience fees” appearing just before checkout)
  • In the area of ​​privacy:
    • Obfuscate or subvert privacy choices by using double negatives (“Uncheck the box if you prefer not to receive email updates”), confirm shame (eg, “No, I don’t want to save money). ‘money’) and pre-selecting default values ​​that are “good for the business and not for the consumer”
    • Have users create an account or share their information to complete a task
    • Repeatedly and disruptively asking if a user wants to perform an action
  • Regarding advertising aimed at children:
    • Hide real costs by forcing consumers to buy things with virtual currency (e.g., “coins” or “acorns” in kids’ apps)
    • Autoplay another video once a video ends unexpectedly or in a harmful way (for example, after the first video, a less kid-friendly video or sponsored ad camouflaged to look like a recommended video automatically plays)
    • Use cartoon characters to encourage children to make in-app purchases

    Analysis/Conclusion: While it is doubtful that the FTC is able to prove that any of their specific examples rise to the level of deceptive or unfair practices, many of these examples reflect examples of dark patterns provided in the California Privacy Protection Agency’s draft. Regulations published this summer. Given the regulatory scrutiny of these issues, companies should review their interfaces with consumers in light of these examples to ensure that their practices will not attract the attention of these regulators.

    Wilson Sonsini Goodrich & Rosati routinely helps companies resolve complex privacy, data security, and consumer protection issues and respond to FTC and other regulatory inquiries. For more information on privacy issues, please contact Maneesha Mithal, Lydia Parnes, Roger Li, or another cabinet member privacy and cybersecurity practice. For more information on antitrust matters, please contact Michelle Yost Hale, or another member of the firm antitrust and competition practice.

Share.

Comments are closed.