FTC Takes on Fake AI-Generated Online Product Reviews
Businesses could be barred from buying or creating fake reviews or testimonials for online products and services
The Federal Trade Commission (FTC) has finalized plans to introduce a rule barring online platforms from posting fake reviews and product testimonials, including those generated by AI.
The final rule is designed to address AI-generated fake reviews, barring businesses from creating or selling products featuring such reviews.
Businesses selling testimonials for e-commerce brands will face a crackdown, with platforms prohibited from buying fake reviews and generating them.
The rules also prohibit the buying or selling of fake social media metrics, including likes, follows or views, created by bot accounts.
However, rules governing social media are limited to cases where the buyer “knew or should have known” that engagement is being faked by bot accounts.
“Fake reviews not only waste people’s time and money but also pollute the marketplace and divert business away from honest competitors,” said FTC chair Lina M. Khan. “By strengthening the FTC’s toolkit to fight deceptive advertising, the final rule will protect Americans from getting cheated, put businesses that unlawfully game the system on notice and promote markets that are fair, honest and competitive.”
The FTC has been working on rules to combat fake reviews since November 2022, introducing several refined versions before it settled on the one published this week.
Amid the rise of generative AI technologies over the past few years, the FTC expressed concern that fake reviews could now be created “even easier” than before.
The final rule will take effect 60 days after its publication in the Federal Register, following the unanimous approval of the commission.
About the Author
You May Also Like