FTC Chair Lina Khan, shown here speaking at Economic Club of New York in July 2023, is looking to expand the authorities her agency has to target fraudsters who use impersonation in scams.

FTC Chair Lina Khan, shown here speaking at Economic Club of New York in July 2023, is looking to expand the authorities her agency has to target fraudsters who use impersonation in scams. Michael M. Santiago/Getty Images

FTC cracks down on AI impersonation scammers

The Federal Trade Commission is looking to extend its authority to target fraudsters impersonating individuals in scams. The proposed rule covers the misuse of generative artificial intelligence and other technologies.

On Thursday, the Federal Trade Commission finalized a new rule extending protections to government and business impersonations to help curb scams and put out a call for public input on a proposed rule that covers artificial intelligence-generated scams directed at individuals. 

The agency’s actions follow growing complaints of generative machine learning software creating synthetic content that mimics individuals. These range from image and video deepfakes to artificial audio and voice cloning, among other forms of content.

The FTC is seeking feedback on how to assess liability to companies that provide tools to create and deploy this type of content. 

“Fraudsters are using AI tools to impersonate individuals with eerie precision and at a much wider scale. With voice cloning and other AI-driven scams on the rise, protecting Americans from impersonator fraud is more critical than ever,” said FTC Chair Lina Khan. “Our proposed expansions to the final impersonation rule would do just that, strengthening the FTC’s toolkit to address AI-enabled scams impersonating individuals.”

Under a rule slated to take effect in the coming weeks, the FTC bans the impersonation of businesses and government entities. According to an FTC blog post published on Thursday, the rule "outlaws some of fraudsters’ favorite means of impersonation," including the use of government seals, business logos, email address spoofing and more. 

At the same time, the FTC wants to expand its authority to encompass content made by AI and machine learning software. 

Notably, this additional rule doesn't just apply to the scammers themself; it also targets actors that provide the “means and instrumentalities” to commit an impersonation scam. In public comments, some tech trade associations were leery of imposing third-party liability without more clarity, but there was widespread support among commentators for some authority to go after suppliers of tech tools used to generate scam content.  

The commission voted to impose liability on “those who provide goods or services with knowledge or reason to know that those goods or services will be used in impersonations of the kind that are themselves unlawful under the Rule.”

“Given the proliferation of AI-enabled fraud, this additional protection seems especially critical,” FTC officials wrote in a statement. “Ensuring that the upstream actors best positioned to halt unlawful use of their tools are not shielded from liability will help align responsibility with capability and control.”