close
close

A lawsuit against Perplexity accuses fake news AI hallucinations

Perplexity did not respond to requests for comment.

In an emailed statement to WIRED, News Corp CEO Robert Thomson negatively compared Perplexity to OpenAI. “We applaud principled companies like OpenAI that understand that integrity and creativity are essential if we are to realize the potential of artificial intelligence,” the statement said. “Perplexity is not the only AI company abusing intellectual property, and it is not the only AI company that we will pursue vigorously and consistently. We have made it clear that we would rather court than sue, but for the sake of our journalists, our writers and our company, we must challenge the content kleptocracy.”

However, OpenAI faces its own accusations of brand dilution. In New York Times vs. OpenAIThe Times claims that ChatGPT and Bing Chat will attribute fabricated quotes to the Times and accuses OpenAI and Microsoft of damaging their reputations through brand dilution. In one example cited in the lawsuit, the Times alleges that Bing Chat claimed that the Times described red wine (in moderation) as a “heart-healthy” food when in fact it was not; The Times argues that its factual reporting has debunked claims about the health of moderate drinking.

“Copying news articles to power substitute commercial generative AI products is unlawful, as we have made clear in our letters to Perplexity and our litigation against Microsoft and OpenAI,” said Charlie Stadtlander, director of external communications at NYT. “We welcome this lawsuit from Dow Jones and the New York Post, which represents an important step in ensuring publisher content is protected from this type of misappropriation.”

Some legal experts are unsure whether the allegations of false designation of origin and brand dilution will bear fruit. Intellectual property lawyer Vincent Allen, a partner at Carstens, Allen & Gourley, believes the copyright infringement claims in this lawsuit are more serious and that he will be “surprised” if the false appellation charge stands. Both Allen and James Grimmelmann, a professor of digital and internet law at Cornell University, believe that the landmark trademark case Dastar v. Twentieth Century Fox Film Corp. could put an end to this line of attack. (In this ruling over a dispute over old World War II footage, the Supreme Court ruled that “origin” does not refer to authorship in trademark law, but instead refers to tangible goods—like a pirated wallet—rather than counterfeit creative ones Works Restricted Grimmelmann is also skeptical that the trademark dilution claim will hold up: “Dilution means using your trademark in a way that affects the distinctiveness of a famous trademark.” See this here,” he says.

If publishers prevail on their argument that hallucinations can violate trademark law, AI companies could face “immense difficulties,” according to Matthew Sag, a professor of law and artificial intelligence at Emory University.

“It is absolutely impossible to guarantee that a language model will not hallucinate,” says Sag. In his view, the way language models work by predicting words that sound right in response to prompts is always a kind of hallucination – sometimes it just sounds more plausible than others.

“We only call it a hallucination when it doesn’t match our reality, but the process is exactly the same whether we like the result or not.”