Skip to main content

Here’s How an Intellectual Property-Focused AI Bill Could Affect Brands and Retailers

A new bill wants to take on the way artificial intelligence models leverage intellectual property and consumer data. 

Senators Josh Hawley (R-Mo.) and Richard Blumenthal (D-Conn.) on Monday proposed the AI Accountability and Personal Data Protection Act, which could change the way Big Tech companies developing large language models (LLMs) are allowed to train their tools. 

The bill aims to prohibit AI companies from using copyrighted works to train models without the owner’s consent; mandates transparency about how consumer data will be handled and more.

Hawley said technology companies are “robbing the American people blind while leaving artists, writers and other creators with zero recourse,” which is what caused his interest in introducing this bill alongside Blumenthal. 

Related Stories

Blumenthal said the safeguards the bill would allow for could ensure American creators and consumers see a path forward that leaves room for human-made creative work while also allowing AI to advance steadily. 

“This bill embodies a bipartisan consensus that AI safeguards are urgent—because the technology is moving at accelerating speed, and so are dangers to privacy. Enforceable rules can put consumers back in control of their data, and help bar abuses,” Blumenthal said in a statement. Tech companies must be held accountable—and liable legally—when they breach consumer privacy, collecting, monetizing or sharing personal information without express consent. Consumers must be given rights and remedies—and legal tools to make them real—not relying on government enforcement alone.”

And while Hawley and Blumenthal put an onus on helping working-class Americans in their respective announcements related to the bill, Kirk Sigmon, an intellectual property attorney at law firm Banner Witcoff, said his interpretation is that it also could be applied to companies’ copyrighted works. 

That means that, going forward, large technology companies could be barred from scraping brands and retailers’ copyrighted images and designs to train LLMs and other public-facing, general-purpose AI models. 

“The way in which I look at it is to say that it is agnostic to small IP, big IP. It’s just talking about copyright; it’s writ large,” he said. “It does seem to set up a system which would effectively kill the ability to train with large language models.” 

That, he contends, is because LLMs require a gargantuan amount of training data to function effectively. Without such data, the systems can serve adverse, false results or fail to understand the context of a user’s query, among other issues. 

“If you’re an AI company, this sucks for you, because the reality is, you need this volume of content to be able to do anything remotely approximating good training,” Sigmon told Sourcing Journal. “The fact that you’re telling them, ‘Any one of those pieces of content can invite you into a copyright lawsuit,’ [could] kill it.” 

But despite the threat the bill poses to large AI players, it could have the opposite effect for small creators, authors and artists—as well as fashion and apparel brands, which often rely on individuality in design to cut through the noise in a highly saturated market. 

“If you are creating copyrighted work—say, some sort of creative image—the last thing you want to do is allow others to use that content to train a model to generate permutations, because even if it might be sufficiently different from what you’re creating, it does invite the possibility that they’re creating something just close enough to compete with the market,” Sigmon said. 

The bill does not mention trademarks, which are a valuable intellectual property consideration for many fashion and apparel brands and retailers. Sigmon said that today’s trademark lawsuits are typically centered around counterfeit or fake items, and noted that current law could take down crimesters using AI models to develop dupes of iconic logos and trademarks. 

The bill mixes considerations related to intellectual property with data privacy issues, lines that Sigmon said continue to blur as emerging technologies grow and expand. According to Hawley’s office, the bill, if passed, “allows individuals to sue any person or company that appropriates, uses, sells or exploits their personal data or copyrighted works without clear, affirmative consent.” 

Sigmon said retailers wanting to continue using consumer data for personalized marketing or AI model training will need to double down on their data security measures, but noted that many companies—particularly those that have an arm transacting in the European Union—already have some safeguards in place to ensure consumers agree to their use of data. 

But for both issues—copyright and data misuse—the burden of proof is likely to be high; many of the largest players in Big Tech have already trained several iterations of the systems that power their LLMs, and much of the work goes on in what many think of as a “black box.” 

“It’s very hard to reverse engineer these machine learning models to figure out how things were trained, so you really have to have some sort of external evidence of your content being used,” Sigmon explained. 

Despite the effects the bill could have on individual consumers and businesses alike, Sigmon said he harbors some doubt about its potential to make it through both chambers of the federal legislature. That’s not necessarily because of the difficulty of proving what the bill would prohibit, but rather because of Big Tech’s consistent influence over the political sphere. 

“I understand why this is being proposed. There are really significant concerns here, and I feel for copyright owners and trademark owners who are finding themselves dealing with this sort of issue. But do I think this is going to be passed? Very unlikely,” he said. “There’s too many people with too much money involved in this process to let this [bill] go through, because it really would decimate your ChatGPTs across the world—they just could not survive this, in my view.”