Skip to main content
Delaware House Democrats

House Passes Bills to Increase Transparency and Reduce the Risk of Negative Outcomes During Interactions with AI

May 6, 2026

DOVER – Recognizing growing risks for consumers as artificial intelligence (AI) technology continues to rapidly advance, the House passed two bills on Tuesday that would protect Delawareans from fraud and deception in an increasingly digital world.

Taken together, the two measures would increase penalties for theft involving the impersonation of a family member, a tactic that has become more common as scammers use AI to mimic voices or identities to exploit victims, and require more transparency in consumer interactions with AI chatbots.

Sponsored by Rep. Eric Morrison and Sen. Kyra Hoffner, House Bill 326 with House Amendment 1 would expand Delaware law relating to theft, making theft by impersonation of a family member a unique crime that would constitute a class B, C, E, or F felony depending on the amount of monetary losses in each case. 

“Committing theft by impersonating a family member is especially manipulative and heinous. We care about our family members and want to help them financially when they need it, so this type of fraud has a higher success rate than others,” said Rep. Eric Morrison.

“Preying upon those who are kind enough to help those in need is disgusting, and turns a good deed into a scary and often lengthy legal battle. As we navigate an age where impersonation is becoming more and more common due to rapidly developing AI tech, I am proud to be taking this step now, so more victims can get the justice they deserve.”

Under HB 326, a person is guilty of theft by impersonation of a family member when they: 

  1. Take, exercise control over, or obtain property of another person intending to deprive that person of it or appropriate it.
  2. In the course of committing the theft or an attempt to commit the theft, the person, who is not an immediate family member of the victim of the theft, uses any oral, written, or electronic communication that is represented to be from an immediate family member of the victim of the theft or from one acting on behalf of or for the benefit of an immediate family member of the victim of the theft, with the intent that the communication facilitate commission of the theft.

“While some fraudulent tactics target certain vulnerable populations like seniors or young students, the use of a family member as bait has the power to trick even the most vigilant person,” said Sen. Kyra Hoffner, Senate prime sponsor of the bill. 

“With the growing prevalence of AI and other sophisticated technologies, we as lawmakers have a responsibility to ensure that we constantly monitor and modernize our consumer protection laws to keep pace with evolving trends and techniques. That’s what House Bill 326 is about.” 

While impersonation scams are not new, the use of AI to carry them out has made them more convincing and difficult to detect. These scams are becoming increasingly common, with bad actors using video or audio recordings pulled from social media to create deep fakes. In many cases, scammers use these deepfakes to impersonate family members or loved ones, claiming to be in duress and urgently in need of money.

According to the Federal Bureau of Investigation’s 2025 Internet Crime Report, more than 22,000 instances of AI-related fraud were reported last year, with almost $893 million in reported losses. 

HA 1 to HB 326 removes the conviction penalty and sentencing requirements for this crime from the bill, leaving it to the discretion of the courts.

Sponsored by Rep. Cyndie Romer and Sen. Bryan Townsend, House Bill 306 with House Amendments 1 and 3 strengthens AI language in Delaware Code by expanding the definition of artificial intelligence and formally defining the terms “Chatbot” and “Avatar,” while also making deceptive computer communication trade an unlawful practice. 

Under HB 306, business entities must disclose to consumers when they are interacting with computer technology, including a chatbot, artificial intelligence agent, or avatar, under the circumstances where a reasonable person would believe they are engaging with an actual human. 

“We’re all used to businesses stopping us from entering their websites until we verify that we are human, whether that’s by ‘selecting all squares with traffic lights’ or checking a box that says ‘I’m not a robot.’ But those same businesses are not required to disclose that the ‘person’ that we are interacting with on their site is not actually human,” said Rep. Cyndie Romer. 

“HB 306 is about transparency. We deserve the right to know if we are interacting with a human or a robot, especially when that comes to sensitive situations such as monetary transactions.” 

If the business entity fails to notify the consumer in a clear and conspicuous manner that they are interacting with a chatbot, the consumer can initiate a private right of action. If the business entity is found liable in a civil action, they could face a maximum liability of any actual damages as well as statutory damages not exceeding $1,000, and, in the case of a class action, up to $10 million. 

Additionally, the Attorney General also has the power to seek injunctive relief and a civil penalty of up to $5 million dollars for violations. 

“Delawareans shouldn’t be deceived when tending to their customer service or business needs. We should know when we’re talking to an actual person or an AI avatar or chatbot, because this technology lacks the proper understanding to assist with complex and nuanced inquiries,” said Senate Majority Leader Bryan Townsend

“House Bill 306 will bolster transparency by modernizing our AI laws and making deceptive computer communications illegal.”

HA 1 to HB 306 provides a safe harbor, which makes it clear that a business entity is not in violation of HB 306 if, at the beginning of each interactive session, the business states, in a clear and conspicuous manner, “You are interacting with a computer, not a human.”

HA 3, introduced on the House floor Tuesday, clarifies the amount of damages able to be awarded in a cause of action under HB 306.

A recent survey found that 85% of retail and e-commerce businesses have implemented chatbots in their e-commerce operations. 

In addition to House Bills 326 and 306, the General Assembly also passed HB 191 in March that clarifies that nonhuman entities, including an agent powered by AI, may not be licensed as a professional nurse, APRN, practical nurse, physician, or physician assistant, or use any of those professional titles. 

In the case of pharmaceutical and other medical-related sales, the requirement that chatbots must disclose that they are not human, along with the prohibition of AI agents from identifying themselves as medical professionals, adds an extra layer of protection for consumers. 

House Bills 326 and 306 now head to the Senate for consideration.

###

Recent Updates


Filter by Category