With all the negative media articles surrounding AI companions (or more specifically, AI girlfriends), and one or two isolated and tragic examples of somebody being harmed by their relationship to an AI companion, it’s no surprise that lawmakers and regulators are starting to push for restrictions on them. California is getting in early, with a new bill (SB 243) that would require an AI companion to intrusively and repeatedly remind the human talking to it that it is only a machine. Talk about killing the romantic vibe! The bill would also mandate that AI companion platforms do not use ‘additive tricks’ or ‘unpredictable rewards’. More reasonably, warnings that AI companions are not suitable for minors would also be required.
Introduced by state Sen. Steve Padilla, the bill would require companies running companion chatbots to avoid using addictive tricks and unpredictable rewards. They’d be required to remind users at the start of the interaction and every three hours that they’re talking to a machine, not a person. And they’d also be required to clearly warn users that chatbots may not be suitable for minors.
If passed, it would be among the first laws in the country to regulate AI companions with clear safety standards and user protections.
“We can and need to put in place common-sense protections that help children, shield our children and other vulnerable users from predatory and addictive properties that we know chatbots have,” Padilla, a Democrat, told reporters at a press conference in Sacramento on Tuesday.
Source: https://statescoop.com/california-sb243-harmful-ai-companion-chatbots/
Another “save the children” initive that will have hugely negative impacts upon the ability of millions of adults to find sexual fulfillment in nascent sex tech. However, the bill is meeting some opposition. The Computer & Communications Industry Association (CCII) testified before a committee voicing concerns around legal risks and the stifling of innovation. Though it appears that their concerns aren’t really about AI girlfriend sites and apps.
Under SB 243, AI models that support everyday tasks like tutoring, mock interviews, or customer service could be classified as “companion chatbots,” even if they weren’t designed to simulate human companionship or meet users’ social needs. These tools would be subject to new rules, including repeated pop-up disclosures, mandatory audits, and detailed reporting requirements.
Featured image found on Wikipedia.