Massachusetts official warns AI systems subject to consumer protection, anti-bias laws-ZoomTech News

BOSTON (AP) — Builders, suppliers, and customers of synthetic intelligence should adjust to current state client safety, anti-discrimination, and information privateness legal guidelines, the Massachusetts lawyer common cautioned Tuesday.

In an advisory, Lawyer Basic Andrea Campbell pointed to what she described because the widespread improve in the usage of AI and algorithmic decision-making techniques by companies, together with know-how centered on customers.

The advisory is supposed partly to emphasise that current state client safety, anti-discrimination, and information safety legal guidelines nonetheless apply to rising applied sciences, together with AI techniques — regardless of the complexity of these techniques — simply as they might in some other context.

“There isn’t any doubt that AI holds large and thrilling potential to learn society and our commonwealth in some ways, together with fostering innovation and boosting efficiencies and cost-savings within the market,” Cambell mentioned in a press release.

“But, these advantages don’t outweigh the true danger of hurt that, for instance, any bias and lack of transparency inside AI techniques, may cause our residents,” she added.

Falsely promoting the usability of AI techniques, supplying an AI system that’s faulty, and misrepresenting the reliability or security of an AI system are simply among the actions that might be thought of unfair and misleading below the state’s client safety legal guidelines, Campbell mentioned.

Misrepresenting audio or video content material of an individual for the aim of deceiving one other to have interaction in a enterprise transaction or provide private data as if to a trusted enterprise accomplice — as within the case of deepfakes, voice cloning, or chatbots used to have interaction in fraud — may additionally violate state regulation, she added.

The objective, partly, is to assist encourage corporations to make sure that their AI services are free from bias earlier than they enter the commerce stream — reasonably than face penalties afterward.

Regulators additionally say that corporations ought to be disclosing to customers when they’re interacting with algorithms. A scarcity of transparency may run afoul of client safety legal guidelines.

Elizabeth Mahoney of the Massachusetts Excessive Know-how Council, which advocates for the state’s know-how financial system, mentioned that as a result of there is likely to be some confusion about how state and federal guidelines apply to the usage of AI, it’s important to spell out state regulation clearly.

“We expect having floor guidelines is necessary and defending customers and defending information is a key element of that,” she mentioned.

Campbell acknowledges in her advisory that AI holds the potential to assist accomplish nice advantages for society even because it has additionally been proven to pose critical dangers to customers, together with bias and the dearth of transparency.

Builders and suppliers promise that their AI techniques and know-how are correct, honest, and efficient at the same time as additionally they declare that AI is a “black field”, that means that they have no idea precisely how AI performs or generates outcomes, she mentioned in her advisory.

The advisory additionally notes that the state’s anti-discrimination legal guidelines prohibit AI builders, suppliers, and customers from utilizing know-how that discriminates in opposition to people primarily based on a legally protected attribute — equivalent to know-how that depends on discriminatory inputs or produces discriminatory outcomes that may violate the state’s civil rights legal guidelines, Campbell mentioned.

AI builders, suppliers, and customers additionally should take steps to safeguard private information utilized by AI techniques and adjust to the state’s information breach notification necessities, she added.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top