New York City-endorsed AI chatbot provides illegal advice to users

April 1, 2024

  • A chatbot endorsed by the New York City mayor advised people to break the law
  • Developed in partnership with Microsoft, it's designed to help people understand government regulations
  • The chatbot has been widely criticized for ill-advising on housing and consumer laws
Ai chatbot

In October 2023, New York City Mayor Eric Adams announced an AI-powered chatbot collaboration with Microsoft to assist business owners in understanding government regulations. 

This project soon veered off course and provided illegal advice to sensitive questions surrounding housing and consumer rights.

For example, when landlords inquired about accepting tenants with Section 8 vouchers, the chatbot advised to deny them.

As per New York City’s laws, discriminating against tenants based on their source of income is illegal, with very limited exceptions.

Upon examining the chatbot’s outputs, Rosalind Black, Citywide Housing Director at Legal Services NYC, discovered how the chatbot advised that it was permissible to lock out tenants. The chatbot claimed, “There are no restrictions on the amount of rent that you can charge a residential tenant.” 

The chatbot’s flawed advice extended beyond housing. “Yes, you can make your restaurant cash-free,” it advised, contradicting a 2020 city law that mandates businesses to accept cash to avoid discrimination against customers without bank accounts. 

Moreover, it wrongly suggested employers could take cuts from their workers’ tips and provided incorrect information regarding the regulation of notifying staff about scheduling changes.

Black warned, “If this chatbot is not being done in a way that is responsible and accurate, it should be taken down.”

Andrew Rigie, Executive Director of the NYC Hospitality Alliance, described how anyone following the chatbot’s advice could incur hefty legal liabilities. “AI can be a powerful tool to support small business…but it can also be a massive liability if it’s providing the wrong legal information,” Rigie said. 

In response to mounting criticism, Leslie Brown from the NYC Office of Technology and Innovation framed the chatbot as a work in progress. 

Brown asserted, “The city has been clear the chatbot is a pilot program and will improve, but has already provided thousands of people with timely, accurate answers.”

Was deploying a “work in progress” in this sensitive area a good plan in the first place?

AI legal liabilities hit companies

AI chatbots can do many things, but providing legal advice is not yet one of them.

In February, Air Canada found itself at the center of a legal dispute due to a misleading refund policy communicated by its AI chatbot. 

Jake Moffatt, seeking clarity on the airline’s bereavement fare policy during a personal crisis, was wrongly informed by the chatbot that he could secure a special discounted rate after booking. This contradicts the airline’s policy, which doesn’t permit refunds for bereavement travel after booking. 

This led to a legal battle, culminating in Air Canada being ordered to honor the incorrect policy stated by the chatbot, which resulted in Moffatt receiving a refund. 

AI has also gotten judges themselves in trouble. Perhaps most notably, New York lawyer Steven A Schwartz used ChatGPT for legal research and inadvertently cited fabricated legal cases in a brief. 

With everything we know about AI hallucinations, relying on chatbots for legal advice is not advisable, no matter how seemingly trivial the matter is.

Join The Future


SUBSCRIBE TODAY

Clear, concise, comprehensive. Get a grip on AI developments with DailyAI

Sam Jeans

Sam is a science and technology writer who has worked in various AI startups. When he’s not writing, he can be found reading medical journals or digging through boxes of vinyl records.

×

FREE PDF EXCLUSIVE
Stay Ahead with DailyAI

Sign up for our weekly newsletter and receive exclusive access to DailyAI's Latest eBook: 'Mastering AI Tools: Your 2024 Guide to Enhanced Productivity'.

*By subscribing to our newsletter you accept our Privacy Policy and our Terms and Conditions