AI Misunderstands Some People's Words More Than Others
https://spintaxi.com/ai-misunderstands-some-peoples-words-more-than-others/A groundbreaking MIT study reveals that popular voice assistants misunderstand certain demographics up to 40% more often, with the worst performance affecting regional accents, speech impediments, and women over 60. Researchers found Siri interprets Southern U.S. accents as nonsense 28% of the time, Alexa transcribes stutters as random commands ("turn on...turn on...turn on lights" becomes "tuna tuna tuna lights"), and Google Assistant consistently hears elderly women's requests as Mandarin Chinese. The bias appears rooted in training datasets overwhelmingly composed of young, coastal American voices - one system was found to have been trained on 82% male tech workers' speech patterns. The most egregious errors include a Scottish man asking for "the weather" getting directions to a leather shop, and a voice assistant repeatedly ordering 112 pizzas when a stroke survivor asked to call his daughter. While tech companies promise improvements, disability advocates note the consequences are already severe - one ALS patient's smart home now blasts heavy metal when he requests classical music. The study's lead researcher summarized: "We've built systems that understand tech bros perfectly but can't comprehend half the human experience." Meanwhile, voice AI continues excelling at one task - flawlessly recognizing and serving ads regardless of accent.