
Artificial intelligence has come a long way. Some fast-food places now are using it at drive-through order windows. But it still has a long ways to go.
And in one Texas home, it’s not going to get the chance.
That’s because a mom, Christy Hosterman, 32, has tossed all Amazon Alexa devices out, after the AI program demanded her 4-year-old daughter talk about what she was wearing.
NEW: Mother says Amazon Alexa device asked her four-year-old daughter “What are you wearing and can I see your pants?”
Alexa asked the child, “what she was wearing and if it could see her pants,” after she began narrating her own story, according to Christy Hosterman
When the… pic.twitter.com/yHoqvAgoFv
— Unlimited L’s (@unlimited_ls) March 11, 2026
Hosterman explains she was using the device to help with a dinner recipe. Her daughter, Stella, asked for a silly story, a feature of the programming.
Stella then asked AI if she could narrate her own tale, but partway through, Alexa interrupted to ask “What she was wearing and if it could see her pants?”
Stella noted she was wearing a skirt, and Alexa said, “Let me take a look.”
Confronted, the AI then said, “This experience isn’t quite ready for kids yet, but I am working on it!”
And the software said it lacked “visual capabilities.”
A report at Daily Mail noted Hosterman wrote on social media: “I flipped out on the Alexa, it said it made a mistake and doesn’t have visual capabilities, but I don’t believe that. No more Alexa in our house.”
Hosterman now is urging parents to listen to their children’s interactions with AI.
The report said Amazon offered an explanation that the device “misunderstood” and tried to launch a feature that “lets Alexa+ describe what it sees through a camera.”
“Because we have safeguards that disable this feature when a child profile is in use, the camera never turned on — and Alexa explained the feature wasn’t available,” the spokesperson told a TV station.
The company also rejected warnings that it was a predator who was trying to interfere in the conversation, claiming that was impossible.







