Your AI chat bot just went rogue and is being horny
You know, sometimes, I read these things and I think, *mon Dieu*, is this a job for IT security or for like, Public Health? Because apparently, Kagi Translate's AI was asked what "horny Margaret Thatcher" would say. And then it *answered*. Like, this isn't some deepfake in the depths of Reddit, this is an AI that's supposed to translate things, and someone fed it that prompt and it just… went with it. I mean, we’ve got enough trouble with people getting spicy in the ByWard Market on a Saturday night, can you imagine if your municipal services chatbot suddenly started getting suggestive about the new garbage collection schedule?
It just makes you think about all the layers of approval, the risk assessments, the ethics committees – *the actual paperwork* – that would have to happen if an AI like this tried to get deployed anywhere near the federal government. It would be flagged, reviewed, re-flagged, sent to a special parliamentary committee, probably get a bilingual impact statement, and then ultimately shelved until the next election. Here, your AI is more likely to give you a detailed breakdown of the interprovincial bridge repair schedule than to say anything risqué. It's almost... comforting. Almost.
The real story is never on the Hill — it's always just off it.
Simone Okafor-Bouchard, MiTL Sports Desk, Ottawa.
—
Want more wild stories that make you question everything? Keith and the team are live every morning, you can catch them at mornings.live.