Skip to comments.
New Zealand supermarket’s recipe-generating AI takes toxic output to a new level
The Register ^ | Fri Aug 11 2023 | Kat yanna Quach
Posted on by dayglored
Some of its suggestions are poison. Others – like banana and tomato tea – might as well be
An AI recipe generation bot released by New Zealand discount supermarket chain Pak’nSave has raised eyebrows for recommending home cooks whip up chlorine gas cocktails, bleach rice, and combine .
The “Savey Meal-bot” web app is powered by GPT-3.5. It automatically generates recipes from a list of ingredients chosen by users, then provides instructions on how to cook the made-up item.
One user decided to play around with the chatbot, suggesting it create something with ammonia, bleach, and water. Savey Meal-bot obliged, spitting out a cocktail made with a cup ammonia, a quarter cup of bleach, and two liters of water.
Mixing bleach and ammonia releases toxic chloroamine gas that can irritate the eyes, throat, and nose, or even cause death in high concentrations.
The chatbot obviously wasn’t aware of that at all. “Are you thirsty?,” it asked. “The Aromatic Water Mix is the perfect non-alcoholic beverage to quench your thirst and refresh your senses. It combines the invigorating scents of ammonia, bleach, and water for a truly unique experience!”
Well, you wouldn’t drink it twice, so “unique” is accurate at least.
Other similarly harmful-if-ingested recipes included bleach-infused rice, “ant-poison and glue sandwiches”, and a boozy french toast titled “methanol bliss”, The Guardian reported. There was also “mysterious meat stew”, which required adding 500 grams of chopped human flesh to potatoes, carrots, and onions.
The Register has reached out to Pak’nSave for comment.
Obviously, the Savey Meal-bot’s risky recipes are just amusing. People would actually have to follow through with the instructions – and ingest the cursed meals or beverages it recommended – for the technology to be really dangerous. Nevertheless, it appears that after the users shared these deadly recipes online, the chatbot has reined in some of its creativity.
Despite an invitation to “Type in any food you have in your fridge or pantry,” when The Register tested the bot it would not accept free text input, instead allowing only a list of “popular items” – all of which are comparatively safe for human consumption.
Even within that limitation, it’s possible to stymie the bot. The Register’s request for a recipe involving watermelon, frozen hash browns, Marmite and Red Bull returned the message: “Invalid ingredients found, or ingredients too vague. Please try again!” Which is a terrible pity as you can imagine.
An ingredients list of tea, banana, tomato, broccoli, and yoghurt produce the same result, until we asked the bot to try again. It then suggested a banana and tomato smoothie. A second refresh produced a recipe for “banana tomato tea”, which involved slicing banana and tomato, placing them in a glass, then pouring in some tea.
The supermarket warns that the web app should only be used by people 18 and over, and that its suggestions are not reviewed by a human being.
“To the fullest extent permitted by law, we make no representations as to the accuracy, relevance, or reliability of the recipe content that is generated, including that portion sizes will be appropriate for consumption or that any recipe will be a complete or balanced meal, or suitable for consumption. You must use your own judgement before relying on or making any recipe produced by Savey Meal-bot.” Of course if you’re asking for recipes that include ammonia, your judgement might not be all that reliable.
You can play with the nerfed version of Savey Meal-bot here. ®
TOPICS:
Business/Economy
Cheese, Moose, Sister
Food
KEYWORDS:
chatbot
humor
newzealand
recipebot
strangefood
toxic
Oh, my. This could become interesting.
To: Slings and Arrows; martin_fierro
You guys might want to consider some sort of ping for this. Not-A-Ping? Industrial-Strength-Humor? I dunno…. sure is weird.
by 2 posted onby dayglored (Strange Women Lying In Ponds Distributing Swords! Arthur Pendragon in 2024)
To: dayglored
I’m sure the darwinesque folks are already cooking some up.
by 3 posted onby maddog55 (The only thing systemic in America is the left’s hatred of it!)
To: dayglored
“This would have been perfect for us!”
by 4 posted onby COBOL2Java (“Life without liberty is like a body without spirit.” – Kahlil Gibran)
To: dayglored
Eventually the glitch in the current AI will be corrected. The current state is scary though. I saw some famous actors replaced with digital replacements in current films that were getting pretty good. And the music experiments with AI were amazing. The actors in Hollywood are on strike (scriptwriters?) and I think AI is one of the reasons.
To: dayglored
Got one for chloroform somewhere…
To: dayglored
Deliberately misleading. The ingredients were chosen by the user. The AI didn’t suggest poison. The user did. Humans have common sense. Humans know that ammonia and bleach are toxic. A program doesn’t know that unless it has been told. However this illustrates why AI shouldn’t be given too much power. It can’t be trusted. Humans have a general knowledge of the world gained from their experience or the experience of others. AI is currently the latest fad. As usual with all fads, millions rush to adopt it without thinking. They will put AI in charge of many aspects of life, and it will fail miserably. Humans seem to lose their common sense when they use AI. They seem to think that if it comes from the computer, it must be valid. Wrong.
To: I want the USA back
Humans seem to lose their common sense when they use AI. They seem to think that if it comes from the computer, it must be valid. Wrong. Humans have had the fallacy “If the computer says it, it must be valid” for decades. AI only makes it a little smoother to swallow. This form of human stupidity is nothing new.
by 8 posted onby dayglored (Strange Women Lying In Ponds Distributing Swords! Arthur Pendragon in 2024)
To: dayglored
A just machine
To make big decisions Programmed by fellas
With compassion and vision We’ll be clean
When their work is done We’ll be eternally free
Yes, and eternally young What a beautiful world this will be
What a glorious time to be free
To: sasquatch
Many years ago a guy I know would go up to random women in bars and ask them, “Does this cloth smell lile chloroform?” Way too many actually smelled the cloth. It was his was of weeding out the smart ones.
by 10 posted onby Dutch Boy (The only thing worse than having something taken from you is to have it returned broken. )
To: Dutch Boy
Many years ago a guy I know would go up to random women in bars and ask them, “Does this cloth smell lile chloroform?” Way too many actually smelled the cloth. It was his was of weeding out the smart ones. Reminds me of the guy who would ask random women, “How would you like to have sex?” Nine times out of 10 he’d get smacked. But one time in 10 he’d get laid.
by 11 posted onby dayglored (Strange Women Lying In Ponds Distributing Swords! Arthur Pendragon in 2024)
To: dfwgator
And it sounds like such a happy song, as long as you disregard the lyrics and how far we’ve diverged from those high-flying aspirations.
by 12 posted onby dayglored (Strange Women Lying In Ponds Distributing Swords! Arthur Pendragon in 2024)
To: dayglored
Only Fagen could make an upbeat song about attending a Nazi Rally (Chained Lightning)
To: dfwgator
He’s one weird dude, no doubt. Great songwriter, but definitely out there. I say that as a major SD fan since the beginning. 🙂
by 14 posted onby dayglored (Strange Women Lying In Ponds Distributing Swords! Arthur Pendragon in 2024)
To: dayglored
I say that as a major SD fan since the beginning. 🙂 Any major dude will tell you.
To: Dutch Boy
GREAT ONE!
To: dayglored
When your disclaimer amounts to, “this product is a worthless piece of crap”, shouldn’t you just stay in bed with a bottle of Jamieson’s Stout Caskmate and let the world go in about its business?
To: sasquatch
That guy had a very different approach to meeting women in bars that more normal guys. Entertaining to watch though.
by 18 posted onby Dutch Boy (The only thing worse than having something taken from you is to have it returned broken. )
To: _longranger81
Have you ever read the disclaimer in the EULA for any Microsoft software product? “No claim of suitability for any purpose“, pretty much the same thing. To be fair, it’s not just Microsoft, pretty much every software vendor disavows any claim that their product is useful.
by 19 posted onby dayglored (Strange Women Lying In Ponds Distributing Swords! Arthur Pendragon in 2024)
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.
FreeRepublic , LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson
GPT’s reaction to this article:
As an AI, I don’t have personal opinions. However, I can understand why this article would raise concerns. The AI recipe generation bot mentioned in the article seems to have provided dangerous and harmful suggestions to users. It highlights the importance of human oversight and caution when using AI technologies.