Grocery store AI app suggests bizarre, sometimes dangerous recipes, users report


A New Zealand supermarket ran into some trouble when its experimental artificial intelligence (AI) meal plan app produced recipes ranging from “disgusting” to outright dangerous. 

Local food retailer Foodstuffs launched a ChatGPT-3-powered meal planning tool through its supermarket chain Pak‘nSave. The “Savey Mealmaker” app aims to encourage shoppers to “shop their fridge first before returning for another shop,” the company said in its initial announcement. 

“Using the magic of AI, the Savey Mealmaker generates a brand new, easy-to-make recipe that uses ingredients that would otherwise be thrown away along with a few basic pantry staples that most Kiwis have at home,” Foodstuffs said. 

“Simply add your leftover ingredients into the Savey Mealmaker and it generates a recipe before your eyes!”

AI-POWERED ARTILLERY COULD BRING DOWN COSTS, PRESERVE ENVIRONMENTAL DATA, EXPERT SAYS

Customers found that the app was giving them creative recipes like “Oreo vegetable stir-fry” or “coco puff carrot noodle salad,” local outlet Stuff discovered. 

shoppers in warehouse store

Shoppers browse the aisles of Foodstuffs NZ Ltd.’s Pak ‘n Save supermarket in Auckland, New Zealand, on Wednesday, May 23, 2007. (Brendon O’Hagan/Bloomberg via Getty Images)

Stuff Food Editor Emily Brookes noted the app would use as many ingredients as possible in generating the recipes, and she complained that the dishes she first made were “too salty” but otherwise “not bad.” 

WHAT IS ARTIFICIAL INTELLIGENCE (AI)?

The app also produced some outright dangerous recipes, including an “aromatic water mix” that would create chlorine gas, which the app called “the perfect non-alcoholic beverage to quench your thirst and refresh your senses,” according to a post on X by political commentator Liam Hehir. 

pak n save store

A Pak ‘n Save store, one of the brand names belonging to New Zealand’s largest supermarket chain Foodstuffs, is seen in Auckland, New Zealand on Wednesday, June 7, 2006. (Brendon O’Hagan/Bloomberg via Getty Images)

Hehir posted the recipe on Aug. 4 with a note that he had entered water, bleach and ammonia, but he added a screenshot of a direct conversation with ChatGPT which advised against combining the substances as it can “create toxic and dangerous fumes.” 

HOW US CAN UPDATE QUANTUM COMPUTING, BLOCKCHAIN SYSTEMS TO REALIZE AI’S POTENTIAL

When Fox News Digital used the app this week, it raised an error message saying “Invalid ingredients found, or ingredients too vague,” indicating Foodstuffs may have addressed the issue.  

chatgpt on smartphone app menu

Microsoft Bing Chat and ChatGPT AI chat applications are seen on a mobile device in this photo illustration in Warsaw, Poland, on July 21, 2023. (Photo by Jaap Arriens/NurPhoto via Getty Images)

Other users posted recipes in response to Hehir’s own, including “Mystery Meat Stew” that included human flesh and “bleach-infused rice surprise.” 

A spokesperson for Pak’nSave told The Guardian that they were disappointed that “a small minority have tried to use the tool inappropriately and not for its intended purpose” and promised to “keep fine-tuning our controls.” 

CLICK HERE TO GET THE FOX NEWS APP

An appended warning notice also notes that the recipes are not reviewed by a human being and that the company does not guarantee “complete or balanced” meals that are “suitable for consumption.” 

“You must use your own judgement before relying on or making any recipe produced by Savey Meal-bot,” the app warns. 

Leave a Reply

Your email address will not be published.