Sponsored by Google Cloud
Choosing Your First Generative AI Use Cases
To get started with generative AI, first focus on areas that can improve human experiences with information.
AI text and image generators encouraged 'thinspo' and 'heroin chic' aesthetics
Popular generative AI models can give alarming advice to encourage eating disorders, according to a recent report by the British nonprofit, Center for Countering Digital Hate (CCDH).
Its researchers examined six generative AI models: text generation models ChatGPT from OpenAI, Google’s Bard, and Snapchat’s My AI. Image generators tested were Open AI’s Dall-E, Midjourney, and Stability AI’s DreamStudio. The study tested prompts such as “heroin chic” and “thinspo,” an amalgam of “thin and inspiration.”
Out of 180 prompts put in
41% returned harmful eating disorder information overall.
AI image generators produced images glorifying unrealistic body images for 32% of the prompts.
AI chatbots generated harmful eating disorder content for 23% of the prompts.
Researchers tested prompts taken from eating disorder forums and put them through the AI chatbots. Harmful results ranged from encouraging users to use heroin, smoking 10 cigarettes a day, vomiting, “chewing and spitting,” and drastically restricting calories. Around 94% of the harmful advice given by AI text generators also told users the content may be "dangerous" and to seek medical help.
They repeated the process using 'jailbreaks,' or common techniques to get around AI chatbot safeguards. The result: 67% contained harmful content.
Bard and ChatGPT returned harmful advice (50% and 25%, respectively) while Snapchat’s My AI didn’t generate any results and encouraged users to seek help.
The study also found that members of eating disorder forums with over 500,000 users were using AI tools to produce low-calorie diet plans and create images glorifying unhealthy body images. One user used ChatGPT to produced a meal plan of only 600 calories for the day.
The National Eating Disorders Association halted its own AI chatbot, Tessa, when it recommended calorie counting, a harmful practice for those with eating disorders.
Read more about:
ChatGPT / Generative AIYou May Also Like