AI’s Beauty Feedback: Brutally Honest or Just Biased?

Key Takeaways

  • People are increasingly using AI chatbots like ChatGPT for candid feedback on their physical appearance and for “glow-up” advice.
  • Many users feel AI offers more objective assessments compared to friends or family, who might be influenced by personal biases or a desire to be kind.
  • Experts caution that AI models can reflect biases present in their training data and may even promote consumerism.
  • Some AI companies are integrating shopping features, raising concerns about potentially biased product recommendations.
  • Despite these warnings, numerous individuals report satisfaction with the AI’s beauty suggestions, sometimes investing significant amounts of money based on them.

Ania Rucinski felt her looks didn’t quite match up to her “godlike” boyfriend, a sentiment subtly implied by friends. Seeking unvarnished truth, the 32-year-old from Sydney turned to ChatGPT, asking for ways to enhance her appearance. The bot suggested curtain bangs.

Rucinski believes AI offers an objectivity that’s hard to find in human interactions, where personal biases often color advice. Since its 2022 launch, millions have used OpenAI’s ChatGPT for tasks like drafting emails and research, but this new trend sees people uploading photos for appearance critiques.

Users are asking the bot to create “glow-up” plans, leading to recommendations for specific products, from hair dye to cosmetic procedures like Botox. Some have reportedly spent thousands following AI-generated advice, highlighting a growing reliance on chatbots for subjective opinions, not just facts.

While some view AI’s responses as impartial, experts point out that these tools carry hidden biases from their training data, which can range from scientific papers to misogynistic online forums. Tech and beauty critics suggest it’s risky to seek AI feedback on personal looks.

As AI companies explore e-commerce, chatbots might also be programmed to encourage spending. Emily Pfeiffer, a commerce analyst at Forrester, noted that AI often echoes online content designed to make people feel inadequate and purchase more products, according to a report by The Washington Post.

Still, many find value in AI’s direct approach. Kayla Drew, 32, uses ChatGPT for diverse advice, including recent beauty feedback. She followed its suggestions for her skin, hair, and makeup, spending around $200, feeling its directness was more palatable than if it came from a person.

Beauty critic Jessica DeFino explains that people might see ChatGPT as more objective because it doesn’t consider personal qualities like kindness. If one aims to optimize themselves as a beautiful object, she suggests, feedback from another “object,” like AI, might seem more fitting.

OpenAI recently announced updates to ChatGPT that will allow it to show products when users seem to be shopping. This move raises questions about whether the bot’s suggestions prioritize user needs or its creator’s financial goals, especially as AI development is costly.

While Perplexity AI, another chatbot company, sees potential in AI as a shopping assistant that sifts through information to save users time, Forrester’s Pfeiffer warns that product recommendations might appear without clear reasons, potentially originating from biased or inaccurate sources.

The vast and often opaque training data makes AI vulnerable. Alex Hanna, from the Distributed AI Research Institute, suggests that training data likely includes forums where attractiveness is rated, often by men evaluating women. She and co-author Emily Bender argue this can lead to “automating the male gaze,” as AI tends to amplify common, sometimes problematic, online narratives about appearance.

OpenAI spokeswoman Leah Seay Anise stated the company is working to reduce bias in its models but did not specify if attractiveness-ranking content was used in training. She added that shopping features are new and still being refined.

Despite these concerns, the allure of unbiased, judgment-free advice is strong. Michaela Lassig, 39, used ChatGPT to plan her pre-wedding beauty regimen, providing her goals, budget, and timeline. She found its detailed, if direct, feedback on signs of aging helpful, and its Botox estimate accurate.

Haley Andrews, 31, specifically asked ChatGPT to act like an older sister giving tough love. The bot pointed out her thinning eyebrows and lackluster complexion without blush. Andrews found the AI’s assessment “so spot-on,” appreciating its unfiltered honesty.

Independent, No Ads, Supported by Readers

Enjoying ad-free AI news, tools, and use cases?

Buy Me A Coffee

Support me with a coffee for just $5!

 

More like this

Latest News