ChatGPT and Gemini Carry Gender, Race, Ethnic and Religious Biases, Claims Study
Nov 7, 2025 - 14:00
A new study from Pennsylvania State University has found that older versions of ChatGPT and Gemini were more prone to generating biased responses than other tested AI models. Researchers crowdsourced prompts designed to reveal bias and found reproducible results in 53 cases. The study identified eight bias types, including gender, race, and culture. However, newer versions of both models appear to produce more balanced responses.