|
Post by varan27813 on Jan 10, 2024 1:22:06 GMT -8
One of the things we advise schools to avoid is using AI content-creation tools to talk about sensitive topics. Given that artificial intelligence tools don’t have cultural sensitivity and can, at times, reproduce stereotypes, schools should stay away from AI-automated content creation when it comes to talking about delicate topics. No one knows what values your school stands for better than your team. Relying on AI to generate content on these issues can result in messages that do not reflect those values. In addition, your school should remember that AI doesn’t necessarily think about diversity and equity when creating content, so ensuring that the content doesn’t reproduce cultural, racial, or gender biases is a must. Many studies have shown that AI tends to take them to extremes instead of addressing and correcting these biases. For instance, in a recent study, Bloomberg analyzed thousands of images generated by Stable Diffusion related to job titles and crime. The authors Phone Number List found that most images generated for high-paying jobs featured subjects with lighter skin tones, and subjects with darker skin tones were mostly seen in images generated for lower-paying jobs. Example: We asked the same platform to create a photo for an ad featuring business leaders. Like Bloomberg, we observed that the generated photo featured a white, middle-aged man, leaving out women and racialized minorities.
|
|