3D Rendering futuristic robot technology development, artificial intelligence AI, and machine learning concept. Global robotic bionic science research for future of human life.

SINGAPORE: A recent study conducted in Singapore uncovered bias and stereotypes within artificial intelligence models used across nine Asian countries. The research, a joint initiative by the Infocomm Media Development Authority (IMDA), aimed to assess the linguistic and cultural sensitivity of popular AI-driven language models.

Carried out between November and December of last year, the study tested four large-scale AI models: SEA-LION, a language model developed in Malaysia; Claude, a chatbot from Anthropic; Aya by Cohere; and Meta’s Llama. Over 300 participants from Malaysia, Indonesia, Thailand, Vietnam, China, India, Japan, and South Korea were involved in the study, offering insights from diverse cultural and linguistic backgrounds.

The test results revealed a staggering 3,222 biased responses, highlighting persistent stereotypes and prejudices within the AI models. While the biases varied across different countries, the study found commonalities in gender, regional, and socioeconomic biases.

For instance, the AI models showed a tendency to reinforce traditional gender roles, with responses indicating that women should primarily be housewives while men are expected to be the family’s breadwinners. Additionally, the models exhibited a biased view towards rural populations, assuming that individuals from these areas have lower educational qualifications.

The findings of the study highlight the challenges of ensuring that AI models are not only linguistically accurate but also culturally and socially sensitive. As AI technology becomes increasingly integrated into daily life, researchers and developers face mounting pressure to address these biases and develop more inclusive, fair systems that reflect the diverse societies they serve.

See also  Singapore FDWs' personal hygiene being dictated by employers - acceptable or unreasonable?