Stanford researchers have found that cultural backgrounds significantly influence people's ideal preferences for AI, challenging the prevailing Western-centric approach to AI development.

The new study has uncovered significant cultural variations in how people envision their ideal relationship with artificial intelligence. The research, supported by the Stanford Institute for Human-Centered AI, challenges the dominant Western view of AI as a tool subservient to individual goals.

Led by psychology researcher Xiao Ge and postdoc researcher Chunchen Xu, the study applied cultural psychology theory to examine AI preferences across different cultural groups. The researchers found clear associations between cultural models of agency and the type of AI considered ideal.

The study revealed that European Americans tend to prefer control over AI, viewing it as a tool for individual goals. In contrast, Chinese participants were more likely to seek connection with AI and accept its influence. African American preferences often fell between these two perspectives.

These findings suggest that the prevailing view in AI development, which assumes people desire control over the technology, reflects a cultural model common in European American middle-class contexts but is not universal.

Jeanne Tsai, professor of psychology and co-author, emphasised that cultural factors may shape the initial creation and design of technology, not just its later-stage development. The researchers used a framework based on independent and interdependent cultural models to understand these variations, conducting two online surveys to test their hypotheses involving scenarios of different AI applications.

The Stanford study underscores the importance of cultural considerations in AI development. By recognising and incorporating diverse cultural perspectives, AI researchers and developers can create more inclusive and globally applicable AI systems that better serve a wider range of users across different cultural contexts.



Share this post
The link has been copied!