Project overview
AI systems are value-laden and reflect the societal values, narratives and priorities embedded in
their design. These cultural narratives or contexts that are built in, through data or algorithms,
influence the outputs from generative AI (genAI) systems. Whereas the collection of diverse datasets is an expensive way of achieving cultural sensitivity, prompt engineering has emerged
as a cheaper alternative. However, the landscape of approaches has failed to sufficiently address
the cultural insensitivity of the genAI outputs. Through an empirically sound approach, CulturAI
seeks to co-develop a culturally inclusive framework to prompt engineering. Funded by Responsible AI UK (RAI UK), the project seeks to answer the research question: How can a culturally sensitive prompt engineering approach be co-developed? By incorporating cultural expertise and community input, the research will explore methods to ensure that AI outputs are not only accurate and relevant but also culturally respectful and inclusive through prompt engineering.
their design. These cultural narratives or contexts that are built in, through data or algorithms,
influence the outputs from generative AI (genAI) systems. Whereas the collection of diverse datasets is an expensive way of achieving cultural sensitivity, prompt engineering has emerged
as a cheaper alternative. However, the landscape of approaches has failed to sufficiently address
the cultural insensitivity of the genAI outputs. Through an empirically sound approach, CulturAI
seeks to co-develop a culturally inclusive framework to prompt engineering. Funded by Responsible AI UK (RAI UK), the project seeks to answer the research question: How can a culturally sensitive prompt engineering approach be co-developed? By incorporating cultural expertise and community input, the research will explore methods to ensure that AI outputs are not only accurate and relevant but also culturally respectful and inclusive through prompt engineering.