ORIGINAL RESEARCH ARTICLE

Critical and creative pedagogies for artificial intelligence and data literacy: an epistemic data justice approach for academic practice

Javiera Atenasa*, Leo Havemannb,c and Chrissi Nerantzid

aSchool of Social Sciences and Humanities, University of Suffolk, Ipswich, UK; bInstitute of Educational Technology, The Open University, Milton Keynes, UK; cUniversity College London, London, UK; dCreative and Open Education, Faculty of Social Sciences, School of Education, University of Leeds

(Received: 11 April 2024; Revised: 23 September 2024; Accepted: 26 October 2024; Published: 16 January 2025)

This paper offers guidance on employing open and creative methods for co-designing critical data and artificial intelligence (AI) literacy spaces and learning activities, rooted in the principles of Data Justice. Through innovative approaches, we aim to enhance participation in learning, research and policymaking, fostering a comprehensive understanding of the impact of data and AI whilst promoting inclusivity in critical data and AI literacy. By reflecting on the Higher Education (HE) context, we advocate for active participation and co-creation within data ecosystems, amplifying the voices of educators and learners. Our methodology employs a triangulation model: initially, we conduct interpretative analyses of literature to gauge best practices for curriculum development in HE; then, we examine frameworks in data justice and ethics to identify principles and skills applicable to undergraduate, postgraduate and academic development programs; finally, we explore proposals for critical, creative, ethical, open and innovative ideas for educators to integrate data and AI into their practice.

Keywords: critical data literacy; data justice; education; creative pedagogies

*Corresponding author. Email: j.atenas@uos.ac.uk

Research in Learning Technology 2024. © 2024 J. Atenas et al. Research in Learning Technology is the journal of the Association for Learning Technology (ALT), a UK-based professional and scholarly society and membership organisation. ALT is registered charity number 1063519. http://www.alt.ac.uk/. This is an Open Access article distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), allowing third parties to copy and redistribute the material in any medium or format and to remix, transform, and build upon the material for any purpose, even commercially, provided the original work is properly cited and states its license.

Citation: Research in Learning Technology 2024, 32: 3296 - http://dx.doi.org/10.25304/rlt.v32.3296

Introduction: a data justice lens on data and artificial intelligence literacy

In the 21st century, global higher education (HE) has increasingly become understood through an imaginary of market, consumers and metrics, rather than social mission, democracy and citizenship, impacting on what and how students learn and raising concerns and debates about the effectiveness of traditional pedagogies (Stein & De Oliveira Andreotti, 2017), trends that have only accelerated in the wake of the COVID-19 pandemic and consequently increased digitalisation (Chaudhry & Kazim, 2022; Treve, 2021).

Couldry and Hepp (2018) argue that our reality is to a growing extent being built through data-based processes and automated decision processes and algorithms, fostering datafication as a business model, which is transforming societies (and therefore, education). Symptomatically, the impact of datafication and artificial intelligence (AI) in HE is profound and tends to reinforce systemic inequalities, rather than representing (as usually claimed) a value-neutral ‘digital transformation’ of institutions to achieve promised innovations and efficiencies (Carmi et al., 2020; Swist & Gulson, 2023; Williamson et al., 2020).

Datafication, particularly through AI and predictive learning analytics, is reshaping education, becoming entangled with the (not just social, but socio-technical) construction of educational reality, thereby altering the organisation of learning and teaching. This transformation requires a critical examination of methodologies and approaches used in analysing educational data, as highlighted by Dalton et al. (2016), Thoutenhoofd (2017), Jarke and Breiter (2019), Couldry (2020), Prinsloo et al. (2022) and Komljenovic et al. (2023), who call for scrutiny of practices that shape perceptions of education across social, political, economic and cultural domains.

We argue that co-created and participatory approaches are needed, grounded on the principles of data justice, to support capacity building in data and AI literacies across learning, research and policymaking realms (Atenas, Havemann & Timmermann, 2020; Gonsales et al., 2021; Taylor, 2017). Data justice is defined by Dencik et al. (2016) as an approach ‘used to denote an analysis of data that pays particular attention to structural inequality, highlighting the unevenness of implications and experiences of data across different groups and communities in society’ (p. 875). Data justice has three pillars that are key to enhancing practices in HE: visibility, engagement with technology and non-discrimination. Visibility includes access to representation and informational privacy; engagement with technology embeds sharing in data’s benefits and autonomy in technology choices; and non-discrimination addresses the ability to challenge biases and counteract discriminatory practices (Taylor, 2017).

Our focus in this paper is on how educators in HE can advance a critical understanding of the impact of data and AI and foster inclusive critical data and AI literacy, defined by Brand and Sander (2020) as ‘the ability to critically engage with datafication by reflecting on the societal implications of data processing and implementing this understanding in practice’ (p. 2). Universities, like other organisations, are increasingly embracing data-driven approaches; however, the sudden interruption of generative AI, encompassing big data applications that create new content, has driven a sense of widespread sectoral ‘freaking out’. Holmes et al. (2019) note that whilst educational AI may have the potential to be beneficial where teachers are scarce, the notion that it is an effective replacement for teachers undermines the value of teachers’ skills and neglects learners’ social learning needs.

To advance critical data and AI literacies through data justice includes acknowledging epistemic, representational and material harm (Hamilton & Sharma, 1996; Stein & De Oliveira Andreotti, 2017; Young, 2014), recognising issues such as racial and social codification, exclusion by design and the reproduction of systemic oppressions (Park & Humphry, 2019; Williams & Clarke, 2016), as the opacity of algorithms perpetuates dehumanisation and undermines accountability (Brew et al., 2023; Henz, 2021; Lepri et al., 2018).

As outlined by Redecker and Punie (2020), individuals need a thorough understanding of data and AI, encompassing the ability to engage positively, critically and safely with these technologies whilst considering ethical dimensions such as environmental sustainability, data protection and discrimination. D’Ignazio and Bhargava (2015) highlight the issue of an unequal distribution of data literacy across social groups, emphasising that those lacking fluency in data-driven discourse are systematically excluded from participatory processes that rely on such discourse. In response, the concept of critical data and AI literacy, as proposed by Pangrazio and Selwyn (2019), Brand and Sander (2020) and others (Bozkurt et al., 2023), extends beyond basic proficiency. It encompasses the ability to critically evaluate data, algorithms and their societal impacts, thereby empowering individuals to engage meaningfully with questions of big data, machine learning and AI whilst being cognisant of risks to (and indeed, existing impacts upon) both human rights and the environment. Such an approach requires a socio-technical understanding of data, encompassing a broader and more critical perspective on its role in society and encompassing issues of agency, data ethics, data justice and participation in data-driven decision-making, and advocates for the fair and just use of data in both education and society (Baack, 2015; Gilliard, 2017; Perrotta, 2022; Williamson, 2017).

Deepening of educators’ understanding of data, datafication and AI represents a critical gap that must be addressed not only through research but also by cultivating creative dialogical spaces. These spaces should foster capacity-building and comprehension of how AI, platforms and data are transforming the educational landscape. In order to construct the kinds of dialogic spaces envisaged, we wish to emphasise the roles of curiosity, imagination and creativity as essential and transformative elements of human existence, crucial for addressing the pressing challenges facing our world. A recent report by the British Science Association (2022) amplifies the voices of young people, who call for creativity and interdisciplinary collaboration to confront these monumental challenges. HE curricula can play pivotal roles in this endeavour by promoting free-thinking, critical inquiry, creative experimentation and cross-boundary or radical collaboration (Manca et al., 2017; Bene & McNeilly, 2020; Means & Slater, 2022). These elements are vital for fulfilling the university’s social mission and embedding it within both local and global communities.

We aim to offer guidance and recommendations through open and creative approaches for the co-design and co-creation of critical data and AI literacy spaces and activities, to cultivate a deep, critical understanding of structural data justice (Mandinach & Gummer, 2013; Dencik & Sanchez-Monedero, 2022; D’Ignazio, 2022; Heeks & Swain, 2018). By encouraging reflective practices within educational contexts, we advocate for participatory engagement within data ecosystems, ensuring that the voices of both educators and learners are included, thereby safeguarding vulnerable groups from the indiscriminate and discriminatory uses of data and providing opportunities to challenge existing data-driven power dynamics (D’Ignazio & Klein, 2020).

Methodology

As discussed in the introduction, there is a social gap in critical data and AI literacy, which educators (in partnership with students and others) in HE are well-positioned to address. We also recognise that such topics may feel outside the comfort zone of many educators, and that relevant pedagogic strategies might be welcomed. We, therefore, reviewed literature with a view to identifying critical perspectives, good practices and innovative approaches for capacity-building activities in critical data and AI literacy within academic practice. By examining a diverse array of sources, including empirical studies, theoretical discussions and case studies, we were able to synthesise insights that highlight effective strategies for consideration of ethical, social and technical dimensions of data and AI. We employed an interpretative analysis method (Dixon-Woods et al., 2006; Fereday & Muir-Cochrane, 2006; McDougall, 2015), which allowed us to go beyond a systematic descriptive analysis to uncover underlying themes, patterns and conceptual frameworks that are essential for fostering critical skills in data and AI literacy. In this way, we have sought to avoid reinforcing existing knowledge inequalities, as biases can be inherent when reliant only upon searching indexes of ‘high quality’ sources; instead we have been informed by feminist and decolonial approaches that seek to amplify voices from women and minoritised communities (De Almeida & De Goulart, 2017; Kordzadeh & Ghasemaghaei, 2022; Leurs, 2017; Webb, 1993).

Our analysis is structured around our key interpretative lenses of critical pedagogy, socio-technical systems theory and data justice, which enabled us to critically engage with the literature and extract relevant practices that align with these frameworks. This method facilitated a comprehensive exploration of how educators can integrate critical data and AI literacy into their teaching practices, addressing not only the technical skills required but also the broader ethical and societal implications of data and AI in education.

Following our in-depth interpretative literature review, we undertook a second phase of analysis focused on evaluating innovative proposals for the use of data and AI in educational settings. To achieve this, we employed a crowdsourcing methodology, which serves as a dynamic strategy for staying at the forefront of developments through the collection of diverse and emerging ideas and fostering collective innovation by harnessing the collective intelligence of a broad community (Agarwal et al., 2021; Llorente & Morant, 2015; Solemon et al., 2013).

Specifically, we crowdsourced creative ideas from an open project coordinated by the international Creative HE community (Nerantzi, et al., 2023; Abbleglen et al., 2024). Between January and March 2023, the crowdsourcing effort yielded 101 creative ideas (comprising 100 single-author contributions and 3 multiple-author contributions) from 83 contributors across a diverse array of countries, including Australia, Canada, China, Egypt, Germany, Greece, India, Israel, Italy, Ireland, Jordan, Liberia, Mexico, South Africa, Spain, Thailand, Turkey, the United Kingdom and the United States. These ideas were then curated for publication as an openly licenced book, accessible to educators and students worldwide (Nerantzi, et al., 2023; Abegglen, et al., 2024). This globally sourced collection not only offers insights into innovative uses of AI for learning and teaching but also contributes to the discourse on open data and the development of critical AI literacy within the educational domain.

The analysis of the crowdsourced data was designed to identify good practices in terms of creative and critical ideas that can be seamlessly integrated into their teaching environments. These practices were synthesised and clustered in different groups, mapped against the principles of data justice, adding guidance for educators to apply them in practice though an actionable set recommendations to enhance pedagogical approaches, encouraging critical engagement with data and AI, and ultimately support the development of a more equitable and informed educational landscape.

Creative pedagogies for data and AI literacies

Insights from literature

The role of data in categorising and shaping governmental, corporate and social understanding and treatment of individuals and social groups underscores the need for data justice to uphold and defend rights and freedoms, particularly of the marginalised, in this digitised, datafied era (Selwyn, 2015, 2018; Schäfer & Van Es, 2017). Whilst conversations about governance and regulation of data and data-driven services continue at the government and supranational levels, we would suggest that citizens need to be informed and literate in these topics rather than assume that governance processes will ‘handle it’ and ensure adequate protections; after all, even in GDPR-protected Europe, one simply needs to tick a box to sign one’s data rights away.

As well as aiding in personal decision-making, the development of relevant literacies, if not in itself a yellow brick road to data justice, creates the possibility of meaningful citizen participation in policy debates. To achieve this, educational organisations, particularly in HE, will need to invest in enhancing skills and capabilities in data and AI literacies, through radical collaboration across various stakeholders, including educators, students, researchers, data experts, civil society and activists, with a focus on developing and disseminating practices that prioritise diversity, equity and inclusivity in curriculum development (Atenas, Havemann & Timmermann, 2023; Markauskaite et al., 2022; McGovern, 2018).

Creative pedagogies can help us embrace imaginative and resilient curricula, which will provide a much-needed social playground, time and space, and bring diverse ideas and perspectives together and nurture risk taking, whilst feeling vulnerable and unsettled, making mistakes and learning from these with others in a learning community within accountable spaces (Ahenkorah, 2020; hooks, 1994; Jackson et al., 2006; Naidu, 2021; Nussbaum, 2013; Suoranta et al., 2021; Zembylas, 2023).

These challenges present opportunities for renewal and change (Freire, 2011; Jackson, 2018), and designing imaginative and dynamic curricula that foster critical data and AI literacy is paramount. People-centred design principles place diverse voices and perspectives at the heart of the co-creative process and enable rapid prototyping and implementation, and therefore, let creative ideas be explored, considered and embedded into the design (Lockwood, 2010).

The development of data and AI literacies for educators in HE faces significant challenges, including limited access to training and professional development, a lack of spaces for dialogue, creativity and reflection, and unclear definitions of the essential literacies required for both educators and students. Thus, it becomes challenging to integrate these literacies into academic practices to empower HE communities to engage effectively in data-led debates. Additionally, there is a shortage of opportunities and resources to help educators and students critically discuss the ethical implications of data and navigate complex issues related to its impact on education, as highlighted by Floridi and Taddeo (2016) and Williamson (2017).

Creative pedagogies to foster data and AI literacies should create dynamic, reflective spaces that empower educators and students to engage with data in meaningful ways, facilitating a deeper understanding of data justice and its implications for education and society. This includes critically examining power dynamics within HE, such as the reliance on data-driven decision-making and the emphasis on performance metrics, which can reinforce existing power structures and marginalise vulnerable groups (Atenas, Havemann & Timmermann, 2023; Bhargava et al., 2016; D’Ignazio & Bhargava, 2020; Dubey et al., 2019). By promoting creative approaches to teaching and learning, we can better equip educators and students to navigate the complexities of data and AI, fostering a more just and equitable educational environment.

Creative pedagogies can provide a dynamic platform for educators to engage with AI technologies. By incorporating these pedagogies, as suggested by Goel and Joyner (2017), AI can be demystified, making it more accessible and empowering educators and students to understand its implications and applications. Such pedagogies encourage exploration, questioning and creativity, which are essential for fostering a deeper understanding of AI.

As highlighted by Beghetto and Kaufman (2014), Danylchenko-Cherniak (2023), Leonard (2021) and Long & Magerko (2020), creative pedagogies are crucial for promoting critical thinking, adaptability and innovation amongst students. These approaches emphasise active learning, collaboration and problem-solving, enabling students to engage with content through hands-on activities, real-world projects and collaborative efforts. In a society increasingly dominated by complex and opaque data ecosystems, educators must adopt the role of data activists, promoting data justice and challenging systemic inequalities. By integrating principles of data justice, as advocated by Milan and Van Der Velden (2016) and Dencik et al. (2016), into curriculum design, educators can help ensure that AI and data-driven systems promote social justice and benefit all members of society.

Data justice involves addressing the ethical and social implications of how data is collected, stored, analysed and disseminated, with a focus on ensuring its fair and equitable use in the public interest (Dencik et al., 2019). Recognising that data are not neutral and can reinforce existing power structures and inequalities, it is crucial to confront biases, incompleteness and manipulation in data practices that may result in social harm (Atenas et al., 2023).

D’Ignazio and Klein (2020) argue that data justice is vital for correcting power imbalances in AI development and deployment, advocating for the active involvement of marginalised communities in shaping data-driven systems. Similarly, Milan and Van Der Velden (2016), Noble (2020) and Dencik et al. (2016) highlight the need to address power disparities within the data ecosystem and ensure the participation of marginalised groups in decision-making processes. They call for a more democratic approach to data governance in HE to avoid perpetuating racial and gender inequalities. By embedding principles of data justice into creative and critical practices in learning, teaching and research, HE can contribute to mitigating biases in data collection and analysis, advancing the principles of redistribution and recognition, and addressing social inequalities related to race, gender and class.

The increasing reliance on digital tools and data-driven policies in HE has created a complex data ecosystem, where fostering participatory governance can lead to a more equitable and just system. Critical data and AI literacy are essential for empowering academics and students to engage meaningfully with data, particularly through participatory design processes that involve community engagement in reviewing and assessing educational technologies. Incorporating community-based participatory research and policymaking ensures that data collection and usage are ethical and socially responsible, prioritising the interests of vulnerable communities. As boyd and Crawford (2012) cautioned, data practices can have unintended consequences, making it crucial for academic programmes to raise awareness of the ethical implications of data use, following ethical frameworks to ensure data justice, as Taylor & Mukiri-Smith, (2021) advocate, to respect the rights and dignity of individuals and communities.

Through the analysis of the literature, we identified a series of data justice principles that serve as guide to cluster a wide range of critical activities that can be embedded in curricula, which are linked with the pillars of data justice to support educators to design effective and sustainable practices with their learners across disciplines as presented in Figure 1.

Fig 1
Figure 1. Data justice pillars and principles.

Therefore, grounded in these principles, we propose a series of activities to build critical data and AI literacy, to advance understanding of the ethical, social and political dimensions of data and promote interdisciplinary research-based learning activities, as shown in Table 1.

Table 1. Learning and teaching activities grounded on data justice concepts.
Data justice principles Learning and teaching activities
Centre marginalised voices This involves incorporating readings, discussions and guest speakers from marginalised communities to ensure their perspectives are highlighted and understood.
Promote data literacy Teaching data visualisation and interpretation skills, whilst discussing the limitations of data sources, helps students become proficient in understanding and analysing data effectively.
Challenge power dynamics Encouraging critical reflection on power structures, students can gain insights into how data can reinforce or challenge existing hierarchies and inequalities.
Foster community-engaged research Introducing principles of community-engaged research and collaborating with community partners in research activities help students understand the importance of involving diverse voices in research.
Promote socially just uses of data Comparing and contrasting examples of potentially socially just and unjust data usage helps students understand the ethical implications of data and encourages them to use data in ways that promote social justice.
Promote data ethics Exploring ethical frameworks and principles relevant to data ethics and discussing case studies of data ethics violations help students develop a strong ethical foundation in handling data.
Advocate for data decoloniality By analysing historical data practices and their impact on indigenous communities and marginalised populations, as well as exploring indigenous approaches to knowledge curation, students can understand the role of data in perpetuating colonial perspectives and power imbalances.
Incorporate diverse perspectives Encouraging diverse readings and inviting guest speakers from various backgrounds help students gain exposure to diverse perspectives and experiences, fostering a more inclusive learning environment.

Analysis of the crowdsourced ideas

Involving educators and students in creative participatory activities, both individually and collaboratively, can help to challenge preconceptions, reveal marginalised or unheard voices and consider alternative perspectives, ultimately leading to new insights and the growth of critical data and AI literacy by exploring and experimenting with AI in innovative ways to develop critical thinking and literacy in various educational contexts, both within and beyond HE (Nerantzi et al., 2023). Though the analysis of a series of crowdsourced ideas, shared openly by 83 contributors from across the globe for a project coordinated by the international Creative HE community, we have mapped a series of initiatives, proposals, ideas and approaches that can help educators to develop learning activities that are grounded in creative and arts-based inquiry and practices, in order to break barriers to communication, diversify and deepen participation (MacGregor et al., 2023).

The activities were first clustered by different types of activities using different techniques to provide valuable strategies to engage educators and students in developing critical data literacies and help them develop a range of skills in data analysis, interpretation and communication, working collaboratively, using real-life scenarios in a real-world learning approach (Abbeglen et al., 2024; Kara et al., 2021). Some of these ideas are summarised in Table 2.

Table 2. Summary of creative ideas for the use of AI in HE.
Idea Description Suggested tools
Fostering digital literacies for AI ethics Using AI tools, educators can engage students in critical discussions on digital technology, exploring real-world AI applications like social media algorithms, analysing ethical implications and societal impacts whilst fostering awareness of privacy concerns. Midjourney
Using AI to develop variety in scenario-based assessments Academics can leverage technology to develop authentic, meaningful and adaptive assessments using AI to adjust complex scenarios based on learner’s level, mirroring real-world situations with varying parameters. ChatGPT
Quizmaster or pub quiz AI generates diverse questions prompting discussions on biases, stereotypes and marginalised perspectives, ensuring balanced representation using familiar formats such as TV or pub quiz, enhancing engagement and facilitating meaningful conversations. ChatGPT
Branching scenarios using AI-generated case studies AI can streamline case study creation, offering comprehensive narratives with dialogue, feedback and branching options. AI-Generated Ethical Case Studies span diverse domains like healthcare, business and social justice, presenting complex scenarios for ethical exploration, featuring multiple decision points and simulating real-world contexts for students to analyse, whilst branching scenarios provide rich contextual information and relevant facts, enabling informed ethical decision-making, promoting engagement and providing students with a content-rich learning journey. ChatGPT, Midjourney, H5P
Rewriting with AI image generators Students can learn to write concise instructions for an AI image generator to explore ethical implications and responsible manipulation, reflecting on consequences like misrepresentation and privacy concerns, emphasising the importance of permissions and cultural sensitivity, streamlining discussions on cultural stereotypes and fostering awareness of contextual impact and mindful image use. DALL·E 2
Understanding gender bias in AI: A critical reflection exercise Students can have AI tools like ChatGPT and DALL·E to investigate how discriminatory data fuels gender bias in AI-generated outputs, prompting group discussions on bias identification and mitigation strategies, encouraging critical reflection on AI’s role in perpetuating gender stereotypes through real-world scenarios in employment, advertising and virtual assistants, to explore how recommendation systems amplify biases, reflect on social implications and consider strategies for mitigation. ChatGPT, DALL·E and any AI emerging technology
Distilling key ideas from OpenAI’s privacy policy/terms of use Students can engage in a jigsaw reading activity by dissecting OpenAI (or another AI)’s Privacy Policy or Terms of Use in pairs, to identify key points and create infographics or comic strips illustrating privacy, user consent and responsible AI use, with an emphasis on incorporating dialogue and visual storytelling techniques for effective communication. OpenAI, Texter
Art and philosophers Students can curate an exhibition focused on philosophical phrases or keywords related to various philosophers’ thoughts, exploring concepts such as moral responsibility, free will and consciousness in the context of AI through the exhibited works. Midjourney
AI: artificial intelligence; HE: higher education.

Not only the mapped activities can help to develop and mature an understanding of data and AI literacy within the wider academic community, but also we additionally present some creative ideas that can help start a dialogue between educators and students, as shown in Table 3.

Table 3. Creative and artistic ideas to develop critical data and AI literacies.
Creative/artistic idea Description Intended outcome
Data debate role play Using existing ed-tech policies, educators and students can debate about data-related issues and lobby for changes or to maintain the status quo using role play techniques. Develop understanding about the processes and skills needed to participate in policy making.
Data comics Create a comic strip that tells a story using data where educators and students can develop comics that can be translated in other contexts. Develop critical thinking and communication skills in data analysis.
Data sculpture Create a physical sculpture or model using data using a range of materials, so educators and students can develop data visualisation. Develop spatial reasoning and creativity in data representation.
Data poetry Write poetry that incorporates data or data visualisations, so educators and students can develop data interpretation and creative writing skills. Develop imaginative and expressive skills in data analysis.
Data mapping Use mapping software to create a visual representation of data where educators and students visualise a social issue. Develop spatial reasoning and critical thinking in data analysis.
Data storytelling Use digital storytelling tools to craft multimodal narratives incorporating data, whilst also exploring non-digital methods for storytelling enhancement, to immerse educators and students, fostering data interpretation and storytelling skills development. Develop critical thinking and communication skills in data analysis.
Data game Employ gamification techniques to collaboratively develop a board game that delves into topics like power dynamics and marginalised voices, facilitating reflection and learning to understand the importance of incorporating diverse perspectives in data work. Develop openness and critical understanding of diverse perspectives for data collection, interpretation and analysis.
Data drama Create an embodied experience using drama, roleplay or simulation techniques to enable educator students to immerse themselves into a specific scenario or case and experience first-hand data unfairness, biases and exclusion. Develop critical awareness of the importance of data fairness and justice.
Data drawing Create a doodle or drawing that visualises a story using data where educators and students will develop data interpretation and storytelling skills using visual language. Develop critical thinking and communication skills in data analysis and interpretation.
Data collage Make a 2D or 3D collage or bricolage to visualise data in a unique way that may help identify data asymmetries, anomalies and misrepresentation, raise new questions and identify connections valuable for data analysis and interpretation. Develop critical awareness of noticing and novel connections and their importance for data interpretation and representation.
AI: artificial intelligence.

Conclusions and recommendations

To develop and advance data and AI literacies, it is key to incorporate the pillars and principles of data justice into HE curricula, integrating activities grounded on real social problems and challenges creatively into existing programmes of study. To enhance academic practice in HE, particularly in the integration of data justice principles into curricula, we propose the following recommendations:

Finally, we call for increased research and scholarship attending to questions of data justice, by educators, students and other collaborators, to advance our understanding of data justice (and injustice) and its applications in HE learning, teaching, research, administration and knowledge exchange. Rather than simply talk, or even do assignments, about data and AI, we call for capacity building in order to conduct and disseminate empirical studies, critical analyses and theoretical explorations that will form a growing knowledge base to support HE communities with the knowledge, skills and ethical awareness to navigate the complex landscapes of data and AI whilst centreing equity, justice and social well-being.

References

Abegglen, S., Nerantzi, C., Martínez-Arboleda, A., Karatsiori, M., Atenas, J., & Rowell, C. (2024). Towards AI literacy: 101+ creative and critical practices, perspectives and purposes. #creativeHE.

Agarwal, V., Panicker, A., Sharma, A., Rammurthy, R., Ganesh, L., & Chaudhary, S. (2021). Crowdsourcing in higher education: Theory and best practices. In R. Lenart-Gansiniec & J. Chen (Eds.), Crowdfunding in the public sector (pp. 127–135). Springer.

Ahenkorah, E. (2020). Safe and brave spaces don’t work (and what you can do instead). Medium. Retrieved from https://medium.com/@elise.k.ahen/safe-and-brave-spaces-dont-work-and-what-you-can-do-instead-f265aa339aff

Atenas, J., Havemann, L., & Timmermann, C. (2020). Critical literacies for a datafied society: Academic development and curriculum design in higher education. Research in Learning Technology, 28, e2468. https://doi.org/10.25304/rlt.v28.2468

Atenas, J., Havemann, L., & Timmermann, C. (2023). Reframing data ethics in research methods education: A pathway to critical data literacy. International Journal of Educational Technology in Higher Education, 20(1), 11. https://doi.org/10.1186/s41239-023-00380-y

Baack, S. (2015). Datafication and empowerment: How the open data movement re-articulates notions of democracy, participation, and journalism. Big Data & Society, 2(2), 1–11. https://doi.org/10.1177/2053951715594634

Beghetto, R.A., & Kaufman, J.C. (2014). Classroom contexts for creativity. High Ability Studies, 25(1), 53–69. https://doi.org/10.1080/13598139.2014.905247

Bene, R., & McNeilly, E. (n.d.). Getting radical: Using design thinking to tackle collaboration issues. Papers on Postsecondary Learning and Teaching, 4, 50–57. https://doi.org/10.11575/pplt.v4i.68832

Bhargava, R., Kadouaki, R., Bhargava, E., Castro, G., & D’Ignazio, C. (2016). Data murals: Using the arts to build data literacy. The Journal of Community Informatics, 12(3). Retrieved from https://openjournals.uwaterloo.ca/index.php/JoCI/article/view/3285

boyd, d., & Crawford, K. (2012). Critical questions for Big Data: Provocations for a cultural, technological, and scholarly phenomenon. Information, Communication & Society, 15(5), 662–679. https://doi.org/10.1080/1369118X.2012.678878

Bozkurt, A., Xiao, J., Lambert, S., Pazurek, A., Crompton, H., Koseoglu, S., Farrow, R., Bond, M., Nerantzi, C., Honeychurch, S., Bali, M., Dron, J., Mir, K., Stewart, B., Costello, E., Mason, J., Stracke, C. M., Romero-Hall, E., Koutropoulos, A., … Jandrić, P. (2023). Speculative futures on ChatGPT and generative artificial intelligence (AI): A collective reflection from the educational landscape. Asian Journal of Distance Education, 18(1). Retrieved from https://asianjde.com/ojs/index.php/AsianJDE/article/view/709

Brand, J., & Sander, I. (2020). Critical data literacy tools for advancing data justice: A guidebook. Data Justice Lab. Retrieved from https://datajustice.files.wordpress.com/2020/06/djl-data-literacy-guidebook.pdf

Brew, M., Taylor, S., Lam, R., Havemann, L., & Nerantzi, C. (2023). Towards developing AI literacy: Three student provocations on AI in higher education. Asian Journal of Distance Education, 18(2), 1–11. https://doi.org/10.5281/zenodo.8032387

British Science Association. (2022). Future Forum: Creativity in STEM: Young people’s views on using collective collaboration to build a better future. British Science Association in collaboration with Unboxed Creativity in the UK. Retrieved from https://www.britishscienceassociation.org/News/future-forum-report-2022-published

Carmi, E., Yates, S. J., Lockley, E., & Pawluczuk, A. (2020). Data citizenship: Rethinking data literacy in the age of disinformation, misinformation, and malinformation. Internet Policy Review, 9(2), 1–22. https://doi.org/10.14763/2020.2.1481

Chaudhry, M.A., & Kazim, E. (2022). Artificial intelligence in education (AIEd): A high-level academic and industry note 2021. AI and Ethics, 2(1), 157–165. https://doi.org/10.1007/s43681-021-00074-z

Couldry, N. (2020). Recovering critique in an age of datafication. New Media & Society, 22(7), 1135–1151. https://doi.org/10.1177/1461444820912536

Couldry, N., & Hepp, A. (2018). The mediated construction of reality. John Wiley & Sons.

D’Ignazio, C., & Bhargava, R. (2015). Approaches to building big data literacy. Proceedings of the Bloomberg Data for Good Exchange Conference. Bloomberg data for good exchange conference. Retrieved from http://www.kanarinka.com/wp-content/uploads/2021/01/DIgnazio-and-Bhargava-Approaches-to-Building-Big-Data-Literacy.pdf

Dalton, C.M., Taylor, L., & Thatcher, J. (2016). Critical data studies: A dialog on data and space. Big Data & Society, 3(1), 1–9. https://doi.org/10.1177/2053951716648346

Danylchenko-Cherniak, O. (2023). Creative and collaborative learning during the Russian-Ukrainian war period: Philological aspects. Philological Treatises, 15(1), 51–61. Retrieved from https://tractatus.sumdu.edu.ua/index.php/journal/article/view/1069

De Almeida, C.P.B., & De Goulart, B.N.G. (2017). How to avoid bias in systematic reviews of observational studies. Revista CEFAC, 19(4), 551–555. https://doi.org/10.1590/1982-021620171941117

Dencik, L., & Sanchez-Monedero, J. (2022). Data justice. Internet Policy Review, 11(1), 1–16. https://doi.org/10.14763/2022.1.1615

Dencik, L., Hintz, A., & Cable, J. (2016). Towards data justice? The ambiguity of anti-surveillance resistance in political activism. Big Data & Society, 3(2). https://doi.org/10.1177/2053951716679678

Dencik, L., Hintz, A., Redden, J., & Treré, E. (2019). Exploring data justice: Conceptions, applications and directions. Information, Communication & Society, 22(7), 873–881. https://doi.org/10.1080/1369118X.2019.1606268

D’Ignazio, C. (2022). Creative data literacy: Bridging the gap between the data-haves and data-have nots. Information Design Journal, 23(1), 6–18. https://doi.org/10.1075/idj.23.1.03dig

D’Ignazio, C., & Klein, L. F. (2020). Data feminism. The MIT Press.

Dixon-Woods, M., Cavers, D., Agarwal, S., Annandale, E., Arthur, A., Harvey, J., Hsu, R., Katbamna, S., Olsen, R., Smith, L., Riley, R., & Sutton, A.J (2006). Conducting a critical interpretive synthesis of the literature on access to healthcare by vulnerable groups. BMC Medical Research Methodology, 6(1), 35. https://doi.org/10.1186/1471-2288-6-35

Dubey, M., Otto, J., & Forbes, A. G. (2019). Data brushes: Interactive style transfer for data art (pp. 1–9). IEEE VIS Arts Program (VISAP). Retrieved from https://ieeexplore.ieee.org/document/8900858

Fereday, J., & Muir-Cochrane, E. (2006). Demonstrating rigor using thematic analysis: A hybrid approach of inductive and deductive coding and theme development. International Journal of Qualitative Methods, 5(1), 80–92. https://doi.org/10.1177/160940690600500107

Floridi, L., & Taddeo, M. (2016). What is data ethics? Philosophical Transactions of the Royal Society. Mathematical, Physical and Engineering Sciences, 374(2083), 20160360. https://doi.org/10.1098/rsta.2016.0360

Freire, P. (2011). Pedagogy of the oppressed. Continuum.

Gilliard, C. (2017). Pedagogy and the logic of platforms. Educause Review, 52(4). Retrieved from https://er.educause.edu/-/media/files/articles/2017/7/erm174111.pdf

Goel, A.K., & Joyner, D.A. (2017). Using AI to Teach AI: Lessons from an Online AI class. AI Magazine, 38(2), 48–58. https://doi.org/10.1609/aimag.v38i2.2732

Gonsales, P., Buzato, M., & King, E. (2021). Digital literacies and digital inclusion in contemporary Brazil. University of Campinas.

Hamilton, T., & Sharma, S. (1996). Power, power relations, and oppression: A perspective for balancing the power relations. Peace Research, 28(1), 21–41. Retrieved from http://www.jstor.org/stable/23607296

Heeks, R., & Swain, S. (n.d.). An applied data justice framework: Analysing datafication and marginalised communities in cities of the Global South. SSRN Electronic Journal, 74, 1–25. https://doi.org/10.2139/ssrn.3425885

Henz, P. (2021). Ethical and legal responsibility for Artificial Intelligence. Discover Artificial Intelligence, 1(2), 1–5. https://doi.org/10.1007/s44163-021-00002-4

Holmes, W., Bialik, M., & Fadel, C. (2019). Artificial intelligence in education: Promises and implications for teaching and learning. Center for Curriculum Redesign. Retrieved from https://circls.org/primers/artificial-intelligence-in-education-promises-and-implications-for-teaching-and-learning

hooks, b. (1994). Teaching to transgress: Education as the practice of freedom. Routledge.

Jackson, B. (2018). The changing research data landscape and the experiences of ethics review board chairs: Implications for library practice and partnerships. The Journal of Academic Librarianship, 44(5), 603–612. https://doi.org/10.1016/j.acalib.2018.07.001

Jackson, N., Oliver, M., Shaw, M., & Wisdom, J. (2006). Developing creativity in higher education: An imaginative curriculum. Routledge.

Jarke, J., & Breiter, A. (2019). The datafication of education. Learning, Media and Technology, 44(1), 1–6. https://doi.org/10.1080/17439884.2019.1573833

Kara, H., Lemon, N., Mannay, D., & McPherson, M. (2021). Creative research methods in education: Principles and practices. Policy Press.

Komljenovic, J., Williamson, B., Eynon, R., & Davies, H.C. (2023). When public policy ‘fails’ and venture capital ‘saves’ education: Edtech investors as economic and political actors. Globalisation, Societies and Education, 1–16. https://doi.org/10.1080/14767724.2023.2272134

Kordzadeh, N., & Ghasemaghaei, M. (2022). Algorithmic bias: Review, synthesis, and future research directions. European Journal of Information Systems, 31(3), 388–409. https://doi.org/10.1080/0960085X.2021.1927212

Leonard, N. (2021). Emerging artificial intelligence, art and pedagogy: Exploring discussions of creative algorithms and machines for art education. Digital Culture & Education, 13(1), 20–41. Retrieved from https://www.digitalcultureandeducation.com/s/Leonard-2021.pdf

Lepri, B., Oliver, N., Letouzé, E., Pentland, A., & Vinck, P. (2018). Fair, transparent, and accountable algorithmic decision-making processes. Philosophy and Technology, 31, 611–627. https://doi.org/10.1007/s13347-017-0279-x

Leurs, K. (2017). Feminist data studies: Using digital methods for ethical, reflexive and situated socio-cultural research. Feminist Review, 115(1), 130–154. https://doi.org/10.1057/s41305-017-0043-1

Llorente, R., & Morant, M. (2015). Crowdsourcing in higher education. In F. Garrigos-Simon, I. Gil-Pechuán, & S. Estelles-Miguel (Eds.), Advances in crowdsourcing (pp. 547–575). Springer. https://doi.org/10.1007/978-3-319-18341-1_7

Lockwood, T. (2010). The bridge between design and business. Design Management Review, 21(3), 5. https://doi.org/10.1111/j.1948-7169.2010.00072.x

Long, D., & Magerko, B. (2020). What is AI literacy? Competencies and design considerations. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, 20, 1–16. https://doi.org/10.1145/3313831.3376727

MacGregor, S., Cooper, A., Searle, M., & Kukkonen, T. (2023) Co-production and arts-informed inquiry as creative power for knowledge mobilisation. Evidence & Policy, 18(2), 206–235. https://doi.org/10.1332/174426421X16478737939339

Manca, A., Atenas, J., Ciociola, C., & Nascimbeni, F. (2017). Critical pedagogy and open data for educating towards social cohesion Pedagogia critica e dati aperti come mezzi per educare alla coesione sociale. Tecnologie Didattiche, 25(1), 111–115. https://doi.org/10.17471/2499-4324/9

Mandinach, E.B., & Gummer, E.S. (2013). A systemic view of implementing data literacy in educator preparation. Educational Researcher, 42(1), 30–37. https://doi.org/10.3102/0013189X12459803

Markauskaite, L., Marrone, R., Poquet, O., Knight, S., Martinez-Maldonado, R., Howard, S., Tondeur, J., De Laat, M., Buckingham Shum, S., Gašević, D., & Siemens, G. (2022). Rethinking the entwinement between artificial intelligence and human learning: What capabilities do learners need for a world with AI? Computers and Education: Artificial Intelligence, 3, 100056. https://doi.org/10.1016/j.caeai.2022.100056

McDougall, R. (2015). Reviewing literature in bioethics research: Increasing rigour in non‑systematic reviews. Bioethics, 29(7), 523–528. https://doi.org/10.1111/bioe.12149

McGovern, N. (2018). Radical collaboration and research data management: An introduction. Research Library Issues, 296, 6–22. https://doi.org/10.29242/rli.296.2

Means, A.J., & Slater, G.B. (2022). Future histories of education, pedagogy, and cultural studies: An editorial introduction. Review of Education, Pedagogy, and Cultural Studies, 44(1), 1–3. https://doi.org/10.1080/10714413.2022.2031750

Milan, S., & Van der Velden, L. (2016). The alternative epistemologies of data activism. Digital Culture & Society, 2(2), 57–74. https://doi.org/10.14361/dcs-2016-0205

Naidu, S. (2021). Building resilience in education systems post-COVID-19. Distance Education, 42(1), 1–4. https://doi.org/10.1080/01587919.2021.1885092

Nerantzi, C., Abegglen, S., Karatsiori, M., & Antonio Martínez-Arboleda (Eds.). (2023). 101 creative ideas to use AI in education, A crowdsourced collection (Version 2023 1.2). Zenodo. https://doi.org/10.5281/ZENODO.8072949

Noble, S.U. (2020). Algorithms of oppression: How search engines reinforce racism. New York University Press.

Nussbaum, B. (2013). Creative Intelligence: Harnessing the power to create, connect, and inspire. Harper Business.

Pangrazio, L., & Selwyn, N. (2019). ‘Personal data literacies’: A critical literacies approach to enhancing understandings of personal digital data. New Media & Society, 21(2), 419–437. https://doi.org/10.1177/1461444818799523

Park, S., & Humphry, J. (2019). Exclusion by design: Intersections of social, digital and data exclusion. Information, Communication & Society, 22(7), 934–953. https://doi.org/10.1080/1369118X.2019.1606266

Perrotta, C. (2022). Advancing data justice in education: Some suggestions towards a deontological framework. Learning, Media and Technology, 48(2), 187–199. https://doi.org/10.1080/17439884.2022.2156536

Prinsloo, P., Slade, S., & Khalil, M. (2022). The answer is (not only) technological: Considering student data privacy in learning analytics. British Journal of Educational Technology, 53(2), 876–893. https://doi.org/10.1111/BJET.13216

Redecker, C., & Punie, Y. (2020). Digital education action plan 2021–2027 resetting education and training for the digital age. Office of the European Union. Retrieved from https://eur-lex.europa.eu/legal-content/EN/TXT/?qid=1602778451601&uri=CELEX:52020DC0624#footnote32

Schäfer, M.T., & Van Es, K. (Eds.). (2017). The Datafied Society: Studying culture through data. Amsterdam University Press. Retrieved from https://library.oapen.org/handle/20.500.12657/31843

Selwyn, N. (2015). Data entry: Towards the critical study of digital data and education. Learning, Media and Technology, 40(1), 64–82. https://doi.org/10.1080/17439884.2014.921628

Selwyn, N. (2018). Data points: Exploring data-driven reforms of education. British Journal of Sociology of Education, 56(92), 1–9. https://doi.org/10.1080/01425692.2018.1469255

Solemon, B., Ariffin, I., Md Din, M., & Md Anwar, R. (2013). A review of the uses of crowdsourcing in higher education. International Journal of Asian Social Science, 3(9), 2066–2073. Retrieved from https://archive.aessweb.com/index.php/5007/article/view/2564

Stein, S., & De Oliveira Andreotti, V. (2017). Higher education and the modern/colonial global imaginary. Cultural StudiesCritical Methodologies, 17(3), 173–181. https://doi.org/10.1177/1532708616672673

Suoranta, J., Teräs, M., Teräs, H., Jandrić, P., Ledger, S., Macgilchrist, F., & Prinsloo, P. (2021). Speculative social science fiction of digitalization in higher education: From what is to what could be. Postdigital Science and Education, 4, 224–236. https://doi.org/10.1007/s42438-021-00260-6

Swist, T., & Gulson, K.N. (2023). Instituting socio-technical education futures: Encounters with/through technical democracy, data justice, and imaginaries. Learning, Media and Technology, 48(2), 181–186. https://doi.org/10.1080/17439884.2023.2205225

Taylor, L. (2017). What is data justice? The case for connecting digital rights and freedoms globally. Big Data & Society, December, 1–14. https://doi.org/10.1177/2053951717736335

Taylor, L., & Mukiri-Smith, H. (2021). Human rights, technology and poverty. In M.F. Davis, M. Kjaerum, & A. Lyons (Eds.), Research handbook on human rights and poverty. Edward Elgar Publishing. https://doi.org/10.4337/9781788977517.00049

Thoutenhoofd, E.D. (2017). The datafication of learning: Data technologies as reflection issue in the system of education. Studies in Philosophy and Education, 37, 433–449. https://doi.org/10.1007/s11217-017-9584-1

Treve, M. (2021). What COVID-19 has introduced into education: Challenges facing higher education institutions (HEIs). Higher Education Pedagogies, 6(1), 212–227. https://doi.org/10.1080/23752696.2021.1951616

Webb, C. (1993). Feminist research: Definitions, methodology, methods and evaluation. Journal of Advanced Nursing, 18(3), 416–423. https://doi.org/10.1046/j.1365-2648.1993.18030416.x

Williams, P., & Clarke, B. (2016). Dangerous associations: Joint enterprise, gangs and racism. Centre for Crime and Justice Studies. Retrieved from https://www.crimeandjustice.org.uk/sites/crimeandjustice.org.uk/files/Dangerous%20assocations%20Joint%20Enterprise%20gangs%20and%20racism.pdf

Williamson, B. (2017). Big data in education: The digital future of learning, policy and practice (1st ed.). SAGE Publications.

Williamson, B., Eynon, R., & Potter, J. (2020). Pandemic politics, pedagogies and practices: Digital technologies and distance education during the coronavirus emergency. Learning, Media and Technology, 45(2), 107–114. https://doi.org/10.1080/17439884.2020.1761641

Young, I.M. (2014). Five faces of oppression. Rethinking Power, 174–195. Retrieved from https://www.kenwoodacademy.org/ourpages/auto/2019/9/4/42714114/Five%20Faces%20of%20Oppression.pdf

Zembylas, M. (2023). The analytical potential of ‘affective imaginaries’ in higher education research. Education Inquiry, 1–16. https://doi.org/10.1080/20004508.2023.2223781