Human-Centred Learning Analytics: 2019-2024 (BJET)

Buckingham Shum, S., Martínez-Maldonado, R., Dimitriadis, Y., & Santos, P. (2024). Human-Centred Learning Analytics: 2019-24. British Journal of Educational Technology, 55(3), 755-768. (Editorial to a Special Section Human-Centred Learning Analytics)
BJET logo

In 2018, working with Rebecca Ferguson and Roberto Martínez-Maldonado, I co-edited a special section of the Journal of Learning Analytics introducing the thematic priority of Human-Centred Learning Analytics (HCLA), published in 2019. We called for LA to engage with the rich diversity of HCI theories, design processes and empirical methods, with explicit attention to meaningful engagement with educational stakeholders, to design LA tools that augment teaching practices and learner behaviour, and through this, illuminating the sociotechnical factors influencing the successful adoption (or rejection) of LA tools.

Since then, we have seen the HCLA community grow through annual international workshops linked to major conferences, and several literature reviews have now been published. So 5 years on, it seemed timely to bring together a new collection of work as a snapshot of the current state of the art, and we were delighted that the British Journal of Educational Technology chose from its call for special sections. In our editorial we reflect on the papers, how the field has developed, and what the next 5 years might hold. Here’s the abstract which is  now published for early online access, with the special section formally published in May:

Human-Centred Learning Analytics (HCLA) has emerged in the last 5 years as an active sub-topic within Learning Analytics, drawing primarily on the theories and methods of Human-Computer Interaction (HCI). HCLA researchers and practitioners are adopting and adapting HCI theories/methods to meet the challenge of meaningfully engaging educational stakeholders in the LA design process, evaluating systems in use, and researching the sociotechnical factors influencing LA successes and failures. This editorial introduces the contributions of the papers in this special section, reflects more broadly on the field’s emergence over the last five years, considers known gaps, and indicates new opportunities that may open in the next five years.

I hope you find this a provocative collection that inspires you to bring the voices of stakeholders more strongly and meaningfully into the design process.

Special Section Human-Centred Learning Analytics – papers and abstracts:

Campos, F.Nguyen, H.Ahn, J., & Jackson, K. (2023). Leveraging cultural forms in human-centred learning analytics design

In this article, we offer theory-grounded narratives of a 4-year participatory design process of a Learning Analytics tool with K-12 educators. We describe how we design-in-partnership by leveraging educators’ routines, values and cultural representations into the designs of digital dashboards. We make our long-term reasoning visible by reflecting upon how design decisions were made, discussing key tensions and analysing to what extent the developed tools were taken up in practice. Through thick design narratives, we reflect upon how cultural forms—recognizable cultural constructs that might cue and facilitate specific activities—were identified among educators and informed the design of a dashboard. We then examined the extent to which the designed tool supported coaches and teachers to engage in Generative Uncertainty, an interpretive stance in which educators manifest productive inquiries towards data. Our analysis highlights that attuning to cultural forms is a valuable first step but not enough towards designing LA tools for systems in ways that fit institutionalized practices, challenge instrumental uses and spur productive inquiry. We conclude by offering two key criteria for making culturally-grounded design decisions in the context of long-term partnerships.

Hilliger, I.Miranda, C.Celis, S., & Pérez-Sanagustín, M. (2023). Curriculum analytics adoption in higher education: A multiple case study engaging stakeholders in different phases of design

Several studies have indicated that stakeholder engagement could ensure the successful adoption of learning analytics (LA). Considering that researchers and tech developers may not be aware of how LA tools can derive meaningful and actionable information for everyday use, these studies suggest that participatory approaches based on human-centred design can provide stakeholders with the opportunity to influence decision-making during tool development. So far, there is a growing consensus about the importance of identifying stakeholders’ needs and expectations in early stages, so researchers and developers can design systems that resonate with their users. However, human-centred LA is a growing sub-field, so further empirical work is needed to understand how stakeholders can contribute effectively to the design process and the adoption strategy of analytical tools. To illustrate mechanisms to engage various stakeholders throughout different phases of a design process, this paper presents a multiple case study conducted in different Latin American universities. A series of studies inform the development of an analytical tool to support continuous curriculum improvement, aiming to improve student learning and programme quality. Yet, these studies differ in scope and design stage, so they use different mechanisms to engage students, course instructors and institutional administrators. By cross analysing the findings of these three cases, three conclusions emerged for each design phase of a CA tool, presenting mechanisms to ensure stakeholder adoption after tool development. Further implications of this multiple case study are discussed from a theoretical and methodological perspective.

Hutchins, N. M., & Biswas, G. (2023). Co-designing teacher support technology for problem-based learning in middle school science

This paper provides an experience report on a co-design approach with teachers to co-create learning analytics-based technology to support problem-based learning in middle school science classrooms. We have mapped out a workflow for such applications and developed design narratives to investigate the implementation, modifications and temporal roles of the participants in the design process. Our results provide precedent knowledge on co-designing with experienced and novice teachers and co-constructing actionable insight that can help teachers engage more effectively with their students’ learning and problem-solving processes during classroom PBL implementations.

Lawrence, L.Echeverria, V.Yang, K.Aleven, V., & Rummel, N. (2023). How teachers conceptualise shared control with an AI co-orchestration tool: A multiyear teacher-centred design process

Artificial intelligence (AI) can enhance teachers’ capabilities by sharing control over different parts of learning activities. This is especially true for complex learning activities, such as dynamic learning transitions where students move between individual and collaborative learning in un-planned ways, as the need arises. Yet, few initiatives have emerged considering how shared responsibility between teachers and AI can support learning and how teachers’ voices might be included to inform design decisions. The goal of our article is twofold. First, we describe a secondary analysis of our co-design process comprising six design methods to understand how teachers conceptualise sharing control with an AI co-orchestration tool, called Pair-Up. We worked with 76 middle school math teachers, each taking part in one to three methods, to create a co-orchestration tool that supports dynamic combinations of individual and collaborative learning using two AI-based tutoring systems. We leveraged qualitative content analysis to examine teachers’ views about sharing control with Pair-Up, and we describe high-level insights about the human-AI interaction, including control, trust, responsibility, efficiency, and accuracy. Secondly, we use our results as an example showcasing how human-centred learning analytics can be applied to the design of human-AI technologies and share reflections for human-AI technology designers regarding the methods that might be fruitful to elicit teacher feedback and ideas. Our findings illustrate the design of a novel co-orchestration tool to facilitate the transitions between individual and collaborative learning and highlight considerations and reflections for designers of similar systems.

Wiley, K.Dimitriadis, Y., & Linn, M. (2023). A human-centred learning analytics approach for developing contextually scalable K-12 teacher dashboards

This paper describes a Human-Centred Learning Analytics (HCLA) design approach for developing learning analytics (LA) dashboards for K-12 classrooms that maintain both contextual relevance and scalability—two goals that are often in competition. Using mixed methods, we collected observational and interview data from teacher partners and assessment data from their students’ engagement with the lesson materials. This DBR-based, human-centred design process resulted in a dashboard that supported teachers in addressing their students’ learning needs. To develop the dashboard features that could support teachers, we found that a design refinement process that drew on the insights of teachers with varying teaching experience, philosophies and teaching contexts strengthened the resulting outcome. The versatile nature of the approach, in terms of student learning outcomes, makes it useful for HCLA design efforts across diverse K-12 educational contexts.

Comments are closed.