CfP: Learning Analytics for 21st Century Competencies
Special Issue Editors: Simon Buckingham Shum & Ruth Crick (University of Technology Sydney)
Full call for Special Issue submissions: http://bit.ly/jlac21
A strategic educational response to a world of constant change is to focus explicitly on nurturing the skills and dispositions which equip learners to cope with novel, complex situations, assessed under authentic conditions. Thus, even if we do not know what the future holds, we can be better equipped for the only thing we can be sure of — change. The qualities that learners need have thus been dubbed “21st Century” in nature — not because these were of no use before (although they may take forms today which are novel) — but because of their critical importance in jobs involving sensemaking and creativity.
This sets the challenging context for understanding the potential of Learning Analytics approaches for the formative (and possibly summative) assessment of 21st century competencies, which are important precisely because they need to be displayed in interpersonal, societally and culturally valid contexts. By definition, the concept of assessing qualities that are lifelong and spanning the ‘arc of life’ inside and beyond formal learning demands new kinds of evidence. Computational support for tracking, feeding back, and reflecting learning processes holds the promise that these qualities can be evidenced, at scale, in ways that have been impractical until now.
Framed thus, the goal is to forge new links from the body of educational/learning sciences research — which typically clarifies the nature of the phenomena under question using representations and language for researchers — to documenting how data, algorithms, code and user interfaces come together through coherent design in order to automate such analyses — providing actionable insight for the educators, students and other stakeholders who constitute the learning system in question.
Quantifying these deeply personal qualities in order to feed back and strengthen them, without in the process reducing them to meaningless statistics, is the heart of the learning analytics challenge. How does one gather data from a diversity of life contexts, as potential evidence of these new competencies? How do we translate theoretical constructs with integrity into algorithms? How can they be rendered for human interpretation, by whom, and with what training? Should such analytics be used primarily for formative assessment, or should we be aiming for summative grades? Who gets to design the analytics, and who gets to validate them? Do analytics of this sort raise new ethical dilemmas?
Contributions are invited to this special issue to document and advance theory, design methodology, technology implementation or evidence of impact, including but not limited to: