Learning Analytics are Assessment Regimes

I have followed the thoughtful work of Gardner Campbell with great interest since I first encountered him in the Learning Analytics MOOC we ran earlier this year [launch webinar replay]. Here and elsewhere, he expresses concerns that I share about the ‘Wal-Martification’ of College through crude analytics that do not do justice to the true complexity of learning and teaching.

In a post last summer entitled “Analytics” interventions, he provides some historical perspective on efforts to formalize notions of learning in order to facilitate scaleable assessment through the Educational Testing Service.

The way I read his concerns about analytics resonate with how I’m trying to map them to assessment regimes (Melanie Booth makes this connection as well, and see the foundational work of the Assessment Reform Group), and with complex systems thinking.

The line of argument goes something like this.

  1. Learning analytics are intended to improve student success. Analytics, consequently, always designed with a particular conception of ‘success’, thus defining the patterns deemed to be evidence of progress, and hence, the data that should be captured. The primary driver of mainstream teaching practice, and hence the learner’s experience, is the assessment regime. Learning analytics operating at the detailed process level of individual learning traces are in essence new assessment technology, capable (at their best) of providing personalized, timely, specific, actionable feedback.
  2. Since assessment regimes are a hotly contested issue within educational research and policy, by extension, an intelligent approach to learning analytics must engage with this debate, making clear what assessment regimes and pedagogical commitments a given learning analytic promotes. Due to the complexity of implementing good assessment for learning [1], designing tools of this sort remains the primary challenge for learning analytics researchers [2]. The promise is that done well, analytics could be the key enabler for delivering formative assessment for learning at scale, placing new kinds of tools in the hands of learners [3].
  3. Information systems filter and categorise the world. When done well, simplified models help us grasp overwhelming complexity, but done badly they legitimise systematic analyst/organizational/societal amnesia [4]. A marker of the health of the learning analytics field will be the quality of debate around what the technology renders visible and leaves invisible.
  4. So the risk that Gardner Campbell sees, and which I share, is that because learning is complex, and formative assessment is hard to do well, never mind formalize into computationally tractable form — analytics could freeze or even turn back the clock on all that we have learnt about deep, authentic learning because crude performance indicators are set, based on what can be currently measured.

I think I’m less pessimistic than Gardner however. The evidence from the excellent July/August EDUCAUSE Review special issue (parts 2-3 to come) is that educational institutions can certainly benefit from the varied armoury of business intelligence tools. But we need to go beyond conventional measures of success. For me, the most exciting work in learning analytics is going to be grappling with the pedagogies and assessment regimes now emerging for learning in open, social contexts. How will analytics get a grip on the higher order qualities that we seek to instill in learners, to equip them to see differently, think differently, seize learning opportunities when they present themselves, and thrive when confronted with turbulent, uncertain, messy social dilemmas?

Coupled with analytics — complex systems thinking, learning dispositions, assessment for learning in its richest sense, and leadership development seem like key elements for effecting systemic shifts [5].

[1] Assessment for Learning: 10 Principles. Assessment Reform Group (assessment-reform-group.org). 2002. http://assessmentreformgroup.files.wordpress.com/2012/01/10principles_english.pdf

[2] Booth, M., Learning Analytics: The New Black. EDUCAUSE Review Online, July/Aug., (2012). http://www.educause.edu/ero/article/learning-analytics-new-black

[3] Buckingham Shum, S. and Ferguson, R., Social Learning Analytics. Educational Technology & Society (Special Issue on Learning Analytics, Eds.: G. Siemens & D. Gašević), (In Press). Eprint: http://oro.open.ac.uk/34092

[4] Bowker, G. C. and Star, L. S. Sorting Things Out: Classification and Its Consequences. MIT Press, Cambridge, MA, 1999

[5] Learning Emergence Network: LearningEmergence.net