Computational “assessment” of writing elicits strong reactions from many educators. For sceptics, handing over to a machine the task of feeding back or even grading writing crosses a boundary line marking the limits of AI. It also raises the same fears that any disruptive technology brings, around redefining roles and identities in a profession.
Effective writing is not only central to education and the workplace, but a lifelong citizenship competency for engaging in society. Many academic disciplines are concerned with building learners’ skills in critical review, conceptual synthesis, reasoning, and disciplinary/professional reflection. In these subjects, writing is arguably the primary window onto the mind of the learner. Huge effort is invested in literacy from the earliest schooling, extending for many into higher education. Yet educators and employers alike recognise the challenge of cultivating this ability in graduates, with poor written communication skills a common cause of complaint.
Extending beyond scholarly, academic writing, many educators also have a keen interest in disciplined, autobiographical reflective writing as a way for students to review and consolidate their learning, thus providing a means for assessing the deepest kinds of shifts that can occur in learner agency and epistemology. Such approaches are also common in the training and development of professional reflective practitioners.
Writing is, however, labour-intensive to assess, hard to do quickly, demanding for students to learn, and not something that all educators can coach well, or even consider their job to do. It is in addressing these systemic limitations that NLP is attracting significant educational interest, and commercial investment.
As Natural Language Processing moves out of the labs and into mainstream products, and as it becomes a mainstream topic in the Learning Analytics community, we have the opportunity and challenge of harnessing language technologies, and delivering them in effective ways that enhance learning.
NLP capability is of course the key enabling capability, but is just one piece of the puzzle for an effective learning analytics solution: it needs to be tuned by theories of how writing and learning shape each other, the scholarship of teaching writing, appropriate pedagogical practices and user interface design, and evidence from empirical evaluation of the total system, not just algorithmic metrics.
The learning analytics community should be in a position to guide educators and students on the evidence of impact in this new space. What questions should be asked before buying a new product or trialling a new research prototype? What are the options for evaluating such tools? What staff competencies are required to ensure that such tools have the maximum chances of success? Do students need orientation or training? These are the often ignored costs around a potentially disruptive technology.
Promises and pitfalls
Ultimately, educators and students must trust these tools, and the effort of learning a new tool must pay back. Computational “assessment” of writing elicits strong reactions from many educators. For sceptics, handing over to a machine the task of feeding back or even grading writing crosses a boundary line marking the limits of artificial intelligence (AI). It also raises the same fears that any disruptive technology brings, around redefining roles and identities in a profession. The research question is whether or not such scepticism is justified.
Writing Analytics have in common similar potential and pitfalls to other learning analytics applications.
- The promise is 24/7, personalised feedback at scale, which exceeds what is possible with the limited resources normally available to students. Only a privileged minority of students have access to detailed, timely feedback as they draft texts.
- What are considered the pitfalls depends on how one frames the design problem. I’d like to propose a critical, whole systems perspective in which the definition of “the system” and “success” is not restricted to IR metrics such as precision and recall, but recognizes the many wider issues that aid or obstruct analytics adoption in educational settings, such as theoretical and pedagogical grounding, usability, user experience, stakeholder design engagement, practitioner development, organizational infrastructure, policy and ethics.
Towards critical, systemic perspectives
In such a critical perspective, writing analytics are problematized. Technical, educational and design thinking need to come together in order to address the range of issues opened up for inquiry:
- Pedagogically-grounded requirements for language technologies to support a specific genre of writing (even if these are extraordinarily challenging)
- Design and validation of analytics for different genres of academic writing (e.g. literature review; debate analysis; personal reflection)
- The relationship between assessment regime and choice of writing analytics (e.g. summative grading for high stakes tests; formative feedback on open ended reflection; individual versus collaborative peer review)
- Arguments for the potential benefits (or damage) of engaging with writing analytics (e.g. Might rapid feedback disrupt critical reflection processes? Is automated feedback perceived differently by students to human feedback?)
- Compelling (even fun?) user interfaces for engaging with automated writing feedback (e.g. annotations; visualizations of content and structure)
- Empirical evaluations of research prototypes and commercial products
- Principles for embedding software tools into practice (e.g. student and staff orientation; common misconceptions)
- Organizational adoption case studies
- Ethical issues specific to writing analytics (e.g. given the range of ideas and emotions that can be expressed)
Reflective writing for wholistic education
Last week Georgetown University convened their second Formation by Design (FxD) symposium, a follow-on to the first gathering last year (interim report). FxD is convened by Randy Bass (Vice-Provost, Education), as part of a university-wide initiative framing the future of higher education at their institution as a design problem.
“The core purpose of the Formation by Design Project is to move formational learning (whole person learning) to the center of higher education at a watershed moment. Our goal is to respond to the challenges that the current landscape poses to an integrative and holistic vision of education by creatively redesigning dimensions of the university to ensure that formational learning can both thrive into the future and be extended to an ever-expanding and diversifying population seeking higher education. In order to take advantage of this opportunity we have to reframe formation in the context of the new ecology of learning. In this context, the Project seeks to make an impact in three areas:
Valuing Formation: How do we define formation so that it accounts for the expanding skill-set and wider outcomes of a liberally-educated person in this century? How can we make formation visible as a core educational goal in ways that respond to the emerging learning ecosystem?
Designing for Formation: How can we develop strategies and identify models for integrating formation into the core practices of institutions of higher education? What are some promising learning designs and technologies that foster a broader sense of purpose and human capability appropriate to the new contexts of globalization, complexity, and social connection?
Measuring and Assessing Formation: How do we assess and measure the impact of formational education in reasonably systematic ways, both to demonstrate the value of learning designs and for continuous improvement of them? How might we develop an integrative approach to assessment and measurement tuned to the emerging digital environment that can make learning, and the data from learning processes, visible and usable in new ways?”
However, this is bigger than Georgetown: along the way, Randy has convened Reinvent University for the Whole Person — a stimulating series of video roundtable conversations reimagining education fit for our times.
FxD are particularly interested in reflective writing as a site for learning analytics that fits their wholistic vision of deeper learning, and so invited me to contribute a ‘provocation’ to the symposium, to spark discussion about the roles of analytics in a conception of university education which has a strong liberal arts tradition, and is relatively non-technical compared to other fields and traditions.
Towards reflective writing analytics?
A long term and very fruitful collaboration with Ágnes Sándor in the Parsing and Semantics research group at Xerox Research Centre Europe, is enabling us to trial a range of parsers in educational settings. The collaboration began at the UK Open University with Anna De Liddo in our work on Contested Collective Intelligence (webinar/paper), and now Duygu Bektik’s PhD (latest news). These have focused to date on the relatively mature parser that XRCE has developed for analysing analytical, formal academic writing of the sort found in peer reviewed publications (they have analysed writing including social sciences, genomics and bioinformatics).
Following my move to UTS, we’re continuing to test their parser, developing and evaluating prototype tools (i) to give rapid formative feedback to students on their writing, (ii) as tools to provide educators with clues to the quality of the writing, and (iii) as new kinds of qual/quant analytical tools for researchers. I am greatly enjoying working with the Academic Language & Learning Group in the Institute for Interactive Media & Learning (IML), who collaborate with academics to embed academic literacies across the curriculum. The new reflective writing parser we are developing was grounded in the work by Rosalie Goldsmith.
PhD work by Thomas Ullmann and Andrew Gibson are two other examples of initiatives in reflective writing analytics, with a fifth annual Workshop on Awareness and Reflection in Technology Enhanced Learning coming up, linking others who share a broad interest in reflection for learning.
The twitter dialogue from the FxD symposium with Gardner Campbell has helped me better articulate this, and I look forward to continuing such dialogue.
— SimonBuckinghamShum (@sbuckshum) June 16, 2015
UPDATE 27 FEB 2016: The argument above has since been developed, and motivates the workshop Critical Perspectives on Writing Analytics.
The work on analytics for reflective writing is now documented as:
Buckingham Shum, S., Ágnes Sándor, Rosalie Goldsmith, Xiaolong Wang, Randall Bass and Mindy McWilliams (2016, In Press). Reflecting on Reflective Writing Analytics: Assessment Challenges and Iterative Evaluation of a Prototype Tool. 6th International Learning Analytics & Knowledge Conference (LAK16). Edinburgh, UK. ACM Press. http://dx.doi.org/10.1145/2883851.2883955 Preprint: http://bit.ly/LAK16paper