PhD Scholarship: Writing Analytics for Deep Reflection

Writing Analytics for Deep Reflection

Supervisors

Ming Liu and Simon Buckingham Shum (UTS:CIC), Cherie Lucas (UTS:Pharmacy)
The supervision team for this PhD is a partnership between CIC and the School of Pharmacy, who together have pioneered reflective writing analytics.

Visit the CIC PhD Scholarships page for full details. Please email us to express interest, ask any questions, and if we can see a potential fit we’ll advise you on writing your proposal.

The Challenge

The societal challenge:

“We do not learn from experience… we learn from reflecting on experience.”
(paraphrasing John Dewey)

The problems now confronting society place an unprecedented urgency on learning from experience. Such is the pace of change that before we can plan for them, citizens and professionals in all sectors find themselves immersed in novel, complex problems. Moreover in education, the learning sciences tell us that crafting authentic experiences is a powerful trigger for learning.

“White water is the new normal”, as they say. But as Dewey noted, critical to this is our capacity to reflect on the experience of shooting those rapids. If we can’t learn how we could do better next time — individually and collectively — we are in deep trouble. From school age students, through higher education, and into professional leadership, we have to make sense of challenging experiences, recognise how we were challenged, how we are changing, and how we can improve.

At the heart of deep learning is our sense of identity. People rarely shift from entrenched positions by force of argument alone. However, when we undergo challenging experiences that force us to question assumptions and worldviews at the heart of our identity, this can indeed be transformational if we are assisted in making sense of this, and can emerge with our identify intact but now under reconstruction. Without such shifts, it’s hard to see how we will move beyond current polarisations around how we relate to each other, and the planet. Given our current political and cultural climate, applied research to help people reflect on how they adjust to threatening transitions is both timely, and of first order importance.

So, we need to get better at deep reflection, and clearly, there’s nothing as valuable as detailed feedback to provoke further reflection. But this is a scarce skillset and very labour-intensive. The practical consequence is that most students and leaders do not understand what good reflective writing is, and do not receive good feedback. For these reasons, there’s interest in educational and professional sectors in the potential of automated techniques to deliver real-time, personalised coaching.

In sum, this PhD is fundamentally about harnessing computational intelligence to deepen human learning in contexts spanning formal education, professional practice, and community transformation.

The writing challenge:

Effective written communication is an essential skill which promotes educational success for university students. However, far too many students have never had the features of good rhetorical moves explained well to them, and most educators are subject matter experts, not skilled writing coaches (Lucas, Gibson & Buckingham Shum, 2018). CIC initiated its Academic Writing Analytics (AWA) project in 2015, as it became clear through consultations across faculties that student writing was a strategically important area for UTS teaching and learning (and indeed, for most other educational institutions). The goal is to more effectively teach the building blocks of good academic writing by providing instant, personalised, actionable feedback to students about their drafts (Knight, Buckingham Shum, Ryan, Sándor, & Wang, 2018).

To deliver on this vision requires integrated expertise including natural language processing, linguistics, academic language pedagogy, learning design, feedback design, user experience, and cloud computing. This is truly a transdisciplinary effort, which has been enormously stimulating. To date, we have worked on critical, argumentative, analytical writing of the sort typically found in literature reviews, persuasive essays and research articles, as well as reflective writing, in which learners make sense of their workplace experiences, try to integrate this with their academic understanding, and share their own uncertainties, emotions and sense of personal challenge/growth (Gibson, Aitken, Sándor, Buckingham Shum, Tsingos-Lucas, & Knight, 2017).

Learning Analytics tools are most effective when co-designed with effective Learning Designs: the features constructed by the analytics align with the assessment criteria, and the tool is coherently embedded in authentic student learning tasks. Our program has demonstrated how this can be accomplished (Knight, Shibani & Buckingham Shum, 2018; Shibani, Knight, Buckingham Shum & Ryan, 2017).

Depending on your interests and skillset, critical advances that this PhD might range across technical and pedagogical contributions to educational technology:

Technical:

    • Integration between rule-based modelling and machine learning approaches
    • Accelerated customisation of the parsers to different disciplinary domains and genres of writing
    • Definition of new computational proxies that can serve as indicators of deep reflection
    • User experience and machine learning to enable user feedback that teaches the tool when it make errors
    • Curation of text corpora to advance the field

Pedagogical:

    • Design and validation of analytics-augmented learning design patterns
    • Radical improvements in the user experience of automated feedback, e.g. through novel educator/student co-design processes, or user interfaces

Analytics Approaches

We currently implement the underlying concept matching model using a rule-based grammar and human-curated lexicons, which for those not familiar with this kind of work, brings both pros and cons (Buckingham Shum, Sándor, Goldsmith, & McWilliams 2017; Ullmann, 2017). The rules are grounded in scholarly literature on the features of academic research writing, and have been tested on diverse texts by the team through close manual analysis. The lexicons can be edited to tune them to the language used in different disciplines and subjects. This relatively traditional AI approach provides familiar intellectual credentials when introducing the system to educators, and when we’re testing it, the underlying behaviour is easier to explain, and errors can be diagnosed very precisely. However, it brings the limitations associated with any rule-based approach: given the richness of open-ended reflective writing, there are exception cases to debug, and improvements to the system’s performance require manual edits to the rules and lexicon.

Quick intro from an educator’s perspective:

We are now beginning work to investigate if a machine learning approach can augment the current infrastructure (Liu, et al. 2019; Ullmann, 2019). Recent years, with the availability of “big data”, such as large question answer banks (Rajpurkar, Zhang, Lopyrev, & Liang, 2016), and effective machine learning algorithms, e.g. deep neural networks (Lecun, Bengio, & Hinton, 2015), data driven approaches based on new data processing architectures have attracted a great attention in natural language processing tasks, such as neural text summarization (Liu & Manning, 2017) and neural machine translation (Bahdanau, Cho, & Bengio, 2014), mainly because these approaches do not require human defined rules and have good generalization power. However, such data driven approaches require a large amount of data, and some statistical learning models such as deep neural networks are not easy to comprehend. Preliminary results are reported

Therefore, we invite your proposals as to which techniques might be best suited to this challenge. What’s more, the creation of a corpus for writing raises ethical challenges, and we invite your thoughts on what these are, and how we might address them.

You will work in close collaboration with one or more academics from other faculties/units in UTS, using co-design methods with academics, potentially external partners, with opportunities for synergy with existing projects and tools as described on the CIC website. For more information about ongoing research in this area, please visit the Academic Writing Analytics homepage and the Writing Analytics blog.

Resources to help you understand the current state of the technology and its educational applications include the references cited, plus:

Candidates

We’re looking for the broad skills and dispositions that we are seeking in all candidates (see CIC’s PhD homepage). In addition, we envisage that applicants will either come from strong technical backgrounds and are passionate to see these make a difference in education, or from strong educational backgrounds seeking to shape the design of analytics/AI.

Core strengths that we expect from applicants with technical backgrounds:

  • A Masters degree, Honours distinction or equivalent with above-average grades in computer science, mathematics, statistics, or equivalent
  • Analytical, creative and innovative approach to solving problems
  • Strong interest in designing and conducting quantitative, qualitative or mixed-method studies
  • Knowledge and experience of natural language processing/text analytic
  • Strong programming skills in at least one relevant language (e.g. C++, .NET, Java, Python)
  • Experience with statistical and data mining, deep learning, or data science tools (e.g. R, Weka, Tensorflow, ProM, RapidMiner).

Core strengths that we expect from applicants with educational backgrounds:

  • A Masters degree, Honours distinction or equivalent with above-average grades in teaching, educational theory, instructional design, learning sciences (ideally experience in the pedagogy and scholarship of writing)
  • Knowledge and experience in Design-Based Research, or a related methodology for authentic design and evaluation of educational technologies
  • Qualitative and quantitative data collection and analysis skills

It is advantageous if you can evidence:

  • Skill in working with non-technical clients to involve them in the design and testing of software tools
  • Peer-reviewed publications
  • Design and Implementation of user-centred software

Interested candidates should contact the team to open a conversation: Ming.Liu@uts.edu.au; Simon.BuckinghamShum@uts.edu.au; Cherie.Lucas@uts.edu.au  We will discuss your ideas to help you sharpen up your proposal, which will be competing with others for a scholarship. Please follow the application procedure for the submission of your proposal.

References

Bahdanau, D., Cho, K., and Bengio, Y. (2014). Neural Machine Translation by Jointly Learning to Align and Translate. In Proceedings of the 3rd International Conference on Learning Representations.

Buckingham Shum, S., Sándor, Á.,Goldsmith, R.,Bass R.,and McWilliams M.(2017). Towards Reflective Writing Analytics: Rationale, Methodology and Preliminary Results. Journal of Learning Analytics, 4, (1), 58–84.

Gibson, A., Aitken, A., Sándor, Á., Buckingham Shum, S., Tsingos-Lucas, C.,and Knight, S. (2017). Reflective Writing Analytics for Actionable Feedback. In Proceedings of LAK17: 7th International Conference on Learning Analytics & Knowledge.

Knight, S., Shibani, A. and Buckingham Shum, S. (2018). Augmenting Formative Writing Assessment with Learning Analytics: A Design Abstraction Approach. London Festival of Learning (ICLS/AIED/L@S Tri-Conference Crossover Track), London (June 2018).

Knight, S., Buckingham Shum, S., Ryan, P., Sándor, Á.,and Wang, X. (2018). Designing Academic Writing Analytics for Civil Law Student Self-Assessment. International Journal of Artificial Intelligence in Education, 28, (1), 1-28.

Liu, M., Buckingham Shum, S., Mantzourani, E. and Lucas, C. (2019). Evaluating Machine Learning Approaches to Classify Pharmacy Students’ Reflective StatementsProceedings AIED2019: 20th International Conference on Artificial Intelligence in Education, June 25th – 29th 2019, Chicago, USA. Lecture Notes in Computer Science & Artificial Intelligence: Springer

Lucas, C., Gibson, A. and Buckingham Shum, S. (2018). Utilization of a novel online reflective learning tool for immediate formative feedback to assist pharmacy students’ reflective writing skills. American Journal of Pharmaceutical Education.

Lecun, Y., Bengio, Y.,and Hinton, G. (2015). Deep LearningNature, 521(7553): 436–444.

Rajpurkar, P., Zhang,J., Lopyrev, K., and Liang P. (2016). SQuAD: 100,000+ Questions for Machine Comprehension of Text. In Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing.

See, A., Liu, P. J., and ManningC. D. (2017). Get To The Point: Summarization with Pointer-Generator Networks. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics.

Shibani, A., Knight, S., Buckingham Shum S. and Ryan, P. (2017). Design and Implementation of a Pedagogic Intervention Using Writing Analytics. In Proceedings of the 25th International Conference on Computers in Education. New Zealand: Asia-Pacific Society for Computers in Education.

Ullmann, T.D. (2019). Automated Analysis of Reflection in Writing: Validating Machine Learning Approaches. Int. J. Artif. Intell. Educ. 1–41.

Leave a Reply

You can use these XHTML tags: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>