Shibani Antonette is a Lecturer at the Faculty of Transdisciplinary Innovation, UTS. She obtained her PhD in Learning Analytics from the Connected Intelligence Centre, UTS.
With a background in computer science engineering, she researches on applied areas of technology and data science (with a focus on text analytics). In her doctoral research in Learning Analytics, she worked on automated writing feedback and its application in higher educational classrooms. Having established a strong research profile in the field of Learning Analytics and artificial intelligence in education, Shibani has presented her work in various international conferences and delivered talks across institutions. She is an executive member of Society for Learning Analytics Research (SoLAR) and co-hosts the podcast ‘SoLAR Spotlight: Conversations on Learning Analytics’.
Prior to joining UTS, Shibani worked as a Research Associate in Nanyang Technological University, Singapore, where she completed her Masters in Information Studies. She has also worked in the industry as a Programmer Analyst in Data Warehousing and Business Intelligence.
- 2019 – Ongoing: Executive Member of the Society for Learning Analytics Research (SOLAR) and Chair of Special Interest Groups within SOLAR.
- 2018: Board of Studies Member (Student Representative) for the Connected Intelligence Centre, University of Technology Sydney, Australia.
- Program Committee member and reviewer for: 9th International Learning Analytics and Knowledge Conference (LAK 2019), Special Track on Big Data, Analytics & Machine Learning in Education in IEEE TALE 2018, 26th & 27th International Conference on Computers in Education (ICCE), 8th International Learning Analytics and Knowledge Conference (LAK 2018).
- Reviewer for journals and conferences including: Journal of Learning Analytics, IEEE Transactions on Learning Technologies, Journal of Computer Assisted Learnin, Learning, Education and Families track in ACM CHI Conference on Human Factors in Computing Systems (CHI’19), 13th International Conference on the Learning Sciences (ICLS 2018), Eye Tracking Enhanced Learning: A workshop on eye-tracking experiences in educational technology research, collocated with EC-TEL 2017
- Organizing team member for: 12th International Conference of the Learning Sciences 2016, Redesigning Pedagogy International Conference 2015 organized by National Institute of Education, NTU, Singapore.
Scholarships and Awards received
- Future Women Leaders Conference Award, 2019 for early career female leaders who are actively involved in the fields of Engineering and IT Academia.
- ACM-W scholarship from the Association of Computing Machinery, supporting Women in Computing to attend the International Learning Analytics and Knowledge 2019 conference.
- Doctoral Consortium Scholarship from the Society for Learning Analytics Research to attend the International Learning Analytics and Knowledge 2018 Conference.
- Vice Chancellor’s Conference Fund from the University of Technology Sydney to attend the International Conference on Computers in Education 2017.
- Merit Scholarship from the Asia Pacific Society for Computers in Education in honor of outstanding performance as a doctoral student, at the International Conference on Computers in Education 2017.
Can supervise: YES
- Learning analytics
- Educational data science
- Artificial intelligence in education
- Data for social good
Shibani has taught in both undergraduate and postgraduate level subjects in UTS including:
Master of Data Science and Innovation subjects
- 36100 Data Science for Innovation
- 36106 Data, Algorithms and Meaning
- 36102 iLab 1
- 36105 iLab 2
- 36201 Arguments, Evidence and Intuition
© 2020 Elsevier Inc. Failing to understand the perspectives of educators, and the constraints under which they work, is a hallmark of many educational technology innovations' failure to achieve usage in authentic contexts, and sustained adoption. Learning Analytics (LA) is no exception, and there are increasingly recognised policy and implementation challenges in higher education for educators to integrate LA into their teaching. This paper contributes a detailed analysis of interviews with educators who introduced an automated writing feedback tool in their classrooms (triangulated with student and tutor survey data), over the course of a three-year collaboration with researchers, spanning six semesters' teaching. It explains educators' motivations, implementation strategies, outcomes, and challenges when using LA in authentic practice. The paper foregrounds the views of educators to support cross-fertilization between LA research and practice, and discusses the importance of cultivating educators' and students' agency when introducing novel, student-facing LA tools.
Shibani, A, Koh, E, Lai, V & Shim, KJ 2017, 'Assessing the Language of Chat for Teamwork Dialogue', EDUCATIONAL TECHNOLOGY & SOCIETY, vol. 20, no. 2, pp. 224-237.
Buckingham Shum, S & Shibani, A 2019, 'Learning Analytics Growing Pains – Socio-technical Infrastructure Changes as LA Tools Mature', Australian Learning Analytics Summer Institute, Wollonging, Australia.
Shibani, A, Knight, S & Shum, SB 2019, 'Contextualizable learning analytics design: A generic model and writing analytics evaluations', Proceedings of the 9th International Conference on Learning Analytics & Knowledge (LAK'19), International Conference on Learning Analytics and Knowledge, ACM, USA, pp. 210-219.View/Download from: Publisher's site
© 2019 Copyright is held by the owner/author(s). Publication rights licensed to ACM. A major promise of learning analytics is that through the collection of large amounts of data we can derive insights from authentic learning environments, and impact many learners at scale. However, the context in which the learning occurs is important for educational innovations to impact student learning. In particular, for student-facing learning analytics systems like feedback tools to work effectively, they have to be integrated with pedagogical approaches and the learning design. This paper proposes a conceptual model to strike a balance between the concepts of generalizable scalable support and contextualized specific support by clarifying key elements that help to contextualize student-facing learning analytics tools. We demonstrate an implementation of the model using a writing analytics example, where the features, feedback and learning activities around the automated writing feedback tool are tuned for the pedagogical context and the assessment regime in hand, by co-designing them with the subject experts. The model can be employed for learning analytics to move from generalized support to meaningful contextualized support for enhancing learning.
Shibani, A, Liu, M, Rapp, C & Knight, S 2019, 'Advances in Writing Analytics: Mapping the state of the field', Companion Proceedings of the 9th International Conference on Learning Analytics & Knowledge, Arizona, USA.
Writing analytics as a field is growing in terms of the tools and technologies developed to support student writing, methods to collect and analyze writing data, and the embedding of tools in pedagogical contexts to make them relevant for learning. This workshop will facilitate discussion on recent writing analytics research by researchers, writing tool developers, theorists and practitioners to map the current state of the field, identify issues and develop future directions for advances in writing analytics.
Shibani, A 2018, 'AWA-Tutor: A Platform to Ground Automated Writing Feedback in Robust Learning Design', http://bit.ly/lak18-companion-proceedings, 8th International Learning Analytics and Knowledge Conference, Sydney, Australia.
Knight, S, Shibani, A & Buckingham Shum, S 2018, 'Augmenting Formative Writing Assessment with Learning Analytics: A Design Abstraction Approach', 13th International Conference of the Learning Sciences: Rethinking Learning in the Digital Age. Making the Learning Sciences Count, International Conference of the Learning Sciences, International Society of the Learning Sciences, London, UK, pp. 1783-1790.
Shibani, A, Knight, S & Shum, SB 2018, 'Understanding Revisions in Student Writing Through Revision Graphs', Artificial Intelligence in Education, Springer, pp. 332-336.View/Download from: Publisher's site
Text revision is regarded as an important process in improving written products. To study the process of revision activity from authentic classroom contexts, this paper introduces a novel visualization method called Revision Graph to aid detailed analysis of the writing process. This opens up the possibility of exploring the stages in students’ revision of drafts, which can lead to further automation of revision analysis for researchers, and formative feedback to students on their writing. The Revision Graph could also be applied to study the direct impact of automated feedback on students’ revisions and written outputs in stages of their revision, thus evaluating its effectiveness in pedagogic contexts.
Shibani, A 2017, 'Combining automated and peer feedback for effective learning design in writing practices', Technology and Innovation: Computer-Based Educational Systems for the 21st Century, International Conference on Computers in Education, Asia-Pacific Society for Computers in Education, New Zealand, pp. 21-24.
The provision of formative feedback has been shown to support self-regulated
learning for improving students’ writing. Formative peer feedback is a promising approach, but
requires scaffolding to be effective for all students. Automated tools making use of writing
analytics techniques are another useful means to provide formative feedback on students’
writing. However, they should be applied through effective learning designs in pedagogic
contexts for better uptake and sense-making by students. Such learning analytics applications
open up the possibilities to combine different types of feedback for effective design of
interventions in authentic contexts. A framework combining peer feedback and automated
feedback is proposed to design effective interventions for improving student writing. Automated
feedback is augmented by peer feedback for better contextual feedback and sense making, and
peer feedback is enhanced by automated feedback as scaffolding, thus complementing each
Shibani, A, Knight, S, Buckingham Shum, S & Ryan, P 2017, 'Design and Implementation of a Pedagogic Intervention Using Writing Analytics', Proceedings of the 25th International Conference on Computers in Education, International Conference on Computers in Education, Asia-Pacific Society for Computers in Education, Christchurch, New Zealand.
Academic writing is a key skill required for higher education students, which is often challenging to learn. A promising approach to help students develop this skill is the use of automated tools that provide formative feedback on writing. However, such tools are not widely adopted by students unless useful for their discipline-related writing, and embedded in the curriculum. This recognition motivates an increased emphasis in the field on aligning learning analytics applications with learning design, so that analytics-driven feedback is congruent with the pedagogy and assessment regime. This paper describes the design, implementation, and evaluation of a pedagogic intervention that was developed for law students to make use of an automated Academic Writing Analytics tool (AWA) for improving their academic writing. In exemplifying this pedagogically aligned learning analytic intervention, we describe the development of a learning analytics platform to support the pedagogic design, illustrating its potential through example analyses of data derived from the task.
Koh, E, Shibani, A & Hong, H 2016, 'Teamwork in the balance: Exploratory findings of Teamwork competency patterns in effective learning teams', Proceedings of International Conference of the Learning Sciences, ICLS, pp. 874-877.
© ISLS. Teamwork is an important life skill and competency for 21st century learners. It also contributes to effective learning teams. In this exploratory study, we examine the different levels of team effectiveness and describe the varying patterns of teamwork competency dimensions from students' online problem-solving chatlogs. We employed two types of measures for a holistic understanding of team effectiveness, namely, task performance scores and learners' perception of how effective they are as a team. Teams were categorized into four levels of team effectiveness based on the measures. A content analysis of teamwork competency dimensions was also performed. Our findings revealed the need for balance in teamwork competency behaviors for effective learning teams. Insights from the findings could lead to design principles for interventions to nurture teamwork competency in our 21st century learners.
Koh, E, Shibani, A, Tan, JP-L & Hong, H 2016, 'A Pedagogical Framework for Learning Analytics in Collaborative Inquiry Tasks: An Example from a Teamwork Competency Awareness Program', LAK '16 CONFERENCE PROCEEDINGS: THE SIXTH INTERNATIONAL LEARNING ANALYTICS & KNOWLEDGE CONFERENCE,, 6th International Conference on Learning Analytics and Knowledge (LAK), ASSOC COMPUTING MACHINERY, Univ Edinburgh, Edinburgh, SCOTLAND, pp. 74-83.View/Download from: Publisher's site
Koh, E, Tee, Y-H, Shibani, A, Tay, KL, Tan, JP-L, Hong, H, Hussin, M & Chan, HL 2016, 'Designing a web tool to support teamwork awareness and reflection: Evaluation of two trial cycles', 24TH INTERNATIONAL CONFERENCE ON COMPUTERS IN EDUCATION (ICCE 2016), 24th International Conference on Computers in Education (ICCE) - Think Global Act Local, ASIA PACIFIC SOC COMPUTERS IN EDUCATION, Indian Inst Technol Bombay, Mumbai, INDIA, pp. 139-144.
Shibani, A, Koh, E, Lai, V & Shim, KJ 2016, 'Analysis of Teamwork Dialogue: A Data Mining Approach', 2016 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 4th IEEE International Conference on Big Data (Big Data), IEEE, Washington, DC, pp. 4032-4034.
Shibani, A, Koh, E & Hong, H 2015, 'Text mining approach to automate teamwork assessment in group chats', ACM International Conference Proceeding Series, pp. 434-435.View/Download from: Publisher's site
© Copyright 2015 ACM. The increasing use of chat tools for learning and collaboration emphasizes the need for automating assessment. We propose a text mining approach to automate teamwork assessment in chat data. This supervised training approach can be extended to other domains for efficient assessment.
Shibani, A 2019, 'Augmenting Pedagogic Writing Practice with Contextualizable Learning Analytics'.
Academic writing is a key skill that contributes to essential learning outcomes for higher education students. Despite its importance, students often lack proficiency in writing and find it challenging to learn. While previous research suggests that students’ writing skills are enhanced through formative feedback, the time-consuming nature of providing formative feedback on individual student drafts, especially in large cohorts, makes it impractical for educators to provide detailed writing support in this way. A promising approach, therefore, is the use of writing analytics to provide automated formative feedback on writing. This particular form of learning analytics, using computational techniques and natural language processing, provides timely, immediate, and consistent automated feedback to help students improve their writing. However, for such tools to work effectively in pedagogic settings, and be adopted by practitioners, academics need to feel a sense of ownership over how the tool fits into their practice. This recognition motivates an increased emphasis on aligning learning analytics applications with learning design, so that analytics-driven feedback is congruent with the pedagogy and assessment regime.
The thesis investigates how writing practice can be augmented with a writing analytics tool called ‘AcaWriter’ by aligning it with learning design. The approach is evaluated across two disciplines in authentic higher educational settings using a design-based research approach. Mixed methods and multiple data sources are used to examine how students perceive and interact with automated feedback, and revise their writing. Based on this analysis, the thesis provides empirical evidence that students found the writing intervention and automated feedback from AcaWriter useful, and improved their subject-related writing skills, thus validating its applicability in writing contexts. It identifies varied levels of student engagement with automated feedback and...