Shibani Antonette is a Lecturer at the Faculty of Transdisciplinary Innovation, UTS. She completed her PhD in Learning Analytics at the Connected Intelligence Centre, UTS in 2019.
With a background in computer science engineering, she works on applied areas of technology and data science (with a focus on text analytics). In her doctoral research in Learning Analytics, she worked on automated writing feedback and its application in higher educational classrooms. Having established a strong research profile in the field of Learning Analytics and artificial intelligence in education, Shibani has presented her work in various international conferences and delivered talks across institutions. She co-hosts the podcast ‘SoLAR Spotlight: Conversations on Learning Analytics’ from the Society for Learning Analytics Research (SoLAR).
Prior to joining UTS, she worked at the National Institute of Education (Nanyang Technological University) in Singapore as a Research Associate.
- 2019 – Ongoing: Executive Member of the Society for Learning Analytics Research (SOLAR) and Chair of Special Interest Groups within SOLAR.
- 2018: Board of Studies Member (Student Representative) for the Connected Intelligence Centre, University of Technology Sydney, Australia.
- Program Committee member and reviewer for: 9th International Learning Analytics and Knowledge Conference (LAK 2019), Special Track on Big Data, Analytics & Machine Learning in Education in IEEE TALE 2018, 26th & 27th International Conference on Computers in Education (ICCE), 8th International Learning Analytics and Knowledge Conference (LAK 2018).
- Reviewer for journals and conferences including: Journal of Learning Analytics, IEEE Transactions on Learning Technologies, Journal of Computer Assisted Learnin, Learning, Education and Families track in ACM CHI Conference on Human Factors in Computing Systems (CHI’19), 13th International Conference on the Learning Sciences (ICLS 2018), Eye Tracking Enhanced Learning: A workshop on eye-tracking experiences in educational technology research, collocated with EC-TEL 2017
- Organizing team member for: 12th International Conference of the Learning Sciences 2016, Redesigning Pedagogy International Conference 2015 organized by National Institute of Education, NTU, Singapore.
Scholarships and Awards received
- Future Women Leaders Conference Award, 2019 for early career female leaders who are actively involved in the fields of Engineering and IT Academia.
- ACM-W scholarship from the Association of Computing Machinery, supporting Women in Computing to attend the International Learning Analytics and Knowledge 2019 conference.
- Doctoral Consortium Scholarship from the Society for Learning Analytics Research to attend the International Learning Analytics and Knowledge 2018 Conference.
- Vice Chancellor’s Conference Fund from the University of Technology Sydney to attend the International Conference on Computers in Education 2017.
- Merit Scholarship from the Asia Pacific Society for Computers in Education in honor of outstanding performance as a doctoral student, at the International Conference on Computers in Education 2017.
Can supervise: YES
- Learning analytics
- Educational data science
- Artificial intelligence in education
- Data for social good
Shibani has taught in both undergraduate and postgraduate level subjects in UTS including:
Master of Data Science and Innovation subjects
- 36100 Data Science for Innovation
- 36106 Data, Algorithms and Meaning
- 36102 iLab 1
- 36105 iLab 2
- 36201 Arguments, Evidence and Intuition
Shibani, A, Koh, E, Lai, V & Shim, KJ 2017, 'Assessing the Language of Chat for Teamwork Dialogue', EDUCATIONAL TECHNOLOGY & SOCIETY, vol. 20, no. 2, pp. 224-237.View/Download from: UTS OPUS
Shibani, A, Knight, S & Shum, SB 2019, 'Contextualizable learning analytics design: A generic model and writing analytics evaluations', Proceedings of the 9th International Conference on Learning Analytics & Knowledge (LAK'19), International Conference on Learning Analytics and Knowledge, ACM, USA, pp. 210-219.View/Download from: UTS OPUS or Publisher's site
© 2019 Copyright is held by the owner/author(s). Publication rights licensed to ACM. A major promise of learning analytics is that through the collection of large amounts of data we can derive insights from authentic learning environments, and impact many learners at scale. However, the context in which the learning occurs is important for educational innovations to impact student learning. In particular, for student-facing learning analytics systems like feedback tools to work effectively, they have to be integrated with pedagogical approaches and the learning design. This paper proposes a conceptual model to strike a balance between the concepts of generalizable scalable support and contextualized specific support by clarifying key elements that help to contextualize student-facing learning analytics tools. We demonstrate an implementation of the model using a writing analytics example, where the features, feedback and learning activities around the automated writing feedback tool are tuned for the pedagogical context and the assessment regime in hand, by co-designing them with the subject experts. The model can be employed for learning analytics to move from generalized support to meaningful contextualized support for enhancing learning.
Shibani, A, Knight, S & Shum, SB 2018, 'Understanding Revisions in Student Writing Through Revision Graphs', Artificial Intelligence in Education, Springer, pp. 332-336.View/Download from: UTS OPUS or Publisher's site
Text revision is regarded as an important process in improving written products. To study the process of revision activity from authentic classroom contexts, this paper introduces a novel visualization method called Revision Graph to aid detailed analysis of the writing process. This opens up the possibility of exploring the stages in students’ revision of drafts, which can lead to further automation of revision analysis for researchers, and formative feedback to students on their writing. The Revision Graph could also be applied to study the direct impact of automated feedback on students’ revisions and written outputs in stages of their revision, thus evaluating its effectiveness in pedagogic contexts.
Aileen Shibani Michael Xavier, A 2017, 'Combining automated and peer feedback for effective learning design in writing practices', Technology and Innovation: Computer-Based Educational Systems for the 21st Century, International Conference on Computers in Education, Asia-Pacific Society for Computers in Education, New Zealand, pp. 21-24.View/Download from: UTS OPUS
The provision of formative feedback has been shown to support self-regulated
learning for improving students’ writing. Formative peer feedback is a promising approach, but
requires scaffolding to be effective for all students. Automated tools making use of writing
analytics techniques are another useful means to provide formative feedback on students’
writing. However, they should be applied through effective learning designs in pedagogic
contexts for better uptake and sense-making by students. Such learning analytics applications
open up the possibilities to combine different types of feedback for effective design of
interventions in authentic contexts. A framework combining peer feedback and automated
feedback is proposed to design effective interventions for improving student writing. Automated
feedback is augmented by peer feedback for better contextual feedback and sense making, and
peer feedback is enhanced by automated feedback as scaffolding, thus complementing each
Koh, E, Shibani, A & Hong, H 2016, 'Teamwork in the balance: Exploratory findings of Teamwork competency patterns in effective learning teams', Proceedings of International Conference of the Learning Sciences, ICLS, pp. 874-877.
© ISLS. Teamwork is an important life skill and competency for 21st century learners. It also contributes to effective learning teams. In this exploratory study, we examine the different levels of team effectiveness and describe the varying patterns of teamwork competency dimensions from students' online problem-solving chatlogs. We employed two types of measures for a holistic understanding of team effectiveness, namely, task performance scores and learners' perception of how effective they are as a team. Teams were categorized into four levels of team effectiveness based on the measures. A content analysis of teamwork competency dimensions was also performed. Our findings revealed the need for balance in teamwork competency behaviors for effective learning teams. Insights from the findings could lead to design principles for interventions to nurture teamwork competency in our 21st century learners.
Koh, E, Shibani, A, Tan, JP-L & Hong, H 2016, 'A Pedagogical Framework for Learning Analytics in Collaborative Inquiry Tasks: An Example from a Teamwork Competency Awareness Program', LAK '16 CONFERENCE PROCEEDINGS: THE SIXTH INTERNATIONAL LEARNING ANALYTICS & KNOWLEDGE CONFERENCE,, 6th International Conference on Learning Analytics and Knowledge (LAK), ASSOC COMPUTING MACHINERY, Univ Edinburgh, Edinburgh, SCOTLAND, pp. 74-83.View/Download from: Publisher's site
Koh, E, Tee, Y-H, Shibani, A, Tay, KL, Tan, JP-L, Hong, H, Hussin, M & Chan, HL 2016, 'Designing a web tool to support teamwork awareness and reflection: Evaluation of two trial cycles', 24TH INTERNATIONAL CONFERENCE ON COMPUTERS IN EDUCATION (ICCE 2016), 24th International Conference on Computers in Education (ICCE) - Think Global Act Local, ASIA PACIFIC SOC COMPUTERS IN EDUCATION, Indian Inst Technol Bombay, Mumbai, INDIA, pp. 139-144.
Shibani, A, Koh, E, Lai, V & Shim, KJ 2016, 'Analysis of Teamwork Dialogue: A Data Mining Approach', 2016 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 4th IEEE International Conference on Big Data (Big Data), IEEE, Washington, DC, pp. 4032-4034.
Shibani, A, Koh, E & Hong, H 2015, 'Text mining approach to automate teamwork assessment in group chats', ACM International Conference Proceeding Series, pp. 434-435.View/Download from: Publisher's site
© Copyright 2015 ACM. The increasing use of chat tools for learning and collaboration emphasizes the need for automating assessment. We propose a text mining approach to automate teamwork assessment in chat data. This supervised training approach can be extended to other domains for efficient assessment.