Dr Knight's research focuses on how people find, use, and evaluate evidence. His Masters in Philosophy of Education (UCL) analysed the epistemological side of assessment policy, particularly in the context of search engine use. His subsequent work takes an empirical approach to student epistemic cognition - thinking about what we know, and how - particularly in information seeking and writing contexts. Dr Knight has investigated this use of evidence, and epistemic cognition, by exploring the collaborative dialogue of small groups of children (MPhil, Cambridge) and undergraduates (PhD, Open) when using a collaborative browser addon (Coagmento, developed by Rutgers University).
In 2019 Dr Knight was a Visiting Scholar at the Learning Analytics Research Network (LEARN) housed at New York University Steinhardt School of Culture, Education, and Human Development, and the University College London, Institute of Education. He is 2019 recipient of the UTS Early Career Academic Teaching and Learning Award, and 2017 recipient of a team High Commendation for Teaching and Learning in the Masters of Data Science and Innovation.
Dr Knight is a member of the UTS STEM Education Futures Research Centre.
His full CV and publication list can be found via the personal website link below.
- Co-Editor in Chief of the Journal of Learning Analytics
- Editorial Board member for the International Journal of Computer Supported Collaborative Learning (ijCSCL)
- Chartered member of the British Psychological Society (CPsychol)
- Member of Society for Text and Discourse
- Member of Philosophy of Education Society of Great Britain
- Member of Society of Learning Analytics Research
- Qualified Secondary Teacher (I primarily taught A-level philosophy and psychology in the UK)
Can supervise: YES
I'm interested in how people find, use, and evaluate evidence. My current research explores student writing practices (including information seeking, reading, note taking, writing processes, and peer and self-assessment of writing). I'm particularly interested in the relationship of these practices to ways of thinking about knowledge and evaluation (epistemic cognition) and collaborative knowledge practices (including co-writing, formative feedback, collaborative information seeking, etc.). I take a systemic approach in considering the policy and practice context of writing and its analysis.
My teaching aims to develop student 'evidence literacy' - their capacity to find, create, use, and evaluate evidence in their practice.
I'm interested in supporting students very broadly in learning analytics, and particularly in topics around epistemic cognition and student writing.
I coordinate the 'Data Science for Innovation' subject in our Masters in Data Science and Innovation, and have taught/coordinated the undergraduate quantitative literacy subject 'Arguments, Evidence, and Intuition', devleoping a number of online modules (at UTS Open) in this area.
Knight, S 2012, ORBIT Coursebook, ORBIT, Cambridge, UK.
Mercer, N, Wegerif, R & Major, L, The Routledge International Handbook of Research on Dialogic Education, Routledge.View/Download from: Publisher's site
Knight, S & Thompson, K 2020, 'Developing a Text-Integration Task for Investigating and Teaching Interdisciplinarity in Science Teams', Research in Science Education.View/Download from: Publisher's site
© 2020, Springer Nature B.V. Integrating information from across multiple sources is an important science literacy skill that involves the following: identifying intra- and intertextual ties, modeling relationships between sources and claims and making an evaluation of the claims made. Tasks that involve reading, interpreting and synthesizing multiple sources have been well explored particularly in the epistemic cognition literature. Interdisciplinarity is a growing area of interest in science education, in terms of the ways we induct students into interdisciplinary ways of thinking and working, including the synthesis of knowledge from across scientific disciplines. While interdisciplinary contexts frequently involve connecting multiple sources from different disciplines, how students complete these text-integration tasks has not been well investigated. This paper develops a model of interdisciplinary text integration for science literacy, drawing on dimensions of epistemic cognition. We exemplify the application of this approach in a specific case of environmental science graduate students, drawing on student syntheses to illustrate how our approach can be used to differentiate between students' written syntheses.
© 2020 Australian and New Zealand Communication Association. This paper discusses the fragmented nature of media literacy and its relationship with technology. It highlights the need for standardised media literacy strategies, particularly the strand that deal with evaluation, which can help address the challenges of the current media landscape (e.g. the fake news phenomenon). Subsequently, we introduce early work towards developing a new evaluative media literacy tool that can empower media consumers to think strategically about the information they are exposed to. This tool, called Fallasigns, is based primarily on research that suggests news topics can attract specific logical flaws. Fallasigns cultivates the ability to anticipate the most likely logical and rhetorical pitfalls to emerge in a news story before being exposed to it. We argue that this strategy may work to effectively inoculate media consumers and provide a more systematic approachwhen evaluating information.
Knight, S, Shibani, A, Abel, S, Gibson, A, Ryan, P, Sutton, N, Wight, R, Lucas, C, Sándor, Á, Kitto, K, Liu, M, Mogarkar, RV & Shum, SB 2020, 'AcaWriter A learning analytics tool for formative feedback on academic writing', Journal of Writing Research, vol. 12, no. 1, pp. 141-186.View/Download from: Publisher's site
© 2020, University of Antwerp. Written communication is an important skill across academia, the workplace, and civic participation. Effective writing incorporates instantiations of particular text structures-rhetorical moves-that communicate intent to the reader. These rhetorical moves are important across a range of academic styles of writing, including essays and research abstracts, as well as in forms of writing in which one reflects on learning gained through experience. However, learning how to effectively instantiate and use these rhetorical moves is a challenge. Moreover, educators often struggle to provide feedback supporting this learning, particularly at scale. Where effective support is provided, the techniques can be hard to share beyond single implementation sites. We address these challenges through the open-source AcaWriter tool, which provides feedback on rhetorical moves, with a design that allows feedback customization for specific contexts. We introduce three example implementations in which we have customized the tool and evaluated it with regard to user perceptions, and its impact on student writing. We discuss the tool's general theoretical background and provide a detailed technical account. We conclude with four recommendations that emphasize the potential of collaborative approaches in building, sharing and evaluating writing tools in research and practice.
© 2019 British Educational Research Association Artificial intelligence and data analysis (AIDA) are increasingly entering the field of education. Within this context, the subfield of learning analytics (LA) has, since its inception, had a strong emphasis upon ethics, with numerous checklists and frameworks proposed to ensure that student privacy is respected and potential harms avoided. Here, we draw attention to some of the assumptions that underlie previous work in ethics for LA, which we frame as three tensions. These assumptions have the potential of leading to both the overcautious underuse of AIDA as administrators seek to avoid risk, or the unbridled misuse of AIDA as practitioners fail to adhere to frameworks that provide them with little guidance upon the problems that they face in building LA for institutional adoption. We use three edge cases to draw attention to these tensions, highlighting places where existing ethical frameworks fail to inform those building LA solutions. We propose a pilot open database that lists edge cases faced by LA system builders as a method for guiding ethicists working in the field towards places where support is needed to inform their practice. This would provide a middle space where technical builders of systems could more deeply interface with those concerned with policy, law and ethics and so work towards building LA that encourages human flourishing across a lifetime of learning. Practitioner Notes What is already known about this topic Applied ethics has a number of well-established theoretical groundings that we can use to frame the actions of ethical agents, including, deontology, consequentialism and virtue ethics. Learning analytics has developed a number of checklists, frameworks and evaluation methodologies for supporting trusted and ethical development, but these are often not adhered to by practitioners. Laws like the General Data Protection Regulation (GDPR) apply to fields like education, but the complexity...
Knight, S, Leigh, A, Davila, Y, Martin, L & Krix, D 2019, 'Calibrating Assessment Literacy Through Benchmarking Tasks', Assessment and Evaluation in Higher Education, vol. 44, no. 8, pp. 1121-1132.View/Download from: Publisher's site
In calibration tasks students assess exemplar texts using criteria against which their own work will be assessed. Typically these tasks are used in the context of training for peer assessment. Little research has been conducted on the benefits of calibration tasks, such as benchmarking, as learning opportunities in their own right. This paper examines a dataset from a long-running benchmarking task ( 500 students per semester, for four semesters). We investigate the relationship of benchmarking performance to other student outcomes, including ability to self-assess accurately. We show that students who complete the benchmarking perform better, that there is a relationship between benchmarking performance and self-assessment performance, and that students appreciate the support for learning that benchmarking tasks provide. We discuss implications for teaching and learning flagging the potential of calibration tasks as an under-explored tool.
The importance of temporality in learning has been long established, but it is only recently that serious attention has begun to be paid to the precise identification, measurement, and analysis of the temporal features of learning. From 2009 to 2016, a series of temporality workshops explored temporal concepts and data types, analysis methods for exploiting temporal data, techniques for visualizing temporal information, and practical considerations for the use of temporal analyses in particular contexts of learning. Following from these efforts, this two-part Special Section serves to consolidate research working to progress conceptual, technical and practical tools for temporal analyses of learning data. In addition, in this second and final editorial we aim to make four contributions to the ongoing dialouge around temporal learning analytics to help us move towards a clearer mapping of the research space. First, the editorial presents an overview of the five papers in Part 2 of the Special Section on Temporal Analyses, highlighting the dimensions of data types, learning constructs, analysis approaches, and potential impact. Second, it draws on the fluid relationship between ‘analyzed time’ and ‘experienced time’ to highlight the need for caution and criticality in the purposes temporal analyses are mobilized to serve. Third it offers a guide for future work in this area by outlining important questions that all temporal analyses should intentionally address. Finally, it proposes next steps learning analytics researchers and practitioners can take collectively to advance work on the use of temporal analyses to support learning.
The importance of temporality in learning has been long established, but it is only recently that serious attention has begun to be paid to the precise identification, measurement, and analysis of the temporal features of learning. From 2009 to 2016, a series of temporality workshops explored temporal concepts and data types, analysis methods for exploiting temporal data, techniques for visualizing temporal information, and practical considerations for the use of temporal analyses in particular contexts of learning. Following from these efforts, this two-part Special Section serves to consolidate research working to progress conceptual, technical and practical tools for temporal analyses of learning data. In addition, in this second and final editorial we aim to make four contributions to the ongoing dialouge around temporal learning analytics to help us move towards a clearer mapping of the research space. First, the editorial presents an overview of the five papers in Part 2 of the Special Section on Temporal Analyses, highlighting the dimensions of data types, learning constructs, analysis approaches, and potential impact. Second, it draws on the fluid relationship between 'analyzed time' and 'experienced time' to highlight the need for caution and criticality in the purposes temporal analyses are mobilized to serve. Third it offers a guide for future work in this area by outlining important questions that all temporal analyses should intentionally address. Finally, it proposes next steps learning analytics researchers and practitioners can take collectively to advance work on the use of temporal analyses to support learning.
© 2018, © 2018 Informa UK Limited, trading as Taylor & Francis Group. Extended and distributed cognition theories argue that human cognitive systems sometimes include non-biological objects. On these views, the physical supervenience base of cognitive systems is thus not the biological brain or even the embodied organism, but an organism-plus-artifacts. In this paper, we provide a novel account of the implications of these views for learning, education, and assessment. We start by conceptualizing how we learn to assemble extended cognitive systems by internalizing cultural norms and practices. Having a better grip on how extended cognitive systems are assembled, we focus on the question: If our cognition extends, how should we educate and assess such extended cognitive systems? We suggest various ways to minimize possible negative effects of extending one's cognition and to efficiently find and organize (online) information by adopting a virtue epistemology approach. Educational and assessment implications are foregrounded, particularly in the case of Danish students' use of the internet during exams.
© 2017 Elsevier Ltd A core concern in learning is coming to understand the ways in which claims of knowledge are made. The epistemic cognition literature typically characterises this learning in terms of how learners cognitively conceptualise the source and nature of knowledge. Recent work has offered alternative accounts of epistemic cognition that recognise the discursive nature of the construct. These accounts are derived from analysis of the ways that learners talk about knowledge in tasks such as evaluating scientific claims from sources of varying qualities. In this paper we draw on this recent work to advance a novel approach to the analysis of discourse data in epistemic contexts. This approach is exemplified through its application to an existing dataset, demonstrating both the application of the approach and the particular kinds of discourse that learners engaged in. This discursive approach has the potential for broad application in the learning sciences' treatment of epistemic cognition.
Knight, S, Buckingham Shum, S, Ryan, P, Sándor, Á & Wang, X 2018, 'Designing Academic Writing Analytics for Civil Law Student Self-Assessment', International Journal of Artificial Intelligence in Education, vol. 28, no. 1, pp. 1-28.View/Download from: Publisher's site
Research into the teaching and assessment of student writing shows that many students find academic writing a challenge to learn, with legal writing no exception. Improving the availability and quality of timely formative feedback is an important aim. However, the time-consuming nature of assessing writing makes it impractical for instructors to provide rapid, detailed feedback on hundreds of draft texts which might be improved prior to submission. This paper describes the design of a natural language processing (NLP) tool to provide such support. We report progress in the development of a web application called AWA (Academic Writing Analytics), which has been piloted in a Civil Law degree. We describe: the underlying NLP platform and the participatory design process through which the law academic and analytics team tested and refined an existing rhetorical parser for the discipline; the user interface design and evaluation process; and feedback from students, which was broadly positive, but also identifies important issues to address. We discuss how our approach is positioned in relation to concerns regarding automated essay grading, and ways in which AWA might provide more actionable feedback to students. We conclude by considering how this design process addresses the challenge of making explicit to learners and educators the underlying mode of action in analytic devices such as our rhetorical parser, which we term algorithmic accountability.
Hershkovitz, A, Knight, S, Jovanovic, J, Dawson, S & Gasevic, D 2017, 'Research with Simulated Data', Journal of Learning Analytics, vol. 4, pp. 1-2.
We draw on recent accounts of social epistemology to present a novel account of epistemic cognition that is 'socialised'. In developing this account we foreground the: normative and pragmatic nature of knowledge claims; functional role that 'to know' plays when agents say they 'know x'; the social context in which such claims occur at a macro level, including disciplinary and cultural context; and the communicative context in which such claims occur, the ways in which individuals and small groups express and construct (or co-construct) their knowledge claims. We frame prior research in terms of this new approach to provide an exemplification of its application. Practical implications for research and learning contexts are highlighted, suggesting a re-focussing of analysis on the collective level, and the ways knowledge-standards emerge from group-activity, as a communicative property of that activity.
Knight, S & Mercer, N 2017, 'Collaborative, epistemic discourse in classroom information seeking tasks', Technology, Pedagogy and Education, vol. 26, no. 1, pp. 33-50.View/Download from: Publisher's site
We discuss the relationship between information seeking, and epistemic beliefs – beliefs about the source, structure, complexity, and stability of knowledge – in the context of collaborative information seeking discourses. We further suggest that both information seeking, and epistemic cognition research agendas have suffered from a lack of attention to how information seeking as a collaborative activity is mediated by talk between partners – an area we seek to address in this paper. A small-scale observational study using sociocultural discourse analysis was conducted with eight eleven year old pupils who carried out search engine tasks in small groups. Qualitative and quantitative analysis were performed on their discussions using sociocultural discourse analytic techniques. Extracts of the dialogue are reported, informed by concordance analysis and quantitative coding of dialogue duration. We find that 1) discourse which could be characterised as 'epistemic' is identifiable in student talk, 2) that it is possible to identify talk which is more or less productive, and 3) that epistemic talk is associated with positive learning outcomes.
Knight, S, Rienties, B, Littleton, K, Mitsui, M, Tempelaar, D & Shah, C 2017, 'The relationship of (perceived) epistemic cognition to interaction with resources on the internet', Computers in Human Behavior, vol. 73, pp. 507-518.View/Download from: Publisher's site
© 2017 Elsevier Ltd Information seeking and processing are key literacy practices. However, they are activities that students, across a range of ages, struggle with. These information seeking processes can be viewed through the lens of epistemic cognition: beliefs regarding the source, justification, complexity, and certainty of knowledge. In the research reported in this article we build on established research in this area, which has typically used self-report psychometric and behavior data, and information seeking tasks involving closed-document sets. We take a novel approach in applying established self-report measures to a large-scale, naturalistic, study environment, pointing to the potential of analysis of dialogue, web-navigation – including sites visited – and other trace data, to support more traditional self-report mechanisms. Our analysis suggests that prior work demonstrating relationships between self-report indicators is not paralleled in investigation of the hypothesized relationships between self-report and trace-indicators. However, there are clear epistemic features of this trace data. The article thus demonstrates the potential of behavioral learning analytic data in understanding how epistemic cognition is brought to bear in rich information seeking and processing tasks.
Knight, S, Rienties, B, Littleton, K, Tempelaar, D, Mitsui, M & Shah, C 2017, 'The Orchestration of a Collaborative Information Seeking Learning Task', Information Retrieval, vol. 20, no. 5, pp. 480-505.View/Download from: Publisher's site
The paper describes our novel perspective on 'searching to learn' through collaborative information seeking (CIS). We describe this perspective, which motivated empirical work to 'orchestrate' a CIS searching to learn session. The work is described through the lens of orchestration, an approach which brings to the fore the ways in which: background context—including practical classroom constraints, and theoretical perspective; actors—including the educators, researchers, and technologies; and activities that are to be completed, are brought into alignment. The orchestration is exemplified through the description of research work designed to explore a pedagogically salient construct (epistemic cognition), in a particular institutional setting. Evaluation of the session indicated satisfaction with the orchestration from students, with written feedback indicating reflection from them on features of the orchestration. We foreground this approach to demonstrate the potential of orchestration as a design approach for researching and implementing CIS as a 'searching to learn' context.
Learning is a process that occurs over time: We build understanding, change perspectives, and develop skills over the course of extended experiences. As a field, learning analytics aims to generate understanding of, and support for, such processes of learning. Indeed, a core characteristic of learning analytics is the generation of high-resolution temporal data about various types of actions. Thus, we might expect study of the temporal nature of learning to be central in learning analytics research and applications. However, temporality has typically been underexplored in both basic and applied learning research. As Reimann (2009) notes, although "researchers have privileged access to process data, the theoretical constructs and methods employed in research practice frequently neglect to make full use of information relating to time and order" (p. 239). Typical approaches to analysis often aggregate across data due to a collection of conceptual, methodological, and operational challenges. As described below, insightful temporal analysis requires (1) conceptualising the temporal nature of learning constructs, (2) translating these theoretical propositions into specific methodological approaches for the capture and analysis of temporal data, and (3) practical methods for capturing temporal data features and using analyses to impact learning contexts. There is a pressing need to address these challenges if we are to realize the exciting possibilities for temporal learning analytics.
Knight, S, Wise, AF, Ochoa, X & Hershkovitz, A 2017, 'Learning Analytics: Looking to the Future', Journal of Learning Analytics, vol. 4, pp. 1-5.
In the last 7 years, since the first LAK conference, Learning Analytics has grown rapidly as a field from a small group of interested scholars and practitioners to one of the most scientifically successful and institutionally accepted areas of Learning and Educational Technologies. Learning Analytics is often referred as a "Middle-Space" where experts from diverse fields (from the Learning Sciences, Computer Science, Human-Computer Interaction, Psychology and Behavioural Sciences, just to name a few) share their perspectives on how to better understand and optimize learning processes and environments using this new instrument called Data Science.
Deakin Crick, R, Knight, S & Barr, S 2017, 'Towards Analytics for Wholistic School Improvement: Hierarchical Process Modelling and Evidence Visualization', Journal of Learning Analytics, vol. 4, no. 2, pp. 160-188.View/Download from: Publisher's site
Central to the mission of most educational institutions is the task of preparing the
next generation of citizens to contribute to society. Schools, colleges, and universities value a
range of outcomes — e.g., problem solving, creativity, collaboration, citizenship, service to
community — as well as academic outcomes in traditional subjects. Often referred to as "wider
outcomes," these are hard to quantify. While new kinds of monitoring technologies and public
datasets expand the possibilities for quantifying these indices, we need ways to bring that data
together to support sense-making and decision-making. Taking a systems perspective, the
hierarchical process modelling (HPM) approach and the "Perimeta" visual analytic provides a
dashboard that informs leadership decision-making with heterogeneous, often incomplete
evidence. We report a prototype of Perimeta modelling from education, aggregating wider
outcomes data across a network of schools, and calculating their cumulative contribution to key
performance indicators, using the visual analytic of the Italian flag to make explicit not only the
supporting evidence, but also the challenging evidence, as well as areas of uncertainty. We
discuss the nature of the modelling decisions and implicit values involved in quantifying these
kinds of educational outcomes.
This issue of the Journal of Learning Analytics features three special sections that look into topics of learning analytics for 21st century skills, multimodal learning analytics, and sharing of datasets for learning analytics. The issue also features a paper that looks at models for early detection of students at risk in tertiary education. The editorial concludes with a summary of the changes in the editorial team of the journal.
Knight, S, Dawson, S, Gašević, D, Jovanović, J & Hershkovitz, A 2016, 'Learning Analytics: Richer Perspectives Across Stakeholders', Journal of Learning Analytics, vol. 3, pp. 1-4.View/Download from: Publisher's site
This issue of the Journal of Learning Analytics features seven research papers, complemented by a practitioner research paper (Dvorak & Jia). Papers by McCoy and Shih, and Knight, Brozina, and Novoselich discuss the important topic of educators working with educational data, alongside (in the latter paper) student perspectives on learning analytics. Douglas, Bermel, Alam, and Madhavan; and Waddington, Nam, Lonn, and Teasley offer empirical insight on developing a richer perspective on learning material interaction and engagement in online learning contexts (MOOCs, and LMS' respectively). Dvorak and Jia bring a practitioner perspective to the issue in their discussion of approaches to analyzing online work habits via timeliness, regularity, and intensity. Sutherland and White, and Vieira, Goldstein, Purzer, and Magana offer focus on specific subject-based learning activities (algebra learning, and student experimentation strategies in engineering design, respectively). Finally, Howley and Rosé discuss the complex interactions of theory and method in computational modeling of group learning processes. The issue also features a special section on learning analytics tutorials, edited by Gašević and Pechenizkiy. The editorial concludes with a report of the recent 'hot spots section' consultation from the editorial team of the journal.
Knight, S & Littleton, K 2015, 'Dialogue as Data in Learning Analytics for Productive Educational Dialogue', Journal of Learning Analytics, vol. 2, no. 3, pp. 111-143.View/Download from: Publisher's site
Accounts of the nature and role of productive dialogue in fostering educational outcomes are now well established in the learning sciences and are underpinned by bodies of strong empirical research and theorising. Allied to this there has been longstanding interest in fostering computer-supported collaborative learning (CSCL) in support of such dialogue. Learning analytic environments such as massive open online courses (moocs) and online learning environments (such as virtual learning environments, VLEs and learning management systems, LMSs) provide ripe potential spaces for learning dialogue. In prior research, preliminary steps have been taken to detect occurrences of productive dialogue automatically through the use of automated analysis techniques. Such advances have the potential to foster effective dialogue through the use of learning analytic techniques that scaffold, give feedback on, and provide pedagogic contexts promoting, such dialogue. However, the translation of learning science research to the online context is complex, requiring the operationalization of constructs theorized in different contexts (often face to face), and based on different data-sets and structures (often spoken dialogue).. In this paper we explore what could constitute the effective analysis of this kind of productive dialogue, arguing that it requires consideration of three key facets of the dialogue: features indicative of productive dialogue; the unit of segmentation; and the interplay of features and segmentation with the temporal underpinning of learning contexts. We begin by outlining what we mean by 'productive educational dialogue', before going on to discuss prior work that has been undertaken to date on its manual and automated analysis. We then highlight ongoing challenges for the development of computational analytic approaches to such data, discussing the representation of features, segments, and temporality in computational modelling. The paper thus foregrounds, to...
There is an increasing interest in developing learning analytic techniques for the analysis, and support of, high quality learning discourse. This paper maps the terrain of discourse-centric learning analytics (DCLA), outlining the distinctive contribution of DCLA and outlining a definition for the field moving forwards. It is our claim that DCLA provide the opportunity to explore the ways in which: discourse of various forms both resources and evidences learning; the ways in which small and large groups, and individuals make and share meaning together through their language use; and the particular types of language – from discipline specific, to argumentative and socio-emotional – associated with positive learning outcomes. DCLA is thus not merely a computational aid to help detect or evidence 'good' and 'bad' performance (the focus of many kinds of analytic), but a tool to help investigate questions of interest to researchers, practitioners, and ultimately learners. The paper ends with three core issues for DCLA researchers – the challenge of context in relation to DCLA; the various systems required for DCLA to be effective; and the means through which DCLA might be delivered for maximum impact at the micro (e.g. learner), meso (e.g. school), and macro (e.g. governmental) levels.
While search engines are commonly used by children to find information, and in classroom-based activities, children are not adept in their information seeking or evaluation of information sources. Prior work has explored such activities in isolated, individual contexts, failing to account for the collaborative, discourse-mediated nature of search engine use which is common in classroom contexts. This small-scale study explored the established 'typology of talk', particularly 'exploratory talk', in a classroom search context. The authors found that the most successful pupils were those who engaged in the most exploratory talk. This finding has practical classroom implications: the collaborative nature of search and potential of collaboration and discourse should be exploited in search-based tasks. This study also indicates a rich area for future research.
Dent, H 2014, 'Wikimedia - chance to get involved', PSYCHOLOGIST, vol. 27, no. 6, pp. 382-383.
Haßler, B, Hennessy, S, Knight, S & Connolly, T 2014, 'Developing an Open Resource Bank for Interactive Teaching of STEM: Perspectives of school teachers and teacher educators', Journal of Interactive Media in Education, pp. 1-24.View/Download from: Publisher's site
Much of the current literature related to Open Educational Resource (OER) development and practice concentrates on higher education, although a growing body of work is also emerging for the primary and secondary school sectors. This article examines the user perspectives of teachers and teacher educators, regarding: discovery of teaching resources; what they know about OER; their sharing practices; and their perspectives on resource quality and trust. The research was done in the context of the Open Resource Bank for Interactive Teaching (ORBIT), a JISC-funded Phase III OER project at the University of Cambridge. ORBIT is an OER Resource Bank containing more than 200 science, technology, engineering and mathematics (STEM) focused lesson ideas for primary and secondary teachers as well as serving as a resource bank of OER to be used by teacher educators in a variety of settings.
Knight, S 2014, 'THE PSYCHOLOGIST NEEDS YOU!', PSYCHOLOGIST, vol. 27, no. 6, pp. 382-383.
Knight, S, Buckingham Shum, S & Littleton, K 2014, 'Epistemology, assessment, pedagogy: where learning meets analytics in the middle space', Journal of Learning Analytics, vol. 1, no. 2, pp. 23-47.View/Download from: Publisher's site
Learning Analytics is an emerging research field and design discipline which occupies the 'middle space' between the learning sciences/educational research, and the use of computational techniques to capture and analyse data (Suthers and Verbert, 2013). We propose that the literature examining the triadic relationships between epistemology (the nature of knowledge), pedagogy (the nature of learning and teaching) and assessment provide critical considerations for bounding this middle space. We provide examples to illustrate the ways in which the understandings of particular analytics are informed by this triad. As a detailed worked example of how one might design analytics to scaffold a specific form of higher order learning, we focus on the construct of epistemic beliefs: beliefs about the nature of knowledge. We argue that analytics grounded in a pragmatic, sociocultural perspective are well placed to explore this construct using discourse-centric technologies. The examples provided throughout this paper, through emphasising the consideration of intentional design issues in the middle space, underscore the "interpretative flexibility" (Hamilton & Feenberg, 2005) of new technologies, including analytics.
Knight, S 2012, 'Anger or Fairness in Ultimatum Game Rejections?', Journal of European Psychology Students, vol. 3, pp. 2-15.
Guth, Schmittberger and Schwarze's (1982) ultimatum game result is replicated with mean earnings of
£59.98 (N = 51) S.D. = £11.45, from a possible £80, and a linear relationship between offer size and
acceptance rate. Results indicate a significant interaction effect between offer size and response, F(3, 31) =
3.69, p < 0.05 on response time. Our novel adjustment introduced the proposer's most 'common offer' to
responders. Results were in accord with prior work (Knez & Camerer, 1995); social comparisons between the
participant, and a hypothesised responder – the receiver of the 'common offer' – were made only at mid-range
offers (£2), for which low common offers were accepted more from proposers making low common offers than
high t(45) = 3.28, p< 0.05.
Knight, S, Nguyen, HX, Falkner, N, Bowden, R & Roughan, M 2011, 'The Internet Topology Zoo', IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, vol. 29, no. 9, pp. 1765-1775.View/Download from: Publisher's site
Knight, S 2011, 'AFL FOR INCLUSIVE DIFFERENTIATED LEARNING', Practical Research in Education, vol. 44, pp. 57-63.
Knight, S 2011, 'So you want to do (free) research?', PSYCHOLOGIST, vol. 24, no. 11, pp. 828-830.
Knight, S 2011, 'Using the ultimatum game to teach economic theories of relationship maintenance to A-level students', Psychology Teaching Review, vol. 17, no. 1, pp. 46-49.
Knight, S & Vainre, M 2011, 'Career Aspirations and Self-Efficacy of European Psychology Students', Psychology Teaching Review, vol. 17 (Themed Issue - Psychology around the world), no. 2, pp. 47-60.
Knight, S & Zupan, Z 2011, 'Besplatna tehnologija u psihološkim istraživanjima. (Free technology in psychological research)', Psihološka istraživanja, vol. XIV, no. 1, pp. 99-106.
Knight, S 2020, 'Augmenting Assessment with Learning Analytics' in Bearman, M, Dawson, P, Ajjawi, R, Tai, J & Boud, D (eds), Re-imagining University Assessment in a Digital World, Springer Nature, Singapore, pp. 129-145.View/Download from: Publisher's site
Learning analytics as currently deployed has tended to consist of large-scale analyses of available learning process data to provide descriptive or predictive insight into behaviours. What is sometimes missing in this analysis is a connection to human-interpretable, actionable, diagnostic information. To gain traction, learning analytics researchers should work within existing good practice particularly in assessment, where high quality assessments are designed to provide both student and educator with diagnostic or formative feedback. Such a model keeps the human in the analytics design and implementation loop, by supporting student, peer, tutor, and instructor sense-making of assessment data, while adding value from computational analyses.
Knight, S 2020, 'Section introduction: Dialogic education and digital technology' in Mercer, N, Wegerif, R & Major, L (eds), The Routledge International Handbook of Research on Dialogic Education, Routledge, UK, pp. 389-393.View/Download from: Publisher's site
The chapters in this section of the book focus specifically on dialogic education and digital technology. To frame this chapter, it is important to understand why there should be mutual interest among those who are interested in the role of dialogic approaches, and the role of digital technologies in learning. At weakest such shared theorising is important simply because technology is increasingly available (indeed, pervasive) in our everyday lives and classrooms. In this view, technologies are more or less neutral actors to be leveraged as we wish; we should thus understand how to develop dialogic approaches in this emerging context.
However, while of course rapid technological change creates an imperative to understand the impact of that change, this narrow perspective is a view that sociocultural researchers and those interested in dialogic approaches would reject. A somewhat stronger claim, then, and one that is made explicitly by Major and Warwick (this section) is that those who are interested in dialogic approaches to learning should be interested in digital technologies with respect to the affordances or possibilities for action that those technologies create for dialogue. A corollary, then, is that those interested in digital technologies should be interested in how they might develop and research tools that create or embody such affordances for dialogue and learning.
Within this context, digital tools can be seen as affording opportunity to, for example, make learning visible to students and teachers as an artefact for reflection and improvement, creating sharing space to scrutinise ideas, and showing how ideas evolve over time. Moreover, as Major and Warwick note, we care not only about the action possibilities, but also the enacted affordances for dialogue – i.e., the specific ways in which the action possibilities are implicated in promotion of dialogic interaction for learning, and indeed, as Rasmussen et al note, the ways that new tools provide ...
Thompson, K, Alhadad, S, Buckingham Shum, S, Howard, S, Knight, S, Martinez-Maldonado, R & Pardo, A 2019, 'Connecting expert knowledge in the design of classroom learning experiences' in Lodge, J, Cooney Horvath, J & Corrin, L (eds), Learning analytics in the classroom: Translating learning analytics research for teachers, Routledge, UK, pp. 111-128.
Learning Analytics technologies provide new ways to log, evaluate and provide feedback on learning activity. Consequently, it is clearly desirable that strong connections are forged with research and practice in established disciplines such as educational research (Gašević, Dawson & Siemens, 2015), learning theory (Friend Wise & Shaffer, 2015), learning designs (Lockyer & Dawson, 2011), tools design (Martinez-Maldonado, Pardo, Mirriahi, Yacef, Kay & Clayphan, 2016b), epistemology, pedagogy and assessment (Knight, Buckingham Shum, & Littleton, 2014). Learning, and in particular, design for learning, has been conceptualised in terms of complex networks of learners, instructors, designers, and researchers integrating physical and digital spaces (Carvalho, Goodyear & de Laat, 2017). In contemporary learning environments, learning designs must consider the role of tools in both physical and digital learning environments, and how these tools can connect to support learning and teaching. The learning design process now needs to take into account the affordances present when digital environments are capable of producing highly detailed digital traces of learner engagement.
In contemporary learning environments, there is also a pressing need to explore a tighter interdisciplinary approach in which learning sciences, learning analytics, and classroom experts develop learning designs to take full advantage of the potential benefits of this new data. Accessing data to inform teaching and learning is increasingly common during the design of learning experiences. However, how this data is interpreted will vary depending on the purpose for which it is being accessed (e.g. accountability, teaching-evaluation, or providing feedback to students, etc.) and the role of the accessor in developing or delivering the learning task (e.g. learning technologist, learning designer, tutor, lecturer, etc.). This chapter uses a case study to explore one such design scenario. In this chapter w...
Knight, S & Buckingham Shum, S 2016, 'Theory and Learning Analytics' in The Handbook of Learning Analytics and Educational Data Mining, SOCIETY for LEARNING ANALYTICS RESEARCH, UK, pp. 17-22.View/Download from: Publisher's site
The challenge of understanding how theory and analytics relate is to move "from clicks to
constructs" in a principled way. Learning analytics are a specific incarnation of the bigger
shift to an algorithmically pervaded society, and their wider impact on education needs
careful consideration. In this chapter, we argue that by design — or else by accident — the
use of a learning analytics tool is always aligned with assessment regimes, which are in
turn grounded in epistemological assumptions and pedagogical practices. Fundamentally
then, we argue that deploying a given learning analytics tool expresses a commitment to
a particular educational worldview, designed to nurture particular kinds of learners. We
outline some key provocations in the development of learning analytic techniques, key
questions to draw out the purpose and assumptions built into learning analytics. We suggest
that using "claims analysis" — analysis of the implicit or explicit stances taken in the design
and deploying of technologies — is a productive human-centred method to address these
key questions, and we offer some examples of the method applied to those provocations.
Knight, S & Littleton, K 2015, 'Learning through Collaborative Information Seeking' in Hansen, P, Shah, C & Klas, C (eds), Collaborative Information Seeking: Best practices, New Domains, New Thoughts, Springer, Switzerland, pp. 101-116.View/Download from: Publisher's site
This chapter discusses Collaborative Information Seeking (CIS) from an educational perspective. Our core claim is that CIS has the potential to bring together rich collaborative, and multimodal, contexts in which important learning processes may take place. We thus see CIS as more than just an activity with potential to 'speed up' information seeking, or contribute to effective division of labour. This claim is independent of the particular classroom subject, or the form of technological mediation; rather, the chapter provides a focus on some key considerations in collaborative learning that should be of interest to both educators and those interested in the 'benefits' of CIS. This chapter first outlines our broad educational interest in elements of CIS, connecting that to the focal points of CIS research. We go on to highlight the importance of dialogue as a tool for learning, before discussing the complexities of understanding 'success' in CIS tasks, and then specifically the role that dialogue has played so far in CIS research. We conclude with a call to researchers in both CIS and education to explore the nature of learning in CIS contexts, making use of a rich understanding of the importance of dialogue to create meaning together.
Knight, S & Littleton, K 2015, 'Thinking, interthinking, and technological tools' in Wegerif, R, Li, L & Kaufman, JC (eds), The Routledge International Handbook of Research on Teaching Thinking, Routledge, Abingdon, UK, pp. 467-478.
Newman, K, Knight, S, Elbeshausen, S & Hansen, P 2015, 'Situating CIS – The importance of Context in Collaborative Information Seeking' in Hansen, P, Shah, C & Klas, C (eds), Collaborative Information Seeking: Best practices, New Domains, New Thoughts, Springer, Switzerland.View/Download from: Publisher's site
Collaborative Information Seeking (CIS) is common in many professional contexts. This chapter discusses CIS from four different perspectives – education, healthcare, science research and patent research. We first introduce the CIS context, focusing on Evans and Chi's proposed model of social search. We highlight the ways contextual factors relate to the search process, in particular noting the role of communication in CIS processes. The four example professional contexts are discussed with reference to the 'medium' of collaboration, the ways CIS is conducted, the tools used and physical setting of CIS, and the 'context' of CIS, the purposes for which an instance of CIS occurs in that discipline. We suggest that these contextual factors can be aligned with, but are additional to, the existing Evans and Chi model of social search, and that their addition in a 'pre- and post-model' extension could provide a shared framework for researching contextual features of CIS. In highlighting commonalities and contrasts across the disciplinary contexts we suggest that a developed model, and further research, is needed to understand the relationship between motivations in these different disciplines and the evaluation of CIS episodes, and the role of processes, particularly communication, in those episodes. In order to evaluate CIS in different disciplines future research should focus on the between, and within discipline differences in the purposes of CIS. Characteristics of success in different disciplinary contexts may relate both to the consideration of the collaborative context, and the information need; developing deeper understanding of this point.
Knight, S 2014, 'Finding Knowledge: What Is It To 'Know' When We Search?' in Society of the Query Reader: Reflections on Web Search, Institute of Network Cultures, Amsterdam, The Netherlands, pp. 227-238.
The issue of the epistemological implications of our social and technical interactions with information is the subject of this essay. This will be specified by looking at the role of the search engine as an informant, offering testimonial knowledge on a query; at the question of how the receiver of testimony should be taken into account by those giving the information; and how we should deal with multiplicity of perspectives, or indeed gaps in our knowledge.
We should seek to understand the nature of 'knowledge', and how informants – including non-human informants – mediate our understanding of the world around us, and have always done so. This essay turns to these questions, discussing some issues with researching technological changes, and then what role search functions fulfill, and how such functions affect our own understanding of 'knowledge'.
Such an analysis has profound implications, for example in education. Under what circumstances do we accept that students 'know' something; how we do we decide that they know (that is, how do educators claim knowledge on their student's knowledge states); but also what sort of knowledge is important important to know in such a situation, these are all important questions. Furthermore, how we think about the future of such technology and the ways that technology might change what we know (for better or worse) is important.
Knight, S 2013, 'Appendix C7.1: Resources for Searching with the Internet' in Hennessy, S, Warwick, P, Brown, L, Rawlins, D & Neale, C (eds), Developing interactive teaching and learning using the IWB, Open University Press.
Knight, S 2013, 'Creating a supportive environment for classroom dialogue' in Hennessy, S, Warwick, P, Brown, L, Rawlins, D & Neale, C (eds), Developing interactive teaching and learning using the IWB, Open University Press, UK.
This chapter first considers the role of dialogue in classroom contexts, and the importance of open-ended dialogue in contrast to more traditional, closed questioning sequences. I briefly discuss the role of dialogue in individual psychological development, focussing on its importance for conceptual development in whole classes and small groups in the context of the classroom. A common – closed – sequence of classroom talk is first outlined, and then discussed in the context of 'dialogic talk' – talk which is more open, builds on prior knowledge, is supportive and collaborative in nature. The use of 'exploratory talk' – talk which focuses on the use of reasoning to build mutual understanding – is also outlined in this context.
1. What role does dialogue play in learning?
2. What form does dialogue typically take?
3. How can we make dialogue more effective?
The second part of the chapter discusses some ways to promote effective dialogue in classroom contexts. Some suggestions for creating and identifying an effective environment for classroom talk are discussed. I highlight the importance of 'ground rules' for talk, and some key words teachers might look for and emphasise in encouraging the use of 'exploratory talk'. I then discuss some ideas for ways to start effective talk in classrooms, including the use of Talking Points and effective questioning. This chapter aims to give some background on effective dialogue of relevance to subsequent chapters, which will consider particular features of the interactive whiteboard in the context of dialogue.
Arastoopour Irgens, G, Knight, S, Wise, A, Philip, T, Olivares, M, Van Wart, S, Vakil, S, Marshall, J, Parikh, T, Lopez, L, Wilkerson, M, Gutiérrez, K, Jiang, S & Kahn, J 2020, 'Data Literacies and Social Justice: Exploring Critical Data Literacies through Sociocultural Perspectives', International Society of the Learning Sciences, Nashville, Tennessee.View/Download from: Publisher's site
The ability to interpret, evaluate, and make data-based decisions is critical in the age of big data. Normative scripts around the use of data position them as a privileged epistemic form conferring authority through objectivity that can serve as a lever for effecting change. However, humans and materials shape how data are created and used which can reinscribe existing power relations in society at large (Van Wart, Lanouette & Parikh, in press). Thus, research is needed on how learners can be supported to engage in critical data literacies through sociocultural perspectives. As a field intimately concerned with data-based reasoning, social justice, and design, the learning sciences is well-positioned to contribute to such an effort. This symposium brings together scholars to present theoretical frameworks and empirical studies on the design of learning spaces for critical data literacies. This collection supports a larger discussion around existing tensions, additional design considerations, and new methodologies.
An engaged, informed citizenry is important for tackling many of the words most pressing sustainability issues. Epistemic cognition research may play a key role in understanding and developing such capacities. Recent shifts in epistemic cognition that draw on social epistemology are to be welcomed, however, there is further potential here in drawing on a recent ethical turn in epistemology to make explicit the ethical assumptions underpinning the area of epistemic cognition. That is, that epistemic cognition has ethical dimensions, including that we (1) care about consequential issues, epistemic issues that have stakes, (2) have epistemic obligations, and (3) should attend to concerns of epistemic injustice. I argue for scoping epistemic cognition to recognize these ethical turns, reflecting that the significance of bringing these concerns – already present in much work – into focus for further inquiry.
Prestigiacomo, R, Hadgraft, R, Lockyer, L, Knight, S, van den Hoven, E, Martinez-Maldonado, R & Hunter, J 2020, 'Learning-centred Translucence: An Approach to Understand How Teachers Talk About Classroom Data', The 10th international Learning Analytics & Knowledge Conference, Frankfurt, Germany.View/Download from: Publisher's site
Teachers are increasingly being encouraged to embrace evidence- based practices to improve their teaching. Learning analytics (LA) offer great promise in supporting these practices by providing evidence for teachers and learners to make informed decisions and transform the educational experience. However, LA limitations and their uptake by educators are also coming under critical scrutiny. This is in part due to the lack of involvement of teachers and learners in the design of LA tools to understand existing educational practices. In this paper, we propose a human-centred approach to generate understanding ofteachers' data needs through the lens of three key principles of translucence: visibility, awareness and accountability. We illustrate our approach through a participatory design sprint to identify how teachers talk about classroom data. We describe teachers' perspectives on the evidence they need for making better-informed decisions and discuss the implications of our approach for the design of human- centred LA in the next years.
Deroover, K, BUCHER, T & Knight, S 2019, 'A taxonomy of disagreements related to health and nutrition information', https://annualmeeting.isbnpa.org/wp-content/uploads/2019/05/ISBNPA-2019…, International Society of Behavioral Nutrition and Physical Activity, Prague, Czech Republic, pp. 826-826.
Shibani, A, Liu, M, Rapp, C & Knight, S 2019, 'Advances in Writing Analytics: Mapping the state of the field', Companion Proceedings of the 9th International Conference on Learning Analytics & Knowledge, Arizona, USA.
Writing analytics as a field is growing in terms of the tools and technologies developed to support student writing, methods to collect and analyze writing data, and the embedding of tools in pedagogical contexts to make them relevant for learning. This workshop will facilitate discussion on recent writing analytics research by researchers, writing tool developers, theorists and practitioners to map the current state of the field, identify issues and develop future directions for advances in writing analytics.
Shibani, A, Knight, S & Shum, SB 2019, 'Contextualizable learning analytics design: A generic model and writing analytics evaluations', Proceedings of the 9th International Conference on Learning Analytics & Knowledge (LAK'19), International Conference on Learning Analytics and Knowledge, ACM, USA, pp. 210-219.View/Download from: Publisher's site
© 2019 Copyright is held by the owner/author(s). Publication rights licensed to ACM. A major promise of learning analytics is that through the collection of large amounts of data we can derive insights from authentic learning environments, and impact many learners at scale. However, the context in which the learning occurs is important for educational innovations to impact student learning. In particular, for student-facing learning analytics systems like feedback tools to work effectively, they have to be integrated with pedagogical approaches and the learning design. This paper proposes a conceptual model to strike a balance between the concepts of generalizable scalable support and contextualized specific support by clarifying key elements that help to contextualize student-facing learning analytics tools. We demonstrate an implementation of the model using a writing analytics example, where the features, feedback and learning activities around the automated writing feedback tool are tuned for the pedagogical context and the assessment regime in hand, by co-designing them with the subject experts. The model can be employed for learning analytics to move from generalized support to meaningful contextualized support for enhancing learning.
Alhadad, SSJ, Thompson, K, Knight, S, Lewis, M & Lodge, JM 2018, 'Analytics-Enabled teaching as design: Reconceptualisation and call for research', Proceedings of the 8th International Conference on Learning Analytics and Knowledge, International Conference on Learning Analytics and Knowledge, ACM, Sydney, New South Wales, Australia, pp. 427-435.View/Download from: Publisher's site
© 2018 Association for Computing Machinery. As a human-centred educational practice and field of research, learning analytics must account for key stakeholders in teaching and learning. The focus of this paper is on the role of institutions to support teachers to incorporate learning analytics into their practice by understanding the confluence of internal and external factors that influence what they do. In this paper, we reconceptualise 'teaching as design' for 'analytics-enabled teaching as design' to shape this discussion to allow for the consideration of external factors, such as professional learning or ethical considerations of student data, as well as personal considerations, such as data literacy and teacher beliefs and identities. In order to address the real-world challenges of progressing teachers' efficacy and capacity toward analytics-enabled teaching as design, we have placed the teacher – as a cognitive, social, and emotional being – at the center. In so doing, we discuss potential directions towards research for practice in elucidating underpinning factors of teacher inquiry in the process of authentic design.
Knight, S & Thompson, K 2018, 'Developing a Text-Integration Task for Investigating and Teaching Interdisciplinarity in Science Teams', 13th International Conference of the Learning Sciences: Rethinking Learning in the Digital Age. Making the Learning Sciences Count, International Society of the Learning Sciences, pp. 1453-1455.
Integrating information from multiple sources is an important literacy skill that involves: identifying intra and inter-textual ties; modeling relationships between sources and claims; and evaluation of the claims made. Tasks that involve reading, interpreting and synthesizing multiple sources have been explored particularly in the epistemic cognition literature. Interdisciplinarity is a growing area of interest in education, with commensurate interest in the learning sciences regarding the means by which we induct students into interdisciplinary ways of thinking and working. While interdisciplinary contexts frequently involve connecting multiple sources, from different disciplines, these text-integration tasks have not been well investigated.
Abel, S, Kitto, K, Knight, S & Shum, SB 2018, 'Designing personalised, automated feedback to develop students' research writing skills', ASCILITE 2018 - Conference Proceedings - 35th International Conference of Innovation, Practice and Research in the use of Educational Technologies in Tertiary Education: Open Oceans: Learning Without Borders, pp. 15-24.
© 2018 ASCILITE 2018 - Conference Proceedings - 35th International Conference of Innovation, Practice and Research in the use of Educational Technologies in Tertiary Education: Open Oceans: Learning Without Borders Constructive and formative feedback on writing is crucial to help Higher Degree Research (HDR) students develop effective writing skills and succeed, both in their degree and beyond. However, at the start students have a poor grasp of good academic writing, and HDR supervisors do not always have the time or the writing expertise to provide quality, constructive, formative feedback to students. One approach to address this problem is provided by Writing Analytics (WA), using text analytics to provide timely, formative feedback to students on their writing, in the process introducing a clear set of terms to describe important features of academic writing. This paper describes how Swales' (1990) Create A Research Space (CARS) model was used to extend a writing analytics tool such that it could be applied to HDR students' writing, and how good feedback practices were employed to design constructive automated feedback. This work summarises a process that can be used to develop theory driven writing analytics tools that should facilitate thesis writing.
Knight, S, Shibani, A & Buckingham Shum, S 2018, 'Augmenting Formative Writing Assessment with Learning Analytics: A Design Abstraction Approach', 13th International Conference of the Learning Sciences: Rethinking Learning in the Digital Age. Making the Learning Sciences Count, International Conference of the Learning Sciences, International Society of the Learning Sciences, London, UK, pp. 1783-1790.
Shibani, A, Knight, S & Shum, SB 2018, 'Understanding Revisions in Student Writing Through Revision Graphs', Artificial Intelligence in Education, Springer, pp. 332-336.View/Download from: Publisher's site
Text revision is regarded as an important process in improving written products. To study the process of revision activity from authentic classroom contexts, this paper introduces a novel visualization method called Revision Graph to aid detailed analysis of the writing process. This opens up the possibility of exploring the stages in students' revision of drafts, which can lead to further automation of revision analysis for researchers, and formative feedback to students on their writing. The Revision Graph could also be applied to study the direct impact of automated feedback on students' revisions and written outputs in stages of their revision, thus evaluating its effectiveness in pedagogic contexts.
Anderson, TK & Knight, S 2016, 'Learning analytic devices - co-forming, re-forming, in-forming', Information Research: an international electronic journal, International Conference on Conceptions of Library and Information Science, Information Research, Uppsala, Sweden, pp. 1-9.
Introduction. This work-in-progress paper explores the intersection of theorising in human-data-interaction, information studies and learning analytics as part of a discussion about the role informative artefacts play as agents of learning. Method. The artefacts crafted by learners through collaborative work in two different classroom context are considered both as representations of and representations about learning. Analysis. Framing analytic devices crafted through collaborative work in these classroom examples as boundary objects draws attention to their value as carriers and constructers of ideas within and beyond the classroom. Results. The fluid, transient nature of the activities contributed to their value as informative artefacts in individual and collective sensemaking. Through the constant refreshment and reinvention of the material forms that students exchange with one another (and ultimately with their instructors) information is produced. Conclusion. By playfully allowing for multiple means of interaction, the artefactual agents in the two examples create a range of multimodal action possibilities as material and informative artefacts. The paper invites further conversation about these possibilities and the valuable "social life"(Brown & Duguid, 1996) of analytic devices that shape the ways that learning is understood and enacted as objects of assessment.
Knight, S, Anderson, T & Tall, K 2017, 'Dear learner: Participatory visualisation of learning data for sensemaking', ACM International Conference Proceeding Series, International Learning Analytics & Knowledge Conference, ACM, Vancouver, British Columbia, Canada, pp. 532-533.View/Download from: Publisher's site
© 2017 ACM. We discuss the application of a hand-drawn self-visualization approach to learner-data, to draw attention to the space of representational possibilities, the power of representation interactions, and the performativity of information representation.
Thompson, K, Danielson, A, Gosselin, D, Knight, S, Martinez-Maldonado, R, Parnell, R, Pennington, D, Svoboda-Gouvea, J, Vincent, S & Wheeler, P 2017, 'Designing the EMBeRS Summer School: Connecting Stakeholders in Learning, Teaching and Research', 25TH INTERNATIONAL CONFERENCE ON COMPUTERS IN EDUCATION (ICCE 2017), International Conference on Computers in Education, ASIA PACIFIC SOC COMPUTERS IN EDUCATION, Christchurch, New Zealand, pp. 210-215.
In this paper, we describe our research investigating design, teaching and learning aspects of the EMBeRS Summer School. In 2016, thirteen graduate Environmental Science students participated in a ten-day Summer School to learn about interdisciplinary approaches to researching socio-environmental systems. Using the Employing Model-Based Reasoning in Socio-Environmental Synthesis (EMBeRS) approach, students learned about wicked problems, team composition, systems thinking and modelling, stakeholder management, and communication. They applied this approach to their own research, as well as to a case study, in order to, ultimately, further the EMBeRS approach in their own institutions. Learning sciences researchers, environmental science instructors and learners collaborated in design, teaching, and learning during the 2016 Summer School in order to co-create and co-configure the tasks, social arrangements, and tools for learning, teaching and design. This paper identifies four examples of connections between the stakeholders (researchers, instructors and learners), the tools that facilitated the connection, and the implications for learning, teaching and design.
Knight, S, Allen, L, Gibson, A, McNamara, D & Shum, SB 2017, 'Writing analytics literacy - Bridging from research to practice', ACM International Conference Proceeding Series, pp. 496-497.View/Download from: Publisher's site
© 2017 ACM. There is untapped potential in achieving the full impact of learning analytics through the integration of tools into practical pedagogic contexts. To meet this potential, more work must be conducted to support educators in developing learning analytics literacy. The proposed workshop addresses this need by building capacity in the learning analytics community and developing an approach to resourcing for building 'writing analytics literacy'.
Knight, S, Martinez-Maldonado, R, Gibson, A & Shum, SB 2017, 'Towards mining sequences and dispersion of rhetorical moves in student written texts', ACM International Conference Proceeding Series, International Learning Analytics & Knowledge Conference, ACM, Vancouver, British Columbia, Canada, pp. 228-232.View/Download from: Publisher's site
© 2017 ACM. There is an increasing interest in the analysis of both student's writing and the temporal aspects of learning data. The analysis of higher-level learning features in writing contexts requires analyses of data that could be characterised in terms of the sequences and processes of textual features present. This paper (1) discusses the extant literature on sequential and process analyses of writing; and, based on this and our own first-hand experience on sequential analysis, (2) proposes a number of approaches to both pre-process and analyse sequences in whole-texts. We illustrate how the approaches could be applied to examples drawn from our own datasets of 'rhetorical moves' in written texts, and the potential each approach holds for providing insight into that data. Work is in progress to apply this model to provide empirical insights. Although, similar sequence or process mining techniques have not yet been applied to student writing, techniques applied to event data could readily be operationalised to undercover patterns in texts.
Shibani, A, Knight, S, Buckingham Shum, S & Ryan, P 2017, 'Design and Implementation of a Pedagogic Intervention Using Writing Analytics', Proceedings of the 25th International Conference on Computers in Education, International Conference on Computers in Education, Asia-Pacific Society for Computers in Education, Christchurch, New Zealand.
Academic writing is a key skill required for higher education students, which is often challenging to learn. A promising approach to help students develop this skill is the use of automated tools that provide formative feedback on writing. However, such tools are not widely adopted by students unless useful for their discipline-related writing, and embedded in the curriculum. This recognition motivates an increased emphasis in the field on aligning learning analytics applications with learning design, so that analytics-driven feedback is congruent with the pedagogy and assessment regime. This paper describes the design, implementation, and evaluation of a pedagogic intervention that was developed for law students to make use of an automated Academic Writing Analytics tool (AWA) for improving their academic writing. In exemplifying this pedagogically aligned learning analytic intervention, we describe the development of a learning analytics platform to support the pedagogic design, illustrating its potential through example analyses of data derived from the task.
Gibson, A, Shum, SB, Aitken, A, Tsingos-Lucas, C, Sándor, Á & Knight, S 2017, 'Reflective writing analytics for actionable feedback', Proceedings of the Seventh International Learning Analytics & Knowledge Conference, International Learning Analytics & Knowledge Conference, ACM, Vancouver, British Columbia, Canada, pp. 153-162.View/Download from: Publisher's site
© 2017 ACM. Reflective writing can provide a powerful way for students to integrate professional experience and academic learning. However, writing reflectively requires high quality actionable feedback, which is time-consuming to provide at scale. This paper reports progress on the design, implementation, and validation of a Reflective Writing Analytics platform to provide actionable feedback within a tertiary authentic assessment context. The contributions are: (1) a new conceptual framework for reflective writing; (2) a computational approach to modelling reflective writing, deriving analytics, and providing feedback; (3) the pedagogical and user experience rationale for platform design decisions; and (4) a pilot in a student learning context, with preliminary data on educator and student acceptance, and the extent to which we can evidence that the software provided actionable feedback for reflective writing.
Chen, B, Wise, AF, Knight, S & Cheng, BH 2016, 'Putting Temporal Analytics into Practice: The 5th International Workshop on Temporality in Learning Data', LAK '16 CONFERENCE PROCEEDINGS: THE SIXTH INTERNATIONAL LEARNING ANALYTICS & KNOWLEDGE CONFERENCE,, 6th International Conference on Learning Analytics and Knowledge (LAK), ASSOC COMPUTING MACHINERY, Univ Edinburgh, Edinburgh, SCOTLAND, pp. 488-489.View/Download from: Publisher's site
Chen, B, Wise, AF, Knight, S & Cheng, L 2016, 'It's About Time: Putting Temporal Analytics into Practice: The 5th International Workshop on Temporality in Learning Data'.
Knight, S & Anderson, T 2016, 'Action-oriented, Accountable, and inter(Active) Learning Analytics for Learners', ACM Press.
Knight, S, Allen, L, Littleton, K, Rienties, B & Tempelaar, DT 2016, 'Writing Analytics for Epistemic Features of Student Writing', Transforming Learning, Empowering Learners Conference Proceedings, International Conference of the Learning Sciences, International Society of the Learning Sciences, Inc. [ISLS], Singapore, pp. 194-201.
Abstract: Literacy, encompassing the ability to produce written outputs from the reading of multiple sources, is a key learning goal. Selecting information, and evaluating and integrating claims from potentially competing documents is a complex literacy task. Prior research exploring differing behaviours and their association to constructs such as epistemic cognition has used 'multiple document processing' (MDP) tasks. Using this model, 270 paired participants, wrote a review of a document. Reports were assessed using a rubric associated with features of complex literacy behaviours. This paper focuses on the conceptual and empirical associations between those rubric-marks and textual features of the reports on a set of natural language processing (NLP) indicators. Findings indicate the potential of NLP indicators for providing feedback regarding the writing of such outputs, demonstrating clear relationships both across rubric facets and between rubric facets and specific NLP indicators.
Martinez-Maldonado, R, Anderson, T, Shum, SB & Knight, S 2016, 'Towards supporting awareness for content curation: The case of food literacy and behavioural change', CEUR Workshop Proceedings, Learning Analytics for Learners (LAK-LAL), Central Europe Workshop, Edinburgh, Scotland, pp. 42-46.
Copyright © 2016 for the individual papers by the papers' authors.This paper presents a theoretical grounding and a conceptual proposal aimed at providing support in the initial stages of sustained behavioural change. We explore the role that learning analytics and/or open learner models can have in supporting life-long learners to enhance their food literacy through a more informed curation process of relevant-content. This approach grounds on a behavioural change perspective that identifies i) knowledge, ii) attitudes, and iii) self-efficacy as key factors that will directly and indirectly affect future decisions and agency of life-long learners concerning their own health. The paper offers some possible avenues to start organising efforts towards the use of learning analytics to enhance awareness in terms of: knowledge curation, knowledge sharing and knowledge certainty. The paper aims at triggering discussion about the type of data and presentation mechanisms that may help life-long learners set a stronger basis for behavioural change in the subsequent stages.
Knight, S 2015, 'Learning indicators in SCIS tasks', ECol 2015 - Proceedings of the 2015 Workshop on Evaluation on Collaborative Information Retrieval and Seeking, co-located with CIKM 2015, 2015 Workshop on Evaluation on Collaborative Information Retrieval and Seeking, ACM, Melbourne, Australia, pp. 11-13.View/Download from: Publisher's site
Copyright is held by the owner/author(s). Evaluation of user success, and systems to support that success, in information seeking tasks is complex. The addition of social or/and collaborative elements to task and system design adds an additional layer of complexity to such evaluation. This short paper highlights a number of simple metrics that have been used by information and learning science researchers to explore SCIS in the context of searching to learn.
Knight, S & Littleton, K 2015, 'Developing a multiple-document-processing performance assessment for epistemic literacy', Proceedings of the Fifth International Conference on Learning Analytics And Knowledge, International Learning Analytics & Knowledge Conference, ACM, Poughkeepsie, USA, pp. 241-245.View/Download from: Publisher's site
The LAK15 theme "shifts the focus from data to impact", noting the potential for Learning Analytics based on existing technologies to have scalable impact on learning for people of all ages. For such demand and potential in scalability to be met the challenges of addressing higher-order thinking skills should be addressed. This paper discuses one such approach – the creation of an analytic and task model to probe epistemic cognition in complex literacy tasks. The research uses existing technologies in novel ways to build a conceptually grounded model of trace-indicators for epistemic-commitments in information seeking behaviors. We argue that such an evidence centered approach is fundamental to realizing the potential of analytics, which should maintain a strong association with learning theory.
Knight, S & Mitsui, M 2015, 'Temporal analysis in epistemic SCIS tasks', ECol 2015 - Proceedings of the 2015 Workshop on Evaluation on Collaborative Information Retrieval and Seeking, co-located with CIKM 2015, 2015 Workshop on Evaluation on Collaborative Information Retrieval and Seeking, co-located with CIKM 2015, ACM, Melbourne, Australia, pp. 15-16.View/Download from: Publisher's site
Copyright is held by the owner/author(s). Temporal considerations are core to understanding both learning, and information seeking processes and outcomes. This claim is true of both individual, and collaboratively based work. Indeed, limited analysis has been conducted in both the learning and information sciences to describe temporal features of SCIS. This paper discusses some of this work, which provides the background for ongoing analysis (to be presented at the workshop) of temporal factors in a designed epistemic SCIS task.
Knight, S, Wise, AF, Chen, B & Cheng, BH 2015, 'It's About Time: 4th International Workshop on Temporal Analyses of Learning Data', The 5th International Learning Analytics & Knowledge Conference (LAK15): Scaling Up: Big Data to Big Impact.
Interest in analyses that probe the temporal aspects of learning continues to grow. The study of common and consequential sequences of events (such as learners accessing resources, interacting with other learners and engaging in self-regulatory activities) and how these are associated with learning outcomes, as well as the ways in which knowledge and skills grow or evolve over time are both core areas of interest. Learning analytics datasets are replete with fine-grained temporal data: click streams; chat logs; document edit histories (e.g. wikis, etherpads); motion tracking (e.g. eye-tracking, Microsoft Kinect), and so on. However, the emerging area of temporal analysis presents both technical and theoretical challenges in appropriating suitable techniques and interpreting results in the context of learning. The learning analytics community offers a productive focal ground for exploring and furthering efforts to address these challenges as it is already positioned in the "'middle space' where learning and analytic concerns meet" (Suthers & Verbert, 2013, p 1). This workshop, the fourth in a series on temporal analysis of learning, provides a focal point for analytics researchers to consider issues around and approaches to temporality in learning analytics.
Knight, S, Wise, A, Arastoopour, G, Shaffer, DW, Buckingham Shum, S & Kirschner, PA 2014, 'Learning analytics for learning and becoming in practice', International Conference of the Learning Sciences.
Arastoopour, G, Shum, SB, Collier, W, Kirschner, PA, Wise, AF, Knight, S & Shaffer, DW 2014, 'Analytics for learning and becoming in practice', Proceedings of International Conference of the Learning Sciences, ICLS, International Conference of the Learning Sciences (ICLS) 2014, pp. 1680-1683.
© ISLS. Learning Analytics sits at the intersection of the learning sciences and computational data capture and analysis. Analytics should be grounded in the existing literature with a view to data 'geology' or 'archeology' over 'mining'. This workshop explores how analytics may extend the common notion of activity trace data from learning processes to encompass learning practices, with a working distinction for discussion as (1) process: a series of related actions engaged in as part of learning activities; and (2) practice: a repertoire of processes organised around particular foci recognised within a social group. The workshop intersperses attendee presentations and demonstrations with relevant theme-based discussions.
Knight, S, Arastoopour, G, Williamson Shaffer, D, Buckingham Shum, S & Littleton, K 2014, 'Epistemic Networks for Epistemic Commitments', Learning and Becoming in Practice The International Conference of the Learning Sciences (ICLS) 2014, International Conference of the Learning Sciences, International Society of the Learning Sciences, USA.
Knight, S, Buckingham Shum, S & Littleton, K 2013, 'Epistemology, Pedagogy, Assessment and Learning Analytics', Proc. 3rd International Conference on Learning Analytics & Knowledge, International Learning Analytics & Knowledge Conference, ACM, Leuven, Belgium, pp. 75-84.View/Download from: Publisher's site
Parsonage, E, Nguyen, HX, Bowden, R, Knight, S, Falkner, N & Roughan, M 2011, 'Generalized Graph Products for Network Design and Analysis', 2011 19TH IEEE INTERNATIONAL CONFERENCE ON NETWORK PROTOCOLS (ICNP), 19th IEEE International Conference on Network Protocols (ICNP), IEEE, Vancouver, CANADA.
Knight, S, Moschou, S & Sorell, M 2009, 'Analysis of Sensor Photo Response Non-Uniformity in RAW Images', FORENSICS IN TELECOMMUNICATIONS, INFORMATION AND MULTIMEDIA, 2nd International Conference on Forensic Applications and Techniques in Telecommunications, Information and Multimedia, SPRINGER, Univ Adelaide, Australian Natl Wine Ctr, Adelaide, AUSTRALIA, pp. 130-141.
Buckingham Shum, S & Knight, S Educational Technology Action Group 2014, Educational Technology Action Group cluster 2a (Students with sight & control of their own complex learning "big" data) consultation.
Knight, S, Maggs, M, Poulter, M, Matthews, C & Sant, T 2014, Wikimedia UK response to House of Lords Digital Skills Committee call for evidence.
Anderson, T & Knight, S 2017, 'Collaborating to Create: Sharing & Co-constructing Information for Learning'.
Knight, S 2017, 'Beyond bubble bursting', British Psychological Society.
Knight, S 2017, 'Bursting the News Filter Bubble', Control Publications (Australia).
After the US presidential elections, Google searches for Breitbart news peaked; friends – not secret Trump supporters – took to the right wing source to try and understand the view they were espousing. Since then there have been frequent calls for more of us to step out of our social media echo chambers and to 'burst our filter bubble'. These echo chambers, it's alleged, combine with a filter bubble effect: social media and search engine personalization that emphasizes content similar to content you have viewed or liked before. So, your Facebook feed only exposes you to views you already agree with, and to information that supports those views, leading to a general deterioration in public and political debate as we seem unable or unwilling to engage across perspectives.
If we believe this argument, then your use of Facebook presents an information-access issue, insulating you from diverse perspectives exposure to which would improve political discourse. Empirical research on this topic is hard. Companies control their data, users typically don't state their politics explicitly, and the impact of proprietary algorithms can only be guessed at. Whether you're liberal or conservative you're more likely to believe information that confirms your prior beliefs, the question is, what role is technology playing in this? And does this cognitive bias mean that we should indiscriminately take a more even handed approach to sources?
Knight, S 2017, 'Epistemic Cognition: A Lens Onto Fake News', The Psychologist.
Knight, S & Anderson, T 2017, 'Data Play: Participatory visualisation to make sense of data'.
Knight, S & Anderson, T 2017, 'Dear Data Learner: Participatory Visualisation of Learning Data for Sensemaking'.
Webinar for Transforming Assessment
Gibson, A, Knight, S, Aitken, A, Buckingham Shum, S, Ryan, P, Jarvis, W, Nikolova, N, Tsingos-Lucas, C, Parr, A, White, A, Sutton, N & Tsingos-Lucas, C 2016, 'Using Writing Analytics For Formative Feedback'.
Knight, S 2012, 'Finding Knowledge - The Role of Dialogue in Collaborative Information Retrieval in the Classroom'.
Knight, S 2011, 'Knowing my extensions: THE IMPLICATIONS OF EXTENDED MIND FOR OUR CONCEPTION OF KNOWLEDGE AND ITS ASSESSMENT – DO I 'KNOW' MY EXTENSIONS?'.
Knight, S, 'Developing Learning Analytics for Epistemic Commitments in a Collaborative Information Seeking Environment'.
- In the AWA project we collaborate with Naver Research in Europe, and the ATN network see http://heta.io/
- I collaborate with Melanie Peffer at the University of Northern Colorado around epistemic cognition research
Work from my PhD is in collaboration with:
- Chirag Shah and Matthew Mitsui at Rutgers University InfoSeeking Lab http://www.infoseeking.org/
- Dirk Tempelaar at Maastricht University,
- Bart Rienties and Karen Littleton at the Open University, UK
- We have active collaborations with the SOLET (Science of Learning and Educational Technology) lab at Arizona State University
- We collaborate with a number of Australian universities