At a glanceLearning Analytics (LA) is a data-driven approach to help understand and "optimize learning and environments in which it occurs”1. A learning analytics project involves:
Who to contactLearning Innovation and Faculty Engagement – Gemma Henderson Usage scenariosThe following scenarios are generated from implementations at various levels across the University of Miami. Learning PlatformsWhile learning data is captured through the multiple learning platforms faculty and students and engage with, it is often difficult to interface with this data together and make meaningful actions. Some platforms provide limited dashboards to view engagement with course resources:
AdvisingSSC Campus is the current advising platform at the University of Miami, developed by the Education Advisory Board (EAB). SSC Campus tracks student progress using historical data, research, and predictive analytics, to facilitate the identification of students who may require further support to complete their degree. It is a university-wide platform used to enhance our coordinated approach to student success through the following resources:
SSC Campus is available to all schools and colleges at UM and is currently focused on supporting undergraduate students. Each school/college has delegated a SSC Campus Specialist who is dedicated to improving the collaborative success effort. The data captured will be used to inform academic policy and improve strategies to support students. Graduate CoursesAt the University of Miami, learning analytics are introduced within the following graduate programs and courses at UM.
Research MethodsNam Ju Kim, Assistant Professor within Teaching and Learning department at the School of Education and Human Development previously engaged in learning analytics research at Utah State University. In one project, Professor Kim:
Through his experience, Dr. Kim expressed that learning analytics can help us evaluate the pathway of students through variables such as GPA, course evaluations, class format or online interactions in a virtual learning environment. Dr. Kim is enthusiastic about the prospect of pursuing learning analytics within his research and teaching practice at the University of Miami. References
Research TeamGemma Henderson: Senior Instructional Designer, Academic Technologies Cameron Riopelle: Librarian Assistant Professor, Data Services |
Grounded in educational theory research and practice, Learning Analytics involves studying data about key stakeholders in education (students, educators, researchers and administrators) to better understand and enrich the learning process. LA projects can range from predicting the problems they may encounter along the way (financial aid, lack of prerequisites, low grades) to how they interact with learning systems (Blackboard Learn, Canvas, Moodle, SSC Campus).
There are multiple pathways to begin exploring learning analytics. Some questions to consider:
Where will you look for available data? Identify and collect multiple data sets (open, internal, cross-institutional), clean and aggregate the data, if possible, and make data open (while protecting privacy) for further analysis. For example, learning management systems collect data on grades, discussion content, videos, or resources accessed; by using this data or merging it with other data such as administrative and enrollment data, can help provide a more comprehensive picture of students at the university.
Who will collaborate with you? As experts in data science explore the potential of learning analytics, educators and educational researchers are vital collaborators in grounding the project in educational theory and analysis, and help make meaningful actions with learning data.
What will the data answer? Address the purpose or research question behind a project, the ethical impact of pursuing a LA project and the principles that will govern the collection of data. By constructing a data analysis process to measure students in their own particular contexts as well as in comparison with differing scales, such as courses, majors/minors, and departments may provide new insights.
How will you share, communicate and visualize results? Reports and summaries generated through learning analytics methodologies should tell a story about the learning process and motivate instrumental change. These stories should follow strict guidelines about the ethics of collecting, storing, and using data in which vulnerabilities, and sensitive information might be identifiable.
Learning analytics projects are diverse in both scale and subject matter - exciting and daunting for individuals new to the field. In recent years, focus has transitioned from student retention to establishing ethical frameworks and policies. Furthermore, as researchers experiment with learning analytics, projects are often temporary in nature and they may have lasting effects on the field3 - such as ‘Course Signals,’ the discontinued early warning system to improve student retention at Purdue. To raise awareness of the practice of learning analytics, JISC (2016) shares eleven case studies of institutions deploying learning analytics, while Ebner et al. (2017) provides a detailed analysis of LA literature. In summary, some initial themes, descriptions, recent developments have been shared below.
Monitoring Student Progress: Educational institutions have been embracing learning analytics as a powerful methodology for understanding educational outcomes, with most interest still residing in “monitoring or measuring student progress” 8 than in other components such as “predicting learning success or prescribing intervention strategies”8.
Actionable Data: At its best, learning analytics uses a variety of sophisticated approaches to data analysis, including statistics, machine-learning, and qualitative methods, to provide key information that can be used by administrators, researchers, and possibly even the students themselves, to better understand the educational process on the whole, from the basics of student enrollment to complex topics of success and remedial actions9.
Alternative Evaluation Methods: With appropriately scaled and staffed learning analytics units that operate campus-wide and in keeping with the missions of higher education, it is possible to imagine a future in which teachers are aware of the effectiveness of their instruction, struggling students are provided with immediate and timely administrative intervention, and measures of success are approached holistically, taking into account both the “objective” such as GPA and job placement, but also more affective qualities such as student satisfaction and courses meeting learning goals.
Scalability: The landscape of learning analytics is still uneven, with different institutions approaching the topic with often vastly different methodologies and goals. A common problem is the question of scale--if only certain colleges or departments within a given university adopt learning analytics platforms, but others do not, the endeavor can be weakened on the whole, and generalizations (both statistical and otherwise) are as a consequence limited.
Surveillance: Part of the difficulty in campus-wide adoption of learning analytics platforms might well be that more humanistic departments could view with hostility the idea that students are to be surveilled through for-profit means and subsequently assessed. These concerns are both relevant and eye-opening--in fact, with any study of “success”, one of the key questions is: when it comes to measuring outcomes, who gets to define success? The instructor, student, or administrators?
Strategic Vision: In their 2017 review of learning analytics at universities, Yi-Shan Tsai and Dragan Gasevic discuss the challenges that learning analytics methodologies face in today’s educational climates. Challenges include shortages of leadership, unequal engagement with different stakeholders, shortages of pedagogy-based approaches to removing learning barriers, insufficient training opportunities, a limited number of empirical studies to validate impact, and a limited availability of policies to address issues of privacy and ethics10. The shortage of pedagogy-based approaches is especially important in that learning analytics platforms often try to improve themselves using technical and IT-based solutions, rather than taking into account the goals of instructors10. For the challenge of privacy and ethics, as well as the challenge of unequal engagement, policies need to be in place that present a strategic vision in which all users of learning analytics platforms are seen as not just “users” but also stakeholders.
Open communities: Due to the various skills needed, researchers and educators from diverse fields are continuing to work together to explore how analytics can inform teaching and learning. The collaborative nature of learning analytics projects has seen the development of multiple interdisciplinary networks, within and outside institutions, including Society for Learning Analytics Research (SoLAR), Learning Analytics and Knowledge Conference, LINK Research Lab, and Learning Analytics Research Network (NYU-LEARN).
Personalized learning: To provide further guidance to large classes, the OnTask Project aims to “improve the academic experience of students through the delivery of timely, personalised and actionable student feedback throughout their participation in a course.” Designed by a team of lead researchers in the field of learning analytics, OnTask aims to situate all stakeholders (learners, faculty, administrators) with more access to data.
Libraries: Libraries are key partners in employing learning analytic initiatives - the learning data captured by the library (resources, spaces, services) relates to multiple stakeholders at the institution. In 2018, the Association of Research Libraries (ARL) delivered SPEC Survey on Learning Analytics to ARL members, to explore data management practices within existing learning analytics initiatives and ethics commitments.
Enterprise solutions: As institutions research sustainable solutions to manage, analyze and act upon the production of learning data, corporate entities are innovating quickly in this area. For example, in association with Cambridge English (part of the University of Cambridge), Write & Improve provides a free tool to learners of English, analysing submissions with automated feedback.
Curriculum: As the discipline expands, higher education institutions are developing programs and courses in Learning Analytics. The University of Miami introduces learning analytics within its M.S.Ed in Applied Learning Sciences and Ph.D. in Teaching and Learning - Science, Technology, Engineering and Mathematics (STEM) Education, while Teachers College at Columbia University delivers a graduate program in Learning Analytics, and University of Edinburgh’s MSc in Digital Education program, offers a fully online course, co-taught by leading researchers in the field.
Ethics: The ethical implications of using learning data are broad and important to keep in mind. There is often a barrier to access for users of learning analytics platforms, often due to the differing ways the environments are used by researchers, instructors, and students. Students are often not given access to the data that is being collected and analyzed to describe them, and perhaps more importantly, to measure their outcomes. Another key ethical problem is focus on student retention instead of thinking about different modes of the learning process--what exactly are we measuring? For example, an analytical process that seeks to measure the effectiveness of discussion groups might greatly differ from an analytical process that seeks to measure student retention. As we use third party learning tools such as Google Drive, we are often unable to access data produced, where they have terms of agreement in which any data is housed on their systems can be used for purposes such as product development and marketing.