IMS Europe Summit 2020 Agenda (Day 4)

Measuring Student Success using Learning Analytics and Digital Assessments

Monday, November 30


Please join us for this combined session on Learning Analytics and Digital Assessment. We are happy to announce that we have two great keynote speakers for this session: Matt Richards, Principal Product Owner at Cambridge Assessment and Bart Rienties, Professor of Learning Analytics at Open University UK. There will be invited guests that will share their ideas with us and a number of very interesting lightning talks from our members. You will have plenty of opportunity to listen and contribute to discussions on the several themes.




All times listed are in Central European Time

Morning Session


By Nynke de Boer, Program Director IMS Europe


By Steve Bailey, Product Marketing Manager at Blackboard

Learning analytics has always relied upon assessment as both a measurement of student performance and a leading indicator of future success, but we are now starting to see the process of assessment itself benefitting from analytical methods. In this introduction I will review the history of this relationship and share some examples of how it is changing for the benefit of both areas.


IMS Europe member Lightning Talk

Analytics in vocational assessment
By Paddy Craven,  Director of Assessment Policy, Research and Compliance at City & Guilds


IMS Europe member Lightning Talk

Online Proctoring: When do you push the RED button?
By Erwin van Schaffelaar, Chief Commercial Officer at XQuiry

Monitoring a live proctored exam requests utmost concentration of the live proctor. Under remote high stakes conditions good, fair testing remains a basic exam regulation principle. There are many forms of irregularities, incidents and violations than can be observed when candidates navigate through their exam. Which ones do you capture and report on? Taking into account that a live proctor monitors at least 6 candidates at the same time. Nothing is what it seems when we assess multiple aberrant patterns. Is there a difference between what we observe and how we judge?


IMS Europe member Lightning Talk

Implementation of new platform for national tests and exams in Norway
By Johan Aamdal Bottheim, Project delivery manager, The Norwegian Directorate for Education and Training

This lightning talk will take you to Norway, where the government is developing digital exams and national tests for students nationwide.
The Norwegian Directorate for Education and Training will share their experiences with implementing a centrally managed system for this, particularly focusing on the administration of tests and assessments.


Smaller breakout groups for attendees to meet and discuss


Perspectives on Digital Assessment in a global context
By Matt Richards, Principal Product Owner at Cambridge Assessment

From enabling new approaches to engaging learners, supporting greater educational insights and ultimately ensuring more scalable and convenient products, the transition to Digital Assessment provides significant opportunities to learning and assessment organisations.
As we continue with this transformation, we must however ensure we bring the learners and educational organisations we support with us. 'Learners' are not a single entity: they have different experiences and confidence in technology. Educational Organisations also have varying access to mix of different digital platforms. All of this must be taken into account as we design our assessments.
Over the course of this session, Matt will outline some of these areas, whilst demonstrating how the rich data used to support learners can also help us maintain the quality of our assessments too.

12:30 End of the morning session


Afternoon Session


By Nynke de Boer, Program Director IMS Europe


Keynote presentation - Learning analytics and the effective use of assessments
By Bart Rienties, Professor of Learning Analytics, PFHEA at Open University UK

As assessments drive student learning there is a wealth of research and practical experience providing guidelines on how to effectively design different forms of assessment. In this presentation, I will discuss how the Open University UK is using online assessments for its 170K students, and how learning analytics can provide unique insights into how students make complex assessment decisions.


IMS Europe member Lightning Talk

Access Beyond the Assessment: Accessibility for all Aspects of Digital Testing
By Thomas Hoffmann, Director of Product Strategy & Solutions at Open Assessment Technologies

Accessibility has long been a challenge in the assessment industry. With the recent uptick in online learning and assessment this year, it’s crucial that we rise to meet this challenge. It’s taken longer than many of us hoped, but online assessment platforms are finally taking accessibility seriously, and many vendors are openly boasting about their accessibility compliance and features. However, accessibility for tests isn’t only applicable to the test itself. Assessment programs need to dig a little deeper and think through the entire chain of access to all of the aspects of taking a test.

In this lightning talk, we dive into providing access for the whole assessment process, including what this looks like for test-takers, agents and test creators. We also discuss the critical IMS learning standards that enable accessibility beyond the assessment for all aspects of digital testing.


IMS Europe member Lightning Talk

Authentic Assessment, what’s next?
By Saskia Wools, Director Research, Knowledge and Innovations at Cito

At Cito we develop authentic assessments to create an immersive testing experience. In this Lightning Talk two examples are shown to demonstrate our newest generation of assessments. These assessments include regular assessment interaction types in an authentic context. And although known item types are used, these assessments do not feel like an assessment anymore. The result? Students enjoy taking these assessments, they are able to demonstrate their ability better and feel less stressed. The problem? Current standards do not seem to provide a way to make this content interoperable. How can we adapt these standards to support this use case?


Panel discussion - Bringing together Digital Assessment and Learning Analytics

Facilitated by Thomas Hoffmann, Director of Product Strategy & Solutions at Open Assessment Technologies

Panel members:
  • Bart Rienties, Professor of Learning Analytics, PFHEA at Open University UK
  • Mark Molenaar, founder at Apenutmize
  • Phil Richards, CTO Data at Jisk
  • Yeona Jang, Chief Community Engagement Officer at Explorance

Improving Student Success and Intervention Strategies with Real-time Data
By Rick Johnson, Co-Founder, Vice President of Product Strategy and Accessibility at VitalSource Technologies, LLC

We know engagement analytics have the power to change the learning process by encouraging students to access materials early and often, and track student engagement and mastery of the content. They also power research and intervention strategies with real-time data feeds to improve the accuracy of your predictive models by combining content engagement data with student outcomes and data from other learning tools.

This session is about the important role of interoperability within a digital ecosystem to empower institutions with robust learning data. Hear an update from last year’s discussion on our latest findings and how you can begin to incorporate data into your overall strategy.

14:55 Closing words
15:00 End of the afternoon session