Thursday 15 May 2014

The power of the MCQ exam: Optical Mark Reading (OMR) Service at UCS: Annual Report 2013-14

This blog post is the annual report for the OMR Service which is run by the Elevate Team for course teams at UCS Ipswich. For more information on how the OMR service can help your teaching and assessment model, please contact elevate@ucs.ac.uk

Executive Summary

During 2013/14
  • one additional course team have started using the OMR Service for a Level 5 summative exam.
  • The total number of course teams using the service was 5
  • The total number of papers set was 17
  • The total number of answer sheets scanned was 516
The recommendations for 2014/15 are;
  • Be more proactive in promoting the service to course teams across UCS
  • Reduce potential points of failure within the service, including, reducing the current risk where the is software being installed on one desktop, which is off site, only one member of the team is aware how to use it
Background

The Elevate Team have been running an OMR Service since 2011-12. The aim of the service is to provide an opportunity for course teams to use Objective Testing (low and high stake) in their assessment and feedback models at UCS. The service complements the use of the Quiz (Test) Engine in LearnUCS. Please note, we do not recommend or support the use of LearnUCS in UCS Computer Labs or alternative learning spaces for summative, high stake assessment.

The Elevate Team use the FormReturn Software / Service (http://www.formreturn.com/), with a local install on one of the Elevate Team’s iMacs.

Workflow

The workflow for the creation, distribution and return of OMR exams is outlined in Figure 1. This illustrates the process is a cross team.

 

The workflow is also available from;
An illustration of an uncompleted answer sheet is included in Appendix 1. This follows current good practice within design, which has been discussed with the Registry / Examination Office to ensure it accommodates UCS requirements.

Usage and feedback on the OMR Service: 2013-14

This section aims is to answer three questions:
  • is service being used?
  • are staff happy with the service
  • how might the service be enhanced?
The methodology involves reviewing the log data to identify if the service is being used, and undertake a survey to capture staff views.

The log data indicates the service has been used. Between 1st September, 2013 and 9th May, 2014

The programmes using it for summative exams are:
  • BA Business Management
  • Foundations of Biological and Cognitive Psychology
  • Child and Social Policy
  • Radiation Physics
    • The total number of course teams using the service were 5
    • The total number of papers set were 17
    • The total number of answer sheets scanned were 516
Admission process

An ongoing commitment is to support the admissions process the Radiography Team. They deploy run a Literacy Test (20 questions) and Numeracy Test (24 questions). During 2013-14 the Elevate Team administered 12 tests (6 instances), with 232 scans.

A survey was administered to gather staff opinions around the value of, and potential to enhance the service. The response rate was 4 out 5 (50%). All four respondents are planning to use the service in 2014/15, two strongly agreed with the statement “I’d strongly recommend the use of OMR in teaching, learning and assessment to my work colleagues”.

We use the OMR as the first assessment in a level 4 module. We have found that students engage very well with the specific learning materials in preparation for the exam. In contrast to the previous assessment which was an essay, students focus on the provided revision material which is directly linked to the learning outcomes for the module, they receive feedback in a matter of days after the exam when previously they had to wait much longer for up to 120 essay scripts to be marked by two tutors.

Feedback from the students is overwhelmingly positive and summative results have improved.
With respect to potential issues, one response focussed on the initial setting up and refreshing of questions, given the potential of three exams a year to the cohort. Therefore, course teams need to develop a large bank of questions. While another response raised issues around their need for a quicker turnaround, in particular, results being available within the working day of the assessment.

Recommendations
  • Be more proactive in promoting the service to course teams across UCS
  • Reduce potential points of failure within the service, including, reducing the current risk where the is software being installed on one desktop, which is off site, only one member of the team is aware how to use it

No comments:

Post a Comment