Tuesday 13 January 2015

Evaluation of online courses: Dec 2014

Background

Learning Services released three online courses through the CourseSites platform at the end of December 2014. The courses where designed to provide an equivalent experience for the online learner as attending a getting started, one hour, face to face workshop.

The aim of this update is to start to draw a few reflections together, to allow us to re-visit the existing courses and enhance the design of the next wave due for release in March 2014. This self-reflection is a collaborative and transparent process.

I would like to thank the wider Learning Services Team for their work on these three courses, and the steps we have made.

For more information on how to enrol on the courses, see
For more background, see

Data Analysis (Did they come?)

In the first month, 29 people enrolled on the SOOC, of which 20 appeared to be UCS students, and 9 UCS staff. Of the UCS staff, 4 where members of Learning Services. Four people completed the courses and achieved their mozilla open badge. It didn’t appear we have any non-UCS registrations. An exploration of the gradebook suggested if people started a course (completed the first quiz), they tended to complete all quizzes on the course.

Criteria

The criteria used by the Elevate Team for reflecting on an online course as the online learner are:

Motivation
  • What are my motivation levels through the learning activity? Do I feel motivated to complete all the tasks? Do I feel engaged? Am I challenged as I complete the task? Is the language and content appropriate for the level (getting started, advanced)
Role of Feedback
  • How much feedback an I getting on the tasks? Do I feel I’m learning from the feedback I get? Does the feedback encourage me to complete more tasks?
Oncreen support and signposting
  • Do I know what to do? Are there clear, appropriate and time relevant instructions?
Screen design and layout
  • Is the screen design and layout consistent? Is it usable? Does the design hinder my chance of completing the tasks?
Interactivity and learning design
  • Am I simply reading and watching information or am I required to complete activities? Are the tasks and interactions the same type throughout the activity? What things are they making me do?
Content
  • Is the content OK? Are there areas which are not clear, or contain typos? Do all the links work?
In addition, individuals within the curriculum development team where asked to feedback. To promote feedback, and triangulate with the e-Learning Developers, they were provided a small range of stimulus questions:
  • how might we improve the learning design / flow?
  • Is the resource engaging?
  • Is the look and feel appealing?
  • Is it a good and effective learning experience?

Reflections

Role of feedback
  • The feedback is generic, irrespective of how you performed on the test / quiz. It would be good to use adaptive release to provided different feedback depending on performance.
  • There is a strong preference to multiple choice questions. So it would be useful to enhance authors awareness of the different question types available.
  • The feedback does not tend to be developmental, it is more, affirming the correct answer. It will be worth exploring with the course team the potential role of feedback.
Screen design and layout
  • The use of mark as reviewed, does create the issue of having to scroll more and more. Not sure if there is an obvious solution, but it is rather frustrating.
  • The use of embed screencasts are inconsistent, in terms of the size, and the presenter.
  • The embeded talking head videos lack engagement, they require text context on the screen, and could think through more effective use of props
  • The screen layout around Task 4 (Information searching), means it is difficult to complete the task.
Content
  • There appears to be a number of typos within the text. We should enhance the content management and sign content sign off processes.
Interactivity
  • There was a high level of interactivity within the resources, with the use of video and quiz questions, and marking as reviewed meant the learner was active in the process.

Where next for Learning Services?

We are expected to develop a number of online workshops to be rolled out in March (with the design phase in February). We'll follow a similar design process, with a kick off sprint. However, the focus of the sprint will include the quiz engine, and the role of feedback. It would also be very good to explore the potential of the students submitting a piece of work or reflection which will need assessing by members of Learning Services. We will also start to pull together and standardise the screencasts, videos and slides. The design and development phase will be completed with a set of student user tests. Of which the feedback will need to be applied before the courses are released.

Many of these lessons and ideas are transferable to other online course designs. For more information on how you might uses these lessons, please contact the Elevate Team (elevate@ucs.ac.uk).

No comments:

Post a Comment