Tuesday 22 October 2013

AR Developments - Engaging the Cognitive Processes

What is it?


Augmented Reality (AR) is the layering of digital information over a real time, live view of objects in the real world.

This digital information can be text-based, images, videos and 3D models, any content that is digitally available has the ability to be overlaid. This technology has the potential to enhance the users’ experiences by combining real and digital information on today’s smartphones and tablets.

What have we been doing?


Here at UCS we have been looking at AR and its uses since mid 2011, the Elevate Team were on a development sprint looking at location (GPS) based layering of digital material. We were using the Layar tool to allow visitors to the Ipswich Wet Dock to find out more information about UCS and its waterfront campus.

Visitors could download the app and subscribe to our channel, allowing them to view the surrounding area through the screen of their smartphone or tablet, this would overlay information about the buildings they were looking at. Due to the technologies in the smartphone/tablet the app would know where they are and in what direction they were looking. We could then plot via GPS what buildings were where and then display certain information about those buildings.

For instance, when looking towards the Library building, the user would be made aware that it was the Library building, what the opening times were, links to the Library’s communications channels and if the user was staff or student, they could search the catalogue.

The last day of this development sprint was set aside for further research. This is when we came across a beta release of a tool called “Aurasma”. Aurasma refers to itself as a visual browser, using mobile devices cameras and image recognition technology to overlay digital content.

Previous AR developments were based around “markers”, meaning you needed a very bold/contrasting icon style image for computer cameras to recognise. With the fast paced developments in mobile technologies and the processing power this brings, companies have been able to develop markerless tracking, known as “Natural Feature Tracking”. This means that image recognition technologies can identify everyday objects as markers, and therefore trigger a reaction within the app.

UCS was one of the first institutions to launch a prospectus with embedded digital content. Prospective students could pick up a copy of the prospectus, download a free app and then hold their mobile device over certain pages of the prospectus (where indicated) and they would have videos play to give more information. An advertising video is available here.

The Elevate Team took the work they had done with the Marketing Team in the prospectus and wanted to engage more with the academic teams. This work progressed into embedding videos in academic research posters. A number of these posters have been produced and one has won ‘best poster’ at an International Radiography conference.


Where are we now?


We have been thinking about ways in which we can take this technology a little further and to really use it in a teaching and learning setting.

We want to really enhance the student experience and to engage with their cognitive processes, getting them to think. We have been looking at developing small learning objects that can be used with mobile devices. Below is a video of a sample learning object that engages with students on a different level.



We no longer have a video that you simply watch and then move on from, we have included responses after the video in the form of a multiple choice answer question.

As you can see in the video, an image on the poster triggers a video to play. The video is someone asking a question of the viewer. The viewer is then able to make a decision/answer the question, which is then responded to in the video. In this instance, the video tells you if you are correct or incorrect and then shows some further feedback in the form of a further video. The video then goes on to show a further question popping up, this question takes you to a form to fill in, giving text based answers/feedback. This information is stored outside of the app, which is accessible to the quiz owner.

This example shows one question and feedback loop, but it is just as easier to create multiple questions, creating individual pathways through the materials if that was required.

This is another tool in our armoury and again, learning design is king.

No comments:

Post a Comment