Last week I ran a session on linking the physical to digital with QR codes, augmented reality and NFC at a JISC RSC Northern event Emerging Technology for Learning.  While it could be considered technically and conceptually easy to link physical objects to digital resources, why and how are important considerations when doing so in educational applications. This presentation led into a set of practical activities and a discussion around ensuring as good a user experience as possible when apply these technologies in education.

The slides are below, with slide by slide notes available on slideshare and are copied as a transcript below.


Notes from the slides

This talk was part of an event on ‘Emerging Technology for Learning’. When making guesses about the future, we are in the territory of a cliché (Garlic bread is a reference to Peter Kay line “Garlic bread, it’s the future, I’ve tasted it”). In the tech world, the brilliant, but perhaps over-used vision of the future is the sixth sense TED talk, which shows one possible direction for linking physical to digital. So instead of emerging technology, this talk focuses on emerging practice. QR codes are not an emerging technology, but in education, they are still emerging in practice.

One thing we can be certain of is that mobile will play a part of the future and will be an important conduit for linking physical to digital. While mobile learning is really about the mobility of the learner and allowing them to move seamlessly between contexts, mobile devices do play an important role in this. Mobiles are personal devices that act to filter huge amounts of data from many different sources and present it in meaningful ways. At present, this data is mainly from other people in an individuals’ network. It can also come from objects that the user is interacting with via for example, QR codes. In the future, it is likely data will increasingly come from smart objects with sensors that broadcast information (Internet of Things). By ‘knowing’ about people, networks, location and personal preferences, mobiles filter ambient data and present it in real time at the right time and place, leading to augmented cognition.

QR codes

Until the Internet of Things arrives, interacting with physical objects requires user-initiated action, such as scanning a QR code.These are an already established technology, but one that is mainly seen in the context of advertising and other non-educational uses. For example, on cupcakes, cows and even gravestones. In education, we have a chance to reclaim this technology and doing something more useful with it. To link physical resources to digital ones, to situate learning, enhance an object, space or place to allow (self-)directed exploration, to make learning more authentic, such as the examples given in the slides.

When implementing QR codes, you are likely to meet with some apathy and reluctance, if not downright resistance. They are sometimes seen as a pointless technology because of the poor experience of using them. However, that is (mostly) a case of shooting the messenger. If we get the use right, they can be effective. How would you indicate that a physical resource is interactive? How would you provide guidance on how to interact with it? What is required to interact? Connectivity, decvices, software? Where and when can people interact? Why would people bother to scan it? What does the object link to? How would people know what to expect? What would the user experience be like for people who do scan it? It is optimized for mobile devices? What issues are there for access and accessibility, authentication, data usage and cost, privacy…?

Augmented Reality

Augmented reality is often presented as terminator vision – in this case for rabbits to read email?! How can we use AR in education right now and where might it go in the future? Aurasma allows for creation of simple image based AR from within a mobile device. Workshop participants tried this after the talk, as well as Juanio creator.

Where I start to fall out with AR – navigation and POIs. Difficult to represent lots of information spatially in a useful way. Takes careful thought not to overwhelm users. This has been done well at Exeter where the AR campus map has filters to allow users to reduce the visual clutter to just what they are looking for. For example, available computers around campus. They are presented with live availability data overlaid on the camera view and if they need directions, this is done through a traditional 2D map. An AR campus map has also been implemented at Sunderland (I’m playing to a local crowd here!)

Similar to QR, implementing AR in practice is all about considering the user experience.

NFC and the Internet of Things

So far, NFC seems to be mainly about mobile wallets and the Internet of Things has been described as being at the Geocities of things stage. However, as with QR and AR, education has an opportunity to make wider use of this technology.

Where AR relates to the internet of things is in providing a virtual interface to physical objects that don’t have them. For example, a lamp that you can remotely set an on/off timer for is unlikely to have a keyboard or other physical inputs, but AR can provide this as an interactive layer. AR also allows the huge amount of data to be filtered and presented in context. For example, buidling schematics for a firefighter or in education, learner analytics for staff and students.

If as educators, we don’t engage with this, then this could be a future: A rabbit that reads your email – as well as it’s own!

Related posts: