Somehow the Heritage Jam run by the University of York has come round around again and gone. As I outlined in this post the Heritage Jam is an opportunity for people to get together and create new heritage visualisations relating to a specific theme. The theme this year was ‘Bones of Our Past’ – as I couldn’t be there in person, I decided to go ahead and put something together for the online competition.
It turns out my entry won first place! I built something that I have wanted to experiment with for a quite a while – an Augmented Reality application that allows you to take a real artefact (in this case a bone) and compare it to a virtual reference collection. By using your phone you can augment a ‘virtual lab’ onto your kitchen table and then use the app to call up a number of different bones from different animals until you can find one that matches.
The AR aspect of it adds something more to the ‘normal’ online virtual reference collections – by allowing you to augment the models in the correct scale in front of you and then twist and turn each one side by side.
In addition, as I am interested in multi-sensory things, I also added in the sounds and smells of the animals – as well as a virtual portal into a 360 degree video of a deer herd in action.
Finally, it has a link through to a set of Open Data from Open Context showing where else in the world similar types of bones have been found.
You can watch the visualisation here:
Please check out the full visualisation and explanation here: http://www.heritagejam.org/new-blog/2017/10/27/the-artefactkit-stu-eve
As with all of these ‘jam’ projects, the app is just a prototype and is quite messy in terms of overall look and feel – but I think it has potential to be quite useful. Now I just need some funding!
For the last 3 years I have had the absolute privilege of being one of the archaeological directors of the current excavations of the battlefield of Waterloo. As part of the incredible project Waterloo Uncovered (http://www.waterloouncovered.com) – we have been taking wounded serving and veteran soldiers, students and professional archaeologists to the battlefield to conduct the first major systematic excavation of parts of the battlefield that shaped the history of Europe in 1815.
We only have two weeks in the field each year, which means there is not a lot of time to doing anything but excavate, record and backfill (see the dig diaries and videos of all we got up to here). However, this year I managed to find the final afternoon to play with the new Apple ARkit and see what potential there is for archaeological sites.
The short answer is that there is a lot of potential! I have discussed Augmented Reality and archaeology to the nth degree on this blog and in other places (see here for a round-up) – but with the beta release of ARkit as an integrated part in iOS11, Apple may have provided the key to making AR more accessible and easier to deploy. I tried out two experiments using some of the data we have accrued over the excavations. Sadly I didn’t have any time to finesse the apps – but hopefully they should give a hint of what could be done given more time and money (ahem, any prospective post-doc funders – my contact details are on the right).
Exploring the lost gardens of Hougoumont
The first video shows a very early experiment in visualising the lost gardens of Hougoumont. The farm and gardens at Hougoumont were famously defended by the Allied forces during the battle of Waterloo (18th June 1815). Hougoumont at the time was rather fancy, with a chateau building, large farms buildings and also a formal walled garden, laid out in the Flemish style. One of the participants this year, WO2 Rachel Willis, is currently in the process of leaving the army and studying horticulture at the Royal Horticultural Society. She was very excited to look at the garden and to see if it was possible to recreate the layout – and perhaps even at some point start replanting the garden. To that end she launched herself into the written accounts and contemporary drawings of Hougoumont and we visited a local garden that was set out in a similar fashion. Rachel is in the process of colouring and drawing a series of Charlie Dimmock style renditions of the garden plans for us to work from – but more on that in the future.
Similar gardens at Gaasbeek Castle
Extract from Wm. Siborne’s survey of the gardens at Hougoumont
As a very first stab at seeing what we might be able to do in the future, I quickly loaded up one of Rachel’s first sketches into Unity and put a few bushes and a covered walkway in. I then did some ARkit magic mainly by following tutorials here, here, and here. Bear in mind that at the time of writing, ARkit is in beta testing, which means you need to install Xcode Beta, sign up for and install the iOS 11 beta program for the iPhone and also run the latest beta version of Unity. It is firmly at the bleeding edge and not for the faint-hearted! However, those tutorial links should get you through fine and we should only have to wait a few months and it will be publicly released. The results of the garden experiment are below:
As can be seen, the ARkit makes it very simple to place objects directly into the landscape OUTSIDE – something that has previously only really been possible reliably using a marker-based AR plugin (such as Vuforia). Being able to reliably place AR objects outside (in bright sunshine) has been somewhat of a holy grail for archaeologists, as unsurprisingly we often work outside. I decided to use a ‘portal’ approach to display the AR content, as I think for the time being it gives the impression of looking through into the past – and gives an understandable frame to the AR content. More practically, it also means it is harder to see the fudged edges where the AR content doesn’t quite line up with the real world! It needs a lot of work to tidy up and make more pretty, but not bad for the first attempt – and the potential for using this system for archaeological reconstructions goes without saying! Of course as it is native in iOS and there is a Unity plugin, it will fit nicely with the smell and sound aspects of the embodied GIS – see the garden, hear the bees and smell the flowers!
Visualising Old Excavation Trenches
Another problem we archaeologists have is that it is very dangerous to leave big holes open all over the place, especially in places frequented by tourists and the public like Hougoumont. However, ARkit might be able to help us out there. This video shows this year’s backfilled trenches at Hougoumont (very neatly done, but you can just still see the slightly darker patches of the re-laid wood chip).
Using the same idea of the portal into the garden, I have overlaid the 3D model one of our previous trenches in its correct geographic location and scale, allowing you to virtually re-excavate the trench and see the foundations of the buildings underneath, along with a culverted drain that we found in 2016. It lines up very well with the rest of the buildings in the courtyard and will certainly help with understanding the further foundation remains we uncovered in 2017. Again, it needs texturing, cleaning and bit of lighting, but this has massive potential as a tool for archaeologists in the field, as we can now overlay any type of geolocated information into the real world. This might be geophysical data, find scatter plots or, as I have shown, 3D models of the trenches themselves.
These are just very initial experiments, but I for one am looking forward to seeing where this all goes. Watch this space!
I recently attended the CAAUK 2016 meeting in Leicester, a great couple of days with a few really interesting papers.
As usual, the rather excellent Dougs Rocks-Macqueen was on hand to record the talks. His videos can be found here – he records all sorts of diverse archaeological conferences, so it is well worth clicking the subscribe button on his account.
In case anyone is interested, I have embedded the video of my talk below – where I discuss the Embodied GIS, using examples from my previous research including Voices Recognition and the Dead Man’s Nose.
As an archaeologist, I’m used to reporting old news, and this is pretty old – however, might be of interest.
In 2014 and 2015 I participated in the University of York’s Heritage Jam. The Heritage Jam is a really excellent initiative, bringing together an eclectic group of archaeologists, gamers, makers and heritage specialists to hack together a project within two days of intense work, locked in a small room. For those not able to travel, there is also the option to participate online.
Heritage Jam Logo
As well as the final prototype each team is expected to produce a paradata document, that outlines the motivations behind the project and also expands a little on the method and technologies used. The intense session really pays dividends and being locked in a room focuses the mind to get a lot of things done – without the constant distractions of the real world.
In 2014, my team won first prize with our ‘Voices Recognition’ project which explored the auralisation of a cemetery in York, and in 2015 I was awarded Highly Commended for my individual entry, the Dead Man’s Nose, a device which I developed, built and use to deliver smells in-situ while investigating archaeological sites. I used it to explore the olfactory landscape of the Moesgaard Museum Archaeological Trail (Denmark) – a link to the video and paradata is here.
I have just submitted a guest blog post on the American Schools of Oriental Research (ASOR) blog for their ongoing special series on Archaeology in the Digital Age. It’s an introduction to Augmented Reality for Archaeology and also includes some sneak peeks of the results of some of my own AR fieldwork on Bodmin Moor. The original post can be found at http://asorblog.org/?p=4707.
Last month I entered a poster into the UCL Graduate School Poster Competition and was lucky enough to win first prize. I find conference posters a bit of a strange animal. The poster session always seems to take place over lunchtime or the coffee break and more often than not the person who made the poster isn’t around to talk you through it. You are then usually left with a poster that has masses of text, that either has too much detail or not enough, and the whole thing can get quickly boring.
I wanted to challenge this a little bit, and as my poster subject was my work with AR, I was provided with the perfect opportunity. The poster was a pretty simple (but hopefully striking) design, a pair of old school binoculars looking at some rocks on Leskernick Hill, Bodmin Moor. The area within the binoculars shows some roundhouses – giving the impression that looking through the bins reveals the ancient landscape.
My winning poster
I tried to keep the text to a bare minimum so that the poster was dominated by the binoculars. However, this being an AR project, there was a bit of a twist. Using the Junaio API I augmented the poster with a video that overlaid the whole thing when viewed through a smartphone or tablet. The video showed the binoculars moving around the poster, revealing more of the roundhouses.
I am increasingly finding that the best way to explain AR is to give someone an example of it. It was a bit of a gamble, as in order to see the AR content the viewer needed to have a smartphone, have an app to scan the QR code on the poster and have good enough internet access to install and run the Junaio app. The main judge of the competition wasn’t at the prize-giving, so I didn’t get any feedback or a chance to ask if they had seen the AR content, but they awarded it first prize so I hope they did!
I am of course not the first person to use AR in a poster, but I am sure that it will become a lot more popular as it really is an excellent way of adding content to a poster, without being too intrusive. I guess at the moment it could be seen as being a little gimmicky, however this isn’t all that bad when trying to attract people to your poster and your research. One of the important things to remember though is that the poster needs to be able to stand on it’s own without the AR content, as it is quite an ask at the moment to get people to download an app on their phone just to learn more about your research.
The process of adding the content via the Junaio app also wasn’t quite as easy as I had hoped, mainly because the video itself had to be made into a 3D object and be of a very low quality and in a special .3g2 format to enable it to be delivered fast to a mobile device. You immediately lose your audience if they have to wait 2 minutes for your content to download and the .3g2 format was specifically designed to look ok on a smartphone screen and be small enough to download quickly. However, as you can see from the video above, the quality is pretty poor. I created the animation using 3D Studio Max, and then rendered it out to a number of tiffs. I then used ffmpeg to render the tiffs to a video and encoded it into the .3g2 format. The Junaio developer website has instructions for how to do all of this, but it is not really for the faint of heart. Junaio provides a number of sample PHP scripts that can be run on your own server to deliver the content, and their trouble-shooting process is really excellent. So if you have your own webserver and are happy with tweaking some scripts then you can do some really quite nice stuff. I should note that they also have a simple upload interface for creating simple AR ‘channels’ which is a great way of quickly getting things up there – but doesn’t allow you to have total control or upload videos. But, if you want to pop a simple 3D model on your conference poster, then the Junaio Channel Creator is the app for you! The other thing to remember if you want to augment your own conference poster, is that the channels can take up to a week to be approved by Junaio, so you can’t leave it all to the last minute!
I suspect we will be seeing many more AR-enabled conference posters, particularly as AR booklets, magazines and museum guides are becoming more popular. One can envisage holographic type projections of people standing beside their posters talking the viewers through it, or interactive posters where the content changes depending on what and where you touch it. As I keep coming back to on this blog, it is the melding of the paper with the digital that I find so fascinating about AR, the ability to re-purpose old ideas (such as the conference poster) and breathe new life into the concept – but without losing the original purpose and feel of the thing itself. The design of the paper poster stands on its own (for better or worse!) and the AR content just gives the creator the chance to provide further information and give the viewer that extra dimension into their research.
The following video shows something that I have been working on as a prototype for a larger landscape AR project.
As you can see, by using the Qualcomm AR SDK and Unity3D it is possible to augment some quite complex virtual objects and information onto the model Roman fort. I really like this application, as all I have done is take a book that you can buy at any heritage site (in the UK at least) and simply changed the baseboard design so that the extra content can be experienced. Obviously there was quite a lot of coding behind the scenes in the app and 3D modelling, but from a user point of view the AR content is very easy to see – simply print out the new baseboard, stick it on and load up the app.
For me that is one of the beautiful things about AR, you still have the real world, you still have the real fort that you have made and can play with it whether or not you have an iPad or Android tablet or what-have-you. All the AR does is augment that experience and allow you to play around with virtual soldiers or peasants or horses instead of using static model ones. It also opens up all sorts of possibilities for adding explanations of building types, a view into the day-to-day activities in a fort, or even for telling stories and acting out historical scenarios.
The relative ease of the deployment of the system (now that I have the code for the app figured out!) means this type of approach could be rolled out in all sorts of different situations. Some of my favourite things in museums, for instance, are the old-school dioramas and scale-models. The skill and craftsmanship of the original model will remain, but it could be augmented by the use of the app – and made to come alive.
The model of Housesteads fort in the Housesteads museum
The same is true of modern day prototyping models or architectural models. As humans we are used to looking at models of things, and want to be able to touch them and move them around. Manipulating them on a computer screen just doesn’t somehow seem quite right. But the ability to combine the virtual data, with the manipulation and movement of the real-life model gives us a unique and enhanced viewpoint, and can also allow us to visualise new buildings or exisiting buildings in new ways.
A particularly important consideration when creating AR content is to ensure that it looks as believable or ‘real’ as possible. The human eye is very good at noticing things that seem out of the ordinary or “don’t feel quite right”. One of the main ways to help with creating a believable AR experience is to ensure the real-world occludes the virtual objects. That is the virtual content can be seen to move behind the real-world objects (such as the soldiers walking through the model gateway). Also it should be possible to interact with the real-world objects and have that affect the virtual content (such as touching one of the buildings and making the labels appear). This will become particularly important as I move into rolling the system out into a landscape instead of just a scale-model. As I augment the real world with virtual objects, those objects have to interact with the real-world as if they are part of it – otherwise too many Breaks in Presence will occur and the value of the AR content is diminished. An accurate 3D model of the real-world is quite a bit harder to create than that of a paper fort, but if I can pull it off, the results promise to be quite a bit more impressive…
Recently I have been working away in the Unity gaming engine using it to make some Augmented Reality applications for the iPhone and iPad. It is surprisingly successful and with at least 3 different ways of getting 3D content to overlay on the iOS video feed (Qualcomm, StringAR and UART) the workflow is more open than ever. I have been attempting to load 3D content at runtime, so that dynamic situations can be created as a result of user interaction – rather than having to have all of the resources (3D models, etc.) pre-loaded into the app. This not only saves on file size of the app, it also means that the app can pull real-time information and data that can be changed by many people at once. However, in order to do that I needed some kind of back-end database…
For those of you that know me, you will know that as well as doing my PhD I work on the development of the open-source archaeological database system known as the Archaeological Recording Kit (ARK). It seemed like a logical step to combine these two projects and use ARK as the back-end database. So that is what I went and did and at the same time created a rudimentary AR interface to ARK. The preliminary results can be seen in the video below:
This example uses the Qualcomm AR API, and ARK v1.0. Obviously at the moment it is marker-based AR (or at least image recognition based), the next task is to incorporate the iDevices’ gyroscope to enable the AR experience to continue even when the QR code is not visible.