About
Dead Men’s Eyes is a project that investigates the use of Augmented Reality within Archaeological Practice. The system is built using a smartphone, an Arduino microcontroller and a Unity3D application that brings the sights, sounds and smells of the past directly into the present – into the very place in which they happened. This site contains a number of how-tos, blog and other articles that detail the process and show where I am up to with the development.
The name is inspired by a short story by M.R. James ‘Dead Men’s Eyes’ (or A View From The Hill) in which a man uses a pair of old binoculars to view grisly episodes from ages past.
LATEST POST:
- The ARtefactKit – Heritage Jam 2017 Winner - Somehow the Heritage Jam run by the University of York has come round around again and gone. As I outlined in this post the Heritage Jam is an opportunity for people to get together and create new heritage visualisations relating to a specific theme. The theme this year was ‘Bones of Our Past’ – as I couldn’t […]
SOME INTERESTING POSTS:
- ARkit and Archaeology – Hougoumont Farm, Waterloo - For the last 3 years I have had the absolute privilege of being one of the archaeological directors of the current excavations of the battlefield of Waterloo. As part of the incredible project Waterloo Uncovered (http://www.waterloouncovered.com) – we have been taking wounded serving and veteran soldiers, students and professional archaeologists to the battlefield to conduct […]
- Pokémon Go Home – Why Pokémon is not what the heritage sector really needs - Gently I edged toward the beast. It had 4 long semi-transparent wings, the same length as its tube-like body. The body was iridescent in the light, changing colour through blue to green. Its face was incredibly ugly… a mixture of large bug-like eyes on the side of its head and a gaping mouth filled with […]
- CAAUK 2016 – Embodied GIS and applied Multi-Sensory Archaeology - I recently attended the CAAUK 2016 meeting in Leicester, a great couple of days with a few really interesting papers. As usual, the rather excellent Dougs Rocks-Macqueen was on hand to record the talks. His videos can be found here – he records all sorts of diverse archaeological conferences, so it is well worth clicking […]
- Heritage Jamming - As an archaeologist, I’m used to reporting old news, and this is pretty old – however, might be of interest. In 2014 and 2015 I participated in the University of York’s Heritage Jam. The Heritage Jam is a really excellent initiative, bringing together an eclectic group of archaeologists, gamers, makers and heritage specialists to hack […]
- Dead Men’s Eyes Stories, Podcasts and Seminars - I have been presenting quite a bit lately about the Dead Men’s Eyes project and it has also been picked up by a couple of the mainstream press outlets – so I thought I should put together a page that has links to more information about the project. Hopefully this list will grow over the […]
- Surfing the Hypegeist - This post is written as part of the Call for Papers over at ThenDig, looking at Zeitgeist in archaeological research and how to follow it, keep up with it, or create it. As will be clear from the previous posts on my blog, I am interested in using Mixed and Augmented Reality to aid in […]
- Guest Blog on ASOR - I have just submitted a guest blog post on the American Schools of Oriental Research (ASOR) blog for their ongoing special series on Archaeology in the Digital Age. It’s an introduction to Augmented Reality for Archaeology and also includes some sneak peeks of the results of some of my own AR fieldwork on Bodmin Moor. […]
- Learning by Doing – Archaeometallurgy - This post will be a little off my normal topics, in that there will no augmented reality and no computers (although I did make some nice 3D models that I’ll link to later). It is about technology, but mostly about prehistoric technology. I have spent the last four days on a prehistoric metallurgy weekend, run […]
- Archaeology, GIS and Smell (and Arduinos) - I have had quite few requests for a continuation of my how-to series, for getting GIS data into an augmented reality environment and for creating an embodied GIS. I promise I will get back to the how-tos very soon, but first I wanted to share something else that I have been experimenting with. Most augmented […]
- Embodied GIS HowTo: Part 1a – Creating RTIs Using Blender (an aside) - This is a bit of an aside in the HowTo series, but nevertheless it should be a useful little tutorial and as I was given a lot of help during the process it is only right to give something back to the community! So this HowTo shows you how to take the 3D model you […]
Hi Stu – thanks for the headsup. I’ve tweeted about your work! I look forward to seeing what you get up to – and am v. jealous that you’ll be working with the folks at CASA!
Shawn
Stu,
I was given the link to your blog by a friend of mine who came across it, she sent it my way as I have been working with some similar concepts as far as using game engines and archaeological data to bring sites to life. I presented a paper in January at the SHA’s talking about how to use this technology as a research tool as well as its more obvious uses as a “people pleaser”. I have been using Cryengine 3 because of its more advanced capabilities, although you are using Unity this program might be worth checking out, it is free as well. I have had great success with this and other programs when re constructing a POW camp located in Manitoba Canada, although far from the Bronze age I used many of the same concepts, I was not, however, able to have GIS data correspond accurately with the models, which is a big step. All in all I like your work and wish you the best of luck with your dissertation
Josh Allen
@Josh Allen
Dear Josh,
Thanks for the comment and sorry for taking so long to respond.CryEngine 3 looks great (I used CryEngine 2 to do some experimenting with Verulamium a few years back – https://vimeo.com/dataanarchist/videos – its a beautiful system… but unfortunately no one has made the move to enable an AR view of the engine. That is the main reason I am using Unity – because it has great AR integration.
In terms of getting the GIS data to overlay properly I have been quite a bit of work on that recently (in Unity at least) and will be putting up a post on it very soon – so watch this space.
I would love to see some of your stuff – do you have any links to it anywhere?
Hi Stu!
Wow, so we seem to be doing almost the exact same dissertation work (Integrated active AR in Archaeology, rather than passive) over a world away (my site is in Mexico). Your “Augmented Reality, a New Horizon in Archaeology” article literally hits on all the major points I was shooing for, while using the same software! I guess great minds think alike and we both saw the digital data horizon/layer (I’m calling it the Digital Heritage Horizon or DH^2) as an integrative platform. It’s truly an amazing tool with applications not only for live, active archaeologists, but it segues the content we generate from data collection to public, user aspects almost seamlessly.
With that being said I’m also looking at using the CryE3 (haha) in a custom Heritage AR app in addition to mixing some kernals I’ve been lucky enough to use. I’m really excited about your solutions to GIS data overlay because I’ve had to use some round about methods to get my data to stick. Anyway I’m glad I found your blog, I’ll stalk it frequently to see what’s happening on your side of the pond and help where I can. Who knows, maybe we could collaborate on something.
Technoarchaeologists unite.
Hi. A consultant I am using directed me to your video “Augmenting a Roman Fort” because I have just taken on the responsibility for running a Roman Fort in Coventry. The consultant is helping me with a Feasibility Study to develop the site and, in conjunction with their work, I am currently planning to hold a hackathon in September at the site.
Your work looks exactly the sort of thing I am looking for. Would you be interested in having a chat and maybe getting involved?
Gary
Hi Stuart
I am a researcher at the Interactive Institute Swedish ICT in Gothenburg, Sweden. We are currently developing an augmented reality tourist binocular station that will overlook the Vitlycke rock carvings sites (and the surrounding world heritage area). I am reading your posts and articles with great interest as we are using similar technology (such as Unity) in our early prototypes. The Oculus Rift is also proving very valuable for prototyping different kinds of visualisations.
Our development blog is currently not public, but I can notify you when we open it up. If you want to know more you can visit https://www.tii.se/projects/bronze-age-binoculars, but currently most of the press links are in Swedish. If you have any questions, don’t hesitate to ask as we aim to be as transparent as possible in our development process.
Nice work! In the image above, we see that real rocks are occluding a virtual hut behind (on the right side). Is that in real time or has it been photoshopped? If it’s in real-time, are you using a 3D model of the area (Lidar?).
Thanks a lot!
Vincent
Hi Vincent – that is just a mock-up – but I have had quite a bit of success using a similar approach in the field. I have been using a 3D model of the landscape to occlude the houses – if you check out this talk I show a few examples (https://www.youtube.com/watch?feature=player_detailpage&v=3OdVSGiU9lQ#t=1750). There are still lots of issues with dodgy GPS positions and also the need for a calibration stage (to ensure the 3D model is aligned properly with the real landscape), but I think as the technology improves (for example speedier mobile processors using SLAM-based tracking) then the need for calibration might diminish.
Are you working on something similar?
HI STUART,
You may be interested to know that a project to marry animated synchronously realised virtual images with an archaeological site, live and in the field, was carried out by myself (who’s original idea it was) and a technical team at IBM Research and Development, who were contracted, along with myself, by the Flemish Archaeological Department.
The project was funded and executed in the mid-late nineties.
The work was carried out in conjunction with the archaeological team at ‘Ename’, Belgium, under the guidance of archaeologist Dirk Callebaut), and his then Head of Department, Guy de Boo. Ename is a small village adjacent to the town of Oudenarde on the banks of the River Scheldt. It was the site of a early medieval abbey.
My original concept prototype was named the ‘Time-Frame’. The idea first came to light at a conference in Israel in 1993, when I, along with Dr Peter Addyman the then Director of York Archaeological Trust and also a speaker, were attending an international archaeological congress the theme of which was, ‘Ways of Interpretation of the Past’.
I recall being on a visit to a late Roman site visit along with the rest of our group, and viewing the chaotic site layer low by earthquakes and thinking, “there must be a way for visitors to view this as it was- here on the site live and from various vantage points- not via interpretive graphics on boards, but live, here and now!”
It was a Eureka moment which put huge and overactive bee in my bonnet which I was determined to realise. The opportunity to see the concept’s fruition came later in the decade when I was contracted to design a museum and on-site interpretation at Ename.
I was employed as a freelance interpretive design consultant at Ename during the nineties and had been toying with the idea of synchronous live overlays on site, since my Israel trip.
(FYI I was the original Project Designer of the Jorvik Viking Centre in York which opened in 1984, and thereafter around the world.)
The Head archaeologist at Ename, Dirk Callebaut had also attended the congress in Israel and had heard my ideas whilst there. Soon after I was contracted by him and his department to work at Ename, and although outside my main brief, I suggested we might work together to develop the Time-Frame system. (All this and much more is documented).
Once developed, and in use on site on a daily basis, opportunities to speak about the system occurred. In the early 2000’s I addressed the World Archaeological Congress in Washington along with Dr. Douglas Comer (ICOMOS). We also spoke as guests at the National Geographic Office in Washington about the potential for in the field virtual overlays in the field of public and educational interpretation of certain classes of sites.
In principle what we set out to achieve at Ename (and did) was a permanent on-site virtual overlay interpretation system which used combined a streaming video signal via computer to a monitor (housed in an open shelter). These live images of the site in front of the view, were synchronously married with 3d reconstructive virtual modelling sequences overlaid exactly onto of sections of the site. The object was to present an animated reconstruction of the site as it had been in it’s heyday, along with many other associated on-command interpretive sequences.
The camera was installed directly above the shelter; the compensation required for parallax in the prescribed views was allowed for in the modelling of reconstructed buildings and the site, so that the geometry married exactly. the effect of re-animation of the past was magical and unforgettable, and the unit once open on the site in the late nineties was an immediate big hit.
For members of the public witnessing the abbey actually rising up from the still ongoing dig was an extraordinary experience. It simply enabled everyone to understand what had once existed there.
The ‘Time-Frame’, housed in the on-site shelter, was under the command of the observer. There would usually be one person manipulating the touch-screen controls on the large monitor for sharing with the group that had gathered, and we had other slave monitors hung in the shelter.
Because the screen was lined up directly with the view of interest, it was a very close match to what you are achieving with the iPad app from what little I have seen. The difference being, the Ename Time-Fram (later re-named by Ename the TimeScope) was fixed in one location- though of course could installations doing the same and more advanced interpretive jobs could have been replicated at other sites.
What made the on-command sequences so powerful was that they appeared live and real time and in the open.These two factors I found to be essential, much more powerful than using previously recorded images. I am sure that will be a key factor in your app.
Frustratingly, we did our work twenty years ago and of course didn’t have the technology you are using now; however I did envisage the use of mobile systems and location locking via GPS, it was obvious then that we were a little ahead of our time.
However, I knew then we were pioneering applications which would soon enough see the light of day.
One outcome of this work was the drafting of proposal to ICOMOS concerning insurance in the future, of the authenticity of virtual world reconstructions (details of this if you are interested). I drafted this, and it was submitted to ICOMOS through Ename.
I must say I am thrilled to see this wonderful idea re-appear, and I wish you all the very best with your endeavours.
John Sunderland
Hey Stu,
Just wanted to let you know that we think “Dead Men’s Eyes” is such a great project that we had to mention it in our latest blog post .
We know you are busy, so no need to reply. But if you get a spare moment, check out the post. If you like it, feel free to share it on social media and mention in your website.
Oops! Forgot to add my blog address (if anyone’s interested):
http://southdownhill.wordpress.com