This is something I posted on the Facebook group The Enterprise Shed back in August. It’s initially written for the ‘shedders’ who took part in the Enterprise Shed: Making Ideas Happen MOOC, created by Newcastle University, which is why it talks about ideas so much. The original Facebook post is here.
It occurred to me this morning that augmented reality allows you to see and interact with things that are not there. This came about because I was looking out the window and saw the moon, low in the sky, lit by reflected sunlight. What I also saw though was ‘floaters’ and other artefacts of eyesight, as well as telegraph poles and some windows in need of a clean! I realised then that there was the reality of the moon, and the ‘added’ reality of the floaters, a squashed bug and some mucky windows – initially I had ‘seen through and past’ the windows to see the moon, but refocusing brought the windows and my own eyes into this expanded reality.
I think that has some relevance to augmented reality (AR) and that led me to think on a few ideas around what people might be able to do with AR;
- Simulate what someone is seeing with various vision ailments; colour blindness, astigmatism, macular degeneration, short and long sight etc. This would be very helpful to health care professionals in appreciating what people with those conditions go through.
- If you’ve accurately mapped what that person is seeing, you could overlay the ‘inverse’ into the AR device to cancel out the condition and possibly improve that persons quality of life
- Look below the horizon (i.e. move your head down and ‘look’ through the earth) to see planets, satellites etc that are outside your field of view or haven’t risen yet (if at all)
- Extending that idea somewhat, increase the virtual magnification of the AR device to zoom in on details of interest (for example, I would use this to look in more detail at planets etc) – there’s no reason the feed can’t be coming from realtime telescopes etc
- Instead of looking out, look inwards, through the earth. Go down through strata seeing soil, pipes, cables, sewers etc until you get to bedrock (but why stop there?)
- The medical uses are dramatic – the diagnostic devices today are churning up giga- and terabytes of information, and being able to ‘navigate’ this torrent of data is essential. A HCP (health care professional, notice I don’t use ‘doctor’, this stuff is going to be used by everyone) wearing AR devices could look up and down your leg, see the fracture (in very great detail) and be able to do something about it. Nothing to dramatic there, but suppose they were able to ‘see’ blood flows, blockages etc in the limbs?…
- One of the vital things HCPs are taught is communication skills, for example don’t talk to the screen if you have a patient in front of you. Someone wearing an AR device could be seeing patient data in their peripheral vision whilst keeping eye contact with the patient. An ER team might well get a flood of vital signs data being shown (blood pressure, heart rate, etc) as they are helping the patient, other HCPs would see a patient in a bed and at the same time see what their allergies are, their ‘chart’ etc. I could write a lot more here, but you may be getting the idea 🙂
- Getting a guided tour through a building that is going to be developed/renovated etc “and here’s where the pot plants will be, here’s the sofa, chairs and carpet. Oh you don’t like the colour scheme, no problem, how about this?”. Extend this to anything you like; a new factory or hospital, induction of new employees/students, demoing your new product/service etc
- Go minimalistic. Why bother decorating a building in reality, if you can have the walls decorated with anything you like, and have it change at a moments notice? (And of course all people looking at the wall could see something else…). Why bother with an 80″ state of the art screen on the wall, when you can pop up a (higher resolution) ‘screen’ onto any surface you like (Japan is aiming for 8k resolution screens for the 2020 olympics, but AR might beat them to the punch. 8k is symbolic, because it’s generally accepted that between 4 & 8k is where human vision can’t tell the difference between a screen and reality…)
- So many opportunities in training, e.g. Car repair, in fact any kind of maintenance (diy plumbing for example). In fact anything where an expert could be looking over your shoulder telling you what to do next…
Ok, so a lot of these are technology based, and you may be wondering where you might be able to fit in. I look at it this way; huge corporations are putting enormous amounts of money into getting the technology sorted out, and pretty soon is going to happen, and it’s going to be available for very little (that’s what technology does). The use cases I came up with are quite literally the tip of a very large iceberg-you can do the same, in your own areas of expertise or enjoyment (you’ll see I’ve completely missed the arts-please fill in the gaps! 🙂 Much smarter people than me are convinced that this field (things that are not there) is enormous and very sparsely populated at the moment. Any technologist that tells you explicitly how a technology is going to be used is probably wrong (I guess that includes me!), and your ideas and thoughts are just as valuable.