Ad

Our DNA is written in Swift
Jump

Augmenting Reality

Opinions differ on what constitutes “Augmented Reality”. This can generally range from sophisticated 3D graphics overlayed on a stereoscopic view of the word to simply overlaying floating icons related to a location over a live camera view. In between some also count as AR when you have a special marker in your hand which you webcam can track and inserts a small 3D model on top of it.

On the iPhone it is currently not possible to do any sophisticated motion tracking or shape recognition was you can only get live images via a backdoor which Apple recently kind of opened just a bit so that useful applications like RedLaser or uStream can make use of it. Still it seems that the consensus is that you can only use images that have no UI elements over them as the workaround does captures of the whole screen.

I figured it would be an interesting addition to my Dr. Touch’s Parts Store and so I set up to create DTAugmentedRealityController. The goal for V1.0 is to provide a view controller which you can feed icons and geo locations and the view controller would overlay said icons over live video from the iPhone’s camera.

I’ve been working on this for 3 days now and because of popular demand I’ve documented each evolutionary step in a YouTube video per day.

Day 1

This was all about figuring out how to overlay my own views on top of UIImagePickerController. Turns out you have to do a bit of transform-magic to resize the video to be full screen, hide the UI elements and provide your own view for the overlayView property. Then I found that UIImagePickerController does not auto-rotate, so I had to apply my own transform. Maybe there is a way to get this rotation by subclassing it, but I have not tried that. I was thrilled to have a turning compass rose on top of the live video.

Day 2

Stuff started to come together. On a static horizontal plan I displayed my first floating icon. I chose Steve Jobs because his location in Cupertino should be every iPhone Developers place of silent thankfulness not to sound overly religious. But Mr. Jobs was the best augmentation I was able to come up with for the reality of my office. Don’t you think so, too?

Day 3+4

For the third episode I improved the engine tremendously.  First change was to pack all sensor-related activities into the DTRealitySensor class to have a central location for all sensing. Then I added rotation to the previously strictly parallel icon plan so the horizon would always stay parallel to the real one. Finally I was facing the challenge of implemented a true attitude independent compass.

The built-in compass has a major drawback: it only works properly if the iphone is flat on it’s back. Which is of little use for a serious component that I am working on. Thankfully I discovered a paper “A NewSolution Algorithm of Magnetic Azimuth” which gave me the necessary insight to use both the gravity vector as well as the raw magnetic vector to calculate a compass bearing which works for any iphone attitude.

Still to Come

I already have several customers lining up to get this component into their hands. Once of them has a deadline of January 12th. He claims “if I don’t get it by then, I’ll code it my self”. “Yeah, right!” would be my appropriate response now that I have seen how much work it is to get the basics lined up.

Version 1.0 of the component still has a couple for things to work out:

  • Make a protocol that allows any UIView to be used for geolocated floating. You would add this view with a method to the controller and the controller would continuosly query your UIView via the protocol. This information would then be used to display the view on screen at the correct location.
  • Invent some sort of dampening so that icons don’t jump around even though you are holding the iPhone still. Maybe have some detection on DTRealitySensor as to wether the iPhone is being moved or not. Still uncertain how to do this properly without loosing the generall responsiveness.
  • Allow the user to set a maximum distance to show and maybe try to show icons smaller and foggier in the distance.
  • There also has been a request to give the icons an altitude. Right now I am positioning in a simplified fashion. Implementing altitude would require that I start to calculate a pitch for looking vector.

Then there are already a couple of ideas for future versions, but I need serious support would I go to tackle those:

  • Do some simple geometric recognition if possible and stabilize the view on detected lines.
  • It would be great if we could detect shapes in general.
  • It should be fairly easy to also overlay an OpenGL layer and display 3D models overlaid over the live video.

The final question that I’ve been asked a lot is what I am planning to charge for DTAugmentedRealityController. You have to bear in mind that I have so far invested 4 full working days (1600 Euros) into this project. Also with that much interest I have to make sure that I don’t make it too cheap and also make it expensive enough so that I can justify investing even more time. Finally AR seems to be a hot topic right now, especially on the iPhone. There are only very few well done AR applications, but I aim to be at the forefront in usability and ease of implementation.

So to get this settled, I chose 300 Euros, with a distinct upward trend. The first 5 customers will get it and keep it for that amount, after that I will reevaluate. Same terms and conditions apply as do for all other components on the Dr. Touch’s Part Store.


Categories: Projects

Leave a Comment