tag:blogger.com,1999:blog-75242584275097383482024-03-13T05:43:31.058+01:00Mobile Augmented Reality ExperiencesThoughts on Mobile Augmented Reality, Android, 3D Technology, User Interaction, Multi-touch, Tactile, Unity3D, WebGL and so on.Anonymoushttp://www.blogger.com/profile/01292190093039898111noreply@blogger.comBlogger57125tag:blogger.com,1999:blog-7524258427509738348.post-73425904499161421972013-04-02T19:55:00.001+02:002013-04-02T19:55:40.055+02:00Selltic Renault Utilitaire<iframe src="http://player.vimeo.com/video/62625501" width="500" height="281" frameborder="0" webkitAllowFullScreen mozallowfullscreen allowFullScreen></iframe> <p><a href="http://vimeo.com/62625501">Selltic Renault Vehicule Utilitaire</a> from <a href="http://vimeo.com/diotasoft">diotasoft</a> on <a href="http://vimeo.com">Vimeo</a>.</p>Anonymoushttp://www.blogger.com/profile/01292190093039898111noreply@blogger.com0tag:blogger.com,1999:blog-7524258427509738348.post-708061357257361882012-11-02T21:51:00.003+01:002012-11-02T21:52:19.540+01:00Selltic Augmented Reality Renault Clio 4<div style="text-align: center;">
<iframe allowfullscreen="allowfullscreen" frameborder="0" height="281" mozallowfullscreen="mozallowfullscreen" src="http://player.vimeo.com/video/52560670?badge=0" webkitallowfullscreen="webkitallowfullscreen" width="500"></iframe> </div>
<a href="http://vimeo.com/52560670">Selltic Augmented Reality Renault Clio 4</a> from <a href="http://vimeo.com/diotasoft">diotasoft</a> on <a href="http://vimeo.com/">Vimeo</a>.<br />
<br />
<br />
<br />
Selltic is a set of technological solutions intended to support sales. Our augmented reality based systems are aimed to enable vendors to emphasize, comprehensively, the full potential of the product, directly in the real context.<br />
<br />
For more information visit <a href="http://www.selltic.eu/">selltic.eu</a>Anonymoushttp://www.blogger.com/profile/01292190093039898111noreply@blogger.com0tag:blogger.com,1999:blog-7524258427509738348.post-75979281542005942262012-05-27T21:16:00.004+02:002012-05-27T21:17:27.869+02:00BlippAR<div dir="ltr" style="text-align: left;" trbidi="on">
<div dir="ltr" style="text-align: left;" trbidi="on">
If you still don't know waht AR could be good for. BlippAR presents some nice ideas.<br />
<br />
<br /></div>
<div style="text-align: center;">
<iframe allowfullscreen="" frameborder="0" height="315" src="http://www.youtube.com/embed/hT-Z6yhiLD8" width="560"></iframe></div>
</div>Anonymoushttp://www.blogger.com/profile/01292190093039898111noreply@blogger.com0tag:blogger.com,1999:blog-7524258427509738348.post-2293889651751442932012-05-07T16:30:00.000+02:002012-05-07T16:47:26.096+02:00MUSTARD - Managing occlusions between real and virtual objects<div dir="ltr" style="text-align: left;" trbidi="on">
<div dir="ltr" style="text-align: left;" trbidi="on">
Interesting paper on augment reality display that can handle occlusions between real and virtual objects presented at CHI 2012.<br />
<br />
MUSTARD: Multi User See Through Augmented Reality Display<br />
<br />
<blockquote class="tr_bq">
"We present MUSTARD, a multi-user dynamic random hole see-through display, capable of delivering viewer dependent information for objects behind a glass cabinet. Multiple viewers are allowed to observe both the physical object(s) being augmented and their location dependent annotations at the same time. The system consists of two liquid-crystal (LC) panels within which physical objects can be placed. The back LC panel serves as a dynamic mask while the front panel serves as the data. We first describe the principle of MUSTARD and then examine various functions that can be used to minimize crosstalk between multiple viewer positions. We compare different conflict management strategies using PSNR and the quality mean opinion score of HDR-VDP2. Finally, through a user- study we show that users can clearly identify images and objects even when the images are shown with strong conflicting regions; demonstrating that our system works even in the most extreme of circumstances."</blockquote>
<br /></div>
<div style="text-align: center;">
<iframe allowfullscreen="" frameborder="0" height="315" src="http://www.youtube.com/embed/pOBRl0k2zpo" width="560"></iframe></div>
</div>Anonymoushttp://www.blogger.com/profile/01292190093039898111noreply@blogger.com0tag:blogger.com,1999:blog-7524258427509738348.post-52144179669485147652012-05-07T15:38:00.000+02:002012-05-07T15:38:26.497+02:00Touché - detecting complex gestures though multifrequency analysis<div dir="ltr" style="text-align: left;" trbidi="on">
<div>
Great work by Disney Research on detecting gestures by analyzing a large spectrum of frequencies from capacitive sensing.</div>
<div>
<br /></div>
Touché: Enhancing Touch Interaction on Humans, Screens, Liquids, and Everyday Objects<br /><br />"Touché proposes a novel Swept Frequency Capacitive Sensing technique that can not only detect a touch event, but also recognize complex configurations of the human hands and body. Such contextual information significantly enhances touch interaction in a broad range of applications, from conventional touchscreens to unique contexts and materials. For example, in our explorations we add touch and gesture sensitivity to the human body and liquids. We demonstrate the rich capabilities of Touché with five example setups from different application domains and conduct experimental studies that show gesture classification accuracies of 99% are achievable with our technology."<div>
<br /></div>
<div>
<br /><div style="text-align: center;">
<iframe allowfullscreen="" frameborder="0" height="315" src="http://www.youtube.com/embed/E4tYpXVTjxA" width="560"></iframe></div>
</div>
</div>Anonymoushttp://www.blogger.com/profile/01292190093039898111noreply@blogger.com0tag:blogger.com,1999:blog-7524258427509738348.post-64196561767940427712012-03-12T10:59:00.000+01:002012-03-12T10:59:21.374+01:00Sony UK presents its markerless tracking technology Magnat for PS Vita at GDC 2012<div dir="ltr" style="text-align: left;" trbidi="on">
<div dir="ltr" style="text-align: left;" trbidi="on">
At GDC 2012 Sony UK has presented its new markerless tracking technology for PS Vita called <i>Magnat</i>. The technique allows to track basically any textured 3D environment.<br />
<br />
The algorithm is able to initialize itself without any need of markers or pre-learning phase. It is also able to detect planar surfaces in the cloud of reconstructed 3D points. This allows the user to add 3D objects interactively.<br />
<br />
Since those planar surfaces can easily be triangulated and the corresponding real video texture is known, they are able to deform these surfaces and render them for example liquid. See the second video for this nice effect.<br />
<br />
While I have no further information about the actual algorithm that is used by Magnat, my first guess would be a variation of the popular <i><a href="http://www.robots.ox.ac.uk/~gk/PTAM/">Parallel Tracking and Mapping</a></i> approach presented by Klein in 2007.<br />
<br /></div>
<iframe allowfullscreen="" frameborder="0" height="315" src="http://www.youtube.com/embed/_2JyILlolXA" width="560"></iframe><br />
<br />
more videos after the break<br />
<br />
<a name='more'></a><br /><br />
<br />
<iframe allowfullscreen="" frameborder="0" height="315" src="http://www.youtube.com/embed/G3qbrQKz_Kk" width="560"></iframe><br />
<br />
<br />
<iframe allowfullscreen="" frameborder="0" height="315" src="http://www.youtube.com/embed/WPmxrSHe4lA" width="560"></iframe></div>Anonymoushttp://www.blogger.com/profile/01292190093039898111noreply@blogger.com0tag:blogger.com,1999:blog-7524258427509738348.post-4255461566826216522012-03-05T11:30:00.001+01:002012-03-05T11:30:37.078+01:00Qualcomm presenting mobile 3D tracking at MWC 2012<div dir="ltr" style="text-align: left;" trbidi="on">
"Sesame Street Augmented Reality Dolls Take AR to the Next Level"<div style="text-align: center;">
<br /></div>
<div style="text-align: center;">
<iframe allowfullscreen="" frameborder="0" height="315" src="http://www.youtube.com/embed/U2jSzmvm_WA" width="560"></iframe></div>
</div>Anonymoushttp://www.blogger.com/profile/01292190093039898111noreply@blogger.com0tag:blogger.com,1999:blog-7524258427509738348.post-58698762087839104162012-03-05T11:25:00.001+01:002012-03-05T11:25:56.710+01:00Board of Imagination by Chaotic Moon Labs<div dir="ltr" style="text-align: left;" trbidi="on">
<div dir="ltr" style="text-align: left;" trbidi="on">
<div class="separator" style="clear: both; text-align: center;">
<a href="http://chaoticmoon.zippykidcdn.com/wp-content/uploads/2012/02/skatehead2.png" imageanchor="1" style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;"><img border="0" height="133" src="http://chaoticmoon.zippykidcdn.com/wp-content/uploads/2012/02/skatehead2.png" width="200" /></a></div>
After the board of awesomeness the <a href="http://www.chaoticmoon.com/labs">Chaotic Moon Labs release</a> their board of imagination. <a href="http://www.chaoticmoon.com/labs/chaotic-moon-labs-board-of-imagination/">The board of imagination</a> is a brain controlled long board. The driver can accelerate or decelerate the board using an Emotiv EPOC headset that measures brain activity.<br />
<br />
<br /></div>
<div style="text-align: center;">
<iframe allowfullscreen="" frameborder="0" height="315" src="http://www.youtube.com/embed/2KtMCX7FfZ0" width="560"></iframe></div>
</div>Anonymoushttp://www.blogger.com/profile/01292190093039898111noreply@blogger.com0tag:blogger.com,1999:blog-7524258427509738348.post-36779530247988029792012-03-05T11:16:00.000+01:002012-03-05T11:16:08.916+01:00Metaio Markerless 3D Tracking using Gravity Aware Algorithms<div dir="ltr" style="text-align: left;" trbidi="on">
Metaio has announced to release in mid mars 2012 its new Juneio SDK, which will include a feature to track 3D objects and sceneries without markers. Metaio could win last tracking contest at ISMAR 2012 using this technology. The first video shows some insights of the gravity aware algorithms that were implemented to allow stable mobile markerless 3D tracking.<br />
<br />
Markerless 3D Tracking Technology Video<br />
<br />
<iframe src="//www.viddler.com/embed/7f90a654/?f=1&offset=0&autoplay=0&secret=60762954&disablebranding=0" frameborder="0" height="349" id="viddler-7f90a654" width="545"></iframe>
<br />
<br />
<br /></div>Anonymoushttp://www.blogger.com/profile/01292190093039898111noreply@blogger.com0tag:blogger.com,1999:blog-7524258427509738348.post-90077210307974164832012-03-05T11:01:00.002+01:002012-03-05T11:03:21.768+01:00History of Mobile Augmented Reality<div dir="ltr" style="text-align: left;" trbidi="on">
<div class="separator" style="clear: both; text-align: center;">
<a href="https://www.icg.tugraz.at/~daniel/HistoryOfMobileAR/Sutherland_68.jpg" imageanchor="1" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" height="200" src="https://www.icg.tugraz.at/~daniel/HistoryOfMobileAR/Sutherland_68.jpg" width="167" /></a></div>
Very nice <a href="https://www.icg.tugraz.at/~daniel/HistoryOfMobileAR/">compilation</a> of the history of mobile augmented reality. From the first head-up-display in 1968 by Sutherland until mobile SLAM by Klein. This list was created by the members of the <a href="http://studierstube.org/handheld_ar/">Christian Doppler Laboratory for Handheld Augmented Reality</a>.<br />
<br />
<a href="https://www.icg.tugraz.at/~daniel/HistoryOfMobileAR/">Mobile Augmented Reality History</a></div>Anonymoushttp://www.blogger.com/profile/01292190093039898111noreply@blogger.com0tag:blogger.com,1999:blog-7524258427509738348.post-68707838101023386542012-02-18T19:50:00.001+01:002012-02-18T19:51:11.686+01:00Dense Tracking and Mapping in Real-Time<div dir="ltr" style="text-align: left;" trbidi="on">
<div dir="ltr" style="text-align: left;" trbidi="on">
Great ICCV paper by <a href="http://www.doc.ic.ac.uk/~rnewcomb/">Richard Newcomb</a> about tracking and mapping techniques with a single camera that even calculates a deth map in real time.<br />
<br />
<blockquote class="tr_bq">
"<span style="background-color: white; font-family: Calibri, Verdana, Arial, Helvetica, Sans; font-size: 15px; text-align: -webkit-auto;">DTAM is a system for real-time camera tracking and reconstruction which relies not on feature extraction but dense, every pixel methods. As a single hand-held RGB camera flies over a static scene, we estimate detailed textured depth maps at selected keyframes to produce a surface patchwork with millions of vertices. We use the hundreds of images available in a video stream to improve the quality of a simple photometric data term, and minimise a global spatially regularised energy functional in a novel non-convex optimisation framework. Interleaved, we track the camera 6DOF motion precisely by frame-rate whole image alignment against the entire dense model.</span>"</blockquote>
<br /></div>
<div style="text-align: center;">
<iframe allowfullscreen="" frameborder="0" height="315" src="http://www.youtube.com/embed/Df9WhgibCQA" width="560"></iframe></div>
</div>Anonymoushttp://www.blogger.com/profile/01292190093039898111noreply@blogger.com1tag:blogger.com,1999:blog-7524258427509738348.post-76771234154605180282012-02-18T19:41:00.001+01:002012-02-18T19:41:50.925+01:00TojiCode Blog about WebGL<div dir="ltr" style="text-align: left;" trbidi="on">
<a href="http://upload.wikimedia.org/wikipedia/commons/3/39/WebGL_logo.png" imageanchor="1" style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;"><img border="0" src="http://upload.wikimedia.org/wikipedia/commons/3/39/WebGL_logo.png" /></a>Very interesting <a href="http://blog.tojicode.com/">Blog</a> about WebGL made by Toji. He provides the reader with a lot of inside in WebGL and some cool demos. Check out the <a href="http://media.tojicode.com/q3bsp/">quake3 level</a> in WebGL.<br />
Many of the projects are available on github so you can have good look on how things are implemented and optimised.<br />
<br />
<br />
<br /></div>Anonymoushttp://www.blogger.com/profile/01292190093039898111noreply@blogger.com0tag:blogger.com,1999:blog-7524258427509738348.post-35203231539306155492012-02-18T19:26:00.000+01:002012-02-18T19:27:05.665+01:00A Day made of Glass<div dir="ltr" style="text-align: left;" trbidi="on">
<div dir="ltr" style="text-align: left;" trbidi="on">
Its an advertisement, but at the same time a real interessant look in the future of augmented reality.<br />
<br />
<br /></div>
<div style="text-align: center;">
<iframe allowfullscreen="" frameborder="0" height="315" src="http://www.youtube.com/embed/X-GXO_urMow" width="560"></iframe></div>
</div>Anonymoushttp://www.blogger.com/profile/01292190093039898111noreply@blogger.com0tag:blogger.com,1999:blog-7524258427509738348.post-42497604602632321652012-02-18T19:18:00.000+01:002012-02-18T19:20:25.944+01:00Mozilla's Boot2Gecko Mobile Platform<div dir="ltr" style="text-align: left;" trbidi="on">
<div class="separator" style="clear: both; text-align: center;">
<a href="http://www.webmonkey.com/wp-content/uploads/2012/02/b2glock-4f3cd57-intro.jpg" imageanchor="1" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" height="200" src="http://www.webmonkey.com/wp-content/uploads/2012/02/b2glock-4f3cd57-intro.jpg" width="115" /></a></div>
<a href="http://www.webmonkey.com/2012/02/first-look-mozillas-boot2gecko-mobile-platform-and-gaia-ui/">First Look: Mozilla's Boot2Gecko Mobile Platform and Gaia UI | Webmonkey | Wired.com</a>: <br />
<div>
<br /></div>
<div>
"<span style="background-color: white; color: #333333; font-family: Arial, Verdana, sans-serif; font-size: 13px; line-height: 18px;">Mozilla launched a new project last year called Boot2Gecko (B2G) with the aim of developing a mobile operating system. The platform’s user interface and application stack will be built entirely with standards-based web technologies and will run on top of Gecko, the HTML rendering engine used in the Firefox web browser. The B2G project has advanced at a rapid pace this year and the platform is beginning to take shape.</span><span style="font-size: 100%;">"</span></div>
<div>
<br />
<a href="https://chrome.google.com/webstore/detail/pengoopmcjnbflcjbmoeodbmoflcgjlk" style="font-size: 13px;">'via Blog this'</a></div>
</div>Anonymoushttp://www.blogger.com/profile/01292190093039898111noreply@blogger.com0tag:blogger.com,1999:blog-7524258427509738348.post-75137061966560085022012-02-08T15:51:00.003+01:002012-02-08T15:52:19.609+01:00Google Augmented Reality Glasses<div dir="ltr" style="text-align: left;" trbidi="on">
Apparently the "secret" Google X labs are working on semi transparent glasses equipped with different sensor ideal for interesting augmented reality applications. So lets hope thats a true rumor.<br />
<blockquote class="tr_bq">
<span style="background-color: white; color: #333333; font-family: Arial, Helvetica, sans-serif; font-size: 13px; line-height: 17px; text-align: -webkit-auto;">the search giant is developing a product called Google Glasses that will have a built-in heads-up display. The device, which supposedly resembles a pair of Oakley shades, is said to have an integrated transparent display for one eye and a built-in front-facing camera. The latter could be used for augmented reality applications. The device would use speech and head tilting for text input and control.</span></blockquote>
<br />
via <a href="http://arstechnica.com/gadgets/news/2012/02/google-reportedly-developing-android-powered-smart-glasses.ars">Arstechnica</a></div>Anonymoushttp://www.blogger.com/profile/01292190093039898111noreply@blogger.com0tag:blogger.com,1999:blog-7524258427509738348.post-24211856828748084402012-02-01T08:09:00.003+01:002012-02-01T08:09:44.454+01:00Kinect Russian Roulette<div dir="ltr" style="text-align: left;" trbidi="on">
<div dir="ltr" style="text-align: left;" trbidi="on">
<br />
<blockquote class="tr_bq">
Kinect Russian Roulette is a speed project by Theo Watson, which allows you to play Russian Roulette with just your hand and a Kinect.</blockquote>
<a href="http://fffff.at/kinect-russian-roulette-prototype/">kinect-russian-roulette-prototype website</a><br />
<br />
<br /></div>
<div style="text-align: center;">
<iframe allowfullscreen="" frameborder="0" height="330" mozallowfullscreen="" src="http://player.vimeo.com/video/35782736?title=0&byline=0&portrait=0" webkitallowfullscreen="" width="440"></iframe></div>
</div>Anonymoushttp://www.blogger.com/profile/01292190093039898111noreply@blogger.com0tag:blogger.com,1999:blog-7524258427509738348.post-16719799510531671742012-02-01T08:05:00.000+01:002012-02-01T08:05:37.479+01:00Metaio working on windshield projection<div dir="ltr" style="text-align: left;" trbidi="on">
<div dir="ltr" style="text-align: left;" trbidi="on">
<br />
Metaio just released a video of one of their current projects in cooperation with Audi, Continental and other partners. The idea is to create a system that is able to analyse and understand the environment around a car and provide the with useful information a projection onto the windshield.<br />
<br />
<blockquote class="tr_bq">
<span style="background-color: #ebebeb; color: #333333; font-family: arial, sans-serif; font-size: 13px; line-height: 18px;">Eleven partners from the German automotive industry join forces to explore innovative technologies and concepts for boosting energy efficiency in vehicles. Firstly, cars are to become "intelligent" and, for example, use knowledge of the planned route to develop proactive operating strategies designed to save energy and, based on this, initiate appropriate responses in good time on the part of the car or driver. Secondly, the vehicle power supply and associated components are to be specifically tailored to the possibilities of these intelligent operating strategies. The project is supported by the German Federal Ministry of Research and Technology (BMBF) and supervised by the VDI TZ.</span></blockquote>
<span style="background-color: rgba(255, 255, 255, 0.917969); color: #222222; font-family: arial, sans-serif; font-size: 13px; text-align: -webkit-auto;"><br /></span><br />
<span style="background-color: rgba(255, 255, 255, 0.917969); color: #222222; font-family: arial, sans-serif; font-size: 13px; text-align: -webkit-auto;"><br /></span></div>
<div style="text-align: center;">
<iframe allowfullscreen="" frameborder="0" height="315" src="http://www.youtube.com/embed/YbhUnhfoGYM" width="560"></iframe>
</div>
</div>Anonymoushttp://www.blogger.com/profile/01292190093039898111noreply@blogger.com0tag:blogger.com,1999:blog-7524258427509738348.post-49526477163228369492012-01-15T22:38:00.000+01:002012-01-15T22:39:51.250+01:00Controlling a Longboard via Kinect<div dir="ltr" style="text-align: left;" trbidi="on">
I like the Kinect and love longboards.<br />
Awesome project by <a href="http://www.chaoticmoon.com/labs/board-of-awesomeness/">CM.Labs</a><br />
<br />
<blockquote class="tr_bq">
Using a motorized longboard custom rigged with a Microsoft Xbox 360 Kinect device, Samsung Windows 8 enabled tablet with full voice control (yep, the one that’s not yet released), a phidget interface module, and all terrain tires, we took Project Sk8 to the streets to show how we’ve revolutionized Kinect by re-engineering it to not only respond to movement but use that movement to operate something other than a gaming avatar.</blockquote>
<br />
<div style="text-align: center;">
<iframe allowfullscreen="" frameborder="0" height="315" src="http://www.youtube.com/embed/n_xz7nX-Dzg" width="560"></iframe>
</div>
</div>Anonymoushttp://www.blogger.com/profile/01292190093039898111noreply@blogger.com0tag:blogger.com,1999:blog-7524258427509738348.post-9732227854111598552011-12-08T21:34:00.001+01:002011-12-08T21:43:12.129+01:00Metaio opens platform to chipset vendors and software developers<a href="http://www.metaio.com/fileadmin/upload/images/logo_metaio/metaio.gif" imageanchor="1" style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;"><img border="0" height="72" src="http://www.metaio.com/fileadmin/upload/images/logo_metaio/metaio.gif" width="320" /></a><span class="Apple-style-span" style="background-color: white; font-family: arial, helvetica, sans; font-size: 20px; line-height: 30px;"><a href="http://www.metaio.com/">Metaio</a> on Thursday opened its platform to chipset vendors and software developers for free, hoping to boost its position in the emerging sector. </span><span class="Apple-style-span" style="background-color: white; font-family: arial, helvetica, sans; font-size: 20px; line-height: 30px;">The cooperation will take place with Texas Instruments, one of leaders of mobile hardware manufacturers.</span><br />
<span class="Apple-style-span" style="background-color: white; font-family: arial, helvetica, sans; font-size: 20px; line-height: 30px;"><br /></span><br />
<span class="Apple-style-span" style="background-color: white; font-family: arial, helvetica, sans; font-size: 20px; line-height: 30px;">With this step, Metaio follows similar attempts taken by <a href="https://developer.qualcomm.com/develop/mobile-technologies/augmented-reality">Qualcomm</a> and Intel. Intel announced recently a cooperation with Total Immersion.</span><br />
<span class="Apple-style-span" style="background-color: white; font-family: arial, helvetica, sans; font-size: 20px; line-height: 30px;"><br /></span><br />
<span class="Apple-style-span" style="background-color: white; font-family: arial, helvetica, sans; font-size: 20px; line-height: 30px;">via <a href="http://www.reuters.com/article/2011/12/08/us-metaio-ar-idUSTRE7B71HM20111208">Reuters</a></span>Anonymoushttp://www.blogger.com/profile/01292190093039898111noreply@blogger.com0tag:blogger.com,1999:blog-7524258427509738348.post-8986347629779899302011-11-07T19:08:00.000+01:002011-11-07T19:13:50.728+01:00ISMAR 2011 Paper Highlights<a href="http://4.bp.blogspot.com/-0DZmB9hCiFw/TrgeO5KjpGI/AAAAAAAAC98/zsziagqf0XA/s1600/Capture+d%25E2%2580%2599e%25CC%2581cran+2011-11-07+a%25CC%2580+19.06.17.png" imageanchor="1" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" height="150" src="http://4.bp.blogspot.com/-0DZmB9hCiFw/TrgeO5KjpGI/AAAAAAAAC98/zsziagqf0XA/s200/Capture+d%25E2%2580%2599e%25CC%2581cran+2011-11-07+a%25CC%2580+19.06.17.png" width="200" /></a>My favorite papers presented at ISMAR 2011.<br />
<br />
<br />
<b>Robust Planar Target Tracking and Pose Estimation from a Single Concavity</b><br />
<i>Michael Donoser, Peter Kontschieder, Horst Bischof</i><br />
Graz University of Technology<br />
<blockquote class="tr_bq">
"The basic idea is to adapt the classic tracking-by-detection approach, which seeks for the object to be tracked independently in each frame, for tracking non-textured objects. In order to robustly estimate the 3D pose of such objects in each frame, we have to tackle three demanding problems. First, we need to find a stable representation of the object which is discriminable against the background and highly repetitive. Second, we have to robustly relocate this representation in every frame, also during considerable viewpoint changes. Finally, we have to estimate the pose from a single,closed object contour"</blockquote>
<br />
<div style="text-align: center;">
<iframe allowfullscreen="" frameborder="0" height="315" src="http://www.youtube.com/embed/ynfxZLtgZWc" width="420"></iframe>
</div>
<br />
<b></b><br />
<a name='more'></a><b><br /></b><br />
<b>Homography-Based Planar Mapping and Tracking for Mobile Phones</b><br />
<i>Christian Pirchheim, Gerhard Reitmayr</i><br />
Graz University of Technology<br />
<blockquote class="tr_bq">
"We present a real-time camera pose tracking and mapping system which uses the assumption of a planar scene to implement a highly efficient mapping algorithm. Our light-weight mapping approach is based on keyframes and plane-induced homographies between them. We solve the planar reconstruction problem of estimating the keyframe poses with an efficient image rectification algorithm.Camera pose tracking uses continuously extended and refined planar point maps and delivers robustly estimated 6DOF poses. We compare system and method with bundle adjustment and monocular SLAM on synthetic and indoor image sequences. We demonstrate large savings in computational effort compared to the monocular SLAM system while the reduction in accuracy remains acceptable."</blockquote>
<br />
<div style="text-align: center;">
<iframe allowfullscreen="" frameborder="0" height="315" src="http://www.youtube.com/embed/F7hid9YNHs0" width="420"></iframe>
</div>
<b><br /></b><br />
<b><br /></b><br />
<b>RGB-D Camera-Based Parallel Tracking and Meshing</b><br />
<i>Sebastian Lieberknecht, Andrea Huber, Slobodan Ilic, Selim Benhimane</i><br />
Metaio GmbH<br />
<blockquote class="tr_bq">
"Compared to standard color cameras, RGB-D cameras are designed to additionally provide the depth of imaged pixels which in turn results in a dense colored 3D point cloud representing the environment from a certain viewpoint. We present a real-time tracking method that performs motion estimation of a consumer RGB-D camera with respect to an unknown environment while at the same time reconstructing this environment as a dense textured mesh."</blockquote>
<br />
<div style="text-align: center;">
<iframe allowfullscreen="" frameborder="0" height="315" src="http://www.youtube.com/embed/1Jfo640kpRo" width="560"></iframe>
</div>
<br />
<br />
<br />
<b>KinectFusion: Real-Time Dense Surface Mapping and Tracking</b><br />
<i>Richard A. Newcombe, Shahram Izadi, Otmar Hilliges, David Molyneaux, David Kim, Andrew J. Davison, Pushmeet Kohli, Jamie Shotton, Steve Hodges, Andrew Fitzgibbon</i><br />
Microsoft Research<br />
<br />
Best paper award!<br />
<blockquote class="tr_bq">
"We present a system for accurate real-time mapping of complex and arbitrary indoor scenes in variable lighting conditions, using only a moving low-cost depth camera and commodity graphics hardware.We fuse all of the depth data streamed from a Kinect sensor into a single global implicit surface model of the observed scene in real-time. The current sensor pose is simultaneously obtained by tracking the live depth frame relative to the global model using a coarse-to-fine iterative closest point (ICP) algorithm, which uses all of the observed depth data available."</blockquote>
<br />
<div style="text-align: center;">
<iframe allowfullscreen="" frameborder="0" height="315" src="http://www.youtube.com/embed/quGhaggn3cQ" width="560"></iframe>
</div>
<br />
<b><br /></b><br />
<br class="Apple-interchange-newline" />My personal favorite!<br />
<br />
<b>Light Factorization for Mixed-Frequency Shadows in Augmented Reality</b><br />
<i>Derek Nowrouzezahrai, Stefan Geiger, Kenny Mitchell, Robert Sumner, Wojciech Jarosz, Markus Gross</i><br />
Disney Research Zürich<br />
<blockquote class="tr_bq">
"Integrating animated virtual objects with their surroundings for high-quality augmented reality requires both geometric and radiometric consistency. We focus on the latter of these problems and present an approach that captures and factorizes external lighting in a manner that allows for realistic relighting of both animated and static virtual objects. Our factorization facilitates a combination of hard and soft shadows, with high-performance, in a manner that is consistent with the surrounding scene lighting."</blockquote>
<br />
<div style="text-align: center;">
<iframe allowfullscreen="" frameborder="0" height="315" src="http://www.youtube.com/embed/56LmddHhI2I" width="560"></iframe>
</div>
<br />
<b><br /></b><br />
<b>The Argon AR Web Browser and Standards-based AR Application Environment</b><br />
<i>Blair MacIntyre, Alex Hill, Hafez Rouzati, Maribeth Gandy, Brian Davidson</i><br />
<i><br /></i><br />
<div class="separator" style="clear: both; text-align: center;">
</div>
<blockquote class="tr_bq">
"In this paper, we present the design and implementation of theArgon AR Web Browser and describe our vision of an AR application environment that leverages the WWW ecosystem. We also describe KARML, our extension to KML (the spatial markup language for Google Earth and Maps), that supports the functionality required for mobile AR. We combine KARML with the full range of standard web technologies to create a standards-based web browser for mobile AR. KARML lets users develop 2D and3D content using existing web technologies and facilitates easy deployment from standard web servers. We highlight a number of projects that have used Argon and point out the ways in which our web-based architecture e has made previously impractical AR concepts possible."</blockquote>Anonymoushttp://www.blogger.com/profile/01292190093039898111noreply@blogger.com0tag:blogger.com,1999:blog-7524258427509738348.post-76754500272762219492011-10-30T01:11:00.002+02:002011-10-30T01:17:35.444+02:00ISMAR 2011 - Tracking Contest Impressions<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="http://3.bp.blogspot.com/-_PV374CO1S8/TqyJKerFPDI/AAAAAAAAC7k/b512OhlnJgI/s1600/IMG_20111027_145245.jpg" imageanchor="1" style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;"><img border="0" height="200" src="http://3.bp.blogspot.com/-_PV374CO1S8/TqyJKerFPDI/AAAAAAAAC7k/b512OhlnJgI/s200/IMG_20111027_145245.jpg" width="150" /></a></div>
Some impressions of the Metaio team during the tracking competition at <a href="http://www.ismar11.org/">ISMAR 2011</a> in Basel. Finally they got the best score and won the contest. The contestants had to go through a parcour with various challenges for their tracking system.<br />
<br />
The goal was to determine exact positions in the real world from single 3D world coordinates. The parcour contained relatively complex environments, like a car or specular setups with mirrors.<br />
<br />
Metaio used their Juneio system on a Samsung Galaxy S with Android.<br />
<br />
<a href="http://junaio.wordpress.com/2011/10/28/junaio-3d-tracking-for-the-win-ismar/">via Junaio Blog</a><br />
<br />
<div style="text-align: center;">
<iframe allowfullscreen="" frameborder="0" height="315" src="http://www.youtube.com/embed/hL5z9Gq_85w" width="560"></iframe></div>Anonymoushttp://www.blogger.com/profile/01292190093039898111noreply@blogger.com0tag:blogger.com,1999:blog-7524258427509738348.post-55434025886234889382011-10-30T00:53:00.001+02:002011-10-30T00:54:32.092+02:00Javascript based Natural Feature Tracking enables markerless AR in the browser<div class="separator" style="clear: both; text-align: center;">
<a href="http://1.bp.blogspot.com/-m0BCY34x2KA/TqyBufPu6sI/AAAAAAAAC7U/JyFKwRgHyDo/s1600/IMG_20111027_142547.jpg" imageanchor="1" style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;"><img border="0" height="200" src="http://1.bp.blogspot.com/-m0BCY34x2KA/TqyBufPu6sI/AAAAAAAAC7U/JyFKwRgHyDo/s200/IMG_20111027_142547.jpg" width="150" /></a></div>
<br />
Coming back from <a href="http://www.ismar11.org/">ISMAR 2011 </a>in Basel, I could see a very interesting project by developed by Christoph Oberhofer, Jens Grubert, Gerhard Reitmayr (ICG Graz University of Technology - Austria). It proofs that a reliable and performant natural feature based tracking can actually be implemented in Javascript, HTML5 and WebGL.<br />
<br />
There are still a lot of issues especially with the access to live video, but results are already impressive. One has to imagine that this runs entirely on a single website.<br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="http://2.bp.blogspot.com/-pICl9FFnmd8/TqyBxz5UB0I/AAAAAAAAC7c/WOEcIiOMvw0/s1600/IMG_20111027_142540.jpg" imageanchor="1" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" height="200" src="http://2.bp.blogspot.com/-pICl9FFnmd8/TqyBxz5UB0I/AAAAAAAAC7c/WOEcIiOMvw0/s200/IMG_20111027_142540.jpg" width="150" /></a></div>
The demo runs also on an Android phone with Firefox Beta. Of course not as fluide as on a PC, but still with reasonable interactive framerates. Chrome and Opera alpha were used for the desktop demo. Interestingly it was not running on Firefox on PC.<br />
<br />
This work is a very good example of the pace of innovation regarding cutting edge augmented reality applications. There is no need for additional plugins, like FLARToolkit. <a href="http://mobilearexperiences.blogspot.com/2011/08/my-take-on-webgl.html">See my post on that</a>. It runs directly in your browser. It is clearly another possibility to bring augmented reality closer to the a broader public.<br />
<br />
We can only imagine what will be possible, when the browser support of HTML5, WebGL and Javascript will be significantly improved and technologies like WebCL will come in play.<br />
<br />
Great work<br />
<br />
<br />
<div style="text-align: center;">
<iframe allowfullscreen="" frameborder="0" height="315" src="http://www.youtube.com/embed/bi9NII-IB04" width="420"></iframe></div>Anonymoushttp://www.blogger.com/profile/01292190093039898111noreply@blogger.com0tag:blogger.com,1999:blog-7524258427509738348.post-81673385785516666492011-10-24T11:46:00.002+02:002011-10-24T12:03:57.820+02:00Day 2 DroidconUK London<div style="text-align: left;">
<a href="http://2.bp.blogspot.com/-F-Lbm7Mx0PI/TqU1zHjWn2I/AAAAAAAAC7A/uS1c7hnKG_A/s1600/droidcon.png" imageanchor="1" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" height="158" src="http://2.bp.blogspot.com/-F-Lbm7Mx0PI/TqU1zHjWn2I/AAAAAAAAC7A/uS1c7hnKG_A/s400/droidcon.png" width="400" /></a><br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
My personal <i>Best Of</i> day 2 at DroidconUK 2011.<br />
<br />
<br />
<b><a href="http://www.blogger.com/blogger.g?blogID=7524258427509738348">Android Gaming</a></b><br />
CodeSurgeon: Mustafa Isik<b><br /></b><br />
<ul></ul>
<div style="text-align: center;">
<iframe allowfullscreen="" frameborder="0" height="315" src="http://www.youtube.com/embed/6V9AIVpqNAY" width="420"></iframe></div>
<div style="text-align: center;">
</div>
<br />
<br />
<a name='more'></a><br /></div>
<b><br /></b><br />
<b>Designing UIs for Phones and Tablets</b><br />
Nick Butcher Google<br />
<ul>
<li>Honeycomb visual design -> simple and focus on content, consider to change presentation of content for tablets</li>
<li>Tablet UI patterns</li>
<ul>
<li>Action Bar: Navigation elements left Action elements right</li>
</ul>
<li>MultiPane Layouts</li>
<ul>
<li>use Fragments</li>
</ul>
<li>Stretch, Stack, Expand/Collapse, Show/Hide</li>
<li>Back & Up functions</li>
<li>Beyond Lists</li>
<ul>
<li>CarouselView 3D and ViewPager 2D</li>
</ul>
</ul>
<div style="text-align: center;">
<iframe allowfullscreen="" frameborder="0" height="315" src="http://www.youtube.com/embed/bGdkIVMBWx8" width="420"></iframe>
</div>
<div style="text-align: center;">
<br /></div>
<div style="text-align: center;">
<div style="text-align: left;">
</div>
<br /></div>
<b><a href="http://www.blogger.com/blogger.g?blogID=7524258427509738348"> Android Design Patterns</a> </b><br />
Closertag: Giorgio Venturi <br />
<br />
<blockquote>
Giorgio Venturi will talk about how Android apps have been heavily criticised in the past due to poor user experience. One of the reasons why this happened is lack of solid & consistent UI patterns. For example, how do you navigate between the different sections of the app? How do you provide visual and aural feedback avoiding interrupting the user? How do you allow background usage? The goal of this session is to look at some of the emerging best practices on the Android Market and analyse recent advancements in fluid navigation, interaction and progressive disclosure of information.</blockquote>
<div style="text-align: center;">
<iframe allowfullscreen="" frameborder="0" height="315" src="http://www.youtube.com/embed/20lHuObP2Wg" width="420"></iframe>
</div>
<b><br /></b><br />
<b><br /></b><br />
<b><br /></b><br />
<b><span class="Apple-style-span" style="font-weight: normal;"></span></b><br />
<b><b><a href="http://www.blogger.com/blogger.g?blogID=7524258427509738348">Closing keynote : Android Market for Developers</a> </b></b><br />
<b>Google: Richard Hyndmann</b><br />
<b><br /></b><br />
<blockquote>
Android Market is undergoing continuous revisions to improve the experience for developers and end-users. This talk covers recent developments, and goes into depth on techniques for using Android Market to leverage new monetization models, deliver optimized content, minimize piracy, and maximize application visibility.</blockquote>
<div>
<br /><div style="text-align: center;">
<iframe allowfullscreen="" frameborder="0" height="315" src="http://www.youtube.com/embed/-nZNQRSoR4Y" width="560"></iframe></div>
<br />
<b><br /></b><br />
<b>Predictions Sure To Go Wrong! - Droidcon UK 2011 </b><br />
Mark Murphy<br />
<br />
<div style="text-align: center;">
<iframe allowfullscreen="" frameborder="0" height="315" src="http://www.youtube.com/embed/u9Wj2Q7zusA" width="560"></iframe>
</div>
<div style="text-align: right;">
</div>
<br />
<br />
<div>
<div style="text-align: left;">
<b><br /></b><br />
<b>Application Development using the hidden Android platform APIs </b></div>
<div style="text-align: left;">
Erik Hellmann, Sony Ericsson</div>
<div class="separator" style="clear: both; text-align: center;">
</div>
<ul>
<li style="text-align: left;">public, hidden, protected APIs</li>
<li style="text-align: left;">look for "@hide" in Android source</li>
<li style="text-align: left;">read text messages, telephony.java</li>
<li style="text-align: left;">Tethering, need to be signed/certified developer, WifiMAnager.java, setWifiApEnabled, only SonyEricsson</li>
<li style="text-align: left;">injecting TouchEvents by Reflection</li>
<li style="text-align: left;">Api protected by Signature permission</li>
<li style="text-align: left;">Hidden USB</li>
<li style="text-align: left;">Code examples soon at sony developer Blog</li>
</ul>
</div>
</div>Anonymoushttp://www.blogger.com/profile/01292190093039898111noreply@blogger.com0tag:blogger.com,1999:blog-7524258427509738348.post-85187728560313378652011-10-23T22:12:00.000+02:002011-10-23T22:26:43.083+02:00Upcoming ISMAR 2011 - DemosHi there,<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="http://2.bp.blogspot.com/-tiZ16HvUAT0/TqR02LeYnnI/AAAAAAAAC64/K6XgxactglE/s1600/ismar2011.jpg" imageanchor="1" style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;"><img border="0" height="256" src="http://2.bp.blogspot.com/-tiZ16HvUAT0/TqR02LeYnnI/AAAAAAAAC64/K6XgxactglE/s320/ismar2011.jpg" width="320" /></a></div>
next week (26.10.-29.10.) the <a href="http://www.ismar11.org/">ISMAR</a> conference will take place in Basel (Switzerland). The premier international conference on research, technology and application in Mixed and Augmented Reality. It's becoming THE reference for research on (mobile) augmented reality.<br />
<br />
After a first look on the <a href="http://www.ismar11.org/index.php/program">program</a> I hope to see a lot of great AR technology, listen to amazing paper presentations and meet interesting people.<br />
<br />
Here is a small preview of <a href="http://www.ismar11.org/index.php/program/demos">tech demo</a>s I think will be interesting and hopefully will show new and innovative directions for future AR applications.<br />
<span class="Apple-style-span" style="background-color: white; color: #333333; font-family: Arial, Helvetica, sans-serif; font-size: 13px; line-height: 20px;"></span><br />
<br />
<a name='more'></a><br />
<a href="http://www.argon.gatech.edu/index.html"><b>Argon AR web browser</b></a><br />
<i> Blair MacIntyre, Alex Hill, Hafez Rouzati, Maribeth Gandy, Brian Davidson (Georgia Institute of Technology) </i><br />
<br />
<blockquote>
Argon is the completely open standards augmented reality browser that allows rapid development and deployment of Web 2.0 style augmented reality content. This demo accompanies our paper “The Argon AR Web Browser and Standards-based AR Application Environment”. Argon renders a standards compliant combination of KML, HTML, CSS and JavaScript served via typical HTTP servers. Multiple simultaneous channels, analogous to browser tabs on the desktop, let authors create dynamic and interactive AR content using existing web development toolsets.</blockquote>
<img src="http://img2.blogblog.com/img/video_object.png" /><br />
<div>
<br />
<br />
<br />
<b>Handheld AR Games at the <a href="http://www.argamestudio.org/">Qualcomm AR Game Studio</a> at Georgia Tech</b><br />
<i>Blair MacIntyre, Yan Xu, Maribeth Gandy (Georgia Institute of Technology)</i><br />
<br />
<blockquote>
In this demo, we will show a collection of games that have been produced over the past year at the Qualcomm Augmented Reality Game Studio, a partnership between the Augmented Environments Lab at the Georgia Institute of Technology (Georgia Tech), the Atlanta campus of the Savannah College of Art and Design (SCAD-Atlanta) and Qualcomm. </blockquote>
</div>
<div class="separator" style="clear: both; text-align: center;">
</div>
<div>
<img src="http://img2.blogblog.com/img/video_object.png" /><br />
<br />
<br />
<b>Gravity-aware Handheld Augmented Reality</b><br />
<i>Daniel Kurz, Selim Benhimane (metaio GmbH)</i><br />
<blockquote>
<a href="http://www.ismar11.org/images/demo/thumb/GravityawareHandheld.jpg" imageanchor="1" style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;"><img border="0" src="http://www.ismar11.org/images/demo/thumb/GravityawareHandheld.jpg" /></a>This demo showcases how different stages in handheld Augmented Reality (AR) applications can benefit from knowing the direction of the gravity measured with inertial sensors. It presents approaches to improve the description and matching of feature points, detection and tracking of planar templates, and the visual quality of the rendering of virtual 3D objects by incorporating the gravity vector. All demonstrations are shown on mobile devices.</blockquote>
<br />
<br />
<b>RGB-D camera-based parallel tracking and meshing</b><br />
<i>Sebastian Lieberknecht, Andrea Huber (metaio GmbH), Slobodan Ilic (TUM), Selim Benhimane (metaio GmbH)</i><br />
<a href="http://www.ismar11.org/images/demo/thumb/RGBDcamerabased.jpg" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" src="http://www.ismar11.org/images/demo/thumb/RGBDcamerabased.jpg" /></a><br />
<blockquote>
This demonstration showcases an approach for RGB-D camera-based tracking and meshing. We investigated how a camera like the Microsoft Kinect could be used to simplify the SLAM problem based on the additional depth information. Besides for tracking the camera’s motion, the available per-pixel-depth is also used to create a meshed and textured reconstruction of the environment at the same time. The meshed version can be used for occlusion of virtual objects, as will be shown using augmented furniture. We further present a live demonstration of how the sparse and meshed map are built. More details on our approach can be found in our accompanying paper „RGB-D camera-based parallel tracking and meshing“ from this year’s ISMAR.</blockquote>
<br />
<br />
<b>Real-Time Accurate Localization in a Partially Known Environment: Application to Augmented Reality on 3D Objects </b><br />
<i>Mohamed Tamaazousti, Vincent Gay-Bellile, Sylvie Naudet Collette, Steve Bourgeois (CEA, List), Michel Dhome (LASMEA/ CNRS)</i><br />
<br />
<a href="http://www.ismar11.org/images/demo/thumb/RealTimeAccurate.jpg" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" src="http://www.ismar11.org/images/demo/thumb/RealTimeAccurate.jpg" /></a><br />
<blockquote>
This demo addresses the challenging issue of real-time camera localization in a partially known environment, i.e. for which a geometric 3D model of one static object in the scene is available. We proposed a constrained bundle adjustment framework for keyframe-based SLAM that includes simultaneously the geometric constraints provided by the 3D model, the multi-view constraints relative to the known part of the environment (i.e. the object observations) and the multi-view constraints relative to the unknown part of the environment. We use two different model based constraints to deal with both textured and textureless 3D objects. Consequently, our solution offers both the accuracy of model-based tracking solution and the robustness of SLAM (fast movements, robustness to partial/total occlusion, robustness to large viewpoint changes, etc.).</blockquote>
<br />
<br />
<br />
<b>JavaScript based Natural Feature Tracking </b><br />
<i>Christoph Oberhofer, Jens Grubert, Gerhard Reitmayr (ICG Graz University of Technology)</i><br />
<a href="http://www.ismar11.org/images/demo/thumb/JavaScriptbasedNatural.jpg" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" src="http://www.ismar11.org/images/demo/thumb/JavaScriptbasedNatural.jpg" /></a><br />
<blockquote>
We present a Natural Feature Tracking pipeline written completely in HTML5 and JavaScript. It runs in real-time on desktop computers on all major web browsers supporting WebGL and achieves interactive frame rates on modern smartphones with mobile web browsers. The tracking pipeline will be made available as Open Source to the community.</blockquote>
</div>Anonymoushttp://www.blogger.com/profile/01292190093039898111noreply@blogger.com0tag:blogger.com,1999:blog-7524258427509738348.post-34959260983592831862011-10-22T12:49:00.001+02:002011-10-22T17:50:09.384+02:00Volkswagen Golf Cabriolet Augmented Reality AppVolkswagen Golf Cabriolet with an augmented reality app. Available for for the iPad2, iPhone and Android. MAde by Paris based <a href="http://www.agence-v.fr/">Agency.V.</a> using <a href="http://www.t-immersion.com/">Total Immersion </a>technology.<br />
<div style="text-align: left;">
<br /></div>
<div style="text-align: left;">
<br /></div>
<div style="text-align: center;">
<iframe allowfullscreen="" frameborder="0" height="315" src="http://www.youtube.com/embed/pFS6EHzBGVc" width="560"></iframe></div>Anonymoushttp://www.blogger.com/profile/01292190093039898111noreply@blogger.com0