Sunday, October 23, 2011

Upcoming ISMAR 2011 - Demos

Hi there,

next week (26.10.-29.10.) the ISMAR conference will take place in Basel (Switzerland). The premier international conference on research, technology and application in Mixed and Augmented Reality. It's becoming THE reference for research on (mobile) augmented reality.

After a first look on the program I hope to see a lot of great AR technology, listen to amazing paper presentations and meet interesting people.

Here is a small preview of tech demos I think will be interesting and hopefully will show new and innovative directions for future AR applications.



Argon AR web browser
Blair MacIntyre, Alex Hill, Hafez Rouzati, Maribeth Gandy, Brian Davidson (Georgia Institute of Technology)

Argon is the completely open standards augmented reality browser that allows rapid development and deployment of Web 2.0 style augmented reality content. This demo accompanies our paper “The Argon AR Web Browser and Standards-based AR Application Environment”. Argon renders a standards compliant combination of KML, HTML, CSS and JavaScript served via typical HTTP servers. Multiple simultaneous channels, analogous to browser tabs on the desktop, let authors create dynamic and interactive AR content using existing web development toolsets.




Handheld AR Games at the Qualcomm AR Game Studio at Georgia Tech
Blair MacIntyre, Yan Xu, Maribeth Gandy (Georgia Institute of Technology)

In this demo, we will show a collection of games that have been produced over the past year at the Qualcomm Augmented Reality Game Studio, a partnership between the Augmented Environments Lab at the Georgia Institute of Technology (Georgia Tech), the Atlanta campus of the Savannah College of Art and Design (SCAD-Atlanta) and Qualcomm.



Gravity-aware Handheld Augmented Reality
Daniel Kurz, Selim Benhimane (metaio GmbH)
This demo showcases how different stages in handheld Augmented Reality (AR) applications can benefit from knowing the direction of the gravity measured with inertial sensors. It presents approaches to improve the description and matching of feature points, detection and tracking of planar templates, and the visual quality of the rendering of virtual 3D objects by incorporating the gravity vector. All demonstrations are shown on mobile devices.


RGB-D camera-based parallel tracking and meshing
Sebastian Lieberknecht, Andrea Huber (metaio GmbH), Slobodan Ilic (TUM), Selim Benhimane (metaio GmbH)

This demonstration showcases an approach for RGB-D camera-based tracking and meshing. We investigated how a camera like the Microsoft Kinect could be used to simplify the SLAM problem based on the additional depth information. Besides for tracking the camera’s motion, the available per-pixel-depth is also used to create a meshed and textured reconstruction of the environment at the same time. The meshed version can be used for occlusion of virtual objects, as will be shown using augmented furniture. We further present a live demonstration of how the sparse and meshed map are built. More details on our approach can be found in our accompanying paper „RGB-D camera-based parallel tracking and meshing“ from this year’s ISMAR.


Real-Time Accurate Localization in a Partially Known Environment: Application to Augmented Reality on 3D Objects
Mohamed Tamaazousti, Vincent Gay-Bellile, Sylvie Naudet Collette, Steve Bourgeois (CEA, List), Michel Dhome (LASMEA/ CNRS)


This demo addresses the challenging issue of real-time camera localization in a partially known environment, i.e. for which a geometric 3D model of one static object in the scene is available. We proposed a constrained bundle adjustment framework for keyframe-based SLAM that includes simultaneously the geometric constraints provided by the 3D model, the multi-view constraints relative to the known part of the environment (i.e. the object observations) and the multi-view constraints relative to the unknown part of the environment. We use two different model based constraints to deal with both textured and textureless 3D objects. Consequently, our solution offers both the accuracy of model-based tracking solution and the robustness of SLAM (fast movements, robustness to partial/total occlusion, robustness to large viewpoint changes, etc.).



JavaScript based Natural Feature Tracking
Christoph Oberhofer, Jens Grubert, Gerhard Reitmayr (ICG Graz University of Technology)

We present a Natural Feature Tracking pipeline written completely in HTML5 and JavaScript. It runs in real-time on desktop computers on all major web browsers supporting WebGL and achieves interactive frame rates on modern smartphones with mobile web browsers. The tracking pipeline will be made available as Open Source to the community.

No comments:

Post a Comment