Posts from October 2010.

Mobile AR Initial Test


Above is a rough test of AR running on iPhone.
Constant marker tracking is heavy on the iPhone and with only three markers the frame rate is between 8 and 20fps, not to mention inconsistent tracking with sudden movements as well as in poor lighting. Implementing the tracking of the position and orientation of the phone using the gyroscope and accelerometer will resolve all of these issues allowing for a fluid, high frame-rate viewing of virtual object in any environment.

Concept:

Using AR coupled with the latest iPhone technology, it is now possible to create virtual objects that exist in real space, allowing the user to not only observe them but to interact and create them. It is thus possible to create an interface for a community of creators to generate rich 3D content that exist in and among everyday objects. Virtual reality can now finally be extracted from the solitary context a personal computer and facilitate face-to-face interaction.

This project I am working on is to culminate in an iPhone application that will enable users to view and generate 3D Augmented Reality objects in any imaginable circumstances. At this point the full functionality of the applications can only be achieved with iPhone4, however limited functionality will be available on earlier versions of iOS and possibly Android.


Viewing Augmented Reality:


Objects created with the use of an app, programmed of imported from 3D-generated can be viewed with and possible even interacted with the help of the applications. The steps to do so will be incredibly simple:
1. Enter the space where the virtual objects are located.
2. Lock into a fiduciary (or similar) mark associated with the space.
3. Roam the area to view objects from all directions and angles.


One will also be able to created objects of their own. Initially these will be simple polygon-based shapes. Although creating sophisticated objects may be difficult at first, simple shapes can be created with a few touches of the screen following these steps:


1. Print out a fiduciary marker and affix it to a stationary surface (not necessary if the marker is already present).
2. Lock into the marker
3. Mark three points in space by moving the phone to a desired point and pressing a button.


These 3 points will created a triangle. Using either discrete triangles, triangle fans or strips one can generate a wide array of 3D shapes.
One will also be able to draw a line by holding a button and moving the iPhone through space.


All virtual object data will be linked to an online database and associated with geographical locations. This will enable anyone to view the objects and identify their rough location on a map.


Technical details:
The release of iPhone 4 offers some exciting new possibilites for Augmented Reality. Apple is finally allowing access to the video api enabling analysis of input integral for AR. On top of this, the addition of the gyroscopes allows for, in combination with the accelerometer, fluid tracking of the phone’s position and orientation in 3D space. This liberates us from the dependance on resource-heavy, unreliable video tracking of fiduciaries to place AR objects. With the system I am developing, one will only need to lock into the marker once and afterwords, be free to roam the space observing various 3D AR objects.


Accomplished so far:
AR framework running on iPhone using Openframeworks.
I am using ARToolkitPlus


There is a great example here that goes into detail on how to compile the library: http://github.com/benlodotcom/VRToolKit


Things to do:
Implement gyroscope API (so far missing in Openframeworks)
Some OpenGL matrix transformations to calculate position of device in relation to virtual objects.
Database design and implementation and finally the user interface design.

Traceroute

visualization of traceroutes with google maps

http://momentsound.com/mobile/trace/locate.php

Mobile Media – test1

text – pumper to 41411

mms image collage

http://momentsound.com/mobile/mms.php