Skip to main content Skip to local navigation

Augmented Reality Lab holding Open House for virtual art and technology displays March 10

Augmented Reality Lab holding Open House for virtual art and technology displays March 10

The Augmented Reality Lab in the Faculty of Fine Arts opens its doors to the York community March 10 for interactive demonstrations of augmented reality (AR) and GPS locative media research projects in development. From 11am to 3pm, visitors will have the opportunity to experience projections on FogScreen, immersive virtual environments and other innovative applications for AR technology.

Directed by film Professor Caitlin Fisher, Canada Research Chair in Digital Culture, York's Augmented Reality Lab is at the forefront in working with both established and emerging technologies. As part of the Future Cinema Lab, it is dedicated to producing innovative research methods, interfaces and content that challenge cinematic and literary conventions and aim to enhance how people interact with their physical environment and with each other.

Left: Caitlin Fisher

The lab offers artists and designers the opportunity to explore new screen technologies, approaches and techniques through production and theoretical study of this emerging medium. Lab participants work interactively and across disciplinary boundaries, particularly film and computer science.

A wide variety of projects will be on display at the open house.

Handheld City is an online streaming experience developed by the AR Lab for the city of Toronto’s virtual museum project, which launched March 6 (Toronto’s 176th birthday). Using AR as a storytelling device, the researchers organized and animated the digital objects in the museum collection and created an interesting way to interact with the objects and access the accompanying text.

Right: Handheld City was developed for Toronto's virtual museum project

The Amazing Cinemagician is an interactive "rfid" (radio frequency identification – like a barcode) video project for the FogScreen by Helen Papagiannis, a PhD student in communication & culture. Digitized film clips by cinematic special effects pioneer Georges Méliès are tied to a series of rfid objects that the viewer can scan to access the video.

Papagiannis made major waves in AR circles last fall with her presentation at the International Symposium for Mixed and Augmented Reality in Florida. A leading AR news blog, Games Alfresco, dubbed her “the new ARtist in charge,” awarded her its Most Beautiful Demo award and put her on its top 10 list of forces currently shaping the industry.

CommCult master's student Justin Stephenson showcases a new "procedural animation" (a form of computer animation generated in real-time) using Quartz Composer.

Master of Fine Arts film student Simone Rapisarda presents the ladybike test project: the first film to come out of the lab using the Ladybug camera. This spherical digital video unit comprises multiple cameras and records more than 80 per cent of the full sphere. Rapisarda’s video, filmed with the camera set in a bicycle basket, shows the scenery approaching, speeding by and receding simultaneously.

Above: An image from Simone Rapisarda's ladybike test project

Also experimenting with the Ladybug camera is graduate film student Cameron Woykin, who has created a time-based video installation using footage of himself shot inside the lab. Edited into a spherical image, the video shows multiple views of the researcher as he moves around the space.

Right: The Ladybug camera in action

Wormholes is another experiment in spherical storytelling, created by Fisher and Andrew Roth, the lab’s technology manager. Using the lab’s Intersense IS900 Inertial/Sonic Tracking "virtual reality" grid, participants wearing a virtual reality headset can literally get inside and explore simultaneous realities through spherical video clips shot by Fisher and Roth on various locations on campus.

Several projects use SnapdragonAR software, an innovative "drag and drop" AR interface developed in the lab in collaboration with computer vision researchers Andrei Rotenstein and Mikhail Sizintsev, PhD candidates in computer science, and Dr. Mark Fiala. Snapdragon allows people without computer programming skills to easily build AR experiences. This software is now available for sale through Future Stories, a spin-off company York’s AR Lab established to provide participants with the option of commercializing their lab developments.

The Snapdragon projects created in the lab by graduate students include Papagiannis' sound toy wonder turner; Boaz Berri’s Neighbours, which fills an image of an apartment complex with videos of life inside the building; Carter Bruce, Anne Koizumi and Claudia Sicondolfo’s The Underground Cave, which animates a model of an underground space; and a work-in-progress by Evelyn Tchakarov. Fisher will also be showing an AR tabletop theatre piece called Circle which was presented for the first time last December as part of the Digital Arts & Culture conference at the University of California, Irvine. Wallace Edwards, a Governor General's Literary Award-winning children’s book illustrator, will show some recent experiments with AR illustrations that come to life in your hands.

Above: A collection of images from the Snapdragon projects created in the lab by graduate students

Another computer program developed in-house is an iPhone GPS video-caching application created by Roth and Rotenstein. Akin to a technological Easter egg hunt, when the application is running, the iPhone will display a digital surprise – in this case a film clip – when it is physically located at a predetermined GPS coordinate. CommCult master's student Magda Olszanowski's Suivez Moi was built using the GPS software. An outdoor demo of her project is available now by appointment (call 416-736-2100 ext. 21077), but the lab hopes these locative film experiences will be available for download through the Apple app store in the near future.

Republished courtesy of YFile – York University’s daily e-bulletin.