Difference between revisions of "Main Page"

From BiomediaLab
Jump to navigation Jump to search
Line 24: Line 24:
 
[[Image:MeditationC.jpg]]
 
[[Image:MeditationC.jpg]]
  
 +
Diane Gromala
  
 
The Meditation Chamber is an immersive VR environment that provides users with real-time feedback. Wearing a head-mounted display and biofeedback device, users are guided through a series of relaxation and meditation techniques. In real-time, the audio and visuals are synced to the users' continually changing physiological states (respiration, pulse rate and sweat gland activity (a measure of calmness)). As users' approach meditative states, the hypnotic visuals dissolve to moving mist and darkness as the user relaxes. Exhibited at SIGGRAPH's Emerging Technologies, and seen on CNN.
 
The Meditation Chamber is an immersive VR environment that provides users with real-time feedback. Wearing a head-mounted display and biofeedback device, users are guided through a series of relaxation and meditation techniques. In real-time, the audio and visuals are synced to the users' continually changing physiological states (respiration, pulse rate and sweat gland activity (a measure of calmness)). As users' approach meditative states, the hypnotic visuals dissolve to moving mist and darkness as the user relaxes. Exhibited at SIGGRAPH's Emerging Technologies, and seen on CNN.
Line 36: Line 37:
 
[[Image:Metbook.jpg]]
 
[[Image:Metbook.jpg]]
  
 +
Diane Gromala
  
 
A project that provokes users' senses of their visceral responses. A time-based (that is, decaying) slab of meat, constructed as a book, is embedded with various sensors that cause the meat to react and quiver as the viewer approaches it. The reanimated flesh also responds with other movements and sound when users touch it. The next stage of development includes artificial intelligence and explores notions of generative art.
 
A project that provokes users' senses of their visceral responses. A time-based (that is, decaying) slab of meat, constructed as a book, is embedded with various sensors that cause the meat to react and quiver as the viewer approaches it. The reanimated flesh also responds with other movements and sound when users touch it. The next stage of development includes artificial intelligence and explores notions of generative art.
Line 48: Line 50:
 
[[Image:neurofloat.jpg]]
 
[[Image:neurofloat.jpg]]
  
 +
Steven Barnes
 +
Meehae Song
  
 
The goal of the NeuroFloat project is to create an interactive system that uses electroencephalographic (EEG) or other biological signals obtained from the user as a primary input for navigation through an immersive real-time 3-D visualization of various regions of the human brain. One of our primary goals is to create a system where the brain-computer interface is intrinsically related to content of the virtual-reality (VR) environment that it interfaces with. Accordingly, we have chosen to develop multiple VR environments that collectively represent a tour through the early stages of the user’s very own visual system.
 
The goal of the NeuroFloat project is to create an interactive system that uses electroencephalographic (EEG) or other biological signals obtained from the user as a primary input for navigation through an immersive real-time 3-D visualization of various regions of the human brain. One of our primary goals is to create a system where the brain-computer interface is intrinsically related to content of the virtual-reality (VR) environment that it interfaces with. Accordingly, we have chosen to develop multiple VR environments that collectively represent a tour through the early stages of the user’s very own visual system.

Revision as of 06:43, 19 December 2007

BioMedia Lab

School of Interactive Arts and Technology(SIAT), Simon Fraser University

Professor: Diane Gromala (Assoc. Director, SIAT)


The BioMedia Lab is dedicated to creatively exploring the relationships among art, technology, and the biosciences.

Our interests range across disciplines: biomedical phenomenology, open source DNA, visceral forensics, genome hacking, affective biometrics, and biomolecular networking.

Projects in the BioMedia Lab emphasize the role that aesthetics, design, and representation play in our broad understanding of biotechnology, biomedicine, and related fields. Many projects involve collaboration with SFU colleagues and students in computer science, biomedical engineering, bioinformatics, electrical engineering, and architecture.


Projects

Meditation Chamber


MeditationC.jpg

Diane Gromala

The Meditation Chamber is an immersive VR environment that provides users with real-time feedback. Wearing a head-mounted display and biofeedback device, users are guided through a series of relaxation and meditation techniques. In real-time, the audio and visuals are synced to the users' continually changing physiological states (respiration, pulse rate and sweat gland activity (a measure of calmness)). As users' approach meditative states, the hypnotic visuals dissolve to moving mist and darkness as the user relaxes. Exhibited at SIGGRAPH's Emerging Technologies, and seen on CNN.




MeatBook


Metbook.jpg

Diane Gromala

A project that provokes users' senses of their visceral responses. A time-based (that is, decaying) slab of meat, constructed as a book, is embedded with various sensors that cause the meat to react and quiver as the viewer approaches it. The reanimated flesh also responds with other movements and sound when users touch it. The next stage of development includes artificial intelligence and explores notions of generative art.




Neurofloat


Neurofloat.jpg

Steven Barnes Meehae Song

The goal of the NeuroFloat project is to create an interactive system that uses electroencephalographic (EEG) or other biological signals obtained from the user as a primary input for navigation through an immersive real-time 3-D visualization of various regions of the human brain. One of our primary goals is to create a system where the brain-computer interface is intrinsically related to content of the virtual-reality (VR) environment that it interfaces with. Accordingly, we have chosen to develop multiple VR environments that collectively represent a tour through the early stages of the user’s very own visual system.