Jump to content

Augmented reality

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by 24.201.234.84 (talk) at 17:07, 18 October 2007 (Companies). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Augmented reality (AR) is a field of computer research which deals with the combination of real world and computer generated data. At present, most AR research is concerned with the use of live video imagery which is digitally processed and "augmented" by the addition of computer generated graphics. Advanced research includes the use of motion tracking data, fiducial marker recognition using machine vision, and the construction of controlled environments containing any number of sensors and actuators.

History

To describe the history of Augmented Reality is also to describe man's journey of adding to the natural world he was born in.

  • 15,000 BC – Lascaux cave drawings showed “virtual” images in a darkened cave that started the idea of enhancing the real world.
  • 1849 – Richard Wagner introduces the idea of immersive experiences using a darkened theatre and surrounding the audience in imagery and sound.
  • 1938 – Konrad Zuse invents the first digital computer known as the Z1 (computer).
  • 1948 – Norbert Wiener creates the science of cybernetics: transmitting messages between man and machine.
  • 1962 – Morton Heilig, a cinematographer, creates a motorcycle simulator called Sensorama with visuals, sound, vibration, and smell.
  • 1966 – Ivan Sutherland invents the head-mounted display suggesting it was a window into a virtual world.
  • 1975 – Myron Krueger creates Videoplace that allows users to interact with virtual objects for the first time.
  • 1989 – Jaron Lanier coins the phrase Virtual Reality and creates the first commercial business around virtual worlds.
  • 1990 – Tom Caudell coins the phrase Augmented Reality while at Boeing helping workers assemble cables into aircraft.

AR as a transformative technology

For many of those interested in AR, one of its most important characteristics is the way in which it makes possible a transformation of the focus of interaction. The interactive system is no longer a precise location, but the whole environment; interaction is no longer simply a face-to-screen exchange, but dissolves itself in the surrounding space and objects. Using an information system is no longer exclusively a conscious and intentional act.

Definition of Augmented Reality

Ronald Azuma's definition of AR is one of the more focused descriptions. It covers a subset of AR's original goal, but it has come to be understood as representing the whole domain of AR: Augmented reality is an environment that includes both virtual reality and real-world elements. For instance, an AR user might wear translucent goggles; through these, he could see the real world, as well as computer-generated images projected on top of that world. Azuma defines an augmented reality system as one that

  • combines real and virtual
  • is interactive in real time
  • is registered in 3D

This definition is now often used in some parts of the research literature (Azuma, 1997).

Outdoor AR

A new and major area of current research is into the use of AR outdoors. GPS and orientation sensors enable backpack computing systems to take AR outdoors.

Early systems have been developed by Dr Steven Feiner at Columbia University (MARS system) and Dr. Bruce H. Thomas and Dr. Wayne Piekarski in the Wearable Computer Lab at the University of South Australia [1]. ARQuake is an example of the use of outdoor AR.

Trimble Navigation, a leading provider of advanced positioning solutions, has been researching Outdoor AR in collaboration with the Human Interface Technology Laboratory at it's New Zealand R&D site in Christchurch. Local network news has reviewed it's progress.

Ubiquitous computing

AR has clear connections with the ubiquitous computing (abbreviated UC) and wearable computers domains. Mark Weiser stated that "embodied virtuality", the original term he used before coining "ubiquitous computing", intended to express the exact opposite to the concept of virtual reality (Mark Weiser's personal communication, Boston, March 1993). The most salient distinction to be made between AR and UC is that UC does not focus on the disappearance of conscious and intentional interaction with an information system as much as AR does: UC systems such as pervasive computing devices usually maintain the notion of explicit and intentional interaction which often blurs in typical AR work such as Ronald Azuma's work. The theory of Humanistic Intelligence (HI), however, also challenges this semiotic notion of signifier and signified. [1] In particular, HI is intelligence that arises from the human being in the feedback loop of a computational process in which the human is inextricably intertwined, and does not typically require conscious thought or effort. In this way, HI, which arises from wearable Computer Mediated Reality, shares a lot in common with AR.

Current and potential uses

Commonly known 'examples' of AR are the yellow first-down line seen in television broadcasts of American football games, and the colored trail showing location and direction of the puck in TV broadcasts of hockey games. The real-world elements are the football field and players, and the virtual element is the yellow line, which is drawn over the image by computers in real time. Note that this example is not an augmented reality application according to the definition above, as objects are not inserted into the real environment and there is usually no interaction with these virtual objects.

Another type of Augmented Reality applications uses projectors and screens to insert objects into the real environment, enhancing museum exhibitions for example. The difference to a simple TV screen for example, is that these objects are related to the environment of the screen or display, and that they often are interactive as well.

Most of the possible applications of AR will however need personal display glasses.

In some current applications like in cars or airplanes, this is usually a head-up display integrated into the windshield.

Leaders in Augmented Reality

Dr. Steven Feiner the father of augmented reality. Author of the first paper on Augmented Reality.

Dr. Bruce H Thomas is the current Director of the Wearable Computer Laboratory at the University of South Australia. He is currently a NICTA fellow, CTO A-Rage Pty Ltd, Member of HxI team, and visiting Scholar with the Human Interaction Technology Laboratory, University of Washington. He is the inventor of the first outdoor augmented reality game ARQuake. His current research interests include: wearable computers, user interfaces, augmented reality, virtual reality, CSCW, and tabletop display interfaces.

Dr. Wayne Piekarski the inventor of the Timith System.

Examples for current applications:

  • Support with complex tasks, in assembly, maintenance, surgery etc.:
    • by inserting of additional information into the field of view (for example, a mechanic getting labels displayed at parts of a system and getting operating instructions)
    • by visualization of hidden objects (during medical diagnostics or surgery as a virtual X-ray view, based on prior tomography or on real time images from ultrasound or open NMR devices, e.g., a doctor could "see" the fetus inside the mother's womb). See also Mixed Reality
  • Navigation devices
    • in buildings, e.g. maintenance of industrial plants
    • outdoors, e.g. military operations or disaster management
    • in cars (headup displays or personal display glasses showing navigation hints and traffic information)
    • in airplanes (headup displays in fighter jets are one of the first AR applications anyhow; meanwhile fully interactive as well, with eye pointing)
  • Military and emergency services (wearable systems, showing instructions, maps, enemy locations, fire cells etc.)
  • Prospecting in hydrology, ecology, geology (display and interactive analysis of terrain characteristics, interactive three-dimensional maps that could be collaboratively modified and analyzed)
  • Visualization of architecture (virtual resurrection of destroyed historic buildings as well as simulation of planned construction projects)
  • Enhanced sightseeing : labels or any text related to the objects/places seen, rebuilt ruins, building or even landscape as seen in the past. Combined with a wireless network the amount of data displayed is limitless (encyclopedic articles, news, etc...).
  • Simulation, e.g. flight and driving simulators
  • Collaboration of distributed teams
    • conferences with real and virtual participants. See also Mixed Reality
    • joint work at simulated 3D models
  • Entertainment and education

Future applications:

  • Expanding a PC screen into the real environment: program windows and icons appear as virtual devices in real space and are eye or gesture operated, by gazing or pointing. A single personal display (glasses) could concurrently simulate a hundred conventional PC screens or application windows all around a user
  • Virtual devices of all kinds, e.g. replacement of traditional screens, control panels, and entirely new applications impossible in 'real' hardware, like 3D objects interactively changing their shape and appearance based on the current task or need.
  • Enhanced media applications, like pseudo holographic virtual screens, virtual surround cinema, virtual 'holodecks' (allowing computer-generated imagery to interact with live entertainers and audience)
  • Virtual conferences in 'holodeck' style
  • Replacement of cellphone and car navigator screens: eye-dialing, insertion of information directly into the environment, e.g. guiding lines directly on the road, as well as enhancements like 'X-ray'-views
  • Virtual plants, wallpapers, panoramic views, artwork, decorations, illumination etc., enhancing everyday life. For example, a virtual window could be displayed on a regular wall showing a live feed of a camera placed on the exterior of the building, thus allowing the user to effectually toggle a wall's transparency
  • With AR systems getting into mass market, we may see virtual window dressings, posters, traffic signs, Christmas decorations, advertisement towers and more. These may be fully interactive even at a distance, by eye pointing for example.
  • Virtual gadgetry becomes possible. Any physical device currently produced to assist in data-oriented tasks (such as the clock, radio, PC, arrival/departure board at an airport, stock ticker, PDA, PMP, informational posters/fliers/billboards, in-car navigation systems, etc. could be replaced by virtual devices that cost nothing to produce aside from the cost of writing the software. Examples might be a virtual wall clock, a to-do list for the day docked by your bed for you to look at first thing in the morning, etc.
  • Subscribable group-specific AR feeds. For example, a manager on a construction site could create and dock instructions including diagrams in specific locations on the site. The workers could refer to this feed of AR items as they work. Another example could be patrons at a public event subscribing to a feed of direction and information oriented AR items.

Further Examples

Specific applications

  • LifeClipper, a wearable AR system
  • Characteroke, a portable AR display costume, whereby the head and neck are concealed behind an active flat panel display.
  • MARISIL, a media phone user interface based on AR
File:Fireflyariel.jpg
AR in a scene from the Firefly episode Ariel.

Pop group Duran Duran included interactive AR projections into their stage show during their 2000 Pop Trash concert tour.[2]

Anime

The current television series Dennō Coil depicts a near-future where children use AR goggles to enhance their environment with games and virtual pets. Ghost in the Shell 2: Innocence gives several examples of augmented reality in use. Gundam, Gunbuster, Neon Genesis Evangelion, Hoshi no koe and Martian Successor Nadesico amongst several others depict 360° augmented reality cockpits that are used to display information. In Serial Experiments Lain, The Wired is overlaid onto the real world via electromagnetic radiation relaying information directly to people's brains, causing people to experience both The Wired and the real world

Science Fiction

In the Star Trek universe, the Jem'Hadar used a sort of augmented display to view the real world and what was outside the ship, integrating with the star ship's main sensors to gain an outside view of the star ship.

The television series Firefly depicts numerous AR applications, including a real-time medical scanner which allows a doctor to use his hands to manipulate a detailed and labeled projection of a patient's brain.

Notes

  1. ^ Mann, Steve. "Intelligence: WearComp as a new framework for Intelligent Signal Processing", Proceedings of the IEEE, Vol. 86, No. 11, November, 1998.
  2. ^ Pair, J., Wilson, J., Chastine, J., Gandy, M. "The Duran Duran Project: The Augmented Reality Toolkit in Live Performance." The First IEEE International Augmented Reality Toolkit Workshop, 2002. (photos and video)

References

Conferences

  • 1. International Workshop on Augmented Reality (IWAR'98), San Francisco, Nov. 1998.
  • 2. International Workshop on Augmented Reality (IWAR'99), San Francisco, Oct. 1999.
  • 1. International Symposium on Mixed Reality (ISMR'99), Yokohama, Japan, March 1999.
  • 1. International Symposium on Augmented Reality (ISAR 2000), Munich, Oct. 2000.
  • 2. International Symposium on Mixed Reality (ISMR'01), Yokohama, Japan, March 2001.
  • 2. International Symposium on Augmented Reality (ISAR 2001), New York, Oct. 2001.
  • 1. International Symposium on Mixed and Augmented Reality (ISMAR 2002), Darmstadt, Oct. 2002.
  • 2. International Symposium on Mixed and Augmented Reality (ISMAR 2003), Tokyo, Oct. 2003.
  • 3. International Symposium on Mixed and Augmented Reality (ISMAR 2004), Arlington, VA, Nov. 2004.
  • 4. International Symposium on Mixed and Augmented Reality (ISMAR 2005), Vienna, Oct. 2005.
  • 5. International Symposium on Mixed and Augmented Reality (ISMAR 2006) Santa Barbara, Oct. 2006.
  • 6. International Symposium on Mixed and Augmented Reality (ISMAR 2007) Nara, Japan, Nov. 2007.

Books

See also

AR research groups and labs

Companies

AR toolkits, frameworks and libraries available for download

Specific AR projects and results