Prof. Nadia Magnenat-Thalmann, George Papagiannakis, Marlène Arévalo, …


LifePlus

Virtuelles Leben in Pompeji


LifePlus_scene 1_Octavia [link 01]

LifePlus_scene 1_Octavia

Kurzdarstellung

Kurzbeschreibung

Innovative Wiederherstellung des Lebens in alten Fresken und Schaffung immersiv narrativer Welten, einschließlich Realszenen mit animierter Fauna und Flora. Es wird die Anwendung einer Fallstudie über ein virtuelles Erbe präsentiert. Historische pompejische Fresken werden durch die 3D-Animation ihres Inhalts „zum Leben erweckt“, ihre reale Umgebung wird damit überlagert. Das Gesamterlebnis wird dem Besucher vor Ort geboten, und zwar mit Hilfe eines immersiven, mobilen, auf Augmented-Reality basierenden Führers, mit tragbarem Computergerät und multimodaler Interaktion. Das LifePlus-Mobilsystem wird benötigt, um in zwei Hauptmodi zu arbeiten: Der "Besichtigungs“-Betriebsmodus ist so entworfen, dass er den Besucher mit ortsbezogenen Multimedia-Informationen zur Seite steht. Dies vereinfacht die Besichtigung des Gebietes durch die Versorgung mit sowohl praktischen als auch historischen Informationen in Form von Texten, Bildern, Kurzfilmen, die auf das am Kopf angebrachte Display gelegt werden. Im "AR"-Simulationsmodus wird der Besucher dem Szenario der VR-Simulation ausgesetzt, das in die realen Bilder der Örtlichkeit hineingeblendet wird.

KünstlerInnen / AutorInnen

  • Prof. Nadia Magnenat-Thalmann, Founder & Director, MIRALab, Switzerland
  • George Papagiannakis, Research assistant & PhD candidate, MIRALab, Switzerland
  • Marlène Arévalo, Research assistant & PhD candidate, MIRALab, Switzerland
  • Prof. Panos Trahanias , Supervisor of the Computer Vision & Robotics Lab, ICS-FORTH, Greece
  • Prof. Stelios C. Orphanoudakis, Director of ICS-FORTH, Leader of the Computer Vision & Robotics Division, ICS-FORTH, Greece
  • Dr. Antonis Argyros, Researcher & Coordinator of the Computer Vision & Robotics, ICS-FORTH, Greece
  • Stephan Müller, Head of the department Visualisation and Virtual Reality, Fraunhofer Institute for Computer Graphics (IGD), Germany
  • Didier Stricker, Head of the Augmented Reality group of the department Visualisation and Virtual Reality, Fraunhofer Institute for Computer Graphics (IGD), Germany
  • Nikolaos Ioannidis, Associate Manager of the Development Programmes Department, INTRACOM S.A., Greece
  • Ioannis Koufakis, Development Programmes Department, INTRACOM S.A., Greece
  • Sofia Tsekeridou, Development Programmes Department, INTRACOM S.A., Greece
  • Dr. Andrew Stoddart, Chief Scientist, VMSL Ltd., U.K.
  • Hélène de Fontenilles, Software Development Manager, ORCHESTRA Software Supervisor, Bionatics S.A., France
  • Pierre Dinouard, Research and Development Manager, Genesis Software Supervisor, Bionatics S.A., France
  • Frédéric Banegas, R&D Computer Graphics engineer, Bionatics S.A., France
  • Prof. Daniel Thalmann, Director of the LIG-Computer Graphics Lab, EPFL, Switzerland
  • Dr. Marcelo Kallmann, untill 2002 Senior Researcher, Computer Graphics Lab, EPFL, Switzerland
  • Michael Ponder, Research assistant, Computer Graphics Lab, EPFL, Switzerland
  • Olaf Schirm, Founder, noDNA AG, Germany
  • Renzo Carlucci, Technical Director, A&C2000, Italy
  • Prof. Pietro Giovanni Guzzo, Achaelogical Superintendent, Pompeii Superintendence, Italy
  • Prof. Eva Cantarella, Professor of Roman Law, Representative of the Minister of Culture in the board of The Italian National Institute for Ancient Drama, University of Milan, Italy

Entstehung

Schweiz, 2002-2004

Partner / Sponsoren

Das Konsortium wird finanziell und administrativ von FORTH (Foundation for Research and Technology – Hellas) geführt, in technischer Hinsicht von MIRALab (University of Geneva); beide teilen sich die Verantwortung für das Management und die Verwaltung des Gesamtprojektes. In LIFEPLUS gibt es 11 führende Industrie- und Forschungspartner, darunter FORTH und MIRALab, die ihre komplementären Fachkenntnisse vereinen, um den Entwicklungsstand in der Augmented Reality Technologie und ihren neuartigen Anwendungen voranzutreiben. Partner in der FORSCHUNG: FORTH (Image Vision, Camera Tracking); UNIGE, MIRALab (Real-time virtual human hair, real-time clothes and facial emotion expression, Realistic skin rendering, Multi-resolution mesh); EPFL - Swiss Federal Institute of Technology, VR-Lab (Artificial life methods for Behavioural Animation of groups, Middleware integration); IGD Fraunhofer Institute for Computer Graphics (AR-autoring tools, Camera tracking); TECHNOLOGY providers: INTRACOM (Mobile On-site AR guide); VMSL Ltd. (Real-time match moving); Bionatics S.A. (Real-time virtual plant simulation); noDna A.G. (3D Character modelling, Virtual Character-based Installations); A&C 2000 S.R.L., AEC (Geographical Information Systems techniques); USERS GROUP: University of Milan, UNIMI (Content providers of Pompeian life-style, Evaluation); Archaelogical Superintendence of Pompeii, SOPRIN (Ancient Pompeii Demonstrators, Evaluators).

LifePlus ist ein laufendes IST Projekt, das von der EU finanziert wird (No: IST-2001-34545).
Gesamtkosten: 279.257 Euro.

Kommentar

Obwohl anfangs auf Kulturerbe-Zentren ausgerichtet, ist das Paradigma nicht auf derartige Themen beschränkt, es umfasst alle Typen zukünftiger ortsbezogener Entertainments, E-visitor Attractions sowie On-Set Visualisierungen für die Fernseh-/Filmindustrie.

MIRALab ist an einem weiteren europäischen Projekt beteiligt, das soeben begonnen hat: Erato. Mit diesem Projekt wird der Begriff des Architekturerbes erweitert: er umfasst jetzt sowohl akustische als auch visuelle Komponenten. Zudem kreiert die in Echtzeit erfolgende audiovisuelle 3D-Integration in virtuelle Environments - mit virtuellen, interaktiven Menschen - ein spezifisches und interessantes Produkt. Dieses Produkt könnte sowohl in der wissenschaftlichen Forschung als auch in den Medien eingesetzt werden. Um die Ziele dieses Projektes zu erreichen, werden einige der Architekturdenkmäler Jordaniens und der Türkei zu Identifikations- und Konservationszwecken genutzt. Ähnliche Kulturerbe-Denkmäler sind in den meisten Mittelmeerländern zu finden, sie folgen dabei der Ausdehnung des Römischen Weltreichs. Daher können auch Denkmäler in Frankreich und Italien im Projekt genutzt werden.

Eingabe des Beitrags

, 10.03.2004

Kategorie

  • Forschungsprojekt

Schlagworte

  • Themen:
    • Kulturvermittlung |
    • Wearable Computing |
    • Animation |
    • Tracking |
    • Augmented Reality |
    • Echtzeit-Rendering
  • Formate:
    • 3D |
    • Computeranimation
  • Technik:
    • Motion Tracking |
    • GPS

Inhalt

Inhaltliche Beschreibung

“Within the EU, addressing the issues posed by the preservation, re-use and access to our intellectual capital will form the cornerstone of future economic growth and development. Access to these assets is essential for the education and improved quality of life of its citizens” . (1)

LIFEPLUS proposes an innovative 3D reconstruction of ancient frescos-paintings through the real-time revival of their fauna and flora, featuring groups of virtual animated characters with artificial life dramaturgical behaviours, in an immersive AR environment. By its very nature, LIFEPLUS is a highly interdisciplinary project involving computer vision, computer graphics, user interfaces, human factors, wearable computing, mobile computing, computer networks, distributed computing, information access and information visualization.


PROBLEMS TO BE SOLVED

Since antiquity, images were used as records of both events-lifestyles, as well as decorations. The possibility of reviving them will add a new dimension in understanding our past. However, the recreation of historic environments for serious study, education and entertainment is not new (2) although the methods for achieving the objectives have evolved considerably over time. Before the days of widespread books and printing, story tellers would conjure up visions of events and places, providing their listeners with an impression of realities (often augmented realities) elsewhere in time and space. Theatre, fine art and cinema have added to the richness of the explicit visual experience available to the viewer. They have made the interpretations of history more accessible to the general public, but at the same time narrowing the individual’s scope for personalised, interactive experience and visualisation of the description of it.

Therefore, for the application of technology to heritage to become a viable historical recreation tool, a combination of technological, economic and creative challenges must be overcome. Potentially a Virtual Reality-based heritage experience gives the visitor the opportunity to feel they are present at significant places and times in the past and use a variety of senses to experience what it would have felt like to be there. However, a review of the range of projects on the internet described as Virtual Heritage (3), shows numerous examples of virtual environments build as reconstructions of historic sites but sterile and devoid of population. Engaging characters that are needed in an interactive experience are now slowly coming into focus with recent EU funded IST projects (Charismatic ) (4). The main reason for their slow adoption is due to a) the incapability of current VR rendering technology for realistic, entertaining, interactive and engaging synthetic characters and b) lack of interesting interaction paradigms for character-based installations. Historical frescos are a unique arrangement of “mise-en-scene” elements that enhance the user experience by creating a set of compelling narrative patterns, alas however in a static, two-dimensional way. The word "narrative” refers to a set of events happening during a certain period of time and providing aesthetic, dramaturgical and emotional elements, objects and attitudes (5). Mixing such aesthetic ambiences with virtual augmentations and adding dramatic tension, can develop these narrative patterns into an exciting new edutainment medium.

Therefore, LIFEPLUS proposes new development for the innovative revival of life in ancient frescos-paintings and creation of narrative spaces. The revival is based on real scenes captured on live video sequences augmented with real-time autonomous groups of 3D virtual fauna and flora. The metaphor, which will inspire the project approach, is oriented to make the "transportation in fictional and historical spaces", as depicted by frescos-paintings, as realistic, immersive and interactive as possible. For that purpose, LIFEPLUS will aim to position itself between the extremes of real life and Virtual Reality, in the spectrum of "Mixed Reality" (6) and especially Augmented Reality (AR), in which views of the real world are combined in some proportion with specific graphic enhancements or augmentations. Apart for Virtual Heritage, LIFEPLUS aims to address the following emerging market needs:

- Tourism and Education-Entertainment (Edutainment).
Novel operational paradigms (immersive AR virtual life) for edutainment experiences are preconditions for economic viability for all types of future Cultural and memory Institutions, Location-Based Entertainments and E-visitor Attractions.

- On set visualization & Virtual Studio:
Film studios currently shoot films expecting to add in computer generated (CG) effects such as backgrounds, dinosaurs or CG characters later. Directors would benefit from the ability to see in real time or very soon afterwards an overlay of real and planned CG elements to decide whether the composition is acceptable. Broadcasters are also currently seeking to expand the use of virtual life in live broadcasts (e.g. Ananova) (7).

The project is defining new models-tools and affordable solutions for 3D virtual life simulations in AR environments with emphasis on two real-time commercialised end-products:
a) a mobile AR on-site guide based on immersive wearable computing
b) a middleware architecture of self-contained SDKs and APIs.

Innovative research will extend the state of the art of technologies developed in IST and other research projects for:
a) Real-time camera tracking in unknown environments (ENREVI project)
b) Immersive on-site guides based on mobile-AR units (ARCHEOGUIDE, ARVIKA)
c) Introduction of virtual humans in mixed reality environments (STAR) and synthesize a new real-time framework that will allow virtual fauna and flora to enhance real environments in AR for, a new breed of innovative edutainment experiences.

References:

1 Ross, S. 1997. ‘Consensus, communication, and collaboration: fostering multidisciplinary cooperation in electronic records’, in INSAR (Supplement II), Proceedings of the DLM-Forum on electronic records, p. 336

2 Computer Graphics and Archaeology: Realism and Symbiosis, David Arnold, Charismatic project,
http://www.charismatic-project.com/

3 Virtual Heritage Network, http://www.virtualheritage.net

4 CHARISMATIC IST project, http://www.charismatic-project.com/

5 Nandi A., Marichal X., "Transfiction", proceedings of Virtual Reality International Conference, Laval May 2000.

6 Milgram P., et al. "Augmented Reality: a class of displays on the reality-virtuality continuum", SPIE Volume 2351: Telemanipulator and Telepresence Technologies, 1994.

7 ANANOVA, http://www.ananova.com


netzspannung.org was provided with the information above by MIRALab. However large parts have been published before:

George Papagiannakis, Michael Ponder, Tom Molet, Sumedha Kshirsagar, Frederic Cordier, Nadia Magnenat-Thalmann, Daniel Thalmann: LIFEPLUS: Revival of life in ancient Pompeii. Virtual Systems and Multimedia, SMM2002-invited paper, October 2002 http://www.miralab.unige.ch/papers/128.pdf )

Technik

  • › Fig._Hardware Overview [JPEG | 64 KB ] [link 02]
  • › Fig._AR Life_Distributed Architecture [JPEG | 73 KB ] [link 03]
  • › Fig._Transition from AR Guide to AR Life [JPEG | 43 KB ] [link 04]
  • › Questionaire_LifePlus_Technical Details & Innovative Aspects [PDF | 137 KB ] [link 05]
  • › Assistance and Experience [JPEG | 39 KB ] [link 06]
  • › Assistance_AR Guide [JPEG | 41 KB ] [link 07]
  • › Fig._AR Life_Functional Elements [JPEG | 28 KB ] [link 08]

Technische Beschreibung

The goal of LIFEPLUS is to push the limits of current Augmented Reality (AR) technologies, exploring the processes of narrative design of fictional spaces (e.g. frescos-paintings) where users can experience a high degree of realistic interactive immersion. Based on a captured/real-time video of a real scene, the project is oriented in enhancing these scenes by allowing the possibility to render realistic 3D simulations of virtual flora and fauna (humans, animals and plants) in real-time. According to its key mobile AR technology, visitors are provided with a see-through Head-Mounted-Display (HMD), earphone and mobile computing equipment. A tracking system determines their location within the site and audio-visual information is presented to them in context with their exploration, superimposed on their current view of the site. LIFEPLUS will extend that system and provide key new technologies to render lively, real-time animations and simulations of ancient virtual life (3D human groups, animals and plants).


LIFEPLUS Real-Time Systems Architecture

HARDWARE REQUIREMENTS OVERVIEW
In this section we present an overview of the main functional requirements of the LIFEPLUS system proposal followed by the short overview of the hardware components that will be necessary to meet them.

Mobility:
-The overall hardware unit must be highly mobile
- Its weight should be limited
- It must be easy to use
- It must have low power consumption and run on batteries

Responsiveness:
- The overall tracking, rendering and AR image composing times must be kept under certain limits in order to assure quality of immersive experience (~200ms)

Performance:
- The system must be able to deliver certain number of frames per second (not less than 10fps)

Robustness:
- System AR mode boot and run phases should be easy to initialise and robust in operation in order to assure possibly unconstrained interaction of the visitor with the site

Camera:
- Head mounted
- Reasonable resolution (~800x600)
- FireWire for the best digital quality and high transmission bandwidth (limiting lags)
- Monoscopic video see-through AR allowing to avoid critical problems with composing of synthetic and real images

DGPS and Digital Compass:
- On-site localization of the visitor
o Optional support for vision based real camera tracking

HMD display:
- Light, compact and low cost

Two mobile workstations:
- Separation and parallelization of the main two heavyweight system tasks: real-time camera tracking and real-time 3D rendering (the ideal solution would be double processor mobile workstation)
- 3D image rendering workstation must feature state of the art Graphics Processing Unit (GPU) which will allow for real-time generation of VR images of high quality

(see Fig. Hardware Overview)


SOFTWARE ARCHITECTURE AND CONTENT COMPONENTS OVERVIEW
The overall LIFEPLUS system architecture is designed based on the VHD++ real-time development framework being a proprietary middleware solution of both MIRAlab and VRlab laboratories. VHD++ is a highly flexible and extendible real-time framework supporting component based development of interactive audio-visual simulation applications in the domain of AR and VR with particular focus on virtual character simulation technologies. C++ has been chosen as the main implementation language. The most important features and functionalities of the VHD++ framework as seen from these different perspectives are:
1) support for real-time audio-visual applications,
2) extendible spectrum of technologies,
3) middleware portability, extendibility and scalability,
4) runtime flexibility: XML based system and content configuration,
5) complexity curbing: multiple design patterns improve clarity and abstraction levels simplify description of problems,
6) fundamental components and pluggable components hide implementation level details allowing to work on required abstraction level simplifying implementation constructs,
7) large scale code reuse: fundamental components and readymade components encapsulating heterogeneous technologies.

The LIFEPLUS system architecture is designed around two VHD++ runtime engines being active software elements running on two separate portable computers. Each of the runtime engines power supplies a set of pluggable VHD++ services that provide to the system encapsulation of required application level technologies. This kind of architecture allows for separation of computationally heavy real-time tracking and synthetic image rendering tasks.


As shown on Fig. AR Life_Distributed Architecture, the TRACK runtime engine features relatively light weight services taking care of DGPS based on-site positioning of the visitor and providing her/him with additionally multimedia information related to the current position at the site being visited. Those services play the mayor role in the phase of sight-seeing of the actual site. Once the visitor reaches the point of interest where the AR simulation is possible then DGPS services become secondary and play a supportive role for the computationally heavy weight real-time, vision based camera tracking service that calculates and sends subsequent camera matrices to the VR/AR runtime engine.
The VR/AR runtime engine features multiple services encapsulating required VR simulation technologies supporting real-time generation of realistic synthetic 3D images and sound effects according to camera matrices obtained from the tracking side. Here we find as well services that are responsible for real image buffering and composition with the synthetic image.

SYSTEM IN OPERATION
The LIFEPLUS mobile system is required to operate in two main modes. The first one is designed to support the visitor of the site with location based multimedia information facilitating sight-seeing of the area by provision of both practical and historical information in form of text, images, short movies overlaid on the head mounted display. In this “sight-seeing” operational mode mainly DGPS technology is used to track relatively coarsely current position of the visitor in the area. Once the visitor reaches the spot where the AR simulation is possible (s)he is informed about it and allowed to enter into AR simulation mode.
In AR simulation mode, the visitor is exposed to the VR simulation scenario blended into the real imagery of the site. The visitor is able to walk around and look around some spatially limited space usually naturally constrained by the walls of a particular site. In this mode DGPS technology plays secondary role supporting real-time, vision based, computationally heavyweight camera tracking module that is designed to deliver precise camera matrix values per each simulation frame in order to allow for generation of respective 3D synthetic images and blending them with the real camera ones.
High quality FireWire digital video signal carrying both image and frame ID information is split to two portable computers running respectively TRACK and VR/AR runtime engines. For each frame real-time, vision based tracking module, optionally supported by directional data from DGPS calculates real camera matrix and sends it together with the received FireWire frame’s ID to the VR/AR side. As the real camera tracking costs certain time the VR/AR side needs to buffer vide images obtained from the FireWire together with their respective Ids. Once the real camera matrix is ready for a real image stored in the cyclic buffer the VR simulation module generates respective 3D synthetic image that is then blended with the real image and sent to the HMD by AR image blending module. It is important to note that VR simulation module is responsible not only for 3D image generation but as well for generation of proper 3D sound effects accompanying the simulated scenario.

(see Fig. Transition from AR Guide to AR Life)



netzspannung.org was provided with the information above by MIRALab. However large parts have been published before:

George Papagiannakis, Michael Ponder, Tom Molet, Sumedha Kshirsagar, Frederic Cordier, Nadia Magnenat-Thalmann, Daniel Thalmann: LIFEPLUS: Revival of life in ancient Pompeii. Virtual Systems and Multimedia, SMM2002-invited paper, October 2002 http://www.miralab.unige.ch/papers/128.pdf

Hardware / Software

LIFEPLUS software components

Automatic Real-time Camera Tracking S/W
Need addressed: Urgent need from the movie/TV industry for pre-visualisation of special effects, need for real-time AR environments with Virtual flora and fauna for the cultural industries, tourism and edutainment
Prime developer: VMSL
Partners that expressed commercialisation intentions: VMSL, FORTH

Alternative Real-time camera tracking Algorithms
Need addressed: Need from the academic community of alternative specialised application-based methodologies (robotics etc.)
Prime developer:IGD, FORTH
Partners that expressed commercialisation intentions: VMSL, IGD, FORTH

Real-time virtual fauna simulation SDK S/W
Need addressed: Need for an application independent realistic integrated solution for virtual human and animal simulation
Prime developer: UNIGE, EPFL
Partners that expressed commercialisation intentions: UNIGE, noDna

Real-time virtual flora simulation SDK S/W
Need addressed: Need for an integrated solution for virtual plant generation, representation and simulation
Prime developer: Bionatics
Partners that expressed commercialisation intentions: Bionatics

Behavioural animation of virtual characters Algorithms
Need addressed: Important component of the virtual fauna simulation SDK for autonomous virtual agents
Prime developer: EPFL
Partners that expressed commercialisation intentions: EPFL, Bionatics

Real-time Hair, Cloth, Facial simulation Algorithms
Need addressed: Optimised solution for pre-visualisation and behaviour simulation (cosmetic, apparel industry) as well as for heightened edutainment experiences
Prime developer: UNIGE
Partners that expressed commercialisation intentions: UNIGE, noDna

AR Authoring Tools Suite S/W
Need addressed: An integrated, extensible authoring solution with SDKs for Virtual life in AR situations
Prime developer: IGD
Partners that expressed commercialisation intentions: INTRACOM

Mobile On-Site AR Guide S/W
Need addressed: To enhance visitor’s site experience with AR audiovisual information for new edutainment interactions.
Prime developer: INTRACOM, IGD
Partners that expressed commercialisation intentions: INTRACOM

Kontext

Theorie / Forschung

The partners of the LIFEPLUS project consortium contribute to the exploitation of the project results, according to their own status, aims and objectives thus receiving different benefits:

- The RESEARCH partners (FORTH, UNIGE, EPFL, IGD) undertake the responsibility of dissemination to the scientific community. The results of this work has been submitted for publication in journals and presentation at scientific meetings, such as the major computer graphics and virtual environment and Augmented Reality conferences (SIGGRAPH, IEEE Virtual Environments, Eurographics and Eurographics Workshops on Virtual Environments and Rendering, ISAR, IWAR etc.), as well as smaller, focussed workshops that combine technology and the cultural field, such as the International Cultural Heritage Informatics Meeting (ICHIM), the series of Virtual Archaeology conferences, etc. This will be done throughout the development process to ensure peer review of the work as well as valuable feedback on the techniques being developed.
- The TECHNOLOGY providers (INTRACOM, VMSL, Bionatics, noDna and A&C 2000) use their contacts to decision-makers in the multimedia and telecommunication industries to promote and market the LIFEPLUS products. They will generate additional revenue by selling the software and hardware components, the technical services, and by administrating the LIFEPLUS services.
- The USERS GROUP of the consortium (University of Milan, Pompeii Superintendence) will support marketing efforts by providing access to reference installations and by disseminating information about the project’s results among potential users. They will benefit from using the LIFEPLUS system in-house, and they may generate additional revenues by increasing the number of visitors to the trial sites.

Ausstellungen / Präsentationen

  • G. Papagiannakis, M. Arevalo-Poizat, N. Magnenat-Thalmann, Recreating daily life in Pompeii with VR and AR, AVIR 2003 Research Workshop on Augmented Virtual Reality, September 2003
  • M. Ponder, G. Papagiannakis, T. Molet, N. Magnenat-Thalmann, D. Thalmann, VHD++ Development Framework: Towards Extendible, Component Based VR/AR Simulation Engine Featuring Advanced Virtual Character Technologies, IEEE Computer Society Press, Proceedings of Computer Graphics International (CGI), 2003
  • N. Magnenat-Thalmann, Photorealistic Hair Modeling, Animation, and Rendering, Siggraph2003 – Courses
  • G. Papagiannakis, M. Ponder, T. Molet, S. Kshirsagar,F. Cordier, N. Magnenat-Thalmann, D.Thalmann, LIFEPLUS:Revival of life in ancient Pompeii, Virtual Systems and Multimedia,VSMM2002-invited paper, October 2002
  • MIRALab - CUI: Atelier 3 - Visitez la reconstitution virtuelle de Pompéï et ses habitants, Open Doors of CUI (Centre Universitaire d'informatique) - 2003

Referenzen

  • ENREVI (Enhanced Reality for the Video); The Enrevi project proposes new development for the enhancement of a video sequence captured on a real scene with real-time rendered 3D objects. Emphasis is put on: a) Fundamental research to provide tolls for tracking in unknown environment in real time b) A software real-time engine that could render 3D synthetic images on a hardware platform c) Analysis of MPEG, DVB and Video standards to specify format to generate, store, retrieve in accordance with existing and future standards d) A new generation of chroma key product e) Integration of all different modules in a real life environment.
    » http://a7www.igd.fhg…g.de/projects/enrevi/ [link 09]
  • STAR (Services and Training through Augmented Reality); STAR focuses on the developing Mixed Reality techniques with a view to developing commercial products for training, documentation and planning purposes. To this end, the project emphasizes on the following issues: a) automated reconstruction of industrial installations by 3D reconstruction or mosaicing, b) interaction between human operators and mixed reality environments and c) the introduction of virtual humans in the mixed reality environments. These tools will be integrated into an integrated AR platform, which will be evaluated in planning and training situations.
    » http://cwisdb.cc.kul…1/project3E010567.htm [link 10]
  • ARVIKA; The research in this project is aimed at using AR technologies to create a user-oriented and system-driven support of operation procedures. It focuses on the development, production and service of complex technical products and systems. The project ideas are realised in various application areas such as automobile manufacture and aircraft construction, mechanical engineering and system development. They intend to support both the high-end/power applications in the development process and the low-end activity of the skilled worker using a belt-worn equipment in the real production and service environment. This is realised by an open platform that allows for different performance grades and especially for true wearability.
    » http://www.arvika.de/ [link 11]
  • TOURBOT (Interactive Museum Telepresence Through Robotic Avatars); This project aims at the development of an interactive tour-guide robot able to provide individual access to museums exhibits and cultural heritage over the Internet. TOURBOT operates as the user’s avatar in the museum by accepting commands over the web that directs it to move in its workspace and visit specific exhibits. More specifically the project objectives are: a) develop a robotic avatar with advanced navigation capabilities b) develop appropriate web interfaces that will realise distant-user’s telepresence c) facilitate personalised and realistic observation of the museum exhibits and c) enable on-site, interactive museum tour-guide.
    » http://www.ics.forth.gr/tourbot/ [link 12]
  • CHARISMATIC (Cultural Heritage Attractions Featuring Real-time Interactive Scenes And Multi-functional Avatars As Theatrical Intelligent Agents); CHARISMATIC introduces drama, story-telling and live interactive dialogue capability into Virtual Environments Acquisition of Performing Arts and transition of the digitised results into smart, multi-functional high fidelity Avatars. The approach is focused on VR based simulations that will acquire many of the current entertainment capabilities of theatre, film and television, but with the added values interactivity and immersive experience. CHARISMATIC aims to develop essential technologies enabling theatrical (audience group) enjoyment of high fidelity virtual environments, populated by virtual humans. Charismatic is a European Commission IST funded project.
    » http://www.charismatic-project.com/ [link 13]
  • ARCHEOGUIDE (Augmented Reality-based Cultural Heritage On-site Guide); The focus of ARCHEOGUIDE is to provide the users the experience of a tour in a cultural site with the ability to view the natural environment, to visualise 3D reconstructions of monuments and be assisted during the view by a multimedia guidance system. Cultural site visitors are provided with a see-through Head-Mounted Display (HMD), earphone and mobile computing equipment. A tracking system determines the visitor’s location within the site. Based on the visitor’s profile and his position, audio and visual information are presented to guide and allow him/her to gain more insight into relevant aspects of the site.
    » http://Archeoguide.i…tranet.gr/project.htm [link 14]
  • › Medienkunst und Forschung [link 15]

» http://www.miralab.u…ifeplus/HTML/home.htm [link 16]

  • › Creation of virtual characters based on fresco paintings [JPEG | 526 KB ] [link 17]
  • › AR Life simulation_Bar [JPEG | 59 KB ] [link 18]
  • › Video_LifePlus_Szene1_2Min [Windows Media] [link 19]
  • › Video_LifePlus_Szene1_2Min [RealMedia] [link 20]
  • › Video_LifePlus_Szene2_2Min [Windows Media] [link 21]
  • › Video_LifePlus_Szene2_2Min [RealMedia] [link 22]
  • › Video_LifePlus_Szene3_2Min [Windows Media] [link 23]
  • › Video_LifePlus_Szene3_2Min [RealMedia] [link 24]
  • › Video_LifePlus_Credits [Windows Media] [link 25]
  • › Video_LifePlus_Credits [RealMedia] [link 26]
  • › Fig._Hardware Overview [JPEG | 64 KB ] [link 27]
  • › Fig._AR Life_Distributed Architecture [JPEG | 73 KB ] [link 28]
  • › Fig._Transition from AR Guide to AR Life [JPEG | 43 KB ] [link 29]
  • › Questionaire_LifePlus_Technical Details & Innovative Aspects [PDF | 137 KB ] [link 30]
  • › Assistance and Experience [JPEG | 39 KB ] [link 31]
  • › Assistance_AR Guide [JPEG | 41 KB ] [link 32]
  • › Fig._AR Life_Functional Elements [JPEG | 28 KB ] [link 33]