Cover ERCIM News 55

This issue in pdf
(48 pages; 10,6 Mb)



Archive:
Cover ERCIM News 54
previous issue
Number 54
July 2003:
Special theme:
Applications and Service Platforms for the Mobile User

all previous issues


Next issue:
January 2004

Next Special theme:
Industrial Diagnosis, Planning and Simulation


About ERCIM News


Valid HTML 4.01!

spacer
 
< Contents ERCIM News No. 55, October 2003
SPECIAL THEME: Machine Perception
 


Real-time Tracking of Moving Objects for Virtual Studios in TV and Cinema Production Sets

by Florica Mindru, Wolfgang Vonolfen and Ulrich Nütten


Virtual studio technology can be used in several types of applications, such as TV productions, cinema productions, virtual advertisement, and so on. The purpose of this technology is to create a realistic look of images obtained by integrating real and computer-generated audio/video entities.

The continuous improvements in computing power and in the price and quality of the sensors and hardware have led to significant improvements in this technology over the recent years. A virtual studio used as a production set can now provide increasingly higher quality, and it is becoming an affordable production solution. Current developments are focusing on the real-time recovery of the camera model and the scene models, which allows the automatic and correct integration of all elements with respect to the specific models, as well as providing better interaction between the real and the virtual objects.

The virtual studio technology available at the Fraunhofer Institute for Media Communication (FhG-IMK) in Germany is a competitive production tool. This is due to its state-of-the-art camera tracking system, the high quality, real-time rendering tools of the 3D virtual scenes and to the efficient, in-house developed software system (3DK) at the core of the virtual studio, which were already used successfully in several productions.

One of the objectives of IMK with respect to improving the virtual studio applications is the design of a real-time tracking system that can monitor moving non-rigid objects within the observed scene, and which is able to meet the requirements imposed by two particular applications of the virtual studio technology.

The blue box studio for TV productions at Fraunhofer IMK.
The blue box studio for TV productions at Fraunhofer IMK.

One of these applications is related to virtual studios for TV productions, which are a further extension of the traditional blue-box, or blue-screen technology. The purpose is to create the impression that the moderator is moving and speaking in a virtual world. Virtual sets can be graphically generated with the help of 3D modeling tools. One of the current limitations of the state-of-the-art techniques for virtual studios is related to the seamless integration of the synthetic and real worlds, where the production and integration processes should run in real time and the resulting images should meet the high requirements of picture quality in professional broadcasting. The integration process is still mainly based on a simple mixing technique that does not include the spatial and physical relationships between the elements. Virtual studios offer a large number of new options. For example, animation can be integrated in real time to create more dynamic situations, and connections can be created with interactive interfaces, which can have a direct influence on the set. For integrating such features, keeping track of the position of the moderator(s) in the (real) studio would allow a more realistic integration of the moderator into the generated 3D virtual scene. The goal here is to set up a PC-based, low-cost, virtual studio system, which will broaden the field of application by adding the feature of real-time tracking of the moderator's trajectory in the 3D scene.

The second application is related to the sound system based on wave-fieldsynthesis recently developed at the Fraunhofer Institute for Integrated Circuits IIS in Ilmenau, Germany. The wave-fieldsynthesis theory was developed at the Delft Technical University (Netherlands). The new process developed by Fraunhofer IIS researchers not only records the sound, but also the sonic characteristics of the surrounding space, and information regarding the spatial arrangement of the acoustic source. This makes it possible to reproduce a more natural spatial sound, covering a wide area of the theater. The cinema "Lindenlichtspiele" in Ilmenau is already equipped with the new system.

In order to produce sound that takes into account the spatial arrangement of the acoustic sources, these sources must be identified, and their position tracked during movie recording. For the time being, most of the methods used to achieve the sound sources positioning rely heavily on manual work. Automating as much as possible of this process would reduce the time and costs related to the post-production phase. Again, of course, the goal is to obtain a PC-based, low-cost, real-time tracking system.

Given the applications at which we are aiming, several constraints are imposed on the tracker configuration. In the case of television stations, an important problem to consider is the real-time capability of the hardware and software. Synchronization between the different recording devices is also vital. Movie productions typically involve recording of scenes with various types of backgrounds. In the case of virtual studios, most of the existing recording studios are based on the traditional blue-box technology, and the moderators are segmented out from the studio images with the help of chromakeyers. There is a need, however, for porting this technology also to more general backgrounds.

The tracking system should therefore be able to cope with various types of background, as well as with clutter and partial occlusions. It should also be possible to detect and track various kinds of objects. Within the virtual studios the moderator is of prime importance, but other kinds of objects can also be included in the scenarios. The tracker we want to develop should nevertheless be able to follow general types of objects, because in tracing acoustic sources within movie settings, not only people (the characters), but also other object types should be taken into account. Particular design issues need therefore to be addressed concerning the main steps of the tracking system (moving object detection, tracking, and 3D localization). The system is for the time being in the stage of development and testing.

Please contact:
Wolfgang Vonolfenn, Fraunhofer IMK
E-mail: wolfgang.vonolfen@imk.fraunhofer.de

   

spacer