Research on Networked Multimedia
≫ Japanese page

Home
Reserch

Networked Multimedia

Networked multimedia applications such as multimedia conferencing, distance learning, collaborative work in networked virtual environments, and networked games attract a great deal of attention.
In such applications, media of haptic (sense of touch), sense of taste, and sense of smell as well as computer data, video (visual media), and voice (audio media) will be handled together.
Ishibashi lab. focuses on audio, visual, haptic, and olfactory media communications.

When we transfer these types of media over QoS (Quality of Service) non-guaranteed networks like the Internet, QoS of the media may be degraded seriously owing to the network delay, its jitter, packet loss, and so on. To achieve high quality of networked multimedia applications by avoiding the deterioration, we need to carry out QoS control.

In Ishibashi lab., we deal with a networked haptic museum, collaborative work with haptic media, remote control systems with haptic media, a haptic media and video transfer system, olfactory and haptic media transfer system, networked real-time games, and remote robot system with force feedback as typical examples of networked multimedia applications. In what follows, we explain these applications.

Collaborative Work with Haptic Media

We do collaborative work such as remote design and remote surgery simulation by touching objects in a 3-D virtual space with the haptic interface device (SensAble Technologies, Inc.) as a haptic interface device. We can expect that using haptic media as well as voice and video largely improves the efficiency of collaborative work.

In the following figure, two users move a rigid cube as a CG object by putting the cube between the two cursors of the haptic interface devices in a networked 3-D virtual space. The cursor of each haptic interface device moves in the space when the user manipulate the stylus of the haptic interface device with the user's hand. The two users raise and move the cube together so that the cube contains a target object (a sphere in the figure), which revolves along a circular orbit at a constant velocity.

Collaborative work using haptic interface device (demo video)

There are a number of studies which deal with collaborative work with haptic media. However, most of them deal with only one object in a 3-D virtual space. This study handles play with building blocks (i.e., objects) in which two users lift and move the blocks collaboratively to build a dollhouse in a 3-D virtual space (see the following figure). The dollhouse consists of 26 blocks. We deal with three cases of collaborative play. In one case, the two users carry the blocks by holding them together. In another case, the users carry out the blocks alternately by holding them separately. In the other case, one of the two users carries each block from a position and hands the block to the other of the two users, who receives it and then carries it to another position. By subjective and objective assessment, we investigate the influences of the network latency and packet loss on the collaborative play. In the following figure, the two users are piling up building blocks collaboratively to building the dollhouse by manipulating haptic interface devices.

Collaborative haptic play with building blocks (demo video)

In the near future, we plan to handle a variety of applications.

Remote Control Systems with Haptic Media

We control a haptic interface device with another remote haptic interface device. As such an application, we deal with remote haptic instruction systems (for example, a remote calligraphy system and a remote drawing instruction system) and remote haptic control systems.

In the remote calligraphy system, as shown in the following figure, a teacher instructs a student at a remote location how to use a brush for calligraphy while conveying the sense of force through a network.

Remote calligraphy system (demo video)

The remote drawing instruction system enables navigation of brush stroke while a teacher and a student feel the sense of force interactively.

Remote drawing instruction system (demo video)

The following figure shows a remote haptic instruction system. A whiteboard marker is attached to each haptic interface device stylus at the teacher and student terminals. We can use the system to write characters and to draw figures while watching video.
We also deal with the system in the case where the video camera and the whiteboard maker are attached to only the student terminal.
We examine the influences of the network delay and haptic transmission direction (one-way or two-way) on the easiness of work of writing characters and work of drawing figures.

Remote haptic instruction system (demo video)

The following figure shows a remote haptic control system. The system controls a haptic interface device at the slave terminal with another haptic interface device at the master terminal while watching video.
We propose switching control which dynamically switches the haptic transmission direction according to the network latency.
Under the control, we automatically select the time which it takes to switch the haptic transmission direction according to the contents of work. This is because there is the optimal value of the switching time, and the optimal value depends on the contents of work.

Remote control system (demo video)

Haptic Media and Video Transfer System

The haptic media and video transfer system conveys the haptic sensation experienced by a user to a remote user. In the system, the user controls a haptic interface device with another remote haptic interface device while watching video (see the following figure). In the figure, the user is touching the Rubik's cube.

Haptic media and video transfer system (demo video)

Haptic Media and video of a real object which the user is touching are transmitted to another user. We handle the case in which a user of the master terminal touches the object located at the master terminal by using haptic interface device while watching video, and the case in which a user if the slave terminal touches the object located at the master terminal while watching video. In the former, the haptic media are transferred from the master terminal to the slave terminal; in the latter, the haptic media are transmitted in both directions between the master and slave terminals. Therefore, the interactivity is more important in the latter than in the former.

By using the system, we examine the influence of inter-stream synchronization error between haptic media and video on the haptic media and video transmission. We clarify the imperceptible range and allowable range of synchronization error.

By using a pair of the haptic media and video transfer systems in each of which the haptic media are transmitted in both directions between the master and slave terminals, users at the slave terminals can do collaborative work (see the following figure). In the figure, the users manipulate their own haptic interface devices, and they lift and more the cube by holding it between the two remotely controlled haptic interface devices' styluses of the master terminals. There are two kinds of usage in the system. In one usage, a user manipulates the two slave terminals by both hands. In the other usage, two users separately manipulate the slave terminals one-handed. In such environments, we can perform a remote surgical operation by substituting a surgical knife of the operation for the stylus of the remote haptic interface devices.

Collaborative work using haptic media and video transfer systems

In addition, by adding sound to the system, we construct the haptic media, sound and video transfer system (see the following figure), and we investigate the influence of the network latency on the media output quality. The figure shows that a user of the slave terminal beats a tambourine.

Haptic media, sound and video transfer system

In the next step of our research, we plan to study a transmission method of haptic sensation of higher quality.

Olfactory and Haptic Media Transfer System

The olfactory and haptic media transfer system enables multiple users to share a work space via a network by using haptic interface devices and olfactory displays (SyP@D2). As such systems, we deal with an olfactory and haptic media display system and a remote ikebana system.

In the olfactory and haptic media display system, a user can enjoy fruit harvesting in a 3D virtual space as shown in the following figure. The system presents the smell of a fruit to the user because the fruit approaches his/her nose when he/she picks the fruit from a tree in the virtual space. He/she can also perceive the reaction force at the same time.

Olfactory and haptic media display system (demo video)

When we transmit olfactory and haptic media through a network, inter-stream synchronization errors may occur owing to the network delay and jitter. We clarify the influence of inter-stream synchronization error between small and reaction force (i.e., between olfactory and haptic media) by using the system. We also make a fruit harvesting game by enhancing the system and study QoS control to solve the problems cause by the network delay, delay jitter, and packet loss.

In our remote ikebana (i.e., Japanese flower arrangement) system, as shown in the following figure, by manipulating a haptic interface device, a teacher can teach a student at a remote location how to arrange followers. The teacher or student can hold a flower, adjust the length of the held flower’s stem with a pair of scissors (see the figure, in which the student is going to cut the held rose’s stem), and impale the flower on a flower pinholder in a 3-D virtual space.

Remote ikebana system (demo video)

The fragrance of a flower is assumed to reach to locations which are within a constant distance from the corolla of the flower. That is, we can perceive the fragrance of the flower in a sphere (called the smell space) as shown in the following figure. When the viewpoint of the teacher or student enters the smell space of a flower, he/she can perceive the fragrance of the flower.

Smell space of flower

We proposed dynamic control of output timing of fragrance as QoS control to achieve remote ikebana with high realistic sensation.

Networked Real-time Games

As networked games, we handle a racing game, a shooting game, and so on. In the networked racing game, players compete with each other by steering their own cars. In networked shooting game, two payers fight with each other. Each player fires shots at the other player's fighter while moving his/her fighter to the right or to the left, and he/she avoids shots fired by the other player by shielding his/her own fighter under buildings. When a shot hits a fighter, the fighter is blown up; when it hits a building, the building is damaged.

 
Racing game (demo video)   Shooting game

We handle a networked real-time game in which two players operate their own objects competitively by manipulating haptic interface devices as shown in the following figure. Each player lifts and moves his/her object (a rigid cube) so that the object contains the target (a sphere) in a 3-D virtual space. When the target is contained by either of the two objects, it disappears and then appears at a randomly-selected position in the space. We need to keep the causality, the consistency, and the fairness between the players in the game.

Networked haptic game (demo video)

We also deal with a networked haptic game with collaborative work. As shown in the following figure, two groups (groups a and b) each of which consists of two players play a networked haptic game in which the two players in each group move their object collaboratively by putting the object between the two cursors of the haptic interface devices in a 3-D virtual space. When the target is contained by either of the two objects, it disappears and then appears at a randomly-selected position in the space. Since we handle both collaborative work and competitive work together, we need to maintain the efficiency and fairness of the game.

Networked haptic game with collaborative work

Furthermore, we handle a fruit harvesting game by enhancing an olfactory and haptic media display system.

Remote Robot System with Force Feedback

We study QoS control and stabilization control for a remote robot system with force feedback in which a user operates a remote industrial robot having a force sensor by using a haptic interface device while watching video in order to achieve a high-quality and high-stable system.

Configuration of remote robot system with force feedback

We deal with several types of work in which we write a small character and push balls with different kinds of softness, and we study QoS control for accurate transfer of pen pressure and softness when pushing each ball even when the network delay is large. We try to achieve high precision control so as to pass a thread through the eye of a needle.

Writing charater and pushing ball (demo video of pushing ball)

We also handle cooperative work such as carrying an object together and hand delivery of an object between two remote robot systems to study spatiotemporal synchronization control (control which moves robots at the same height, angle, and timing). We further suppose that we operate movable robots, and we achieve hand delivery which moving robot arms. It is important to avoid large force applied to an object so as not to break the object.

Carrying object togetehr and hand delivery (demo video of hand delivery)

As the next step of our research, we plan to operate movable robots remotely.

In addition, our laboratory carried out the following researches.

Networked Haptic Museum

In a networked haptic museum, which is a distributed virtual museum with touchable exhibits, an avatar explains exhibited objects (e.g., an Egyptian mummy sculpture, a painting of sunflowers by Vincent Van Gogh, and a dinosaur sleleton and one of its fangs) with voice and video while manipulating the objects by using a haptic interface device as shown in the following figure. Multiple users touch the objects while watching/listening to the avatar's presentation and ask him/her questions. Each user lifts and moves an exhibited object in order to feel the object's weight. The avatar explains the part of the the object which a user is touching through a haptic interface device.

Configuration of the networked haptic museum (demo video).

Interconnection among Heterogeneous Haptic Interface Devices

When we interconnect haptic interface devices which have different specifications such as the workspace size from each other, the following problem occurs: Domains which some of the haptic interface devices cannot reach exist in the virtual space. The problem can be solved by mapping each of the workspaces to the virtual space. We examine some methods of the mapping for collaborative work and competitive work.

In the collaborative work shown in the following figure, two users operate SPIDAR-G AHS and Falcon, and the two users move a rigid cube as an object collaboratively by holding the cube between the two cursors of the devices in a 3-D virtual space. They lift and move the cube collaboratively so that the cube contains a target (a sphere in the figure), which revolves along a circular orbit at a constant velocity.

Collaborative work using SPIDAR-G AHS and Falcon

The following figure shows the competitive work in which four players operate their own objects competitively by manipulating haptic interface devices. Each player lifts and moves his/her object (a rigid cube) so that the object contains the target (a sphere) in a 3-D virtual space. When the target is contained by any of the four objects, it disappears and then appears at a randomly-selected position in the space.

Competitive work using four heterogeneous haptic interface devices

In the near future, we plan to handle a variety of applications.

In the above applications, multiple media streams are temporally related to each other. Also, the requirements for the interactivity is very stringent; that is, the applications need small network delays.
Furthermore, since multiple users participate in the applications, it is important to transfer a single information unit to all the users at the same time. Therefore, we need multicast communication for the effective usage of network resources.

Ishibashi lab. studies fundamental technologies in order to achieve networked multimedia applications of high quality and those with advanced functions.
We mainly handle QoS control, media synchronization schemes, and multimedia communication protocols as fundamental technologies.