About Us Our Work Employment News & Events
MITRE Remote Access for MITRE Staff and Partners Site Map
edge top

December 1999,
Volume 3
Number 4

 

Home > News & Events > MITRE Publications > The Edge >

IMPs Enhance Virtual Collaboration Environments

collaborationWith the recent introduction and adoption of software tools to support collaboration, more work is being done cooperatively, with results stored in shared virtual environments. Traditionally, collaborators have benefited from support mechanisms such as physical containers, individual computers, and supporting staff. As the collaborators move toward more computer-driven; geographically or temporally dispersed interactions, the lack of equivalent virtual world support mechanisms becomes apparent. Responding to this need, the MITRE Technology Program has funded research into Intelligent Multimodal Participants (IMPs).

IMPs are a specialized class of software agents that exist in virtual environments to support collaborators. While IMPs currently support human collaborators, a longer-term goal is to have them support other agent collaborators. IMPs participate in virtual environments on an equal footing as human participants. IMPs have several important qualities that make them the ideal candidate to support collaborators in virtual environments. They are persistent. They can remain in the virtual environment long after other collaborators (human or not) have come and gone. They depend only on the persistency of the collaboration environment itself. Consequently, they can assist collaborators in ways that are not easy for humans. IMPs are autonomous and can be built to exhibit certain "behaviors." These behaviors define the IMP's purpose and its interaction with the human collaborators. For example, an IMP could be designed as a help resource for new users to the collaboration environment. This kind of IMP would monitor the user's actions and offer shortcuts or suggestions to the user based on the user's actions and profile.

IMPs can be constructed to follow profiles and to store and retrieve data based on situation or room events. Currently, IMPs are categorized into three broad categories: place-based assistants, personal assistants, or global resources.

Using MITRE's Collaborative Virtual Workspace (CVW) as an example (cvw.mitre.org), place-based IMPs occupy a room in the CVW. CVW provides a persistent virtual environment where teams can communicate, collaborate, and share information regardless of their geographic location. It provides virtual collocation through persistent virtual rooms, which house the people, information, and tools appropriate to a task, operation, or service. To a user, CVW is a building that is divided into floors and rooms, where each room provides a context for communication and document sharing. Defining rooms as the basis for communication means that users are not required to set up sessions or know user locations; they need only enter a room. The place-based IMP stays in the room serving as an asset to anyone in the room. An example of room-based logging will be discussed below.

The personal assistant is a mobile IMP; it can either stay in a fixed location, or it can follow a user throughout the virtual environment. This class of IMP might mimic the behavior of a human executive assistant and is most useful to the person(s) it supports. Although not yet built, this class of IMP will be used to explore adaptive learning techniques.

The last type of IMP is a global resource IMP. This class of IMP is a cross between the capabilities of the room-based IMP and the personal assistant. This class of IMP is only now being framed and defined.

To the human collaborators in the virtual environment, the IMP looks just like another user. In the picture on page nine, the IMP's picture appears in the same place as all of the human collaborators. Participants can interact with the IMP just as they would with another participant, directing requests or commands to it. For the human collaborators in the environment, a dialogue interface provides a natural means of interacting with the IMP mechanisms and requires no additional training. Textual dialog with the IMP appears in the textual scroll-back area, in the same place as human dialogue.

Internally, the IMP is a special purpose collaboration client with additional software to provide the features that human users see as the behavior of the IMP. Consequently, to the server of the virtual environment, the IMP appears as a client in a client-server relationship. This interface style is important for two reasons:

  1. The client-server model allows the IMP to connect to the server and perform all of its functions without any modification to the server. The only exception is when logging private actions between a user and the server, or private actions between two users. The current security model of the server prevents any other user (in this example the IMP) from being aware or seeing any private interactions. In order to capture and log these interactions a patch must be applied to the server to channel these interactions to a third party (i.e., the IMP).
  2. The client server model allows the IMP to extend the capabilities of the virtual environment. The IMP provides a mechanism to experiment with additional capabilities and their effects prior to modifying the server or the user's client software.

Research

Current research is focused on building a class of IMPs that are aware of their surroundings and aware of user actions in their surroundings. By building up a history of the interactions, the IMP will be able to discern user intent and begin to be able to proactively assist the users in several ways. In order to further this research, a reusable IMP framework has been constructed to allow researchers and other developers to rapidly prototype different types of IMPs without having to reinvent the collaboration client portion of the IMP. This allows developers to concentrate on the behavior of the IMP, which is the user-visible functionality, instead of on the mechanics of the virtual environment. Examples of end user functionality include searches using external web sites, note-taking for textual discussions in virtual rooms in MITRE's CVW, and multimodal logging of collaborative sessions. The framework has been used to construct a series of room-based IMPs that are the tools of the current research.

In order to extend the IMP's ability to discover and categorize user interactions, it was necessary to capture, classify, time stamp, and store all public interactions that occur in a virtual room. Examples of these actions include textual conversation, audio conversation, non-verbal communication, user actions (e.g., arrivals, departures, taking objects, leaving objects) and whiteboard actions. These interactions can now be recorded and saved; this provides a real-time log; capturing all public interactions -- multimodal interactions -- that occur within a room. The log contains not only a time-ordered sequence of the events, but the contents of the events themselves, permitting replay of the captured interactions.

In this interface we can view a dialogue, participants, and other contents in the room "CVW Development Center." Note the IMP icon shown alongside the human collaborators. Also, note the IMP's interaction as "Scribe" in the dialogue section.

In this interface we can view a dialogue, participants, and other contents in the room "CVW Development Center." Note the IMP icon shown alongside the human collaborators. Also, note the IMP's interaction as "Scribe" in the dialogue section.

Application of Research

Recent Air Force exercises and experiments have used virtual environments to support the geographic distribution of elements of the Joint Forces Air Component Commander's (JFACC's) Air Operations Center (AOC). This geographic distribution includes splitting team members who would normally be collocated. This separation makes traditional interactions difficult since the team members are not only separated by distance, but often also by time. Setting up virtual workspaces in an environment like CVW allows the team members to have a common workroom with easy to access persistent storage. One application of the current research would be using an IMP to record minutes of meetings and make them available to team members. IMPs could also be used to monitor and notify team members when activities occur in other workspaces, thereby extending the awareness of team members.

Typically Air Force exercises span several days and often have extended shifts greater than 12 hours. Exercises often have several measures of effectiveness that must be monitored to determine overall results. In addition to exercises, the Air Force has established an annual Expeditionary Force Experiment.

Since exercises are graded to determine unit efficiency, and experiments are monitored to determine outcomes, significant amounts of data must be collected. Typical methods of collecting data include hanging video cameras and microphones from the ceiling to capture user actions. Additionally, observer staff wander the floors with clipboards to manually capture discrete user actions and record overall impressions. Reviewing video and audio logs, and collating observer's notes is a long, tedious process. In the case of interactions in virtual worlds, IMPs become the perfect vehicles to collect the low-level event data.

One of the research IMPs has been extended and made robust enough to support the multimodal logging of data that has been described above. The IMP inhabits a room in CVW and when asked, starts logging all of the public actions in the room. The IMP is capable of notifying the current room inhabitants and notifying new arrivals that it is logging public actions. This notification provides important feedback to the room inhabitants. The IMP takes advantage of previous work done under MITRE's Sponsored Research program the multimodal logger (MML). The MML is used to time stamp and store the collected data. Data analysis tools built to work with the MML's database can be used to visualize different aspects of the data.

SQL query-like tools can be used to find all occurrences of a particular type or piece of data. Time line analysis tools can be used to see the data in the order that it occurred. Filters can be applied to allow either full or partial visualization of the entire data set. This type of visualization demonstrates where this technology can be useful in post exercise reconstruction and analysis. The collected data itself becomes valuable not only as a collection that can be searched, but also as a record of the events themselves. In the case of exercises and experiments, it serves as a means of determining outcomes and can also be used in training situations. In real world situations, the data serves as an easily obtained record of actions.


For more information, please contact Michael Krutsch using the employee directory.


Homeland Security Center Center for Enterprise Modernization Command, Control, Communications and Intelligence Center Center for Advanced Aviation System Development

 
 
 

Solutions That Make a Difference.®
Copyright © 1997-2013, The MITRE Corporation. All rights reserved.
MITRE is a registered trademark of The MITRE Corporation.
Material on this site may be copied and distributed with permission only.

IDG's Computerworld Names MITRE a "Best Place to Work in IT" for Eighth Straight Year The Boston Globe Ranks MITRE Number 6 Top Place to Work Fast Company Names MITRE One of the "World's 50 Most Innovative Companies"
 

Privacy Policy | Contact Us