Virtual Seminars

Creating Internet2

Untangling the Web

Campus Communication Strategies

Transcripts

 

Order CDs

Submit Feedback
 


TechTalk  |   Virtual Seminars  |   Glossary

Creating Internet2 Transcript

Internet2 Applications: Samples

Ted Hanss
Director of Applications Development
Internet2
ted@internet2.edu

In this section we're going to take a look at specific examples of the types of applications that Internet2 is designed to enable.

One characteristic shared by advanced applications is that they require an advanced network in order to run. Another characteristic is that they have the ability to powerfully affect the way we work in research and education.

The advanced applications that we are looking at in the first stages of Internet2 can be grouped into several different categories. One of these categories is access to remote scientific instruments. There are some very remarkable and expensive instruments on university campuses across the country, but currently only a limited number of researchers have access to them. Internet2 potentially can help address this problem.

Another category of Internet2 applications is real-time modeling from information gathered from sensors. We're also working on collaborative projects using large-scale, multi-site computation, and tele-immersion.

Each of the applications we'll be looking at in this section combines some or all of these characteristics.

The first example of an advanced application that we will look at is streaming, high fidelity audio from a project called "Variations" at Indiana University. This application lets a music student listen to digitized orchestral recordings while sitting at a computer workstation connected to the network on the Indiana University campus. Accessing these recordings over the network eliminates the need for students to request an audio tape of a particular recording from a central circulation desk, wait for the tape to be retrieved, go to a player to listen to the tape, and then return it back to the central desk.

Even after these recordings were digitized, access was initially extremely limited. When a music student wanted access to these digitized orchestral recordings, he or she needed to go to a single room on the Indiana campus with only 70 workstations in it. The fidelity of the recordings required high bandwidth -- several hundred thousand kilobits per second. And the controls needed to be as responsive as the CD player in your home -- the latency had to be very low. So existing network connections couldn't adequately support the requirements of this application.

The potential for this application is greatly increased with Internet2 connectivity. Now, any workstation user with access to a high-performance network, like the vBNS, is now able to access those recordings as if he or she were on the Indiana University campus. We demonstrated this application at the Internet2 meeting in Washington, D.C. in October 1997. We have also demonstrated this application over the vBNS. The response and quality are the same as if you were in the music workstation room on the IU campus. Applications of this type hold the potential for music students at any university to access digital recordings from across the country.

Another application of remote access to instruments is tele-microscopy, as developed at the Collaboratory for Microscope Digital Anatomy by the University of California at San Diego and Cornell University. This application allows a researcher who may be thousands of miles away to remotely control and acquire data from an advanced intermediate high-voltage electron microscope (IVEM) located in San Diego at the National Center for Microscopy and Imaging Research, a shared facility sponsored by the National Center for Research Resources of the NIH.

This tele-microscopy application enables researchers from sites around the country to work collaboratively with images generated by this microscope. After working together to collect images, researchers working on the same or related projects share the images and data over the network. The collaboratory will also provide remote access to high-performance computing platforms and to 3-dimensional data sets of biological structures from IVEM images. Using these resources, researchers can use electron tomography and advanced computer graphics programs to visualize and analyze 3-D data. This application makes it possible to support new instructional experiences.

This application is also changing the way the microscope can be used. Instead of just displaying the content on the screen in front of the microscope, the picture can be digitized, encrypted and then delivered over the network. Then, with appropriate interface software, a researcher can zoom, pan across and look at the sample in different ways.

Researchers use the electron microscope from computers in their own laboratories to obtain critical information on microscopic biological structures of cells and tissues. This information helps them understand how the cells and tissues function normally, and provide the basis for studying pathological conditions and disease. The system is currently being tested by several scientists at sites around the country in preparation for widespread deployment. Once deployed, this project will enable more and better research projects to be accomplished with fewer resources. This project will also push the envelope of available networking technologies. It will require addressing problems associated with bandwidth and Quality of Service encountered in today's networks.

The capabilities being developed by Internet2 will improve the functionality of the application. The existing Internet is strained to provide the necessary bandwidth and low-latency required for high-resolution imaging and low-latency control. (As has been explained in other sections of the seminar, these are some of the issues being studied by Internet2 working groups.) Internet2 will provide the necessary network performance to significantly improve the collaborative nature of the application. Eventually, the interactive control of this type of instrument and high-speed of data acquisition will be a common and routine method of data collection and collaboration.

Part of the excitement about this type of application is that it takes a device that usually sits in some remote corner of an engineering building and places it virtually into any number of laboratories or classrooms. A huge device like an electron microscope can be brought over the network to any computer with the proper client software. Applications like this expand the accessibility of a scientific instrument and can transform what was purely a research tool into an instructional tool.

The University of Michigan is developing another collaborative application. It is called the Upper Atmospheric Research Collaboratory, or UARC. The UARC is a joint venture of researchers in upper atmospheric and space physics, computer science, and behavioral science. It consists of a suite of collaboration tools that enable geographically distributed scientists to work together. The tools that they use include shared data visualizers, whiteboard, and chat. The tools are built on top of a specialized set of protocols, or transport layer, that provides application level Quality of Service.

Upper atmospheric scientists were once required to fly to Greenland and sit in a trailer to monitor the scientific instruments that they install in that cold and isolated area. The geographic placement of the instruments was essential for the types of observations they wanted to make. Now, with this tool they can remotely -- and collaboratively -- access data from their instruments over the Internet. They can access real-time data -- that is, watch events as they happen -- as well as recall stored data.

A second-order effect of creating this collaboratory is that there is greater participation in the scientific campaigns -- the scheduled observations of atmospheric events. Previously, access to the facilities in Greenland was quite limited. Now, it isn't only the scientists that can afford to travel to Greenland who have a chance to participate in these observations. Graduate students now make up the largest number of UARC participants. UARC has not only extended the ability of top scientists to participate, but has also made instruments and data available to a much larger part of the academic community.

Another important application, developed at the University of Pittsburgh Medical Center, demonstrates how effective the marriage of scientific instruments with supercomputing and visualization tools can be. This application, called "Watching the Brain in Action" comes from the combined efforts of the University of Pittsburgh, Carnegie Mellon University, and the Pittsburgh Supercomputing Center. It involves real-time visualization of brain activity during visual and memory tasks of a subject in a remote Magnetic Resonance Imaging (MRI) scanner. This application is currently being used in scientific research to improve the quality of data acquisition, but there are several clinical uses envisioned. This tool might eventually be used to diagnose brain pathology and psychiatric or cognitive disorders. It may be used to plan for neurosurgery. More broadly, the application can also serve as the basis for other tele-medicine and tele-research applications.

In this application, an individual may be undergoing a brain scan on an MRI machine. As the person is given a visual stimulus, the MRI machine scans for neural activity and the raw MRI data are transmitted over one leg of the network to the supercomputer at the Pittsburgh Supercomputing Center. As the data is received, the Cray supercomputer processes many MRI images into a single synchronized set of data, and creates a three-dimensional volume visualization of the brain. This visualization is then in turn sent on another leg of the network to a high-performance computer workstation that does the visualization in three dimensions on the desktop. The image can be rendered so that by wearing special glasses, you can see a 3-dimensional view of the brain. A really exciting part of this application is that because it is able to produce 3-dimensional views of the brain in real-time, you can actually see the different patterns of neurons firing in the person's brain as they are shown different visual patterns -- you can actually see the person thinking!

This capability allows someone -- a surgeon, for example -- anywhere with a high speed network connection to see, in real-time, where the visual stimulation occurred in the brain. The surgeon could go in and remove a tumor using this tool as a way of understanding more precisely where the tumor affects certain areas of the brain. Clinical assessment tools that can respond quickly, and with as rich an environment as the human brain, are simply not accessible today.

And, this capability can be accessed wherever there is a MRI machine and a high-performance connection. A medical specialist can remotely diagnose patients using capabilities that exist anywhere today. This is the kind of tool that will have a qualitative effect on people's lives, and it is a perfect example of how the capabilities of an advanced network can make the distance among instruments, computers and people less important.

A final type of advanced application is tele-immersion. Now, tele-immersion is a very exciting area not only because of the kinds of things the application enables, but also because it puts a very high demand on the network. More than any other application we've talked about, tele-immersion demands very high bandwidth and extremely low delay.

In some ways, tele-immersion is a benchmark application because its demands are so much greater; the networking capabilities we need to develop for tele-immersion will allow many other types of applications being developed to work as well.

Tele-immersion typically uses special, large display devices -- like "Immersadesks" and CAVES -- which provide the sense of being "immersed" in an image. These displays are very large, or may even completely surround the user. Tele-immersion also often uses special glasses to provide a 3-D effect to images. The result is the sense of being "immersed" in a computer generated scene.

Since the concept of a CAVE is fairly new, let's take a moment to define it. CAVE stands for CAVE Automatic Virtual Environment, and it is a room-sized advanced visualization tool which creates the illusion of complete "immersion" in a virtual environment for one or more users. It does this by bringing together high-resolution, stereoscopic projection and 3D computer graphics.

Tele-immersion is the activity made possible through the use of a CAVE. Tele-immersion also makes use of special goggles that allow a user to see images in three dimensions, and special monitors that keep track of a user's point of view in the virtual environment. These requirements, along with the especially intense demands they make on the network, mean that this application is years from being widely deployed.

But, the work that's being done today has resulted in some very interesting applications.

One example is the work being done at the Electronic Visualization Laboratory at the University of Illinois at Chicago. This application is called NICE, which stands for Narrative Immersive Constructionist /Collaborative Environments. NICE is a virtual reality (VR) educational environment for children. Real and computer generated users, motivated by an underlying story, can build persistent virtual worlds through collaboration. For example, the potential is for several children at remote sites around the country to use CAVE-based virtual systems to learn together while they tend a virtual garden. NICE is being used as a testbed to investigate how virtual reality can be made into an effective educational environment for 6 to 10 year olds.

While using virtual reality to teach children about gardening may seem a little silly, it does provide an exceptionally good testing ground for exploring the kinds of issues we face when working at the limits of our application and engineering abilities. We can learn lessons that can then be used in similar applications which might allow doctors to explore a patient's anatomy, or engineers and designers to work together. The sorts of problems presented by tele-immersion are very difficult and they are just the sort that Internet2 members and working groups hope to eventually solve. The Internet2 project will help make tele-immersion a much more accessible technology by beginning to work on the very difficult networking issues.

As work on the Internet2 project continues on member campuses, the number and kinds of advanced applications being developed will begin to increase rapidly. And our experience from the round of networking development that occurred with the current Internet suggests that there is a whole set of applications waiting to be developed that will take advantage of new networking capabilities that we haven't even imagined yet.

For the latest information about advanced applications being developed by researchers at Internet2 universities, keep in touch with the Internet2 website.

About CREN © CREN, 1999 Contact Us

[Top of Page]