Архитектура Аудит Военная наука Иностранные языки Медицина Металлургия Метрология
Образование Политология Производство Психология Стандартизация Технологии


Информационные системы и технологии»



Вариант 1

Текст 1

Inside Artificial Reality

You can fly to the moon by pointing your finger. With a flick of your wrist, see the world through the eyes of a child. Reach out and grasp furniture, windows, or walls that exist only within the silicon memory of a personal computer. Wave your hand to create virtual paper on an empty desktop, a simplified skid in a nonexistent car, or X rays of the human body.

Artificial reality was the stuff computer researchers’ dreams were made of – until recently. It has become possible thanks to a confluence of developments in new technologies emerging from the labs, including lightweight 3-D stereoscopic displays, magnetic positioning systems, advanced graphics chips, continuing expansion of computer memory, and novel interfaces between human and computer.

What’s most remarkable isn’t that computerized artificial reality exists, but that it is emerging so quickly from the research labs into public reach. Already, prototype systems have migrated from customized graphics workstations to off-the-shelf PCs like the Compaq Desk-pro 386/33. Within a year, PC users will be able to design their own artificial worlds for eye-popping presentations and realistic engineering and architectural modeling.

The creative fervor in artificial reality is reminiscent of the atmosphere that prevailed 2 decades ago, when PCs were beginning to make their mark. A handful of researchers on both coasts are combining home-brewed and professional-grade components to process complex artificial environments. Such virtual worlds are meant to be perceived only by a person’s sight and hearing, but also by touch, shattering the barrier between the computer screen and worlds beyond.

Within 50 miles (80 kilometers) of one another along the San Francisco Bay are three innovative outposts or research on the frontiers of computerized reality: the National Aeronautics and Space Administration (NASA) Ames Research Center at Moffett Field, VPL Research of Redwood City, and Autodesk in Sausalito.

The NASA Ames Research Center anchors the southern tip of the bay and nearby Silicon Valley to the edge of the 21st century. Here, in a building near one of the world’s largest experimental wind tunnels, is the modest lab of the Human Interface Research Branch of NASA’s Aerospace Human Research Division. From this room has emerged much of the technology associated with artificial reality.

To experience the NASA Ames personal simulator, you don a special headgear: a round frame bearing a wraparound visor and a rectangular aluminium enclosure the size of a tissue box. Look into this head-mounted display, and you see a stereoscopic 3-D image of the lab in black and white, complete with walls, checkerboard floor, ceiling, furniture, desktop computers, and equipment racks. When you move your head to the side or up and down, the computer shifts the display to realistically match your point of view.

Next, slip a black Lycra glove attached to strands of black cable onto your right hand. Move your hand to calibrate this Data Glove, then point your index and middle fingers while bending your third finger and pinkie. You see a disembodied image of the glove do the same; the display moves in the direction you indicate. Point straight up, and the room appears to fall away. With your feet still on the ground, you’re flying through the air.

Research scientist Dr. Michael McGreevy, who initiated and guided NASA’s journey into artificial reality, is about to make a lifelong dream come true, using this setup to simulate visits to the solar system. The scientist has secured funding for a project called Visualization for Planetary Exploration, which could result in virtual environments of the moon and the planets.

“We’ll use the visual data recorded by space probes and satellites to create computer models of each planet. When we are through, you’ll be able to hold the moon or any planet in your hand and point to where you want to go on its surface. The computer will scale the environment back to life size, and you can be virtually present at the indicated location. You’d feel like you were there, ” McGreevy explains.

Using the same artificial-reality application, known as virtual travel, you could take simulated trips to exotic locales, faraway resorts, or business meetings without moving from your sofa or desk.

The democratization of space travel is one example of how artificial reality may provide greater access to every individual We need to democratize the technology because that will unleash thousands of talented people who will collectively develop its rich promise.

McGreevy can be credited with designing and implementing the first practical artificial-reality system on an off-the-shelf computer system which included an inexpensive 3-D, head-mounted display.

Considered an essential component of artificial-reality technology by many developers, the head-mounted display was not invented at NASA Ames, but researchers there refined it to a practical size and cost. In 1965 computer pioneer Ivan Sutherland started developing a helmet sporting a pair of CRTs with left-and-right-eye views that were adjusted by computer according to the user’s head movements. By 1982 Tom Furness of the Wright-Patterson Air Force Base in Ohio had built an elaborate aircraft-training simulator into what became known as the Darth Vader helmet. In 1984 McGreevy asked Furness to sell him a helmet for his experiments, but was told it would cost a cool $1 million.

By cannibalizing two $79.95 Radio Shack pocket TVs with black-and-white LCDs and placing them in a $60 motorcycle helmet, McGreevy and hardware contractor Jim Humphries were able to assemble the first NASA Ames head-mounted display for a lot less- under $2, 000.

NASA’s government-funded lab was later able to build on early artificial-reality work at Atari Research, where an affordable head-mounted display, a glove input device, and educational and game software were developed but never marketed. After the first great video-game boom went bust, sending Atari into decline, several of its researchers joined forces with NASA: hardware “engineer” Scott Fisher, software whiz Warren Robinett, and glove inventor Thomas Zimmerman. Fisher is recognized as having brought the glove – now considered of vital importance for interacting with artificial reality – to the NASA Ames personal-simulator system.

 

Ответьте на вопросы к тексту:

1. What can virtual travel be used for?

2. Where can one be virtually present with the help of the Visualization for Planetary Exploration project?

3. Why is it important to democratize the space travel technology?

4. Whom was the head-mounted display refined by?

5. How much would McGreevy have paid for the Darth Vader helmet if he had agreed to buy it in 1984?

6. What did McGreevy and Jim Humphries do to assemble the first NASA Ames head-mounted display?

7. Who were the several Atari’s researchers that joined forces with NASA?

Текст 2

Artificial Reality In Use

Artificial-reality systems allow computer users to experience another world, the world on the other side of a computer screen. By wearing a head-mounted display called an Eye Phone and a black Lycra Data Glove one enters into a three-dimensional ‘virtual reality’ that can be touched and moved around with a wave of the hand. Some advanced artificial-reality systems allow two people to share virtual reality. For example, one participant can play Alice and the other the Mad Hatter in a computerized reenactment of the tea party from ‘Alice in Wonderland’. A participant can lift objects by means of a Data Glove.

The NASA bureaucracy is reported to have been confused at first about the usefulness of the Virtual Interface Environment Workstation (VIEW), the official name for its artificial-reality system. Originally it was seen as just a way to display operational data, essentially a virtual instrument panel. Notes Robinett, now a project manager for artificial-reality research at the University of North Carolina: ” The real application for the VIEW system had to be telepresence – artificial presence in hazardous environments like space. With the head-mounted display hooked up to remote cameras, you’ve got what we call “telerobotics”. For a lot of users, it’s safer than being there. You can assemble a space station without risking astronauts’ lives.”

With video-camera input or computer generated graphics, telerobotics could also be useful at the bottom of the ocean or in handling nuclear materials. Microtelepresence, a variation using optical or computer-aided magnification, would allow the manipulation of materials at a microscopic or even molecular level.. This technology should not be confused with computer-generated simulations of molecules, an area that was pioneered at the University of North Carolina and is now being tested for the modeling and synthesis of drugs.

At NASA Ames, Fisher and colleagues McGreevy, Humphries, and Dr. Beth Wenzel are now perfecting the next-generation VIEW system. When completed, the system will show stereoscopic images in color or high-resolution black and white that are generated by computer, supplied by video cameras, or replayed from videodiscs. The updated VIEW system will also incorporate 3-D sound positioning and speech recognition and synthesis.

To see the hottest advances in artificial reality, you once had to be privy to government, university, and private research labs but on June 7, 1989, computerized environments went public.

Two firms commercializing the technology held demonstrations in San Francisco and Anaheim. VPL Research proclaimed the occasion a holiday, Virtual Reality Day. Declared its press release: ”Like Columbus Day, VR Day will be celebrated every year with a parade and virtual beauty contest inside Virtual Reality.”

Despite such whimsy, big business – as represented by Pacific Bell – is taking artificial reality seriously. In San Francisco’s Brooks Hall, at the regional-telephone-company-sponsored Texpo’89 exhibit of telecommunications products, a place of honor is reserved for VPL Research’s demonstration amid the applications that will be possible once broadband fiber-optic phone lines connect offices and homes. On display is the world’s first shared virtual reality. Side-by-side color monitors suspended above eye level show two views of a colorful computer-generated child-day-care center, as seen from the points of view of two participants seated below the screens, each wearing 3-D goggles and a Data Glove.

Transmitting image data over fiber-optic cable, users in separate cities could manipulate the same computer-mediated environment – a welcome development for an architect and client, say, with offices in different parts of the country For Pacific Bell, shared virtual reality carries the allure of profits from the use of its extensive fiber-optic-cable networks.

To enter this world of virtual reality, you put on a VPL Data Glove and a VPL-manufactured head-mounted color display called an Eye Phone. You can see the day-care center’s interior, in three dimensions, complete with doors, windows, furnishings, and a mannequin representing your colleague. Look down to see your own effigy and a disembodied hand floating nearby. To move yourself around, flex your index and middle fingers and then simply “let your fingers do the walking.”

If you want to change the placement of the water fountain, reach out to it and make a fist; grasp and move it, then flatten your hand to release it. This kind of tactile interaction is one way artificial reality differs from computer simulations of the past. Experience the day-care center from a six-year-old eye level; to get short, point down with your little finger. To get big again, point up.

Presiding over the demonstration is Jaron Lanier, founder of VPL Research, and originator of VR Day. He is introducing the first commercial shared virtual reality, RB2, which stands for Reality Built for Two. According to Lanier, “the essence of virtual reality is that it’s shared.”

Even with sophisticated graphics workstations powering RB2, its synthetic environments are hardly detailed enough to be confused with the real world. And Lanier’s ‘ first new level of objectively shared reality available to humanity since the physical world ‘ has other drawbacks; by the end of the first day’s showings, one of the system demonstrators is experiencing motion sickness and has to take Dramamine before the next day’s session. Such ‘simulator sickness’ is common in flight training, where the body reacts to conflicting sensory cues from simulation and reality.

 

Ответьте на вопросы к тексту:

1. What makes a welcome development for an architect and a client with offices in different parts of the country?

2. Who is the founder of the VPL Research?

3. How is Virtual Reality Day celebrated?

4. What is an Eye Phone?

5. Why does one of the RB2 system demonstrators have to take Dramamine before the next day’s session?

6. What is the essence of virtual reality, according to Lanier?

7. What must one do to enter the world of virtual reality?

Текст 3

Cyberspace on a Desktop

The current activity in perfecting artificial reality, much of it in California, is not unlike the development of the movies in Paris at the turn of the XX century. Two apparently opposite tendencies emerged then: a realistic, documentary bent, characterized by the Lumiere brothers, who filmed scenes of everyday life; and an imaginative, fictional inclination, exemplified by Georges Melies and his fancifully staged Voyage to the Moon. More recently, filmmaker Jean-Luc Godard argues that the two types of movies were much closer in execution than historians would have us believe. If Jaron Lanier, with his hallucinatory environments, is the Melies of artificial reality, then the developers at Sausalito-based Autodesk, working within the bounds of realistic computer-aided design software, are virtual reality’s counterparts to the Lumiere brothers. Yet by drawing inspiration from the work of science fiction author William Gibson, they, too, are crossing the line between reality and fantasy.

On Virtual Reality Day in Anaheim, just a gnome’s throw from Disneyland, the members of the Autodesk Research Lab are demonstrating their version of artificial reality for invited guests in a hotel suite. One by one, visitors navigate within either of two colorful virtual worlds; a cityscape with a few buildings at an intersection surrounded by fields, a lake, and clouds; or a room interior that can be ‘grasped’ and moved around. While these are strictly solo experiences rather than shared realities, they, too, rely on the VPL Eye Phone and Data Glove as the connection between human and computer.

Since Autodesk’s mainstay design program is the world’s most widely used CAD software, engineers and architects will soon be able to create a building design and move through it in any direction, ” says William Bricken, lab director at Autodesk. “ To reposition a design element like a window, you’ll be able to reach out and shift it.”

Achieving architectural walk-through has long been a goal of artificial-reality researchers. The most advanced work is being done by Fred Brooks’ team at the University of North Carolina, where a deluxe system shows high-resolution depictions of floor plans and 3-D, full-color interiors, complete with real0time adjustment of the position of the sun and the ratio of direct/ambient light. To navigate through such a virtual structure, you can use velocity-modulating joysticks (in what is called a helicopter metaphor), move a head-mounted display (the eye-ball metaphor), or, in the closest thing to being there, walk on a treadmill you steer using handlebars (the shopping cart metaphor). Because of hardware limitations, Autodesk’s virtual reality program will be less detailed and somewhat slower in real time; but it nonetheless represents a major advance in PC-based software.

In the absence of a preexisting need or market, Autodesk’s attempt at making a product from artificial reality is a gamble. Who will spend $3, 000 to $ 4, 000 on a software package requiring a top-of-the-line PC and $ 20, 000 in extras? Current AutoCAD users in architecture, design, and manufacturing may want to use artificial reality in demos, presentations, and sales pitches. Also expected are a new breed of virtuality hackers, using the Autodesk program to create their own alternative realities.

Autodesk’s product may be called Cyberspace, in homage to the term coined by William Gibson used to describe the nightmarish universal computer data network of tomorrow.

At work on two film scripts and a novel set in a Victorian England in which Charles Babbage succeeds in building the first computer, Gibson confesses that he’s not curious about current developments in virtual reality. “When you’ve imagined it in an evolved state, ” he says, “the art isn’t interesting. Besides opening things up for paraplegics, I don’t see what virtual reality will do for anyone. It won’t help save the rain forests; in that respect, I would prefer seeing cold fusion happen. Artificial reality will just be a gadget for rich countries; for military applications. At the low end, it will just end up being used for better Nintendo games.”

With an input device like the VPL Data Glove priced $ 8, 800, artificial reality should remain a rarefied pursuit for some time. As of this writing, only a thousand or so people have experienced artificial reality directly. Thousands more have experienced Myron Krueger’s Videoplace in museums and exhibits, but they had little idea what they were seeing.

At the same time, the ability to plunge into virtual worlds beyond the computer screen is so compelling that the market can hardly wait for real-world technology to catch up, and Gibson’s vision may already be turning to fact. Artificial-reality components may soon find their way into as many as 1.5 million homes, thanks to a Data-Glove-like plastic glove and a competing flat-panel sensor device, selling for $ 80 to $ 90 apiece as Nintendo computer-game-system peripherals. The orders are in from Toys “R” Us and other retailers: Mattel’s Power Glove and Broderbund’s U-Force are expected to be big hits. At first they will be rather expensive Nintendo joystick replacements, with the accompanying virtual-reality software expected sometime this year.

With 20 million home Nintendo systems, projections show an aftermarket for 7 million artificial-reality add-ons. Even with deep discounts, a half-billion-dollar U.S. artificial-reality industry seems inevitable within a few years. And that’s just for lowest-common-denominator video games. PC hardware and software entrepreneurs, take note.

 

Ответьте на вопросы к тексту:

1. What tendencies emerged in the development of the movies in Paris?

2. Where is VR Day celebrated?

3. What is the world’s most widely used CAD software?

4. What devices are used in architectural walk-through?

5. How much does the VPL Data-Glove cost?

6. Why isn’t Gibson interested in current developments in virtual reality?

7. Why does artificial reality seem inevitable within a few years?

 

Вариант 2

Текст 1


Поделиться:



Последнее изменение этой страницы: 2017-04-12; Просмотров: 66; Нарушение авторского права страницы


lektsia.com 2007 - 2024 год. Все материалы представленные на сайте исключительно с целью ознакомления читателями и не преследуют коммерческих целей или нарушение авторских прав! (0.035 с.)
Главная | Случайная страница | Обратная связь