Show Navigation

Search Results

Refine Search
Match all words
Match any word
Prints
Personal Use
Royalty-Free
Rights-Managed
(leave unchecked to
search all images)
{ 80 images found }

Loading ()...

  • Virtual reality: Michael McGreevy, PhD. in front of a pair of video images of the Valles Marineris of the planet Mars, computer-generated from data provided by the Viking spacecraft at NASA's Ames Research Centre, California. Sophisticated computers & sensors provide the user with a telepresence in the virtual world, through small video screens mounted in goggles on a headset, whilst a spherical joystick controls movement through the virtual landscape. One future Martian application of this system might be in gathering geological samples by remote control using a rover robot. A sensor in the geologist's headset could direct the robot at specific sample targets. Model Released (1990)
    USA_SCI_VR_35_xs.jpg
  • Applications of virtual reality systems in medical education. Here, Scott Delp and Scott Fisher are using a system developed at NASA's Ames Research Centre in Menlo Park, California, to study the anatomy of the human leg. They both wear a headset equipped with 3-D video displays to view the computer-generated graphical images - one is shown between the two doctors. Physical exploration of the leg anatomy is afforded by using the data glove, a black rubber glove with woven optical fiber sensors, which relays data on their physical hand movements back to the computer. Model Released (1990)
    USA_SCI_VR_06_xs.jpg
  • FINAL CONTACT: "GRAVEWATCH".  Photo Illustration for the Future of Communication GEO (Germany) Special issue. Fictional Representation and Caption: Interactive gravestones became quite popular in the 21st century. Adding snippets of video of the diseased was quite easy to program since nearly every family had extensively documented their family time with small digital videocams. AI (artificial intelligence) computer programs made conversations with the dead quite easy. These virtual visits to the underworld became passé within a decade however, and graveyard visits became less common. By mid-century many people wanted to insure that their relatives would continue paying their respects, and keeping their memory alive. New technology insured regular visits to the gravesite to pick up a monthly inheritance check issued electronically by a built-in device with wireless connection to the living relative's bank account. Face recognition (and retinal scanners on high-end models) insured that family members were present during the half-hour visits. A pressure pad at the foot of the grave activated the system and after 30 minutes of kneeling at the grave, watching videos or prerecorded messages or admonitions, a message flashed on the screen, indicating that a deposit had been made electronically to their bank account. For the Wright family of Napa, California, there is no other way to collect Uncle Eno's inheritance other than by monthly kneelings. ["Gravewatch" tombstones shown with "Retscan" retinal scanning ID monitors.] MODEL RELEASED
    USA_SCI_COMM_07_xs.jpg
  • BEDTIME FOR BOZOS WITH THE "HONEYMOONER" Photo Illustration for the Future of Communication GEO (Germany) Special issue. Fictional Representation and Caption: Video phones and teledildactic interactive body gloves facilitated large numbers of long distance relationships among huge numbers of couples in an age where job mobility was crucial to financial well being. But as divorce rates grew, the interpersonal skills for maintaining relationships atrophied, and couples found it easier to have a virtual partner that had a physical presence in the bedroom. No more headaches, bad breath, receding hair or cellulite to worry about. With a "Honeymooner", robotic sex doll, programmable with a PC, all kinds of simulations are possible. Richard "Dick" Kravitz of Sonoma, California,  MODEL RELEASED.
    USA_SCI_COMM_05_xs.jpg
  • David Chaum, managing director of DigiCash, Amsterdam (31)20-665-2611. The rush is on to buy and sell on the Internet. David Chaum's company has developed a system of digital cash. Buyer's identities are kept secret and by encrypting their account numbers and transaction details, privacy and security are assured. He has developed an experimental currency trial on the Internet using "ecash", which uses "cyberbucks" as its virtual currency.
    USA_SCI_COMP_04_120_xs.jpg
  • FINAL CONTACT: "GRAVEWATCH".  Photo Illustration for the Future of Communication GEO (Germany) Special issue. Fictional Representation and Caption: Interactive gravestones became quite popular in the 21st century. Adding snippets of video of the diseased was quite easy to program since nearly every family had extensively documented their family time with small digital videocams. AI (artificial intelligence) computer programs made conversations with the dead quite easy. These virtual visits to the underworld became passé within a decade however, and graveyard visits became less common. By mid-century many people wanted to insure that their relatives would continue paying their respects, and keeping their memory alive. New technology insured regular visits to the gravesite to pick up a monthly inheritance check issued electronically by a built-in device with wireless connection to the living relative's bank account. Face recognition (and retinal scanners on high-end models) insured that family members were present during the half-hour visits. A pressure pad at the foot of the grave activated the system and after 30 minutes of kneeling at the grave, watching videos or prerecorded messages or admonitions, a message flashed on the screen, indicating that a deposit had been made electronically to their bank account. For the Wright family of Napa, California, there is no other way to collect Uncle Eno's inheritance other than by monthly kneelings. ["Gravewatch" tombstones shown with "Retscan" retinal scanning ID monitors.] MODEL RELEASED
    USA_SCI_COMM_06_xs.jpg
  • SUPER SUPPER WITH I-GOGS"  Photo Illustration for the Future of Communication GEO (Germany) Special issue. Fictional Representation and Caption: Statistics and cultural studies always harked that families who dine "ensemble" have much better relations than those who do not. The time-honored tradition of families eating together fell by the wayside by the end of the 20th century. In the time-starved 21st century, families re-instituted the practice, but with a twist. They ritualistically eat together but are nearly all multi-tasking at the same time. But they can and often do interact with new half-mirrored goggles "I-GOGS" that allow virtually any computer/TV/school/ or video game program to be played at any time. Mealtime became an opportunity to share data as well as food. The Elkins family of Yountville, California are all surfing various audio-visual entertainment nodes while partaking of their Friday evening fish logs, sports drinks and Jello. MODEL RELEASED.
    USA_SCI_COMM_04_xs.jpg
  • Matthew Jones, wearing 3-D glasses to view computer simulations, from the Stanford Linear Collider (SLC) experiment, seen with a computer-simulated collision event between an electron and a positron. The SLC produces Z-zero particles by this collision process, which takes place at extremely high energies. The Z-zero is one of the mediators of the weak nuclear force, the force behind radioactive decay, and was discovered at CERN in 1983. The scientist is seen wearing special glasses that enable viewing of computer- generated stereoscopic images of the particle tracks following the collision inside the Large Detector. The first Z-zero seen at SLC was detected on 11 April 1989. MODEL RELEASED [1988]
    USA_SCI_PHY_08_xs.jpg
  • FIRST CONTACT: "FETALFONE" Photo Illustration for the Future of Communication GEO (Germany) Special Issue. Fictional Representation and Caption: The Smith's of Vallejo, California were not certain that the latest hi-tech form of giving their (unborn) child a headstart was effective, but it sure was fun to see Junior react to their voice on his "fetalfone". It was true that the youngster could only use it to listen (even if he could talk, it would very difficult in the amniotic fluid), but they enjoyed the idea that their offspring would be comfortable with a cell phone from Day Minus-90 to Day One when he popped out. The flat screen imaging unit affords the parents (and in this case older sister) the opportunity to track the unborn's development and also watch his reactions when they talk to him on the "Fetalfone". [Fetus with "Fetalfone" shown on "Babewatch", fetus-scan home imaging system can be monitored by absent parent via Internet.] MODEL RELEASED.
    USA_SCI_COMM_02_xs.jpg
  • The robotic hand developed at the Deutsches Zentrum für Luft und Raumfahrt (German Aerospace Center), in the countryside outside Munich, Germany, demonstrates the power of a control technique called force-feedback. To pick up an object, Max Fischer (in control room), one of the hand's developers, uses the data-glove to transmit the motion of his hand to the robot. If he moves a finger, the robot moves the corresponding finger. From the book Robo sapiens: Evolution of a New Species, page 135.
    GER_rs_13_qxxs.jpg
  • Showscan developed moving theater seats and enhanced movie projection that work together to give audiences bigger thrills.  Film is projected at 60 frames per second to enhance clarity and seats on hydraulic lifts follow movie action. Hollywood, California. Shot for the book project: A Day in a Life of Hollywood. USA.
    USA_HLWD_3_xs.jpg
  • Myron Kruger jumps in front of a VideoPlace screen. Kruger designed this system to allow people to interface directly with computers. The operator stands in front of this large, backlit screen. A video camera is used to form an image of the silhouette - the computer then interprets different poses or actions as different commands. The results are displayed on an equally- large video screen, the image of the operator being manipulated in response to the commands. Kruger was the first to use the term 'artificial reality' for this concept. Model released. (1990)
    USA_SCI_VR_19_xs.jpg
  • Myron Kruger and his assistant, Katrin Hinrichsen, 'shooting' at each other with computer-generated sparks. Kruger is a pioneer of artificial reality, a method allowing people to interface directly with computers. In Kruger's method, called VideoPlace, the participants stand in front of a backlit screen. A video camera forms an image of their silhouette; the computer is programmed to respond to particular actions in a particular way. Here the computer sees the operators pointing, and interprets this as fire a spark in this direction. The computer-generated image appears in the background here on a large video screen. Model Released (1990)
    USA_SCI_VR_03_xs.jpg
  • Matthew Jones, wearing 3-D glasses to view computer simulations, from the Stanford Linear Collider (SLC) experiment, seen with a computer-simulated collision event between an electron and a positron. The SLC produces Z-zero particles by this collision process, which takes place at extremely high energies. The Z-zero is one of the mediators of the weak nuclear force, the force behind radioactive decay, and was discovered at CERN in 1983. The scientist is seen wearing special glasses that enable viewing of computer- generated stereoscopic images of the particle tracks following the collision inside the Large Detector. The first Z-zero seen at SLC was detected on 11 April 1989. MODEL RELEASED [1988]
    USA_SCI_PHY_07_xs.jpg
  • FIRST DAY OF SCHOOL: "GREENREAD" AND "WASTEWATCHER" Photo Illustration for the Future of Communication GEO (Germany) Special Issue. Fictional Representation and Caption: The first day of school for one-year-olds is less traumatic when the learning guide can monitor their progress with infant-friendly "Greenreads", (friendly retro laptops with green-red monitoring LEDs that display learning progress). Getting a jump-start on education is crucial to the future success of a citizen in this very wired world. Moving beyond the abdominal skin speakers to fetal cell phones became so common by mid-century that many children were able to communicate very well by the time they started school at 12 months, even though they had not mastered verbal speech. Photographed at Headzup Learning Center in Napa, California MODEL RELEASED.
    USA_SCI_COMM_03_xs.jpg
  • New Age meditation technology. Customers relaxing during a 'brain tune-up' session at the Universe of You clinic. Each customer is wearing a Synchro-Energiser. This projects patterns of colored lights into the eyes, and plays the sound of ocean waves into the ears. It is claimed that this helps the wearer to achieve a meditative state, from, which they enjoy deep mental and physical relaxation. Further claims for long-term use of the system include increased creativity, improved memory and improvements in problem- solving and decision-making abilities. A session lasts for 45 minutes. The clinic is in Corte Madera, California. [1988].
    USA_SCI_NEWAGE_01_xs.jpg
  • Silicon Valley, California; Jay Eisenlohr, VP of marketing for Rendition Software of Mountain View, maker of 3-D graphic chips for games. Eisenlohr in his living room playing an on-line racing game while his wife and daughter watch TV (classic old US TV shows on Nickelodeon). Model Released. (1999).
    USA_SVAL_29_xs.jpg
  • At the MIT Media Lab in Cambridge, MA, David Koons is a graduate student working under Richard Bolt doing his Ph.D. dissertation on multi-modal processing. In the photo Koons is busy programming with the large screen monitor.  Gloves, jacket, and head-mounted eye-tracking gear are in the background.
    Usa_rs_104_xs.jpg
  • UC Berkeley graduate student Eric Paulos calibrates his Personal Roving Presence (PRoP), which he describes as "a simple, inexpensive, Internet-controlled, untethered tele-robot that strives to provide the sensation of tele-embodiment in a remote real space." Berkeley, CA . From the book Robo sapiens: Evolution of a New Species, page 168.
    USA_rs_445_qxxs.jpg
  • t the University of Utah in Salt Lake City, computer scientist John M. Hollerbach puts a lab staff member on the SARCOS Treadport, a device that mimics the tug and pull of acceleration. Walking on a treadmill, the staffer is surrounded by a projected simulation of a Western mountainside. On a real hill, hikers must struggle with their own inertia to surmount the slope, a sensation no ordinary treadmill can provide. The Treadport uses force-feedback to push or pull at the user, uncannily evoking the sensation of climbing, a new dimension of realism for this type of simulation. From the book Robo sapiens: Evolution of a New Species, page 137 top.
    USA_rs_432_120_qxxs.jpg
  • New Age meditation technology. A client at the Altered States Float Center and Mind Gym, West Hollywood, California. MODEL RELEASED [1988]
    USA_SCI_NEWAGE_03_xs.jpg
  • Virtual reality: Rich Holloway wears prototype headset which employs half-silvered mirrors to enable the user to view a projected image of a virtual environment (and thus exist in virtual reality) and also see in front of his nose. A virtual environment is one created by a computer. A person entering such an environment does so with the aid of such a headset, which displays virtual imagery. Tactile interaction with the environment may be made using a data glove, a Spandex garment wired with sensors, which relays movement of the hand & fingers to the virtual environment. Model Released (1990)
    USA_SCI_VR_13_xs.jpg
  • Virtual reality. Jamaea Commodore wears a virtual reality headset and data glove appears immersed in a computer-generated world. Virtual reality headsets contain two screens in front of the eyes, both displaying a computer- generated environment such as a room or landscape. The screens show subtly different perspectives to create a 3-D effect. The headset responds to movements of the head, changing the view so that the user can look around. Sensors on the data glove track the hand, allowing the user to manipulate objects in the artificial world with a virtual hand that appears in front of them. Model Released (1990)
    USA_SCI_VR_28_xs.jpg
  • Virtual reality. Cyberspace racquetball game: real strokes made by Christopher Allis, the player are returned by the Cyberspace computer through the virtual, computer- generated environment displayed on the monitor. Admission to this virtual squash court is provided by 3-D video goggles, a magnetic sensor & optical fiber sensors woven into a black rubber glove. The headset sensor transmits data to the computer on the player's position in space, whilst the data glove connects real hand movements to the virtual racquet court. Photo taken at AutoDesk Inc., Sausalito, California. Model Released (1990)
    USA_SCI_VR_27_xs.jpg
  • Silicon Valley, California; Linda Jacobson, Virtual Reality Evangelist at Silicon Graphics, Incorporated, Mountainview, California. Jacobson stands poised over the operations area of one of Silicon Graphics' RealityCenters. The high tech console operates the large wrap-around screen behind her. Jacobson's dream is to be the host of a virtual reality talk show. In the meantime, this former Wired Magazine reporter is content to tout the virtues of Immersive Visualization?the newly coined industry name, she says, for virtual reality. The tangible element of her job at SGI is to manage and market SGI's RealityCenters?facilities designed to do quick representations in a fully interactive graphical interface. These can include virtual factory tours; automobile mock-ups; and mock-up product changes depending on the desires of purchasing company. Model Released (1999).
    USA_SVAL_127_120_xs.jpg
  • Virtual sex. Pornographic application of virtual reality, showing a man mauling his virtual conquest provided by his headset and data glove & an unseen computer system. Virtual, in computer parlance, describes equipment or programs that assume one form yet give the illusion of another. Here, the image of the woman is provided by the system through goggles in the head-set; contact is effectively faked by optic-optic sensors in the black, rubber data glove, which relay information on aspect and movement of the man's fingers. Photographed at Autodesk Inc., USA. MODEL RELEASED. (1990)
    USA_SCI_VR_08_xs.jpg
  • Virtual reality: data suit design. John Bumgarner at VPL Research Inc., Redwood City, California, discussing technical points relating to the design of the blue data suit being worn by Lou Ellen Jones on left. VPL produces virtual reality systems - computer generated graphical environments that a user may enter & interact with. Visual contact is provided by a headset equipped with 3-D goggles. A spatial sensor on the headset (to fix the user's position in space) and numerous optical fiber sensors woven into the data suit relay data back to the computer. The forerunner to the data suit is the data glove, which restricted the user's virtual interaction to hand gestures. Model Released (1990)
    USA_SCI_VR_33_xs.jpg
  • Virtual reality: Jaron Lanier, head of VPL Research of Redwood City, California, photographed surrounded by demonstration images of the virtual, non-real worlds that VPL have created. Fiber- optic sensors in the black rubber glove Lanier is wearing transmit a user's movements into the computer-generated virtual environment. A user's view of such a world is projected by the computer into 2 eye phones mounted on a headset. Model Released (1990)
    USA_SCI_VR_25_xs.jpg
  • Virtual reality: Lewis Hitchner manipulates a pair of video images of the Valles Marineris of the planet Mars, computer-generated from data provided by the Viking spacecraft at NASA's Ames Research Centre, California. Sophisticated computers & sensors provide the user with a telepresence in the virtual world, through small video screens mounted in goggles on a headset, whilst a spherical joystick controls movement through the virtual landscape. One future Martian application of this system might be in gathering geological samples by remote control using a rover robot. A sensor in the geologist's headset could direct the robot at specific sample targets. Model Released (1990)
    USA_SCI_VR_17_xs.jpg
  • Virtual reality. Harry Marples, Computer Scientist, programming a system that will allow visitors a 3-D guided tour of a new building before it is even built. Plans for a proposed design are fed into a computer, which is capable of displaying them in sophisticated 3-D graphics. Thus the real building is presented by the computer as a virtual one. Visitors wearing special headsets fitted with video goggles and spatial sensors can move from room to room within the virtual space as if they were in the real world. Optical fibers woven into rubber data gloves provide a tactile dimension. Photo taken at the Computer Science Dept., University of North Carolina. Model Released (1990)
    USA_SCI_VR_07_xs.jpg
  • Virtual reality. Harry Marples, Computer Scientist, programming a system that will allow visitors a 3-D guided tour of a new building before it is even built. Plans for a proposed design are fed into a computer, which is capable of displaying them in sophisticated 3-D graphics. Thus the real building is presented by the computer as a virtual one. Visitors wearing special headsets fitted with video goggles and spatial sensors can move from room to room within the virtual space as if they were in the real world. Optical fibers woven into rubber data gloves provide a tactile dimension. Photo taken at the Computer Science Dept., University of North Carolina. Model Released Model Released (1990)
    USA_SCI_VR_05_xs.jpg
  • Virtual Reality: Henry Fuchs, University of North Carolina. Henry Fuchs is a pioneer in the development of virtual reality. He has worked with 3D biomedical imaging and graphics since 1969 and with head-mounted displays since 1970. He has been on the faculty of the Department of Computer Science at the University of North Carolina at Chapel Hill since 1978. At present, he is predominantly involved in the field of virtual reality in medicine through his work in ultrasound-guided, head-mounted displays, and in telecollaboration as part of the National Tele- immersion Initiative. (1990)
    USA_SCI_VR_46_xs.jpg
  • Virtual reality: fitting adjustments being made to a data suit (blue, center) by Lou Ellen Jones, Asif Emon and Bea Holster at VPL research, Redwood City, California. VPL specializes in virtual or artificial reality systems, the production of computer-generated graphical environments that users may enter. Visual contact with such artificial worlds is provided by a headset equipped with 3-D goggles. A spatial sensor on the headset (to fix the user's position in space) and numerous optical fiber sensors woven into the data suit, relay data back to the computer. The forerunner to the data suit is the data glove, which restricted the user's virtual interaction to hand gestures. Model Released (1990)
    USA_SCI_VR_34_xs.jpg
  • Virtual reality: Jaron Lanier, head of VPL Research of Redwood City, California, photographed surrounded by demonstration images of the virtual, non-real worlds that VPL have created. Fiber- optic sensors in the black rubber glove Lanier is wearing transmit a user's movements into the computer-generated virtual environment. A user's view of such a world is projected by the computer into 2 eye phones mounted on a headset (seen unworn at left, on top of the computer monitor). Model Released (1990)
    USA_SCI_VR_24_xs.jpg
  • Virtual reality: Jaron Lanier, head of VPL Research of Redwood City, California, photographed surrounded by demonstration images of the virtual, non-real worlds that VPL have created. Fiber- optic sensors in the black rubber glove Lanier is wearing tranmsit a user's movements into the computer-generated virtual environment. A user's view of such a world is projected by the computer into 2 eyephones mounted on a headset (seen unworn at left, on top of the computer monitor). Model Released (1990)
    USA_SCI_VR_21_xs.jpg
  • At the MIT Media Lab in Cambridge, MA, Joshua Bers models virtual reality gloves and tracking devices while calibrating them. Bers is working on his master's thesis under Richard Bolt. He is seen wearing the equipment detailed above for calibration purposes. Once programmed and calibrated, he can move virtual objects around in a virtual room. Bolt is working on multi-modal interaction using speech, gesture, and gaze. He is attempting to program computers to interact with their users by non-standard (keyboard, mouse) methods.
    Usa_rs_105_xs.jpg
  • Nautical application of virtual reality used for undersea viewing. NOAA personnel demonstrating a concept developed by Washington University's Human Interface Technology Laboratory; to be able to see underwater objects, fish or terrain by combining sonar with a computer graphics system that would be viewed by the operator wearing laser micro- scanner glasses. Here, a NOAA operator looks out over the stern of a small boat whilst wearing the pink, plastic-rimmed laser glasses & data glove that connect him to the virtual undersea world created by the computer. (1990)
    USA_SCI_VR_45_xs.jpg
  • Virtual reality: Jaron Lanier, head of VPL Research of Redwood City, California. Fiber- optic sensors in the black rubber glove Lanier is wearing transmit a user's movements into the computer-generated virtual environment. A user's view of such a world is projected by the computer into 2 eye phones mounted on a headset. Model Released (1990)
    USA_SCI_VR_23_xs.jpg
  • Virtual reality videogame: Jack Menzel wears a Nintendo Power Glove to interact with the fictional (or virtual) Super Mario Brothers (Nintendo characters) in the living room of his home in Napa, California. Model Released (1990)
    USA_SCI_VR_42_xs.jpg
  • Virtual reality in undersea exploration: bench testing of an undersea tele-robotic robot arm, being developed for the U.S. Navy by the Centre for Engineering Design at the University of Utah, Salt Lake City. The functions of this robot are the performance of complex underwater tasks by remote manipulation from the surface. Underwater video cameras & other imaging systems relay information to a computer that produces a 3-D virtual image of the seabed. The operator is linked to this world through a headset equipped with 3-D goggles, & spatial sensor, and data gloves or other clothing that relay precision movements back through the computer to tools on the robot's limbs. (1990)
    USA_SCI_VR_40_xs.jpg
  • Virtual reality in undersea exploration: bench testing of an undersea tele-robotic robot arm, being developed for the U.S. Navy by the Center for Engineering Design at the University of Utah, Salt Lake City. The functions of this robot are the performance of complex underwater tasks by remote manipulation from the surface. Underwater video cameras & other imaging systems relay information to a computer that produces a 3-D virtual image of the seabed. The operator is linked to this world through a headset equipped with 3-D goggles, & spatial sensor, and data gloves or other clothing that relay precision movements back through the computer to tools on the robot's limbs. (1990)
    USA_SCI_VR_39_xs.jpg
  • Virtual Reality: Rick Walsh, director for the Resource Center for the Handicapped in Seattle, has an office that he runs with voice command activated computers. He is working with the Human Interface Technology Lab on innovative uses of Virtual Reality for the handicapped. Model Released
    USA_SCI_VR_31_xs.jpg
  • Virtual or artificial reality. Alvar Green, CEO of Autodesk in 1990, Playing Cyberspace, a sophisticated videogame designed by AutoDesk Inc., USA. The computer monitor displays an image of one of Cyberspace's virtual (non-real) environments - a room - into which the player enters by wearing a headset & data glove. Two video images of the environment fit are projected into the eyes, whilst physical interaction is achieved through spatial sensors in the headset & optical fibers woven into the black rubber data glove, which send data to the computer on the player's position & movements in space. Alvar Green Model Released (1990)
    USA_SCI_VR_26_xs.jpg
  • Virtual reality: Warren Robinett wears a prototype (1st generation) headset. Virtual environments are generated by computer systems to allow users to interact with in similar ways as they might with a real environment. The computer environments are displayed to their users using sophisticated graphics projected through small video monitors mounted on the headset. In addition, some headsets have a sensor which instructs the computer of the wearer's spatial aspect, that is, in 3-D. This particular model features displays with half-silvered mirrors that allow the user to see the computer image & look ahead. Model Released (1990)
    USA_SCI_VR_14_xs.jpg
  • Virtual reality in air traffic control (ATC) systems. Bill Wiseman from the University of Washington Human Interface Technology Laboratory, Seattle, demonstrating how ATC might operate in the future. Optical fiber sensors in his black data glove & the pink-rimmed micro-laser scanner glasses connect the operator with a virtual, computer-generated, 3-D image of the airspace he is controlling. Through raising his gloved hand to touch an icon (projected image) of an approaching jet, he is placed in instant voice communication with the pilot. This photograph was taken with the cooperation of SEA/TAC international airport, Seattle. MODEL RELEASED. (1990)
    USA_SCI_VR_11_xs.jpg
  • Virtual reality in air traffic control (ATC) systems. Bill Wiseman from the University of Washington Human Interface Technology Laboratory, Seattle, demonstrating how ATC might operate in the future. Optical fiber sensors in his black data glove & the pink-rimmed micro-laser scanner glasses connect the operator with a virtual, computer-generated, 3-D image of the airspace he is controlling. Through raising his gloved hand to touch an icon (projected image) of an approaching jet, he is placed in instant voice communication with the pilot. This photograph was taken with the cooperation of SEA/TAC international airport, Seattle. MODEL RELEASED. (1990)
    USA_SCI_VR_09_xs.jpg
  • Virtual reality videogame: Evan Menzel wears a Nintendo Power Glove to interact with the fictional/virtual Super Mario Brothers (Nintendo characters) in the living room of his home in Napa, California. Model Released (1990)
    USA_SCI_VR_44_xs.jpg
  • Virtual reality videogame: Jack Menzel wears a Nintendo Power Glove to interact with the fictional (or virtual) Super Mario Brothers (Nintendo characters) in the living room of his home in Napa, California. Model Released (1990)
    USA_SCI_VR_43_xs.jpg
  • Virtual reality videogame: Evan & Jack Menzel appear to do battle over who is to wear the Nintendo Power Glove to interact with the fictional (or virtual) Super Mario Brothers (Nintendo characters) in the living room of their home in Napa, California. Model Released (1990)
    USA_SCI_VR_41_xs.jpg
  • Virtual reality: Jim Chong wears a prototype (1st generation) headset. Virtual environments are generated by computer systems to allow users to interact with in similar ways as they might with a real environment. The computer environments are displayed to their users using sophisticated graphics projected through small video monitors mounted on the headset. In addition, some headsets have a sensor which instructs the computer of the wearer's spatial aspect, that is, in 3-D. This particular model features displays with half-silvered mirrors that allow the user to see the computer image & look ahead. Model Released (1990)
    USA_SCI_VR_30_xs.jpg
  • Virtual reality: Jaron Lanier, head of VPL Research of Redwood City, California. Fiber- optic sensors in the black rubber glove Lanier is wearing transmit a user's movements into the computer-generated virtual environment. A user's view of such a world is projected by the computer into 2 eye phones mounted on a headset. Model Released (1990)
    USA_SCI_VR_22_xs.jpg
  • Virtual reality. Appearing to be supported by a high-tech Zimmer frame, computer scientist, John Airey uses a steer-able treadmill to progress on a walk- through tour of a virtual image of a church hall. As he paces on the real treadmill, so he moves towards the altar of the 3-D computer-generated image of the church. Such software packages would be invaluable to architects in judging how their designs may be received by the people who will use them, perhaps well in advance of any real foundations being laid. This photo was taken in the Computer Science Department at the University of North Carolina. Model Released (1990)
    USA_SCI_VR_20_xs.jpg
  • Virtual reality in air traffic control (ATC) systems. Bill Wiseman from the University of Washington Human Interface Technology Laboratory, Seattle, demonstrating how ATC might operate in the future. Optical fiber sensors in his black data glove & the pink-rimmed micro-laser scanner glasses connect the operator with a virtual, computer-generated, 3-D image of the airspace he is controlling. Through raising his gloved hand to touch an icon (projected image) of an approaching jet, he is placed in instant voice communication with the pilot. This photograph was taken with the cooperation of SEA/TAC international airport, Seattle. MODEL RELEASED. (1990)
    USA_SCI_VR_12_xs.jpg
  • Virtual reality in air traffic control (ATC) systems. Bill Wiseman from the University of Washington Human Interface Technology Laboratory, Seattle, demonstrating how ATC might operate in the future. Optical fiber sensors in his black data glove & the pink-rimmed micro-laser scanner glasses connect the operator with a virtual, computer-generated, 3-D image of the airspace he is controlling. Through raising his gloved hand to touch an icon (projected image) of an approaching jet, he is placed in instant voice communication with the pilot. This photograph was taken with the cooperation of SEA/TAC international airport, Seattle. MODEL RELEASED. (1990)
    USA_SCI_VR_10_xs.jpg
  • Seeming to touch the objects on his screen, Peter Berkelman, then a graduate student at the Carnegie Mellon Robotics Institute in Pittsburgh, PA, scoops up virtual blocks with a special device that communicates the sensation of touching them. The device, which has a handle suspended in powerful magnetic fields, can move with all six possible degrees of freedom: up and down, side to side, back and forth, yaw, pitch, and roll. Used with special "haptic" software the device has force-feedback. From the book Robo sapiens: Evolution of a New Species, page 136.
    USA_rs_27A_120_qxxs.jpg
  • Cyberthon: Virtual Reality Conference, San Francisco, California. (1990)
    USA_SCI_VR_47_xs.jpg
  • Cyberthon: Wavy Gravy, Hugh Romney, at a Virtual Reality Conference, San Francisco, California. He was at Woodstock as a member of an entertainment/activist commune known as the Hog Farm. (1990)
    USA_SCI_VR_38_xs.jpg
  • Cyberthon: Author Howard Rheingold at a Virtual Reality Conference, San Francisco, California Model Released (1990)
    USA_SCI_VR_37_xs.jpg
  • Virtual reality: Margaret Minsky works with a force-feedback joystick being developed in the MIT Media Laboratory. The joystick is designed to give its user a physical impression of features in a computer-generated environment. In this demonstration, the user is invited to feel shapes & textures whilst running a cursor over the various images displayed on the screen, and be able to differentiate between them. Model Released (1990)
    USA_SCI_VR_36_xs.jpg
  • Application of virtual reality computer systems in the experimental design of novel drugs (molecular modeling). Russ Taylor docking drug and protein with force feedback robot arm. This system allows chemists to not only see whether two molecules might fit together but allows them to feel how well they do. The force-feedback robot grip lets the chemist rotate a simulated drug and fit it into a protein molecule. The computer calculates electrostatic forces & other parameters concerned with the probability of a reaction occurring & feeds this information back to the robot grip, so the designer may feel how smoothly or otherwise the reaction is proceeding. Photo taken at the University of North Carolina. Model Released (1990)
    USA_SCI_VR_29_xs.jpg
  • Virtual reality & the home computer. Home-based computer scientist, John Schultz, plays a 3-D video game in 3-D stereo sound featuring space-planes dog-fighting, which he wrote for his home computer. Entitled The Event Horizon Simulator the game runs on an Atari 2000 computer, using conventional stereo headphones and a basic LCD headset. Model Released (1990)
    USA_SCI_VR_18_xs.jpg
  • Cyberspace hi-cycle: Carolyn Hedrich pedals an exercise bike through a virtual, computer generated landscape, projected into her eyes through two video screens in her headset. Riders are encouraged to pedal as fast as they are capable, because, on reaching a certain pedal speed, the computer creates the impression of take-off and flight. Model Released (1990)
    USA_SCI_VR_16_xs.jpg
  • Cyberspace hi-cycle: Carolyn Hedrich pedals an exercise bike through a virtual, computer generated landscape, projected into her eyes through two video screens in her headset. Riders are encouraged to pedal as fast as they are capable, because, on reaching a certain pedal speed, the computer creates the impression of take-off and flight. Model Released (1990)
    USA_SCI_VR_15_xs.jpg
  • Application of virtual (artificial) reality computer systems in medical diagnostic imaging, showing a magnetic resonance image (MRI) of the head next to a scientist wearing a headset. Computer scientists here at the University of North Carolina aim to distill various types of diagnostic images, (X-rays, CT, MRI) into a vivid digital model, that is displayed through the head-mounted displays. Advantages of this type of presentation include not being bound by screen conventions, such as a lack of step back features, wider area views & the need to control a keyboard or mouse. Future uses may exist in the accurate targeting of radiotherapy. Stereo tactic radiotherapy technique. Model Released (1990)
    USA_SCI_VR_04_xs.jpg
  • Virtual reality: Ralph Hollis, IBM, NY "Feeling" Gold Atoms working with a scanning tunneling microscope (STM) (at right) linked to a tele-robotic manipulation system with atomic scale force-feedback. The minute movements of the STM's probe as its traverses the gold sample surface is linked to a force-feedback magic wrist, enabling the scientist, whose hand is in contact with the magic wrist, to feel the texture of the gold atoms. In background is a false-color STM image of the gold surface, revealing the cobbled pattern of individual atoms. The photo was taken at IBM's Thomas Watson Research Centre, Yorktown Heights, New York. Model Released (1990)
    USA_SCI_VR_02_xs.jpg
  • Virtual reality: Ralph Hollis, IBM, NY "Feeling" Gold Atoms working with a scanning tunneling microscope (STM) (at right) linked to a tele-robotic manipulation system with atomic scale force-feedback. The minute movements of the STM's probe as its traverses the gold sample surface is linked to a force-feedback magic wrist, enabling the scientist, whose hand is in contact with the magic wrist, to feel the texture of the gold atoms. In background is a false-color STM image of the gold surface, revealing the cobbled pattern of individual atoms. The photo was taken at IBM's Thomas Watson Research Centre, Yorktown Heights, New York. (1990)
    USA_SCI_VR_01_xs.jpg
  • Virtual Reality: Dick Schlicting, Kenworth Trucking Company. Dick Schlicting drives a Kenworth tractor trailer. In 1990 the Human Interface Technology Lab was working on the idea of truck drivers using the same type of heads-up-display that fighter pilots use: indicators and gauges hover semi-transparent in front of their helmets/glasses. Kenworth Model Released (1990)
    USA_SCI_VR_32_xs.jpg
  • To study the flight control behavior of fruit flies, Dickinson and his researchers have come up with something even more bizarre than RoboFly. They have built a virtual reality flight simulator for fruit flies in an upstairs lab. A tiny fly is glued to a probe positioned in an electronic arena of hundreds of flashing LEDs that can also measure its wing motion and flight forces. By altering its wing motion, the fly itself can change the display of the moving electronic panorama, tricking the fly into "thinking" it is really flying through the air. The amplified humming of the fruit fly as it buzzes through its imaginary flight surrounded by computers in the darkened lab is quite bizarre.
    Usa_rs_616_xs.jpg
  • Pattie Maes (and grad student Cecil). Maes is photographed with "ALIVE," a real-time virtual reality system.  She captioned the photo:  "A novel system developed at the MIT Media Lab makes it possible for a person to interact with artificial creatures such as this dog using natural gestures."
    Usa_rs_101_xs.jpg
  • Long-EZ flying above the Mojave desert in California. The aircraft is of an unusual design, having forward-mounted "canard" wings instead of a tail plane and a rear-mounted "pusher" propeller. The canard makes the plane virtually stall proof. It has a slightly steeper tilt than the regular wing; thus the canard begins to stall before the main wing, and as it does so, it drops the nose and gains speed. The Long-EZ has a range of up to 7700 kilometers, a ceiling of 27,000 feet (8230 meters) and a top speed of 309 kilometers per hour. The aircraft is available in a kit form, manufactured by the Rutan Aircraft Factory, which can be assembled in as few as 1000 hours.
    USA_SCI_AVIA_14_xs.jpg
  • Long-EZ landing at the Mojave airport in California. The aircraft is of an unusual design, having forward-mounted "canard" wings instead of a tail plane and a rear-mounted "pusher" propeller. The canard makes the plane virtually stall proof. It has a slightly steeper tilt than the regular wing; thus the canard begins to stall before the main wing, and as it does so, it drops the nose and gains speed. The Long-EZ has a range of up to 7700 kilometers, a ceiling of 27,000 feet (8230 meters) and a top speed of 309 kilometers per hour. The aircraft is available in a kit form, manufactured by the Rutan Aircraft Factory, which can be assembled in as few as 1000 hours.
    USA_SCI_AVIA_15_xs.jpg
  • Long-EZ flying above the Mojave desert in California. The aircraft is of an unusual design, having forward-mounted "canard" wings instead of a tail plane and a rear-mounted "pusher" propeller. The canard makes the plane virtually stall proof. It has a slightly steeper tilt than the regular wing; thus the canard begins to stall before the main wing, and as it does so, it drops the nose and gains speed. The Long-EZ has a range of up to 7700 kilometers, a ceiling of 27,000 feet (8230 meters) and a top speed of 309 kilometers per hour. The aircraft is available in a kit form, manufactured by the Rutan Aircraft Factory, which can be assembled in as few as 1000 hours.
    USA_SCI_AVIA_11_xs.jpg
  • Long-EZ flying above the Mojave desert in California. The aircraft is of an unusual design, having forward-mounted "canard" wings instead of a tail plane and a rear-mounted "pusher" propeller. The canard makes the plane virtually stall proof. It has a slightly steeper tilt than the regular wing; thus the canard begins to stall before the main wing, and as it does so, it drops the nose and gains speed. The Long-EZ has a range of up to 7700 kilometers, a ceiling of 27,000 feet (8230 meters) and a top speed of 309 kilometers per hour. The aircraft is available in a kit form, manufactured by the Rutan Aircraft Factory, which can be assembled in as few as 1000 hours.
    USA_SCI_AVIA_09_xs.jpg
  • Long-EZ flying above the Mojave desert in California. The aircraft is of an unusual design, having forward-mounted "canard" wings instead of a tail plane and a rear-mounted "pusher" propeller. The canard makes the plane virtually stall proof. It has a slightly steeper tilt than the regular wing; thus the canard begins to stall before the main wing, and as it does so, it drops the nose and gains speed. The Long-EZ has a range of up to 7700 kilometers, a ceiling of 27,000 feet (8230 meters) and a top speed of 309 kilometers per hour. The aircraft is available in a kit form, manufactured by the Rutan Aircraft Factory, which can be assembled in as few as 1000 hours.
    USA_SCI_AVIA_08_xs.jpg
  • Wafaa Al Haggan, assisted by one of the many foreign guest workers who do virtually all the manual labor in Kuwait, shops at her local co-op supermarket in Kuwait City. Although Kuwait imports 98 percent of its food, much of it from thousands of miles away, the choice and quality of the goods on display in supermarkets in Kuwait easily match those in European or U.S. markets, and the prices are lower. (Supporting image from the project Hungry Planet: What the World Eats.)
    KUW03_5476_xf1b.jpg
  • Many Okinawans used to work into their nineties, farming, and weaving bashofu, a fine fabric made from a local banana fiber. Bashofu weaving was a home-based craft, and highly valued, but there are few, if any, weavers producing the fabric at home anymore. The workshop of Toshiko Taira, 87, at left, with a young apprentice, in the northern Okinawa village of Kijoka, is virtually all that is left of the art. She has been designated a national treasure of Japan. She and her daughter are attempting to keep the fine practice alive. (Supporting image from the project Hungry Planet: What the World Eats)
    JOK03_0038_xf1b.jpg
  • Long-EZ flying above the Mojave desert in California. The aircraft is of an unusual design, having forward-mounted "canard" wings instead of a tail plane and a rear-mounted "pusher" propeller. The canard makes the plane virtually stall proof. It has a slightly steeper tilt than the regular wing; thus the canard begins to stall before the main wing, and as it does so, it drops the nose and gains speed. The Long-EZ has a range of up to 7700 kilometers, a ceiling of 27,000 feet (8230 meters) and a top speed of 309 kilometers per hour. The aircraft is available in a kit form, manufactured by the Rutan Aircraft Factory, which can be assembled in as few as 1000 hours.
    USA_SCI_AVIA_13_xs.jpg
  • Long-EZ flying above the Mojave desert in California. The aircraft is of an unusual design, having forward-mounted "canard" wings instead of a tail plane and a rear-mounted "pusher" propeller. The canard makes the plane virtually stall proof. It has a slightly steeper tilt than the regular wing; thus the canard begins to stall before the main wing, and as it does so, it drops the nose and gains speed. The Long-EZ has a range of up to 7700 kilometers, a ceiling of 27,000 feet (8230 meters) and a top speed of 309 kilometers per hour. The aircraft is available in a kit form, manufactured by the Rutan Aircraft Factory, which can be assembled in as few as 1000 hours.
    USA_SCI_AVIA_12_xs.jpg
  • Long-EZ flying above the Mojave desert in California. The aircraft is of an unusual design, having forward-mounted "canard" wings instead of a tail plane and a rear-mounted "pusher" propeller. The canard makes the plane virtually stall proof. It has a slightly steeper tilt than the regular wing; thus the canard begins to stall before the main wing, and as it does so, it drops the nose and gains speed. The Long-EZ has a range of up to 7700 kilometers, a ceiling of 27,000 feet (8230 meters) and a top speed of 309 kilometers per hour. The aircraft is available in a kit form, manufactured by the Rutan Aircraft Factory, which can be assembled in as few as 1000 hours.
    USA_SCI_AVIA_10_xs.jpg
  • Toshiko Taira, 87, of Kijoka, Okinawa, Japan. Many Okinawans used to work into their nineties, farming, and weaving bashofu, a fine fabric made from a local banana fiber. Bashofu weaving was a home-based craft, and highly valued, but there are few, if any, weavers producing the fabric at home anymore. The workshop of Toshiko Taira, 87, and her daughter, in the northern Okinawa village of Kijoka, is virtually all that is left of the art. She has been named a national treasure of Japan. She and her daughter are attempting to keep the fine practice alive. Although older generations of Okinawans are still living into their one-hundredth year, some say that the decline of weaving in the home was the beginning of the decline of the lengthy life spans of Okinawans.
    JOK03_0194_xf1b.jpg

Peter Menzel Photography

  • Home
  • Legal & Copyright
  • About Us
  • Image Archive
  • Search the Archive
  • Exhibit List
  • Lecture List
  • Agencies
  • Contact Us: Licensing & Inquiries