We've all seen video images of astronauts in space: A smiling astronaut reaches for a screwdriver floating by her head; a newly constructed satellite slowly emerges from the cargo bay against the backdrop of Earth. What we rarely see is how these images are captured and relayed to the ground. NASA has integrated video into manned space flights since the days of Apollo, yet NASA scientists and engineers seldom discuss the details of its use. Their answers provide a fascinating view of how video can be used to communicate in extreme situations and at extreme distances - and, in one particularly exceptional effort, to help build the International Space Station, the largest collaborative orbital construction project in history.
But if you equate NASA's current budgets with those of the Kennedy/Johnson space era, think again. With the Cold War long over and a multinational space station under way, the agency now operates more like a private enterprise. The majority of its employees are contractors or subcontractors from private corporations with a common goal: to keep an eye on the bottom line for its shareholders - in this case, American taxpayers. And despite the sheer combined brainpower of NASA's specialist employees at the Jet Propulsion Lab and the Johnson, Marshall and Kennedy space centers, the agency is not always the first to employ cutting-edge technology. Safety concerns and the rigorous, lengthy testing schedules that come with them limit NASA's adoption of new tools and technology. You might be surprised to learn, for example, that although the public has seen live video from space since the 1960s, the ability to send live video back to the spacecraft didn't exist until 1995. But more on that later.
Wish You Were Here
What sets NASA apart from other corporate production companies, of course, are the vast and varied ways in which it employs digital video. In addition to being used in the space-shuttle cabin and on the space station to broadcast public affairs and educational events and to transmit data from onboard experiments, video will play an essential role in the construction of the International Space Station. Right now, the space station consists of only two sectional spacecrafts, or "modules": the United States "Unity" module and the Russian "Zarya" module, which were joined as one in December 1998.
Highly customized video cameras that can operate within the vacuum of space will be mounted outside the space station to document the construction of additional U.S. and multinational modules. They also will provide an ongoing view of the process, which NASA estimates will be completed sometime in 2004. "There will actually be 14 different places on the station where you can mount a camera - not to mention the cameras that are already going to be on the robotic arms," says Ed Wilson, a photo-TV trainer with the mission operations group at the Johnson Space Center in Houston. "Right now, the baseline is for four cameras, so we can have any four cameras in any four of these 14 locations."
Installing the cameras over the next few years will require an astronaut to go to the space station's exterior, unplug the camera, move it and set it up in a new position. "We'll configure them to remain in their positions for as long a time as we can, just to be more efficient," says Wilson. "We'll use them in those positions based on what will be assembled over the next two to six months, as the different flights come up."
Video cameras in the cabin will assist the crew during the construction. "For some sections of the station's assembly, the arm operators will not have a direct view of what they're putting together," says Wilson. "They'll rely heavily on images from video cameras to help them get the modules into the correct orientation." The space-station crew will also point handheld camcorders out the windows when they need a tighter view of a particular construction area.
All of the cameras used regularly on NASA space flights shoot in NTSC and range from "small Sony mini-cams to Canon L Series [lens] camcorders," says Wilson. Because the United States will finish its switch to digital television by 2006, NASA plans to phase out NTSC in order to maintain compatibility. "Whether it will be standard-definition or high-definition, it's still going to be a digital format," Wilson says.
A more pressing compatibility issue involves the multiple-format video systems that exist or will be built by various nations on the space station. The U.S. portion of the space station uses NTSC, and the Russian portion uses a combination of PAL and SECAM. In order to interface with the U.S. system, video systems from other countries will have to be at least partly NTSC-based, Wilson says. Though NASA has no immediate plans to connect the incompatible Russian and American video systems through a video standards converter, there are ongoing discussions about it.
Several NASA working groups are also addressing how to bring the multiple systems together once the United States moves from NTSC. "It's just a matter of what is the best and most cost-effective way to get there," Wilson says. "The Russians acquire in PAL, using PAL camcorders. They actually transmit to the ground in SECAM, which is the format used by Russian television. Most of this is based on the system that was used on Mir, which was pretty robust."
But the independent video systems on board are a backup, says Wilson. "If we have problems with the U.S. system, we can at least get video via the Russian system over Russian ground stations." The NTSC video system in the Japanese portion of the space station will link to the U.S. video system. The Japanese may be able to "downlink directly to Japan via a Japanese satellite or directly to the ground when they're over Japan," according to Wilson.
When you're sending and receiving video from space, bandwidth is the primary limiting factor for the quality and number of simultaneous transmissions. Dr. Norman Kluksdahl, a control systems engineer with the Johnson Space Center, works with the portion of the space station's and shuttle's bandwidth that is set aside specifically for digital videoconferencing. "On the shuttle, we have 128 kilobits per second going uphill," he says. The space station, however, will have access to a bigger bandwidth pipe. "On the station, we'll have 3 megabits per second going uphill, so we'll have 24 times the bandwidth. On the shuttle, we can get two or four megabits per second downhill, depending on which Ku-band satellite channel we're using. On the station, we'll be fixed at 6 megabits, so we'll have more bandwidth down, as well."
All of the video on the space station will be processed digitally by a video baseband signal processor, which can compress and interlace the video so that more of it can be squeezed through limited bandwidth. "It's a multiplex-type system, where four different signals will feed into it from four different channels," says Wilson. "There will still be analog sources going into it, but the processing of the video and the downlinking of the video will be digital."
The maximum bandwidth of the video baseband signal processor is 50 megabits per second. "But when you take the overhead out of the packetizing and all the stuff that they do to put that in, you have about 43 megabits per second," he says. "If you split that into four even channels coming down, you would have about 11 megabits per channel. You could also have one channel take up effectively half of the bandwidth, and have the other three use the other half of it. It's all scaleable, but you can't have more than four channels at a time."
Why multiple channels? Some of the scientific experiments on board the space station require a station-to-ground video link. "You may need to watch a crystal grow, or you may need to watch some fluid do something," says Wilson. The video is routed through the station's video system and transmitted to the ground, where a scientist can watch the experiment. "There's an interface document that tells you what parameters you have to meet, what your video has to be like, and what connector you have to put on it in order to interface with the video system that's going to be onboard," he says. Internal video switchers accept the video signals from the experiments, as well as the signals from portable video cameras that can be plugged in as needed.
The two-way video systems are also used for private videoconferences. "We use the exact same video path that we use for everything else," says Wilson. "We just turn off all the monitors and unplug them, so no one else can see it, except in two or three places." These restricted videoconferences include private medical discussions among the flight surgeon and individual crew members, as well as family conferences with crew members. "Because the station crew will be up there for long periods of time, they plan to expand this, so the crew can exchange e-mail with their families," says Andy Stolz, who works with the U.S. payload video-photo operations group at the Marshall Space Flight Center in Hunstville, Alabama.
Much of the in-flight videoconferencing is implemented using Intel's ProShare Video System on notebook computers. Intel modified NASA's older version of ProShare to accommodate the irregularities of transmission from space. Ironically, Intel's newest systems can't be used because they can't be modified for the delay. "It tends to time out when trying to send the packets, and it never really establishes a good, firm two-way link," says Kluksdahl. "Intel increased the delay time in terms of how long it takes to get an acknowledgment back to establish the link." Part of the delay results from the video signal being sent to an orbiting Tracking and Data Relay Satellite (TDRS) before it is relayed to Earth. "And sometimes communications get a little interrupted, depending on the attitude of the orbiter. We have AOS [Acquisition of Signal] and LOS [Loss of Signal] cycles, so that we lose communications, and then it comes back."
Because all of the space station's video will be processed digitally, the transmissions will become just another form of data. "The goal is to move toward treating the spacecraft as a node in a wide-area network," says Kluksdahl. The video data stream is kept separate from the systems-control data stream for obvious security reasons. "We're doing all non-critical stuff. If the e-mail system goes down, or if it doesn't work, it's not going to affect the safety of the crew."
Each station crew member will have a DVD-equipped laptop, which is referred to as a Station Support Computer (SSC). "It can be used for keeping experiment notes, doing e-mail, videoconferencing and watching movies via DVD-ROM," says Wilson. Another type of laptop, referred to as a Portable Computer System (PCS), is used to control the station's systems. "It makes sure the air is right, the fans are blowing and the temperature is correct."
Both sets of computers are IBM ThinkPad 760 XD notebooks, which will make plugging in a USB-equipped video camera very easy. "The crew might have one or two video cameras, most likely a Canon XL1, that they can plug in when they need to have a videoconference."
Science and Sensibility
NASA can't always use the latest video equipment because each component must be rigorously tested before it is introduced into space-flight operations. "In certifying stuff for space, it's a little harder to take advantage of technology creep," says Dave Scott, a self-described technologist and "gizmologist" at Marshall Space Flight Center. "The ripple effect of making a change can be immense." New video equipment is often tested on Detailed Test Objective (DTO) flights. "It's a step-by-step process where a camera might be evaluated for its image quality, reliability and power requirements," says Ed Wilson. On an April 1999 shuttle mission, for example, the crew tested a Canon XL1 digital camcorder and a Sony high-definition camera with a Canon lens.
Bringing two-way, real-time video to the astronauts was a particularly drawn-out conundrum for NASA. "When you saw the stuff on TV [in the 1960s] with the president talking to the crew, you saw the president - the crew didn't," says Scott. In 1994, NASA sent prerecorded video files via a data link to the astronauts' notebook computers. "You could send a couple of pictures, or you could send a QuickTime or AVI movie, though it would take a while to load," says Scott. True two-way, real-time video was finally achieved in 1995. Uplink fax capability, however, has been used since the 1980s whenever engineers on the ground have needed to describe something visual to the crew.
NASA devised work-arounds in the 1970s. "If you've seen photos of the Sky Lab, you may have noticed that one solar panel is missing," says Scott. "There's only one solar panel going out to the side. The panel on the other side didn't deploy properly, and NASA decided it would be safer to cast off the solar panel and let it float away than have it flap around. They had to jury-rig a bracket, and this guy I was working with had done the design for it. This guy figured out how to draw a diagram on the Sky Lab's teletype machine using overstrikes and backing up the carriage. It took a long time, but he was able to send a picture." Today, NASA can send those instructions via a live video feed.
When it comes to choosing video equipment for NASA's space missions, reliability is the single most important factor. "I might go to Radio Shack and buy a couple of pieces for an electronics project at home," says Scott. "If it doesn't work, it's not a big deal - I can jump in the car, drive five blocks and go back to Radio Shack."
Outer space, however, "is a little farther than five blocks," he says. David English is a freelance writer in Greensboro, North Carolina.
|Copyright � 2001 Knowledge Industry Publications, Inc. All rights reserved.|