![]() |
![]() |
||||||||||||||
|
Research News
FYI Research: Interactive maps speak to visually
impaired FYI Research: Interactive maps speak to visually impaired Using an ordinary computer keyboard, touchpad and stylus, Jason Morris navigates a map of the British Isles in Roman times. When his stylus hits the ocean, he hears the sound of waves. When he touches land, horses' hoofs thunder. As he moves over a landmark, a computerized voice speaks and spells the landmark's name. These sounds are much more than window dressing. Morris, a graduate student in classics, doesn't experience the world the way that most people do. He's blind.
With Morris as a consultant, five undergraduates in computer science
have created software that can help visually impaired people explore
maps. The team calls it BATS (Blind Audio Tactile Mapping System). Getting
the software to work with the British Isles map took a semester of work.
But the students now think of Maps provide spatial information that is hard to get any other way. But most maps aren't acessible to people with visual impairments. Braille maps do exist, but they can include only a small fraction of the information found on conventional maps. And, there isn't a standardized tactile way of signifying features such as bodies of water, said James Kessler, director of Disability Services. Morris knows this problem firsthand. When he came to Carolina to study classics and began working in the Ancient World Mapping Center, he and Tom Elliott, director of the mapping center, wanted to make parts of the Barrington Atlas of the Greek and Roman World (edited by Richard Talbert, professor of history) accessible for visually impaired college students. The atlas includes 175 pages of full-color maps. None of that information could be used by visually impaired students and researchers. Elliott and Morris were planning to make parts of it into tactile maps. Meanwhile, Gary Bishop, associate professor of computer science, was looking for a way to use his computer-graphics knowledge to help people with visual impairments. But first Bishop needed a visually impaired person to help him. "If you work with a user, then you really find out what is necessary for the tool," Bishop said. He had looked for such a person for a while. Then one day, he was walking across campus. "Here comes this blind guy walking the other way on the sidewalk," Bishop said. "I know he's blind because he's got a [guide] dog. But I'm not going to say to him, `Oh hi, I notice you're blind.' And so I just walked past him." But Morris stopped Bishop. "What street am I on?" he asked. Bishop told him he was on a campus sidewalk. Morris' guide dog, Annie, had taken a wrong turn. As Bishop helped Morris get on the right course, they introduced themselves. Bishop had found his user. For starters, Bishop suggested that a group of students work on an interface for part of the atlas as their project for Computer Science 145, taught by Kye Hedlund, associate professor of computer science. That's when undergraduates Shawn Hunter, Thomas Logan, Chad Haynes, Elan Dassani and Anthony Perkins began working with Morris on the map of the British Isles. Their completed project helps Morris explore a map that wouldn't be available to him otherwise. "I've been very impressed with the way these undergrads have participated in research," Bishop said. Right now the software gives feedback only via sounds. The next step -- this summer if they can find the funding -- the students want to add more ways of exploring the map, such as a haptic -- touch -- interface. The user could also get information about the map through, say, vibration of the stylus. Right now the software is experimental, Morris said. It's programmed to work specifically with the British Isles map, and that took a lot of work. For instance, to make the software read elevation data, Elliott used a Geographic Information System to lay a grid over the map. Then he assigned elevation values to each point on thegrid. He also gave the students access to the mapping center's database of place names and other data. The students developed software to relate points in the grid to the information in the database. As Morris traces the map with the stylus, the software reads data from the grid and from the database, interpreting it to Morris via sound and synthesized speech. All those involved with BATS want to take their ideas even further and design a program that could be used to explore any type of map with little programming. To do that, the software would have to use technology that is just emerging, such as scalable vector graphics, which would allow the software to read almost any digitized map that had been prepared accordingly. "People can publish information via their web server, and a device that understands how to speak these standard protocols can immediately interact with that data, even if the device has never seen it before," Elliott said. What began as a class project has now attracted interest from James Kessler, director of Disability Services and Toby Considine, a technology staff member in Facilities Services, because of its possibilities for instantly providing updated maps to different kinds of people. If the University could leverage the right technology, Considine said, it's feasible that Facilities Services could update a map once, and that information would automatically be transmitted to a variety of devices -- palm pilots, web browsers and tactile devices such as BATS. So a visitor with a palm pilot or a blind person with an audio/tactile device could walk into a campus building and immediately get information about the layout of the building and the services offered there. Or a blind student who needed to find an unfamiliar building or avoid a construction fence could download a map that could be read by an audio/tactile device. Realizing those goals will take a lot of work, some funding and a short wait for the technology to become standard. But the students are willing to work. Hunter said, "We want to build a foundation now to make something that can be extended and create an application that has far larger implications than what we are able to do this summer." Provided by Research and Graduate Studies Editor: Neil Caudle Writer: Angela Spivey Other projects Along with BATS, students took on these projects in Kye Hedlund's computer science class: Software that provides quick, online translations of Greek texts into Braille. Software that makes it possible to use a digital scanner to get an audio translation of currency.
Grant to fund cell signaling research A team of investigators in the School of Medicine has received a five-year, $5 million program project grant from the National Institute of General Medical Sciences to study molecular aspects of cell signaling pathways. Kendall Harden, professor of pharmacology and the project's principal investigator, said the study brings together five scientists "who share a common passion but widely different expertise for unraveling biochemical mechanisms that determine how cells respond to hormones, neurotransmitters and growth factors." The group also includes Channing Der, David Siderovski and John Sondek of the Department of Pharmacology and Henrik Dohlman of the biochemistry & biophysics department. All are members of the UNC Lineberger Comprehensive Cancer Center. Harden noted that the project team will apply state-of-the-art biochemical, genetic, biological and structural approaches to gain new insights into key proteins that orchestrate cellular events at the heart of diseases as diverse as cancer, heart disease and mental disorders. The research should illuminate new targets for drug therapies. In noting that more than 30 scientists at Carolina now study cell signaling, Harden said Carolina has become "one of the best places in the country to do this kind of work."
Group receives $8 million to model 3-D objects from images A University research group has received a five-year, $8 million grant from the National Cancer Institute to develop techniques of characterizing anatomical objects seen in medical images.
The Medical Image Display and Analysis Group focuses on using medical
images created by computed tomography, magnetic resonance and other
means to help doctors diagnose disease and deliver therapy. The grant
will support the next phase of the group's ongoing Medical Image Presentation
project. "We are excited that we can combine medical contributions which can improve the health of patients with scientific contributions as to what anatomic shape is and how to compute its properties," said Stephen Pizer, the project's principal investigator. Pizer is a Kenan professor who represents the departments of computer science, radiation oncology, radiology and biomedical engineering. The objective of this phase of the project, called Structural Image Analysis and Medical Uses, is to develop ways to model 3-D anatomical objects, such as organs, from computed tomography and magnetic resonance images. Such models can help doctors plan radiation treatment for cancer and make psychiatric diagnoses. The models also will support research into the development and diagnosis of schizophrenia. The project also will further develop methods for 3-D modeling based on m-reps, a means of representing pliable objects invented by Pizer and the Medical Image Display and Analysis Group. Project leaders are Edward Chaney, professor in the department of radiation oncology; Guido Gerig, Taylor Grandy professor with a joint appointment in the departments of computer science and psychiatry; Keith Muller, associate professor in the department of biostatistics; and Graham Gash, software engineer in the department of computer science. The Medical Image Display and Analysis Group, which formed in 1974, is a multidisciplinary group made up of about 100 faculty, graduate students and staff representing 11 departments. Much of the group's recent work has focused on using 3-D medical images in describing the location, orientation, size and shape of anatomic objects. These projects include 3-D modeling of brain structure change in patients with psychiatric diseases and modeling of patients' vascular networks to assist in surgery. For more information on the group's research, refer to midag.cs.unc.edu
|