In This Issue...
Hooked on Phonons: NIST-led Research Group Measures Graphene Vibrations
An international research group led by scientists at the National Institute of Standards and Technology's (NIST) Center for Nanoscale Science and Technology has developed a method for measuring crystal vibrations in graphene. Understanding these vibrations is a critical step toward controlling future technologies based on graphene, a one-atom thick form of carbon.
They report their findings in the June 19, 2015, issue of Physical Review Letters.*
Carbon atoms in graphene sheets are arranged in a regularly repeating honeycomb-like lattice—a two-dimensional crystal. Like other crystals, when enough heat or other energy is applied, the forces that bond the atoms together cause the atoms to vibrate and spread the energy throughout the material, akin to how the vibration of a violin's string resonates throughout the body of the violin when played.
And just like every violin has its own unique character, each material vibrates at unique frequencies. The collective vibrations, which have frequencies in the terahertz-range (a billion billion oscillations per second), are called phonons.
Understanding how phonons interact gives clues as to how to put in, take out or move energy around inside a material. In particular, finding effective ways to remove heat energy is vital to the continued miniaturization of electronics.
One way to measure these tiny vibrations is to bounce electrons off the material and measure how much energy the electrons have transferred to the vibrating atoms. But it's difficult. The technique, called inelastic electron tunneling spectroscopy, elicits only a small blip that can be hard to pick out over more raucous disturbances.
"Researchers are frequently faced with finding ways to measure smaller and smaller signals," says NIST researcher Fabian Natterer, "To suppress the chaos and get a grip on the small signals, we use the very distinct properties of the signal itself."
Unlike a violin that sounds at the lightest touch, according to Natterer, phonons have a characteristic threshold energy. That means they won't vibrate unless they get just the right amount of energy, such as that supplied by the electrons in a scanning tunneling microscope (STM).
To filter the phonons' signal from other distractions, NIST researchers used their STM to systematically alter the number of electrons moving through their graphene device. As the number of electrons were varied, the unwanted signals also varied in energy, but the phonons remained fixed at their characteristic frequency. Averaging the signals over the different electron concentrations diluted the annoying disturbances, but reinforced the phonon signals.
The team was able to map all the graphene phonons this way, and their findings agreed well with their Georgia Tech collaborators' theoretical predictions.
According to NIST Fellow Joe Stroscio, learning to pick out the phonons' signal enabled them to observe a peculiar and surprising behavior.
"The phonon signal intensity fell off sharply when we switched the graphene charge carrier from holes to electrons—positive to negative charges," says Stroscio. "A clue to what's initially enhancing the phonons' signals and then causing them to fall off are whispering gallery modes, which become filled with electrons and stop the phonons from vibrating when we switch from hole to electron doping."
The team notes that this effect is similar to resonance-induced effects seen in small molecules. They speculate that if the same effect were happening here, it could mean that the system—graphene and STM—is mimicking a giant molecule, but say that they still don't have a firm theoretical foundation for what's happening.
The high purity graphene device was fabricated by NIST researcher Y. Zhao in the Center for Nanoscale Science and Technology's Nanofab, a national user facility available to researchers from industry, academia and government.
*F. Natterer, Y. Zhao, J. Wyrick, Y. Chan, W. Ruan, M. Chou,K. Watanabe, T. Taniguchi, N. Zhitenev and J. Stroscio. Strong asymmetric charge carrier dependence in inelastic electron tunneling spectroscopy of graphene phonons. Physical Review Letters, 114, 245502. Published June 16, 2015. DOI: 10.1103/PhysRevLett.114.245502.
Media Contact: Mark Esser, firstname.lastname@example.org, 301-975-8735
To Give Cancer a Deadly Fever, NIST Explores Better Nanoparticle Design
Heat may be the key to killing certain types of cancer, and new research from a team including National Institute of Standards and Technology (NIST) scientists has yielded unexpected results that should help optimize the design of magnetic nanoparticles that can be used to deliver heat directly to cancerous tumors.
When combined with other treatments such as radiotherapy or chemotherapy, heat applied directly to tumors helps increase the effectiveness of those types of treatments, and it reduces the necessary dose of chemicals or radiation.
This is where magnetic nanoparticles come in. These balls of iron oxide, just a few tens of nanometers in diameter, heat up when exposed to a powerful magnetic field. Their purpose is to bring heat directly to the tumors. Materials research, performed in part at the NIST Center for Neutron Research (NCNR), revealed magnetic behavior that proved counterintuitive to the scientific team—a finding that will affect which particles are chosen for a particular treatment.
Choosing the right kind of particles is important because, depending on their structure, they deliver a different dose of heat to the cancer. Some heat up quickly at first, while others require a stronger magnetic field to get going but ultimately deliver more heat.
“You want to design your nanoparticles for the kind of cancer you’re treating—whether it’s localized or spread through the body,” says NIST’s Cindi Dennis. “The amount of electricity needed to create the field can be 100 kilowatts or more. That costs a lot of money, so we want to help engineer particles that will do the best job.”
Although the magnetic field applied for hyperthermia is 100 to 1,000 times weaker that that typically used for MRI imaging, Dennis explains, it’s an alternating field (the magnetic polarity switches rapidly), which requires a lot more power.
With colleagues at Johns Hopkins University School of Medicine, the University of Manitoba and in industry, the team studied two kinds of iron-oxide nanoparticles, each of which has a different internal structure. In one, iron-oxide crystals are stacked neatly, like bricks in a wall; in the other, the arrangement is more haphazard, like balls in a playpen. While subjecting both types to an alternating magnetic field, the team discovered that the neatly-stacked ones needed a stronger field than expected to heat up, while the haphazard particles got hot more quickly, even when the field was still weak.
It took a trip to the NCNR to figure out why these nanoparticles acted strangely. The neutron experiments showed regions of different sizes and shapes in the particles. Within each region, the so-called magnetic moments are uniform and point in the same direction. But the regions themselves did not align with each other. This unexpected behavior among regions, it turns out, profoundly affects the nanoparticles’ response to a magnetic field.“
"Materials often behave unexpectedly on the nanoscale, and here we have another example of that,” Dennis says. “We expect it will help design better cancer treatments. A localized cancer could be treated with nanoparticles that give out lots of heat right away because the field can be focused on a small region.”
* C.L. Dennis, K.L. Krycka, J.A. Borchers, R.D. Desautels, J. van Lierop, N.F. Huls, A.J. Jackson, C. Grüttner and R. Ivkov. Internal magnetic structure of nanoparticles dominates time-dependent relaxation processes in a Magnetic Field. Advanced Functional Materials. Published online June 2, 2015. DOI: 10.1002/adfm.201500405.
Media Contact: Chad Boutin, email@example.com, 301-975-4261
NIST’s NextGen PIV Card Strengthens Security and Authentication
The National Institute of Standards and Technology (NIST) has updated its technical specifications and guidance for the next generation of “smart” identity cards used by the federal government's workforce. The new specifications add enhanced security features to verify employees’ and contractors’ identities, as well as new capabilities that work with mobile devices and media such as smart phones.
Federal employees and contractors use Personal Identification Verification (PIV) Cards for secure access to government facilities and computers. The PIV Card features a microchip with the employee’s photo, PIN, fingerprint information and other details.
The next generation PIV Card can be used with mobile devices, enabling federal employees to connect securely to government computer networks from such devices. This feature is in addition to the Derived PIV Credential as specified in Guidelines for Derived Personal Identity Verification (PIV) Credentials, issued in December 2014. The card provides stronger identity assurance for federal workers to enter many government facilities and use computers at those locations.
The revised Federal Information Processing Standard 201-2 of 2013 sets the stage for the new generation of PIV Cards by specifying new technologies for the strong authentication credential and provides enhanced support for mobile devices based on lessons learned from federal agencies.
NIST has issued updates to two key documents that lay out the technical details identified in FIPS 201-2 for government PIV Cards:
The publications are designed for U.S. government agencies to upgrade their PIV Cards, for vendors that make the cards, and for vendors that develop hardware and software to work with the cards.
Media Contact: Evelyn Brown, firstname.lastname@example.org, 301-975-5661
NIST Releases Update of Industrial Control Systems Security Guide
The National Institute of Standards and Technology (NIST) has issued the second revision to its Guide to Industrial Control Systems (ICS) Security. It includes new guidance on how to tailor traditional IT security controls to accommodate unique ICS performance, reliability and safety requirements, as well as updates to sections on threats and vulnerabilities, risk management, recommended practices, security architectures and security capabilities and tools.
Downloaded more than 3 million times since its initial release in 2006, the ICS security guide advises on how to reduce the vulnerability of computer-controlled industrial systems to malicious attacks, equipment failures, errors, inadequate malware protection and other threats.
ICS encompass the hardware and software that control equipment and the information technologies that gather and process data. They are commonly used in factories and by operators of electric utilities, pipelines and other major infrastructure systems.
Most ICS began as proprietary, stand-alone collections of hardware and software that were walled off from the rest of the world and isolated from most external threats. Today, widely available software applications, Internet-enabled devices and other nonproprietary IT offerings have been integrated into most such systems. This connectivity has delivered many benefits, but it also has increased the vulnerability of these systems. Cybersecurity threats to ICS can pose significant risks to human health and safety, the environment, and business and government operations.
Due to unique performance, reliability and safety requirements, securing ICS often requires adaptations and extensions to NIST-developed security standards and guidelines commonly used to secure traditional IT systems.
A significant addition in this revision is a new ICS overlay offering tailored guidance on how to adapt and apply security controls and control enhancements detailed in the 2013 comprehensive update of Security and Privacy Controls for Federal Information Systems and Organizations (NIST Special Publication 800-53, revision 4) to ICS. SP 800-53 contains a catalog of security controls that can be customized to meet specific needs stemming from an organization's mission, operational environment, or the particular technologies used. Using the ICS overlay, utilities, chemical companies, food manufacturers, automakers and other ICS users can adapt and refine these security controls to address their specialized security needs.
Media Contact: Mark Bello, email@example.com, 301-975-3776
Nothing Says You Like a Tattoo: NIST Workshop Considers Ways to Improve Tattoo Recognition
An international group of experts from industry, academia and government gathered today at the National Institute of Standards and Technology (NIST) to discuss challenges and potential approaches to automated tattoo recognition, which could assist law enforcement in the identification of criminals and victims.
One in five American adults sport a tattoo, and among the criminal population, that number is much higher. And while tattoos can be used to assist identification of people who may have committed a crime, they also are potentially valuable in supporting identification of victims of mass casualties such as tsunamis and earthquakes.
Participants at NIST's Tattoo Recognition Technology Challenge Workshop heard the results of a preliminary trial of existing tattoo recognition software. NIST challenged industry and academia to take initial steps into automated image-based tattoo matching technology at the request of the FBI Biometric Center of Excellence (BCOE).
The current method of cataloging tattoo images for sharing relies on a keyword based process. But the increasing variety of tattoo designs requires multiple keywords, and examiner subjectivity can lead to the same tattoo being labeled differently depending on the examiner.
All of the participating organizations used the same BCOE-provided dataset of thousands of images from government databases. NIST provided five use cases and asked the participants to report their performance on finding:
NIST computer scientist Mei Ngan organized the challenge and found that "the state-of-the-art algorithms fared quite well in detecting tattoos, finding different instances of the same tattoo from the same subject over time, and finding a small part of a tattoo within a larger tattoo."
Two areas that could use further research, she said, were detecting visually similar tattoos on different people and recognizing a tattoo image from a sketch or sources other than a photo.
"Improving the quality of tattoo images during collection is another area that may also improve recognition accuracy," Ngan said.
The six organizations that participated in the challenge include Compass Technical Consulting, LLC.; the Fraunhofer Institute of Optronics, System Technologies and Image Exploitation; the French Alternative Energies and Atomic Energy Commission; MITRE; MorphoTrak and Purdue University.
In addition to discussing the self-reported finding in this initial research step, participants also discussed the utility of image-based tattoo matching in operations, identified gaps and needs to improve tattoo recognition and discussed next steps NIST might take in this area.
For more details regarding the discussions and outcomes of the workshop, visit www.nist.gov/itl/iad/ig/tatt-c.cfm.
NIST has been the go-to source for testing and evaluation in biometrics for more than half a century, with the goal of ensuring biometric systems are accurate. The laboratory has played a major role in biometric research, including large-scale evaluations of fingerprint and face recognition systems.
Media Contact: Evelyn Brown, firstname.lastname@example.org, 301-975-5661
NIST's 'Nano-Raspberries' Could Bear Fruit in Fuel Cells
Researchers at the National Institute of Standards and Technology (NIST) have developed a fast, simple process for making platinum "nano-raspberries"—microscopic clusters of nanoscale particles of the precious metal. The berry-like shape is significant because it has a high surface area, which is helpful in the design of catalysts. Even better news for industrial chemists: the researchers figured out when and why the berry clusters clump into larger bunches of "nano-grapes."
The research could help make fuel cells more practical. Nanoparticles can act as catalysts to help convert methanol to electricity in fuel cells. NIST's 40-minute process for making nano-raspberries, described in a new paper,* has several advantages. The high surface area of the berries encourages efficient reactions. In addition, the NIST process uses water, a benign or "green" solvent. And the bunches catalyze methanol reactions consistently and are stable at room temperature for at least eight weeks.
Although the berries were made of platinum, the metal is expensive and was used only as a model. The study will actually help guide the search for alternative catalyst materials, and clumping behavior in solvents is a key issue. For fuel cells, nanoparticles often are mixed with solvents to bind them to an electrode. To learn how such formulas affect particle properties, the NIST team measured particle clumping in four different solvents for the first time. For applications such as liquid methanol fuel cells, catalyst particles should remain separated and dispersed in the liquid, not clumped.
"Our innovation has little to do with the platinum and everything to do with how new materials are tested in the laboratory," project leader Kavita Jeerage says. "Our critical contribution is that after you make a new material you need to make choices. Our paper is about one choice: what solvent to use. We made the particles in water and tested whether you could put them in other solvents. We found out that this choice is a big deal."
The NIST team measured conditions under which platinum particles, ranging in size from 3 to 4 nanometers (nm) in diameter, agglomerated into bunches 100 nm wide or larger. They found that clumping depends on the electrical properties of the solvent. The raspberries form bigger bunches of grapes in solvents that are less "polar," that is, where solvent molecules lack regions with strongly positive or negative charges. (Water is a strongly polar molecule.)
The researchers expected that. What they didn't expect is that the trend doesn't scale in a predictable way. The four solvents studied were water, methanol, ethanol and isopropanol, ordered by decreasing polarity. There wasn't much agglomeration in methanol; bunches got about 30 percent bigger than they were in water. But in ethanol and isopropanol, the clumps got 400 percent and 600 percent bigger, respectively—really humongous bunches. This is a very poor suspension quality for catalytic purposes.
Because the nanoparticles clumped up slowly and not too much in methanol, the researchers concluded that the particles could be transferred to that solvent, assuming they were to be used within a few days—effectively putting an expiration date on the catalyst.
Two college students in NIST's Summer Undergraduate Research Fellowship (SURF) program helped with the extensive data collection required for the study.
* I. Sriram, A.E. Curtin, A.N. Chiaramonti, J.H. Cuchiaro, A.D. Weidner, T.M. Tingley, L.F. Greenlee and K.M. Jeerage. Stability and phase transfer of catalytically active platinum nanoparticle suspensions. Journal of Nanoparticle Research 17:230.DOI 10.1007/s11051-015-3034-1. Published online May 22, 2015.
Media Contact: Laura Ost, email@example.com, 303-497-4880
Lockheed Martin Study: NIST Computational Tool Improves Product Testing and Saves Money
Lockheed Martin, the global security and aerospace company, estimates that widely used software testing methods developed at the National Institute of Standards and Technology (NIST) can trim test planning and design costs by up to 20 percent, while greatly improving the thoroughness of product and system testing during development.
The projected benefits are based on the results of the company’s two-year pilot study of combinatorial testing, as reported in an article in IEEE Computer.* The study was carried out under a Cooperative Research and Development Agreement (CRADA) with NIST.
In contrast to creating test cases manually, combinatorial testing is a comprehensive “trouble shooting” tool for rooting out faulty code, calculation errors, unanticipated interactions and other causes of software failures. Used during the early stages of product development, the computational tool can detect faults that might otherwise linger until late-stage testing or, perhaps, slip through entirely.
Corrective rework to fix late-detected software errors can wreck project schedules and may be the costliest part of the development process, especially for complex products and systems.
Developed with collaborators from the University of Texas at Arlington, NIST’s Advanced Combinatorial Testing System (ACTS) “uses proven mathematical techniques to greatly reduce the number of tests a company needs to perform to ensure the quality of a product or process,” explains NIST computer scientist Richard Kuhn.
ACTS generates a plan for testing combinations of two to six variables that can interact and cause errors.
While studying software crashes of medical devices and Web browsers, Kuhn and colleagues determined that between 70 and 95 percent of software failures are triggered by interactions between only two variables. Nearly all software failures are triggered by no more than six variables.
“For example, all six-way combinations of 34 switches could be tested with only 522 tests instead of 17 billion for all possible combinations.” said Raghu Kacker, a NIST mathematical statistician.
In one project, combinatorial testing was compared with conventional methods for testing commercial Web browser software. NIST used the method to find 100 percent of flaws using less than 5 percent of the original number of tests.
During Lockheed Martin’s two-year study to assess the merits of the software testing approach, the company chose ACTS as its primary tool, which it supplemented with other combinatorial testing packages. In all, the tool’s utility was evaluated in eight pilot development projects, including an electronic warfare system, the redesign of a fin on a fighter jet, and an evaluation of aircraft engine failure modes.
“Lockheed Martin’s initial estimate is that combinatorial testing and supporting technology can save up to 20 percent in test planning and design costs if used early on in a program and can increase test coverage by 20 to 50 percent,” company and NIST researchers wrote in the Computer article.
“Our experience showed that the combinatorial testing technique is maturing,” says Thomas L. Wissink, director of integration, test, and evaluation at Lockheed Martin. “We are continuing to use combinatorial testing at Lockheed Martin and are planning to extend its use through the company.”
NIST also is building on lessons learned from this collaboration. Knowledge and insights gained are guiding NIST research to further improve measurement and testing capabilities focused on ensuring the quality, safety and reliability of software and systems.
The ACTS tool and tutorial are publicly available.
* J. Hagar, T. Wissink, D. Kuhn and R. Kacker. Introducing combinatorial testing in a large organization, Computer, April 2015.
Media Contact: Evelyn Brown, firstname.lastname@example.org, 301-975-5661
Learn About the State of Federal Cloud Computing at NIST, July 7-10, 2015
The Cloud Computing Forum & Workshop VIII will be held July 7-10, 2015, at the National Institute of Standards and Technology (NIST) in Gaithersburg, Md. One of the most influential annual meetings on the topic, particularly for government users, it will feature five tracks covering cloud customers, forensics, research, security and standards.
Tuesday, July 7, is Cloud Standards Day. The first two international cloud computing standards have been completed by the International Organization for Standardization (ISO)/International Electrotechnical Commission (IEC), and work on standards continues at the International Telecommunication Union (ITU). How they are being used and how do they shape the future of cloud standards? Two leading figures in standards development—Hewlett-Packard Director, Standards Initiatives Karen Higginbottom and Oracle Vice President, Standards Strategy and Architecture Donald Deutsch—will lead the discussion.
Security is the theme of Wednesday, July 8. The day offers insight into the intersection of the NIST Cloud Computing Program and the GSA-based Federal Risk and Authorization Management Program (FedRAMP) that reviews and authorizes cloud vendors for government use, overcoming security obstacles when buying cloud-based systems and managing service level agreements and security. The day will end with a discussion on privacy and security.
Thursday, July 9, features two themes. The Cloud Research Track will examine the next direction of cloud computing now that it has become basic infrastructure. Researchers will discuss the next steps for the cloud and how to measure cloud services. A special focus will be on the concept of a federated cloud, which is a vision of the future in which the resources of different cloud services providers are connected in a seamless network resembling that of today’s Internet.
Thursday’s second theme is Forensics—law enforcement access to cloud data. Speakers will review topics including e-Discovery, data governance, deletion and data integrity in the cloud.
Cloud Customers are the focus on Friday, July 10. Cloud customer Dawn Leaf provides her unique experience with cloud computing as the current Labor Department CIO and former NIST Senior Executive for Cloud Computing. Other topics to be covered include lessons in cloud computing acquisitions, government needs and available solutions, a practical guide to cloud service level agreements and how the cloud can assist people with disabilities.
For more information or to register for the Cloud Computing Forum and Workshop VIII: www.nist.gov/itl/cloud/cloud_computing_wkshp_viii.cfm.
Media Contact: Evelyn Brown, email@example.com, 301-975-5661