Showing posts with label cloud computing. Show all posts
Showing posts with label cloud computing. Show all posts

June 05, 2014

Softbank $2000 Companion robot for 2015 will test many social issues such as Learning your families habits and technical ones like Cloud Artificial Intelligence

Japanese firm Softbank has unveiled a robot called Pepper, which it says can read human emotions.

It uses an "emotional engine" and a cloud-based artificial intelligence system that allows it to analyse gestures, expressions and voice tones.

The firm said people could communicate with it "just like they would with friends and family" and it could perform various tasks.

It will go on sale to the public next year for 198,000 yen ($1,930; £1,150).

"People describe others as being robots because they have no emotions, no heart," Masayoshi Son, chief executive of Softbank, said at a press conference.

The machine will be on display starting Friday at Softbank retailers.

* This robot is part of wave of new higher capability and affordable robots for home and business

Softbank has a high target of making this robot able to independently babysit children and attend to elder care. If they can actually succeed with those a large part of those capabilities then that would mean potentially over 100+ million in unit sales over many years and broad global societal impact. It does not seem capable of that level of success now, so part of it is getting enough initial traction and filling certain capability niches to gain momentum.




Softbank has a 48 page pdf presentation.



February 05, 2014

China is expected to take the lead in research and development spending around 2022

Global research and development (R and D) spending is forecast to grow by 3.8 percent—or $60.0 billion—to $1.6 trillion in 2014, according to the closely watched annual forecast by Battelle and R and D Magazine. After a flat year of R and D spending in 2013, the U.S. is projected to show modest growth while China is expected to continue its two-decade upward trajectory in R and D investment.

Industry Snapshots

Life Sciences: The biopharmaceutical sector accounts for 85 percent of all expenditures in the life sciences industry that also includes medical instruments and devices, animal/agricultural bioscience and commercial life science research and testing. The global industry is forecast to have a healthy recovery after a flat 2013, increasing 3.1 percent to $201 billion in 2014. In the U.S., a small projected rebound of 2.2 percent would increase spending to about $93 million, with growth primarily coming from smaller biopharmaceutical innovators and medical device manufacturers.

Information and Communication Technologies: This industry is the largest private-sector R&D investor in the U.S., performing nearly one-third of the total, and is expected to grow by 5.4 percent to $146 billion in 2014. U.S. firms also are dominant globally and will account for more than half of the industry’s worldwide R and D expenditures of $257 billion in 2014. Cloud computing and associated technologies will remain the major R and D thrust for the foreseeable future.

China's PPP research and development spending is a bit over half of the US level and almost double the level of Japan. China is projected to pass the 30 countries of the EU in about 2019 and pass the US in about 2022 in R and D spending.

Here is the 37 page 2014 forecast report on global research.

January 09, 2014

IBM is making a multi-billion bet on Watson Artificial Intelligence for a Siri for Science and Business

This is IBM attempting to use enterprise artificial intelligence to accelerate technology in science, medicine and business. If they succeed this could be the first real artificial intelligence acceleration of technological development and they are attempting to enhance the productivity of science and business. It is a huge lift that is being attempted and it would not be a fast iteration of improvement. It is not like the vision of vastly greater than human level artificial intelligence setting off an explosion of intelligence.

IBM unveiled three new Watson services delivered over the cloud.

1) Watson Discovery Advisor, is designed to accelerate and strengthen research and development projects in industries such as pharmaceutical, publishing and biotechnology.

2) Watson Analytics, delivers visualized Big Data insights, based on questions posed in natural language by any business user.

3) IBM Watson Explorer, helps users across an enterprise uncover and share data-driven insights more easily, while empowering organizations launch Big Data initiatives faster.

The services are being developed and will be offered by the new IBM Watson Group, announced today at an event in New York City. The Watson Group will accelerate a new class of cognitive computing services, software and apps into the marketplace that analyze, improve by learning, and discover answers and insights to complex questions from massive amounts of disparate data.

IBM is investing $1 billion into the Watson Business Group and $100 million in venture capital earmarks toward new Watson apps, and a shiny new Watson headquarters in New York's East Village neighborhood.

IBM wants to transform Watson into a Siri for business. The platform is designed for users to ask Watson questions, with Watson giving answers--such as medical diagnoses for hard-to-diagnose diseases, or the likely outcome of business decisions--on the spot.



December 16, 2013

Briefly Profitable Litecoin mining via Amazon Cloud Services and clever coding

Litecoin is an alternative to Bitcoin (BTC) that is designed to reduce the comparative advantage of using custom ASICs (or GPUs) for mining it relative to using a conventional CPU. Its future is even less certain that BTC, of course, as a later comer, but the technically interesting bits lie in its proof-of-work hash function: Scrypt. Scrypt is designed to be "memory-hard" in addition to being computationally hard.

Just before a spike in litecoin prices, a developer a week and a half writing a better scrypt miner for NVidia-based cards using the Kepler architecture. Which meant that, as of this writing, his code was about 30% or 40% faster than any of the public code for scrypt hashing on Nvidia cards. The core improvements: It went from about 150 kh/sec to about 220 kh/s, and its CPU use dropped from 30% to 0.1%.

With the GPU hitting 215-220 kh/s and the CPU at 35 kh/s (remember, it was idle), he figured that he could conservatively get 230 kh/s. Assuming a 10% overhead in shifting the money back from LTC -> USD, and taking a conservative view of the likely sale price (there's a lag between mining them and selling them), then mining on Amazon would produce about $50-$75/month/instance.

November 14, 2013

IBM announces Watson Developers Cloud for a new era of cognitive Apps starting in 2014

IBM today announced that, for the first time, it will make its IBM Watson technology available as a development platform in the cloud, to enable a worldwide community of software application providers to build a new generation of apps infused with Watson's cognitive computing intelligence.

The move aims to spur innovation and fuel a new ecosystem of entrepreneurial software application providers – ranging from start-ups and emerging, venture capital backed businesses to established players. Together with IBM, these business partners share a vision for creating a new class of cognitive applications that transform how businesses and consumers make decisions.

To bring this shared vision to life, IBM will be launching the IBM Watson Developers Cloud, a cloud-hosted marketplace where application providers of all sizes and industries will be able to tap into resources for developing Watson-powered apps. This will include a developer toolkit, educational materials and access to Watson's application programming interface (API).

IBM partners that build Watson-powered apps in the cloud will be able to choose from two sources of data-driven content, to prepare their apps to uncover insights for users. App providers can use their own company’s data, or access the IBM Watson Content Store, featuring third-party content that offers data-rich resources that can fuel Watson’s ever expanding knowledge.



October 14, 2013

GE gets traction with the industrial internet which is incorporating sensors, big data and cloud to make industry more efficient

On Wednesday GE said that it has brought in $290 million so far this year from products built using this industrial internet philosophy and booked an anticipated $400 million in revenue. That may not seem like much for a company that had sales of $147.36 billion last year, but this is a two-year-old effort inside GE. GE also expanded the line of product offerings it has in its Predictivity line from 10 to 24, announced Intel, Cisco and AT&T as its latest partners and detailed its platform for building out the industrial internet, called Predix. Think of it as Amazon Web Services for the industrial internet.

* Predix is a platform for industrial applications. Applications can be built for any system or machine — from jet engines to MRI scanners — and be remotely managed while connected to the internet. So far there are four components to the platform, for the sensors themselves, analytics, management of the connected devices, and a vague one called Predix Experience.

* Next year GE plans to offer a developer program that lets third parties integrate Predix platform technologies into their own services.

* Of the new partners, AT&T will handle connectivity via cellular, wireline and perhaps even Wi-Fi management techniques courtesy of AT&T’s Wayport division.

* Cisco and GE will continue an existing business relationship to “include collaboration in industries that may include oil and gas, transportation, healthcare, and power generation.”

* GE says it will work with Intel to “embed virtualization and cloud-based, standardized interfaces within the GE Predix platform.”

The Predix platform and Predictivity products are simply a way for GE to get even better data while offering the real advantages of the internet of things to its clients.

In November, 2012, GE announced it would invest $1.5 billion in efforts to fine-tune its machines’ performance and capture big efficiency gains by connecting them to its enterprise software and to the wider Internet. GE thinks that cheaper computing power and sensors are now poised to usher in a new era of big data for industry. Jeff Immelt, GE’s CEO, has called the idea a revolution, and the company’s top economist has suggested it could help increase worker productivity by as much as 1.5 percent a year.

May 28, 2013

Seven Multi-trillion dollar technologies

The McKinsey Global Institute identifies 12 technologies that could drive truly massive economic transformations and disruptions in the coming years. Applications of the 12 technologies discussed in the report could have a potential economic impact between $14 trillion and $33 trillion a year in 2025. This estimate is neither predictive nor comprehensive. It is based on an in-depth analysis of key potential applications and the value they could create in a number of ways, including the consumer surplus that arises from better products, lower prices, a cleaner environment, and better health. Some technologies detailed in the report have been gestating for years and thus will be familiar.

1. Mobile Internet $3.7-10.5 trillion
2. Automation of knowledge work $5.2-6.7 trillion
3. Internet of things $2.7-6.2 trillion
4. Cloud $1.7-6.2 trillion
5. Advanced robotics $1.7-4.5 trillion
6. Autonomous or near-autonomous cars $0.2-1.9 trillion
7. Next generation genomics $0.7-1.6 trillion

These can be compared to what GDP is expected to be added by leading countries (without currency or inflation adjustment.
1. China $10-20 trillion
2. United States 4-7 trillion
3. India $2-4 trillion
4. Indonesia $1.5-2.5 trillion
5. Brazil $1-2 trillion
6. Japan $1-2 trillion
7. South Korea $1 trillion
8. Mexico $1 trillion
9. Russia $1 trillion

Worldbank 2011 report on global development horizons

McKinsey forecasted urban GDP growth by country


May 05, 2013

Keeping the internet going when there is a crisis or where connectivity and power are not reliable

The people behind Ushahidi, a software platform for communicating information during a crisis, have now developed what they are dubbing a “backup generator for the Internet”—a device that can connect with any network in the world, provide eight hours of wireless connectivity battery life, and can be programmed for new applications, such as remote sensing.

The gadget, dubbed BRCK—slated to be unveiled Monday at a conference in Berlin—is a Wi-Fi router that can serve as many as 20 devices when there is an Internet connection. In other contexts it can serve as a 3G and 4G modem that includes data settings that work on any network in the world—just swap in whatever prepaid Sim card you need.

The BRCK connects to a cloud-based server that lets any BRCK user monitor its performance remotely and manage alerts; leave one at home, for example, and it can send you a text message when the power goes out. The device is also programmable; apps can be written for it; and it comes with up to 16 gigabits of storage. Plug in a camera or other sensor and it’s a monitoring device.

The BRCK was prototyped over the past nine months. To fund the manufacture of the first 1,000 gadgets, it is planning a fundraising campaign on Kickstarter.


This prototype wireless communications device has eight hours of battery power.

March 25, 2013

73.7 terabits per second and 30% lower latency using new hollow core fiber design

Researchers at the University of Southampton in England have produced optical fibers that can transfer data at 99.7% of the universe’s speed limit: The speed of light. The researchers have used these new optical fibers to transfer data at 73.7 terabits per second — roughly 10 terabytes per second, and some 1,000 times faster than today’s state-of-the-art 40-gigabit fiber optic links, and at much lower latency.

The researchers overcame issues with vacuum transmission by fundamentally improving the hollow core design, using an ultra-thin photonic-bandgap rim. This new design enables low loss (3.5 dB/km), wide bandwidth (160nm), and latency that blows the doors off normal optic fiber — light, and thus the data, really is travelling 31% faster down this new hollow fiber. To achieve the transmission rate of 73.7 terabits per second, the researchers used wave division multiplexing (WDM) to transmit 37 40-gigabit signals down the hollow fiber. As far as we’re aware, this is one of the fastest ever transmission rates in the lab.

As for real-world applications, loss of 3.5 dB/km is okay, but it won’t be replacing normal glass fiber any time soon. For short stretches, though, such as in data centers and supercomputer interconnects, these speed-of-light fibers could provide a very significant speed and latency boost.

This will be rapidly adopted because the lower latency will be very valuable for computerized financial trading. It will also go into supercomputer data centers and cloud computer centers.


Nature Photonics - Towards high-capacity fibre-optic communications at the speed of light in vacuum


March 14, 2013

Fujitsu Develops First Optical Transmission Technology to Achieve 100 Gbps using conventional components

Fujitsu Laboratories Limited and Fujitsu Research and Development Center Co., Ltd. of China today announced the development of the world's first optical-transmission technology that can achieve 100 Gbps transmission speeds using widely available, conventional components intended for 10 Gbps networking.

Increasing data-transfer rates has typically required new components designed for those higher speeds, for which existing components have not been compatible. Moreover, there is a limit to the speed improvements that can be achieved with transmission methods using the simple modulation and demodulation formats that have been used to date. Fujitsu Laboratories and Fujitsu R&D Center have applied a Discrete Multi-Tone (DMT) modulation/demodulation format(1) using digital signal processing (DSP) to transmit at 100 Gbps per channel using conventional components intended for transmission speeds of 10 Gbps per channel.

Applying this technology to an optical transceiver with four channels would result in a 400 Gbps Ethernet transceiver, which are needed in the next generation of datacenters in order to increase their data transmission speeds and processing capacity to better support cloud services.

Application for this technology

March 06, 2013

Google's systems send out alerts when they are down to their last few petabytes

One of the best-kept secrets of Google’s rapid evolution into the most dominant force on the web is a software called Borg. Google has been using the system for a good nine or 10 years and John Wilkes and his team are now building a new version of the tool, codenamed Omega.

Borg is a way of efficiently parceling work across Google’s vast fleet of computer servers, and according to Wilkes, the system is so effective, it has probably saved Google the cost of building an extra data center. Yes, an entire data center. That may seem like something from another world — and in a way, it is — but the new-age hardware and software that Google builds to run its enormous online empire usually trickles down to the rest of the web. And Borg is no exception.

Google's systems are big. Google engineers might receive an emergency alert because a system that stores data is down to its last few petabytes of space. In other words, billions of megabytes can flood a fleet of Google machines in a matter of hours.

Google’s system provides a central brain for controlling tasks across the company’s data centers. Rather than building a separate cluster of servers for each software system — one for Google Search, one for Gmail, one for Google Maps, etc. — Google can erect a cluster that does several different types of work at the same time. All this work is divided into tiny tasks, and Borg sends these tasks wherever it can find free computing resources, such as processing power or computer memory or storage space.

Wilkes says it’s like taking a massive pile of wooden blocks — blocks of all different shapes and sizes — and finding a way to pack all those blocks into buckets. The blocks are the computer tasks. And the buckets are the servers. The trick is to make sure you never waste any of the extra space in the buckets.

“If you just throw the blocks in the buckets, you’ll either have a lot of building blocks left over — because they didn’t fit very well — or you’ll have a bunch of buckets that are full and a bunch that are empty, and that’s wasteful,” Wilkes says.


Rather than run separate software systems on separate server clusters, Google can run everything on one cluster — thanks to Borg and its successor, Omega. Illustration: Ross Patton


February 23, 2013

Sony Playstation 4 Will Bring Movie Like CGI with Motion Control Interaction

Sony Playstation 4 will bring movie like CGI and combine with superior motion control interaction.

Simple User Experience Enabled by technology

The PS4 will offer a low-power sleep state, so it will have instant-on capabilities. There will also be background downloading when the console is asleep or even powered off. For DLC, gamers will be able to start playing once the download starts.

Sony will have second chip for uploads and downloads. Games will be playable even while they are downloading.




Sharing

They want to make sharing of video clips of games as easy as screenshots today.

Friends can help take over your controller to get through levels that you are having trouble with.

Games

Blizzard is bringing Diablo 3 to the Playstation 3 and Playstation 4.
There is a new game Knack.

Bungie (the makers of Halo) are bringing the First person Shooter, Destiny to PS4. They also call it the first "Shared World Shooter".





January 23, 2013

Open Source Web Crawl Copy of the Internet

A nonprofit called Common Crawl is now using its own Web crawler and making a giant copy of the Web that it makes accessible to anyone. The organization offers up over five billion Web pages, available for free so that researchers and entrepreneurs can try things otherwise possible only for those with access to resources on the scale of Google’s.

The ccBot crawler is a distributed crawling infrastructure that makes use of the Apache Hadoop project. We use Map-Reduce to process and extract crawl candidates from our crawl database. This candidate list is sorted by host (domain name) and then distributed to a set of spider (bot) servers. We do not use Nutch for the purposes of crawling, but instead utilize a custom crawl infrastructure to strictly limit the rate at which we crawl individual web hosts. The resulting crawl data is then post processed (for the purposes of link extraction and deduplication) and then reintegrated into the crawl database.

They store the crawl data on Amazon’s S3 service, allowing it to be bulk downloaded as well as directly accessed for map-reduce processing in EC2.

Technology Review discusses the open web crawl.

December 11, 2012

Intel has 6 watt Atom chip with 64 bit server capabilities

Intel Atom S1200 server system on-chip hits lower-power levels (6 watts), and includes key features such as error code correction, 64-bit support, and virtualization technologies required for use inside data centers. The Intel Atom processor S1200 is shipping today to customers with recommended customer price starting at $54 in quantities of 1,000 units. Intel introduced three processors within new Intel Atom S1200 product family with actual TDP as follows: Intel Atom S1260 (8.5Watts), Intel Atom S1240 (6.1Watts) and Intel Atom S1220 (8.1 Watts)

More from Intel in 2013
Intel is working on the next generation of Intel Atom processors for extreme energy efficiency codenamed "Avoton." Available in 2013, Avoton will further extend Intel's SoC capabilities and use the company's leading 3-D Tri-gate 22 nm transistors, delivering world-class power consumption and performance levels.

For customers interested in low-voltage Intel® Xeon® processor models for low-power servers, storage and networking, Intel will introduce the new Intel Xeon processor E3 v3 product family based on the "Haswell" microarchitecture next year. These new processors will take advantage of new energy-saving features in Haswell and provide balanced performance-per-watt, giving customers even more options.

November 27, 2012

“Industrial Internet” Report From GE Finds That Combination of Networks and Machines Could Add $10 to $15 Trillion to Global GDP

GE - the Industrial Revolution radically changed the way we use energy and make things. The Internet Revolution altered how we communicate, consume information, and spend money. A combination of these two transformations, called the Industrial Internet, now links networks, data and machines. It promises to remake global industry, boost productivity, and launch an entirely new age of prosperity and robust growth.

The authors found that in the U.S. alone the Industrial Internet could boost average incomes by 25 to 40 percent over the next 20 years and lift growth back to levels not seen since the late 1990s. If the rest of the world achieved half of the U.S. productivity gains, the Industrial Internet could add from $10 to $15 trillion to global GDP – the size of today’s U.S. economy – over the same period. “With better health outcomes at lower cost, substantial savings in fuel and energy, and better performing and longer-lived physical assets, the Industrial Internet will deliver new efficiency gains, accelerating productivity growth the way that the Industrial Revolution and the Internet Revolution did,” Evans and Annunziata write. “These innovations promise to bring greater speed and efficiency to industries as diverse as aviation, rail transportation, power generation, oil and gas development, and health care delivery. It holds the promise of stronger economic growth, better and more jobs and rising living standards, whether in the US or in China, in a megacity in Africa or in a rural area in Kazakhstan.”

The full 37 page report is here



November 20, 2012

Five IT Predictions for China in 2013 and Beyond from Gartner

Enterprise spending on IT in China is forecast to grow from US$117.8 billion in 2013 to reach $172.4 billion in 2016, representing a compound annual growth rate of 8 percent, compared to a global growth rate of 3 percent over the same period, according to Gartner.

“In common with many emerging markets, cloud and mobile initiatives are hot and enterprises are also making progress in adopting virtualization technologies, a key stepping stone in the journey to cloud,” said Matthew Cheung, principal research analyst at Gartner. “Without the legacy systems that hamper many western enterprises, Chinese organizations have an opportunity to leapfrog in the adoption of new technologies.

1. By 2013, Lenovo will become the top smartphone vendor in China.

Lenovo is the world’s top PC manufacturer, and the company’s mobile phone business has gained real momentum in China.. Its smartphone market share rose from 1.7 percent in 3Q11 to 14.8% in 3Q12, making it now the No. 2 smartphone brand, ahead of Apple (6.9 percent) and behind Samsung (16.7 percent). The brand is positioned at the mid-to-lower end which will drive much of its future growth, and this is where global brands are less competitive.

November 12, 2012

Google and Acer announce $199 C7 Chromebook

Zdnet - Google is continuing its push to build the Chromebook market with the latest model by Acer. The Acer C7 Chromebook is available for $199.

The $199 model has an extra 20% weight (3.05 lbs instead of 2.43 pounds) and it has 3.5 hours of battery instead of 6.5 hours for the $249 Samsung Chromebook.

The Acer Chromebook can be bought at Google Play

Google has been seriously pushing the Chrome OS with the recent partnership with Samsung to offer the new Samsung Chromebook. That device lowered the price of a Chromebook to just $249. Continuing that push is the new Acer C7 Chromebook just released for a mere $199. Like the Samsung model before, Google is also offering free goodies worth more than the price tag of the Acer C7 Chromebook.

The new C7 from Acer weighs in at 3.05 pounds and is only an inch thich. In that slim body is an Intel Celeron processor, 2 GB of memory, and surprising for a Chromebook -- a 320GB hard drive. The 11.6-inch display is capable of HD video, and has a quoted battery life of over 3.5 hours.

Google is offering two years of 100GB of free storage in the cloud for new purchasers of the Acer, along with 12 free Gogo Internet passes. Those two offers are together worth more than the price of the new Acer Chromebook, so you could say Google will pay you to get one.


August 27, 2012

IBM Watson 2.0 for your smartphone and tablet for mobile assistance and analystics

Business Week - International Business Machines Corp. (IBM) researchers spent four years developing Watson, the computer smart enough to beat the champions of the quiz show “Jeopardy!” Now they’re trying to figure out how to get those capabilities into the phone in your pocket.

Finding additional uses for Watson is part of IBM’s plan to tap new markets and boost revenue from business analytics (IBM) to $16 billion by 2015. After mastering history and pop culture for its “Jeopardy!” appearance, the system is crunching financial information for Citigroup Inc. and cancer data for WellPoint Inc. The next version, dubbed “Watson 2.0,” would be energy- efficient enough to work on smartphones and tablets.

IBM expects to generate billions in sales by putting Watson to work in finance, health care, telecommunications and other areas.

It takes a while for Watson to do the “machine learning” necessary to become a reliable assistant in an area. Watson’s deal with WellPoint (WLP) was announced in September of last year, and the system won’t master the field of oncology until at least late 2013.

Researchers also need to add voice and image recognition to the service so that it can respond to real-world input, said Katharine Frase, vice president of industry research at Armonk, New York-based IBM.

August 22, 2012

Ferroelectric materials could bring down cost of cloud computing and electronic devices

- A new class of organic materials developed at Northwestern University boasts a very attractive but elusive property: ferroelectricity. The crystalline materials also have a great memory, which could be very useful in computer and cellphone memory applications, including cloud computing.

A team of organic chemists discovered they could create very long crystals with desirable properties using just two small organic molecules that are extremely attracted to each other. The attraction between the two molecules causes them to self assemble into an ordered network -- order that is needed for a material to be ferroelectric.

The starting compounds are simple and inexpensive, making the lightweight materials scalable and very promising for technology applications. In contrast, conventional ferroelectric materials -- special varieties of polymers and ceramics -- are complex and expensive to produce. The Northwestern materials can be made quickly and are very versatile.

In addition to computer memory, the discovery of the Northwestern materials could potentially improve sensing devices, solar energy systems and nanoelectronics

Nature - Room-temperature ferroelectricity in supramolecular networks of charge-transfer complexes


Crystal structures of LASO complexes

July 15, 2012

DOE funds pre-Exaflop technology development from Intel, AMD, Nvidia and Whamcloud to set the stage for Main Exaflop Supercomputer project

HPCWire - Intel, AMD, NVIDIA, and Whamcloud have been awarded tens of millions of dollars by the US Department of Energy (DOE) to kick-start research and development required to build exascale supercomputers. The work will be performed under the FastForward program, a joint effort run by the DOE Office of Science and the National Nuclear Security Administration (NNSA) that will focus on developing future hardware and software technologies capable of supporting such machines.

Although we're only six to eight years away from the first exaflops systems, the DOE's primary exascale program has yet to be funded. (And since this is an election year in the US, such funding will probably not fall into place until 2013.) In the interim, FastForward was devised in order to begin the needed R&D for some of the exascale foundational technologies, in particular, processors, memory and storage.

At least some of the impetus for the program came from the vendors themselves. According to Mark Seager, Intel's CTO for the company's High Performance Computing Ecosystem group, the DOE was told by multiple commercial partners that research for the component pieces needed to get underway this year if they hoped to field an exascale machine by 2020. That led to the formation of the program, and apparently there was enough loose change rolling around at the Office of Science and NNSA to fund this more modest effort.

Although all the FastForward subcontracts have yet to be made public, as of today there are four known awards:

* Intel: $19 million for both processor and memory technologies
* AMD: $12.6 million for processor and memory technologies
* NVIDIA: $12 million for processor technology
* Whamcloud (along with EMC, Cray and HDF Group): Unknown dollar amount for storage and I/O technologies

Форма для связи

Name

Email *

Message *