Archive for November 18, 2009

IBM Moves Closer to Creating Computer Based on the Brain

At the Supercomputing 2009 conference, IBM announced significant progress toward creating a computer system that simulates and emulates the brain’s abilities for sensation, perception, action, interaction and cognition, while rivaling the brain’s low power and energy consumption and compact size. Scientists at IBM Research and Lawrence Berkeley National Lab, have performed the first near real-time cortical simulation of the brain that exceeds the scale of a cat cortex and contains 1 billion spiking neurons and 10 trillion individual learning synapses. The simulation was performed using Lawrence Livermore National Lab’s Dawn Blue Gene/P supercomputer with 147,456 CPUs and 144 terabytes of main memory.

            IBM scientists have also collaborated with researchers from Stanford University to develop an algorithm that exploits the Blue Gene® supercomputing architecture in order to noninvasively measure and map the connections between all cortical and sub-cortical locations within the human brain using magnetic resonance diffusion weighted imaging. Mapping the wiring diagram of the brain is crucial to untangling its vast communication network and understanding how it represents and processes information.

            These advancements will provide a unique workbench for exploring the computational dynamics of the brain, moving the team closer to its goal of building a compact, low-power synaptronic chip using nanotechnology. This work stands to break the mold of conventional computing, creating a new paradigm to meet the system requirements of the instrumented and interconnected world of tomorrow.

Traditional Computing                      Cognitive Computing

Stored program model                               Replicated neurons and synapses

Digital                                                                Mixed-mode analog-digital

Synchronous                                                 Asynchronous

Serial                                                                Parallel

Centralized                                                    Distributed

Hardwired circuits                                     Reconfigurable

Explicit memory addressing                 Implicit memory addressing

Over-writes data                                         Updates state when info changes

Separates computation from data      Blurs data/computation boundary

             As the amount of digital data that we create continues to grow massively and the world becomes more instrumented and interconnected, there is a need for new computing systems with intelligence that can spot patterns in various digital and sensor data; analyze and integrate information real-time in context; and deal with the ambiguity found in complex environments.

Businesses will simultaneously need to monitor, prioritize, adapt and make rapid decisions based on ever-growing streams of critical data and information. A cognitive computer could quickly and accurately put together the disparate pieces of this complex puzzle, while taking into account context and previous experience, to help business decision makers come to a logical response.

            “Learning from the brain is an attractive way to overcome power and density challenges faced in computing today,” said Josephine Cheng, IBM Fellow and lab director of IBM Research – Almaden. “As the digital and physical worlds continue to merge and computing becomes more embedded in the fabric of our daily lives, it’s imperative that we create a more intelligent computing system that can help us make sense the vast amount of information that’s increasingly available to us, much the way our brains can quickly interpret and act on complex tasks.”

            IBM and its university partners were recently awarded $16.1 million in additional funding from the Defense Advanced Research Projects Agency (DARPA) for Phase 1 of its Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE) initiative. This phase of research will focus on the components, brain-like architecture and simulations to build a prototype chip. The long-term mission of IBM’s cognitive computing initiative is to discover and demonstrate the algorithms of the brain and deliver low-power, compact cognitive computers that approach mammalian-scale intelligence and use significantly less energy than today’s computing systems.

            “The goal of the SyNAPSE program is to create new electronics hardware and architecture that can understand, adapt and respond to an informative environment in ways that extend traditional computation to include fundamentally different capabilities found in biological brains,” said DARPA program manager Todd Hylton, PhD. Technical insight and more details on the SyNAPSE project can be found on the Cognitive Computing blog at



November 18, 2009 at 11:36 pm Leave a comment

Microsoft Offers Betas for Supercomputing Clusters

At Supercomputing 2009, Microsoft Corp. announced the availability of betas for Windows HPC Server 2008 R2 and distributed , Microsoft Office Excel 2010 for the cluster. Together with the recently announced Microsoft Visual Studio 2010 Beta, which helps simplify parallel programming, these advances make it possible for more users to access supercomputing power through familiar technologies and tools such as Microsoft Office Excel, Windows Server and Visual Studio.

            “Until now, the power of high-performance and parallel computing has largely been available to a limited subset of customers due to the complexity of environments and applications, as well as the challenges of parallel programming,” said Vince Mendillo, senior director of High Performance Computing at Microsoft. “Today, we’re seeing performance numbers that rival Linux … ISVs are seeing 30% to 40% performance improvements in the speed of their code on Windows HPC Server.”

            Although multicore systems are becoming ubiquitous, few developers can build parallel applications that truly leverage the available resources. Visual Studio 2010 Beta 2 will help make parallel programming simpler and more efficient for a broad base of developers across both client and cluster workloads. In addition, by moving Microsoft Office Excel 2010 to the cluster, customers are seeing linear performance scaling of complex spreadsheets – spreadsheets that before would take weeks to complete, and which are now completing their calculations in a few hours.

            Windows HPC Server 2008 R2 improvements include the following:

  • Improved scalability with support for deploying, running and managing clusters up to 1,000 nodes;
  • New configuration and deployment options such as diskless boot, mixed-version clusters and support for a remote head node database;
  • Improved system management, diagnostics and reporting with an enhanced heat map, multiple customizable tabs, an extensible diagnostic framework and the ability to create richer custom reports;
  • Improved support for service-oriented architecture (SOA) workloads, automatic restart and failover of broker nodes, and improved management, monitoring, diagnostics and debugging; and
  • New ways to accelerate Microsoft Office Excel workbooks such as support for Cluster-Aware User-Defined Functions and the capability to run distributed Excel 2010 for the cluster.

            “Many frontline researchers, analysts and scientists desperately need access to more computational power than they currently have, but find it either difficult or too costly in time to gain access to expanded HPC resources. Windows HPC Server 2008 has been designed to address the needs of those wishing to expand their access to HPC, without requiring them to become computer programming experts,” said Earl Joseph, program VP, high-performance computing, IDC. “Microsoft’s latest investments in HPC and parallelism help to reduce the complexities of supercomputing, in particular making it easier to program and thereby making it more accessible to business, academia and government users.”

November 18, 2009 at 4:15 am Leave a comment

November 2009
« Oct   Dec »

Recent Posts

Blog Stats

  • 56,530 hits

PetroComputing Tweets on Twitter