Posts tagged ‘IBM’

IBM Uses IT to Boost Oilsands Efficiencies

IBM is using information technology to help energy companies extend the life of oilfields and make them more efficient and environmentally friendlier. IBM’s Oil Sands Centre of Excellence in Calgary has developed an “integrated information framework” that analyzes huge amounts of real-time data from oil operations to identify problems sooner.

“There’s no more easy oil,” said Andy MacRae, a partner with IBM’s Business Consulting Services unit in Calgary. “The next evolution in trends are related to digital energy. It drives a significant improvement in the energy consumption of oil operations.”

In February, Shell and IBM announced a joint collaboration to extend the life of oil and natural gas fields using complex analytics and simulations.

“Using predictive analytics to drive new intelligence into oil and natural gas reservoir management has the potential to extend the life of existing oil and gas fields in a responsible way,” said John Kelly III, Sr. VP and director of IBM Research.

IBM is helping oil companies become more efficient in many different ways. One oil and gas company in Calgary wanted to optimize the extraction of heavy bitumen from the oilsands, but found the effectiveness of their extraction process varied significantly depending on the acidity or alkalinity of the water, changes in temperature, calcium content and ore quality.

IBM mapped and modeled patterns across multiple areas to show how to adjust the extraction process under various conditions. The result was the entire operation’s efficiency improved, reducing energy consumption and the environmental footprint of the operation.

“Not only the energy, but the waste products that get discharged were reduced significantly,” MacRae said. “If you can take more oil off in the extraction process, you end up with less in the tailings pond.”

In another example, an oilsands operator was managing its mine and upgrader “fairly well,” but the operation as a whole was not optimally efficient. After using an integrated software and automation process, routine maintenance was stepped up and scheduled in the least disruptive manner. The strategy included finding more efficient ways to plan shutdowns and improve turnaround time using condition-based monitoring through a series of sensors and fiber optic cables, as well as robotics.

“It means the reliability of the equipment is better and they run more efficiently and use less energy,” MacRae explained.

IBM spends about $6 billion (with a B) on research and development each year, and oil companies can leverage some of that research to reduce oil and gas operations’ costs and environmental impact. As energy companies are forced to look for oil and gas in more difficult places, IT is expected to play an increasing role in how they are developed, MacRae predicted.

www.ibm.com

March 24, 2010 at 10:08 am Leave a comment

Shell, IBM Team Up to Extend Field Life

Shell and IBM have announced a research collaboration that aims to extend the life of oil and natural gas fields. Shell sees potential to reduce the time and money required to model its reservoirs. IBM’s long-standing analytics and simulation experience will meet Shell’s strong subsurface and reservoir expertise to create a more efficient, more accurate picture of energy recovery.

            The companies will explore advanced techniques for reconciling geophysical and reservoir engineering field data. As a result of applying improved algorithms, analytics and accelerated simulations, Shell can reduce the educated guesswork and extract natural resources with more certainty and efficiency, thereby optimizing the recovery of oil and gas.

            “This collaboration is remarkable,” said Gerald Schotman, Exec. VP of Shell Innovation, Research & Development. “Two industrial research giants are coming together to solve a very specific, real-world problem and make the most of oil and natural gas reservoirs. This will not be done through expensive, experimental facilities, but by bringing together a powerful team and powerful computers so we can be smarter than before.”

            The complex process of reconciling often-differing views of oil and natural gas fields can take several months to complete and involves measurements of production volumes, flow rates and pressures. For example, geophysicists must examine time-lapse seismic data from subsurface rock formations; reservoir engineers receive well and laboratory data, and geophysicists receive cross-well seismic tomography data covering wide spaces between the wells.

            Shell and IBM will reformulate and automate the task of reconciling the different data and create an enhanced, yet practical, mathematical optimization solution. This can improve the cost-effectiveness of the data inversion process and, once available, will become part of Shell’s proprietary reservoir modeling tool kits for application in new oil and natural gas developments as well as existing assets.

            “Working with Shell is a prime example of the importance of collaborative research in the effort to build a smarter planet,” said John E. Kelly III, Sr. VP and Director of IBM Research. “Using predictive analytics to drive new intelligence into oil and natural gas reservoir management has the potential to extend the life of existing oil and gas fields in a responsible way.”

            As part of this Joint Development Agreement, IBM and Shell research scientists will work in several laboratories in both the US and the Netherlands.

www.ibm.com

www.shell.com

 

March 2, 2010 at 11:36 am Leave a comment

IBM Updates Lotus Sametime Videoconferencing App

At the annual Lotusphere conference, IBM announced the availability of new IBM Lotus Sametime features, including online meetings and expanded audio and video integration, that help make unified communications and collaboration (UC2) easier and more cost-effective.

            Available now, Lotus Sametime 8.5 offers a new online meeting experience that provides a consolidated calendar view and enables users to start or join a meeting with a single click. Users can easily invite participants to a meeting by dragging names from the instant messaging contact list and dropping them into the meeting. Participants can also accept meeting invitations with a single click and upload materials to the meeting with a drag-and-drop capability.

            Leading companies from around the world participated in a beta program to test the new Lotus Sametime 8.5 capabilities and experience its simplicity and cost savings.

            “For more than eight years, we have used Lotus Sametime to help our employees around the world connect and collaborate through a real-time, integrated communications platform,” said Thomas Eidenmueller, Merck KGaA. “With our planned deployment of Lotus Sametime 8.5 in the second quarter of 2010, we will be able to further lower travel expenses with our plans to run education sessions both internally and externally with our partners. This is an important upgrade that delivers simplified capabilities that will help us increase the use of meetings and connect to our Polycom video conferencing system to promote richer collaboration.”

            The latest version of Lotus Sametime features always-ready, reservation-less meetings that offer password-protected meeting rooms that are always available so that an online meeting can be started instantly. Documents such as meeting minutes and presentations can be stored in the personalized meeting room for future use. Additionally, standards-based audio and video integration makes it easier to interoperate with existing audio and video conferencing systems and improves utilization.

            In addition, Lotus Sametime now offers a new zero-download Web client built on a new Web 2.0 toolkit that makes it easier for businesses to embed Lotus Sametime capabilities into their applications and websites. For example, a business can now include presence, instant messaging, click-to-call and click-to-meet features on its website so that customers can start a conversation or ask questions with one click from the company’s website. This capability helps businesses reduce charges associated with toll-free customer support lines and also helps to improve customer service. Additionally, the zero-download Web client makes it easy for external users to participate in meetings.

            The new release of Lotus Sametime offers expanded mobility support with a new, browser-based Apple iPhone chat client, and an improved mobile client for Microsoft Windows Mobile devices.

www.ibm.com/software/lotus/unified-communications

January 27, 2010 at 8:49 am 1 comment

New IBM Service Helps Companies Monitor Data Center Health

IBM announced the availability of online software as a monthly subscription service to help monitor, predict and prevent IT outages.

            Today even the smallest IT departments are demanding capabilities to identify where bottlenecks might occur, prevent them, and automate data center processes. IT staff need a central point of control to oversee the piecemeal parts of the data center while faced with shrinking capital budgets.

            To help meet that demand, IBM is introducing Tivoli Live Monitoring Services delivered on the IBM cloud to help companies manage the health and performance of their IT resources, including operating systems, virtualized servers, middleware and software applications. Tivoli Live Monitoring Services offers enterprise-class monitoring capabilities as a service – without the need to deploy hardware, purchase separate software licenses, or engage in extensive software configuration.`

            The service helps to identify quickly and address potential outages and bottlenecks that threaten application availability before impacting end-users. When the service detects a potential problem, such as running out of resource capacity, it automatically alerts IT operations and displays the relevant information in a dashboard to help them analyze and correct the issue. Using IBM’s autonomic computing capabilities, the service can be programmed to automate certain tasks that enable the affected system to “self-heal” when faced with certain issues.

            “With digital information as the lifeblood of more organizations, even the smallest companies or divisions consider the data center’s functionality mission-critical,” said Al Zollar, general manager of IBM Tivoli. “With this new service, IBM is delivering our smartest data center software in which businesses choose and pay for what they need. It’s so easy that we expect most companies can sign up for it on Monday and have it running by Friday. The simplicity is an attractive addition to our service management portfolio.”

            IBM also provides these service management capabilities as on-premise software, managed services and software appliances.

            With Tivoli Live Monitoring Services, customers can access pre-configured and dedicated instances of IBM Tivoli Monitoring 6.2.1, IBM Tivoli Monitoring for Microsoft Applications 6.2 and IBM Tivoli Composite Application Manager for Applications 6.2. The service will support the monitoring of up to 500 monitored resources such as operating systems, applications and devices. The service will offer 24×7 phone and e-mail support, and will have extensive self-help content to get running quickly. Services include:

  • Touchless Monitoring – Agent-less monitoring per operating system and/or device;
  • Distributed Monitoring – Agent-based operating system and application monitoring per operating system and/or application; and
  • Performance Services – Historical reporting per operating system useful for capacity planning.

            The offerings are priced per service or monitored element on a monthly basis. There is a one-time setup fee for on-boarding costs. Terms include minimum of 90 days and run from one to three years. The operating systems supported include Linux, AIX, HP-UX and Microsoft Windows.

www.ibm.com/software/tivoli/products/monitor/

 

December 11, 2009 at 12:39 am 3 comments

IBM Moves Closer to Creating Computer Based on the Brain

At the Supercomputing 2009 conference, IBM announced significant progress toward creating a computer system that simulates and emulates the brain’s abilities for sensation, perception, action, interaction and cognition, while rivaling the brain’s low power and energy consumption and compact size. Scientists at IBM Research and Lawrence Berkeley National Lab, have performed the first near real-time cortical simulation of the brain that exceeds the scale of a cat cortex and contains 1 billion spiking neurons and 10 trillion individual learning synapses. The simulation was performed using Lawrence Livermore National Lab’s Dawn Blue Gene/P supercomputer with 147,456 CPUs and 144 terabytes of main memory.

            IBM scientists have also collaborated with researchers from Stanford University to develop an algorithm that exploits the Blue Gene® supercomputing architecture in order to noninvasively measure and map the connections between all cortical and sub-cortical locations within the human brain using magnetic resonance diffusion weighted imaging. Mapping the wiring diagram of the brain is crucial to untangling its vast communication network and understanding how it represents and processes information.

            These advancements will provide a unique workbench for exploring the computational dynamics of the brain, moving the team closer to its goal of building a compact, low-power synaptronic chip using nanotechnology. This work stands to break the mold of conventional computing, creating a new paradigm to meet the system requirements of the instrumented and interconnected world of tomorrow.

Traditional Computing                      Cognitive Computing

Stored program model                               Replicated neurons and synapses

Digital                                                                Mixed-mode analog-digital

Synchronous                                                 Asynchronous

Serial                                                                Parallel

Centralized                                                    Distributed

Hardwired circuits                                     Reconfigurable

Explicit memory addressing                 Implicit memory addressing

Over-writes data                                         Updates state when info changes

Separates computation from data      Blurs data/computation boundary

             As the amount of digital data that we create continues to grow massively and the world becomes more instrumented and interconnected, there is a need for new computing systems with intelligence that can spot patterns in various digital and sensor data; analyze and integrate information real-time in context; and deal with the ambiguity found in complex environments.

Businesses will simultaneously need to monitor, prioritize, adapt and make rapid decisions based on ever-growing streams of critical data and information. A cognitive computer could quickly and accurately put together the disparate pieces of this complex puzzle, while taking into account context and previous experience, to help business decision makers come to a logical response.

            “Learning from the brain is an attractive way to overcome power and density challenges faced in computing today,” said Josephine Cheng, IBM Fellow and lab director of IBM Research – Almaden. “As the digital and physical worlds continue to merge and computing becomes more embedded in the fabric of our daily lives, it’s imperative that we create a more intelligent computing system that can help us make sense the vast amount of information that’s increasingly available to us, much the way our brains can quickly interpret and act on complex tasks.”

            IBM and its university partners were recently awarded $16.1 million in additional funding from the Defense Advanced Research Projects Agency (DARPA) for Phase 1 of its Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE) initiative. This phase of research will focus on the components, brain-like architecture and simulations to build a prototype chip. The long-term mission of IBM’s cognitive computing initiative is to discover and demonstrate the algorithms of the brain and deliver low-power, compact cognitive computers that approach mammalian-scale intelligence and use significantly less energy than today’s computing systems.

            “The goal of the SyNAPSE program is to create new electronics hardware and architecture that can understand, adapt and respond to an informative environment in ways that extend traditional computation to include fundamentally different capabilities found in biological brains,” said DARPA program manager Todd Hylton, PhD. Technical insight and more details on the SyNAPSE project can be found on the Cognitive Computing blog at http://modha.org/.

             www.ibm.com/research

November 18, 2009 at 11:36 pm Leave a comment

IBM Announces New Software for Managing Data Centers

IBM has introduced groundbreaking new software for managing data centers. The new technology has the potential to dramatically cut the cost of operations while speeding the deployment of new applications from weeks to minutes.

            The introduction of IBM’s new VMControl product for enterprises, combined with IBM Tivoli software, gives businesses a single point of control across multiple types of IT systems and virtualization technologies. It spans UNIX/Linux, mainframe, x86 and storage systems and networks.

            VMControl helps companies that have turned to virtualization – the creation of multiple virtual servers or storage on a single physical system – to reduce infrastructure costs, but have encountered new struggles as they try to manage enterprises made up of disparate platforms, each with their own virtualization technology. VMControl allows combinations of physical and virtual IBM servers to be managed as a single entity. This approach, known as system pooling, expands the benefits of virtualization by helping corporate data centers simplify complex management functions and better share and prioritize use of critical resources such as processing power, memory and storage.

            Centralizing control of virtualized environments brings new intelligence to data center operations. Companies can manage their vast pools of information and processing resources and parcel them out to applications when and where they’re needed. This breakthrough capability not only increases the overall capacity utilization of the IT infrastructure to lower capital, operational and energy costs, and improves application availability, but gives IT managers the flexibility to adapt to new demands being prompted by the surge of data from internet-connected devices.

            VMControl will also accelerate the deployment of new IT delivery models, like cloud computing, which allows information and processing resources to be tapped remotely. The new product, together with IBM Tivoli software, helps companies address and improve service and reduce cost and risk. IBM also announced a new version of Tivoli Provisioning Manager that provides enhanced automation of the manual tasks of provisioning and configuring servers, operating systems, middleware, software applications, storage and network devices.

www.ibm.com/dynamicinfrastructure

October 21, 2009 at 3:49 am Leave a comment


May 2024
M T W T F S S
 12345
6789101112
13141516171819
20212223242526
2728293031  

Blog Stats

  • 59,185 hits