Posts filed under ‘The Bleeding Edge’

Tendeka Launches New Temperature Sensing System

Tendeka launched the new Sensornet Oryx-XR Distributed Temperature Sensing (DTS) system, designed for use in harsh environments. The Oryx-XR is an enhanced version of the original Oryx system, which was launched in 2009. The upgraded model has an extended sensing range, increased from 4 km to 12 km, and an improved measurement performance, which can provide a temperature resolution as fine as 0.01°C.

            The autonomous, low-powered device provides temperature samples every meter along a fiber optic cable, with a wide operating temperature window of between -50°C to 650°C and can operate by solar or wind power. The permanent, standalone unit contains the sensing optoelectronics and operates remotely with an intuitive, user-friendly software interface, making it simple to use and easily transportable.

            The Oryx-XR features an inbuilt multiplexing module with either two or four channels, enabling up to four single-ended measurements or two double-ended measurements. It provides data back to the client’s office by any available satellite, radio or fiber communications link, making it a powerful remote logging DTS unit.

            The system has been designed to complement the performance of Tendeka’s family of Sentinel DTS units, making it suitable for the toughest monitoring challenges, such as horizontal well activity.

            Dan Watley, Tendeka’s VP for Upstream Digital Monitoring, said: “The Oryx-XR is a natural progression of the original system. Technology developments mean we can now provide monitoring across greater lengths and even smaller temperature differences. Operators are increasingly targeting harder to reach reserves with increasingly complex well, and the Oryx-XR technology is ideally suited to monitoring for these challenging applications.”

www.tendeka.com

Advertisements

April 26, 2010 at 9:45 am Leave a comment

WesternGeco Sets Seismic Record in Kuwait with UniQ

Schlumberger’s WesternGeco business unit announced that its UniQ integrated point-receiver land seismic system has set a new industry record in Kuwait for Kuwait Oil Company (KOC) in acquiring data from 80,000 live digital point-receiver channels at a two-millisecond sample interval.

            During sustained slip-sweep production in February, UniQ technology acquired and real-time quality checked one terabyte of data per hour – the equivalent of five days of production for a typical 3000-channel conventional crew. All data were concurrently pre-conditioned using the Q-Xpress infield integrated seismic data acquisition and processing workflow for near-real-time seismic data analysis. The UniQ system is being deployed in conjunction with WesternGeco DX-80 Desert Explorer vibrators and MD Sweep low-frequency technology.

            Part of the Q-Technology point-receiver seismic hardware and software portfolio, the UniQ system combines extreme channel count technology with support for advanced simultaneous source techniques. Building upon the fidelity provided by the broad bandwidth geophone accelerometer sensor, the system can support up to 150,000 live channels at a two-millisecond sample interval.

            “UniQ continues to set new standards of performance in the land seismic business, proving that sustained high productivity can be reliably combined with high channel count recording,” said Marwan Moufarrej, VP Land, WesternGeco.

www.westerngeco.com

 

March 9, 2010 at 10:08 am Leave a comment

IBM Moves Closer to Creating Computer Based on the Brain

At the Supercomputing 2009 conference, IBM announced significant progress toward creating a computer system that simulates and emulates the brain’s abilities for sensation, perception, action, interaction and cognition, while rivaling the brain’s low power and energy consumption and compact size. Scientists at IBM Research and Lawrence Berkeley National Lab, have performed the first near real-time cortical simulation of the brain that exceeds the scale of a cat cortex and contains 1 billion spiking neurons and 10 trillion individual learning synapses. The simulation was performed using Lawrence Livermore National Lab’s Dawn Blue Gene/P supercomputer with 147,456 CPUs and 144 terabytes of main memory.

            IBM scientists have also collaborated with researchers from Stanford University to develop an algorithm that exploits the Blue Gene® supercomputing architecture in order to noninvasively measure and map the connections between all cortical and sub-cortical locations within the human brain using magnetic resonance diffusion weighted imaging. Mapping the wiring diagram of the brain is crucial to untangling its vast communication network and understanding how it represents and processes information.

            These advancements will provide a unique workbench for exploring the computational dynamics of the brain, moving the team closer to its goal of building a compact, low-power synaptronic chip using nanotechnology. This work stands to break the mold of conventional computing, creating a new paradigm to meet the system requirements of the instrumented and interconnected world of tomorrow.

Traditional Computing                      Cognitive Computing

Stored program model                               Replicated neurons and synapses

Digital                                                                Mixed-mode analog-digital

Synchronous                                                 Asynchronous

Serial                                                                Parallel

Centralized                                                    Distributed

Hardwired circuits                                     Reconfigurable

Explicit memory addressing                 Implicit memory addressing

Over-writes data                                         Updates state when info changes

Separates computation from data      Blurs data/computation boundary

             As the amount of digital data that we create continues to grow massively and the world becomes more instrumented and interconnected, there is a need for new computing systems with intelligence that can spot patterns in various digital and sensor data; analyze and integrate information real-time in context; and deal with the ambiguity found in complex environments.

Businesses will simultaneously need to monitor, prioritize, adapt and make rapid decisions based on ever-growing streams of critical data and information. A cognitive computer could quickly and accurately put together the disparate pieces of this complex puzzle, while taking into account context and previous experience, to help business decision makers come to a logical response.

            “Learning from the brain is an attractive way to overcome power and density challenges faced in computing today,” said Josephine Cheng, IBM Fellow and lab director of IBM Research – Almaden. “As the digital and physical worlds continue to merge and computing becomes more embedded in the fabric of our daily lives, it’s imperative that we create a more intelligent computing system that can help us make sense the vast amount of information that’s increasingly available to us, much the way our brains can quickly interpret and act on complex tasks.”

            IBM and its university partners were recently awarded $16.1 million in additional funding from the Defense Advanced Research Projects Agency (DARPA) for Phase 1 of its Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE) initiative. This phase of research will focus on the components, brain-like architecture and simulations to build a prototype chip. The long-term mission of IBM’s cognitive computing initiative is to discover and demonstrate the algorithms of the brain and deliver low-power, compact cognitive computers that approach mammalian-scale intelligence and use significantly less energy than today’s computing systems.

            “The goal of the SyNAPSE program is to create new electronics hardware and architecture that can understand, adapt and respond to an informative environment in ways that extend traditional computation to include fundamentally different capabilities found in biological brains,” said DARPA program manager Todd Hylton, PhD. Technical insight and more details on the SyNAPSE project can be found on the Cognitive Computing blog at http://modha.org/.

             www.ibm.com/research

November 18, 2009 at 11:36 pm Leave a comment

Microsoft Offers Betas for Supercomputing Clusters

At Supercomputing 2009, Microsoft Corp. announced the availability of betas for Windows HPC Server 2008 R2 and distributed , Microsoft Office Excel 2010 for the cluster. Together with the recently announced Microsoft Visual Studio 2010 Beta, which helps simplify parallel programming, these advances make it possible for more users to access supercomputing power through familiar technologies and tools such as Microsoft Office Excel, Windows Server and Visual Studio.

            “Until now, the power of high-performance and parallel computing has largely been available to a limited subset of customers due to the complexity of environments and applications, as well as the challenges of parallel programming,” said Vince Mendillo, senior director of High Performance Computing at Microsoft. “Today, we’re seeing performance numbers that rival Linux … ISVs are seeing 30% to 40% performance improvements in the speed of their code on Windows HPC Server.”

            Although multicore systems are becoming ubiquitous, few developers can build parallel applications that truly leverage the available resources. Visual Studio 2010 Beta 2 will help make parallel programming simpler and more efficient for a broad base of developers across both client and cluster workloads. In addition, by moving Microsoft Office Excel 2010 to the cluster, customers are seeing linear performance scaling of complex spreadsheets – spreadsheets that before would take weeks to complete, and which are now completing their calculations in a few hours.

            Windows HPC Server 2008 R2 improvements include the following:

  • Improved scalability with support for deploying, running and managing clusters up to 1,000 nodes;
  • New configuration and deployment options such as diskless boot, mixed-version clusters and support for a remote head node database;
  • Improved system management, diagnostics and reporting with an enhanced heat map, multiple customizable tabs, an extensible diagnostic framework and the ability to create richer custom reports;
  • Improved support for service-oriented architecture (SOA) workloads, automatic restart and failover of broker nodes, and improved management, monitoring, diagnostics and debugging; and
  • New ways to accelerate Microsoft Office Excel workbooks such as support for Cluster-Aware User-Defined Functions and the capability to run distributed Excel 2010 for the cluster.

            “Many frontline researchers, analysts and scientists desperately need access to more computational power than they currently have, but find it either difficult or too costly in time to gain access to expanded HPC resources. Windows HPC Server 2008 has been designed to address the needs of those wishing to expand their access to HPC, without requiring them to become computer programming experts,” said Earl Joseph, program VP, high-performance computing, IDC. “Microsoft’s latest investments in HPC and parallelism help to reduce the complexities of supercomputing, in particular making it easier to program and thereby making it more accessible to business, academia and government users.”

www.microsoft.com/hpc

November 18, 2009 at 4:15 am Leave a comment

Mobile Spy Monitors GPS Locations, SMS Messages and Calls

Arizona-based Retina-X Studios LLC announced the availability of Mobile Spy for smartphones running the Android operating system. Using this technology, users can silently monitor GPS locations, incoming and outgoing text messages (SMS) and call information of children or employees — even if activity logs are erased. Mobile Spy had already been available for the BlackBerry, iPhone, Windows Mobile and Symbian OS smartphones. The new version for the Android platform is now on the market.

www.mobile-spy.com

 

November 9, 2009 at 11:19 pm Leave a comment

DeepOcean, Statoil Pilot Windows 7 Operating System

Following the official launch of Windows 7 around the globe, Microsoft Corp. announced that two Norwegian customers from the oil and gas industry have successfully piloted the new Windows 7 platform. DeepOcean, a subsea services and construction support firm, and Statoil, one of the world’s largest offshore oil companies, have deployed or are piloting the Windows 7 operating system, improving employee productivity and increasing overall business performance.

            Albrecht “Ali” Ferling, PhD, managing director of Microsoft’s Worldwide Oil and Gas Industries, said: “Our industry is facing unprecedented challenges, and doing more with less is a priority for many of our customers. The role of IT as a key enabler to drive business efficiency is more important than ever, and Windows 7 and Windows Server 2008 R2 bring a powerful combination of cost savings, greater productivity and improved capacity for innovation to our oil and gas customers.”

            Windows 7 and Windows Server 2008 R2 have been developed with today’s economy in mind, where long-term business success needs to be built on two things: innovation and productivity. Their main features allow employees easy access to information anywhere at any time while organizations can reduce risk through improved security and drive cost savings through virtualization and streamlined management capabilities.

 

DeepOcean Migrates from UNIX

In 2000, DeepOcean was among the first in its industry to implement the Windows platform for its onshore operations and offshore data processing, moving from a UNIX-based platform with flat files because the UNIX technology was out of date and the company wanted to develop new applications on the Windows platform. DeepOcean is now migrating from a UNIX-based platform to Windows 7 and Windows Server 2008 R2 because of its need to support its sales force and engineers who travel between onshore and offshore locations. These mobile employees rely on portable computers that make up more than 25% of the company’s computers.

            When DeepOcean migrated to the Windows platform, it implemented the Windows NT 4.0 operating system on its 50 client computers and the Windows NT Server 4.0 operating system on its 10 servers, which it is gradually upgrading from Windows Server 2003 to Windows Server 2008 R2. DeepOcean uses Microsoft Forefront Client Security to help protect its client and server environment from Internet-based threats.

            In an effort to enhance security for its portable computers and to address challenges with its virtual private network solution, the company also decided to migrate to the Windows 7 operating system. As a result of the upgrade, DeepOcean has simplified IT management, enhanced IT security and improved employee productivity.

            “Windows 7 has enabled our mobile work force to connect to the corporate network and access all the resources they need faster and more easily,” said Per Arne Stromo, IT manager at DeepOcean. “At the same time, Microsoft technology offers us a highly secure and reliable tool to help protect our confidential data and intellectual property even when on the road.”

 

Statoil Enables Remote Access

Statoil also wanted to improve employee productivity by making sure that workers in its increasingly global operations could fully collaborate with their colleagues. To address remote access issues that could hinder employee productivity and collaboration, the oil company intends to implement the Windows 7 and the Windows Server 2008 R2 operating systems, which together offer features such as BranchCache to improve data access at branch offices and DirectAccess to simplify remote connectivity. As a result of the upgrade, Statoil will deliver seamless access to the corporate network for traveling employees, improve information access at branch offices and enhance IT security.

            “Using Windows 7 and Windows Server 2008 R2, we’ll be able to better support our strategy as a global company and more easily share information no matter where our employees and consultants reside,” said Petter Wersland, leading advisor for IT Infrastructure, Statoil.

            Building upon this core IT infrastructure, Microsoft, together with its partners, is continuing to tackle the creation of technology solutions for some of the industry’s top priorities – better collaboration, unified communications and role-based productivity – to fundamentally change the way people work by introducing novel workflows and knowledge management capabilities that maximize scarce labor talent and bring business-critical information to workers wherever they are.

www.microsoft.com/oilandgas

November 6, 2009 at 1:06 am Leave a comment

CGGVeritas Deploys SeisMovie for Monitoring Heavy Oil Production

CGGVeritas has announced the successful completion of the acquisition phase of the first continuous land 3D SeisMovie™ project conducted for Shell Canada in the Peace River region, Alberta, Canada. The project is a joint technology trial with Shell Canada; data acquisition spanned 90 days of continuous operations and was completed with no HSE incidents.

            SeisMovie is a patented reservoir monitoring technology that is particularly applicable to steam-assisted heavy oil production. On this project, multiple SeisMovie sources were permanently buried and continuously and simultaneously activated over a period of three months to collect up to 1 TB of raw data daily. After real-time, in-field automatic processing, a daily stack was produced. These data are currently being analyzed in a collaborative effort between Shell and CGGVeritas to extract a time-lapse reservoir signal.

            Jean-Jacques Postel, Exec. VP, Land Acquisition Product Line, CGGVeritas, said: “SeisMovie is the only 4D seismic acquisition solution of its kind in the world. Its deployment enables companies to effectively monitor reservoir changes over long periods of time while minimizing human intervention and environmental impact. It is a safe, long-term solution for autonomous, real-time monitoring of reservoirs. SeisMovie offers our clients the ability to see and utilize time-lapse effects to better understand the evolution of reservoir production and ultimately enhance recovery rates.”

www.cggveritas.com

October 31, 2009 at 12:05 am Leave a comment

Older Posts


June 2019
M T W T F S S
« Jan    
 12
3456789
10111213141516
17181920212223
24252627282930

Recent Posts

Blog Stats

  • 56,365 hits

PetroComputing Tweets on Twitter