Upstream E&Ps leverage data centers for increased visualization

July 1, 2008
For upstream E&P companies that use 3D seismic data to view and assess geological conditions, the ability to generate and configure large data sets constantly outpaces the ability to visualize that data using 3D models.

Blake McLane - CyrusOne

For upstream E&P companies that use 3D seismic data to view and assess geological conditions, the ability to generate and configure large data sets constantly outpaces the ability to visualize that data using 3D models. This can lead to analysis gaps and verification inaccuracies that minimize the full value of the data being generated.

The problem is caused by increasingly huge volumes of data – often dozens of terabytes or more – being channeled simultaneously into visualization applications that must work at maximum speed to provide highly detailed, real-time images, which help upstream exploration units make accurate decisions. The problem relates to computer hardware and software, as both components struggle to keep pace with the sheer quantity of data being processed at maximum speed. In fact, companies find it nearly impossible to achieve optimal visualization under these conditions using traditional servers and hardware storage.

Computational Fluid Dynamics (CFD) is required in high density data center environments running visualization software. Proper airflow can be simulated using this tool to minimize downtime of the application and increase efficiencies.

Click here to enlarge image

While many E&P companies have proprietary IP algorithms in place to reprocess portions of the data quickly, capturing fully rendered 3D images requires the most powerful software coupled with best-in-class servers and storage hardware. These companies typically require a solution that delivers three critical components: speed, scalability, and accuracy.

Improved performance at a cost

Seismic visualization software is the fastest, most proven way to work quickly and accurately through huge data sets. The good news is that today’s feature-rich seismic data offers unprecedented visibility and accuracy, offering vast improvements over the previous “line-by-line” methodologies that were tedious and time consuming. This software is, in many cases, tightly integrated with parallel business applications that deliver interactive interpretation at maximum speed.

The bad news is that it requires huge IT investments in best-in-class hardware offering maximum performance and exponential scalability. To ensure the software works at peak levels, the application must reside on the most advanced hardware architecture available. Only then can E&P companies be sure they’re making the best possible business decisions.

3D requires peak performance

The hardware solution is found within the data center – specifically data centers with an industry-leading Tier 3+ rating to ensure optimal future-proof performance, scalability, and reliability. For most E&Ps, that means co-located application hosting with an outside data center. The costs and manpower to build or retrofit an in-house data center are high for companies that want to dedicate all available time and resources to achieving field results. Hosting their entire IT infrastructures provides cost-conscious E&Ps with an affordable solution, ensuring maximum redundancy and performance dependability.

Best-in-class Tier 3+ data centers, whether in-house or co-located, provide scalable architectures designed for optimal high density and dedicated server needs, and reduce the need for in-house IT staff to run and manage visualization applications. They are equipped to meet the principal speed, scalability, and accuracy criteria.

Application speed is ensured by dedicated, high-density servers. Although the data sets are enormous, the ability to host and run these applications on best-in-class servers allows access to real-time visualization with seamless updates. This means all back office hardware, including storage, servers, and switches, is fully integrated and optimized for the highest speed and availability. In the case of upstream seismic visualization, this can require a large hardware footprint to accommodate all necessary hardware for running 10, 20, 30, or even more terabytes of data simultaneously.

To address scalability, these data centers use “application-agnostic” architectures. This enables a seamless fit for a wide variety of applications and system requirements, with full availability of any chosen operating system. This also allows 3D applications full access to use open GLI libraries, which provide greater resource availability but are more demanding than conventional, non-open protocols. These architectures also facilitate process load-balancing rather than user load-balancing, a protocol that works better for 3D visualization. This, in turn, provides real-time adaptability and load shift to guarantee optimal performance at all times.

Redundancy requirements

Data accuracy (and reliability) is delivered through the deployment of 2N redundant architectures. This means each system runs on parallel architecture, providing full 1:1 data and application backup in the event of failure. Each dedicated server has a backup unit for immediate, seamless switchover. Using a 2N protocol guarantees business continuity – a system will remain up and running despite any adverse events, whether external or internal to the data center. This is essential to upstream exploration where system failure, even for a brief time, could lose business opportunities.

Planning for future

For E&P companies, data center needs extend beyond visualization applications. As seismic visualization shifts to more mature business models with increased software and system integration, the IT burden increases across the board.

The industry is seeing business decision models that provide, at a single interface, the ability to better visualize geological conditions in conjunction with surface and thermal conditions, and tactical recommendations for working around shallow hazards. This gives a more immediate, comprehensive picture of the task at hand, and allows for precise trajectories of drilling and well placements. But it also means more software, more data, and more system integration.

As new applications and software become necessary, the role of the data center is paramount. Under the supervision of best-in-class data center personnel, these applications can be seamlessly uploaded and integrated with zero downtime. Some in-house data centers deliver these capabilities, but are typically built out on an ad-hoc basis, or require retrofit architectures and physical relocation of other data storage and system hardware. As many E&Ps have found, co-located data centers provide “future-proof peace of mind” and the ability to dedicate IT resources and staff to other customer and business projects.