/   News   /   Technology challenges in Oil & Gas

Technology challenges in Oil & Gas

/ 21 July, 2010

These factors are placing exploration and production (E&P) asset teams under pressure to more precisely characterise prospects and existing dormant assets. At the same time, data managers cheapest levitra generic are looking to stretch shrinking information technology budgets.

NetApp Fabric Attached Storage (FAS) systems, combined with the NetApp Data ONTAP operating system, provides a cost-effective means of pooling the significant volumes of diverse data resources used in E&P operations. Several of NetApp’s innovative data management features enable Oil & Gas industry users to access petabytes of information buying cialis without a prescription efficiently and cost-effectively.

Data storage and management challenges

In uncertain economic times, oil and gas companies need to quickly and accurately assess the potential of prospects. New interpretation, visualisation, and modelling applications are multiplying the amount of data used to http://www.cnmeonline.com/new/shop/buy-levitra-in-dania/ evaluate a prospect. As members of exploration teams collaborate to characterise properties, they consume and create even more data, requiring more storage and better data management solutions.

Commodity storage prices continue to decline, thereby increasing consumption but causing new problems in manageability, usability, and data centre resources.

Using NetApp FAS systems, NetApp delivers a comprehensive data management and storage solution to store and manage data – seismic volumes, attribute volumes, horizons, faults, wells, and other data types that allow users to achieve the highest levels of performance and data protection. Converging technologies bring high-end interpretation, modelling, and simulation to desktops

Two primary components of E&P decision making, seismic interpretation and reservoir modelling have recently undergone rapid transformation due to information technology advances. The advances present a unique opportunity for E&P managers: the ability to combine improved data interpretation and modelling capabilities with the data storage and management infrastructure to support it, all on a Windows-based desktop.

Escalating volumes of data

Due to the long-term nature of the energy exploration industry, multiple disparate seismic datasets typically characterise any single levitra costa rica property. Hence, data managers and asset teams may need to combine seismic information from an original survey conducted years ago with data from current production operations on the same reservoir. In addition, managers need accurate models of brownfield reservoirs to make better drilling and other operational efficiency decisions as oil depletes.

Increasingly complex reservoirs, growing amounts of data, and a shortage of experienced petro-technical professionals have driven the need for extremely efficient workflows. Increased data volumes lead to increased data processing, storage, and management challenges.

The data storage challenge

Outside seismic processing, seismic interpretation is by far the largest consumer of data storage in exploration divisions, accounting for up to 80 percent of enterprise storage requirements. A primary ongoing challenge for data managers regardless of the data interpretation method has always been how to manage huge data volumes.

Data storage and management become significantly larger challenges when data from disparate systems must be consolidated to fuel volume interpretation. Larger amounts of higher-fidelity seismic data and the application of new processing techniques contribute to significant and growing data management challenges.

The cost and complexity of architecting, implementing, and managing traditional enterprise storage infrastructures can be substantial. And when the needs of the dynamic enterprise inevitably change often quickly and even dramatically, traditional static systems require reprovisioning. This can cause disruption and increase management overhead and risk.

To address this, IT organisations typically overprovision systems and pre-allocate resources. However, this approach is costly and offers only temporary relief from inevitable expense and disruption.

One reason is that the I/O capabilities of individual storage devices inherently bind traditional static system performance. The need to overprovision is compounded for operations such as oil and gas exploration, which require distributed data processing environments and large datasets.

The exploration interpretation process may create multiple large-scale versions of data, often in a consolidated storage environment shared among several asset teams. This means that data managers incur additional overhead, management costs, and data risk to support these processes.

Even the most robust and highly resourced static systems can be cumbersome and inflexible when handling multiple scenarios with extremely large datasets. As a result, many organisations have settled for decimating their seismic data to a small percentage of the original size, and these limitations hamper success.