STORAGE CHALLENGE – OIL AND GAS
Big data is the lifeblood of the oil & gas industry. It is a simple fact that “quality equals volume” for example: new 3D seismic imagery requires 10X the amount of storage capacity of standard seismic imagery. The question all HPC managers are dealing with is “How do I balance the need to spend money on the latest interpretation and visualization applications that help drive revenue versus the need to expand storage capacity to store the massive amounts of data?” The logical answer to this question is to spend less on storage and more on things that help drive revenue, but how?
Traditional Storage Not Suited For Today’s Big Seismic Data
Storage efficiency, from both a capacity and performance perspective, is key to delivering a solution that works with today’s big seismic data. Once data is stored it must be accessible for processing, interpretation and visualization and the storage system must have read/write performance to handle the transfer of large data sets to and from the HPC cluster or to end users or other applications. Finally, data loss is not an option for oil & gas HPC environments, so high data durability is a must. Traditional storage options utilizing RAID-based storage and tape backup no longer meet the challenge. The main reason is that as drive sizes increase within RAID sets, so do the rebuild times during failure, with some rebuilds taking as long as a week causing severe performance problems. In addition, standard tape backup processes are no longer capable of handling the massive amounts of data, are increasingly cumbersome and do not provide the necessary reliability.
|Extreme scalability for large seismic datasets to Exabytes of capacity|
|No single point of failure, high data integrity and durability|
|Performance that scales independently up to 10s of GB/s aggregate throughput|
|Easily integrates with existing applications without any modification|
|Self heals in the background with minimal impact to system performance|
|Single pane system management with non disruptive changes|
|TCO approximately 50% less than traditional storage systems|
The new generation of object storage solutions like AmpliStor were built specifically to solve this big energy exploration data challenge. Its modular architecture and fully abstracted software stack delivers unbreakable durability, infinite, scalability, and extreme efficiency at a lower cost. Optimized for Intel-based commercial-off-the-shelf hardware AmpliStor is designed to take full advantage of the latest Intel* Xeon* Processors with performance scaling linearly with each new controller in the system. AmpliStor is ideal for creating a common repository for the vast amount of seismic data and can consolidate several tiers of storage into a single tier.