Estimated reading time: 5 minutes
Seismic data processing is a complex process that comprises of several elements that contribute to its outcome. Due to the nature of these unpredictable elements, ensuring the quality of the outcome of any seismic data processing is difficult. Seismic interpretation is a fundamental tool for depicting subsurface geology and assisting activities in various domains, such as environmental engineering and petroleum exploration.
To gain a successful outcome for data processing, it depends on these factors such as people, quality control and software/hardware used (Ghiath et al, 2011). By having a proper arrangement of all these factors, it can result in a successful outcome while reducing uncertainties and save costs.
As the cost of conducting a seismic survey is much greater than the cost of processing the data, it is imperative that the process and technology used for data processing is effective and up-to-date. Having proper QA/QC procedures in place ensures that the quality of the data produced is good and can be used by the team.
When making processing decisions for 3D seismic surveys, it is best to adopt a team approach. It is best to have a multidisciplinary team comprising of:
- Seismic processing geophysicists
- Interpretation geophysicists
- Reservoir engineers
- Management representatives
- Other experts in field acquisition and/or processing
By having a multidisciplinary team, it would ensure that the outcome is robust, impartial and comprehensive. Other best practices also include:
- Ensure the evaluation team is multidisciplinary – technical and commercial
- Ensure that the evaluation strategy and criteria are clearly outlined at the outset of the project
- Ensure that all available data are made available to the processing team – well logs and seismic, regional data, reports etc
- Ensure that the TOR document has a detailed scope of work and all required deliverables for the purpose of future work both intermediate and final should be clearly defined
- Ensure that at kick-off workshop, all members of a multidisciplinary team set-up for the project are present and involved in the determination of the final scope and processing objectives of the project
- Ensure that allowance is made for alternative testing, technologies and/or expert feedback
- Ensure that each processing step has a well spelt out QA/QC plan or structure to ensure that best practices are maintained each step of the way
- Ensure that no decision is made prior to consultation with the technical evaluation team
- Ensure that at each processing step, the processing project team set up and partake in the various sessions reviewing input parameters and product quality. This should most often take place at the workstations not with a review of presentation slides.
- Ensure that variations in the resolution, multiple attenuation, S/N ratio, imaging, fault resolution, phase stability, and amplitude integrity of the data are identified and where not expected, a reason for this is given or the variation is corrected
- Ensure that the processing geophysicists generate several sub frequency cubes for low-, medium, and high-frequency ranges
- Ensure when dealing with different migrated algorithms volumes, the right comparison seismic lines should be QC to highlight the upside/limitations from one versus other
- Ensure that the interpreting geophysicists and geologists have regional interpretation experience and geological knowledge
3D Seismic Data Processing is a 3-day training course held from 25-27 November 2019 (Kuala Lumpur). The objective of the course is to create and then develop an understanding of the theoretical background of seismic data processing algorithms, Quality Control and Quality Assurances for each processing step to ensure having optimally seismic data product from seismic data quality perspectives. The course will cover all seismic processing workflow to having an amenable seismic data interpretation, with case studies from onshore, transition zone and offshore seismic surveys.