Some elements of the seismic data processing sequence are virtually universal, regardless of whether the intention is to perform time imaging, depth imaging, multicomponent imaging, or reservoir studies. Data conditioning and signal processing form the foundation of the seismic processing workflow.
Signal processing and data conditioning encompass a wide variety of technologies designed to address numerous challenges in the processing sequence: from data calibration and regularization through to noise attenuation, demultiple, and signal enhancement techniques.
Reducing multiple contamination is one of the greatest challenges in seismic processing, and no single approach fits all scenarios. We offer an extensive portfolio of demultiple algorithms and innovative workflows. In practice, these methods may be combined and cascaded to obtain the optimum solution.
All our signal processing techniques are designed to enhance structural continuity, boost resolution of fine details, and optimize amplitude and phase consistency in preparation for reservoir characterization.
Eliminating irregularities in acquisition geometries and other variations in the seismic wavelet is a vital step in any processing workflow. Numerous processing steps rely on assumptions of regular geometry, while confidence in many amplitude-dependent processes (such as AVO and inversion) requires control of the wavelet characteristics.
We have a huge variety of tools available to attack both coherent and random noise modes; from wind and swell noise, through to mud roll, ambient noise, spikes, and seismic interference.
Techniques and workflows that combine multiple geophysical and geological measurements to improve characterization, providing improved analysis of the near-surface geology, and, ultimately, a better image of the reservoir.