2018 October 22

Location: GRL 107

Group members discussed impressions from SEG annual presentations relating to machine learning. A crude outline of topics discussed follows.

Attending: Andy Antoine Thomas Iga Jihyun Bane Arnab Hayden

TODO:

  • find room for weekly meetings @ noon (Thomas)
  • prepare next week’s topic slides (Bane + Hayden + Arnab)

Windowed CNN for fault classification

  • Problem: “Where are the faults?”
  • Xin Ming Wu (OG CWP but UT now)
  • Simple synthetics for generating training data for reflectivity model.
  • Artificial faults induced in data via Mines JTK.
  • Results look convincing.
  • Fault classification given by quantizing:
    1. fault existence
    2. azimuth and
    3. dip
  • Windowed approach, assuming fault goes through the center of voxel (??)
  • Faults reconstructed after classification

Classifying signals in ambient seismic

  • Problem: “I’m only interested in certain ambient signals.”
  • Fantine Huot (Stanford SEP)
  • Application to DAS and finding specific types of signals
  • Unsupervised, pick, supervised, then rinse+repeat

Extrapolating frequencies for improving FWI

  • Problem: “I need lower frequencies for my FWI to work.”
  • Is this approach physically valid?
  • Hopfield network + boltzmann machine used
    Q: Why and how do NN deal with RTM artifacts “better”?
  • Overall fishy, probably waste of time - we already can do RTM

FWI updates via NN

  • Problem “My FWI model updates take a while and aren’t great.”
  • NN weights updated instead of model updates.
    Q: Is this like what Andy is doing?

ML in interpretation

  • Problem “Proper salt interpretation takes too much time and expert knowledge.”
  • Salt interpretation is popular.
  • Top, bottom, and existence interpretations.
  • Problem quantifying error in results.
  • Interpreter still required for QC.
  • 2D overused, 3D is more appropriate.

Surface wave attenuation

  • Problem “Someone get these surface waves out of my data please.”
  • So many GANs.
  • Input: noisy data
  • Output: denoised data given by (insert denoising alg. here)
  • Training data: inputs and outputs from (insert denoising alg. here)
  • You could go unsupervised here.

Next steps

  • finding sparse represenations of data (dictionary learning) for denoising and interpolation. (Iga, Arnab, Thomas)
  • denoising and monitoring DAS data (Jihyun)

Next week

Topics

  • Bane - semester project
  • Hayden - “I have research, I got this.”
  • Arnab - existing project
  • Backup - Iga

Resources

Comments