Scaling seismic foundation models on AWS: Distributed training with Amazon SageMaker HyperPod and expanding context windows

📰 AWS Machine Learning

This post describes how TGS achieved near-linear scaling for distributed training and expanded context windows for their Vision Transformer-based SFM using Amazon SageMaker HyperPod. This joint solution cut training time from 6 months to just 5 days while enabling analysis of seismic volumes larger than previously possible.

Published 2 Apr 2026
Read full article → ← Back to News