Abstract
Gravitational wave data are often contaminated by non-Gaussian noise transients, “glitches,” which can bias the inference of astrophysical signal parameters. Traditional approaches either subtract glitches in a preprocessing step, or a glitch model can be included from an agnostic wavelet basis (e.g., BayesWave). In this work, we introduce a machine-learning-based approach to build a parametrized model of glitches. We train a normalizing flow on known glitches from the Gravity Spy catalog, constructing an informative prior on the glitch model. By incorporating this model into the Bayesian inference analysis with bilby, we estimate glitch and signal parameters simultaneously. We demonstrate the performance of our method through bias reduction, glitch identification, and Bayesian model selection on real glitches. Our results show that this approach effectively removes glitches from the data, significantly improving source parameter estimation and reducing bias.
| Original language | English |
|---|---|
| Article number | 024071 |
| Journal | Physical Review D |
| Volume | 112 |
| Issue number | 2 |
| DOIs | |
| Publication status | Published - 30 Jul 2025 |