Calibrating gravitational-wave detectors with GW170817 [CL]

http://arxiv.org/abs/1902.08076


The waveform of a compact binary coalescence is predicted by general relativity. It is therefore possible to directly constrain the response of a gravitational-wave (GW) detector by analyzing a signal’s observed amplitude and phase evolution as a function of frequency. GW signals alone constrain the relative amplitude and phase between different frequencies within the same detector and between different detectors. We analyze GW170817’s ability to calibrate the LIGO/Virgo detectors, finding a relative amplitude calibration precision of approximately $\pm20\%$ and relative phase precision of $\pm15^\circ$ (1-$\sigma$ uncertainty) between the LIGO Hanford and Livingston detectors. Incorporating additional information about the distance and inclination of the source from electromagnetic observations, the relative amplitude of the LIGO detectors can be tightened to $\sim\pm15\%$. We investigate the ability of future events to improve astronomical calibration. By simulating the cumulative uncertainties from an ensemble of detections, we find that with several hundred events with electromagnetic counterparts, or several thousand events without counterparts, we reach percent-level astronomical calibration. This corresponds to $\sim$5-10 years of operation at advanced LIGO and Virgo design sensitivity. It is to be emphasized that direct {\em in-situ}\/ measurements of detector calibration provide significantly higher precision than astronomical sources, and already constrain the calibration to a few percent in amplitude and a few degrees in phase. In this sense, our astronomical calibrators only corroborate existing calibration measurements. Nonetheless, astrophysical calibration may become an important corroboration of existing calibration methods, providing a completely independent constraint of potential systematics.

Read this paper on arXiv…

R. Essick and D. Holz
Fri, 22 Feb 19
22/52

Comments: 12 pages, 6 figures