Laser-based spectroscopic techniques, such as cavity ring-down spectroscopy (CRDS), provide a new, cost effective and more widely available approach to measure the oxygen isotope ratio in water molecules, 18 O/ 16 O (δ 18 O), and are used increasingly to measure δ 18 O in the world's oceans. Here, we present results from an interlaboratory comparison designed to evaluate the quality of CRDS-derived measurements, and their consistency with values measured by isotope ratio mass spectrometry (IRMS). We also discuss the influence of salt on instrument performance and sample throughput for the analysis of seawater samples. This study compared measurements of δ 18 O from natural samples with a wide range of salinities (0, 29.4, and 34.6) performed by four independent labs: two using CRDS and two using IRMS. We also compared δ 18 O measurements of Northeast Atlantic Deep Water collected in 2013, 2012, 2009, and 1995 from the AR7W repeat hydrography transect across the Labrador Sea. The within-lab precision of ocean-based CRDS measurements is seen to approach 0.03‰, which is better than the manufacturer's typically stated analytical precision (around +/− 0.05‰), and comparable to that achievable with IRMS. The interlaboratory differences of measurements (highest-lowest) reported by the four labs is taken as an indicator of overall accuracy, and is estimated conservatively as being 〈 0.1‰, with the potential to approach 0.05‰. Overall, these results show that CRDS based 18 O measurements of seawater can be equivalent to high-quality measurements by IRMS.