We evaluate the effect that radome transparency has on atmospheric opacity measurements performed by the skydip technique. We show that, except at rather high opacities, it is not sufficient to ignore losses in the radome (or ‘window’) during the data analysis and then subtract them from the derived atmospheric opacity. Perhaps surprisingly, unless radome transparency is correctly modelled, the atmosphere will appear to have a minimum opacity that is many times greater that the radome losses. Our conclusion is that some previous site studies may have significantly underestimated the quality of the best submilli-metre sites, and that the difference between these sites and poorer sites may be much greater than currently believed. We also show that part of the residual 857-GHz opacity at the best sites, currently ascribed to ‘dry-air opacity’, can in fact be just an artefact caused by not properly modelling the radome during the data analysis.