Published online by Cambridge University Press: 10 November 2010
In this paper we discuss the differences between normal supernova remnants (SNRs) evolving in comparatively low-density ambient media (n0 ≪ 105 cm−3) and “compact” supernova remnants that evolve in the high-density (n0 > 104 cm−3) environment that one would expect to find in starburst regions of galaxies. For normal SNRs, radiative losses do not start to become important until time scales of the order of 104 yr, after the onset of thin shell formation. For compact SNRs, however, the evolution proceeds at a much quicker pace, with radiative losses due to free-free emission, given the high temperatures (≥ 107 K), being important because of the high densities. The onset of thin shell formation in this case occurs over time scales of the order of years, and most of the radiation is emitted in X-rays and the UV. We argue that the compact supernova activity associated with starburst regions in the centers of galaxies gives rise to most of the typical properties of the Broad Line Regions of active galactic nuclei.
Starbursts are usually traced by their bright photoionized regions and large cluster luminosities, in either optical or IR wavelengths. The mass spectrum of the resulting stellar groups is difficult to derive but a large fraction of massive stars is usually implied. Stars with initial masses above ∼ 8 M⊙ have strong UV radiation fields and significant mass loss during their whole evolution, and they are also the progenitors of Type II and Ib supernovae (SNe).