Interconnection metallization uses film stacks, often composed of thin (<10 nm) Ti, TiN, or Ti/TiN underlayer(s) with a thick (200–1000 nm) Al-alloy film deposited on top. The texture or preferred orientation in such film stacks has important implications for both processing and reliability. Earlier studies' have demonstrated the importance of the underlayers on Al texture; however, to date no systematic work has been done on the effect of processing conditions on underlayer texture. This study examines the effect of deposition parameters on the underlayer texture development as well as the effect of this underlayer texture on subsequently deposited Al-alloy films. Fiber plots were obtained for Ti <002> and <101> and Al <111> reflections for a series of 20 nm Ti/ 10 nm TiN/400 nm AlCu films using both a conventional Siemens D500 diffractometer with a pole figure attachment and a Siemens HI-STAR Area Detector system using Cu radiation from a rotating anode source. Because of overlap between the Al <111> and Ti <101> reflections, the Al was removed with a subtractive etch. In this way both the Al and underlayer film textures could be quantified. It was found that the Ti and Al-alloy film textures vary depending on the deposition temperature, deposition method and final film thickness. For example, an increase in the substrate temperature from 300° to 500°C caused the Ti film texture to change from <002> to <101>. Additionally, switching the TiN deposition process from physical vapor deposition (PVD) sputtering to chemical vapor deposition (CVD) in a Ti/TiN/AlCu film stack caused a degradation in the AlCu <111 > texture.