Accurate low-dimensional models for the dynamics of falling liquid films subject to localized or time-varying heating are essential for applications that involve patterning or control. However, existing modelling methodologies either fail to respect fundamental thermodynamic properties or else do not accurately capture the effects of advection and diffusion on the temperature profile. We argue that the best-performing long-wave models are those that give the surface temperature implicitly as the solution of an evolution equation in which the wall temperature alone (and none of its derivatives) appears as a source term. We show that, for both flat and non-uniform films, such a model can be rationally derived by expanding the temperature field about its free-surface values. We test this model in linear and nonlinear regimes, and show that its predictions are in remarkable quantitative agreement with full Navier–Stokes calculations regarding the surface temperature, the internal temperature field and the surface displacement that would result from temperature-induced Marangoni stresses.