As design and design thinking become increasingly important competencies for a modern workforce, the burden of assessing these fuzzy skills creates a scalability bottleneck. Toward addressing this need, this paper presents an exploratory study into a scalable computational approach for design thinking assessment. In this study, student responses to a variety of contextualized design questions – gathered both before and after participation in a design thinking training course – are analyzed. Specifically, a variety of text features are engineered, tested, and interpreted within a design thinking framework in order to identify specific markers of design thinking skill acquisition. Key findings of this work include identification of text features that may enable scalable measurement of (1) user-centric language and (2) design thinking concept acquisition. These results contribute toward the creation of computational tools to ease the burden of providing feedback about design thinking skills to a wide audience.