Abstract: Many survey researchers are interested in gauging public support for government policy, but there is strong evidence that a question’s wording affects responses to it. I develop the first automated and scalable method to predict the magnitude and direction of the partisan bias a question’s wording may impose on survey responses, and show using a series of survey experiments that it outperforms public opinion scholars in predicting that bias. Using a novel data set of almost one million survey questions from 1997 to 2017, I then examine trends in partisan survey question biases over time. I find that while questions related to economic issues are relatively unbiased, questions related to Barack Obama become steadily more conservatively biased from 2008 to 2017. Questions related to abortion and immigration are generally conservative, while questions related to healthcare and education are consistently liberal. Substantively, my results suggest that measurements of American public opinion are systematically biased; I discuss the implications of this result for democratic representation. Methodologically, this paper opens up new opportunities for studying ideology from text, and for improving survey methodology and measurement in public opinion.