Political Knowledge and Foreign Policy Attitudes: A...

10
Political Knowledge and Foreign Policy Attitudes: A Preliminary Investigation of Support for the Use of Military Force in Ukraine Supplementary Information April 14, 2014 Kyle Dropp, Joshua D. Kertzer, and Thomas Zeitzoff 1 We received a large number of questions and comments in response to our Monkey Cage blog post last week exploring the extent to which Americans can put Ukraine on a map. We were unable to address all of these questions in our follow-up post due to space limitations, so we’re including more information here. Figure 1: Americans have trouble placing Ukraine on a map We’re encouraged that our blog post last Monday exploring the extent to which Americans can put Ukraine on a map has received so much attention. As 1 Kyle Dropp is an Assistant Professor of Government at Dartmouth College and Visiting Associate Research Scholar at Princeton University’s Center for the Study of Democratic Politics (CSDP). Joshua D. Kertzer is a Dartmouth Fellow in US Foreign Policy and International Security at Dartmouth College, and Assistant Professor of Government at Harvard University. Thomas Zeitzoff is a Postdoctoral Fellow in Regional Political Economy at the Niehaus Center for Globalization and Governance at Princeton University. In the Fall of 2014 he will be an assistant professor in the School of Public Affairs at American University. Thanks to Pat Egan, Brendan Nyhan, John Sides, Josh Tucker, and Sean Westwood for their helpful comments. 1

Transcript of Political Knowledge and Foreign Policy Attitudes: A...

Political Knowledge and Foreign Policy Attitudes: APreliminary Investigation of Support for the Use of

Military Force in Ukraine

Supplementary Information

April 14, 2014

Kyle Dropp, Joshua D. Kertzer, and Thomas Zeitzoff1

We received a large number of questions and comments in responseto our Monkey Cage blog post last week exploring the extent towhich Americans can put Ukraine on a map. We were unable toaddress all of these questions in our follow-up post due to spacelimitations, so we’re including more information here.

Figure 1: Americans have trouble placing Ukraine on a map

We’re encouraged that our blog post last Monday exploring the extent towhich Americans can put Ukraine on a map has received so much attention. As

1Kyle Dropp is an Assistant Professor of Government at Dartmouth College and VisitingAssociate Research Scholar at Princeton University’s Center for the Study of DemocraticPolitics (CSDP).

Joshua D. Kertzer is a Dartmouth Fellow in US Foreign Policy and International Securityat Dartmouth College, and Assistant Professor of Government at Harvard University.

Thomas Zeitzoff is a Postdoctoral Fellow in Regional Political Economy at the NiehausCenter for Globalization and Governance at Princeton University. In the Fall of 2014 he willbe an assistant professor in the School of Public Affairs at American University.

Thanks to Pat Egan, Brendan Nyhan, John Sides, Josh Tucker, and Sean Westwood fortheir helpful comments.

1

academics, we’re excited when anyone outside of our immediate family readssomething we’ve written, so we’re thrilled by all of the comments, emails, blogposts, and tweets (we love you, Stephen Colbert!). More seriously, in a year whenlawmakers and pundits have expressed skepticism about the contribution andrelevance of the social sciences, we think the fact that a post by three politicalscientists topped the Washington Post most-read list serves as an importantreminder that there’s a very real appetite for the kind of work that our disciplinedoes.

In the previous post, we presented preliminary research we’ve undertakenthat suggests three things. First, many Americans have trouble placing Ukraineon a map. Second, accuracy varies across some subgroups (the more educationyou have, for example, the better your chance of getting it right), but not amongothers (both Republicans and Democrats fare about the same). Third, there isa relationship between how inaccurate people were, and attitudes related to USmilitary intervention.

We’ve received a number of questions and comments about the data wecollected, the methods we used, and what the results mean. Below, we firstdescribe why political knowledge matters, next discuss our survey design, thenreview our main findings, and finally include some diagnostic checks. We areeager to hear from you, so please contact us directly with further reactions.

1 Why we think this is important

1.1 Why does it matter whether Americans can put Ukraineon a map?

Researchers studying political attitudes care deeply about processes underlyingthose attitudes such as partisanship, trust, political knowledge, and tolerance.And political scientists have shown that knowledge is related to a wide rangeof attitudes and behaviors. A key assumption in many of the existing theoriesthat seek to explain variation in military conflict is that democratic publicscan constrain their leaders’ actions in the foreign policy realm. However, thesetheories assume citizens have the knowledge to hold their leaders accountablefor their actions. If, as some have argued, foreign policy is far removed frommost people’s daily lives, then the ability for the public to serve as a check ona leaders’ foreign policy is greatly diminished.

1.2 What do we think is going on between geographicknowledge and attitudes about the use of force?

We are still in the early stages of explaining why Americans with lower foreignpolicy knowledge are more supportive of using military force in Ukraine. Onepossible explanation is that our Ukraine distance measure is a proxy for overallknowledge and news consumption. Americans who place Ukraine closer to itsactual border may have seen news on the subject, may be aware that elites

2

(e.g., members of Congress, the President) on both sides are hesitant to engagein military action, or may have an elevated sense of the overall costs to actionin Ukraine (in terms of casualties). Less informed Americans may not have seenthat elites on both sides are hesitant to engage in military action. Along theselines, we observe an absence of partisan differences once we control for generalforeign policy attitudes — a finding that is consistent with this explanation.Since neither Republican nor Democratic leaders are actually advocating thatthe US use military force, it makes sense that the people who are more likely tofavor intervention are also the people less likely to be engaged in the first place.

This reflects an ongoing debate in political science: we know knowledgematters, but although we know that voters who can name the Vice President,for example, behave differently than voters who can’t, it is less clear whether thisdifference stems from knowledgeable voters putting their command of facts togood use, or to certain types of people being more likely to have this knowledgein the first place.

2 Survey design and measurement

2.1 How was the survey designed?

This poll was conducted from March 28-31, 2014, with a national sample of2,066 registered 2014 voters. The survey was programmed in Qualtrics onlinesurvey software, conducted by Paragon Insights, and fielded online by SurveySampling International, Inc. (SSI). The data were weighted to approximate atarget sample of registered voters based on age, race/ethnicity, gender, educa-tional attainment, geographic region, annual household income, homeownershipstatus, and marital status. The results we reported use weighted data, but thefindings are the same with the raw, unweighted data. The full survey has amargin of error of plus or minus two percentage points.

The survey questionnaire contained the following sections: scales measuringrespondents’ preferences toward two commonly used isolationism and militaryassertiveness scales drawn from previous research, the visual maps section, ques-tions about a number of possible actions the United States could take in Ukraine,a question battery asking Americans to describe the possible consequences if theUnited States used military force or took no military action, and a wide rangeof demographic items such as age, race, gender, educational attainment, andparty affiliation.

The March 28-31, 2014, survey (Survey 2) is a replication of another surveywe conducted from March 21-24, 2014, (Survey 1) on a separate national sampleof 1,997 registered voters. Levels of support for US actions on Survey 2 closelymatched those from Survey 1. The principal difference is that in Survey 1 weused a map of Eurasia and in Survey 2 we used a map of the entire world.

3

2.2 How did we obtain Americans’ responses and measuredistance?

During the survey, respondents were shown a large, empty map and saw thefollowing prompt: “We’ve heard a lot about Ukraine in the news lately. Canyou tell us where Ukraine is by clicking on the map below? Please do notlook up the answer online.” We then recorded the x-y position (i.e., horizontaland vertical in pixels) of where each respondent clicked. If respondents clickedmultiple times, we recorded their last click.

There are a number of different ways to measure the distance between twopoints, but we chose the simplest one: we used a Euclidean distance measurefrom the closest Ukrainian border to where participants clicked on the map(akin to placing a ruler between two points). The earth, of course, isn’t flat, sothis distance measure is really a measure of distance on the map participantssaw rather than distance in the world. Finally, we reduced the influence ofoutlier observations by using a logarithmic transformation of total distance. Wethought this was a defensible method for measuring respondents’ accuracy, butwe anticipate exploring other measures in the future.

2.3 How does our distance measure compare with other,more conventional foreign policy knowledge measures?

Political scientists have long measured foreign policy knowledge using multiplechoice question batteries that test for recall skills or familiarity with key pro-cesses. On Surveys 1 and 2, we also asked Americans multiple-choice questionsto identify the Current Chairman of the Joint Chiefs of Staff, treaty allies ofthe United States, and the name of the current Prime Minister of the UnitedKingdom. Respondents could score from 0 to 3 on this measure. The correlationbetween the number of questions answered correctly and our Ukraine distancemeasure is -0.2. In other words, people who answer more knowledge questionscorrectly are more accurate in placing Ukraine on the world map, though thecorrelation is relatively weak.

2.4 Is it strange that Americans have trouble putting Ukraineon a map?

Political scientists have known for a long time that people don’t know very muchabout international politics. This shouldn’t be surprising: even if “soft news”programs help bring information about foreign affairs to otherwise politicallyuninvolved viewers, foreign policy issues are far removed from most people’sdaily lives. What people know about world affairs seems to be associated withwhat they think; in other research we’ve done, for example, we’ve found thathigher-knowledge respondents are more likely to react to events on the worldstage, simply because they’re more likely to be aware of them in the first place.One reason why our results may seem so striking is because people who arerelatively politically engaged (e.g., the sort of person who reads The Monkey

4

Cage in their spare time) tend to interact with other people who are relativelypolitically engaged, and thus overestimate how much other people know.

2.5 How would people do in other countries?

We don’t have data on how people in other countries would do on our map quiz(yet), but a 1994 survey found that Americans tend to be less knowledgeableabout foreign affairs than citizens in four other western democracies. Other,more recent studies from education echo these findings about America’s lackof geographic literacy compared to other industrialized countries. In ongoingresearch, we asked ∼1,000 residents in the United Kingdom in mid-April 2014 toidentify many of the same countries on the same world map, along with Syria,North Korea, and Iran.

2.6 Would Americans do better with other maps?

This is a question we’re still exploring. Below is a map for the United Kingdom(UK), where respondents do relatively well.

Figure 2: Americans have little trouble placing the United Kingdom on a map

These results are from a poll conducted April 3-6, 2014, also via SurveySampling International (Inc.), where one in six respondents were asked to locatethe UK on a world map (hence the fewer number of dots on the map).

2.7 How did we measure support for military operationsin Ukraine?

After viewing the map of Ukraine and clicking on a location, respondents viewedthe following prompt:

“As you may know, Russian troops have invaded and occupied aportion of Ukraine known as Crimea, to protect what it sees as a

5

threat to the majority ethnic Russians living there. In a March 16,2014, referendum, the vast majority of Crimeans voted to secedefrom Ukraine and join Russia. Ukrainian officials have declared it aviolation of their national territory and an act of war. US and Eu-ropean officials have demanded that Russia remove its troops fromUkrainian territory and have called Russian claims on Crimea a vi-olation of international law. The US has levied economic sanctionsagainst top Russian officials. Many analysts argue that the US mustsend a strong signal to Russia that further threatening of other statesand allies will not be tolerated.”2

Respondents then answered the following question: “Please state how stronglyyou support or oppose each action. The United States should take military ac-tion in Ukraine.”

The response options were 1 through 7, where 1 was labeled at “Strongly Op-pose,” 4 was labeled “Neither Support nor Oppose,” and 7 was labeled “StronglySupport.” Thirteen percent of Americans selected options 5 through 7, whichwe informally categorized as support, 24 percent selected option 4 (“NeitherSupport nor Oppose”), and 63 percent selected options 1-3, which we catego-rized as oppose. As we note later in this post, 7-point (continuous) or 3-point(support/neither/oppose) versions of this variable yield similar results in themodel specifications.

2.8 How depressing are these results?

What you take away from these results depends on where you sit – the Kaza-khstani news site TengriNews, for example, summarized the findings with theheadline “Americans confuse Ukraine and Kazakhstan on world map” – but wedon’t think the results paint as negative a picture of the public as some have in-terpreted. First, we presented respondents with a blank map of the entire world,with no differentiation between landmasses and bodies of water and asked themto identify a country in the interior of Europe.

Although there were some dramatic outliers, most respondents were reason-ably close. If we look at the results as a sign of the wisdom of crowds, the modalresponse was quite accurate. More generally, since many survey respondents of-ten don’t try very hard, it is difficult to tell whether the respondents who clickedin Brazil were doing so because they had no idea and were clicking randomlyor because they genuinely thought it was in South America. (Of course, moretraditional multiple-choice measures exhibit this trend, as anyone who has evercompleted a multiple choice test knows).

2This prompt contains more information and context before asking about attitudes towardmilitary operations in Ukraine. This additional information, however, is unlikely to drive ourobserved relationship between accuracy and support for military operations in Ukraine.

6

3 Results

3.1 Main results

We tested the relationship between distance and support for military interven-tion using a number of statistical models. Our main models used an orderedlogistic regression examining support for intervention as function of logged dis-tance, controlling for respondent age, educational level, race, gender and partyaffiliation.

3.2 Interpretation of these results

Some readers have interpreted our results as saying that people who are ignorantsupport military intervention. We would push back against this interpretationfor two reasons. 1) Most people do not want US military intervention in Ukraine(only 13% were in favor of it). 2) This is preliminary research, and we did notask how strongly individuals supporting intervention or how certain they wereof their support. It could be that individuals who were less likely to know whereUkraine is were more uncertain about their responses and thus fell closer to thecenter of the scale than other respondents. Future waves of surveys will attemptto tease out whether a lack of geographical knowledge leads to more uncertaintyor to greater support for the use of force.

3.3 A note on effect sizes

In general, the effect sizes on our principal variable of interest, logged distance,were modest. Compared with individuals who were in the 75th percentile ofthe distance metric, the 16% of individuals who correctly identified Ukraine’slocation were about two percentage points less likely to support using militaryforce abroad. Compared with individuals near the maximum of the distancemetric, individuals who correctly identified Ukraine’s location were about fourpercentage points less likely to say they supported using military force abroad.The effects sizes, while modest, are identical across Survey 1 (March 21-24)and Survey 2 (March 28-31), and are similar in magnitude to the effect ofmoving up two increments on a five point education scale (i.e., from some collegeto post-graduate education) and are considerably larger than the impact ofparty affiliation, which does not appear to have much explanatory power in thisoccasion. If we take into account the fact that respondents clustered on theopposing the use of force end of the 7-point scale (i.e. a 1), and then run aTobit model, our results for the effects of distance on support for use of forceare even stronger.

7

4 Diagnostics

4.1 Sensitivity of our results

Our main models showed a significant relationship between distance and supportfor using military force. Subsequent variations such as using raw unweighteddata (versus survey weights with post-stratification), estimating a linear regres-sion model (versus the ordered logit model), or including additional controlvariables did not change the substantive or statistical significance of the rela-tionship. Furthermore, our results are robust if we exclude extreme outliers(i.e. people how thought Ukraine was in the Bahamas), and the results areslightly stronger if we exclude any respondents in the furthest (highest) 25% ofthe distance metric.

4.2 What’s with the vertical line in Russia?

A larger than anticipated proportion of the sample took the survey using com-puters that had monitors with low screen resolutions. As a result, they hadto actively scroll over to see the rest of the map. To the extent that this thisreduced the default area in which participants could click, it actually increasedtheir accuracy.

In the previous week (March 21-24), we administered the study on a differentsample of 1,977 Americans that used a map of Eurasia (not the entire world)that fit into nearly all monitors, and we found the same relationships that wedid in this post.

4.3 Were people paying attention? And does that influ-ence the findings?

One potential explanation for the results is that respondents simply weren’tpaying attention and that these inattentive participants were both less accurateon the map and in how they answered questions about their foreign policypreferences with Ukraine. We considered this possibility, but think there are afew reasons why it is less plausible.

First, we included a question at the beginning of the survey to measurewhether respondents were paying attention. Respondents saw a block of textand were directed at the end of the block of text to provide an incorrect answerto the previous question (see Figure 3).

When restricting the sample to the approximately six in 10 respondents whocorrectly answered the attention check in survey 2, the substantive results lookthe same. The median for the distance metric is identical across the full sampleand the sample of attentive respondents. The same holds for survey 1.

A second attention-related explanation for the results is that since the pri-mary dependent variable of interest, support for using military force, has alow overall value and is on a seven-point scale, its value would be inflated if

8

Figure 3: A sample attention-check

respondents were answering at random. This is plausible if inattentive indi-viduals are selecting 1 through 7 at random but less likely if such respondentsare disproportionately picking the first choice on the left (“Strongly Oppose”).Nonetheless, if respondents are answering at random, then our distance metricalso should predict responses to other survey questions that yield very low (orvery high) levels of support. We do not find that to be the case across a numberof other outcome variables. For example, we find no relationship between ourdistance metric and responses to a five point item in the military assertivenessscale (“Rather than simply reacting to our enemies, it’s better for us to strikefirst”) where only 15% of respondents agree or strongly agree.

Higher levels of random response patterns among individuals who incorrectlyidentify the location of Ukraine would yield the opposite predictions for someof our other findings we reported in the original post. For instance, we foundthat less accurate respondents said Russia posed a larger threat to the UnitedStates than respondents who were able to identify Ukraine on a map. Sincetwo in three respondents overall said Russia is “very threatening” or “some-what threatening” to U.S. interests, and the remainder said Russia was “nottoo threatening” or “not threatening at all” (a 4-point scale), random responsepatterns would make inattentive respondents view Russia as less of a threat.That is, we would anticipate approximately 50% of respondents to say “verythreatening” or “somewhat threatening” if response patterns were random, afigure far lower than the observed figure, which is higher than six in 10. More-over, if respondents who have a high distance measure are paying less atten-tion than others, we would not expect them to be more likely to say Russiais “very threatening” or “somewhat threatening” than other, more attentive

9

respondents. Either way, this is an important question worthy of further inves-tigation and future studies might randomize whether the first item is “StronglyOppose” or “Strongly Support,” or might examine how larger primacy effectsamong inattentive respondents might influence treatment effects.

As a final note, our research raises important questions about the role thatinformation plays in the public’s support for different foreign policy options. Weexamine an important relationship – low geographic knowledge is correlated withgreater support for the use of force. These studies are still ongoing, and futuresurveys will seek to further disentangle the mechanisms connecting politicalknowledge to foreign policy attitudes.

10