Jump to ContentJump to Main Navigation
Interpretive Social ScienceAn Anti-Naturalist Approach$

Mark Bevir and Jason Blakely

Print publication date: 2018

Print ISBN-13: 9780198832942

Published to Oxford Scholarship Online: December 2018

DOI: 10.1093/oso/9780198832942.001.0001

Show Summary Details
Page of

PRINTED FROM OXFORD SCHOLARSHIP ONLINE (oxford.universitypressscholarship.com). (c) Copyright Oxford University Press, 2021. All Rights Reserved. An individual user may print out a PDF of a single chapter of a monograph in OSO for personal use.  Subscriber: null; date: 20 January 2021



(p.88) 5 Methods
Interpretive Social Science

Mark Bevir

Jason Blakely

Oxford University Press

Abstract and Keywords

This chapter draws on the latest methodological literature in order to show how an anti-naturalist framework justifies multi-methods in social science research. Contrary to the widespread debate that pits “quantitative” versus “qualitative” methods, researchers are free to use methods from across the social sciences provided they remain aware of anti-naturalist concepts and concerns. Leading methods are analyzed in light of the latest social science, including: mass surveys, random sampling, regression analysis, statistics, rational choice modeling, ethnography, archival research, and long-form interviewing. A full-blown interpretive approach to the social sciences can make use of all the major methods and techniques for studying human behavior, while also avoiding the scientism that too often plagues their current deployment.

Keywords:   multi-methods, random sampling, regression analysis, rational choice, ethnography, qualitative versus quantitative

Social science today has largely reduced the interpretive turn to a commitment to “qualitative” methods. Unfortunately, this narrow and mistaken view of the interpretive turn is widespread among many of the most influential practitioners of social science. For example, three of the most eminent methodologists working in political science today equate interpretivism with deep immersion in the customs, practices, and institutions of a particular group of people.1 In this way, interpretive research becomes synonymous with ethnography and immersive, onsite fieldwork.2 Viewed from this perspective, the opposite of “interpretive” is quantitative methods like mass surveys, statistics, and various forms of modeling. The interpretive turn is then thought of as little more than an attempt to have qualitative methodologies reign supreme. The entire methodological debate becomes limited to a highly polarized discussion about “quantitative” versus “qualitative” research. So widespread is this confusion today that even those who defend the interpretive turn in social science often do so in the name of qualitative methods.3

Here our anti-naturalist framework can dispel confusions, pointing to a more viable way forward past the gridlock of a method dispute between “qualies” and “quants.” Anti-naturalist premises help show how the interpretive turn does not bind social scientists by logical chains to either qualitative or quantitative methods. To the contrary, social scientists are free to employ the full range of methods from the “soft” qualitative to the “hard” quantitative so long as they keep clear of certain naturalist malformations. This is because the primary methods used by social scientists today are philosophically compatible.4 Ethnography may be combined with statistical analysis, rational choice mixed and matched with long-form interviews, mass surveys carried out side by side with immersive field research. Methodology, properly conceived, is nothing more than forms of data collection, data analysis, and heuristics. Methods are instrumental, much like a hammer or nail, and whether they are put in the service of building a naturalist or an anti-naturalist edifice depends on how they are employed. This means social scientists can stop worrying about whether two methods are incompatible and leave the debate between “qualitative” versus “quantitative” behind. As instruments, methods are always subordinate to the wider purposes of the researcher and do not dictate the philosophical commitments of the work. Instead, the primary form of incompatibility social scientists do need to worry about is philosophical—especially the naturalist versus anti-naturalist divide. Our hope is to turn attention away from supposed methodological divisions, to the deeper and (p.89) more important strata of philosophical dispute. Clearly, then, our use of the term “methodology” emerges out of philosophical concerns, and although it is used this way by a wide number of social scientists, it is also at odds with other uses of the term that consider methodology and philosophy to be more synonymous.5

Methods can be distinguished philosophically into three general, not entirely mutually exclusive categories. First, methods can be used for data collection or to generate information about the world (as in the case of ethnography, long-form and semi-structured interviews, and mass surveys). Second, they may be used to find patterns in data or what we call data analysis (as in the case of random sampling, statistical inference, case studies, grounded theory, and Q methodology). Finally, they can serve as heuristics that, although not chiefly concerned with either data collection or analysis, help inspire insights about social reality (as is the case with certain kinds of formal modeling, especially rational choice theory). While anti-naturalist philosophy prescribes clear parameters about how social scientists should treat data generation, data analysis, and heuristics, it does nothing to prohibit the use of any one particular technique. Social scientists determine what method is best for their research goals by exercising their experienced judgment within context. Provided social scientists steer clear of naturalism, they are free to creatively make use of whatever method best serves their research goals and purposes. Anti-naturalism makes clear the way in which multi-methods is fully compatible with the interpretive turn.

Data collection: ethnography, interviews, and mass surveys

Social science research is impossible to conduct without some way of grasping what social reality actually looks like on the ground. This means social scientists need tools for generating information about the world—what we will refer to as “data collection.” Data collection involves collecting information about a particular group’s beliefs, actions, practices, and way of life. Such data collection can take a number of forms—from the techniques of ethnography to mass surveys and census data. We will look at each of various techniques in turn to clarify how social scientists should use these methods while also remaining cautious of the possibility of naturalist vitiation. However, first a brief clarification in our use of the word “data” may be necessary. By “data” we do not mean to reintroduce some naturalist notion of empiricism or brute verification. In prior chapters we have already argued that no such immediate sense data is available in the social sciences. This is because the (p.90) social sciences are engaged in trying to interpret webs of meanings and beliefs. Social reality is expressive of these holistic webs and so must be studied through a hermeneutic circle of relating part to whole. In other words, our use of the word “data” here has nothing to do with a naturalist claim to foundationalism or brute empiricism in the social sciences (a connotation it too often takes in naturalist-dominated disciplines). We mean something very different by data here—simply information about the meanings, beliefs, actions, practices, and so on that comprise social reality. In other words, data is information about the social world that respects the holistic and interpretive nature of human agency.

Because of the dominance of the qualitative versus quantitative debate, ethnography has become the method most commonly associated with the interpretive turn. Indeed, many people believe that interpretive research is little more than a movement advocating the spread of ethnography and other such qualitative research methods. But anti-naturalism makes clear how this view is mistaken. Like any other form of data collection, ethnography can be put to either anti-naturalist or naturalist uses.

Largely developed by anthropologists and sociologists, ethnography is a method defined by immersion in the way of life of a particular group or culture. This is normally accomplished by onsite research and observation. The goal of ethnography is the construction of “thick descriptions.”6 Thick descriptions are detailed accounts of human belief and action within their surrounding webs of meaning. The most common way to conduct ethnography is through observer-participation, in which a social scientist scrutinizes a given group in its daily life, developing a sense for its patterns and rhythms. Observer-participation can take the form of “researcher alone,” in which the social scientist never departs from his or her role as a detached observer, or “situational participant,” in which the social scientist directly partakes in the customs, rituals, and other aspects of social life.7

Ethnographers often combine observer-participation with in-depth interviews. In-depth interviewing allows researchers to pursue the complexities and nuances of an individual’s or group’s beliefs through conversation. Unlike mass surveying, in-depth interviewing gives the researcher the chance to extemporize: following unexpected leads and other unforeseen elements of dialogue and exchange. This approach contrasts with “fixed-format” interviews that follow a single, preordained track and “forbid researchers from digging in areas that emerge as promising during the course of an interview.”8 Like the immersive practices of observer-participation, long-form interviews allow social scientists to arrive at thick descriptions—or highly nuanced, complex accounts of the meanings constituting social reality.

Ethnography is clearly an extremely powerful tool for the study of social science. The vast and diverse body of successful research that currently employs ethnographic techniques confirms the strength and fruitfulness of (p.91) this method. To choose only a few of many examples, the sociologist Kenneth MacLeish has used ethnographic techniques to track the effects of the Iraq war on military communities far beyond the frontlines of the battlefield. He accomplished this through the use of both observer-participation and long-form interviewing at Fort Hood, Texas. What MacLeish found is that the violence of war extends into the life of the communities far beyond the battlefield and mixes into the ordinary lives of military families. The otherwise invisible tolls of war are therefore made concrete.9

In a different vein, Alice Goffman made a much-discussed contribution to urban ethnography by immersing herself in the world of young African American men living in the ghettos of Philadelphia.10 Goffman used thick descriptions to challenge the standard narrative which holds that racial equality was achieved with the civil rights movement, instead arguing that poor black men are “enveloped in intensive penal supervision” that is “the latest chapter in a long history of black exclusion and civic diminishment.”11 Yet another example are the deep insights into the political and economic despair of white working-class conservative voters in rural Louisiana offered by Arlie Russell Hochschild; or Matthew Desmond’s masterful ethnographic work on the pressures of urban poverty and eviction.12

Regardless of the final assessment of these respective studies, they stand as examples of the effective use of ethnographic methods in an anti-naturalist study of society. Indeed, ethnography is a highly attractive and powerful tool for anti-naturalist research in particular for at least three related reasons. First, ethnography is especially helpful for crafting concepts that emerge out of dialogue with the social actors studied. Rather than concepts crafted “a priori by the researcher,” the ethnographer can form concepts from “the bottom up” in a way that is “context-situated.”13 Done correctly, this helps social scientists steer clear of the naturalist error of instrumental concept formation and instead form concepts that are dialogical. This averts the instrumentalist errors we discussed in Chapter 4 on concept formation.

A second reason ethnography is attractive for anti-naturalist inquiry is because thick descriptions are particularly useful for making the unnoticed meanings and beliefs structuring social reality come to light. Ellen Pader emphasizes the power of ethnography in this regard by recounting the case of a single, working-class mother in New York City who, although she struggled to pay for food and rent, spent some of her meager salary on two luxury goods—an air-conditioning unit and a videogame for her children. Pader notes that many social science students, when presented with this poor mother’s purchases, polarize into two groups: one decrying the fiscal irresponsibility of the lower classes, the other defending the purchases as another case of a poor woman succumbing to the nefarious pressures of commercial society. Neither group comes close to seeing what ethnography revealed about this tragic case of human poverty. Namely, this single mother purchased (p.92) the videogame and air-conditioning unit in a desperate attempt at childcare. Unable to afford a babysitter or daycare during the hot summer months when her children were out of school, the mother purchased these two items to try to keep her children off the rough streets of their New York City neighborhood.14 In this way, ethnographies of New York City’s public housing projects have made the motives driving particular residents more intelligible in a way that otherwise would have remained undetected. Ethnography offers the opportunity for thick descriptions that grasp the meanings and beliefs informing agents’ particular actions. In this case the hard plights and no-win situations facing the poorest families would remain unintelligible without the help of ethnography.

A final way ethnography helps advance anti-naturalist research is that it can be used to attack oversimplifications of social reality by generating greater sensitivity to the contextual nature of meaning and language. Frederic Schaffer, for example, has used ethnography to great effect in this way—correcting contemporary political science’s tendency to assume that the word “democracy” means the same thing everywhere. As discussed in Chapter 4 on concept formation, this essentialist error occurs frequently within the study of democratic theory. Schaffer’s extended study of the word “demokaraasi” in Senegal shows how deeply this term differs from the “English-language concept of democracy used by American social scientists.”15 In this need to grasp language, all social science depends (if only inchoately and implicitly) on an ethnographic background of knowledge.

Yet ethnography need not exclusively be used in support of anti-naturalist research in the social sciences. Nothing about ethnography magically guarantees the right philosophical position. On the contrary, during the twentieth century ethnographic techniques were also often used to bolster naturalist research programs. For example, during the 1920s and 30s anthropologists like Margaret Mead used ethnography as a way to travel the globe in an effort to ascertain the “essential institutions and structures” of a culture.16 Mead and her students believed anthropologists could use short forms of observer-participation in order to identify the essential, abstract types that define all human social life. This effort at essentialist, synchronic, and ahistorical structures clearly shares in the philosophical attributes of naturalism and later came to be derisively referred to as “airplane ethnography.”17 But it also makes clear that there is nothing philosophical or logical barring the use of ethnographic tools for naturalist ends.

Indeed, longer stays and intensive language training would not necessarily have freed this kind of ethnography from naturalist distortion. Many ethnographers spend their entire lives specializing in a few cultural subgroups and carry out forms of immersive observer-participation that last years and even decades. Such practices often make for good workmanship and generate important insights about social reality, but they do not guarantee an escape (p.93) from naturalist distortions. After all, a researcher could spend years mastering the appropriate language, and decades in the field engaged in observer-participation, and still adopt essentialist, ahistorical structures of explanation. Arguably, many naturalist social scientists working across modern societies today live out de facto just such an immersive experience in their own home cultures. But while long-term exposure to a cultural group may enrich ethnographic data, it does not guarantee its freedom from naturalist conclusions.

Naturalism is also evident in some of the earliest ethnographies ever conducted. Historically speaking, ethnography began in part as a colonial endeavor. The pioneers of ethnographic technique often sought to make far-flung and purportedly primitive cultures intelligible to European colonialists. What arose from this colonial approach to social science was too often essentialist and “orientalist” fictionalizations of the “natives.” Too often the native was stripped of the dignity of human agency by social scientists and treated as simply another representative of a monolithic cultural type.18 Where Mead sought ahistorical, universal structures shared by all global cultures, colonial anthropologists instead ethnocentrically assumed there was a particular, essential “spirit” defining a culture, which the ethnographer tapped.19 Once again, then, ethnography can clearly be put to naturalist uses—for the construction of monolithic natural types: essentialist and sometimes even racist conceptions of the core nuclei of cultural features that invariantly characterize a people. Ethnography need not take such misbegotten features, but there is nothing philosophically that necessitates it as a method pure of naturalist taint. The qualitative–quantitative debate and its insistence on methods as guaranteeing interpretive outcomes is therefore potentially misleading in this regard.

Fortunately, ethnographers in our own time have developed a number of post-colonial strategies for avoiding naturalist pitfalls like that of essentialism. These strategies are important to keep in mind for anyone wishing to make use of ethnography. The first is making sure to disaggregate meanings and to guard against the creation of false monoliths. In its colonial phase, anthropologists sometimes favored the study of small, bounded locales like villages or a particular political institution and from this they would extrapolate the supposed overarching spirit of an entire people. However, today ethnographers are more apt to study all varieties of social organization, from the army to large multi-national corporations, while also pursuing “multi-sited” studies, following communities across borders and into the different branches of an increasingly globalized society.20 Where ethnographers once spent much time trying to make the far-flung and strange familiar (e.g., Balinese cockfights), now they are as likely to focus on features of the nearby in an effort to make the familiar appear strange (e.g., American shopping habits). Ethnographers can avoid the error of essentialism by always searching for the heterogeneity and variances within cultures and not rushing to convert local meanings into (p.94) civilizational monoliths. As we already discussed at length, meanings are a series of family resemblances and not a fixed nucleus or core. One of the important anti-naturalist lessons of post-colonial ethnography is to be alive to the sheer heterogeneity and variance in meaning.

Similarly, the post-colonial turn in ethnography has inspired researchers to develop innovative dialogical approaches to their research. Rather than the monologue of the ethnocentric colonialist, ethnographers today have developed strategies of deeper dialogue, including lengthy quotation, various forms of co-authorship, and other such attempts to let the subject of the study speak in his or her own voice.21 Naturalist conceptual problems like essentialism and object-side instrumentalism can potentially be corrected by these more radically dialogical strategies.

Ultimately, ethnography is not any more immune to naturalism than other forms of data collection. The fact that it can be classified as a “qualitative” method does not resolve the deeper philosophical issues of which social scientists must remain aware. Because naturalism and anti-naturalism are philosophies and not mere techniques, they can subordinate any number of methods into their service. Furthermore, because all methods are compatible with one another, ethnography is easily paired with other kinds of data generation that are less immersive. For example, ethnography and long-form interviewing are often used by social scientists in conjunction with mass surveys and questionnaires.22 So researchers conducting a mass census survey might supplement their work with in-depth interviews of particular subgroups within that population in order to gain a more nuanced sense of their beliefs. Similarly, a mass survey registering widespread dissatisfaction with a given public figure can be combined with in-depth interviewing to uncover the specific and myriad reasons for this dissatisfaction. Thus, ethnography can be joined with other more “quantitative” methods in the social sciences by providing contextual, local knowledge that might otherwise remain inaccessible. How this mixing and matching can be realized will become clearer by briefly scrutinizing a second form of data collection—mass surveys.

If ethnography is frequently mistaken as an exclusively interpretive method, mass surveys are too often thought of as inescapably naturalist. Again, this is due to the spell cast by the quantitative–qualitative debate, which assumes that social scientists must choose a side when it comes to methods. Yet mass surveys remain one of the most effective tools for collecting information about the social world. Such surveys can be used to describe individual beliefs and behaviors as well as to capture the attributes of particular social groupings—for instance, how many households own a firearm or how many cities run a public transportation system. In this way, surveys can describe both individuals and larger group organizations (churches, clubs, companies, cities, and nations). Mass surveys allow social scientists to sketch the social (p.95) world in broad strokes. Such sketches of social reality can either take the form of a single snapshot (“cross-sectional”) or else depict a population over an extended period of time (“longitudinal”).23 Any social science that could not make use of such a powerful tool would be greatly impoverished. One can hardly be surprised when well-meaning social scientists, taught that they must decide between qualitative and quantitative methods (and who therefore believe the interpretive turn is incompatible with the use of mass surveys), opt for quantitative methods and forgo anything they could learn from hermeneutic philosophy.

Fortunately, anti-naturalism makes clear that no such exclusive choice between quantitative methods and the interpretive turn is necessary. Rather, the main naturalist threat that must be avoided in the use of surveys is atomization. In Chapter 4 on concept formation we argued at length that under the influence of naturalist philosophy, social scientists often try to atomize political reality in order to correlate one bit of reality with another. This is done in the hope of discovering general causal mechanisms or laws. For instance, naturalist social scientists might try to establish a link between one isolated feature discovered by a mass survey like a respondent’s race, age, class, or gender and another, like their voter preferences. In doing so they might try to explain voter behavior by appealing to some atomistic fact as in a necessary causal bond with another such fact. When one atomistic bit of social reality occurs, another bit is said to necessarily follow (or at least be correlated in some way). Indeed, often survey research is erroneously presented by naturalist methodologists as tied to just such “multivariate analysis.”24 When framed by such concepts and ambitions, mass surveys are thought to simply generate data for mechanistic, naturalist explanations.

But there is no logical link binding mass surveys to such atomism and mechanism. Instead, social scientists may affirm the descriptive validity of particular mass survey research (for example, X percent of white female Christians voted for the rightwing candidate) while recognizing that any attempt at explaining these beliefs will require further exploration of the reasons and beliefs held by these agents. In other words, explanation requires placing beliefs in a wider web of meanings. In order to move from description to explanation, mass survey research might benefit from supplementing its findings with ethnographic research or panel studies that take a sample of respondents in order to delve into their reasons for holding the beliefs that they do.25

A similar point can be made about structured and semi-structured interview methodologies. Structuring an interview refers to “the degree to which the questions and other interventions made by the interviewer are in fact pre-prepared by the researcher”; thus, interviewing can vary in approach from “lightly structured to heavily structured…from the completely unstructured to the fully structured.”26 Heavily structured interviews can sometimes veer (p.96) into naturalism by ignoring the contingent beliefs and meanings of those being interviewed. In these cases, interviewers can fall into the trap of having reified or brute, reductive notions of the answers given by interviewees (as if they were merely extracting readymade, interpretation-free data). When language and beliefs are treated this way, social reality is seriously disfigured. But there is nothing about the degree of structuring of an interview that forces researchers to treat reality in a naturalist way. To the contrary, staying aware of the interpretive features of the meanings gathered allows researchers to decide how much or how little to structure their interviews on a purely pragmatic basis.

Thus, there is no need to make a hard choice between “quantitative” mass surveys or structured interviews and the insights of anti-naturalist philosophy. To the contrary, anti-naturalist philosophy shows that the descriptions of mass survey research and structured interviews can be accepted as valid while also insisting that explanation requires embedding these beliefs within some narrative. Both concept formation and explanatory forms must remain anti-naturalist, but the method for generating information need not necessarily be rejected even if some caution of use is in order.

In short, data collection (be it in the form of immersive ethnography, long-form interviews, or mass surveys and questionnaires) may be mixed and matched by social scientists to best serve their research goals. But what about the use of statistics that are often paired with mass surveys? Surely such randomization and statistical inference are the mark of quantitative methods that are necessarily tied to the naturalist project? In order to grapple with this problem, we must consider methods that are not forms of information collection but of data analysis.

Data analysis: random sampling, statistical inference, case studies, grounded theory, and Q methodology

Ethnography, interviewing, and mass surveys are all forms of data collection—that is, of gathering information about a given population’s beliefs, actions, and way of life. This is distinct from data analysis, in which such information is treated to a certain kind of organization in search of patterns. Yet similar to data collection, data analysis is also frequently subject to the widespread assumption that certain forms of it are either inescapably naturalist or inescapably interpretivist. For example, random sampling and statistical inference are often thought to be inherently naturalistic, while case studies are too often conceived as necessarily sensitive to meanings. Other methods like (p.97) grounded theory and Q methodology are often subject to considerable confusion and claimed as both scientistic and sensitive to meanings.27 Yet as was the case with data collection, these tools of analyses are largely instrumental and can take either naturalist or anti-naturalist forms. Social scientists are thus free to make use of whichever data analysis tools they choose so long as they guard themselves carefully against naturalist tendencies. Looking at each of these forms of data analysis will clarify how this is the case.

Random sampling makes use of statistical theory in order to generalize about a large population while studying only a much smaller, more manageable subset. Without a technique like random sampling, social scientists would be unable to describe larger populations because such a project might be impractical, cost prohibitive, or in some other way unmanageable. Random sampling resolves this problem by allowing social scientists to conduct a rational form of guesswork when trying to infer the characteristics of a larger population by examining only a thin slice. This form of data analysis requires that every single member of a population has an equal chance of selection. Such randomization is an effort to eliminate bias in selecting the subset. The subset then gives a picture of how beliefs are distributed.

This is not the place for an extended exposition on the statistical theory that supports the inference from the small subset to the total population—but in general terms, this inference is possible because a distribution of samples takes on a fixed, normal shape. For example, support for the Green Party in a given population may be 7 percent. A sample of the wider population might instead place the percentage of Green Party support at 4 percent, a second sample at 9 percent, a third at 6 percent. So long as the survey respondents are truly randomly selected, the more samples that are taken by researchers from a given larger population, the more there will be convergence around a mean. This then creates a standard distribution around a given number. In the use of random sampling, this number or mean always remains a guess or approximation. Because the population has not been exhaustively interviewed, the mean may be wrong. But the more samples that are taken, the more reliable this guesswork becomes. Indeed, statistical theory allows researchers to quantify the degree of their uncertainty. The sample mean allows researchers to build a “bridge” from what they know about the sample population to what they “believe, probabilistically to be true about the broader population.”28 This inference, from a small subset of a population to what is likely about a large population, is the very heart of what is called statistical inference.

Thus, there is an ineliminable uncertainty in the method of random sampling and statistical inference. Randomization and statistical inference are a form of data analysis that allows for highly sophisticated guesswork. No researcher should ever forget this. Indeed, the guesswork can itself be subject to statistical description and analysis in terms of standard deviations that can be treated as “confidence” levels.29 As with mass surveys more generally, there (p.98) are many who mistakenly associate random sampling and statistical inference with an exclusively “scientific” and naturalist approach.30 But there is nothing inherent to this method that is philosophically incompatible with an interpretive approach. Guesswork, even of a highly sophisticated statistical form, is in no way out of bounds for anti-naturalists. Rather, as with mass surveys more generally, the real naturalist threat occurs in how social scientists use this information. If they tie this information to multivariate analysis that atomizes political reality in order to seek out general causal laws, then something has gone badly wrong. If, however, they use this descriptive guesswork to embed their findings in further webs of meaning and belief (perhaps with the aid of ethnography or other more immersive techniques), then they are on firm philosophical footing.

Both the promise and the peril of random sampling statistical methods can be clarified with an example from the social science literature. In 2005, sociologists Christian Smith and Melinda Lundquist Denton conducted a deep analysis of the National Survey of Youth and Religion—a random-digital-dial sample of teens across the United States ages thirteen to seventeen.31 The fruit of this research was the single largest trove of statistical data on U.S. teen attitudes toward religion and spirituality at that time. Such a massive random sample survey can be embraced by anti-naturalists as providing an important descriptive map of certain features of social reality. For example, anti-naturalists can affirm Smith and Denton’s finding that contrary to widespread belief at that time, U.S. teens were not trending away from traditional churches and congregations; indeed, the “vast majority of U.S. teenagers profess to be theists.”32 This could be taken as a descriptively valid finding at the time of the survey research.

However, anti-naturalists should also take note of an important caveat to this finding that was furnished by Smith and Denton’s own supplementary, more ethnographic research on this topic. Specifically, Smith and Denton used a randomized subsample of the survey respondents in order to conduct 267 in-depth face-to-face interviews. Among other things, this in-depth investigation uncovered that “the de facto religion” among U.S. teens at that time was what the authors termed “Moralistic Therapeutic Deism” or the belief that traditional religion is primarily a tool for making people nice and generating good individual self-esteem (as opposed to, say, the more mystical, metaphysical, and political aims of various traditional forms of religious belief).33 In this respect, the in-depth interviewing revealed a deeper underlying complexity to the statistical survey research. While the majority of American teens continued to identify with the traditional churches, what they meant by “God” had changed significantly. God was an impersonal designer, a lawmaker whose primary concern was fostering self-esteem and kindness. He was not so much a real person, a teacher of sin and grace, or a redeemer of human history. In this way, mass surveys combined with in-depth interviews established that U.S. adolescents were both more and less religious than their contemporaries suspected.

(p.99) The lesson for an anti-naturalist social science is clear: mass surveys, random sampling, and statistical inference are valuable for large-scale sketches of social reality but they do not replace an analysis of the deeper beliefs and traditions that create a given status quo. As Smith and Denton observe, mass statistical surveys provide an “overarching sense of our social world,” but they are also “oversimplifying” and must be supplemented with deeper engagement of cultural meanings and textures.34 In this respect Smith and Denton’s work serves as a model to anti-naturalist researchers—expertly employing both mass survey statistics and ethnographic inquiry. In doing so, they also make clear our point that all methods are compatible and can be effectively mixed and matched. The move from so-called “quantitative” to “qualitative” methods presents no impassable boundaries to the working social scientist.

However, Smith and Denton’s research also serves as a cautionary tale insofar as it occasionally creeps toward naturalism. To pick just one example, Smith and Denton propose possible “empirical correlations and causal relations” between the intensity of religious belief and adolescent wellbeing or good life “outcomes.”35 Of course, the idea that there would be a mechanistic causal relationship between teenage belief in religion and wellbeing commits a number of the errors we have already discussed in prior chapters. Not only can such an explanation not cope with anomalies, but it also neglects the contingency of meanings and the basic narrative structure of human agency. Yet the point at present is not to rehash these arguments. Rather, the point is to make clear the way that statistical data can be both used and abused. Random sampling and statistical inference are powerful tools for anti-naturalist social science provided researchers keep clear of the temptation to impose mechanistic explanations between atomized, essentialized, or reified variables (here “religiosity” and “wellbeing”). Instead, social scientists should explore the inherited traditions and contingent reasons and beliefs that have helped create a particular distribution of beliefs, attitudes, and behaviors. They should connect a given set of findings to wider webs of belief and meaning. This means engaging in the hermeneutic circle, and explaining a particular pattern through the construction of a holistic narrative of meanings, not formulating mechanistic laws. Agents should be situated within traditions, and stories told about how they inherited or modified their beliefs, actions, and practices.

For example, if religious U.S. adolescents are outperforming their peers in terms of “wellbeing,” the anti-naturalist social scientist should consider a whole battery of questions that remain obscured by the tacit naturalist assumptions of Smith and Denton. This means looking at questions like: Are certain religious traditions more likely to create certain kinds of wellbeing (for example, Max Weber’s famous link between Calvinist theology and capitalist prosperity)? Such cultural links might then reveal that the term “wellbeing” is not simply neutrally descriptive. Whose definition of “wellbeing” is being granted priority in such (p.100) research? After all, varying traditions have starkly opposed conceptions of “wellbeing.” The term is not essentially or atomistically self-evident, but must be related to wider webs of meaning in ways that produce rival and contestable conceptions of wellbeing and not just one. In this vein, a reader of Smith and Denton’s study whose philosophical intuitions have been honed by anti-naturalism might ask: Is the study neglecting rival definitions of “wellbeing” held by other religious traditions or teens who do not necessarily have religious beliefs? Perhaps atheistic and agnostic teens simply do not share the same conception of wellbeing (this would mean Smith and Denton might be committing conceptual errors like essentialism and object-side instrumentalism).

Yet even if the concept of wellbeing is widely shared, the link is not mechanistic, but involves contingent belief formation and self-interpretation. An anti-naturalist social science must be prepared to explore the complex matrices of meaning, belief, tradition, and practice that have created certain descriptive relationships and not move to the oversimplifying, distortive mechanistic explanatory forms of naturalism. As with mass surveys more generally, data analysis like random sampling and statistical inference can be accepted as a powerful tool provided social scientists do not mistakenly begin to atomize and correlate such findings into mechanistic forms of explanation.

A parallel set of problems arises with tools of analysis like case studies, which are often conversely thought to be a guarantee of more historically and interpretively sensitive inquiry than statistical inference and random sampling. Case studies are employed across the social sciences, but in political science are most often identified with the more “qualitative” subfields, especially comparative politics. One of the most widely cited methodologists of case studies is the political scientist, John Gerring. Briefly scrutinizing his influential claims will help clarify the naturalist dangers and anti-naturalist potential of this tool of analysis.

Gerring defines a “case” as a single instance of a phenomenon (e.g., a nation-state, a city, a prison, a voter) within a wider class of that phenomenon (e.g., nation-states, cities, prisons, voters). Studying particular cases is thus a strategy for analyzing political reality by intensively drilling down on a single historical instance of it (or perhaps a cross-case study of a handful of cases) with the goal of better understanding something about the larger series or class.36 In comparative politics the case study is often presented as imminently historical, sensitive to context, and anti-scientistic. Yet as with the other two forms of data analysis, there is nothing logically binding this method of inquiry to either naturalism or anti-naturalism. What is crucial from a philosophical perspective is that researchers retain a strong sense of the holistic, historical, contingent, and narrative nature of social and political reality. So cases can be fruitfully analyzed by interpretive social scientists to search for analogies or wider patterns of meaning and belief. But these must always be (p.101) conceptualized in terms of narratives about contingent features of reality and not ahistorical, formal, atomistic, or mechanistic applications.

Unfortunately, Gerring’s own treatment of case studies slides into a naturalist direction, not only through his atomizing of ahistorical units (as if a nation-state or prison were a reified chunk of reality that could be moved from context to context) but also through his extended analysis of case studies as tools for finding causal pathways and relationships between formal variables. For example, Gerring argues that one of the primary virtues of case-study analyses is that they can be used to test either strong or weak causal bonds between ahistorical variables. In the case of strong causal bonds, “X is assumed to be necessary and/or sufficient for Y’s occurrence,” whereas in weak causal bonds, the “mechanisms” are “more tenuous,” “highly irregular,” and “probabilistic.”37 When the former is the case a single case study can be used to disprove a claim to a strong causal bond, but when a claim is being made to probabilistic causal bonds, Gerring recommends social scientists use a cluster of cases (or “cross-case” studies) to test the hypothesis. In both scenarios, the problem is that Gerring is fitting case-study forms of analysis to a naturalist schema of mechanistic explanation between ahistorical, atomized variables.

For instance, in the case of weak, probabilistic causal bonds between variables Gerring writes of “democracy” and “the economy” as if they were formal, ahistorical variables: “democracy, if it has any effect on economic growth at all, probably has only a slight effect over the near-to-medium term, and this effect is probably characterized by many exceptions.”38 What this passage reveals is Gerring’s naturalist tendency to treat political reality as reified, essentialized, and atomistic units. Gerring’s defense and articulation of case-study analysis is certainly not without merit but needs to be disentangled from these naturalist tendencies. By contrast, interpretive social scientists analyze cases not for the sake of setting up mechanistic and ahistorical causal bonds, but for the construction of contingent narratives, family resemblances, and patterns of meaning. So the case study, far from being inherently interpretive or historical, is (like the other methods of data analysis) subsumable under either one of the competing philosophical paradigms.

That methods can be bent to either naturalist or anti-naturalist ends is something that more methodologists have begun to recognize. For instance, the latest methodological work on “grounded theory” or coding notes that this tool can be put to either more interpretively sensitive or more scientistic uses.39 Grounded theory is a method of data analysis that draws on in-depth interviewing in order to formulate abstract categories. The building of these abstract categories is called “coding” and “involves constructing short labels that describe, dissect, and distill” meanings from the interviewing process with the goal of sorting and synthesizing large troves of beliefs and actions.40 Researchers can use the coding system they have devised to think more (p.102) broadly about the tacit meanings and beliefs of a group of people; coding also allows researchers to creatively conceptualize and draw comparisons.

Of course, all this can be taken in a very naturalist direction if meanings are treated as self-evident, brute bits of empirical reality in little need of context or interpretation. Indeed, under the philosophical sway of naturalism there is a serious danger that the coding categories or labels will become completely exogenous to the social world supposedly being studied. When this happens researchers will be more likely to ignore the holistic, contingent, and historical dimensions of meanings. Instead, under the spell of naturalism, researchers might start treating coded terms like reified, essentialist, or atomistic objects ready to be plugged into mechanistic explanations. Researchers can correct this mistake by always remaining aware of the way their coded categories should be derived from the contingent beliefs and meanings of the interview subjects. Anti-naturalism implies that coded concepts should be the fruit of a dialogue between the interviewer and interviewee—as such, the concepts should have a bottom-up sensitivity to agent language and self-understandings. Coded concepts that fail to capture the contingent meanings and beliefs of the subjects in question are philosophically defective. For this reason, the most recent handbooks on grounded theory have rightly stressed the need for an “iterative” process in which the coded concepts are continually refined, modified, scrapped, reformulated, and changed in light of ongoing discussions with the subjects of study.41 Provided this is the case, there is nothing philosophically barring interpretive social scientists from making careful use of the abstraction, labeling, and conceptualization involved in coding meanings.

A final method that combines both elements of data collection and data analysis is Q methodology. Q methodology has mostly been developed and employed by psychologists in order to study configurations of opinions and beliefs within a group. Carrying out this method involves several steps, beginning with selecting a sample collection of items (these can be pictures, objects, or statements) for participants in the study to then rank along a continuum. The continuum can be devised along any number of scales like “most agree to most disagree, most characteristic to most uncharacteristic, most attractive to most unattractive.”42 The participants then assign the series of items selected by the researchers a number along the scale (for example, +5 or “I strongly agree” with this statement). The significance of statements on a given theme (like love or crime or moods) can in this way be organized and studied by researchers across a group of participants. A key feature of Q methods is the exit interview in which “open-ended comments” are requested to discover “how the participant has interpreted the items.”43 The results of the various rankings can then be mathematically sorted, ordered, and analyzed into various patterns that are subject by researchers to interpretation.

In light of the discussion of other methods, it should be clear that Q methodology can either be pulled in interpretive or naturalist directions. (p.103) The risk with Q methodology is that it be fitted to naturalist aims that neglect meanings, intentions, and historical contingency. In current discussions of Q methodology this often happens when researchers begin to treat aggregate configurations or rankings among a group like a reified “gestalt” viewpoint floating above any one participant. For instance, two prominent methodologists slide in this naturalist direction when they write that the findings of Q methodology are “designed to communicate a ‘shared’ viewpoint, and hence…they need not provide a veridical representation of a participant’s own opinion.”44

By contrast, interpretive philosophy shows that meanings are always the product of the contingent reasons and intentions of particular agents. Reification happens when features of social reality are stripped and disconnected from agent intentions, purposes, reasons, and beliefs. Researchers should never be mystified into treating meanings as objects ascribable to a group “gestalt” that hovers above any one participant. For this involves a basic naturalist philosophical confusion over the nature of meanings. Fortunately, the same methodological writers provide the antidote to bring Q methods in a more interpretively sound direction. This can be done by keeping the rankings and configurations of Q methods embedded within the reasons and beliefs that participants gave in their interviews. This can be used to “fill out” the various competing meanings and reasons that have led to certain rankings of items.45 The formal configurations must not be allowed to turn into formal, ahistorical features of social reality that have no connection to the thought-world, beliefs, meanings, and practices of the participants in the study.

In sum, the various methods of data analysis offer highly valuable research tools for working social scientists. A social science without the tools of random sampling, statistical inference, and case studies would be deeply impaired. Moreover, there is no reason why interpretive social scientists cannot make careful and selective use of Q methodology if it helps them answer a particular research question or explore the meanings and beliefs of those they are studying. However, when using these tools social scientists must also be philosophically cautious and resist the temptation to atomize, reify, or mechanize political reality. The task of the social scientist is not to hunt down correlations that might yield the holy grail of ahistorical causal bonds, but rather to historicize findings by placing them within the scope of a particular narrative and world of meaning.

Heuristics: formal modeling and the case of rational choice theory

So far we have argued that social scientists wishing to follow the interpretive turn may make use of methods of data collection that generate information (p.104) about a way of life (ethnography, interviews, mass surveys) and data analysis that finds patterns in that information (random sampling and statistical inference). Yet a third and final category of methods that deserve careful attention are “heuristics.” A heuristic is a potentially fruitful way to reach a valid conclusion about social reality. This is distinct from an explanation or description of social reality itself. Explanations, as we saw in prior chapters, must consider the actual reasons and beliefs that led particular individuals or groups to specific actions or meanings. Explanations, therefore, have a narrative form and engage the social world. Data analysis and collection meanwhile are foremost concerned with describing some feature of that social reality. By contrast, heuristics often deal in formal or ideal models that initially put aside the question of how well they actually describe or explain social reality. A heuristic is a way to think about or consider social reality that may potentially lead to further insights. Anti-naturalist philosophy makes clear that, like the other methods we have studied so far, heuristics can be used by social scientists provided they remain cautious and vigilant when it comes to concept formation and explanation.

In the social sciences, heuristics most often take the shape of formal modeling and rational choice theory. Of all the heuristics used by social scientists today, perhaps none is more often thought to be the antithesis to an interpretive approach than rational choice theory. After all, what could be more contrary to interpretive philosophy than creating a model of rationality completely divorced from the actual beliefs of individuals in their life-worlds? And we already saw that scholars like Colin Hay have insightfully linked this kind of project to a deterministic view of social reality as fixed by certain option or incentive environments. Yet, like the other methods we have examined in this chapter, rational choice theory can be put to either naturalist or anti-naturalist uses. Anti-naturalist social scientists will be able to appreciate this once they view rational choice theory as a heuristic and not foremost an attempt to conduct data collection or analysis. Once social scientists realize that rational choice theory is a heuristic, they are free to playfully make use of this method within certain limited contexts and as suits their purposes—again, they must steer clear of naturalist snags. This series of points will become clearer by taking a closer look at the specifics of rational choice.

Rational choice theory does not begin by looking at the actual social world but instead by building an ideal conception of human rationality. Like randomization and statistical inference, rational choice is a way of organizing and reasoning about certain features of social reality. But unlike data analysis, rational choice does not begin by paying very much attention to the actual features of social reality. Instead, rational choice proposes an ideal theory of decision-making and strategic game scenarios. In the case of rational choice, this is achieved by formulating axioms about how a certain kind of rational agent makes decisions—a process known as “axiomatization.”46 Two key (p.105) axioms of rational choice modeling are the assumptions that individual preferences are complete and transitive. Completeness assumes that a rational actor will always be able to compare and rank preferences (though ties and indifference are both allowed). What is not allowed by the completeness axiom is that a rational actor will be unable to compare and rank two preferences. In addition, the transitivity axiom assumes that a rational actor can transfer the preference of one object over another to other objects. So, a rational actor who prefers x to y and y to z must also prefer x to z.

Completeness and transitivity are two of the most important (though by no means the only) axioms of rational choice theory. The idealized picture of decision-making that rational choice generates is the basis for game theory and social choice theory, and has also been central to the development of neoclassical economics.47 Rational choice constructs a thin or minimalist view of human rationality—as is widely acknowledged today, this ideal of rationality is at wide variance with the actual empirical workings of human psychology (actual human beings do not uniformly arrive at their beliefs in compliance with this idealized pattern).48 However, in building an ideal model, rational choice has nonetheless proven a powerful tool for modeling how idealized strategic and rational decision-making scenarios might play out, casting light on scenarios as diverse as economic exchange, geopolitical strategy, voter behavior, and other game-like scenarios. Indeed, as Hay (who has been particularly helpful on the uses and abuses of this method) has noted, rational choice theorists have made significant contributions to social science and public policy debates. For instance, using these theories to model social reality, they have drawn “attention to the often perverse and collectively irrational effects of individually rational action” in cases like the so-called free-rider problem or the “tragedy of the commons” in which the devastation of some shared good is motivated by short-term individual gains.49 If enough people really follow a “self-serving, utility maximizing behavior,” then rational choice models can show how this “translates into collectively irrational outcomes.”50

What does anti-naturalist philosophy make of such formal models built before researchers have even had a chance to consider the actual self-understandings and webs of meaning of particular individuals out in the social world? First, anti-naturalism stresses the way in which rational choice theory’s axiomatization and idealization of social reality is effective only within a very limited domain.51 The biggest naturalist pitfall for rational choice theory is to mistakenly assume that it offers a universal, historically transcendent account of the human subject. Indeed, many rational choice theorists have themselves begun to affirm that this kind of formal model can in no way be taken as a universal theory of human agency. The best way to see this, and avoid naturalist pitfalls, is by understanding some of the limits of axiomatization—many of which have been mapped out by rational choice theorists and their (p.106) critics. We will focus on only a couple of examples of such limits in order to impress upon readers the importance of treating rational choice as a heuristic and not as a universal theory explaining human action. We will then turn to possible anti-naturalist uses of rational choice as a heuristic.

The completeness axiom assumes that any two objects can be compared or else are simply objects of indifference. Pairs of objects, in other words, no matter how different, are comparable and susceptible to ranking. But this excludes from the outset all goods that are incommensurable or unable to be compared to one another. Critics of rational choice have attempted to establish the existence of incommensurable goods in various ways. One influential way is the “small improvements” argument.52 The small improvements argument asks us to imagine a person who is unable to decide between two preferences—say, seriously endangering the life of a loved one versus saving his small bankrupt country one trillion dollars. The completeness axiom holds that all goods are comparable, granted there may be ties. This means that if one million dollars were added to the one trillion then the individual, so long as he or she is a rational decision-maker under this definition, should at that point prefer the money to keeping the loved one safe. Yet because such incremental changes might not break the stalemate, this shows that the goods in question are not in fact comparable. And yet neither can it be said that the individual was indifferent to the economic fate of his country or the wellbeing of this person.53 The completeness axiom is therefore not a psychologically valid description of human attachment to certain goods, which individuals can resist subordinating to a calculative rationality. Indeed, for many individuals, subordinating such goods to calculation is itself potentially corruptive or demeaning of those goods.54

This is a larger problem than it may at first appear because human life abounds in incommensurable goods. For example, can we compare one person we love dearly to another? What of competing political and ethical goods like security and freedom, justice and mercy? Utilitarian philosophers often argue that all goods are subject to completeness and ordering. What is certain is that society is not made up exclusively of utilitarian philosophers. Individuals daily deal with what they perceive as incommensurable goods that comprise their distinctive ethical outlooks. Moreover, what counts as an incommensurable good will depend on the self-understandings of the individual or group. Social scientists cannot legislate this beforehand. In one culture newborns, totems, and the land will be incomparable goods; in another a particular species of animal, a tabernacle, a piece of bread. From a sociological perspective, completeness of preference is simply not always an accurate or even useful way to look at human decision-making.55 Thus, the basic axioms of rational choice should not be taken as universal descriptions or explanations of human action. The idealization of one kind of formal rational structure is not a successful or universal human anthropology. In fact, in a (p.107) very different way, it veers back into the problematic universal, ahistorical, and autonomous subject critically examined earlier in the debates between old-school phenomenologists and Foucault. Foucault’s radically historicist critique applies here as well.

This dilemma with completeness and an ahistorical subject is closely related to another naturalist problem with rational choice. Namely, by its very nature, rational choice begins by bracketing the actually psychological beliefs and motives that explain why individuals act the way they do. Instead, rational choice simply posits an order of preferences. The assumption is that an individual is involved in some form of preference maximizing. However, research has shown that in many scenarios individuals do not respond rationally to risk, but tend to inflate small probabilities “as if they were larger than they are known to be.”56 This failure of individuals to actually carry out the ideal of a rational assessment of probable risks is called “prospect theory.” Once again, there is a gap between the formal model of rational choice and the thick psychological reality of human belief formation. The idea that rational choice is a universal or imperial theory of human behavior is untenable because actual human beliefs and actions are frequently incompatible with its basic assumptions. The bracketing of human beliefs and self-interpretations thus comes at a significant cost to rational choice theory. It cannot be adopted as explanatory or descriptive of the human social and political world.

Because rational choice theory begins by intentionally bracketing the beliefs of individuals in order to construct an ideal of rationality, it is highly limited in its legitimate application.57 Rational choice is a heuristic that can within some contexts help shed light on the analysis of social dynamics. But social scientists must always consider whether rational choice is the proper tool based on an assessment of the social actors in question. Are the goods involved complete and transitive? Is the actor through repeated practice, education, or other forms of socialization adept at strategic reasoning and calculation? As with mass surveys, proper use of rational choice requires a grasp of the ethnographic background of the actors involved. This is necessary in order to determine whether the individuals involved are in fact interpreting themselves as the sorts of strategic actors posited by rational choice axioms. In cases where individuals are dealing with incommensurable goods (for instance, as is often the case in politics and many domains of psychology), then rational choice will most likely not be as helpful a heuristic. Indeed, in such cases rational choice might be downright harmful, generating a completely ahistorical subjectivity that actually occludes what is happening in the social world that social scientists wish to understand and explain.

The above line of reasoning makes clear why the domain in which rational choice has met with the greatest success is contemporary economics. This is because economic actors often aspire to approximate the type of decision-maker offered by neoclassicism and rational choice. The practices of modern (p.108) consumer capitalism have habituated individuals to the treatment of goods as complete and transitive. Likewise, the discourses of rational choice have seeped into the self-understandings of many in market societies who learned these discourses in economics and business schools. By contrast, rational choice and modern economics have been much less successful in domains that deal with incommensurable goods and non-strategic, non-calculative self-understandings and practices. But economics is only an autonomous domain in academia. The actual economic world is always embedded in values, practices, and institutions that extend beyond the market and its calculative practices.58 Human social, economic, and political reality are permeated by incommensurable goods as well as non-strategic forms of reasoning that can make the assumed axioms a stumbling block to effective social research. This perhaps goes a long way toward explaining the intense combination of successes and disappointments that is the modern discipline of economics.

The anti-naturalist upshot of all of this is clear: rational choice is an effective but also highly limited tool. Social scientists should employ rational choice when there is some approximate fit between the actual self-understandings of the agents involved and the idealized model. They must never mistake what is a heuristic for actual explanation or description. The thin or minimalist sociology generated by rational choice must be evaluated in light of thick understandings and descriptions. And even in those limited cases where rational choice is found to be useful, the model remains an abstraction and idealization that needs serious re-embedding in the fabric of agent self-understandings. This process can once again be aided by multi-methods and the compatibility of the entire scope of techniques. Rational choice should thus be used in tandem with other methods like ethnography and interviewing. Far from being rivals, these qualitative and quantitative methods might often come to complement one another.

In addition to this domain-limited use of rational choice theory, Hay has drawn attention to the ways in which rational choice can be employed to illuminate hypothetical, what-if scenarios. The purpose of the rational choice method in this case would be “hypothetical thought experiments” that ask the question “what if the world were like this?”59 If social scientists are careful to not mistake rational choice theories as naturalistic and explanatory, then these models might “provide timely and powerful warnings about the likely consequences of existing political trajectories.”60 Vicious cycles and perverse incentives that encourage the squandering of shared goods in the environment (like water and climate) might be clarified using rational choice models. Similarly, the consequences of neoliberalizing or marketizing goods like public education or welfare might be explored in this way—not as “predictive hypotheses” but as “precautionary political warnings.”61 These are what Hay dubs “as-if” uses of naturalist assumptions and theories, and we will return to them at length in our treatment of public policy. For now the point is that Hay adds (p.109) another important sense in which rational choice may be employed as a heuristic method, divorced from naturalist philosophical assumptions.

Unfortunately, at present very few practitioners of rational choice heed anti-naturalist and interpretive insights. Recent studies show that the bulk of rational choice research today woefully neglects interpretive evidence and instead treats its models as quasi-universal explanations of social reality.62 Interpretive social scientists using rational choice and game theory must correct this tendency by always asking themselves questions like: Are the real-life participants playing this game and strategizing in a way that approximates the structure of the idealized axioms or not? Will simplifying through formal modeling produce useful insight into a given social reality or rather occlude the actual social dynamics? What actual reasons do the people involved have for specific beliefs and actions? What are the limits of modeling a strategic or market scenario in this way? What forms of ethical reasoning, irrationality, or non-calculative thinking become invisible when this particular piece of social reality is modeled in this way? Does employing rational choice in this way provide a useful thought experiment or political warning? What dangerous biases might the as-if uses of rational choice and the focus on strategic rationality and game scenarios create? Failure to weigh such considerations will perpetuate what several critics of rational choice have diagnosed as a “flight from reality”—or the neglect of social and political reality in favor of complex models of high precision and little relationship to political life.63 Such uses of rational choice are more explicable in terms of the insularity of much of modern scholarship than a genuine effort to respond to social reality.64

But advocates of the interpretive turn and qualitative methods should also take note. Social scientists should not squeamishly cut themselves off from the uses of rational choice analysis. Such models can and have generated insights into the ramifications of strategic and calculative reasoning within human practices, particularly in market scenarios where individuals are highly habituated to thinking in this way. The successes of modern economics as a discipline are often tied to the strength of rational choice as a heuristic. Once again, anti-naturalism is uniquely placed to identify both the strengths and the weaknesses of this tool. Anti-naturalism rejects the overblown antagonism between rational choice scholars and their opponents. Instead, anti-naturalism is able to absorb and integrate rational choice into a vast array of social science methods. This is accomplished, moreover, without ever giving way to the naturalist myth that rational choice is anything like an adequate philosophical anthropology. Indeed, rational choice must never be mistaken for an actual anthropology or even a very good account of how humans across history form their beliefs (a mistake made all too frequently by neoclassical economists).65 But even economists need anti-naturalists concepts like traditions, beliefs, practices, and the social background. This will tune them into whether a (p.110) particular actor or set of actors within a context might be more given to rational choice strategic thinking because they themselves were instructed in this heuristic or learned approximations of it through long iterative decisions in the marketplace.

In sum, anti-naturalism gives back to social scientists the freedom to consider and judge within context which method is best for their particular research goals. Social scientists need not sit behind artificial methodological walls defensively committing to either quantitative or qualitative methods. Instead, they can make use of the full range of social science tools. They must only learn to use these toward anti-naturalist and not naturalist ends. In other words, social scientists must stop neglecting philosophy.

We have seen that data collection, data analysis, and heuristics can all be reconciled. This means tools as diverse as observer-participation, in-depth interviews, mass surveys, random sampling, statistical inference, case studies, Q methodology, grounded theory, and rational choice modeling can all be either used or abused. Anti-naturalism is uniquely placed to create a vast synthesis of the data collection, data analysis, and heuristics that have been developed by social scientists in the last two centuries.

Of course, within the limits of a single research project, mixing methods (like advanced econometric analysis and ethnography) could be difficult because of the practical unlikelihood of finding this kind of knowledge in a single researcher. Indeed, the disciplinary demands of graduate school and modern scholarship make it difficult for any one person to master and apply highly diverse methods. Moreover, the intense specialized training often serves to limit and not broaden the kind of research that is conducted by contemporary scholars. One way to resolve this is to begin to enact a far more cooperative form of social science—one in which anti-naturalist scholars do not expect to find all the requisite methodological expertise embodied in a single researcher or discipline.66

The anti-naturalist case for multi-methods does in this regard seem to imply a shift toward much greater levels of cooperation across research communities than has thus far been the case. Unfortunately, scholarly communities clustered tribalistically around method-expertise and technical wizardry currently serve as the norm. Each tribe claims for itself the one true path to social science via a particular method or methods. By contrast, future anti-naturalist ethnographers and statisticians might collaborate in the construction of a narrative social science. But even in lieu of this more cooperative future, current anti-naturalists working under intellectual isolation are free to learn from the research of both hard-nosed quants and linguistically adept qualies. Anti-naturalist social scientists can employ philosophy to help them sift through the mountains of empirical findings generated by current researchers. What is distorted by naturalism can be carefully separated out from what is valid and admirable but still in need of interpretive and historical (p.111) contextualization. The repurposing of existing findings into interpretive and narrative forms of explanation is arguably an enormous area of untapped research potential. An entire generation of social scientists could devote itself to taking the many bricks and isolated pieces of information generated by the naturalist focus on data and build them into narrative, sociological edifices.


(1.) The authors call this “soaking and poking.” Gary King, Robert O. Keohane, and Sidney Verba, Designing Social Inquiry: Scientific Inference in Qualitative Research (Princeton, NJ: Princeton University Press, 1994) 38–9. For another widely used textbook that makes a similar assumption, see: John W. Creswell, Research Design: Qualitative, Quantitative, and Mixed Methods Approaches, 3rd ed. (Los Angeles, CA: SAGE Publications, 2009) 8.

(2.) This is not necessarily an error made among ethnographers themselves: Edward Schatz, ed., Political Ethnography: What Immersion Contributes to the Study of Politics (Chicago, IL: University of Chicago Press, 2009).

(3.) For example: Henry Brady and David Collier, Rethinking Social Inquiry: Diverse Tools, Shared Standards, 2nd ed. (Lanham, MD: Rowman & Littlefield, 2010) 86; Peregrine Schwartz-Shea and Dvora Yanow, Interpretive Research Design: Concepts and Processes (New York: Routledge, 2012).

(4.) A similar point about a pluralist approach to methods has been echoed in the action research literature. For example: Davydd James Greenwood and Morten Levin, Introduction to Action Research, 2nd ed. (Thousand Oaks, CA: SAGE Publications, 2007).

(5.) For a usage similar to our own see: David Marsh and Gerry Stoker, eds., Theory and Methods in Political Science, 3rd ed. (New York: Palgrave Macmillan, 2010) 3.

(6.) The famous phrase is from Clifford Geertz, “Thick Description: Toward an Interpretive Theory of Culture,” in The Interpretation of Cultures (New York: Basic Books, 1973) 6. See also: John Van Maanen, “Ethnography as Work,” Journal of Management Studies 48:1 (2011): 219–20.

(7.) Schwartz-Shea and Yanow, Interpretive Research Design, 63–5.

(8.) Joe Soss, “Talking Our Way to Meaningful Explanations: A Practice-Centered View of Interviewing for Interpretive Research,” in Interpretation and Method: Empirical Research Methods and the Interpretive Turn, eds. Dvora Yanow and Peregrine Schwartz-Shea (Armonk, NY: M.E. Sharpe, 2006) 135.

(9.) Kenneth MacLeish, Making War at Fort Hood: Life and Uncertainty in a Military Community (Princeton, NJ: Princeton University Press, 2013).

(10.) Alice Goffman, On the Run: Fugitive Life in an American City (Chicago, IL: University of Chicago Press, 2014). For a recent overview of urban ethnography see: Mitchell Duneier, Philip Kasinitz, and Alexandra Murphy, eds., The Urban Ethnography Reader (Oxford: Oxford University Press, 2014).

(11.) Goffman, On the Run, 203.

(12.) Arlie Russell Hochschild, Strangers in Their Own Land: Anger and Mourning on the American Right (New York: The New Press, 2016); Matthew Desmond, (p.112) Evicted: Poverty and Profit in the American City (New York: Penguin Random House, 2016).

(13.) Schwarz-Shea and Yanow, Interpretive Research Design, 38.

(14.) Ellen Pader, “Seeing With an Ethnographic Sensibility,” in Interpretation and Method, eds. Yanow and Schwartz-Shea, 167.

(15.) Frederic C. Schaffer, Democracy in Translation: Understanding Politics in an Unfamiliar Culture (Ithaca, NY: Cornell University Press, 1998) xi.

(16.) James Clifford, “On Ethnographic Authority,” Representations 2 (1983): 124.

(17.) Dvora Yanow, Sierk Ybema, and Merlijn van Hulst, “Practicing Organizational Ethnography,” in The Practice of Qualitative Organizational Research: Core Methods and Current Challenges, eds. Catherine Cassel and Gillian Symon (London: Sage, 2012) 331–50. See also: James Clifford, “On Ethnographic Authority,” 125.

(18.) Clifford, “On Ethnographic Authority,” 119.

(19.) Van Maanen, “Ethnography as Work,” 225.

(20.) Schwartz-Shea and Yanow, Interpretive Research Design, 65–6.

(21.) Clifford, “On Ethnographic Authority,” 139–40.

(22.) See, for example, our extended discussion below of: Christian Smith and Melinda Lundquist Denton, Soul Searching: The Religious and Spiritual Lives of American Teenagers (Oxford: Oxford University Press, 2005) 67.

(23.) For further discussion see: Earl Babbie, Survey Research Methods, 2nd ed. (Belmont, CA: Wadsworth Publishing Company, 1998) 56–9; John W. Cresswell, Research Design: Qualitative, Quantitative, and Mixed Methods Approaches, 3rd ed. (Thousand Oaks, CA: SAGE, 2009) 146.

(24.) Babbie, Survey Research Methods, 52. Another example of this naturalist error can be seen in: Arlene Fink, The Survey Handbook, 2nd ed. (Thousand Oaks, CA: SAGE, 2003) 55–60.

(25.) Even survey researchers who promote naturalist modes of explanation admit the efficacy of such supplementary study: Babbie, Survey Research Methods, 58.

(26.) Tom Wengraf, Qualitative Research Interviewing: Biographic Narrative and Semi-Structured Methods (London: SAGE, 2001) 60.

(27.) See: Simon Watts and Paul Stenner, “Doing Q-Methodology: Theory, Method, and Interpretation,” Qualitative Research in Psychology 2 (2005): 67–91.

(28.) Paul M. Kellstedt and Guy D. Whitten, The Fundamentals of Political Science Research (Cambridge: Cambridge University Press, 2009) 121.

(29.) For a more detailed discussion, see: Kellstedt and Whitten, The Fundamentals of Political Science Research, 126–7.

(30.) Lesley Andres, Designing and Doing Survey Research (Thousand Oaks, CA: SAGE Publications, 2012) 9.

(31.) Smith and Denton, Soul Searching, 292–3.

(32.) Smith and Denton, Soul Searching, 68.

(33.) Smith and Denton, Soul Searching, 163.

(34.) Smith and Denton, Soul Searching, 67.

(35.) Smith and Denton, Soul Searching, 263.

(36.) John Gerring, “The Case Study: What It Is and What It Does,” in The Oxford Handbook of Political Science, ed. Robert E. Goodin (Oxford: Oxford University Press, 2011) 1137–8.

(37.) Gerring, “The Case Study,” 1152–3.

(p.113) (38.) Gerring, “The Case Study,” 1153.

(39.) Kathy Charmaz and Linda Liska Belgrave, “Qualitative Interviewing and Grounded Theory Analysis,” in The SAGE Handbook of Interview Research: The Complexity of the Craft, 2nd ed., eds. Jaber Gubrium, James Holstein, Amir Marvasti, and Karyn McKinney (Thousand Oaks, CA: SAGE, 2012) 349.

(40.) Charmaz and Belgrave, “Qualitative Interviewing and Grounded Theory Analysis,” 356.

(41.) Charmaz and Belgrave, “Qualitative Interviewing and Grounded Theory Analysis,” 348.

(42.) Watts and Stenner, “Doing Q-Methodology,” 77.

(43.) Watts and Stenner, “Doing Q-Methodology,” 78.

(44.) Watts and Stenner, “Doing Q-Methodology,” 85.

(45.) Watts and Stenner, “Doing Q-Methodology,” 76.

(46.) Itzhak Gilboa, Rational Choice (Cambridge, MA: MIT Press, 2010) 39–40.

(47.) See: Julian Reiss, Philosophy of Economics (New York: Routledge, 2013) 6; Daniel M. Hausman, “Philosophy of Economics,” in Routledge Encyclopedia of Philosophy, vol. 3, ed. Edward Craig (London: Routledge, 1998) 211–22.

(48.) This has been a key finding of behavioral economics. For famous early pieces disputing the transitivity of human preferences on psychological grounds, see: Amos Tversky and Daniel Kahneman, “The Framing of Decisions and the Psychology of Choice,” Science 211 (1981): 453–8; Kenneth O. May, “Intransitivity, Utility, and the Aggregation of Preference Patterns,” Econometrica 22:1 (1954): 1–13; Amos Tversky, “Intransitivity of Preferences,” Psychological Review 76:1 (1969): 31–48.

(49.) Colin Hay, Political Analysis: A Critical Introduction (New York: Palgrave Macmillan, 2002) 9.

(50.) Colin Hay, “Theory, Stylized Heuristic, or Self-Fulfilling Prophecy? The Status of Rational Choice Theory in Public Administration,” Public Administration 82:1 (2004): 42.

(51.) See Donald Green and Ian Shapiro’s important critique of rational choice’s pretensions toward “universalism.” Green and Shapiro, Pathologies of Rational Choice Theory (New Haven, CT: Yale University Press, 1994) 54.

(52.) Martin Peterson, An Introduction to Decision Theory (Cambridge: Cambridge University Press, 2009) 170.

(53.) Some rational choice theorists have responded to the problem of incommensurable goods with the theory of revealed preferences, which simply holds that whatever decision individuals in fact make reveals their comparison. Yet the theory of revealed preference, which is by no means held by all rational choice theorists, runs completely afoul of an interpretive conception of actions as expressive of beliefs. Social scientists must be sensitive to the meaning of actions by interpreting them in light of the beliefs of the actors involved. To unilaterally impose a meaning on actions from the outside is a clear form of naturalist distortion. Decision theorists have also argued that the “revealed preference dogma” is negated by the existence of “probabilistic preferences” or preferences that are only held a percentage of the time without vacillating back and forth. In the case of probabilistic preferences, one does not always prefer A to B or B to A. Peterson, An Introduction to Decision Theory, 292.

(p.114) (54.) For one extended account of goods not susceptible to rational calculation, see: Charles Taylor, Sources of the Self: The Making of the Modern Identity (Cambridge, UK: Cambridge University Press, 1989) Part I.

(55.) Rational choice theorists have similarly found that individuals do not always hold transitive preferences but in some contexts hold cyclical preferences. Peterson, An Introduction to Decision Theory, 290.

(56.) Gilboa, Rational Choice, 43.

(57.) Gilboa, Rational Choice, 22–3.

(58.) For an extended historical account of markets and market rationality as culturally embedded, see: Karl Polanyi, The Great Transformation: The Political and Economic Origins of Our Time (Boston, MA: Beacon Press, 2001).

(59.) Hay, “Theory, Stylized Heuristic, or Self-Fulfilling Prophecy?”, 55.

(60.) Hay, “Theory, Stylized Heuristic, or Self-Fulfilling Prophecy?”, 56.

(61.) Hay, “Theory, Stylized Heuristic, or Self-Fulfilling Prophecy?”, 57.

(62.) Iain Hampsher-Monk and Andrew Hindmoor, “Rational Choice and Inter-pretive Evidence: Caught Between a Rock and a Hard Place?”, Political Studies 58 (2010): 49.

(63.) Ian Shapiro, The Flight From Reality in the Human Sciences (Princeton, NJ: Princeton University Press, 2005).

(64.) Because of our concern with rational choice as a social science method, we have intentionally avoided the related debate over whether rational choice ought to be normative for decision-making.

(65.) For a famous articulation of this erroneous view see Nobel laureate, Gary S. Becker, The Economic Approach to Human Behavior (Chicago, IL: University of Chicago Press, 1976).

(66.) These issues have been extensively and provocatively discussed in action research literature: Morten Levin and Davydd Greenwood, “Revitalizing Universities by Reinventing the Social Sciences: Bildung and Action Research,” in The SAGE Handbook of Qualitative Research, eds. Norman Denzin and Yvonna Lincoln (Thousand Oaks, CA: SAGE Publications, 2011) 27–42.