Fetishism and The “Variables Paradigm”

Are all regression coefficients riddled with metaphysical subtleties and theological niceties?

Sociology
Theory
Author

andrés castro araújo

Published

January 4, 2024

Karl Marx’s criticism of commodity fetishism is very similar to Andrew Abbott’s criticism of the so-called “variables-based approach” in sociology (i.e., most standard quantitative research).

The former denounces the idea of “value” as an alleged property of commodities (and not people doing things) to be a capitalist illusion. The latter denounces the idea of “causality” as an alleged property of variables (and not people doing things) to be a positivist illusion.

Both criticism are usually blown out of proportions, mostly because it is rare that someone out there is really under the trappings of such “illusions” in the first place.

Before getting into detail, let me note the following:

Commodity Fetishism

A commodity appears at first sight an extremely obvious, trivial thing. But its analysis brings out that it is a very strange thing, abounding in metaphysical subtleties and theological niceties.

— Karl Marx, Capital

The commodity fetishism argument is an invitation to look beyond the surface of appearances, an illusion according to which “the definite social relation between men themselves” assumes “the fantastic form of a relation between things.”

The accusation is that normal people see value as an intrinsic property of commodities and fail to see it for what it is: a relational property.

This is the best two-paragraph explanation I’ve seen of what all this means.

There are two ways of ascribing properties to objects. Both have the same surface grammatical form: \(A\) is \(F\). The book is red, the man is tall, the woman is rich. They differ, however, at a deeper level. The height of a person is a quality that inheres in him quite independently of social context. Wealth, on the other hand, can only be predicated of a person who is inserted in a web of social relations. It makes sense to say that Robinson Crusoe on his island was tall, not that he was rich, even if perhaps he brought some gold coins along with him. To be rich means that other people are willing to exchange their goods or labor for your money. Being rich, unlike being tall, is a relational predicate.

[…]

Commodity fetishism is the belief that goods possess value just as they have weight, as an inherent property. To the unmystified mind, it is clear that a commodity has exchange value only because it stands in certain relations to human labor and human needs. In the bewitched world of commodity fetishism, however, goods appear to exchange at a certain rate because of their inherent values. Such, at any rate, was Marx’s argument. It is somewhat unconvincing, because it is hard to believe that anyone ever committed this particular fallacy.

Elster (1986, pp. 56–57, emphasis added)

I agree. It’s hard to believe that anyone has ever thought about the value of commodities in this way. But being more sophisticated than others makes us feel good.

Note. Other accusations of economic fetishism are more on point. For example, the idea that “money” has an intrinsic value (especially in the form of gold or silver) has been an amply documented illusion throughout history. A similar dynamic happens with “capital” when it’s considered as raw materials or instruments of labor; the “stuff” we identify as capital only becomes capital when it’s embedded in a process of accumulation (M-C-M′).

Thus, arguments about fetishism may hold true in certain domains (e.g., money, capital), but we should doubt about the truthfulness of the more narrow commodity fetishism argument.

The “Variables Paradigm”

Some time ago Andrew Abbott (2001) mounted an attack on traditional quantitative researchers by accusing them of foolishly believing that “variables do things, not social actors.”

Here are the charges:

  • Quantitative researchers simplify too much.

  • They forget the importance of narrative.

  • They are plain silly.

    They don’t see individuals as motivated actors, but merely as the locale for the variables “doing their thing” (Abbott 2001, p. 132). For example, they actually believe that some abstraction called education acts on another abstraction called occupation.

    Our normal methods parse social reality into fixed entities with variable qualities. They attribute causality to the variables—hypostatized social characteristics—rather than to agents; variables do things, not social actors.

    Abbott (2001, p. 183)

  • They ignore that social activity is located space and time, unlike the always rich and sophisticated Chicago School.

    For the idea of a variable is the idea of a scale that has the same causal meaning whatever its context: the idea, for example, that “education” can have “an effect” on “occupation’ irrespective of the other qualities of an individual, whether those qualities be other past experiences, other personal characteristics, or friends, acquaintances, and connections. Within variable-based thinking, one allows for a few”interactions” to modify this single causal meaning contextually, but the fundamental image of variables’ independence is enshrined in the phrase “net of other variables” and in the aim to discover this net effect, whether through experimental or statistical manipulation. The Chicago view was that the concept of net effect was social scientific nonsense. Nothing that ever occurs in the social world occurs “net of other variables.” All social facts are located in contexts. So why bother to pretend that they are not?

    Abbott (1997, p. 1152)

    Except it’s not nonsense!

The problem with this discussion is that the target of criticism—the so-called “variables paradigm”—does not exist. It has never existed. It’s hard to believe that someone out there is really that dumb.

In a particularly egregious example of bad faith, Andrew Abbott reads a couple of articles from the American Journal of Sociology and finds that “narrative sentences usually have variables as subjects” (Abbott 2001, p. 140).1 Now imagine overhearing a conversation in which someone mentions the sunset and then concluding they’re being foolish fool for believing the sun revolves around the earth. It is deeply obnoxious. We must be able to distinguish between convenient “figures of speech” and actual “beliefs.”

Note. After writing the above I came across Mustafa Emirbayer’s (1997) “Manifesto for a Relational Sociology.”

TL;DR it’s not good, it’s mostly nonsense.

Among other things, he argues that it’s futile for quantitative researchers to adjust for confounders because all such attempts “ignore the ontological embeddedness or locatedness of entities within actual situational contexts” (Emirbayer 1997, p. 289).

Yikes!

Data and Phenomena

As mentioned in a previous blogpost, I find it useful to distinguish between data and phenomena:

  • Phenomena: recurrent features of the world.

  • Data: “public records produced by measurement and experiment that serve as evidence for the existence or features of phenomena” (Woodward 2011, p. 166)

Criticism of the so-called “variables paradigm” may be valid in the rare circumstances in which researchers forget this distinction; or in cases in which people conflate data mining with the scientific modeling of real-world phenomena.

Self-proclaimed critical thinkers might complain and insist that quantitative research is riddled with variable fetishism; that GLMs and DAGs merely depict relationships between variables, not between real-world phenomena; and that convenient figures of speech reveal some kind of weird metaphysics.

But, I assure you, no one out there really thinks that variables “cause” other variables in a social vacuum.

On Trained Incapacities

Note. I added this section after receiving some useful feedback from friends.

Here’s a more charitable view of Andrew Abbott’s full-blown attack on linear regression.

  • Traditional linear regression—particularly when used to model cross-sectional datasets—is ignorant of temporal context in ways that are counterproductive. The significance of some measured “variables” is severely distorted when they’re ripped from their temporal context, making some research questions impossible to answer.

    In particular, the order in which things happen determines the outcomes that most historical researchers are interested in explaining.

  • A similar thing happens when we want to answer questions for which spatial context is important. This is why there’s a whole subfield of statistics dedicated to modeling “spatial data.”

  • Paul Pierson has a great culinary analogy for how time “works” in linear regression for cross-sectional data:

    Imagine that your friend invites you to the trendiest new restaurant in town, charmingly named “The Modern Social Scientist.” As an added bonus, he informs you that he knows the chef well, and that you will have a chance to tour the kitchen. When you arrive, the chef explains that the kitchen is divided into two parts. On the left, she has all the ingredients which to your puzzlement she refers to as “variables”). These ingredients, she insists, are the freshest available and carefully selected. On the right is an extraordinary profusion of measuring devices. You express astonishment at their complexity and detailed ornamentation, and the chef explains that each requires years to learn how to operate properly.

    The chef proceeds to elaborate her culinary approach: good cooking, she says, amounts to having the perfect ingredients, perfectly measured. Traditional cooks have stressed how important the cooking process itself is, including the sequence, pace, and specific manner in which the ingredients are to be combined. Not so, says the proprietor of The Modern Social Scientist. As long as you have the correct ingredients and they are properly measured, she insists, how, in what order, and for how long they are combined makes no difference.

    Pierson (2011, p. 1)

    But quantitative research does not have to be decontextualized in this way. There are many ways of adapting quantitative research to answer some of the criticisms levied by scholars like Pierson and Abbott (e.g., Wawro and Katznelson 2022).

    Actually, thinking about temporal assumptions in linear regression can be very helpful—e.g., did you know that fixed-effect models produce wrong estimates when the “causal lags in the real world do not match the lags found in panel data” (Vaisey and Miles 2017)?

    However, there is only one correct way of answering to someone calling you an unsophisticated fool with weird metaphysics: silence.

  • Specializing in quantitative methods produces a sort of “trained incapacity,” limiting the scope of questions that researchers can answer. People that specialize in qualitative methods are usually not experts in quantitative research; people that specialize in social network analysis are usually not experts in archival research; and so on.

    Note. Sociology is actually better than most other disciplines at producing researchers that defy these expectations (e.g., Mario Small, Emily Erikson, Peter Bearman).

    However, there’s no reason to think that becoming competent in one form of social research will make you less competent in another.

This last point is worth repeating. I worry that students will stumble upon criticism of the “variables paradigm” and conclude that learning about linear regression will actually make them worse sociologists. I had a similar reaction about a decade ago.

I think this quote by Arthur Stinchcombe sums it best:

A “trained incapacity” is produced by a sufficient level of specialization so that some kinds of good work cannot be produced by some of the competent practitioners in the discipline. If it takes so much time to learn statistics up to a modern standard that quantitative researchers do not have time to learn any history, or if the incentive system among quantitative people discourages dillentantism, or if quantitative workers have a positive ideology against becoming learned in “soft” subjects, or if the social circles that know about hierarchical models do not know about recent good books, then quantitative people will not be able to see things that one can find out about only historically. What one has to look for is not competencies, for there is no general reason that one competence should interfere with another (in fact in general disparate competencies are positively correlated, though generally the correlation is small). Instead one has to look for tendencies for competence in one thing to decrease the ability to develop competence at another thing.

Stinchcombe (1986, p. 25, emphasis added)

References

Abbott, Andrew. 1997. “Of Time and Space: The Contemporary Relevance of the Chicago School.” Social Forces 75(4): 1149–82.
———. 2001. Time Matters: On Theory and Method. University of Chicago Press.
Baehr, Peter. 2019. The Unmasking Style in Social Theory. Routledge.
Elster, Jon. 1986. An Introduction to Karl Marx. Cambridge: Cambridge University Press.
Emirbayer, Mustafa. 1997. “Manifesto for a Relational Sociology.” American journal of sociology 103(2): 281317.
Healy, Kieran. 2017. “Fuck Nuance.” Sociological Theory 35(2): 118127.
Pierson, Paul. 2011. Politics in Time. Princeton University Press.
Stinchcombe, Arthur L. 1986. Stratification and Organization: Selected Papers. Cambridge: Cambridge University Press.
Vaisey, Stephen, and Andrew Miles. 2017. What You Canand CantDo With Three-Wave Panel Data.” Sociological Methods & Research 46(1): 44–67.
Wawro, Gregory, and Ira Katznelson. 2022. Time Counts: Quantitative Analysis for Historical Social Science. Princeton University Press.
Woodward, James F. 2011. “Data and Phenomena: A Restatement and Defense.” Synthese 182: 165179.

Footnotes

  1. “My procedure is simple. I find all the narrative sentences, the sentences whose predicates are activities or parts of the social process. I then consider who are the subjects of these sentences, what kinds of activities are involved, and how the predicates are related to causality” (Abbott 2001, p. 130).↩︎