(In)Effective Altruism
I wasn’t going to bother writing this because frankly I think that with Israel currently massacring the people of Gaza in a live genocide, the dangers posed by Effective Altruism are piffling at best (and outside of the remit of this blog). However, in a moment of lukewarm synchronicity, Katherine Dee just posted an article on an Efilist – an ideology if not exactly down-river of EA, certainly a tributary – having bombed a fertility clinic… Having hung out with the large Cambridge chapter of Astral Codex Ten/ Slate Star Codex readers [a bright bunch I am probably the opposite Myers-Briggs type to though they wouldn’t endorse Myers-Briggs types] I’ve had to hear more than my fair share about EA and utilitarianism more broadly. I have thoughts.
I don’t find people concerned about the welfare of farmed shrimp especially sinister and I’m also more sympathetic to the philosophy of Efilists and Zizians and other anti-natalists than Katherine Dee. I’m very keen on improving the lives of the children who current exist on this planet (I’m currently not a teacher for the salary or working hours) but I have no interest in bringing any more into existence. The fact remains, most beings want to keep on living. It would be ludicrously arrogant to wag my finger at that, far less to violate that wish.
Most Effective Altruists would agree with that. My friend who is most linked to the EA community is a sweetheart who has worked on improving rice crop yield and mostly cares about geometric algebra. The only murderer I have been friends was, on the surface, the most “normal”, centrist-liberal guy I knew who was also massively into the Marvel Cinematic Universe. I hate the conflation of surface-level weirdness with a deeper moral or ethical deviance. I see more and more Substack and social media posts urging people to trust in their reactive attitudes and I mostly see it weaponised against trans* and autistic folks. Fuck that.
So, this is a somewhat affectionate critique… or, at the very least, an observation not motivated by hate or disgust.
In short, EA has an issue with commodity fetishism. In short, to quote Oxford Reference, commodity fetishism is:
The mistaken view that the value of a commodity is intrinsic and the corresponding failure to appreciate the investment of labour that went into its production. Karl Marx created this term, borrowing the notion of the fetish from anthropology, where it refers to a sacred or symbolic object that according to its worshippers has supernatural power.
Yesterday, also, Laura Basu provided a helpful potted definition from her toddler.
At my friend’s bachelor party, his brother gave a PechaKucha-style presentation on EA. In this he argued that it was morally far superior to donate $1000 to a charity that provides mosquito nets against malaria than to donate one organ since the mosquito net will save many lives whereas donating your kidney will just save one.
Now, as someone in the process of donating a kidney [or, rather, attending the numerous medical appointments and check-ups that permit you to donate a kidney] I have some skin in the game regarding this argument! Frankly, I’m most inclined to urge you to do both!
However, as is often the case with EA adherents, this argument builds a false equivalence and it does so by invisibilising labour.
Because it is not the EA donor who is making or distributing those mosquito nets. While the chances of those nets saving lives is probabilistic, that probability is determined by the work of cotten weavers/ factory workers, those who grew and harvested the cotton (or engaged in the the polymerization process if we’re not dealing with cotton), those making and documenting the distributions etc. etc. etc.
Of course, the same is true of my kidney. I need to thank my mother first for enabling me to grow one at all! However, it is abundantly clear that there is not a flat ontology that unites the donation of a kidney with the donation of a mosquito net as equivalent acts of giving. The problem is, EA gives very little consideration to ontology (much less phenomenology) despite the fact that gift giving is a site of deep-rooted ontological emeshment thats sits at the foundation of human social relationships. We don’t even need to be religious and consider John Milbank’s theology of the gift, we can go back to Marcel Mauss’s formative 1925 essay on the gift. Since my Substack is free from the strictures of academic publishing, I’ll quote Wiki:
From the disparate evidence, he builds a case for a foundation to human society based on collective (vs. individual) exchange practices. In doing so, he refutes the English tradition of liberal thought, such as utilitarianism, as distortions of human exchange practices.
Maybe I’m being overly precious, but I think trying to make deep-rooted human practices mathematically elegant and streamlined rather bypasses what is human about them, their emeshment in collective sociality. Even more vitally, this reductionist drive towards mathematical purity can lead to some troubling and inhumane places.
While I do not believe this fact alone should stop you from donating to Against Malaria Foundation (one of EA’s NGOs of choice that at least isn’t devoted to the ludicrous concern that AI is going to become sentient) I think it is notable that their partners include Microsoft, Citi and Sumitomo. Microsoft has been providing AI technology to assist the Israeli military in the slaughter of the people of Gaza. Citigroup is being targetted by both pro-Palestine and climate-justice movements. Sumitomo Corp. has invested over $100 million into Israel over the last five years.
Do I think all executives of these corporations are Zionists deeply invested in the genocide of the Palestinian people? No, I think they are deeply invested in profit and see the Palestinian people as, at best, numbers on a spreadsheet or, at worst, irritants and irrelevencies. From their utilitarian long-termist perspective, what matter if thousands upon thousands of Palestinians are crushed, starved and tortured to death if it means the system of global capital and the comfort and supremacy of the Global North maintained?
Ultimately, utilitarianism is not non-ideological and cannot be extricated from the socio-political systems in which utilitarians exist. If utilitarians seek, Steven Pinker-like, to improve outcomes while maintaining a murderous and grotesque status-quo, then the system will just become more efficient at perpetuating the inequalities it serves to uphold.