In the months leading up to the RadicalxChange conference in March, I wrote a series of critiques of prominent contemporary ideologies (capitalism, statism and nationalism) as well as an attempt to sketch the positive beliefs of the RxC movement. Since this time, however, it has become apparent that I omitted a critical contemporary ideology, perhaps the one with which RxC is most likely to be confused by outsiders (and which most RxC participants previous subscribed to): technocracy. Myself, I was socialized into a highly technocratic culture. In this blog post I try to fill this lacuna.
By technocracy, I mean the view that most of governance and policy should be left to some type of “experts”, distinguished by meritocratically-evaluated training in formal methods used to “optimize” social outcomes. Many technocrats are at least open to a degree of ultimate popular sovereignty over government, but believe that such democratic checks should operate at a quite high level, evaluating government performance on “final outcomes” rather than the means of achieving these. They thus believe the intelligibility and evaluability of technocratic designs by the broader public is of little value. Within these broad outlines, technocracy comes in many flavors. A couple of notable and less democratic version are the forms adopted by the Chinese communist party, the “neoreactionary” movement and its celebration of Lee Kwan Yew’s Singapore.
Yet perhaps the most prominent version, especially in democratic countries, is a belief in a technocracy based on a mixture of analytic philosophy, economic theory, computational power and high-volume statistical analysis, often using experimentation. This form of technocracy is a widely held view among much of the academic and high technology elites, among the most powerful groups in the world today. I focus on this tendency as I assume it will be the form of technocracy most familiar and attractive to my readers, and because the neoreactionary and Chinese Communist technocracies have much conceptually and intellectual historically in common with it. Some examples of more extreme versions of this view, likely to be popular among my readers, are common in the “rationalist” community and projects adjoining it such as effective altruism, mechanism design, artificial intelligence alignment and, to a lesser extent, humane design. I will critique each of these tendencies in detail as archetypes of technocracy.
Such rationalist projects are generally “outcome oriented” and utilitarian, have great faith in formal and quantitative methods of analysis and measurement. Their standard operating procedure is to take abstract goals related to human welfare, derive from these a series of more easily-measurable target metrics (ranging from gross domestic product to specific village level health outcomes) and use optimization tools and empirical analysis derived from economics, computer science and statistics to maximize these outcomes. This process is imagined as taking place overwhelmingly outside the public eye and is viewed as technical in nature. The public is invited to judge final outcomes only, and invited to offer input into the process only through formalisms such as “likes”, bets, votes, etc. Constraints on this process based on democratic legitimacy or explicability, “common sense” restrictions on what should or shouldn’t be optimized, unstructured or verbal input into the process by those lacking formal training, etc. are all viewed as harmful noise at best and as destructive meddling by ill-informed politics at worst.
The fundamental problem with technocracy on which I will focus (as it is most easily understood within the technocratic worldview) is that formal systems of knowledge creation always have their limits and biases. They always leave out important consideration that are only discovered later and that often turn out to have a systematic relationship to the limited cultural and social experience of the groups developing them. They are thus subject to a wide range of failure modes that can be interpreted as reflecting on a mixture of corruption and incompetence of the technocratic elite. Only systems that leave a wide range of latitude for broader social input can avoid these failure modes. Yet allowing such social input requires simplification, distillation, collaboration and a relative reduction in the social status and monetary rewards allocated to technocrats compared to the rest of the population, thereby running directly against the technocratic ideology. While technical knowledge, appropriately communicated and distilled, has potentially great benefits in opening social imagination, it can only achieve this potential if it understands itself as part of a broader democratic conversation.
My argument proceeds in six parts:
- Formal social systems intended to serve broad populations always have blind spots and biases that cannot be anticipated in advance by their designers.
- Historically, these blind spots often lead to disastrous outcomes if they are left unchecked by external input. If this input is left to the outcome stage, disasters must occur before the system is reconsidered rather than biases being caught during the process.
- Failures of technocracy in managing economic and computational systems today bear significant responsibility for widespread feelings of illegitimacy that threaten respect for the best-grounded science that technocrats believe is most important for the public to trust.
- Technical insights and designs are best able to avoid this problem when, whatever their analytic provenance, they can be conveyed in a simple and clear way to the public, allowing them to be critiqued, recombined, and deployed by a variety of members of the public outside the technical class.
- Technical experts therefore have a critical role precisely if they can make their technical insights part of a social and democratic conversation that stretches well beyond the role for democratic participation imagined by technocrats. Ensuring this role cannot be separated from the work of design.
- Technocracy divorced from the need for public communication and accountability is thus a dangerous ideology that distracts technical experts from the valuable role they can play by tempting them to assume undue, independent power and influence.
Continues in source: Why I Am Not A Technocrat | RadicalxChange