How do you change complex systems? Center for Humane Technology


How do you change complex systems?
In this newsletter, we offer a preliminary framework for how to intervene in complex systems, provide our highlights from Frances Haugen’s UK testimony last week, and share a new resource for educators wanting to bring The Social Dilemma into the classroom.
As we learned from the Facebook Files, social media’s operating model is having disastrous effects on our global society. We’re 15 years into a mass experiment where our attention is mined for profit, and we’re seeing escalating distraction, addiction, outrage, and polarization in ways that are degrading our mental health, social cohesion, and democratic institutions.

Where do we go from here? Actionable change requires elevating the discussion beyond harms and towards a systems perspective. Inspired by Donella Meadows’ 12 Leverage Points to Intervene in a System, we developed a simplified framework of leverage points for how to intervene in the extractive tech ecosystem.

The Leverage Points Framework shows that change happens at multiple levels with different degrees of impact. Importantly, it demonstrates why pushes for immediate design tweaks at major platforms must be paired with longer-term systemic reform, like changing the fundamental business models. Generally, leverage increases from left to right on the framework. However, so does the difficulty of implementing changes. Because of this, multiple efforts at multiple points of leverage are important.

As follows are working definitions and examples for each lever:
1. Platform Changes: Platform changes are adjustments that platforms themselves make in the design (visual, interactive, etc.) of their platforms. For example, platforms can choose to prompt users via a notification to read an article before sharing it. While these design changes can have material impact (e.g., an action like changing the Share button as #OneClickSafer proposes), they don’t address root cause issues stemming from the operating model.2. Internal Governance: Internal governance changes are implemented by decision-makers within platforms to shift how internal systems and structures operate. Examples could include having The Facebook Oversight Board oversee unsafe design features (not simply whether a piece of content is bad or good) or changing employee bonuses to pay out for increasing people’s safety and well-being (not for increasing user engagement).3. External Regulation: External regulation occurs when outside forces, such as legislators or regulators, pass laws that set common platform safety requirements, limit age-appropriate design features, force interoperability with competing platforms, or create liabilities for unsafe business practices or harms. While these changes take longer to enact, they are more enduring with higher impact potential. Recent examples include the GDPR, COPPA, and the proposed KIDS Act.4. Business Model: Business model changes shift the fundamental operations and profit structures of a firm. An example would include a social media platform that moves to a subscription model with a sliding scale to ensure broad access. Business model changes may arise from internal or external regulation, supply and demand changes (e.g., a lawsuit that makes the current “viral engagement” business models unaffordable, therefore changing what venture capitalists deem profitable), or operating system changes (e.g., Apple changing iOS to limit user tracking and reducing the profitability of surveillance business models.)5. Economic Goal: Economic goal changes are when the orientation of the system itself transforms through regulation, investor behavior, new financial models, or new market entrants. An example would be if Facebook was accountable to metrics that reflect a healthier society (instead of optimizing for quarterly profits) or was turned into a public benefit corporation based on a stakeholder model as opposed to a shareholder model.6. Operating Paradigm: Paradigm changes are the highest leverage point and most difficult to shift. They occur when there is widespread change in our core beliefs, values, behaviors, and operating norms. Examples include:A mass shift in consumer sentiment, as with Big Tobacco and cigarettes, which over several decades went from being “cool” to dangerous and lethal. Similarly, this could happen in public attitudes towards “viral engagement” social media with people shifting to seeing it as dangerous. A change in the cultural beliefs of technologists, who shift to seeing attention-harvesting, “race to the bottom of the brain stem” addictive platforms as unethical and dangerous to society. As a society, changing the North Star of what we’re seeking. This could be done by asking ourselves, is our ultimate goal to have “30% less toxic social media” than we do now? Or is it to build humane technology that enables thriving 21st-century digital societies?
This is a preliminary framework. We’d value your feedback on how to improve it. Please drop us a note at with your thoughts.