A magnifying glass positioned over a stack of dollar bills, symbolizing the need to take a closer look at public funds.
Credit: Sasun Bughdaryan on Unsplash

Digital Colonialism, a series co-produced by NPQ and MediaJustice, explores how the rapid expansion of artificial intelligence (AI) data centers is reshaping communities across the United States.


The challenge of this moment is not how to stop artificial intelligence (AI) or its momentum, but how to shape it.

Nearly 60 years ago, in one of his final public remarks, civil rights defender Dr. Martin Luther King Jr. issued a call to action we have yet to fully recognize: the need to build “a new stage of massive active, nonviolent resistance to the evils of modern corporate society.”

As corporate-driven AI impacts our public health, workplaces, economies, and futures, it becomes one of the most pressing frontiers of today’s civil rights movement. What makes this moment especially unsettling is that AI projects that denigrate human life are not being financed solely by distant private interests, but in part by public dollars and institutional funds tied to our retirement savings, endowments, and charitable contributions, often without our knowledge or consent.

Dr. King’s call rings out to us still today. In hundreds of US locations where data centers are built or are planned to be built, corporate overreach is creating harmful impacts—poor air quality, health risks, and higher utility rates—for local communities.

Stewards of our Future

As we acknowledge the “evils of modern corporate society,” we must reject the idea that “the corporation” is inherently or wholly immoral. We must recognize that corporations take on the moral charge we allow. They are shaped by the choices of executives, investors, regulators, consumers, and governments and by the incentives we set and the standards we fail to enforce.

In the race to profit from AI, corporate decision-making is increasingly determining who absorbs risk, who captures value, and whose communities are sacrificed. These corporate decisions are unfolding faster than democratic oversight can respond, creating conditions where public consent is bypassed and accountability is optional.

AI projects that denigrate human life are not being financed solely by distant private interests, but in part by public dollars and institutional funds tied to our retirement savings, endowments, and charitable contributions.

When corporate decision-making reflects the lived knowledge of impacted stakeholders—workers, retirees, beneficiaries, and civil society leaders—our systems become safer and healthier. Biology teaches us that resilient systems depend on feedback from those who inhabit them; humans are not an exception. Their inclusion is essential to governing our complex economies and technologies in ways that are mutually beneficial rather than extractive.

Due to the systemic exclusion of impacted stakeholders, the AI investment boom is unfolding alongside economic insecurity for millions. While everyday people confront an affordability crisis, AI is driving US equity markets and corporate valuations to historic highs.

According to Harvard economist Jason Furman, US GDP growth in the first half of 2025 “was almost entirely driven by investment in data centers and information processing technology.” These capital flows are not neutral, as they actively shape what we live through now, and what we will live through tomorrow.

As systems and our tools evolve and new risks emerge, our moral and organizing frameworks must adapt and evolve as well.

When Investments Undermine Our Values

If AI’s expansion were financed solely by speculative capital—investment in high-risk assets—the story would be simpler. But that is not the case.

Majority Action, the organization I lead that mobilizes investor power for corporate accountability, recently published a report titled Emerging Technologies, Evolving Responsibilities: Why Investors Must Act to Mitigate AI’s System-Level Impacts. In it, an uncomfortable yet opportune truth emerges: Much of the money powering unbridled AI and data center growth comes from investor institutions with a fiduciary and moral duty to serve people. Public pension funds, nonprofit and foundation endowments, university endowments, and employer-sponsored retirement plans are not abstract pools of money; they are real investments built from workers’ paychecks, charitable contributions, and savings entrusted for future security.

These institutions are entrusted to grow value without compromising the people they exist to protect. Yet in practice, their assets are frequently channeled through private equity and venture capital structures that obscure outcomes and accountability. As a result, institutions committed to social good are often unaware that their assets are helping shape technologies and infrastructure that undermine labor protections, strain local environments, and deepen inequality. This is a profound breach of trust: Workers’ retirement savings are financing systems that can weaken labor protections, nonprofit endowments are exposed to backing developments that harm the communities they aim to uplift, and university endowments are backing infrastructure that can shape economic futures in ways that disadvantage students.

As systems and our tools evolve and new risks emerge, our moral and organizing frameworks must adapt and evolve as well.

However, expecting investors to step away from AI entirely is not realistic, given the current structure of markets and geopolitical implications of AI as the modern-day “space race”. This limits full divestment as a primary strategy and elevates the importance of stronger, more accountable corporate governance, enforceable standard-setting, and active stewardship.

Right now, it is crucial that we organize our capital and the institutions entrusted to manage it. We are called to remember that we are not outside the systems shaping our future: We are the people who make them up. It is our money, our retirement plans, our endowments, our partners, and our institutions. And with that comes the right and responsibility to have a say in how our funds are deployed. We can redirect our capital flows toward technologies that advance and serve the greater good.

Through managing our capital, we can:

  • Demand renewable energy and labor standards for data centers
  • Establish enforceable labor and community protections across AI and data-center expansion
  • Require clear accountability and meaningful consequences for social and environmental harm

Building Toward Beloved Community

Majority Action and our collaborators—including investors who play an influential role in corporate society, and community members and workers who are impacted by corporate decision-making—are creating a pathway forward. We support investor due diligence by providing strategic guidance, developing practical tools to inform investment policies and corporate engagement, and cultivating meaningful relationships through site visits and shared learning spaces. Corporate guardrails and enforceable standards are the difference between performative solidarity and real accountability. Public pension funds, nonprofit and foundation endowments, and university endowments risk being passive financiers of harm, or they can seize the opportunity to be active stewards of our shared prosperity.

As we pursue corporate and capital strategies, we must also be clear-eyed about the façade of accountability.

Bound by their duty to their beneficiaries, public pension funds and mission-driven endowments are often the leaders. setting expectations that ripple outward across the financial ecosystem, helping shape norms for other asset owners and asset managers, and potentially setting the stage for state and federal policy. We collectively benefit from foundations that  and better align their endowments with the values and priorities of their institutions.

For example, pension funds like CalPERS, California’s public employee retirement system, established labor principles that set clear expectations for worker rights and corporate accountability. Public stewards such as former New York City Comptroller Brad Lander urged pension boards to reconsider mandates with large asset managers whose practices fell short of climate and governance commitments. This is how we reclaim capital as a tool for shared security across generations.

Nonprofit leaders, trustees, and asset owners can begin by asking simple but powerful questions:

  • Where is your organization’s pension plan or endowment invested?
  • What kind of digital future is that capital currently building?
  • What levers of engagement do you have if those investments conflict with your values or your duty to your beneficiaries?

As we pursue corporate and capital strategies, we must also be clear-eyed about the façade of accountability. Corporate leaders know it often takes little to quiet opposition. Selective transparency or symbolic gestures can weaken public pressure while corporate capture and power continue to consolidate. Resisting performative accountability requires solidarity, clarity, and sustained commitment to and benchmarks for real change.

As we celebrate the wins and take note of the critical organizing and local fights around data centers, our momentum must expand beyond individual projects to grapple directly with the financial systems underwriting this emerging technology. We are at a narrow inflection point before potentially harmful patterns lock in.

What we choose to build reflects what we believe is possible. There are leaders across the corporate and investment landscape who understand that stability, democratic principles, and long-term value are inseparable, and who remain accountable to the people and communities living through the consequences.

When misaligned corporate actors and actions threaten our collective wellbeing, we can respond by generating something more just, more humane, more life-giving—together. The question here is simple: Do we want a future where our own money works against us, or one where it sustains us? The stage is set. Our invitation stands.