📨 AI for Social Impact Deep Dive: The AI Ecosystem

✍🏼 A Note From the Editor

Today, we’re getting into how the AI ecosystem works. The AI we all know and love (or have mixed feelings about) is the result of 70+ years of development, and a pivotal 2017 research breakthrough on the transformer. Now, a staggering amount of capital investment by a very small number of companies and investors is powering AI and the AI economy. Understanding this structure can help you ask better questions, make smarter vendor decisions, and think strategically about the concentration of AI power.

⛓️‍💥 The Transformer Breakthrough

In 2017, a team of Google researchers published a paper called "Attention Is All You Need," introducing a new architecture called the transformer. The breakthrough of the transformer is that it allows AI to treat everything, like text, images, sound, DNA sequences, and even fMRI brain scans, as a form of language. This means that a breakthrough in one domain became a breakthrough across all domains, and thus, the modern AI boom was born.

🏗️ The AI Ecosystem: A Layered Pyramid

🧱 Layer 1: Infrastructure

The foundation of all AI tools is the infrastructure that powers it, like semiconductor chips (particularly Nvidia's GPUs, which are the dominant hardware for training AI models), data centers, electricity, cooling systems, and the computational power needed to train and run large AI models. The investment required for all these components is what makes building a foundational model cost-prohibitive for smaller players. “Hyperscalers” Amazon, Google, Microsoft, and Meta are projected to spend upwards of $650 billion on AI in 2026, which is between 67% and 74% more than what they spent in 2025.

🧠 Layer 2: Foundation Models

This is where the actual AI "intelligence" lives. Foundation models are the large, pre-trained systems like, ChatGPT, Claude, Gemini, and Llama, that learn patterns from vast amounts of data and apply them to a broad range of tasks. Costs to train AI models has grown at a rate of 2.4x per year since 2016, with the most significant expenses allocated to AI accelerator chips and staff costs, in the tens of millions of dollars.

📦 Layer 3: Applications

Applications are the tools and products built on top of foundation models. Application developers access foundation models via APIs (application programming interfaces), paying per query or per token. They build the user experience and the specialized functionality of the tool, and the model layer provides the underlying intelligence.

👤 Layer 4: End Users

That's you. You are interacting either directly with a foundation model or you are using a tool in the application layer likely powered by one of these models. The caveat here is that a vendor might be using an open-weight model, which presents an alternative to the power centralization within Big Tech, but has it’s limitations, too. For example, because of the insane amount of money pouring into Big Tech models, they are the best-in-class tools. Open weight models may be less advanced in reasoning, creativity, and nuanced language tasks, but at the same time can be trained for a specific use case through RAG or fine-tuning and aren’t at the mercy of Big Tech companies for pricing, for example.

💸 The Economics

The AI industry runs on a circular funding model—a loop where the same small number of companies invest in each other, and the capital flows back to the original investor through cloud computing contracts and hardware purchases. One company injects capital into another, and then funnels that capital back into the original investor’s revenue stream.

Microsoft and OpenAI (ChatGPT) started this deal structure back in 2023, when Microsoft invested more than $13 billion in OpenAI, so they could access the computing resources they needed to build and deploy powerful models. As part of this deal, OpenAI committed to using Microsoft Azure as its primary cloud provider.

Then, Amazon ($4 billion) and Google ($2 billion) invested in Anthropic (Claude), with a deal to use AWS for AI training and Google’s chips and cloud services. Most recently, Microsoft and Nvidia together invested $15 billion in Anthropic, while Anthropic committed to purchasing $30 billion of Microsoft Azure compute capacity powered by NVIDIA systems. OpenAI also just named AWS its exclusive third-party cloud provider for its new enterprise agent platform, with an investment of up to $50 billion.

Everyone is invested in or has contracts with just about everyone else in the industry, and this further circularizes the AI industry, making all these companies even more interdependent. The concern some analysts have raised is about a structural fragility: if any major player falters, the reverberations could cascade quickly through the whole system.

😵‍💫 So now what?

It’s easy to feel overwhelmed and powerless at the hands of Big Tech, but knowledge is power. The more that we build our AI literacy, instead of turning away from AI completely, the more agency we have in building our AI future. We can start small by opting-out of model training when using paid tools, asking vendors which models they are using to see if their AI company providers align with our values, and understanding where open-weight tools fit into the equation. The more we know, the better off we are.

👋🏼 About AI for Social Impact

I’m Joanna, and I’m on a mission to help folks in the social impact sector understand, experiment with, and responsibly adopt AI. We don’t have time to waste, but we also can’t get left behind.

Let’s move the sector forward together. 💫

♥️ Spread the Love

Spread the love and forward this newsletter to anyone who might benefit from a dose of AI inspo!

Thank you for being part of the community. 🫶🏼