Gavin Baker on the Evolving AI Landscape: Nvidia, TPUs, and Data Centers in Space

In this podcast episode, investor Gavin Baker shares insights on processing rapid AI advancements like Gemini 3, the importance of premium model access, and innovative ideas such as space-based data centers, highlighting the competitive dynamics in AI hardware and business models.

Gavin Baker on the Evolving AI Landscape: Nvidia, TPUs, and Data Centers in Space

Investor Gavin Baker brings his encyclopedic knowledge of technology markets to bear on the rapidly evolving AI landscape. In this wide-ranging discussion, he shares insights on processing breakthrough developments like Gemini 3, the critical importance of accessing premium AI models, and his bold vision for space-based data centers.

The Reality of AI Model Testing

Most investors make a fundamental mistake when evaluating AI progress: they judge frontier models based on free tiers. Baker emphasizes this creates a distorted view of capabilities.

“The free tier is like you’re dealing with a 10-year-old and you’re making conclusions about the 10-year-old’s capabilities as an adult,” Baker explains. “You have to pay for the highest tier whether it’s Gemini Ultra, Super Grock, whatever it is—you have to pay the $200 per month tier whereas those are like a fully-fledged 30, 35-year-old.”

This distinction matters for investment decisions. The gap between free and premium tiers represents the difference between a basic assistant and a sophisticated reasoning system capable of complex problem-solving.

Following the Signal in AI Development

Baker tracks AI progress through multiple channels, with Twitter/X serving as the primary battleground for real-time developments. He points to public disputes between research teams—like the PyTorch team at Meta and the Jax team at Google—as indicators of competitive intensity.

The key insight: “If on planet Earth there’s 500 to 1,000 people who really understand this and are at the cutting edge, and a good number of them live in China, you have to follow those people closely.”

His approach involves:

  • Monitoring posts from leading researchers like Andrej Karpathy
  • Listening to podcasts featuring lab personnel from OpenAI, Google, Anthropic, and XAI
  • Using AI itself to process and analyze the constant stream of information

The Gemini 3 Breakthrough and Scaling Laws

Gemini 3’s release provided crucial confirmation that pre-training scaling laws remain intact. This matters because scaling laws represent empirical observations rather than understood phenomena—like ancient civilizations precisely measuring celestial movements without understanding orbital mechanics.

“Our understanding of scaling laws for pre-training is kind of like the ancient British people’s understanding of the sun,” Baker notes. “They can measure it so precisely that the east-west axis of the great pyramids are perfectly aligned with the equinoxes… but they had no idea how or why.”

The confirmation means continued progress is possible as hardware advances, particularly with the transition from Hopper to Blackwell chips.

The Infrastructure Battle: Nvidia vs. Google

The AI landscape centers on a fundamental competition between Nvidia’s GPU ecosystem and Google’s TPU approach. This battle has profound implications for cost structures and competitive positioning.

Google has maintained an advantage as the lowest-cost producer of tokens, enabling them to “suck the economic oxygen out of the AI ecosystem.” However, this dynamic shifts as Blackwell deployment scales up.

Baker predicts XAI will release the first Blackwell-trained model in early 2026, benefiting from Elon Musk’s rapid data center construction capabilities. This represents a critical inflection point where cost advantages may reverse.

The Economics of AI Competition

For the first time in Baker’s career as a technology investor, being the low-cost producer matters fundamentally. Unlike Apple’s premium positioning in phones or Microsoft’s software margins, AI success requires operational efficiency at massive scale.

The transition to Blackwell and subsequent generations creates a prisoner’s dilemma among major players. Companies fear falling behind if they reduce spending, yet the economics become increasingly challenging as infrastructure costs soar.

“This is a life or death decision that essentially everyone except Microsoft is failing,” Baker observes, referring to the willingness to accept lower margins for AI capabilities.

The Space Data Center Vision

Baker’s most audacious prediction involves data centers in space becoming reality within 3-4 years. The physics make compelling sense:

Power advantages: Space-based solar panels receive 30% more intense sunlight 24/7, eliminating battery storage needs and providing six times more irradiance than Earth-based systems.

Cooling benefits: The vacuum of space provides free cooling through radiators facing away from the sun, eliminating complex HVAC systems that comprise most of a data center’s mass.

Network performance: Laser communication between satellites travels faster through vacuum than fiber optic cables, potentially creating superior network coherence.

User experience: Direct satellite-to-device communication eliminates multiple network hops, reducing latency and improving response times.

The primary constraint remains launch capacity, requiring significant Starship deployment to make the economics work.

Implications for Traditional Software Companies

Baker sees SaaS companies making the same mistake brick-and-mortar retailers made with e-commerce: refusing to embrace new technology due to margin concerns.

“Application SaaS companies are making the exact same mistake that brick-and-mortar retailers did with e-commerce,” he argues. Traditional software enjoys 70-90% gross margins, while AI applications typically achieve 40% margins due to computational requirements.

The solution requires accepting lower margins while leveraging existing customer relationships and data advantages. Companies like Salesforce, ServiceNow, and HubSpot could build AI agents that access their proprietary data, but most hesitate due to margin preservation instincts.

The Broader Technology Ecosystem

The AI boom has revitalized semiconductor venture capital, with experienced engineers leaving established companies to start specialized component firms. This ecosystem development proves crucial for maintaining the annual chip upgrade cycles that major players like Nvidia now target.

“Your average semiconductor venture founder is like 50 years old,” Baker notes, describing seasoned professionals who see massive market opportunities in AI infrastructure components.

Investment Philosophy and Truth-Seeking

Baker frames investing as “the search for truth”—finding hidden insights before markets recognize them. His background in history and current events provides the analytical framework for identifying technological and economic inflection points.

This approach requires constant learning and adaptation. Baker emphasizes how AI-native young entrepreneurs demonstrate remarkable sophistication by leveraging AI tools for business strategy, HR decisions, and operational challenges.

The convergence of historical pattern recognition, current event analysis, and technological understanding creates the foundation for identifying tomorrow’s dominant platforms and business models.

As AI continues reshaping technology markets, Baker’s insights highlight both the massive opportunities and fundamental risks facing companies that must choose between preserving existing margins and embracing transformative but margin-dilutive technologies. The winners will be those who recognize that in this new paradigm, being the low-cost producer of intelligence may matter more than traditional software economics.