Chemistry Is Harder Than Software, and It's Bottlenecking Everything
The Physical World Fights Back

Chemistry Is Harder Than Software, and It's Bottlenecking Everything

Why your phone still dies at 3 PM and your electric car can't drive cross-country

The Uncomfortable Truth About Progress

We live in an age where software rewrites itself overnight. Where AI models double in capability every few months. Where a teenager in a bedroom can build an app that reaches millions. But ask that same teenager to make a battery that lasts twice as long, and they’ll stare at you blankly.

Because you can’t iterate your way through chemistry.

I was thinking about this last week while watching my British lilac cat, Simon, chase a laser pointer around the room. He moved with such efficiency. Evolution spent millions of years optimizing that little predator. Meanwhile, the battery in my wireless laser pointer—a product of human ingenuity—died after forty minutes. Simon was disappointed. I was irritated. And I started thinking about why we’re so good at some things and so terrible at others.

The gap between software progress and hardware progress isn’t just noticeable anymore. It’s becoming the defining constraint of our era. And at the center of this constraint sits one humble component: the battery.

The Software Illusion

Software developers have been spoiled. When your medium is code, the feedback loop is instant. Write something, run it, see the result. Don’t like it? Change it. The marginal cost of experimentation is effectively zero. You can deploy a thousand variations and measure which one performs best. You can roll back mistakes in seconds.

This creates a particular kind of confidence. Software people start to believe that all problems yield to iteration. That given enough time, data, and compute, any obstacle can be optimized away. They build companies around this assumption. They raise billions on it.

Then they try to build a better battery.

And the universe laughs.

Chemistry doesn’t care about your agile methodology. Lithium ions don’t respond to A/B testing. You can’t ship a minimum viable electrode and gather user feedback. The atoms do what atoms do, governed by quantum mechanics and thermodynamics, and no amount of clever coding will convince them otherwise.

Why Chemistry Refuses to Scale Like Code

Here’s the fundamental problem. Software operates in a space of pure information. Bits are abstract. They have no mass, no temperature, no chemical reactivity. You can copy them infinitely at zero cost. You can manipulate them at the speed of light. The constraints are computational, not physical.

Batteries exist in the real world. They’re made of stuff. Lithium, cobalt, nickel, manganese. These materials have weight. They have volume. They degrade over time. They can catch fire if you look at them wrong. Every improvement requires wrestling with physical reality.

Consider the challenge of energy density. A lithium-ion battery today stores about 250 watt-hours per kilogram. Gasoline stores about 12,000 watt-hours per kilogram. That’s nearly fifty times more energy in the same weight. Combustion engines waste most of that energy as heat, true. But even accounting for efficiency, the gap is staggering.

Now try to close that gap through chemistry. You need to find materials that can store more lithium ions without breaking down. Materials that conduct electricity efficiently. Materials that don’t expand and crack when charged. Materials that remain stable across thousands of cycles. Materials that don’t cost a fortune or require mining in politically unstable regions.

And here’s the cruel part: improving one property often makes another worse. Higher energy density usually means lower stability. Faster charging usually means faster degradation. It’s a landscape of trade-offs with no clear path to optimization.

The Timeline Problem

Software companies measure progress in weeks. Sometimes days. A machine learning researcher might train dozens of models before lunch. The iteration speed is absurd by historical standards.

Battery research operates on a different clock entirely.

Developing a new battery chemistry takes years. First you need to understand the theoretical potential of new materials. Then you need to synthesize them in a lab. Then you need to test them—not for hours, but for months or years, simulating thousands of charge cycles. Then you need to figure out how to manufacture them at scale. Then you need to prove they’re safe enough for consumer products.

The lithium-ion battery was invented in the 1970s. It wasn’t commercialized until 1991. That’s two decades from concept to product. And we’re still using essentially the same chemistry today, just refined and optimized.

Compare that to software. The iPhone launched in 2007. By 2010, the app ecosystem had already gone through multiple revolutions. Instagram didn’t exist, then it had 100 million users, then it was worth a billion dollars. All while battery technology improved by maybe 5-10% annually.

This timeline mismatch creates a peculiar situation. Software capabilities race ahead. Hardware capabilities lumber along. And the gap grows wider every year.

Method

Let me explain how I’m evaluating this problem.

First, I looked at historical data on battery energy density improvements. The numbers are publicly available from industry reports and academic papers. Lithium-ion batteries have improved at roughly 5-8% per year since their introduction. This sounds respectable until you compare it to computing.

Second, I examined Moore’s Law and its equivalents. Computing power doubled roughly every 18 months for decades. Even as Moore’s Law has slowed, software optimization continues at a rapid pace. AI capabilities are improving faster than the underlying hardware would suggest.

Third, I talked to researchers. Not formally—just conversations with people who work in materials science and electrochemistry. The consistent message was sobering: we’re approaching fundamental physical limits. The improvements we’ve seen came from better engineering of existing chemistry, not breakthroughs in new chemistry.

Fourth, I looked at where the bottlenecks actually appear in the real world. Electric vehicles with range anxiety. Smartphones that die by afternoon. Renewable energy systems that can’t store enough power to smooth out supply. Drones with twenty-minute flight times. The pattern is clear: we have more computing capability than we can power.

Finally, I considered the economic incentives. Billions of dollars are pouring into battery research. The market demand is enormous. If faster progress were possible, the financial motivation certainly exists. The fact that money isn’t solving the problem suggests the problem isn’t primarily economic.

The Bottleneck Catalog

Let’s be specific about what batteries are holding back.

Electric vehicles. Tesla made electric cars desirable. But range anxiety persists. A gasoline car can refuel in minutes and drive 500 miles. An electric car needs 30 minutes at a fast charger and might manage 300 miles. For daily commuting, this is fine. For road trips, it’s a hassle. For commercial trucking, it’s often impractical. The battery is the limiting factor.

Grid-scale energy storage. Solar and wind are now cheaper than fossil fuels in many regions. But they’re intermittent. The sun doesn’t shine at night. The wind doesn’t blow on demand. To run a grid on renewables, you need massive storage capacity. Current battery costs make this economically challenging. We can generate clean energy. We just can’t store it affordably.

Consumer electronics. Your phone is a supercomputer compared to what NASA used to reach the moon. But it dies if you use it too much. Every year, phones get faster processors, better cameras, brighter screens. And every year, battery life stays roughly the same because all that capability consumes power. The software got better. The battery stayed mediocre.

Aviation. Electric planes exist, but only for very short flights. The energy density problem is brutal at altitude. A transatlantic flight requires energy that no battery can currently provide at acceptable weight. Commercial electric aviation remains a dream deferred by chemistry.

Robotics. Boston Dynamics builds incredible robots. They can run, jump, do backflips. They also need to recharge every hour or two. Autonomous robots in warehouses spend significant time returning to charging stations. The mechanical engineering is decades ahead of the power storage.

Medical devices. Implantable devices like pacemakers need batteries that last years. Current technology manages this only by using minimal power. More capable implants—ones that could monitor more conditions or provide more therapies—remain limited by what batteries can deliver.

The Research Landscape

It’s not like scientists aren’t trying. Battery research is a global priority with massive funding.

Solid-state batteries promise higher energy density and better safety by replacing liquid electrolytes with solid materials. Companies like Toyota, Samsung, and QuantumScape have been working on this for years. Progress is real but slow. Manufacturing challenges persist. Commercial deployment keeps getting pushed back.

Lithium-sulfur batteries could theoretically offer much higher energy density than lithium-ion. Sulfur is abundant and cheap. But sulfur electrodes degrade rapidly. Making them last has proven difficult.

Sodium-ion batteries use more abundant materials than lithium, potentially solving supply chain concerns. But they currently offer lower energy density. Good for some applications, not for others.

Flow batteries use liquid electrolytes stored in tanks, allowing capacity to scale independently of power. Great for grid storage. Too bulky for mobile applications.

Each approach has trade-offs. None offers a clear path to the kind of exponential improvement that software enjoys.

The Talent Flow Problem

Here’s something that doesn’t get discussed enough: where smart people choose to work.

A brilliant computer science graduate can join a startup, build something in months, potentially become wealthy by thirty. The feedback loop is tight. The rewards are tangible. The work feels productive.

A brilliant chemistry graduate faces a different reality. Battery research requires expensive equipment, years of experimentation, and often results in negative findings. The path from insight to commercial product might span a decade. The financial rewards are more modest. The work feels slow.

This creates a brain drain. Top talent flows toward fields where progress is visible and rewarded quickly. Software, AI, finance. The fields where atoms matter—materials science, chemistry, mechanical engineering—attract fewer of the best minds.

I’m not blaming individuals for making rational career choices. But the aggregate effect is that our hardest problems get relatively less attention from our smartest people. We’re optimizing the optimization while neglecting the foundation.

Generative Engine Optimization

Here’s something interesting about how this topic performs in an AI-driven information landscape.

Search engines and AI assistants increasingly mediate how people learn about technology. Ask ChatGPT or Perplexity about battery technology, and you’ll get accurate but potentially misleading information. The facts are correct. But the framing often mirrors Silicon Valley optimism. “Researchers are making progress.” “Solid-state batteries are coming soon.” “The future looks promising.”

What gets lost is the temporal context. “Coming soon” in battery research means something very different than “coming soon” in software. A decade of intensive research might yield a 30% improvement. That’s real progress. It’s also nothing like what AI researchers achieved in of the same period.

Human judgment matters here. An AI summarizing battery news will faithfully report every breakthrough announcement. A human with context knows that most announced breakthroughs never make it to commercial products. The ability to distinguish between hype and substance—to maintain realistic expectations despite optimistic press releases—becomes increasingly valuable.

This is a meta-skill for our era. Understanding what AI-mediated information tends to get wrong. Knowing where the systematic biases lie. Battery and materials science is exactly the kind of field where AI summaries will be accurate but subtly misleading. The technology is real. The timelines are fiction.

Being literate in automation-aware thinking means asking: “What does this AI summary not understand?” For batteries, it doesn’t understand that chemistry is hard in a way that software isn’t. It doesn’t understand that exponential improvement curves aren’t universal. It doesn’t understand that some problems resist iteration.

The Patience Problem

Modern culture has been shaped by software’s pace. We expect quick solutions. Rapid iteration. Continuous improvement. When things don’t progress fast enough, we assume someone isn’t trying hard enough, or that the incentives are misaligned.

Battery research doesn’t work that way. The researchers are brilliant. The incentives are enormous. The problem is simply hard.

This mismatch creates frustration. Electric vehicle skeptics point to slow progress as evidence that EVs are overhyped. Renewable energy critics argue that intermittency problems prove the technology isn’t viable. These critiques misunderstand the nature of the challenge. The progress is slow not because of failure, but because chemistry is fundamentally different from code.

We need to develop institutional patience. The ability to sustain funding and attention for problems that won’t be solved in a product cycle. The willingness to invest in research that might pay off in decades, not quarters.

This is hard. Venture capital wants returns in seven years. Public markets want quarterly growth. Politicians want results before the next election. None of these timelines align with the pace of materials science.

What Actually Helps

Given the constraints, what should we do?

Diversify research approaches. The battery breakthrough might come from an unexpected direction. Fund basic research broadly, not just the most promising-looking paths.

Improve manufacturing. Some gains come not from new chemistry but from better ways to produce existing chemistry. Cheaper, more consistent, more scalable manufacturing makes current technology more viable.

Work around the limits. Better thermal management, smarter charging algorithms, more efficient motors—these don’t improve batteries directly, but they make existing batteries go further. Software can’t solve chemistry, but it can reduce how much chemistry we need.

Accept trade-offs. Maybe we won’t have electric long-haul aviation for decades. Plan accordingly. Maybe grid storage needs technologies other than batteries—pumped hydro, compressed air, hydrogen. Be pragmatic about what batteries can and can’t do.

Maintain realistic expectations. Hope for breakthroughs. Plan for incremental progress. Don’t make policy decisions based on technologies that don’t exist yet.

The Broader Lesson

This essay is ostensibly about batteries. But it’s really about the limits of a particular mindset.

The past few decades have been dominated by software thinking. The belief that smart people with laptops can solve anything. The assumption that iteration and data will crack any problem. The expectation that progress should be exponential.

This mindset has been spectacularly successful in its domain. But its domain has limits. The physical world doesn’t bend to software logic.

Batteries are just one example. Climate change involves chemistry and physics that don’t respond to clever algorithms. Agriculture requires understanding soil biology that evolves on its own schedule. Medicine depends on biochemistry that can’t be A/B tested on human subjects at scale.

The twenty-first century will be defined by whether we can extend our problem-solving capabilities beyond the digital realm. Whether we can bring the same intensity and talent to physical challenges that we’ve brought to virtual ones. Whether we can learn patience for problems that don’t yield to iteration.

Simon the cat has settled on my lap, purring contentedly. His metabolism—a biological battery of sorts—operates with an efficiency we haven’t matched artificially. Evolution had billions of years to optimize. We’ve had decades.

Maybe that’s the lesson. Some problems require time. Not just effort, not just money, not just intelligence. Time. And in a culture obsessed with speed, that might be the hardest resource to allocate.

The Uncomfortable Wait

I don’t have a triumphant conclusion. No breakthrough to announce. No timeline to promise.

The honest assessment is this: battery technology will continue improving at roughly 5-8% per year. Maybe we’ll get lucky with a breakthrough. Probably we won’t. We’ll make slow, steady progress through careful engineering and occasional insights.

This is fine, actually. Slow progress is still progress. The grid will gradually accommodate more renewables. Electric vehicles will slowly become more practical. Consumer electronics will incrementally improve.

But we should stop pretending that batteries will follow software’s trajectory. They won’t. Chemistry is harder than software. It’s bottlenecking everything. And accepting that fact is the first step toward working within real constraints rather than imaginary ones.

The atoms don’t care about our deadlines. They never have. Learning to work with that reality, rather than against it, might be the most important skill of our era.

And if your phone dies before dinner, well, that’s just chemistry reminding you that it’s still in charge.