Getting climate, energy & environment news right.

Conservatives have been vocal about our climate for years. Those voices won’t be ignored any longer.

[searchandfilter id="17558"]

This was initially published for the Center for Strategic and International Studies.

Texas has long stood apart from national energy policy—its own grid, its own rules. In 2026, that independence has made it the top destination for AI-driven electricity demand. While Washington grapples with the pressures of building data centers, ensuring reliability, and controlling costs, Texas is tackling everything at once. The Electric Reliability Council of Texas (ERCOT) market is solving for speed-to-power by surmounting transmission bottlenecks, phantom load growth, buying down residential rate increases, and more.

Last week, a single hearing of the Texas Senate Committee on Business and Commerce brought this into view. Public Utility Commission of Texas (PUCT) Chairman Thomas Gleeson and ERCOT CEO Pablo Vegas told the committee that they were managing 410 gigawatts (GW) of load applications, nearly five times the capacity of today’s ERCOT grid, and nearly 90 percent of them were data centers. In the face of potentially overwhelming demand, Texas will either become an example of success or failure in the United States’ approach to winning the AI race.

The Texas legislature and regulators have developed a suite of policies to manage this growth. Three stand out as particularly important:

  1. Market-Based Reliability: The move toward a Dispatchable Reliability Reserve Service (DRRS) (mandated in H.B. 1500, codified in PURA § 39.159(d),(e)), proves that even in liberalized markets, long-duration reliability (like gas and long-duration batteries) requires a specific price signal to survive alongside zero-marginal-cost renewables and short-duration batteries.
  2. The Denominator Effect: Texas is shifting the conversation from “How much will this cost ratepayers?” to “How much can we grow the load base to dilute system costs?” There is an emerging Texan answer to debate over who bears the fixed costs of grid upgrades, and how the grid is utilized (and utilized by what technologies) to spread out fixed costs.
  3. The End of the “Doom Loop” and Flexible Connection: By ditching the traditional load study process for a “batching” approach, Texas is in a first-of-kind exercise to solve load interconnection with strict project maturity criteria, bankable energization timelines, and as-available service options for loads to receive power faster.
     

Gas for Tail Risk and Solar for Affordability—Core Texas Themes Resonate Nationwide

Vegas reported that the scale of projected demand has fundamentally shifted Texas’s generation priorities. On April 1, the Texas grid was running entirely on wind and solar with 17 GW of operating reserves. Vegas noted that those operating reserves sit idle and unpaid until called. When asked how the state could justify adding more gas if investors only see a return on the handful of “risky days,” he noted that a new structural signal is required to price the attribute of availability for tail risk days.

>>>READ: How Renewables and Batteries Saved the Texas Grid in 2025

For the first time in years, the natural gas interconnection queue (currently 60 GW) has overtaken wind. However, a massive financing gap remains. While the legislature’s Texas Energy Fund (TEF) catalyzed an initial 9 GW through low-cost debt, roughly 5 GW of proposed gas capacity lacks a viable pathway in markets. Vegas was explicit that state-backed loans are a jump-start, not a permanent solution. He pointed to the DRRS proceedings, which are set to create a dedicated revenue stream to bring longer-duration reserves to the grid. There are now two dueling DRRS implementation proposals pending at ERCOT. NPRR 1309 excludes batteries. NPRR 1310 introduces a “release factor” mechanism that allows a battery to be paid for DRRS while also being available for energy or other ancillary services.

Who Pays for Transmission?

To protect residential ratepayers from grid costs incurred to meet speculative data center loads, the PUCT is developing a standardized large load interconnection process. The proposed rules aim to establish a rigorous financial gauntlet to ensure that only viable, “real” projects receive grid connection allocations. To achieve that, the commission is currently weighing how developers must prove land ownership, financial security, and procurement of long lead-time equipment before they can advance in the queue. As currently drafted in the proposal:

  • Proposed Front-Loaded Entry Fees: Developers could be required to pay a $50,000-per-megawatt (MW) deposit when they sign their first paperwork (an “intermediate agreement”) to lock in their place in line.
  • Nonrefundable Interconnection Fees: Once studies are finalized and move to a final interconnection agreement, an additional (currently proposed at) $50,000 per MW nonrefundable fee.
  • The 80 Percent “Claw Back”: If a project is downsized or withdrawn, roughly 80 percent of the posted security (as currently proposed) could be forfeited and applied directly to the transmission provider’s rate base.

PUCT Chairman Gleeson emphasized that this mechanism intends to use forfeited funds from failed data center developments to directly “buy down” transmission charges for residential customers.

However this proceeding turns out, the national debate on AI and affordability might show that in the case of Texas, armed with better permitting regimes, abundant land, megawatt-hours powered by the sun, and the best free-market construct for batteries in the country, load growth in advance of new fixed costs can hedge rate hikes.

A National Blueprint for “First-Ready” Load Growth

The 410 GW of proposed interconnection in Texas reflects both the desirability of building there and the failure of the one-off bespoke methodology that dominates interconnection processes. A grid operator assesses each project as it arrives for firm power rights, but because dozens of other projects are advancing in parallel, the topology of the grid shifts constantly. This leads to the ultimate surprise for developers: a pre-energization study that announces massive, unforeseen system changes, effectively sending a project that has already invested billions back to the drawing board for a total restudy.

Transition to a proposed batching process for loads in Texas is a counterpoint to the failure of “cluster” studies for generators in other markets. The ERCOT process is being designed to (1) take a fixed snapshot of projects that meet strict maturity criteria, (2) study them as a single group, (3) identify required transmission upgrades, and (4) allocate firm power pro rata across the projects of the batch. ERCOT intends to approve batching rules for the first such group, “batch zero,” at the June 1, 2026, ERCOT board meeting and seek PUCT ratification in July 2026.

Even with these improvements, the latest from ERCOT indicates that batch zero projects will not see their first drop of “firm” power until 2028 at the earliest; many will not reach full requested capacity until 2033. A developer who makes it into a favorable batch might still only be offered a small fraction of the electricity they requested. This creates a high-stakes “misfit” risk between grid and commercial timelines: If that partial offer is less than the bare minimum needed to run at least one building on grid power, the project might not be able to begin operations—effectively leaving the investment sitting idle. This problem epitomizes the national conversation on flexible load connections (and is also addressed in a major rulemaking from the Federal Energy Regulatory Commission).

>>>READ: What CERAWeek 2026 Says About Energy’s Next Chapter

ERCOT is already a leader in connecting generators faster than other regional grids, because power plants accept curtailed use of the transmission system in congested hours and free use in the hours where grid headroom is available. Now, ERCOT is working on similar stakeholder-proposed solutions so that large loads in the batching process can connect earlier with flexible consumption tied to grid headroom. Doing so could allow the batch process itself to become more effective: More loads could “flex” between on-site sources, off-site workload shifting, and available grid headroom, derisking scenarios where they receive less firm power or delayed firm power from the grid.

A snapshot of Texas today captures the full debate over how to energize AI loads. Texas is unique, but policymakers around the country face the same questions: who pays for grid upgrades, who bears the cost if data centers do not materialize, how to energize new loads without losing reliability, and how to reward dispatchability effectively. Texas is confronting all of this at once, since it is a locus of growth and data center development. Major ERCOT and state decisions in the next 12 months will force the issue.

Avalanche Energy just landed a DARPA contract to solve one of fusion power’s trickiest problems – turning damaging radiation into usable electricity. The Seattle-based fusion startup is developing a new class of materials that could transform how next-generation reactors capture energy, potentially accelerating the timeline to commercial fusion power. It’s a critical piece of infrastructure that the entire industry needs but few are tackling head-on.

Avalanche Energy is taking a different angle in the fusion power race. While competitors chase higher temperatures and longer plasma burns, the startup is building the batteries that’ll actually turn fusion’s intense radiation into electricity you can plug into the grid.

Read more in The Tech Buzz here.

This piece was initially published in the Washington Examiner.

If America is energy dominant, why is the shutdown of a strait thousands of miles away spiking our gas prices?

Following U.S. and Israeli strikes on Iran, shipping traffic through the Strait of Hormuz collapsed nearly overnight. Qatar declared force majeure on liquefied natural gas deliveries. Oil surged past $90 a barrel. Gas prices jumped at the pump.

The economy or the climate? Why not both?

Subscribe for ideas that support the environment and the people. 

And yet, America is producing more energy than at any point in our history. What gives?The answer is not a failure of the Trump administration’s American energy dominance policy. It is a structural reality of global commodity markets that no amount of domestic production can fully escape.

Oil and natural gas trade on global markets. When roughly 20% of the world’s daily oil supply and more than 20% of global LNG trade stops moving through a single chokepoint, prices spike everywhere — including at American gas stations. Energy dominance is real and worth defending. But dominance over production is not the same as immunity from price shocks. The Iran war has made that distinction impossible to ignore.

The good news is that America is already building the answer to transform our dominance into true independence. On March 4, the Nuclear Regulatory Commission (NRC) approved the construction permit for TerraPower’s Natrium reactor in Kemmerer, Wyoming — the first commercial-scale reactor permit in nearly a decade, and the first approval ever for a commercial non-light water reactor. The review was completed in 18 months, well ahead of the original 26-month schedule. Construction on nuclear-related portions of the plant begins in the coming weeks with a target operational date of 2030.

Novel small modular reactors (SMRs) are also branching out of prototype phase and will soon make the power of atomic energy portable, deployable, and localized in ways never achieved before. If AI makes nuclear fusion commercially viable, our energy problems begin to evaporate.

>>>READ: America Needs to Fix Nuclear Economics, Not Just Go Smaller

Advanced nuclear is not a solution to today’s energy crisis. A reactor or fusion technology that comes online in 2030 does not reopen the Strait of Hormuz. But it represents exactly the kind of long-term investment that structurally insulates America from the next one.

That’s because abundant nuclear power generates electricity that has precisely zero exposure to the Strait of Hormuz. Its fuel is domestic. Its production can’t be stopped by weather, war, or drones. And as America’s economy electrifies — driven by data centers, EVs, and the broader shift away from fossil fuels — the share of our economic activity directly exposed to global oil and gas price volatility shrinks with every megawatt of advanced nuclear capacity we bring online.

Nuclear already provides roughly 18% of U.S. electricity generation, more than any other carbon-free source, and runs at near full output more than nine days out of 10. No other clean energy source comes close to that combination of reliability and insulation from external shocks.

NRC has proven with the Natrium reactor in Kemmerer that it can compress nuclear approval timelines. Now, as Meta has signed agreements for deployment of up to eight additional Natrium reactors, the government should move even faster. We must safely grant permits to advanced nuclear energy at a pace that matches the urgency of our energy security needs.

Likewise, the administration should streamline small modular reactor (SMR) deployment and publicly track application processing times as a way to introduce accountability. 

Of course, nuclear energy isn’t entirely insulated from supply chain risk, as the U.S. imports most of its uranium. But over half comes from Canada and Australia, close allies whose supply routes are much more secure than the Strait of Hormuz. Coupling securely sourced nuclear power with the removal of barriers to every other available source of energy — solar, wind, geothermal, hydrogen — are the only way to protect Americans from global disruption.

>>>READ: DOE Takes Important Step to Modernize Nuclear Permitting

Energy security isn’t an accomplishment. It is a diversified portfolio. Domestic oil and gas production gives us the production dominance that makes America the indispensable energy supplier to the world. Advanced nuclear and other energy sources give us reliable clean energy that ends our electricity grid’s remaining exposure to global commodity shocks. Both matter. Neither is sufficient alone. 

After the Iran war ends and the Strait of Hormuz opens, gas prices will fall. But when the next crisis comes, we’ll be happy for every reactor we have, large and small. So let’s build.

A California-headquartered advanced nuclear energy company said it has received U.S. Dept. of Energy (DOE) approval of the Documented Safety Analysis for the company’s Mark-0 reactor. Antares, which is building compact nuclear microreactors, on April 7 said the DOE’s approval confirms the agency’s acceptance of the final design for the Mark-0, along with the safety case supporting it.

Tuesday’s announcement comes after the DOE approved a Preliminary Documented Safety Analysis for the technology in January of this year. The safety approval comes under DOE standard 1271, part of a streamlined regulatory pathway for nuclear power technology.

Read more in Power Magazine here.

The Nuclear Regulatory Commission (NRC) has voted to no longer lead security drills at power plants, instead allowing companies to lead their own drills in the coming years.

Last week, the commission decided to transition to company-led drills rather than agency-led ones to assess the nuclear energy fleet’s preparation for attacks.

Read more in The Hill here.

When President Trump stood before Congress in February 2026, he delivered a stark message to America’s technology leaders: “You have an obligation to provide for your own power needs.” His call to action on behind-the-meter nuclear power for data centers wasn’t mere rhetoric. It was a recognition that the energy demands of artificial intelligence and modern computing require a fundamental shift in how we power our economy. Meanwhile, NASA and the Department of Energy announced their commitment to deploy a nuclear fission reactor on the lunar surface by 2030, signaling America’s serious intent to lead in next-generation nuclear technology.

The economy or the climate? Why not both?

Subscribe for ideas that support the environment and the people. 

These announcements capture a crucial moment for American energy policy. Yet, they mask a deeper challenge that the nuclear industry has not fully confronted: the economics of nuclear power, including Small Modular Reactors (SMRs), remain tenuous.

SMRs are undoubtedly a promising development. Their smaller size enables factory construction, modularity, and potential deployment in locations unsuitable for large reactors. The industry has done genuinely impressive work shrinking proven technologies into more flexible packages. But there’s a critical distinction between innovation in form and innovation in economics.

Today’s SMR narrative rests on a bet that scaling will work: take proven reactor designs, make them smaller, and assume that manufacturing at scale will eventually close the cost gap with larger plants. Unfortunately, this is not guaranteed. The physics of smaller reactors fundamentally differs from that of larger ones. Smaller systems have higher surface-area-to-volume ratios, meaning proportionally higher heat losses, more complex engineering challenges per unit output, and less ability to leverage the economies of scale that made large reactors attractive in the first place.

While these SMRs will be critical in the near term, the real path forward isn’t to hope that volume manufacturing solves our cost problems; it’s to redesign reactors from the ground up to be cheaper by physics and engineering, not cheaper at volume. This will entail rethinking materials, coolant systems, passive safety mechanisms, and operational requirements. It means pursuing designs in which the inherent characteristics of the system itself drive down costs, where a smaller reactor is genuinely cheaper to operate and maintain, not just smaller.

>>>READ: DOE Takes Important Step to Modernize Nuclear Permitting

Still, industry innovation alone won’t get us there. Policymakers have an equally important role to play, not through subsidies or cost-overrun insurance, but by confronting the regulatory frameworks that have made building new nuclear plants so prohibitively expensive in the first place. The NRC’s licensing timelines stretch for years, adding enormous carrying costs before a single watt is generated. The ALARA principle, or “as low as reasonably achievable,” has, in practice, evolved into an ever-tightening standard that demands diminishing safety returns at exponentially increasing cost. Layer on duplicative environmental reviews, outdated siting restrictions, and a regulatory culture that treats any new design with deep institutional skepticism, and even the most elegantly engineered reactor faces a cost structure inflated before construction begins. If we’re serious about reactors that are cheaper by design, we need a regulatory environment that doesn’t penalize innovation by default.

Why does this distinction matter? Because without it, we risk repeating the same cycle that has plagued nuclear energy for decades: betting on cost reductions that never materialize. The next wave of nuclear innovation must focus on reactors designed to be cheaper. When that happens, economies of scale amplify an already advantageous cost structure, rather than desperately patching fundamentally uneconomical designs.

>>>READ: Florida Is Leading the Next Nuclear Revolution

Energy security reinforces this urgency. This month, Iran closed the Strait of Hormuz, disrupting 20 percent of global oil supply and sending energy prices toward $100 per barrel. America cannot afford to rely indefinitely on volatile foreign energy markets. Nuclear power, including well-designed SMRs and next-generation systems, offers a path to genuine energy independence. But only if the economics work.

President Trump’s vision of data centers powered by nuclear energy is compelling. But it can only succeed if we build reactors that are not just smaller, but smarter. The technology exists. The talent exists. What’s needed now is the commitment to pursue nuclear innovation that puts economics first, not as an afterthought.

America’s energy future depends on it.

Alina Voss is on the founding team of NX Atomics, a next-generation nuclear energy company based in Indiana.

A small Wisconsin city upended by a data center backed by President Donald Trump is set to vote Tuesday on a referendum that could reshape grassroots resistance to AI projects nationwide.

The vote in Port Washington, a lakeside town of roughly 12,000 people just north of Milwaukee, appears to be the first time any U.S. municipality will go to the ballot to kneecap data center development. It marks an aggressive new tactic in an escalating movement to oppose the hulking artificial intelligence factories — and offers a potential blueprint for other small towns challenging Big Tech.

Read more in Politico here.

This piece was initially published in The National Interest.

Granting small refinery exemptions would ease fuel costs and protect US refining capacity, preventing further price increases during the Iran War. 

The unprecedented supply disruption tied to the Strait of Hormuz has delivered the most severe energy shock in decades. The national average for a gallon of gas has surpassed $4 per gallon, the first time that’s happened in over four years. Drivers in some parts of the country are paying closer to $5 per gallon, and California is inching closer to $6. 

The economy or the climate? Why not both?

Subscribe for ideas that support the environment and the people. 

With seemingly no end to the Iran War in sight, the Trump administration has several actions to soften the economic blow, including waiving the Jones Act and releasing 172 million barrels of oil from the Strategic Petroleum Reserve. One sensible solution to help consumers is to grant exemptions for the small refineries that are critical to US energy security and affordability. 

How the Renewable Fuel Standard Works 

Signed into law in 2005, the Renewable Fuel Standard (RFS) mandates that fuel suppliers blend renewable fuels into America’s gasoline supply. The most common fuel is corn-based ethanol, but other feedstocks can include soybeans, sugarcane, crop residues, and used cooking oil. Each year, the Environmental Protection Agency (EPA) sets yearly targets for the biofuels market. 

Each refiner has a renewable volume obligation that requires a set percentage of the fuel they sell into the US market to include renewable fuels. That requirement can be met either by physically blending biofuels or by purchasing compliance credits. 

The Problems with the Renewable Fuel Standard

There have been several longstanding problems with the RFS, namely that it is an economic burden forced upon businesses and consumers through Soviet-style quotas. While ethanol is an important oxygenate to make gasoline burn cleaner, its use should be determined by market needs rather than government mandates.

Any other stated environmental benefits, however, are dubious at best. The RFS has led to land use changes and crop switching, increasing food prices for households. The land conversion and increased agricultural inputs are producing fuels that are no better, and sometimes worse, than gasoline from a climate perspective.

Why Small Refineries Are Disproportionately Affected by the Renewable Fuel Standard

For refiners, the cost of RFS compliance can be significant. For small and mid-sized refiners, compliance can be one of the most significant expenses. According to an August 2025 analysis by Turner, Mason, & Company, the mandate could cost refiners nearly $70 billion annually, nearly double what it cost refiners in 2023. Whether small or large, refiners must absorb the economic hit or pass costs onto consumers. 

Recognizing that one-size-fits-all mandates may be economically harmful, Congress created small refinery exemptions to prevent “disproportionate economic hardship.” Stripping away that relief now risks forcing refinery closures and making the pain at the pump even worse. There are roughly 50 small refineries in the United States, with 37 or so consistently engaged in the RFS exemption process.

Many of these small refineries are in rural communities and are the economic anchors of their towns. They help support entire local economies, funding schools, public safety, and infrastructure through their tax base. In many cases, they are the largest employer in town.

Together, they provide roughly 1.8 million barrels per day of US refining capacity. Supplying roughly 10 percent of America’s refining capacity, small refineries provide a meaningful share of the fuels Americans rely on every single day. Critically, these refiners often provide the specialized fuels necessary for our military, farmers, and manufacturers.

Higher Prices Are at Stake for Consumers and the Economy 

Denying small refiner exemptions would tighten fuel supply and drive prices even higher at a time when Americans are already feeling the squeeze. Gasoline prices have surged by nearly a dollar per gallon from a year ago. But the economic pain extends beyond the price at the pump. Higher fuel prices mean higher prices for groceries, travel, and all the goods that move by trucking, freight rail, shipping, and airlines. 

The United States cannot afford to lose refining capacity, especially in the middle of a seismic energy shock. Energy policy works best when it fosters competition and delivers for consumers. Biofuels work because they deliver value to consumers. If it’s cheaper and compatible with vehicles, consumers will choose it. Refiners and blenders will supply it. In fact, the small refinery exemptions during the first Trump administration did not destroy ethanol demand precisely because ethanol was economically competitive.

However, punishing small refineries would layer a bad decision on top of an antiquated, two-decade-old mandate that never should have existed in the first place. With gas prices where they are, denying small refinery exemptions would be remarkably bad timing. 

When Bob Hersey Jr., a Maine lobsterman, pulls up his traps, he gets more than tasty crustaceans. He’s collecting vital details about the changing ocean environment.

Mr. Hersey, who also dives for sea urchins, is among nearly 150 fishermen who have installed temperature sensors on their traps or trawl nets from Maine to North Carolina as part of a program run by a nonprofit organization with help from the National Oceanic and Atmospheric Administration.

The soda-can-size sensors are dragged along the seafloor, giving fishermen and scientists a three-dimensional map of the ocean rather than just conditions on the surface, which can be checked using satellites or thermometers on boats. The data is continuously collected and fed into regional weather and climate models.

Read more in the New York Times here.

After 22 years at IBM, where he rose to senior vice president and director of IBM Research, Dr. Dario Gil now leads one of the most ambitious science and technology initiatives in a generation. As director of the Genesis Mission, Gil is orchestrating a convergence of high-performance computing, AI, and quantum computing aimed at fundamentally transforming how the nation does science and engineering.

As a guest on The POWER Podcast, Gil explained what the Genesis Mission is, how it works, and why its implications extend from fusion reactors to the Texas power grid. Here are the key takeaways.

Read more in Power Magazine here.

Subscribe to our exclusive email designed for conservatives who care about climate.

Help us promote free market solutions for climate change.

5 Incredible Ways Economic Freedom Helps the Planet.

Sign up for our newsletter now to get the full list right in your inbox.

Thank you for signing up

Help us promote sensible solutions for both planet and prosperity.

Download Now