Search
`
November 5, 2024

Collaboration Key to Managing Growing Western Load, Panelists Say

Collaboration among stakeholders is crucial to maintaining Western grid reliability in the face of increasing demand posed by large loads such as new data centers, speakers said Sept. 4 during a webinar hosted by WECC.

Representatives from Elevate Energy Consulting, the Pacific Northwest Utilities Conference Committee (PNUCC) and the Grant County Public Utility District in Washington participated in the webinar. The panelists discussed the challenges of integrating large loads in the Western Interconnection.

According to PNUCC’s Northwest Regional forecast for 2024, electricity demand is projected to increase from approximately 23,700 average MW in 2024 to about 31,100 aMW in 2033, an increase of over 30% in the next 10 years.

“That is an increase of 7,000 average MW, or enough electricity to power seven cities the size of Seattle,” said Crystal Ball, PNUCC’s executive director. She noted the increase in demand is  primarily from three things: data center development, high tech manufacturing growth and electrification.

“But really, we see it coming from these companies developing large data centers in the Pacific Northwest,” Ball added.

Grant County has been dealing with the increase in large loads for some time, according to Shane Lunderville, business development manager for the county’s publicly owned utility.

“We’ve had data centers for the last 10 years and a lot of growth that has not stopped,” Lunderville said. “We have averaged in just industrial growth between 5 to 7% per year of that growth, and we’re not seeing it slow down.”

Ball said the increased demand for electricity is a sign of economic growth opportunities. However, it also poses significant reliability challenges, such as integrating large loads while adhering to efforts to reduce carbon emissions.

“One misstep really could lead to cascading consequences,” according to Ball. “It’s really the reliability of the power system that is at risk during this transition while meeting this increasing demand for electricity.”

She added that stakeholders must work collaboratively and focus on proactive solutions.

Kyle Thomas, vice president of compliance services at Elevate Energy Consulting, agreed, saying that “all parties have to be at the table.”

“Doing one thing on the grid actually involves many different departments … because it’s so interconnected, it’s so involved, and the data centers is no exception,” Thomas said. “So, we need policy, we need regulatory, we need legal, we need the engineers.”

However, according to Thomas, one issue is that data centers often have strict confidentiality rules due to the competitive space between different developers. This makes it difficult to study how to best integrate data centers while ensuring reliability, he said.

“We should still start and try and figure out where our gaps of knowledge are and partner with them to get information, get data, get models, and then learn from these real operations with monitoring data and get that cycle as fast as possible,” Thomas said.

The U.S. also could learn from other countries that have successfully brought on data centers while ensuring the reliability of the grid, according to Thomas.

“You look at Ireland and their adoption of data centers is unbelievable,” he noted. “You look at the [European Union], they have had interconnection requirements in place and policies for large loads since about 2009. We can learn from others in the collective global industry here to learn and accelerate our knowledge where it may be lacking, and we can also help others in that aspect.”

Clean Power Installations Hit Record for Second Quarter, ACP Says

American Clean Power Association on Sept. 5 released its latest Clean Power Quarterly Market Report, in which developers set a record for the second quarter with 11 GW of installations — up 91% from the same period last year. 

In all, 137 utility-scale projects went online, and they brought the cumulative nameplate capacity up to 283.6 GW across utility-scale solar, storage and wind — enough to power 70 million homes. 

The second quarter brought installations up to 19 GW across the first half of the year. ACP noted the second half usually sees higher installations, so 2024 could set a record for solar, storage and wind installations. The first half of the year beat the previous five-year average by more than 10 GW. 

“While all clean energy technologies are expanding their market share, energy storage is scaling at a stunning speed and has surpassed 20 GW of operating capacity,” ACP CEO Jason Grumet said in a statement. “With rapidly growing demand and the need to make significant strides in decarbonizing our economy, the stakes are high. Our recent progress is encouraging, but we are not moving fast enough.”   

The quarter saw California lose its crown as leading state for utility-scale solar, as Texas added 1,656 MW — bringing its total installations to a whopping 21,932 MW. Across the country, solar saw 6.7 GW installed, or 61% of overall clean energy installations. 

Storage added 2.9 GW around the country in the second quarter, which brings its total installations to 21.6 GW. 

Land-based wind saw the smallest installations among the three technologies, with 1,370 MW installed in the second quarter. But that was more than triple the capacity added in the first quarter. 

Onshore wind still represents most of the installed capacity for renewables in the country, but solar is closing the gap — at 39% at the end of the second quarter, up from 26% on Jan. 1, 2020. 

Some 32 states and the District of Columbia added clean energy capacity in the quarter, with Texas leading the way with 2,596 MW overall. California came in second, adding 1,947 MW. Of that, nearly 70% was storage, a trend that should continue because storage represents 64% of the clean capacity in the Golden State’s development pipeline. 

The clean power pipeline of projects under development is at nearly 163 GW, which is up 13% from the middle of 2023 and has been growing at 5% annually. The steady expansion of the pipeline can be attributed to storage and solar, which have grown at average rates of 12% and 3% per quarter since 2022.  

The pipeline dipped 1.5% from the first quarter due to projects entering operation and the cancellation of New York’s third offshore wind project. The decline should be short lived, with an additional 8 GW to 12 GW of offshore wind set to be added to the pipeline as states wrap up ongoing procurements. 

ACP Welcomes the 10th BOEM Offshore Wind Approval

The report was released a few hours before the Bureau of Ocean Energy Management approved the country’s 10th offshore wind project — the Maryland Offshore Wind Project, which brought total approvals by the agency to 15 GW. 

“Today’s approval of our nation’s 10th offshore wind project — a total game change from the zero projects approved before President Biden and Vice President Harris took office — shows the tremendous progress we are making to harness this economic opportunity that [benefits both] American workers and the planet alike,” White House National Climate Advisor Ali Zaidi said in a statement. “From port infrastructure upgrades and new tax credits to speeding responsible and efficient permitting, we are using every tool available to continue turbocharging this industry and delivering a clean energy future for the nation.”  

The Maryland project is planned to add 2 GW of offshore wind capacity and would supply power to the Delmarva Peninsula. Its lease area is 8.7 nautical miles offshore Maryland and about 9 nautical miles from Sussex County, Delaware. 

“In just four years, the U.S. went from zero permitted offshore wind power projects to 10, representing real progress,” ACP Vice President for Offshore Wind Anne Reynolds said in a statement. “Of these, one project is completed and five are under active construction.” 

DOE Awards $430M for Hydro Maintenance

The U.S. Department of Energy has awarded $430 million to nearly 300 projects at hydroelectric facilities to enhance dam safety, strengthen grid resilience and improve the environment.

The funding, announced Sept. 5, comes from the DOE’s Grid Deployment Office through the Maintaining and Enhancing Hydroelectricity Incentive program.

The 293 projects to receive funding are spread across 33 states. Eighty-four projects focus on grid resilience, 149 are dam safety projects, and 60 are environmental improvement projects. Award amounts range from $7,200 to $5 million.

Hydropower contributes nearly 27% of the nation’s renewable electricity generation and 93% of utility-scale energy storage. But the fleet is aging, officials noted. Facilities selected for funding are on average 79 years old.

For example, Entergy Arkansas marked the 100-year anniversary of the Remmel Dam this year. The DOE awarded $1.8 million for safety improvements at the dam.

“We’re thrilled to invest in this hydroelectric fleet that is such an important part of our nation’s electric system,” Maria Robinson, Grid Deployment Office director, said during a news conference.

For the most part, the projects selected for funding through the Maintaining and Enhancing Hydroelectricity Incentive program won’t increase generation or capacity, DOE officials said.

Rather, the program focuses on strengthening grid resilience at dams through measures such as turbine or generator replacement or transformer upgrades.

Safety measures funded by the program might include improvements to emergency spillways or concrete replacement to prevent water seepage through the dam.

In addition, the program funds environmental and recreational improvements such as fish ladders or improved boating access.

Multiple Awards

Many utilities are receiving awards for multiple projects. PacifiCorp Renewable Resources was awarded $38 million for nine projects, including $5 million each for the Fish Creek pumped storage facility and Weber Dam improvements.

Pacific Gas and Electric is receiving more than $34 million for 19 projects. Among the funding is $123,289 for improvements at the Potter Valley fish hotel and $5 million for the Lower Bucks spillway restoration project.

Michigan-based Consumers Energy is receiving about $23 million for 10 projects, including $5 million each for improvements at the Rogers and Hardy spillways.

Seattle City Light was awarded about $21 million for five projects, including $5 million for dam safety at the Cedar Falls hydroelectric project.

The Maintaining and Enhancing Hydroelectricity Incentive is one of three DOE programs that fund hydroelectric projects through the Bipartisan Infrastructure Law.

Another program, the Hydroelectric Production Incentives, will provide $125 million to hydroelectric facilities for electricity generated and sold. In 2023, 66 hydro facilities were awarded $36.7 million. Applications for a second funding round are now under review.

The third program, the Hydroelectric Efficiency Improvement Incentives, will invest $75 million into hydropower facilities. In February, DOE awarded $71.5 million to 46 hydroelectric projects in 19 states.

The Grid Deployment Office will discuss the latest Maintaining and Enhancing Hydroelectricity Incentive awards during a webinar on Sept. 11 from 1 to 1:30 p.m. ET.

DOE expects to announce a second round of funding for the program next year.

‘Leaning’ Evident in BPA Response to NW Senators

CAISO’s adoption of the West-Wide Governance Pathways Initiative’s “Step 1” changes won’t overcome the Bonneville Power Administration’s objections to the governance of the ISO’s Extended Day-Ahead Market (EDAM), BPA Administrator John Hairston told U.S. senators from the Pacific Northwest.

“Our specific concern is that, with only Step 1 in place, the market governance remains under the ultimate authority of California,” Hairston wrote in an Aug. 21 letter to the senators, which has not yet been posted on the agency’s website.

Hairston’s comments were part of a broader response to a series of questions posed to him in a July 25 letter signed by Democratic Sens. Jeff Merkley (Ore.), Ron Wyden (Ore.), Maria Cantwell (Wash.) and Patty Murray (Wash.).

In their letter, the senators urged the agency to “act carefully and deliberately” before selecting a day-ahead market and to delay a “draft letter to the region” relaying its decision, previously slated for Aug. 29, until more developments play out around EDAM and SPP’s competing Markets+ offering. (See NW Senators Urge BPA to Delay Day-ahead Market Decision.)

The senators’ letter signaled a preference shared by many Western state officials, public interest groups and large energy users — and some utilities — that the region will benefit most from a single organized electricity market that includes CAISO.

It also expressed concern that BPA staff in April issued a “leaning” recommending the agency choose Markets+ over EDAM, citing the SPP market’s independent governance and overall design as primary factors supporting the opinion. The senators directed the agency to answer 14 detailed questions to clarify the reasons behind the leaning. (See BPA Staff Recommends Markets+ over EDAM.)

Inadvertently or not, the senators got one wish: In his Aug. 21 response, Hairston said BPA would delay its market decision until next year, an announcement it later would relay to its stakeholders on Aug. 25, saying both markets have “outstanding issues that require additional analysis.” (See BPA Postpones Day-ahead Market Decision Until 2025.)

But Hairston’s Aug. 21 response to the senators clearly — and understandably — shows the fingerprints of the staff that produced the leaning. It also evinces continued concerns among some parties in both the Northwest and Southwest about a market arrangement that could be dominated by California and its interests.

In response to the senators’ question about which market BPA expects “will provide the greatest improvement in grid reliability in the Northwest,” Hairston cites the benefit of the Markets+ requirement that its participating entities also participate in the Western Power Pool’s Western Resource Adequacy Program (WRAP).

“The EDAM proposal’s lack of a common resource adequacy metric makes it difficult to assess whether the market or its participants will be resource adequate in the planning horizon for the market,” Hairston wrote, adding that California’s “state-mandated” RA metrics don’t align with WRAP requirements and that EDAM will accept non-California participants that haven’t committed to the WRAP.

Responding to another question about which market would do more to reduce greenhouse gas emissions from the Northwest’s electricity sector, Hairston said Markets+ has made progress in developing GHG tracking and accounting procedures that would allow BPA’s customer base of publicly owned utilities to meet Washington’s cap-and-invest program obligations and Oregon’s “non-pricing” carbon requirements.

“Our continuing concern with CAISO’s EDAM design is that California is able to deem a disproportionate share of carbon-free market-traded resources as delivered to California, to the disadvantage of utilities in the Northwest and their ability to meet their state goals,” he wrote.

Addressing a question about the impact on the Northwest grid from “seams” between two different markets, Hairston cited BPA’s previous experience using the Coordinated Transmission Agreement with CAISO to enable several of the region’s utilities to use BPA’s transmission system to participate in the ISO’s Western Energy Imbalance Market before the agency itself joined that market.

“Bonneville expects to undertake a similar exercise if necessary to manage day-ahead market seams,” he said.

Governance Still Key

But the issue of CAISO’s state-run governance was front and center in Hairston’s response to the senators — just as in the staff leaning.

“Bonneville seeks to participate in a market that has a durable, effective and independent governance structure [that] provides fair representation to all market participants and stakeholders,” he wrote.

Hairston described the choice as being between Markets+, with its independent board of directors, and EDAM, which would fall under the “shared authority” of the Western Energy Markets (WEM) Governing Body and the ISO Board of Governors “appointed by the governor of California.”

Hairston acknowledged the progress made by the Pathways Initiative in forcing movement on CAISO’s governance. The ISO and WEM boards last month voted to approve the Pathways plan giving WEM officials “primary authority” over WEIM- and EDAM-related market-rule decisions. (See CAISO, WEM Boards Approve Pathways ‘Step 1’ Plan.)

But his response to the senators’ question about that effort illustrated his skepticism around whether the “Step 2” plan for advancing a California bill to grant the WEM Governing Body “sole authority” over the EDAM would get traction or meet BPA’s requirements.

“While we appreciate the Pathways’ sponsors optimism for a positive outcome, such efforts have repeatedly failed to secure legislative approval. It also remains to be determined what legislative conditions and constraints will continue to impede an independent governance structure,” he wrote.

Pathways backers expect to begin working with California lawmakers on a bill this fall after the conclusion of the current session. They hope to get the bill introduced and passed during the 2025 session, which starts in January.

That bill might not progress in alignment with BPA’s new day-ahead market decision timeline. The agency now plans to release its draft decision in March 2025, followed by a final decision in May.

AEU Webinar Highlights Potential Queue Improvements

Speeding up the interconnection queues is becoming more important as demand growth and the retirement of existing generators combine to cut into reserve margins around the U.S., experts said during a webinar Sept. 4 hosted by Advanced Energy United.

Grid Strategies President Rob Gramlich summarized a recent report his firm co-authored with the Brattle Group for United and the Solar and Storage Industries Institute called “Unlocking America’s Energy: How to Efficiently Connect New Generation to the Grid.”

The report’s recommendations would help developers of generation get more certainty from the interconnection process, which generators do not have when they enter the queues and put up deposits to save a place in line, Gramlich argued.

“The developers really need certainty, or else, if you don’t have it, you’ll continue to have this queue churn and projects coming in and out in order to get information,” Gramlich said.

After putting up their deposits, projects end up in a study process that is usually lengthy; then getting them built can be delayed because of required transmission upgrades, Gramlich said.

Supply chain issues are delaying both the construction of network upgrades and sometimes the generators themselves, with PJM reporting that nearly 40 GW of projects have made it through its queue but are not in operation yet.

“There’s no single answer to that, but there’s maybe a few common challenges,” said Hannah Muller, senior director of external and market affairs at Clearway Energy Group. “One is supply chain; that’s both for the actual project equipment, but also the transmission infrastructure. There’s just years [of] delay in getting necessary equipment; it’s just a function of the global economy at this point.”

Other issues include permitting and community opposition to new infrastructure; FERC and the RTOs can speed up the interconnection, but those are outside of their purview, Muller said.

FERC made changes to the baseline for queues with Order 2023, but the report and panelists argued that additional measures are needed. The commission is holding a two-day technical conference from Sept. 10 to 11 on that subject; Gramlich said it would be a good venue for helping to share best practices from the different regions.

In addition to certainty, the Grid Strategies report argues for quicker schedules and non-discrimination that guarantees a level playing field for “similarly situated interconnection customers.” Adopting an interconnection entry fee for proactively planned capacity would provide customers with significant interconnection cost certainty and address cost allocation of the upgrades identified. The report also suggests a fast-track process to use existing and already planned interconnection capacity that would prioritize the projects most ready to go live.

The interconnection study process also needs improvements to enable the fast-track process and make studies more efficient generally, the report said.

One promising way to improve the study process that is in its early stages is the use of artificial intelligence, said Kyle Davis, the Clean Energy Buyers Association’s senior director of federal affairs.

SPP, Amazon Web Services, Pearl Street [Technologies] and NextEra [Energy] are testing out Pearl Street’s SUGAR platform to try and help organize and bring that machine-learning process to the interconnection queue analysis,” Davis said.

Testing so far indicates the technology can whittle what has historically taken three months down to as little as an hour, he added. The report discusses the pilot effort, and Pearl Street’s CEO, David Bromberg, is a witness at FERC’s technical conference next week.

The final set of modifications that the report suggests is around speeding up the transmission construction backlog to address growing constraints that hinder network upgrades. The report noted that while it did not focus on proactive transmission planning, that is key to speeding up the queues, and several transmission providers are already implementing proactive planning, while others are developing long-term planning processes to comply with FERC Order 1920.

Speeding up construction also would mean addressing supply chain issues, with which the Department of Energy could help, Gramlich said. As for FERC jurisdictional issues, it is unclear why some utilities can get network upgrades built more quickly than others; the report suggests some independent monitoring of that stage to identify why that is happening and address issues that slow down construction, Gramlich added.

Demand is growing, but supply is out there that could address it, said R Street Institute Senior Fellow Beth Garza, who is also speaking at FERC’s technical conference next week.

“Those costs eventually are borne by consumers,” Garza said. “It’s consumers that are using the electricity. They’re the ones getting value out of having the electricity. … It’s the indirect costs that consumers absolutely bear by inefficient processes.”

DOE Set to Fund 2 Energy Storage Research Hubs

Teams led by Argonne National Laboratory and Stanford University are in line for $125 million to boost their research into next-generation energy storage.

The U.S. Department of Energy announced the funding Sept. 3. It said the two Energy Innovation Hubs will accelerate development of storage technology beyond lithium-ion batteries, with a priority on use of inexpensive and abundant materials.

The Energy Storage Research Alliance (ESRA) led by Argonne will focus on new compact batteries for heavy-duty transportation and grid-scale energy storage, while the Aqueous Battery Consortium (ABC) led by Stanford will work to establish the scientific foundation for large-scale development and deployment of aqueous batteries for long-duration grid storage technologies.

If finalized, the awards will be worth up to $62.5 million each and will extend up to five years.

In its own announcement, Argonne said:

“ESRA seeks to enable transformative discoveries in materials chemistry, gain a fundamental understanding of electrochemical phenomena at the atomic scale and lay the scientific foundations for breakthroughs in energy storage technologies.”

The goal is high-energy batteries that provide days of output, do not catch fire and have decades-long service lives. They will rely on abundant materials, which may mitigate the cost and supply chain volatility associated with present-day batteries.

ESRA Director Shirley Meng said: “To achieve this, energy storage technology must reach levels of unprecedented performance, surpassing the capabilities of current lithium-ion technology. The key to making these transformative leaps lies in a robust research and development initiative firmly grounded in basic science.”

DOE said both projects also will be a vehicle for workforce development and inclusion of diversity.

ESRA Deputy Director Wei Wang said: “Cultivating a diverse workforce dedicated to safeguarding America’s energy resilience is key to ESRA’s mission. Through our strategic equity and inclusion initiatives, we plan to create a robust training ground for energy storage science from the undergraduate to postdoctoral levels.”

Stanford in its announcement said the ABC seeks to overcome the limitations facing batteries that use water as their electrolyte.

ABC Director Yi Cui explained that enormous amounts of stationary energy storage will be needed to achieve net-zero greenhouse gas emissions, and said water is the only realistic solvent available at the cost and quantity needed.

He said: “How do we control charge transfer between solids and water from the molecular to the device scale and achieve reversibility with an efficiency of nearly 100%? We don’t know the solutions to those hard problems, but with the Department of Energy’s support we intend to find out.”

The aqueous battery concept is in extensive use in the starter systems of internal combustion vehicles, but those batteries are small and rely on a toxic substance — lead acid.

Using water instead is a tall order.

“The barriers to such a new aqueous battery have stymied inventors for years,” said ABC chief scientist Linda Nazar. “In addition to stubbornly low voltage and energy density, water can corrode battery materials, become the source of undesirable side reactions, and the cells can fail after just hundreds of charge-discharge cycles under demanding practical conditions.”

Dozens of researchers working in six teams will investigate the challenge.

Stanford University and SLAC National Accelerator Laboratory lead the Aqueous Battery Consortium, and are joined by investigators from California State University, Long Beach; Florida A&M University/Florida State University’s College of Engineering; North Carolina State University; Oregon State University; San Jose State University; UCLA; UC-San Diego; UC-Santa Barbara; University of Maryland; University of Texas at Austin; and the University of Waterloo.

ESRA is led by Argonne and co-led by Lawrence Berkeley National Laboratory and Pacific Northwest National Laboratory. Partners are Columbia University; Duke University; Massachusetts Institute of Technology; Princeton University; UC San Diego; UChicago; University of Houston; University of Illinois Chicago; University of Illinois Urbana-Champaign; University of Michigan; Utah State University; and Xavier University.

DOE’s Energy Innovation Hubs are managed by the Basic Energy Sciences (BES) Program of the Office of Science. They seek to overcome key scientific barriers for major energy technologies.

ABC and ESRA build off previous BES-funded efforts, including the Joint Center for Energy Storage Research innovation hub, which ended a decade of operation in 2023 and also was led by Argonne.

MISO: 50% Peak Load Cap, Software Help Key for Crowded, Delayed Queue

MISO is adamant it should limit project proposals in future queue cycles to 50% of annual peak load to moderate its 300-GW, oversaturated queue. 

During a Sept. 3 Interconnection Queue Process Working Group teleconference, MISO’s Ryan Westphal said an annual megawatt cap, in conjunction with a tech startup’s software for study help, will allow MISO to build study models faster without the “engineering problem” of too many hypothetical overloads, network upgrades and resources exceeding load. (See MISO Sets Sights on 50% Peak MW Cap in Annual Interconnection Queue Cycles.) 

“We think that gives us the best chance of moving faster,” Westphal said of overall queue processing.  

MISO’s current queue stands at 321 GW.  

Westphal said a cap won’t hinder MISO’s resource adequacy, either. Using the queue’s historical 21% completion rate, Westphal said MISO stands to add roughly 67 GW within a few years, even with capping entrants. 

In the long run, MISO estimates 310 GW will be able to hook up to the system through 2042. Westphal pointed out that figure exceeds the 248 GW of additions by 2042 that the RTO uses in its current transmission planning scenario. Westphal said that number should allay stakeholders’ resource adequacy concerns, where a cap might restrict too many projects from connecting. 

Clean Grid Alliance’s Rhonda Peters asked how MISO plans to factor in significant new load additions in its annual megawatt cap.  

Westphal said the cap calculation will be based on peak load using MISO’s five-year out models, which should capture definite load additions.  

“Using what’s in the models as firm makes the most sense,” he said.  

Westphal also said MISO’s attempt to conquer its unwieldy queue using Pearl Street’s study software will not negate the need for a cap.  

MISO will lean on Pearl Street’s SUGAR (Suite of Unified Grid Analyses with Renewables) software to conduct screening of projects prior to conducting studies in earnest and to perform the first phase of studies in the queue.  

MISO has delayed kickoff of studies on the 123 GW of projects that entered the queue in 2023 while Pearl Street assists with modeling. When the 2024 cycle will begin is an open question, since the RTO intends to have the cap in place before it formally accepts a new cycle. (See 2023 Queue Cycle Delayed into 2025 as MISO Seeks Software Help on Studies.)  

“We have long been proponents of technology adoption in this space,” NextEra Energy’s Erin Murphy said, thanking MISO for reaching out for third-party help. However, Murphy said that if MISO and Pearl Street can achieve faster study results with more variables, that could negate the need for a megawatt cap on annual queue cycles. 

Westphal said while SUGAR may help speed up study processing, without a queue cap, MISO still would run into the familiar problem of unrealistic dispatch models overflowing with too many projects. 

Westphal said MISO still wants to return to its usual cadence of one-year queue cycles where submissions are accepted in the fall, validated through the holidays and begin studies in the new year, after MISO’s Board of Directors approves MISO’s planning models. However, Westphal added that it doesn’t make sense to accept a new queue cycle if the previous cycle isn’t far enough along in the study process.  

Last week, the Union of Concerned Scientists’ Sam Gomberg suggested MISO plan to play “catch up” on queue studies if Pearl Street’s software proves successful. He suggested MISO consider accepting multiple queue cycles in a year to get back on track.  

MISO is not entertaining using a volumetric price escalation — where developers pay fees that increase as they submit more projects for study — in lieu of a cap, as some stakeholders requested. Westphal said enacting escalating fees won’t solve MISO’s underlying “technical issue” of trying to study “load being served by too many generators.” 

“In our minds, it’s not an alternative for a cap. We still believe we need that hard cap there to get us to reasonable study parameters and dispatch [model],” he said.  

Several MISO generation developers argued that a volumetric price escalation would encourage interconnection customers to put only their best projects forward, discourage manipulation of the queue and allow small developers and co-ops an even playing field for submitting projects. 

Savion’s Derek Sunderman said MISO could police the volumetric approach by requiring large corporations to sign forms attesting to their subsidiaries. Sunderman requested that the RTO conduct a stakeholder vote to gauge whether stakeholders prefer the cap or a volumetric price escalation.  

Some stakeholders have asked MISO to consider giving developers estimated network upgrade costs at the screening stage of queue, if Pearl Street is proven effective at anticipating results.  

MISO still is drawing up a plan to reevaluate the queue cap after three annual cycles, Westphal added. A few stakeholders have asked the RTO to view the cap as a short-term measure and commit to sunsetting the cap after three years of use.  

“Unfortunately, there’s no silver bullet on the queue. It’s just constant improvement,” Westphal said. 

MISO has scheduled a special meeting Sept. 30 to discuss the queue cap again. Westphal said he hopes to present “a final go” of the queue cap by then, make a filing at the end of October and earn FERC approval by the end of the year. 

DC Circuit Strikes down Emissions Standards for ‘New’ Pre-2020 Boilers

A three-judge panel of the D.C. Circuit Court of Appeals on Sept. 3 set aside EPA emission rules on new large boilers as they applied to those built prior to August 2020, ruling that was a violation of the Clean Air Act (22-1271). 

The rules, issued in October 2022, set National Emission Standards for Hazardous Air Pollutants (NESHAP) for major sources focused on industrial, commercial and institutional boilers. A source is considered “new” under CAA Section 112 if it is built after EPA proposes an emission standard for that source, which the agency did for boilers in August 2020. The court found that EPA had improperly classified certain industrial boilers built before then as “new.” 

In doing so, the court agreed with industry petitioners, led by U.S. Sugar, which completed building a boiler to help power its facility in Clewiston, Fla., in 2019 at a cost of $65 million to replace three older and higher-polluting boilers to comply with standards issued by EPA in 2011. 

Boilers burn materials such as coal, paper and agricultural waste to create heat, electricity and other forms of energy. That comes with emissions of pollutants like mercury, carbon monoxide and particulate matter. 

U.S. Sugar’s boiler also “surpassed” EPA’s 2022 standards for existing sources, the court said. But under EPA’s rules, it was considered a new source. “Under this regime — whose logic suggests that boilers built after June 4, 2010, are forever ‘new’ — the U.S. Sugar Corp. must spend tens of millions of dollars retrofitting” the boiler, it said. 

EPA is supposed to base its standards around the maximum achievable control technology (MACT), which can vary between new and existing sources. New sources are supposed to meet a standard at least as strict as the emission control that is achieved in practice by the best controlled similar source, while existing sources have to meet one at least as stringent as the best performing 12% of operating sources for which EPA has emissions data. 

The agency argued that because it was using the same dataset as when it proposed the 2011 standards, the cutoff date for whether a source is “new” is June 4, 2010, when the proposal was first published.  

But the court found that “when Section 112 references the date ‘an emission standard’ is ‘first propose[d],’ it means the first proposal of each consecutive standard.” It noted that while existing sources are given three years to comply with new standards, new sources are expected to be in compliance upon their effective date. The court pointed to other cases challenging EPA rulemakings for other types of new sources under Section 112 in which the agency also noted this in its arguments. 

“EPA itself has explained that retrofitting older sources to comply with increasingly stringent modern standards may be ‘draconian’ if not ‘impossible,’” the court said. “And we should not lightly assume that a statute is ‘draconian’ or ‘demands the impossible.’” 

Environmentalists including Sierra Club also appealed the rules, but because they said EPA used old data, despite more recent data being available. But the court found that did not violate the CAA. 

In other cases, the court has generally acknowledged that EPA may exercise discretion and use its expertise to calculate standards. The environmentalists’ view would offer no discretion to EPA when choosing its data, which could force the agency to use faulty data if that was all it had, the court said. 

“Because that interpretation of Section 112(d) would substantially hamper EPA’s ability to effectively promulgate standards, we reject environmental petitioners’ interpretation and hold that EPA’s decision to rely on its original dataset was not unlawful,” the court said. 

Heat Pump Tech Could Help Decarbonize Dairy Sector, CEC Says

The California Energy Commission (CEC) is exploring the use of heat pump technologies to accelerate decarbonization in the dairy sector, which accounts for 2.5% of the state’s energy consumption and 1.4% of greenhouse gas emissions.

Home to over 1,100 dairy farms and over 130 dairy product processing facilities, California leads the nation in milk production and is the second largest cheese producer. In 2020, the U.S. dairy industry announced a goal of net zero carbon emissions by 2050, and commissioners identified that transitioning from thermal resources to heat pump technologies for processes like pasteurization, evaporation and cleaning could lead to significant energy savings while reducing reliance on fossil fuels.

“Knowing how important the dairy sector is to California’s economy and knowing we could bring some innovation to the sector, [we can] really work together with industry to improve the carbon footprint,” CEC Commissioner Andrew McAllister said during an Aug. 29 meeting to discuss decreasing dairy emissions.

Through the Food Production Investment Program, the CEC has awarded up to $117.8 million in grants to help food producers reduce greenhouse gas emissions, including to six California dairy facilities.

“These projects have or will improve operation efficiency and lower production costs, and in general maintained or increased the quality and quantity of production,” said Matthew Stevens, a CEC staffer representing the Food Production Investment Program. “We have done a lot of waste heat capture and storage, general system overhauls, and recently, we’re tackling to replace very inefficient, aging, high global warming refrigeration systems.”

‘Where Everybody Wins’

Several experts who focus on the decarbonization of industrial facilities presented at the meeting, all highlighting the potential for heat pumps to improve energy efficiency in the dairy sector.

Dr. Ahmad Ganji, director of San Francisco State University’s Industrial Assessment Center (IAC), said there are “significant opportunities for energy efficiency in dairy processing plants,” with efficiency increases of at least 10-15%. The IAC analyzes and informs industrial facilities about how they can decarbonize, and Ganji said it plans to recommend heat pump technologies at dairy processing plants as the technology improves.

Most of the emissions from dairy processing facilities come from natural gas-powered steam boilers used for pasteurization and other processes requiring heat, according to Arun Gupta, CEO of Skyven Technologies.

“Twenty percent of global carbon emissions are caused by industrial heat, which, for context, is about as much carbon impact as all of transportation, all of the cars, trains, planes, boats, everything combined,” Gupta said. “Half of that is steam, so steam is enormous.”

Skyven developed a new steam-generating heat pump technology, designed for use in dairy processing plants, that can generate steam for heating and cooling at lower prices than boilers that run on natural gas.

“That allows us to achieve the deep decarbonization that the industry is looking for,” he said. “Decarbonization solutions must be cost competitive with existing boilers and, better than cost competitive, they actually need to save money … Where everybody wins is where decarbonization and cost savings go hand-in-hand.”

Skyven was recently awarded a $145 million grant from the U.S. Department of Energy to deploy steam-generated heat pumps across multiple manufacturing sectors, including California dairy facilities, with the goal to make the technology an industry standard. Gupta estimated the project will cut GHG emissions by around 400,000 metric tons, produce 1,000 jobs and benefit over 300,000 people through cleaner air.

Curtis Rager, product manager at Johnson Controls, provided additional background on how heat pumps could increase the efficiency of refrigeration systems at dairy facilities. Most dairy plants use ammonia as a refrigerant that is pumped throughout the system and absorbed at the point of use.

For example, refrigerant is sent to milk silos which absorb heat that then flows through the system and is discharged into the atmosphere via evaporative condensers. The process of heating milk up for pasteurization and then cooling it back down requires a lot of heat usage, and pumps could help offset discharged waste heat.

“[Heat pumps are] capturing that ammonia refrigeration gas steam that’s going to the evaporative condenser and it’s now taking that and it’s going through another stage of compression,” Rager said. “With a second stage of compression in the heat pump portion you can pump up that temperature … and now through the condenser, you can bring the cold water in and produce hot water all from the energy that was absorbed … from those evaporators.”

The system would allow for significant energy- and water-use savings, contributing to Gupta’s goal of simultaneous decarbonization and reduction of costs.

“We believe that steam decarbonization is crucial for the decarbonization of the industry, and this technology allows that to happen in a way that is profitable for manufacturers and allows them to achieve both the savings and the carbon reductions that they’re looking for,” Gupta said.

NPCC, NYSEG Agree to Settle Control Center Violation

FERC accepted a settlement between New York State Electric and Gas (NYSEG) and the Northeast Power Coordinating Council in which the utility admitted to violating NERC’s requirements for maintaining backup control centers.

The settlement, which carries no monetary penalty, was filed by NERC in its monthly spreadsheet Notice of Penalty on July 31; it was the only settlement in the spreadsheet and the only NOP filed that month (NP24-10). In a filing issued Aug. 30, FERC said it would not further review the settlement. Commissioner Judy Chang did not participate in the decision.

The settlement stemmed from a violation of EOP-008-2 (Loss of control center functionality), approved by FERC in 2018 in order to “ensure continued reliable operations of the [electric grid] in the event that a control center becomes inoperable.” NPCC discovered the noncompliance during an audit in 2020.

According to the settlement, NPCC found that NYSEG’s backup and primary control centers used a shared communication path with a single point of failure. This contravened requirement R6 of the standard, which mandates that reliability coordinators, balancing authorities and transmission operators ensure their primary and backup control centers maintain separate functionalities.

NPCC reported that seven communications lines terminated in a single room common to both the primary and backup control centers. In the event of a “catastrophic event” at the primary control center, the utility would lose its connection with about 150 remote terminal units (RTUs), 62 of which provide data from its substations. This represents a loss of data from more than half of its 121 grid-connected RTUs.

Further investigation revealed that NYSEG had discovered the issue during a prior audit in 2017 and labeled it an area of concern. The utility first sought to address the problem with its telecommunications vendor, but the vendor delayed implementation of the proposed solution for more than a year before telling NYSEG in 2019 that it “could no longer support the solution as designed.”

NYSEG then pursued a permanent solution, which was “in the planning stages” when NPCC conducted its 2020 audit. But the regional entity said the utility did not assign the task the necessary priority or management oversight, and thus the violation lasted longer than it would have with proper prioritization. Along with EOP-008-2, NPCC also found that NYSEG had violated the standard’s predecessor, EOP-008-1, which was in effect when NYSEG registered as a transmission operator and was required to comply with it.

NPCC assessed the violation as a moderate risk to grid reliability. It pointed out that the shared point of failure would have reduced NYSEG’s visibility into its system and compromised its ability to work remotely if the primary control center became inoperable. The RE said a catastrophic event compromising the primary center “would likely be a long-duration event,” exacerbating the risk.

At the same time, the RE acknowledged that the risk of such a catastrophic event affecting the primary control center is low. It also pointed out that even if NYSEG lost its ability to monitor the system, NYISO and neighboring TOPs and BAs could still monitor their respective systems, ensuring some visibility into the grid’s health.

NPCC determined that no monetary penalty would be required in light of NYSEG’s cooperation in the enforcement process, lack of prior relevant noncompliance and agreement to settle the matter rather than calling for a hearing. However, the RE did feel it necessary to elevate the matter to the spreadsheet NOP because of the length of the noncompliance and the fact that it became aware of the issue through a compliance audit rather than the utility reporting the problem itself.

To mitigate the problem, NYSEG removed the single point of failure by migrating the communications lines. It also created a new NERC compliance tool to monitor compliance projects and make sure schedules are maintained properly, trained relevant personnel on the tool, and updated its project management procedures to specify that leadership must review the project management plan when changes to a project’s schedule are needed.