A team led by engineers at Princeton and the U.S. Department of Energy's Princeton Plasma Physics Laboratory (PPPL) has deployed machine learning methods to suppress these edge instabilities without sacrificing plasma performance. This approach optimizes the suppression response in real-time, demonstrating high fusion performance without edge bursts at two different fusion facilities with unique operating parameters. The findings were reported on May 11 in Nature Communications.
"Not only did we show our approach was capable of maintaining a high-performing plasma without instabilities, but we also showed that it can work at two different facilities," said research leader Egemen Kolemen, associate professor of mechanical and aerospace engineering and the Andlinger Center for Energy and the Environment.
"We demonstrated that our approach is not just effective - it's versatile as well."
Researchers have experimented with operating fusion reactors in high-confinement mode, characterized by a steep pressure gradient at the plasma's edge that enhances confinement. However, this mode comes with edge instabilities, requiring creative workarounds. One fix involves using magnetic coils to apply fields to the plasma edge, breaking up structures that could develop into instabilities. While stabilizing, this typically lowers performance.
"We have a way to control these instabilities, but in turn, we've had to sacrifice performance, which is one of the main motivations for operating in the high-confinement mode in the first place," said Kolemen, also a staff research physicist at PPPL.
The performance loss is partly due to optimizing the shape and amplitude of magnetic perturbations, which is computationally intense. Existing methods take tens of seconds to optimize a single point in time, while plasma behavior can change in milliseconds. Consequently, researchers preset the shape and amplitude of magnetic perturbations ahead of each fusion run, losing the ability to make real-time adjustments.
"In the past, everything has had to be pre-programmed," said co-first author SangKyeun Kim, a staff research scientist at PPPL and former postdoctoral researcher in Kolemen's group.
"That limitation has made it difficult to truly optimize the system, because it means that the parameters can't be changed in real time depending on how the conditions of the plasma unfold."
The machine learning approach reduces computation time to the millisecond scale, enabling real-time optimization. The model can monitor plasma status from one millisecond to the next and adjust the magnetic perturbations as needed, balancing edge burst suppression and high fusion performance.
"With our machine learning surrogate model, we reduced the calculation time of a code that we wanted to use by orders of magnitude," said co-first author Ricardo Shousha, a postdoctoral researcher at PPPL and former graduate student in Kolemen's group.
The approach is grounded in physics, making it straightforward to apply to different fusion devices. The researchers demonstrated success at both the KSTAR tokamak in South Korea and the DIII-D tokamak in San Diego, achieving strong confinement and high fusion performance without harmful edge bursts.
"Some machine learning approaches have been critiqued for being solely data-driven, meaning that they're only as good as the amount of quality data they're trained on," Shousha said.
"But since our model is a surrogate of a physics code, and the principles of physics apply equally everywhere, it's easier to extrapolate our work to other contexts."
The team is refining their model to be compatible with other fusion devices, including future reactors like ITER, which is under construction. Enhancing predictive capabilities is an active area of work, aiming to recognize precursors to harmful instabilities to optimize the system without encountering edge bursts.
Kolemen noted the potential for AI to overcome longstanding bottlenecks in developing fusion power as a clean energy resource. Previously, his team deployed a separate AI controller to predict and avoid another type of plasma instability in real time at the DIII-D tokamak.
"For many of the challenges we have faced with fusion, we've gotten to the point where we know how to approach a solution but have been limited in our ability to implement those solutions by the computational complexity of our traditional tools," said Kolemen.
"These machine learning approaches have unlocked new ways of approaching these well-known fusion challenges."
Research Report:Highest fusion performance without harmful edge energy bursts in tokamak
Related Links
Andlinger Center for Energy and the Environment
Powering The World in the 21st Century at Energy-Daily.com
Subscribe Free To Our Daily Newsletters |
Subscribe Free To Our Daily Newsletters |