
Research Using AI in Energy Applications at CMU Showcases the Frontier of Opportunities
Media Inquiries
Pioneered at Carnegie Mellon University(opens in new window), artificial intelligence (AI) holds tremendous promise while also invoking challenges in its applications and use.
AI’s capabilities to synthesize mammoth amounts of data are being harnessed across every industry. However, running and iterating on the algorithms that compute new, innovative solutions faster also means energy to power them is needed in greater magnitudes.
The pace at which AI is now advancing also requires consideration of how to efficiently and sustainably power the technology, and long-term, thoughtful strategies, said Daniel Tkacik(opens in new window), executive director of the Wilton E. Scott Institute for Energy Innovation(opens in new window).
“At Carnegie Mellon, we are leading AI for the world, but we're also leading energy for the world,” he said. “We've been leading work at the intersection of AI and energy for decades."
As part of that mission, the Scott Institute is hosting CMU Energy Week(opens in new window), this year bringing together energy and sustainability leaders to combine forces and exchange ideas at the intersection of AI and energy.
“There are good reasons why AI is being pursued in the way it is,” Tkacik said. "It's going to make our lives better in a myriad of ways, and there are a lot of smart people trying to make sure that this is done in an ethical and responsible way."
Harnessing AI’s Possibilities Starts with Definitions and Data
Emma Strubell(opens in new window), Raj Reddy Assistant Professor in the Language Technologies Institute(opens in new window) in the School of Computer Science(opens in new window), who was recently named one of the most powerful people in artificial intelligence(opens in new window), said solutions regarding AI efficiency should start with agreeing upon what defines something as artificial intelligence, then measuring and reporting the energy usage.
Strubell contributes to a nationwide project(opens in new window) funded by the U.S. National Science Foundation's (NSF) Expeditions in Computing Awards program hoping to lay this type of groundwork for sustainable computing.
“In order to make informed decisions and policies — for example, around energy use and the relationship between AI and future energy use in the U.S., due to data centers — we need a much better understanding of the actual drivers of that energy use,” they said.
Strubell is among the Carnegie Mellon researchers examining these frontiers related to AI in energy and climate solutions set to contribute to discussions as part of Energy Week(opens in new window).
“I’ve been thinking about the foundational work,” they said. “There's a need for data. And there's a lot of analysis that we can do in academia with the information that's available to us, but there's also a lot that we can't do because there's not enough data about things like what the workloads actually are in data centers.”
Without Batteries, Devices Storing Energy Create Possibilities
Efficiency in computing also stems from more efficient processing systems, including computer chips, which has been the focus of work by Brandon Lucia(opens in new window), Kavčić-Moura Professor of Electrical and Computer Engineering in the College of Engineering(opens in new window), for the past 10 years.
“We need to make computing more energy efficient, because if we don't do that, we can't continue to add functionality to these kinds of energy-constrained devices,” he said. “We can't continue to push AI and computing forward in general without energy efficiency. Going forward, for the next five to 10 years, energy is the only thing that matters.”
Lucia and his research team are working on batteryless computer systems(opens in new window) that use energy-harvesting devices and intermittent computing, a term that his team established.
The devices collect energy from the environment, such as solar energy, radio waves or vibrational mechanical energy, which is stored in a chargeable capacitor that is simpler than one that uses a battery.
“There's no environmental impact of having to produce, distribute and dispose of batteries, and there's no maintenance in having to replace batteries,” Lucia said of the project. “You have these devices which have all these other benefits, but they have an unfortunate side effect — they turn off, because you don't always have power in the environment. And when they turn off, your system goes haywire.”
That’s where intermittent computing comes in, Lucia said.
Lucia began to push forward intermittent computing, and he and Nathan Beckmann(opens in new window), associate professor in the School of Computer Science, co-advised then-doctoral student Graham Gobieski on work on spatial dataflow architectures that were well-suited for intermittent computing. The trio founded Efficient Computer(opens in new window) to commercialize the spatial dataflow architecture.
“With that little tiny bit of energy, the biggest impediment to the progress of those batteryless devices was the inefficiency of computing to begin with,” Lucia said. “This spatial dataflow architecture is what unlocks all that efficiency, and so we've been developing those efficient architectures.”
Devices using these technologies could be used in complex environments that are difficult to access, where those with batteries would otherwise require changing or charging by humans, such as space exploration, disaster response, and construction or industrial sites. Fewer batteries would also lessen the environmental impact of otherwise disposing of or recycling used batteries.
“With these batteryless devices, you can extend the lifetime of the device essentially indefinitely, until the energy transducer, like a solar panel, starts to break down,” Lucia said. “So that might be a five- to 10-year lifetime, up to 30 years, in some cases.”
Ultimately, energy is a cornerstone on which future innovation rests, Lucia said, especially when it comes to applications such as personal devices for each of the seven and a half billion people on the planet.
“We need to think about energy,” he said. “The defining problem for humanity for the next several decades is how to match the need for computing with the energy required, and that includes AI, but not just AI. … Energy is the biggest problem facing humanity.”
AI Increases Speed of Progress Toward Nuclear Fusion Power
Using AI could help unlock a new potential source of energy to solve that problem, including work by Jeff Schneider(opens in new window), research professor in the School of Computer Science, and his research team studying nuclear fusion.
The reaction, where atoms collide — distinct from the splitting atoms of nuclear fission already used in nuclear power plants — are created in a tokamak machine. The billion-dollar reactor heats hydrogen until it becomes plasma, which is then formed into a donut-shape, wrapped with magnets and confined within a magnetic field. The system simultaneously controls the injection of hydrogen particles, the shape of the plasma, and its current and density.
“We just don't know how to keep that plasma in place at high enough temperatures and pressures for long enough periods of time so that it can be used in a power plant,” said Schneider, who will also be speaking as part of Energy Week. “That's basically the one thing that's standing between us and unlimited clean energy.”
Machine learning is helping the team synthesize decades of data from past experiments to better understand exactly how each of the “shots” taken inside the machine at the DIII-D National Fusion Facility in San Diego.
“Over the decades there have been tens of thousands of these shots that have happened, and all of that you can feed to a machine learning model to learn to predict those steps from one state to the next to use that to make a simulator,” Schneider said.
The simulator can then perform millions of shots, “more than have ever been run in real time,” he said. “It just keeps practicing until it finds a control policy, and now what we can do is get to the real bottleneck to progress, the limited time — only a few hours per year — available to run experiments on the tokamak.”
Schneider said with machine learning, the team has been able to double the number of shots that could happen without disruption as a result of their experiments, and because of that, collaborators asked to run the same algorithm at the KSTAR tokamak in South Korea.
“These methods are getting us these results faster than we otherwise would be able to get them,” he said. “These are both examples of things that physicists have known about and been interested in for years, that you know they just hadn't been able to reach yet. Now, we've proven that we have the tools to solve them, and so what we're really trying to do now is to get the resources to roll this out at scale.”
Making progress on the science behind nuclear fusion will lead to progress toward power plants that can produce considerable energy from a renewable, emission-free source.
“If you think about the world's grand challenges, many of them are just energy problems,” Schneider said, citing global warming and food and water accessibility. “All these things are just clean energy problems, so that's why I'm really excited about solving the problems with clean energy, specifically with fusion.”
Building Data Synthesized by AI Could Reduce Energy Usage
When it comes to research aiding systems that are already in place, Azadeh O. Sawyer(opens in new window), assistant professor in building technology with the School of Architecture(opens in new window), is using AI to make building design more efficient.
Setting a benchmark standard for buildings’ energy usage of a typical size in a specific location allows for future designs to use those figures to continue to improve efficiency, but many cities don’t have those comprehensive energy benchmarking data to compare past designs with those under development, Sawyer said.
“We realized we could actually use the power of AI and machine learning to analyze a lot of data from other similar cities and similar environments to predict what it would be for a city that's missing those benchmarks right now,” she said.
Sawyer received a seed grant from the Scott Institute(opens in new window) with her Ph.D. student Tian Li for this work to address the need to reduce carbon emissions and energy use in buildings across the country, then identify ambitious and achievable decarbonization targets.
Previously, building scientists would apply statistical methods, she said. Now, using AI, researchers can augment those approaches by uncovering nonlinear patterns to make even better predictions and classifications based on larger and more complex datasets.
“AI really changes how we do research,” she said. “It gives you the opportunity to really do true exploration: Here’s an idea area, I don’t necessarily know everything that’s out there, but I have this amount of data and can ask it to find patterns that our human eyes don’t see.”
In another project, she is working with doctoral student Niloofar Nikookar to develop a dynamic lighting system based on AI.
Static lighting systems remain the same regardless of the amount of daylight filtered through a building’s windows or the color of the light, which could affect the moods and productivity of the people inside.
Adjusting lighting through smart systems can not only help people feel better, but also improve energy efficiency, Sawyer said.
“Our hope is that once we have the datasets of how people respond to different colors of lighting and different sky conditions, then we could actually train the model to create a dynamic lighting system using AI that responds to how someone feels and what kind of space they're in,” she said.
Building design could also benefit from predicting occupant behavior, which can also impact energy usage, Sawyer said, such as how people react to blinds that adjust on their own to create shade and reduce glare.
“We’re designing for people,” she said. “We want to design responsibly so it doesn’t harm the environment and so it doesn’t negatively impact the people that are using our spaces.”
Monitoring Household Systems with AI Could Reduce Energy Usage
Mario Bergés(opens in new window), professor in the Department of Civil and Environmental Engineering(opens in new window), examines the way existing buildings monitor and use energy in order to make them more efficient.
His research involves what is known as non-intrusive load monitoring, which analyzes smart meter data, identifying appliance usage and predicting malfunctions.
“If you can be smart about how to analyze the data that's coming from your smart meter, then you are going to be able to fingerprint individual appliances and also get to know a lot about the behavior of people in the home through their usage of devices that consume electricity,” Bergés said.
Then, computer systems can be trained to analyze the data from the meter and provide feedback on how to improve energy consumption.
“Say your washing machine is breaking down. There are signatures of that motor failing that could be detected from the smart meter itself,” he said. “If you're careful about what you're paying attention to, you can know not only that these things are on or off, but also what they could be doing, and whether they are malfunctioning.”
Even though all the possible signatures from every appliance would be difficult to identify, systems could use AI to recognize anomalies closely enough to monitor them and make the data accessible.
His team has used artificial intelligence to develop learning-based controls for heating, ventilation and air condition systems. AI can monitor the building, then emulate how to manage the temperatures before making it more efficient.
Instead of managing only one building at a time, a set of buildings could be managed together and the energy usage could be coordinated using an algorithm. The energy can then be stored or released as needed.
“Then you are essentially allowing all these buildings to act as a very big thermal battery,” he said. “All of them together are creating this storage and they are allowing for the excess production of electricity to be stored.”
Transforming Energy Solutions with AI Starts with Research
Using artificial intelligence in each of these ways requires human innovation and ingenuity, including the interdisciplinary collaboration encouraged by Carnegie Mellon and facilitated by the Scott Institute.
The faculty members agreed that the students and colleagues at CMU are what makes considering and working toward solutions to these energy-related challenges possible.
“We have the best students to work on these projects, and it really takes the best folks in AI and machine learning to tackle such hard problems,” said Schneider.
The future benefits of artificial intelligence will depend on how the research and innovation being pursued now also balance sustainability and the environment.
“You can have both innovation and sustainability. We just need to sort of think about things differently,” Strubell said. “Now is a critical time to be thinking about this, because I think we are about to build out a ton of data center infrastructure and the supporting energy systems to power those data centers, and I do believe that we can do that in a way that is compatible with sustainability in various ways.”
Processes using AI that improve energy usage, storage and reliance developed now through research at CMU will continue to transform and establish sustainable systems well into the future.
“I'm optimistic, and the Scott Institute is optimistic, which is why we're pursuing this line of research,” Tkacik said. “There are many applications of AI for energy and climate yet to be discovered, and these discoveries are being pursued in a very responsible way by a number of scholars, both across the country and right here at Carnegie Mellon."