[Podcast] Navigating sustainability data with GenAI Copilot 

AI in data system

How to properly use AI in data systems? In this episode of the AI at Scale podcast, host Gosia Gorska asks Jeff Willert, the Director of Data Science at Schneider Electric, about the effective application of AI in data systems. Jeff shares his career journey, from his early days working with the U.S. Department of Defense to his current role, leading a team of data scientists dedicated to sustainability at Schneider Electric.  

He also comments on the groundbreaking Resource Advisor solution and its newly integrated Copilot feature, designed to assist users in understanding and managing their energy consumption as well as carbon footprint.  

Jeff elaborates on the functionality of Resource Advisor Copilot, covering its safety protocols, data privacy measures, and the user experience of interacting with the system.   

Finally, Jeff addresses the difficulties involved in ensuring reliable AI responses and underscores the significance of visualization in data reporting. Additionally, he highlights how AI can contribute to ESG (Environmental, Social, and Governance) reporting.

AI in Sustainability Data Management

 

Listen to the AI at Scale podcast

Listen to Jeff Willert:  Navigating sustainability data with GenAI Copilot  episode. Subscribe to the show on the preferred streaming platform (Apple Podcasts, Spotify). 

Transcript

Gosia Gorska: Welcome. This is Gosia Gorska, the host of the AI at Scale podcast and today I have the pleasure of hosting Jeff Willert, Director of Data Science at Schneider Electric. Jeff holds a Ph.D. in Applied Mathematics from North Carolina State University. He worked in the field of computational physics at Los Alamos National Laboratory as a graduate student and postdoc, and before he joined Schneider Electric’s sustainability business in 2019, he worked in multiple data scientist roles in Washington, D.C. supporting the U.S. Department of Defense. Welcome, Jeff.

Jeff Willert: Thank you, Gosia. It’s a pleasure to be here.

Introduction

Gosia: Yeah, thank you for being with us. And it’s really impressive you have this experience as a data scientist supporting the U.S. Department of Defense. I bet it was an interesting experience for you.

Jeff: It was a great experience. I had the opportunity to learn a lot of lessons both about myself and also about what it really meant to be a data scientist. I was trained as a mathematician and didn’t have any coursework or formal experience in data science at the time, but it turned out that there was a lot of overlap between the work I’d done as a mathematician and the work I was ultimately going to be doing as a data scientist. I also really got the opportunity to learn some less technical lessons. During that time, it was really important for me to be able to communicate technical concepts to non-technical audiences. In the past, I’d attended math conferences or nuclear engineering conferences where everyone I met with was an expert in the topics I was discussing. But all of a sudden, I found myself talking to stakeholders who didn’t care about the technical details as long as the results were robust and actionable. I also got to learn a lot about collaborating intimately with subject matter experts. Even if data suggests a solution is viable, it needs to work within the boundaries of existing processes and within the expectations of the stakeholders. I learned how important change management can be when implementing solutions that turn the status quo upside down. And finally, I learned how important it was for me to find work that was meaningful to me personally.

Gosia: Yeah, I can see what you mean. And it’s pretty amazing that AI is actually putting in the spotlight applied mathematics. We discuss, of course, all these AI solutions, but behind this, we have solid math, we have data science, and I was wondering, was there any particular reason why you decided to join the sustainability department at Schneider Electric? And what do you do in your current role?

Jeff: While the work I was doing was very interesting, it hadn’t been personally fulfilling. I found that I needed to devote my work towards something more meaningful, and sustainability is an obvious choice. In my current role, I lead a team of about a dozen great data scientists and machine learning experts. We execute projects on behalf of the sustainability business of Schneider Electric. Within our energy management division, we’re focused on helping companies across the globe understand how they’re consuming energy or other commodities. We help them source those commodities more cost-effectively, and ultimately, we want to help them understand and reduce their carbon footprint. In addition to leading my team, I’m really passionate about educating others in the business about data science and AI. I really believe that some of the best projects come about when the people who are closest to the client or closest to the business processes recognize a way to incorporate new technology. We’ve held a handful of lunch and learns and other learning opportunities across the sustainability business, which are always met with great excitement.

EcoStruxure Resource Advisor

Gosia: I see, and I’m very happy that I will be learning with you today and also our audience of the AI at Scale podcast. So really getting to the point of your current work, I know that you are working on a team that is supporting a solution that we call EcoStruxure Resource Advisor and is actually using some of the AI capabilities in the sustainability realm. So, for those who are not familiar, what is the solution about and how did you work on it?

Jeff: Resource Advisor is a five-in-one platform that enables clients to address many aspects of energy management and their sustainability journey, including sustainability, renewables, carbon efficiency, energy and risk, and ESG reporting. It’s used by over 2,700 clients globally, including 40% of the Fortune 500. The system supports more than 60 gigawatts of managed energy across the globe and 40 billion euros worth of managed spend. This system is used by sustainability professionals, finance and procurement teams, site and operations managers, and CXOs. It’s a really impressive platform with a wide range of applications.

Life of a Facility Manager

Gosia: Yeah, so you really find your place in the sustainability space. I try to imagine people who are using it on a daily or maybe monthly basis. Could you describe for us a day in the life of a typical, say, facility manager or sustainability manager and what kind of data they need to collect?

Jeff: Yeah, these are great questions. Facility managers and sustainability managers potentially have very different interests in what they want to do with Resource Advisor. A sustainability manager is likely interested in computing carbon footprints across scope 1, scope 2, and scope 3. They may be interested in tracking other ESG metrics. That stands for environmental, social, and governance, which are very important in today’s world. Whereas a facility manager is probably more interested in understanding how their site is specifically using different commodities like electricity, natural gas, water, wastewater, and maybe other commodities like propane or liquefied natural gas. And so, for example, at a manufacturing facility, they might have meter data and invoice data that they need to reconcile, which they later will try to turn into an emissions profile. Ultimately, these teams are all seeking to reduce their costs and reduce their carbon footprint.

Data Collection in Resource Advisor

Gosia: Yeah, and that’s really a lot of different sources of data. And I imagine they are also often in different formats. So that’s why you came up with the solution named Resource Advisor, and what type of information is exactly available there and how it supports customers in collecting all this data?

Jeff: Resource Advisor has, as you mentioned, an enormous variety of standard and customizable mechanisms for accessing activity and cost data. And we centralize it into a common platform that can be the trusted source of truth for our clients’ organizations. To answer your question a little bit more specifically, we have connected smart meters that are feeding electric power readings or natural gas data at a regular interval. We have invoice data that we collect from utilities or suppliers every month. And we also have the opportunity for users to enter their own data streams and track metrics that are important to their business, all within the same site. They can also compute calculated data streams. So maybe they’re interested in energy efficiency of some manufacturing process, or if you’re a hotel, maybe the amount of energy being used per guest. So there’s a lot of opportunities and it’s a very flexible platform.

Introducing AI to Resource Advisor

Gosia: Okay, so at this level, we are just talking about collecting all the data in one place, turning it into information that sustainability, procurement, or facility managers can use on a daily or monthly basis. So why then and how did this idea come to your team that we need some AI on top of it?

Jeff: Yes, this is something we’ve been discussing for the last five years, and it’s basically how do we make access to the data simpler? And we came up with this tagline, “stop searching for the data, just ask for it.”

Gosia: Yeah, I like that.

Jeff: The final push for this came about two years ago when we began to see rapid improvement in natural language processing and large language modeling capabilities. So these large language models, or LLMs, give us a new way to understand text inputs. And we began asking ourselves, how can we put the information a user wants right at their fingertips, regardless of who they are? And so these new technologies offered us a great opportunity to begin delivering on this longstanding desire.

Gosia: Okay, so basically what you offer now is a friendly companion that sits on your computer when you enter your job desk. You can ask a question and it gives you the answer. How does it exactly work?

Jeff: It’s actually a very cool system. When we began ideating about Resource Advisor Copilot, we asked ourselves some basic questions. And one in particular was what should the copilot be able to answer? And of course, what shouldn’t it answer? And this led us to develop some of the safety features of the tool. Before building out the rest of the framework, we built models that take as input any user’s utterance, and we try to determine if that’s relevant to energy management or sustainability. Furthermore, we work to quickly identify any negative or adversarial behavior on behalf of the user. Overall, our goal is for the user to have a safe, positive, and relevant conversation, which is ultimately useful to the client. Next, we run a set of models to determine the user’s intent. We want to know, are they seeking help understanding complex energy management and sustainability terms? Are they looking for help using Resource Advisor, or are they seeking to query some of their own data? The real challenge for us occurs when the user seeks to query some of their own data. We need to understand the type of data that they seek, as well as all the parameters that comprise the query, including things like data stream names, locations, dates, and more. And if I can get technical for just a moment, these large language models, or LLMs, are designed to process numerical data, not text. Therefore, the first step in language modeling requires us to compute text embeddings or high dimensional numerical representations of a word, sentence, or longer passage. And we’ve built more than a dozen classification models that take as input the user’s utterances, text embedding. This has many benefits, including the fact that we don’t need to utilize a large language model to make every decision within the application. This speeds it up and also means we’re using less energy running large language models.

Gosia: Yeah, and I think that’s really impressive because we often maybe a bit overestimate the power of Gen AI in a sense that of course, it gives you this conversational capability. And this is great. This is exactly what you wanted to achieve for a customer to be able just to ask a question and get a reply in their own language. But on the other hand, if I understand it well, you need to put a lot of work beside, like at the backbone of the application. So it’s not Gen AI that is actually replying to these questions, but you do all the calculations outside of it. Right, to ensure the reliability of the answers.

Jeff: Yes, that’s a great point. Gen AI is an amazing tool, but it doesn’t need to be used to solve every problem that we have. To me, it’s always about right-sizing the solution to the problem we’re trying to solve. And one of the things you just brought up was the ability to perform calculations. Anybody who’s used ChatGPT or other large language models knows that they’re not great at multiplying numbers or doing other simple math. And so we don’t ask the LLM to do math. Instead, we leverage functions or APIs from Resource Advisor, which return the data our clients are requesting in a form that’s as close to the end result as possible. And we also recognize a set of common operations that need to be performed on the resulting data set. This allows us to do things like compute sums, averages, and more without asking the LLM to do any math. It would be overkill and ill-advised to try to let a large language model do that work for us.

Gosia: Yeah, and then you would really get very various replies to a question about your energy consumption in a given facility in the last quarter, right? And then without this, you can actually get a correct answer.

Jeff: Correct. We want to get deterministic solutions. And the challenge of course is large language models generally aren’t deterministic. You can ask it the same question 10 times and get different results. It’s maybe okay to phrase a solution or an answer in a different way each time, but the underlying result, the computation that needs to be performed, needs to be exact.

Gosia: Yeah. Correct. And I’m also wondering, is it possible to create visualizations of data for the purpose of reporting? So instead of just receiving a text as the answer, also receive a chart, for example?

Jeff: Yes. Resource Advisor has a very extensive set of visualization capabilities already built in, and we didn’t want to reinvent the wheel. So instead of developing our own set of visualizations within Resource Advisor Copilot, we recognize when a query can be linked to an existing visualization capability and provide access to that visualization within the copilot’s response. Additionally, as part of the response, we provide links to dashboards where that data can be explored in more detail.

Gosia: Okay, that sounds very useful. And how is it used for ESG reporting?

Jeff: So far we’ve focused on providing direct access to data streams or data that’s captured at some regular interval, usually monthly, but occasionally constant or changing over some other frequency. This includes emissions data, which can be transformed in many ways, including being broken out by scope or pollutant. This energy and emissions data are components of our clients’ environmental data. That’s the E in ESG. Social and governance data are exciting areas that we seek to develop more in the future.

Gosia: Yeah, I see. And actually as you were expanding and also thank you for sharing the definition of ESG. Is it also something that the copilot can do? So can it advise and maybe serve as an education companion for the customers? So if somebody is new to the job and maybe they are not yet familiar with all the acronyms or the definition of, you know, scope one, two, and three, is it also helpful in these cases?

Jeff: Yes, and I think that’s one of the coolest use cases for Resource Advisor Copilot today. Users can access a wide array of sustainability knowledge. A common example I like to use is a user might ask, “Are their emissions at a given site?” And the response comes back and it tells them what their Scope 1, Scope 2, and Scope 3 emissions are. But the user might not be familiar with those terms. Within the same conversation, the user can ask that follow-up, “Can you define Scope 1, 2, and 3 emissions? What are the differences?” Additionally, Resource Advisor Copilot can help a user find appropriate help material when they have a question about how to do something in Resource Advisor. For example, maybe the user has forgotten how to change their password or wants to create a new site group within the system. A simple query to the copilot will point them in the right direction. In the future, we expect to have even more functionality, including having access to decarbonization plans, assuming they’ve opted into that kind of service.

Gosia: I see. But it makes me wonder how much work and basically how did you prepare the database for this kind of solution? Like how many definitions and additional documents you needed to make available to the application?

Jeff: The good thing is a lot of this material already existed. For example, any software platform should have robust documentation and help material. We were able to index all of that help material that already existed and connect it to kind of our knowledge functionality of the copilot. So that was actually one of the easier things for us to do when developing this copilot.

Gosia: I see. But we come to the basic point that is repeated across many of the episodes and conversations that we have, that data is really a very important fundament for any AI application.

Jeff: I 100% agree. I’m a believer in the data-centric AI philosophy, whereas I don’t believe that you should spend all of your time building bigger and better models, but focus more on making sure that the data that underlies your solution is accurate and reliable, 100% correct.

Data Privacy and Security

Gosia: And how about another side of data, which is the privacy aspect? How do you ensure that data or the questions that are asked by the customer are secure and they are not shared outside?

Jeff: Another great question. This is something we talked a lot about at the beginning of development of the copilot. We decided that we would absolutely not train our own language models on any client data. First of all, I don’t believe it’s necessary for a large language model to have any client-specific information baked in. We can leverage that type of information at runtime via techniques like retrieval-augmented generation. This is essentially a technique when a large language model needs specific information. You provide it just those pieces of information that are relevant to answering the question as it’s about to develop a solution. Second, doing so would require us to maintain a separate large language model for each client and potentially even every user, in order to avoid cross-contamination and ensure that users only have access to the data they’re supposed to see. Finally, the carbon footprint of training and hosting all those models would be extreme. Ultimately, it’s clear that this tactic needed to be avoided at all costs. And if I can follow up a little bit more on the security aspect, Resource Advisor already had robust user-based access controls built in. So two of us who work at Schneider Electric might not have access to the same data. I might have access only to a site in Louisville where I live. Whereas Gosia, you might have access to data across all of Europe. Those same user-based access controls are baked into the copilot so the user can only query data that they’re expected to have access to.

Gosia: Okay, that sounds really great. And how do our customers react? Do you have any feedback, any stories to share with us?

Jeff: Yes. I had the opportunity to present an early version of our copilot at our Perspective conference last fall. We received a ton of excitement from attendees. One of the things they loved about the system was how conversational it was and how easy to use it was. You didn’t need to memorize specific prompts. You could ask questions in your own language, the way you speak. Furthermore, you could ask a question like, “What are our total emissions across all of our sites in 2023?” The Copilot will give you an accurate response and the user can follow up in a conversational manner asking a casual question like, “What about 2022?” The system understands that the user is looking to compare emissions across 2022 and 2023 and quickly yields a response. This type of conversation is more natural to people and makes it much easier to interact with the copilot.

Gosia: I see. And I guess that during this kind of conference you also explain a bit about how the solution works and all the security measures that you just described today with me. Because I think this is really making a difference. And what I like really about this kind of conversation is that we are taking away this level of mystery around AI and we are making really it understandable to everyone. How does it exactly work, what it does, and what kind of value it can bring? And back to your personal experience. How was the experience of delivering Resource Advisor Copilot been? How did you work with the team? How did you basically like the whole project?

Jeff: This has been probably the most fun, most challenging, and most rewarding project in my career. We set out to build something that not many others were experimenting with at the time. At the end of 2023, we were recognized by the Wall Street Journal as one of five Gen AI early adopters. We had the opportunity to exhibit Resource Advisor Copilot in Microsoft’s Executive Briefing Center, which highlighted innovative solutions that were leveraging large language models using Microsoft Azure. Delivering the project was great because of the excellent team that I got to work with. In addition to the amazing data scientists on my team who helped bring this to life, there were super talented software architects, developers, designers, business analysts, product owners, and business leaders who helped make this a reality. We also have to thank a variety of internal subject matter experts who rigorously tested the system, making sure it generated accurate information and provided correct responses to queries. This couldn’t have been accomplished without that wonderful collaboration, and just talking about this project more generally, we were solving new problems every day, problems that I couldn’t go to the web and say, “How did others solve this?” Because at the time we were developing this, there weren’t a ton of solutions out in the wild that looked like ours. At least none that people were writing about and sharing their experiences. So it was an opportunity to get back to my research roots a little bit and every day felt like a brainstorming session.

Gosia: It sounds like a great experience and I’m really happy that you joined Schneider for this adventure and that today we can offer this solution to our customers. And Jeff, it was really a pleasure to discuss with you and to understand all that is hidden inside the copilot. I must admit that I was able to see the architecture of the solution and I was really impressed with what you mentioned about the domain expertise that is required to make this solution work. So it’s actually not only the AI expertise that is needed, but you need to really be an expert on the topic. You need to understand what customers may be asking, what they want to achieve from this solution, and only then you can deliver a meaningful solution. So thank you so much for this conversation. Maybe as the last question, you could share some of your thoughts generally about the usage of AI for sustainability objectives. How do you see the future? What kind of needs do you see among the customers in this space?

Jeff: This is a really interesting question and I can make arguments both for and against using AI without being conscious about how it’s implemented. First of all, we all know that AI can have a very large carbon footprint and that’s something that we need to be conscious of at all times. That said, we are going to be facing some massive challenges over the next decade around decarbonization across the globe, and data is going to be at the heart of that. I think AI will allow us to parse larger data sets, accelerate the processes of understanding carbon footprints, understanding how supply chains can impact scope 3 emissions. And so I think there are a lot of challenges that can’t be solved without AI, and it’s a balancing act that we’ll need to be very careful about. That said, I’m excited because I pay attention to every model release and I’ve seen this trend over the last year where models, specifically large language models, are getting smaller and more performant every day. And that means they will be more efficient, require less energy to run, and that means they’ll be a better choice moving forward as we use them to interpret more data and make better decisions.

Gosia: 100% correct. Thank you so much, Jeff, for being with us. Thank you for your time.

Jeff: Thank you. It’s been a pleasure.

Like what you hear?

  • Visit our AI at Scale website to discover how do we transform energy management and industrial automation with artificial intelligence technologies to create more sustainable solutions.  

AI at Scale Schneider Electric podcast series continues!

The first Schneider Electric podcast dedicated only to artificial intelligence is available on all streaming platforms. The AI at Scale podcast invites AI practitioners and AI experts to share their experiences, insights, and AI success stories. Through casual conversations, the show provides answers to questions such as: How do I implement AI successfully and sustainably? How do I make a real impact with AI? The AI at Scale podcast features real AI solutions and innovations and offers a sneak peek into the future.  

AI at scale

Tags: , , , , ,

Add a comment

All fields are required.

Your browser is out of date and has known security issues.

It also may not display all features of this website or other websites.

Please upgrade your browser to access all of the features of this website.

Latest version for Google Chrome, Mozilla Firefox or Microsoft Edgeis recommended for optimal functionality.