Episode Transcript
[00:00:03] Speaker A: Welcome to INA Insights, where prominent leaders and influencers shaping the industrial and industrial technology sector discuss topics that are critical for executives, boards and investors. INA Insights is brought to you by Ina AI, a firm focused on working with industrial companies to make them unrivaled. Segment of one leaders to learn more about Ina Aihdev, please visit our website at www. Dot ina dot AI.
[00:00:40] Speaker B: Hi everyone, welcome to another episode of Ina Insights podcast hosted by Ina AI.
Today we have with us John Belizer, CEO of Celluna Holdings. Celuna converts excess renewable energy into computing power, developing modular, scalable green data centers for bitcoin mining, AI and scientific research.
They offer data hosting services providing power and network connectivity for cryptocurrency mining clients.
John, a serial entrepreneur and CEO of Saluna, drives innovation in renewable energy using computing. He has founded and scaled multiple tech startups, including Firstbest and theory center, and contributed to Intel's digital enterprise growth. John, a very warm welcome to our podcast and we are excited to have you and look forward to talk to you about, talk to you particularly about your insights on renewable computing.
[00:01:38] Speaker C: Well, thanks for having me on the show. It's a pleasure to be here and look forward to the conversation.
[00:01:44] Speaker B: Perfect. So, John, to get started, would it be great if you could tell us a bit more about what Saluna does?
[00:01:51] Speaker C: Absolutely. Here at Saluna, we're on a mission to make renewable energy the global superpower. If we are successful, I should say when we are successful, renewable energy will be the primary source of energy around the world. And in that mission, we believe that that can be made possible by using computing as a catalyst. What we do is we find large scale, called utility scale power plants that are either hydro, solar, or wind power plants that have excess unused power because they're intermittent resources. And the grid wasn't designed to use those types of resources in a major way. So sometimes the grid sees excess energy being produced by these power plants, and those power plants have to shut down. There's a little known secret that about 30 to 40 of the energy of some of these power plants never actually makes it to the grid. And that can have catastrophic financial implications for those power projects.
And so what we do is we build data centers that are co located with those power plants that interconnect to the power plant and to the grid at the same time. And we turn that wasted energy into global computing resources.
As you said at the beginning of the call or the podcast, we do this for crypto mining customers. They bring their machines to our data centers, plug them in and they can mine cryptocurrency at favorable costs and the best customer experience.
And now we've shifted into also doing this for AI and generative AI workloads, so customers can bring their workloads to our data sites and perform those workloads in a much more sustainable, scalable and cost effective way.
So that's what we do as a company. We're essentially remaking the relationship between energy and computing by essentially fusing them together in a convergence, if you will, to help shape the future of both.
[00:04:06] Speaker B: That's very interesting, John. So I imagine that there are other companies who are trying to harness the excess renewable energy like you are and offering it for computing. What, in your opinion, sets saloona apart from some of them?
[00:04:24] Speaker C: You are right. There are some other players out there, I'll call them industry peers, that are doing similar things, converting stranded energy into computing, locating their data centers close to renewable power plants to benefit from the green electrons and the low power cost.
The difference, I would say, is a couple of factors between us and our partners and our peers out there.
The first is we are, as far as I know, the only company that is really truly doing behind their meter co location of those data centers. In other words, we sit right at the power plant, not near the power plant. So down the line, if you will, on the grid where we're consuming energy from one or more power plants, this allows us to structure a deal that benefits directly a specific resource. So if you've got a power plant that has this issue and you want to make sure that curtailment is significantly reduced, then you do need a local consumer of that power that can monetize it for you. So by co locating with the power plant, we solve that problem. Other peers will be close to the plant, they'll build their own substation to essentially get some of that excess. But it's not really solving the problem at that location.
The other thing we do that's different is because we're integrated to the resource, we actually form a very powerful solution for the grid now, because now that resource is either a consumer or a producer of power. And depending on what's happening with the data center, we can produce more or less energy out there.
And that's important because the grid essentially needs sort of a retroactive upgrade, if you will, of capabilities or resources that can help it to perform better in these intermittent environments.
Now there's also the data centers themselves.
We have developed a series of innovations in our data centers that allows us to build a very large power consumer in a very small space. So they're modular in design and can be combined very easily to scale to create a very large facility. That modular design allows the data center to be very flexible, so we can flex up and flex down and under different scenarios, and we can do it quite quickly.
We have all sorts of thermodynamic innovations that allow us to live in these remote environments. That enables the company to deliver a host of different types of applications, especially the next generation applications. And then we've built an advanced piece of software that allows us to sense what's happening with the power plant, the grid, the weather, to determine how big or how small that data center can be to enable the benefits that we're bringing to the power plant.
So, in a sense, we're solving the hardest problems, placing the data center in the most challenging location and integrating it directly to the power plant to bring a whole host of benefits fundamentally to the grid and ultimately to our downstream customers. That's the key difference between us and our peers. We're fundamentally trying to accelerate the growth of renewables.
Some of our peers are primarily focused on getting access to cheap power.
[00:08:09] Speaker B: So, like you said, John, this colocation is really the differentiating factor. But if I think about some of the downstream customers, for them, it may not be the most preferred location for their data centers. So how do you communicate the value proposition to them?
[00:08:30] Speaker C: That's a great question. So what I find with innovation, innovation is hard, actually. You need a tremendous amount of conviction to be in the innovation business. About six years ago, people told us that integrating data centers or computing with renewables was impossible. There are all sorts of problems you need to solve. The wind won't work properly, you won't have enough uptime. And today, we are among the leaders, as I said, of behind the meter, bitcoin related data centers, and hosting and flexible data centers, and responsive design to the grid is now a thing when we look at the next generation of computing. We're now saying that AI can in fact be done in these remote locations. In fact, we're going as far as to say that AI should be done. For it, to reach its true potential. We need more sustainable and scalable ways to deliver this technology to the enterprise, because it will be the primary beneficiary.
And that means while customers would not like their computing to happen in these remote locations, it ultimately will require it to be there for them to get the grand benefits and not create more problems for us. From a climate perspective, we're saying that because we are investing in innovations and lateral thinking to solve all the hard engineering problems you're going to need to solve to put data centers near renewable resources. One, you're going to need to solve for the thermodynamics and management of cooling or mechanical cooling of this equipment. Two, you're going to need to find a way to provide massive amounts of energy and high energy density for these assets. Three, you're going to need to deliver connectivity to that location that has never really been built out before.
Or you're going to need to develop a way to integrate with that renewable resource, the grid, and ensure that you can deliver the same level of uptime that you can in other locations. Five, you're going to need to solve for the environmental effects or environmental footprint of that data center. We need to have the carbon footprint go down, not up, by moving it to that location. So that means reducing the water usage, looking at different ways of integrating cooling, using power density versus evaporative cooling and that sort of thing. All of that is on our docket as a company to deliver to this big new wave in transformation in AI. The role that we want to play is to be the innovator that can help deliver a series of data centers that are distributed around the country and ultimately the world, that can provide a powerful resources for enterprises that are looking to put their own proprietary data into a data center, know where that is, have complete transparency and have the ability to support all of their AI workloads, specifically training and tuning, while at the same time feeling comfortable that their ESG goals will still be on track, the cost of their training will still be within reasonable levels, and the growth, their ability to grow and make this a core part of their business will also be supported. And finally, all of that has to be integrated into their existing footprint and cloud relationships, et cetera. As you can see, there's a lot to do and we are stepping up to the challenge as a company to go after that. Because if we can solve these challenging things, we can create a tremendous amount of value. But more importantly, we can help to accelerate the journey that we're on to ultimately fight climate change.
[00:12:45] Speaker B: Understood. And John, in terms of your focus on crypto and bitcoin mining and AI now, what have been some of the drivers that made you choose some of these segments for your customer, for your downstream customer focus?
[00:13:03] Speaker C: The key question that, or the key insight that drove our design and our approach to this is really around the fact that you're living in an environment where power may be intermittent. It's not continuous power. We asked ourselves. What types of applications can live in that environment? It turns out if you create a map of all the different applications that are out there, you really have two core types. One is real time applications that might be something like your Netflix application, your ERP system email, for example, could sort of be considered these days, a real time application. You sort of expect the email to hit somebody on the other side fairly quickly. Gaming might be other examples, et cetera. There are a whole host of other applications that are more batchable in nature. You can run them for a series of time, they're stateless, and you can pause them and return to the process, or you can perform the process at a specific time during the day that doesn't have synchronous elements to it. In high performance computing, there's a whole series of these types of applications. And in fact, on the AI side, training and tuning are very much those types of applications. Yes, people will tell me that, well, you really don't ever want to turn your training thing off because it's so expensive and the machines are so expensive.
We heard the same thing for bitcoin.
It's a very expensive capex. To purchase these machines, they need to be on 24/7 actually they don't. Bitcoins design is a very resilient design. When you participate in the bitcoin network, you can enter and exit and participate in batches, if you will, and still participate sufficiently to get all of the money back for your machines in a reasonable amount of time.
And if your power cost is also reasonable. So what we're saying is if you can take these types of applications, move them to these energy rich locations, run them in a way that goes from, let's say 24 by seven operation down to, you know, let's say if it's four nines, if it's five nines, maybe it's four nines.
That small loss, let's say, in uptime, can significantly drop your capex, let's say expenditure to support the redundancy of that data center significantly. One, two. The location benefits drive your cost down from the power footprint perspective. Examples of applications, bitcoin AI, they use an extremely large amount of energy that can be harmful to the planet. And unless we use clean energy, so placing in those locations not only gives you the sustainability benefits, but it greatly reduces your cost. Well guess what? You now can deliver the same experience to the customer at a significantly lower cost, just the same level of reliability, and you havent experienced any loss. Look, if youre running a training for a specific model. If it runs three months versus two and three quarters of a month, its not going to affect you that much in terms of your experience and your time, but it will have a significant effect on the cost and it will have a significant effect on the environmental footprint of that process. And that's really what we're focused on. And that's what the insight that we had is, that batchable computing is a distributed process that can be done anywhere. And if it can be done anywhere, then we can start to bring it closer and closer to the, to the electrons that ultimately should be powering it. And by doing that, we will actually bring financial benefit to those types of projects that will help us to build more of those projects over time.
That's why we pick these versus. You won't find Netflix running in our, you won't find your e commerce applications running in our data centers.
[00:17:23] Speaker B: Interesting. And how have you seen customer adoption ramp up, John? Since the beginning of Celluna?
[00:17:33] Speaker C: Initially, we were conscious about how we executed on this plan.
The first thing we did was we performed this operation for ourselves. We were the customer in the data center, so that we didn't expose, let's say, our true customers to all of the growing pains and technical challenges that you run into to doing this. So we built a small R and D facility.
Then we scaled that to our first greenfield site. And then we scaled it again to double the size of a new greenfield site. And now we're going to double it again. Through that doubling effect, we got closer and closer to customers, and larger and larger customers. We're now the leading provider of hosting to some of the most successful publicly traded bitcoin mining companies.
And we're very maniacal about customer service and the experience that they have.
What we have found is that all of our theory and thesis around what you should expect by placing this computing in these locations, the cost benefits of it, the uptime and the technology you need to have it has proven out true and it has grown our business over time. Customers are delighted. They're like, wow, this actually works. They were very skeptical at first. I don't know. We don't know if that really works for us to put a machines in these locations. It turns out if done the right way, it can have materially positive effects on those customers. And so that growth has happened. We're now launching into the AI space and we're doing it also in a measured and careful way. We're starting with a partner in an existing renewable energy back facility with a small cluster of Aih GPU's. And we're targeting enterprises that are just starting with small workgroups and dipping their toe in this AI space. And people are starting to get that thinking about sustainability from day one has benefits. It's easier to get approved internally, they can scale it and so forth. And so we're attracting some very exciting customers in the financial services and biotech space and startup AI labs, etcetera. The goal is to keep scaling that again, as well as we build out our new designed facilities like Helix, that basically incorporates all of the things we talked about earlier, zero water usage, modular design, a closed loop cooling and thermodynamic environment, and a scalable design directly integrated into renewables. So everything we do at Saluna is very intentional because it allows us to reduce the risk of our execution and then scale all of our learnings over time to reach our goal of being a leader in the space.
[00:20:29] Speaker B: Got it. And John, as you're doing this at Saluna, ramping up at that pace, I'm sure there are some challenges that you're facing. Can you talk about some of them?
[00:20:39] Speaker C: The big challenge is that we've faced in the last, let's say twelve months have been supply chain. How do you get some of this very scarce electrical equipment that you'll need to build out these sites? We want to go faster. And so managing access to that supply chain has been a challenge for us and other folks in the industry. The other challenge is education.
To your point, people don't readily imagine that their computing would happen in these locations. Where is your data center? You're saying it's not in Virginia? And educating them on why they should not be concerned, or if they're concerned, here's all the mitigants to the concerns that they may have.
And so we're doing, I think, a great job of that education. And then the last thing is the process of getting access to these renewable energy plants. It takes time, it takes a lot of work to get all of the approvals you need and also educating the grid operators that. So you're not putting a battery back there, you're putting a data center. We don't have any models that say you can put data centers behind renewables. That's new, although they're saying that less these days because we've proven that it can be done. But it's still an education that needs to happen.
For me personally, that's no surprise. For the last 25 years or so, being in the business of entrepreneurship and innovation, 99% of the work that you're doing is educating or reeducating people about what the future looks like and convincing them that that future is here today.
And so we've become really good at creating content and having conversations like this that can be distributed out there to help people think about reframing or re asking the question, is that still true? Is it still true that only batteries can solve this problem, or transmission can solve the problem of wasted energy? What if you brought load to generation?
It's, wow, I didn't think about that. And that has helped us a long way over the years.
[00:22:58] Speaker B: Speaking of the future, John, so it seems at Saluna you have seemed to brought the two right things together, renewable energy and a focus on AI, crypto, bitcoin, mining, because there's a lot of focus on renewable energy given all of the climate change talk and the popularity of generative AI and crypto just keeps on increasing as we talk. So what do you see going forward as a future for Saluna and this industry?
[00:23:30] Speaker C: I have to say I'm very optimistic and hopeful about the future of Saluna. We have, we've been at this for about six years at the time we're talking today, and it hasn't been easy, but it's been a tremendous amount of learning throughout that time, we've had to overcome a series of challenges and, you know, through that pain, we've learned. I always say pain is where the learning happens. And now we can sit at this point and look at the future. We are not without challenges ahead. We always have challenges ahead as we execute on our vision and our plan.
But it's so clear right now, it's so clear that renewable energy and computing are going to be integrated. Convergence is going to happen where you actually won't think of energy and computing as two separate things, but they are one thing they're integrating to a point where they will reshape the future of computing into something we call renewable computing, where the computing resources are ubiquitous one and integrated into the global power infrastructure. And it will help us to advance every single thing we've done in the power space for decades. That's the exciting thing, is that through this convergence, you're actually going to reshape and rethink things a bit. I did a talk earlier this year called AI and convergence in the power of AI, I think it was called. And I made this reflection on an early part of my career where I was at intel. And it's funny saying this, because intel has sort of, is like a fallen star to some extent, as we talk, but I have lots of hope that they will rise again. At the time, Andy Grove was the CEO and he made this very pivotal speech during a big consumer electronics conference in Vegas. And the point he was making was that here we are, we have built a company that is helping to shape the personal computer space, but yet still, computers are just still seen as personal calculators or productivity tools. I think they can be more. If we integrate them with telecommunications, converge them, essentially integrate the two, then you won't actually know a difference between the two. And computing will become a collaboration tool in a global connection tool.
So if you look at that, that was over 25 years ago. And now to tell someone that the computer, to say to someone that the computer is not a collaboration tool, that we don't connect with each other through computing, they would laugh at you. That's incredible vision. And that was the first time I saw the concept of convergence. I think the second time and the most important time I'm seeing it in a company that I'm involved with is this convergence around renewables and computing. Essentially, renewable computing is going to be a thing and we won't sort of talk about it as much as what is that? I don't understand what you mean. Of course, computing is integrated with energy. That's just how it is.
One thing I want to say is if you look at every single innovation over the last decade or so, maybe two decades, it's all been driven by convergence. The convergence of our and the Internet, the convergence of social network or connectivity data to form social networks, the convergence of big data and computing is really driving the growth of AI. Convergence is actually a very powerful driver of innovation and ubiquity. And when I look at the future, that's what I see for the space. AI will now be integrated into business. It will be a central part of business and it will be powered by renewable computing.
[00:27:43] Speaker B: Thank you so much, John. It was a lovely conversation and very exciting to hear about all of the great stuff that you're doing at Saluda. And this would be one of the ones that we would be very excited to keep track on how things shape up in the future.
[00:28:00] Speaker C: Thank you so much for having me on the show and it was a pleasure talking about our vision. I'm always, always happy to do that and I wish the best to you and the best to the world as we go through this journey together.
[00:28:13] Speaker B: Awesome. Thank you.
[00:28:21] Speaker A: Thanks for listening to Ina insights. Please visit Ina AI for more podcasts, publications and events on developments shaping the industrial and industrial technology sector.