Astrid talks about systems, design and how @EnergySRE describes “the unsexy valley” of applying tech to new cool areas like the energy transition. https://twitter.com/shinynew_oz/status/1230994868144492544?s=20
The grid space is behind the curve in adopting contemporary software technology. The last ten years have seen a computing revolution – the cloud isn’t just about using other people’s computers, it’s about using effectively unlimited computing power to solve the hardest problems. This isn’t very well branded – we talk about “the cloud”, and “AI” and even “blockchain” – but they’re all fragmentary perspectives on the idea of building vast software systems composed of thousands or millions of pieces, all working together. That’s what built Google, and Facebook, and the vast array of modern internet services large and small.
Each of us interacts with apps on our phone – but it’s easy to miss the thousands of interacting cloud components which make those apps work. Not all software is created equal. When we look at technical solutions or proposals in the grid space, they typically discuss “algorithms” for solving various problems. But the key part of modern distributed computing systems is the system itself – connecting all the pieces, managing data flows, correcting errors, creating models, visualizing system behavior, managing distributed state and decision-making, reconciling place and time (this is harder than it sounds), managing reliability, distributing work… once you’ve done all that, algorithms begin to matter. People refer to “the algorithm” that runs Google.
But Google search is a system, with thousands of small algorithms inside of it, each doing a specialized thing. Without the system, no algorithm is meaningful. Within the right system, very complex behavior can be made simple.
Building these systems is hard, and there’s not a good name for it (we’d call it “distributed systems engineering”). It is the model that the cloud was built to enable, and it’s a fundamental shift from the way that software worked before. This is why what utilities think of as “too much data to handle” isn’t, really. Smart meter data isn’t big data – it’s just too big to fit on a single machine. There is a profound transformation coming in the energy space once people really get what’s possible. (This is also why distributed systems engineers laugh whenever anyone mentions blockchain – because it’s one specific distributed design pattern, and not a very good one – and it doesn’t actually solve any of the systems problems that matter.)
This is the biggest reason why Camus Energy is choosing to pursue an open source path – because we can’t have a real conversation about the role of software in energy systems until we can level up everyone’s expectations about software. It should be better! It should be secure! Software should actually work, and solve important problems, and it should provide tools for managing the growing complexity of the modern grid environment. If it doesn’t do those things, it’s bad software. There is a *lot* of bad software in the grid environment.
But this revolution has already come to adjacent industries – see 5G and software defined networking in the telecom space – and I’m excited about the possibility of a similar groundshift in electricity systems. But we have a *lot* of work to do. Most grid tools aren’t built for parallelization. There is still a lack of sufficient publicly available data for researchers and startups. Most “open” standards in the grid space aren’t (they charge for access, which limits innovation). Bringing real change will require a community. We’re excited to have begun engaging with others on open source collaborations, and we hope this will be more of a trend for the energy industry broadly.