The Actionable Futurist® Podcast

S5 Episode 26: The Transformative Impact of Edge Computing with Mark Swinson from Red Hat

October 30, 2023 Chief Futurist - The Actionable Futurist® Andrew Grill speaks with Mark Swinson from Red Hat Season 5 Episode 26
The Actionable Futurist® Podcast
S5 Episode 26: The Transformative Impact of Edge Computing with Mark Swinson from Red Hat
Show Notes Transcript Chapter Markers

Curious about edge computing? Want to understand why it's the next big thing in the world of IT?

My conversation with Mark Swinson, an enterprise IT Automation Sales Specialist at Red Hat, might just be the discussion you're looking for.

Mark takes us on a deep dive into the world of edge computing, discussing its benefits, applications, and the crucial role of open-source projects.

We navigate the diverse applications of edge computing, exploring its transformative impact in sectors like retail, autonomous driving, and more.

Mark also enlightens us on the significant role of AI and Kubernetes in shaping the edge computing landscape.

Our conversation also touches upon the unique challenges in edge computing and why data security is paramount in this field.

More on Mark
Mark on LinkedIn

Resources Mentioned
Red Hat Connect London  7 November 2023
The Age of AI - Henry Kissinger, Eric Schmidt, Daniel Huttenlocher


Your Host: Actionable Futurist® & Chief Futurist Andrew Grill
For more on Andrew - what he speaks about and recent talks, please visit ActionableFuturist.com

Andrew's Social Channels
Andrew on LinkedIn
@AndrewGrill on Twitter
@Andrew.Grill on Instagram
Keynote speeches here
Pre-order Andrew's upcoming book - Digitally Curious

Speaker 2:

Welcome to the Actionable Futurist podcast, a show all about the near-term future, with practical and actionable advice from a range of global experts to help you stay ahead of the curve. Every episode answers the question what's the future on, with voices and opinions that need to be heard. Your host is international keynote speaker and Actionable Futurist, andrew Grill.

Speaker 1:

It's fair to say that data goes to work in challenging and unlikely places Places like the International Space Station, connected vehicles, factory floors, ships at sea and the neighbourhood pharmacy. Data might have traditionally belonged in a data centre or in the cloud, but many important decisions need to happen out in the field and at the edge of a network. Today's podcast will cover edge computing, a relatively new branch of cloud computing that has the promise to deliver insights and experiences. At the moment, they're needed right where they're needed To unpack the opportunity and promise of edge computing. I'm delighted to welcome Mark Swinson, enterprise IT Automation Sales Specialist at Red Hat, to look at the uses of edge computing and how Red Hat is making edge a reality. Welcome, mark. Hi, andrew, fantastic to have you on the podcast. So, for our podcast audience, how do you define edge computing In simple?

Speaker 3:

terms.

Speaker 3:

I think edge computing can best be summed up by saying that really it's based on the premise that, instead of sending data to a centralised location like a server or the cloud for processing, all the processing is done on a device that's closer to the source and sometimes the user, the consumer of the data.

Speaker 3:

So this can be a computer or a device that's located at the edge of a network. So one example would be an organisation we're working with that has a lot of assets all around the UK. They have built-in control systems on those sites, connected over a separate network, and so they want to do some monitoring to look for any security concerns that might cause concern and need investigation. So they've implemented a monitoring system that looks for patterns of unusual behaviour and will generate an alert. But there's very limited bandwidth back to headquarters, so the monitoring solution effectively has to run disconnected for most of the time. The other reason that you come across edge use cases is for data residency as well, so where legislation says that data may not leave the country, if it were to be transferred to the cloud, then it could be inadvertently or maliciously transferred into another jurisdiction.

Speaker 1:

Now, I know a lot about Red Hat because I've been around the industry for a while and I used to be at IBM. But it'd also be great if you'd explain more about Red Hat and where they play in the open source community and the broader computing ecosystem.

Speaker 3:

Red Hat is a little bit unusual, because all of the development that Red Hat does is done in the upstream communities and that means that we are supplying engineers into those communities, those teams who are working on open source projects, and the benefit of that, obviously, is that we are in sync with the community and it's a good way of having an ear to the ground, you could say, on what it is that is interesting to you know, users and the open source and the market at large.

Speaker 3:

We basically collect, curate, integrate, test and harden multiple projects from upstream communities into products that are suitable for enterprises to rely on. And because our code is open, our customers have the flexibility and freedom of choice and they can see where we're going. They can see the enhancement requests, they can see the bugs that have been lodged. Our involvement also means that we have some influence so we can help to shape the direction of open source projects to reflect the needs of our customers and the broader community. So really, you know we try to be open in everything that we do. There's no hidden agenda. Customers can see what they're going to get, they can try it, they can work with us and we encourage collaboration, you know, in an ecosystem around our products. So you know, this has stood us well over the last 30 years, with the rise of Linux to become the predominant server OS, through to IT automation with Ansible and, more recently, developing Kubernetes and the ecosystem around it.

Speaker 1:

So what are the benefits of open source when it comes to edge computing?

Speaker 3:

Well, I think it's really about this idea of transparency, because edge is an ecosystem game. It's about having a solution and that requires putting several components together. So this transparency allows the partners playing in the ecosystem to have visibility about the future directions of development and contribute their own enhancements into the upstream projects which then become part of Red Hat's products. So an example would be working with chip designers like Arm, intel, nvidia to support their latest chipsets and allow the higher levels in the software stack to exploit these new features. For the software ISVs so developers of software that run on our platforms it's the reassurance that there's a stable and reliable platform roadmap that they can develop to.

Speaker 3:

For solution providers, you know it's a modular sort of approach to open source because no project can provide all of the parts for a solution. So it's very much built around this idea of being able to mix and match components, so it makes it easy to swap one part out, bring another part in. So a good example would be AIML, which is actually becoming a more popular type of edge use case, and you know there's so many components that could be useful in putting together an AIML solution and most of those come from the open source world. So it's this kind of flexibility that I think is really invaluable for Edge.

Speaker 1:

So let's talk into the Edge. Where is Red Hat playing when it comes to edge computing?

Speaker 3:

We've got a few different sort of areas that we're really involved with when it comes to the Edge. So one very important one is with something called the Red Hat in-vehicle operating system. So this is taking a derivation of Linux, making it suitable to run, obviously, in the automotive setting Not surprisingly because there's so much more software that goes into cars these days. You need to have a strong platform, and I think a lot of the manufacturers are realising that it's just not sort of feasible to redevelop that kind of core capability over and over again. So we've got a partnership with subsidiary of Bosch in that respect and others like General Motors.

Speaker 3:

We're obviously also partnering a lot with their hardware providers. So the likes of NVIDIA, intel, arm, but also some of the players that you would see in the sort of typical kind of industrial setting. People like Siemens, abb and, of course, red Hat's had a very strong track record in the telco space. So just recently we've announced a partnership with Nokia around their telco infrastructure products and of course in modern telco there's a lot of edge in that story as well.

Speaker 1:

So what's required to make edge a reality and what are the particular challenges that enterprises might face?

Speaker 3:

Edge is an ecosystem play, so part of the challenge is bringing together parts to create solutions, but to do that in a flexible way that requires some common standards and approaches and an openness and an intent to collaborate.

Speaker 3:

So to some extent, I think it's necessary to have a willingness to experiment. Actually, I think the use cases that we can explore with Edge are still evolving and organisations, businesses finding new ways to put Edge to good use. I think it has been, up to this point, somewhat treated as separate and different to the kind of data centre and cloud and the way that that environment or those environments are managed, and one of the things that Red Hat is really pushing on is to try and make that more consistent so that you can take a lot of the same capabilities that have really made a big difference when it comes to cloud computing in the last five, ten years about standardising things, automating things apply those same principles when it comes to the Edge. We also shouldn't forget security, because obviously, as you put more devices out there at the edge of the network, that increases the attack surface, and so security has got to be one of those things that you really think about with Edge.

Speaker 1:

Now, everyone's talking about AI at the moment, so we've got to get the AI angle. Where can AI play a major role in edge computing?

Speaker 3:

Well, I think that since Edge is really most often about collecting data close to the source, so in particular sort of machine learning models that you can use to do inference on the data is a really important part of the Edge use case portfolio. So the ability to sort of seamlessly move applications from cloud to Edge is also linked to that. You typically will do the training of the model in a sort of compute intensive environment, so something like a data center or the cloud. So that ability to kind of pull data out from the edge, train your model and then push the model out back to the edge where it actually executes and you make decisions and act on the data locally is a really sort of key part. So that flexibility to move things back and forth, data and model. So some great examples of sort of AIML being used with the edge. I mean one would be in retail using it to sort of spot hot spots in the store. You've got a big queue of customers. You need to deploy some extra staff.

Speaker 3:

Autonomous driving. Another classic one visual inspection, so looking at the quality of items as they come off the manufacturing line. Spotting things early, getting into remediation saves a lot of money. You do that before you've assembled a you know complete car. It's a lot cheaper than having to fix it as it comes off the end of the production line. And then also augmented reality. So AI, you know, supporting decision making with engineers in the field, guiding actions based on contextual info All of those are sort of fundamentally relying on AIML capabilities.

Speaker 1:

So one thing I've wondered is edge computing only for the big players, or can smaller companies benefit from edge computing?

Speaker 3:

Everybody actually can probably benefit from edge computing. Price points for computing ability, capability at the edge coming down all the time, Price of sensors the amount of different data that you can collect is growing all the time, so even a small organization can yield some big benefits. Worked with one small craft beer bottling company who basically looked at reducing the wastage as they put the beer into the bottles and that's dependent on a few factors like the amount of CO2 in the beer, the temperature, the humidity, the pressure at which it's pushed in and they found that just by using some monitoring of those different conditions they could reduce the wastage significantly.

Speaker 1:

So where does edge fit with traditional cloud computing?

Speaker 3:

Edge, I think, is a widely used term so it's a little bit hard to be specific because different people use it in different senses. But I think edge has been quite separate from the cloud and edge has typically been where network connectivity is sort of poor or expensive or you're dealing with large volumes of data, like high frequency readings, or your size of compute resources is limited, limited power, storage, etc. Versus cloud, where the perception is sort of infinite resources. But I think the two should be seen as complementary and interoperable and, as we mentioned with the AI stuff, you know it's got to be connected as well. So really we're now seeing edge and cloud as more of a continuum and served by a consistent set of capabilities that make integration and management of the multiple parts as a solution possible and even easy. Just as we think about sort of cloud deployments, cloud management, as being easy. These days we talk about platforms as serving both the cloud and the edge use cases.

Speaker 3:

So are there different types of edge computing? It comes back to this point about edge being a very widely used term. We just think about it simplistically. There's sort of edge maybe at the edge of the network, so if you're talking to a telco, they'll have some use cases where they would say edges at the edge of their network. If you're talking to a car manufacturer, they would consider edge to be embedded in that car, that vehicle. I mean, you might consider your smartphone an edge device. And then you've got the sort of typical factory floor stuff where you've probably got a fairly significant server rack sitting in the factory somewhere but maybe connecting to some sort of smaller, simpler gateways that are then connecting into sensors. So yeah, there's a whole range of different sort of specific patterns that you can see when it comes to edge.

Speaker 1:

And one of the reasons I have this podcast is to really amplify some of the concepts that are emerging, like edge computing, to a broader audience. So for my business leaders that are listening to the podcast today, what do they need to do to understand the potential benefits of edge computing and how do they need to work with their existing IT teams to bring this to a reality?

Speaker 3:

I think there's a lot of things that we can do with EDGE that we haven't really got to grips with yet. So the first thing I would say for the business person is recognize. You don't know what you don't know. So there's data out there about the way that things are operating that will help you to be more productive, more efficient, less wasteful, etc. So in that respect, I think you should expect that deciding or evolving an EDGE strategy is likely going to involve multiple stakeholders in the organization. It's not just the technologists, it's also business, it's finance, it's legal. Think about if you're going down this journey, assembling that sort of broad team. The other thing is, you know the potential is to have significant impact in areas that we take for granted today. There's significant wastage still in our so, for example, our food supply chain or an agriculture, in the way that we use chemicals and fertilizers in agriculture. So optimizing any of those decisions can be a small change, but magnified over a large number of occasions will make a big difference. Keep in mind that there are obligations that come with collecting more data security, privacy and things like that but also think about what potential benefits you can get from collecting that data. The other thing I would say is that there's a cost model consideration here as well.

Speaker 3:

Typically, when you think about cloud, it's a pay-as-you-go model and it's consumption-based pricing. With EDGE it tends to be more of a sunk cost. You know, you make a capital investment, deploy devices around and so the benefits accrue while the costs stay flat. So that is a different kind of approach to what we've become used to with cloud. And lastly, I would say we've come to expect the benefits or the speed at which you can get the benefits with cloud computing. So it's easy to stand up a new application or a new website, get hundreds, thousands, millions of users, perhaps in a very short time. Edge has a more of a physical and often geographically dispersed nature to it, so it can just take longer to kind of get that payback. So just some things to think about really as you look ahead and think where could I put EDGE to use in my business?

Speaker 1:

Now everyone talks about skills shortages as these new technologies come to market and I'm sure people are scrambling to find generative AI programmers and those sort of things. But what new skills are needed to unlock the potential of EDGE computing?

Speaker 3:

In industrial scenarios we see EDGE really as where IT, the data center kind of disciplines meet OT or operational technology, and these are two worlds that have traditionally been quite separate. So we're seeing the lines between IT and OT blurring. So I think one of the areas that we need to develop skills is in a sort of a mutual understanding of the other side, if you like. By its nature, edge is remote, so it can involve some specialized connectivity options. So you come across things like low power, wide area networking, laura, private 5G, and so I think an understanding of what the options are in those areas is important as well. Essentially, I think the skills of sort of modern cloud native computing should apply to EDGE as much as they apply to cloud today. As we said earlier, it's really should be seen as part of a continuum between EDGE and the cloud.

Speaker 1:

I'm glad you mentioned Laura and private 5G. I've had guests on both those subjects on the podcast previously, so it looks like we are covering all the new emerging technologies. Pun intended, have you seen some EDGE cases where the limits of the technology are being pushed?

Speaker 3:

A favorite of mine is that we've got Red Hats technology running on the International Space Station. We've seen the same kind of technology running in CubeSat environments as well, so allowing people to put experiments up onto these low cost satellites. I've seen projects where we've been working with the MOD and DOD in the US to put more cloud computing capability out there at the EDGE and some of that stuff. When you get out into the field is pretty impressive of what they aim to have as an ability out there. But as I think we've seen sort of recently in the last couple of years with the war in Ukraine, the ability to sort of collect and coordinate data is so important in modern warfare. Other things I mean smart agriculture again. I think has a lot sort of there that we can look at in terms of improving our crop yields, quality of the produce, reducing the impact on the environment in terms of fertilizer, pest killers.

Speaker 1:

I did a podcast with David Keane, CEO of Rego, who performed a trial of autonomous vehicles in Cambridge a couple of years ago, and they work with Vodafone to deploy edge servers at their 5G base stations. So is this a typical use case and how can this be scaled across tens of thousands of base stations across the country or across the globe?

Speaker 3:

So I wouldn't say this is typical. I think it's still a bit too early for that but I think it is indicative of what we're going to see coming very soon. I mean, as we live in a high population density and there's more congestion in our transport networks, the ability to have those real-time insights into conditions, make decisions to optimise, will be increasingly important. We simply can't continue to just add capacity. So this type of low latency, high speed connectivity and decision making will be absolutely critical. And as these use cases continue to emerge, we'll need the ability to easily and scalably deploy workloads, and this starts to look like the highly automated deployment and scaling that we've come to expect in cloud.

Speaker 1:

So your website says that open standards and creative thinking can help you craft an edge strategy that meets your current needs and adapts the future. So how does creative thinking work alongside designing an edge solution?

Speaker 3:

Well, I think edge is still quite a nascent area, so any investment in something new has to share a return, and in my experience, once an organisation starts to deploy edge solutions, they begin to realise that they're adjacent possibilities, use cases that they hadn't thought of. So I think it's very important to approach edge with an open and curious mindset, because the outcomes you thought you were going to get are often not the only ones that can be realised. So I would say that's where the creative thinking comes in.

Speaker 1:

And how does edge solve the challenges inherent in a distributed IT deployment?

Speaker 3:

Well, I think there are different aspects to deploying edge workloads, but the key ingredient is to take the same approach as cloud native, and that's really to say standardisation and automation. So we've got a solution called Ansible Automation Platform which automates the management and maintenance of pretty much every type of IT asset. So it's very flexible. You can automate almost everything with Ansible, because it's modular and extensible, it's human, readable definitions and it's an idempotent language. So it's something you can just run again and again and it will make the IT resource configured in the way that you want without having to worry about rollback and things like that. So it's essentially designed to encourage sharing and reuse of these automation assets.

Speaker 3:

So with edge, you're going to have a large number of distributed systems targets to keep up to date. So you need something like that that's going to be able to continuously keep that network in sync and up to date. And the other thing we've got is something that you can be used to trigger a remediation if there's something detected as a problem. So that makes that sort of network of distributed systems a bit more autonomous self healing, if you like and that sort of takes you into things like the security considerations. So you're making sure that everything is correctly configured, disabling things quickly if that's necessary, if you're detecting attack or vulnerability very important to maintain that sort of security posture. But it's also about being able to effectively distribute the work as well. So figuring out where is the right place and respecting the network segmentation which is part of the security architecture as well, Is the explosion of the internet of things, or IoT, like to be one of the drivers behind the growth of edge computing.

Speaker 3:

Yeah, definitely. I think that essentially comes back to this point about basically making decisions close to the source and consumer of the data. So IoT has a huge source of data in real time from sensors that are typically pretty simple with no data persistence is a really important use case for edge computing.

Speaker 1:

So you mentioned a few industries you're working with. Are there other industries we haven't mentioned that can really benefit from edge computing?

Speaker 3:

We talked about retail, we've talked about automotive. You find that there are things in almost every sector. So healthcare is another great example. Any way you've got sort of high value assets that you need to track to maintain, you can save a lot of money. And so you know if you've got equipment that lives in a hospital that's moved around from room to room, just being able to understand where that particular piece of equipment is, get your hands on it quickly, can mean that patients get treated more rapidly.

Speaker 3:

At the end of the day, in the utility space, you know, with the move to more sustainable energy generation we're seeing a significant shift in the pattern there. So it's no longer about these sort of big centralized generating capacities and now you've got solar panels, you've got wind turbines. So the grid needs to sort of evolve to be able to be managed at a more sort of modular level, at a regional level. So we've seen some really interesting work that's been done with some utilities around understanding the flow of power through the grid, that sort of local level almost down to the street level.

Speaker 1:

I read a great phrase everything should just work everywhere. So how does edge computing make this a reality?

Speaker 3:

It comes back to consistency. In Red Hat's portfolio, we work really hard to make sure our solutions work in a consistent way, regardless of where they're deployed. So we've got a Kubernetes based solution that runs in the cloud, runs on premise, it'll run on a sort of modest sized server and it'll even run on a small server. Of course you're compromising on things like performance and disaster recovery, resilience, but essentially it gives you that same experience of a Kubernetes environment and the nice thing is that you can use one control plane, one management plane, to give you a unified view across all of those different deployment options, right from the cloud down to a very small footprint server.

Speaker 1:

Now, you've mentioned Kubernetes a number of times For our business audience that haven't heard that term. What does it mean and how do you explain what they do and what they are to my mum?

Speaker 3:

I think the key thing really with Kubernetes is, first of all, to understand that it's all about this idea of orchestrating containers, and a container is a way of packaging up a piece of application.

Speaker 3:

It might be an entire application, but very often it's a piece of an application in a way that's very standardized, in the same way that we do with shipping we have done.

Speaker 3:

It's put inside a standard sized container and that means you can load it onto a ship in a very efficient way. And this is the same idea applied to applications. So put your application in a container and then use Kubernetes as really the sort of the director which is saying this application or this container needs to run on this piece of hardware or this server and it needs to have this characteristics of performance and responsiveness. And so what that means is that you can rely on something like Kubernetes to actually deploy more resources to respond to an increase in demand and then scale things back as that demand drops. So it's quite a powerful idea, especially in this sort of era of cloud as I mentioned earlier, when you've got almost infinite resources in the cloud having something that can quickly and easily take account of that, take advantage of those resources, which is what Kubernetes is doing, is really important and really powerful, and to do it in a standard way and to do it in an automated way makes life a lot easier for everybody.

Speaker 1:

So the 64 million dollar pound euro question what's the future of edge computing and where is Red Hat playing a part?

Speaker 3:

One of the really interesting things is with AR and VR, apple Vision Pro, maybe the product that tips us from sort of a niche kind of interest in the metaverse into something that's a bit more widespread, in the same way that the original iPhone did. The opportunities to have more automation in vehicles and in transport is going to be really important. I mean, it seems to me that if you had a self-driving car that could actually go and charge itself and then bring itself back to you, that would be a really nice place to start with autonomous driving. It doesn't need to drive down the motorway, it just needs to pop around the corner to the nearest charging spot.

Speaker 1:

So we're almost out of time and we're up to my favorite part of the show, the quick fire round, where we learn more about our guests iPhone or Android, iphone, window or aisle.

Speaker 3:

I'm not very good at sitting still, so aisle In the room or in the metaverse In the room, but I hope the metaverse comes along soon your biggest hope for this year and next. I'd like to see a scalable breakthrough in battery technology, because I really think that energy is going to be the biggest factor probably in our economic prosperity and climate stability over the next three or four decades.

Speaker 1:

I wish that AI could do all of my shopping for gifts. The app you use most on your phone.

Speaker 3:

It's a bit sad, but I think it's Gmail for work. The best advice you've ever received Be present. The future is unknown and the past is only an unreliable memory.

Speaker 1:

What are you reading at the moment?

Speaker 3:

I'm reading a book called the Age of AI and Our Human Future. It's Henry Kissinger, Eric Schmidt and Daniel Huffman Locker.

Speaker 1:

I think I need to read that. Who should I invite next onto the podcast?

Speaker 3:

I would suggest Dr Graham Spickel. Graham's a guy I used to work with in IBM a few years ago. One of the smartest people I've ever met since IBM moved on to working in a lot of different fields. He was involved in the UK Technology Strategy Board a few years ago and is now involved with Edinburgh University.

Speaker 1:

Final quickfire question how do you want to be remembered as a problem solver? So, as this is the actionable future, as podcast, what three actionable things should our audience do today to prepare for edge computing?

Speaker 3:

Start by looking for areas of inefficiency or waste in your business, areas where better visibility could yield some improvements. The second would be find a trusted partner that you can work with that can bring together multiple elements into a solution that will address your particular needs, your particular situation. And the third would be an experiment and iterate, because there's still a lot to be learned about how applying more data based insights can improve business outcomes.

Speaker 1:

Mark a fascinating discussion, as always. How can we find out more about you and your work?

Speaker 3:

Well, I regularly post on LinkedIn and I also speak at events, particularly red hat events. So, for example, we've got our Summit Connect event coming up in London on the 7th of November. It'd be great to see any of our listeners there.

Speaker 1:

I'll put links to that in the show notes, mark. Thank you so much for your time. A really interesting topic, and I've learned much more about Edge, and I think I can probably explain it to my mum now.

Speaker 3:

Thank you, andrew.

Speaker 2:

Thank you for listening to the Actionable Futurist podcast. You can find all of our previous shows at actionablefuturistcom and if you like what you've heard on the show, please consider subscribing via your favourite podcast app so you never miss an episode. You can find out more about Andrew and how he helps corporates navigate a disruptive digital world with keynote speeches and C-suite workshops delivered in person or virtually at actionablefuturistcom. Until next time, this has been the Actionable Futurist podcast.

Exploring Edge Computing With Red Hat
Edge Computing Benefits and Applications
Edge Computing