Edge computing is one of the areas where we see startup vendors offering innovative solutions, enabling applications to operate where the business operates rather than where the IT team sit. This episode of the Tech Field Day podcast focuses on the melting pot of edge computing and features Guy Currier, John Osmond, Ivan McPhee, and host Alastair Cooke, all of whom attended the recent Edge Field Day in September. To accommodate the unique nature of the diverse and unusual locations where businesses operate, many different technologies are brought together to form the melting pot of edge computing. Containers and AI applications are coming from the massive public cloud data centres to a range of embedded computers on factory floors, industrial sites, and farm equipment. ARM CPUs, sensors, and low-power hardware accelerators are coming from mobile phones to power applications in new locations. Enterprise organizations must still control and manage data and applications across these locations and platforms. Security must be built into the edge from the beginning; edge computing often happens in an unsecured location and often with no human oversight. This melting pot of technology and innovation makes edge computing an innovative part of IT.
There are Too Many Clouds
Public Cloud computing is a large part of enterprise IT alongside on-premises computing. Many organizations that had a cloud-first approach and are now gaining value from on-premises private clouds and seeing their changing business needs leading to changing cloud use. This episode of the Tech Field Day podcast delves into the complexity of multiple cloud providers and features Maciej Lelusz, Jack Poller, Justin Warren, and host Alastair Cooke, all attendees at Cloud Field Day. The awareness of changing business needs is causing some re-thinking of how businesses use cloud platforms, possibly moving away from using cloud vendor specific services to bare VMs. VMs are far simpler to move from one cloud to another, or between public cloud and private cloud platforms. Over time, the market will speak and if there are too many cloud providers, we will see mergers, acquisitions or failures of smaller specialized cloud providers. In the meantime, choosing where to put which application for the best outcome can be a challenge for businesses.
You Don’t Need Post-Quantum Crypto Yet
With the advent of quantum computers, the likelihood that modern encryption is going to be invalidated is a possibility. New standards from NIST have arrived that have ushered in the post-quantum era. You don’t need to implement them yet but you need to be familiar with them. Tom Hollingsworth is joined by JJ MInella, Drew-Conry Murray, and Alastair Cooke in this episode to discuss why post-quantum algorithms are needed, why you should be readying your enterprise to use them, and how best to plan your implementation strategy.
Network Automation Is More Than Just Tooling
The modern enterprise network automation strategy is failing. This is due in part to a collection of tools masquerading as an automation solution. In this episode, Tom Hollingsworth is joined by Scott Robohn, Bruno Wollmann, and special guest Mike Bushong of Nokia to discuss the current state of automation in the data center. They discuss how tools are often improperly incorporated as well as why organizations shouldn’t rely on just a single person or team to affect change. They also explore ideas around Nokia Event-Driven Automation (EDA), a new operations platform dedicated to solving these issues.
Data Infrastructure Is A Lot More Than Storage
The rise of AI and the importance of data to modern businesses has driven us too recognize that data matters, not storage. This episode of the Tech Field Day podcast focuses on AI data infrastructure and features Camberley Bates, Andy Banta, David Klee, and host Stephen Foskett, all of whom will be attending our AI Data Infrastructure Field Day this week. We’ve known for decades that storage solutions must provide the right access method for applications, not just performance, capacity, and reliability. Today’s enterprise storage solutions have specialized data services and interfaces to enable AI workloads, even as capacity has been driven beyond what we’ve seen in the past. Power and cooling is another critical element, since AI systems are optimized to make the most of expensive GPUs and accelerators. AI also requires extensive preparation and organization of data as well as traceability and records of metadata for compliance and reproducibility. Another question is interfaces, with modern storage turning to object stores or even vector database interfaces rather than traditional block and file. AI is driving a profound transformation of storage and data.
AI and Cloud Demand a New Approach to Cyber Resilience featuring Commvault
As companies are exposed to more and more attackers, they’re realizing that cyber resilience is increasingly important. On this episode of the Tech Field Day Podcast, presented by Commvault, Senior Director of Product and Ecosystem Strategy Michael Stempf joins Justin Warren, Karen Lopez, and Stephen Foskett to discuss the growing challenges companies face in today’s cybersecurity landscape. As more organizations transition to a cloud-first operation, they’re recognizing the heightened exposure of their data protection strategies to global compliance mandates like DORA and SCI. Adding to this complexity is the emerging threat of AI, raising important questions about how businesses can adapt and maintain resilience in the face of these evolving risks.
Hardware Still Matters at the Edge
Hardware innovation at the edge is driven by diverse and challenging environments found outside traditional data centers. This episode of the Tech Field Day podcast features Jack Poller, Stephen Foskett, and Alastair Cooke considering the special requirements of hardware in edge computing prior to Edge Field Day this week. Edge locations, including energy, military, retail, and more, demand robust, tamper-resistant hardware that can endure harsh conditions like extreme temperatures and vibrations. This shift is fostering new hardware designs, drawing inspiration from industries like mobile technology, to support real-time data processing and AI applications. As edge computing grows, the interplay between durable hardware and adaptive software, including containerized platforms, will be crucial for maximizing efficiency and unlocking new capabilities in these dynamic environments.
AI Solves All Our Problems
Although AI can be quite useful, it seems that the promise of generative AI has lead to irrational exuberance on the topic. This episode of the Tech Field Day podcast, recorded ahead of AI Field Day, features Justin Warren, Alastair Cooke, Frederic van Haren, and Stephen Foskett considering the promises made about AI. Generative AI was so impressive that it escaped from the lab, being pushed into production before it was ready for use. We are still living with the repercussions of this decision on a daily basis, with AI assistants appearing everywhere. Many customers are already frustrated by these systems, leading to a rapid push-back against the universal use of LLM chatbots. One problem the widespread mis-use of AI has solved already is the search for a driver of computer hardware and software sales, though this already seems to be wearing off. But once we take stock of the huge variety of tools being created, it is likely that we will have many useful new technologies to apply.
Ethernet is not Ready to Replace InfiniBand Yet
AI networking is making huge strides toward standardization but Ethernet isn’t ready to displace the leading incumbent InfiniBand yet. In this episode of the Tech Field Day Podcast, Tom Hollingsworth is joined by Scott Robohn and Ray Lucchesi to discuss the state of Ethernet today and how it is continuing to improve. The guests discuss topics such as the dominance of InfiniBand, why basic Ethernet isn’t suited to latency-sensitive workloads, and how the future will improve the technology.
AI is Not a Fad
The current hype about building massive generative AI models with massive hardware investment is just one aspect of AI. This episode of the Tech Field Day podcast features Frederic Van Haren, Karen Lopez, Marian Newsome, and host Stephen Foskett taking a different perspective on the larger world of AI. Our last episode suggested that AI as it is currently being hyped is a fad, but the bigger world of AI is absolutely real. Large language models are maturing rapidly and even generative AI is getting better by the month, but we are rapidly seeing the reality of the use cases for this technology. All neural networks use patterns in historical data to infer results, so any AI engine could hallucinate. But traditional AI is much less susceptible to errors than the much-hyped generative AI models that are capturing the headlines today. AI is a tool that augments our knowledge and decision making, but it doesn’t replace human intelligence. There is a whole world of AI applications that are productive, responsible, and practical, and these are most certainly not a fad.
AI as We Know It is Just a Fad
Although AI is certain to transform society, not to mention computing, what we know if it is unlikely to last much longer. This episode of the Tech Field Day podcast brings together Glenn Dekhayser, Alastair Cooke, Allyson Klein, and Stephen Foskett to discuss the real and changing world of AI. Looking at AI infrastructure today, we see massive clusters of GPUs being deployed in the cloud and on-premises to train ever-larger language models, but how much business value do these clusters have long-term? It seems that the true transformation promised by LLM and GenAI will be realized once models are applied across industries with RAG or tuning rather than developing new models. Ultimately AI is a feature of a larger business process or application rather than being a product in itself. We can certainly see that AI-based applications will be transformative, but the vast investment required to build out AI infrastructure to date might never be recouped. Ultimately there is a future for AI, but not the way we have been doing it to date.
AI Has A Place In Networking Operations
Generative AI tools and features are becoming an indispensable part of the way operations teams do their jobs. Tom Hollingsworth is joined by Keith Parsons, Kerry Kulp, and Ron Westfall for this episode discussing the rise of AI tools and how they are implemented. The guests talk about how AI tools should be used by teams to increase their capabilities. They also discuss where AI still has a lot of room to grow and how to avoid traps that could cause issues for stakeholders and champions.
Network as a Service is More of a Financial Model
Network-as-a-Service (NaaS) is a very popular topic in the modern enterprise. It promises a way to consume networking technologies in the same way that one would purchase cloud computing by only charging users for what they need. In this episode of the Tech Field Day podcast, Jordan Martin, Micheline Murphy, and Robb Boyd join Tom Hollingsworth as they discuss the various ways that Network-as-a-Service can be expressed in an organization. They debate the merits of the operational model versus the financial aspects and how NaaS blends into the wider industry trends.
The Mainframe is Still Going Strong
Despite the hype about modern applications, the mainframe remains central to enterprise IT and is rapidly adopting new technologies. This episode of the Tech Field Day podcast features Steven Dickens, Geoffrey Decker, and Jon Hildebrand talking to Stephen Foskett about the modern mainframe prior to the SHARE conference. The modern datacenter is rapidly adopting technologies like containerization, orchestration, and artificial intelligence, and these are coming to the mainframe world as well. And the continued importance of mainframe applications, especially in finance and transportation, makes the mainframe more important than ever. There is a tremendous career opportunity in mainframes as well, with recent grads commanding high salaries and working with exciting modern technologies. Modern mainframes run Linux natively, support OpenShift and containers, and support all of the latest languages and programming models in addition to PL1, Cobol, DB2, and of course zOS. We’re looking forward to bringing the latest in the mainframe space from SHARE to our audience.
Network Engineering is a Dying Profession
Network Engineering isn’t the hottest profession on the block and people have expressed concerns that the profession is going to be subsumed into other disciplines in the near future. In this episode of the Tech Field Day podcast, Tom Hollingsworth joins Andy Lapteff and Remington Loose at the table to discuss the decline in network engineering roles. They also talk about changes in perceptions as well as the industry. They close out by discussing the future outlook for roles involving network engineering.
Open Source Helps Small Businesses Modernize Applications
Open-source platforms and managed services are a huge help when it comes to modernizing applications, especially for smaller businesses. This episode of the Tech Field Day podcast, recorded at AppDev Field Day, includes Stephen Foskett and Paul Nashawaty discussing the challenges and solutions for small businesses in modernizing applications.
On-Premises Networks Need to Work Like Cloud Networks
On-premises networks are still very common for specialize applications and need to adopt cloud network operational models. In this episode, Tom Hollingsworth is joined by experts Ron Westfall, Chris Grundemann, and Jeremy Schulman as they discuss how to better implement these preferred methods. They also debate how each model has different requirements and may face headwinds in an enterprise.
Everything is the Cloud and The Cloud is Everything
The cloud operating model is everywhere these days, and just about everything is now called cloud. This episode of the Tech Field Day podcast, recorded live at Cloud Field Day 20, includes Stephen Foskett, Jeffrey Powers, Alastair Cooke, and Steve Puluka discussing the true meaning of the term cloud computing.
GenAI is Revolutionizing the Enterprise
Generative AI will revolutionize enterprise IT, but not in the way people expect. This episode of the Tech Field Day podcast includes Stephen Foskett discussing the impact of GenAI with Jack Poller, Calvin Hendryx-Parker, and Josh Atwell at AppDev Field Day. The discussion centered around the potential impact of generative AI on enterprises, debating whether it will significantly transform business operations or merely offer incremental improvements. Generative AI is still in its infancy and may not yet provide revolutionary benefits, but there is great potential for AI in automating tasks and enhancing efficiencies despite challenges in implementation and validation. We must be realistic when it comes to the application of AI in enterprises, and it is important to understand the real capabilities and limitations, and the role of existing vendors in integrating AI functionalities into their products.
It’s Time for Private 5G in the Enterprise
Wi-Fi has changed the way we work in the office but it’s not the only wireless technology. Challenging environments require new solutions like private 5G. In this episode, Tom Hollingsworth is joined by Mark Houtz and Shaun Neal as they discuss the rise of private LTE/5G technologies outside of the carrier space.