Americas

  • United States
John Edwards
Contributing writer

Predictive networking promises faster fixes

Feature
Mar 27, 20237 mins
Network Management SoftwareNetwork Monitoring

Predictive network technology promises to find and fix problems faster.

Conceptual trend lines track + monitor data analytics [forecasting / future / what's next]

With the assistance of artificial intelligence (AI) and machine learning (ML), predictive network technology alerts administrators to possible network issues as early as possible and offers potential solutions.

The AI and ML algorithms used in predictive network technology have become critical, says Bob Hersch, a principal with Deloitte Consulting and US lead for platforms and infrastructure. “Predictive network technology leverages artificial neural networks and utilizes models to analyze data, learn patterns, and make predictions,” he says. “AI and ML significantly enhance observability, application visibility, and the ability to respond to network and other issues.”

While predictive network technology has made impressive strides over the past several years, many developers and observers are confident that the best is yet to come. “Tools and systems are available now, but like most significant evolutions in technology there are risks for the early adopters, as development and even how to assess the effectiveness of a shift are in flight,” says David Lessin, a director at technology research and advisory firm ISG.

Predictive analytics is no longer just for predicting network outages and proactively handling problems of bandwidth and application performance, says Yaakov Shapiro, CTO at telecommunications software and services provider Tangoe. “Predictive analytics are now being applied to problems surrounding the network and helping to address the downsides of SD-WAN, most notably the issue of provider sprawl and the need for wider carrier-service management and telecom-cost optimization,” he says. “These have become larger issues in the age of trading MPLS—one- and two-carrier services—for broadband services comprising potentially hundreds of internet service providers.”

AI is moving predictive networking forward.

The most recent evolution of AI is the most important development in predictive network technology. “Cloud-based AI technologies can improve the quality and speed of information delivered to network technicians while giving them a valuable tool to investigate outages and other issues,” says Patrick MeLampy, a Juniper Networks fellow. “AI can detect anomalies quicker than humans and can even analyze the root cause of an anomaly, helping to guide a technician to understand and repair the issue faster than before.”

The integration of AI tools into predictive network technology also has the potential to be an economic game-changer. “With mature AI and ML tools at their disposal, service providers and organizations alike can reduce the costs of problem discovery and resolution,” MeLampy says. In addition to bottom-line economic benefits, AI helps to simplify management, either within an enterprise or across a service provider’s portfolio. “Mean-time-to repair is decreased, improving end user satisfaction as well,” he says.

Bryan Woodworth, principal solutions strategist at multicloud network technology firm Aviatrix, says that predictive network technology will advance rapidly over the next few years. It already helps resolve network issues quickly and efficiently. “AI can correlate alerts and error conditions across many disparate systems, discovering related patterns in minutes or even seconds, something that would take humans hours or days,” he says.

Predictive network technology can also drastically decrease the number of false positives tucked into log and error analyses, leading to more intelligent and useful alerts, Woodworth says. “You can’t heal from something you don’t detect,” he says. “For example, before you change the network to route around a problem, you must know where that problem is.” Self-healing networks based on AI and ML provide better recommendations on how to recover from errors and avoid outages.

Predictive modeling works best in data centers.

Network behavior analytics examines network data, such as ports, protocols, performance, and geo-IP data, to alert whenever there’s been a significant change in network behavior that might indicate a threat. “In the future, this data can be fed into an AI model that can help confirm if the threat is real, and then make suggestions on how to remediate the issue by changing the network,” Woodworth says. “This kind of predictive modeling works best within private networks, like the data center, because [that’s where] humans have complete control over all the networking components and the data they generate.”

For public networks, including those connected to the internet, the task becomes more challenging. Learning models must be designed to compensate for systems that aren’t under direct control or provide incomplete data sets. This means that learning models will make less accurate predictions and may need to be tuned by humans to compensate for the missing data, Woodworth says.

To be fully effective, advanced AI and ML models should run at production level and scale for error remediation, Smith says. “Decision-makers need to trust modeling results, and technology sponsors need to execute operations efficiently,” he says.

Meanwhile, ongoing advances in cloud technology and graphics processing units (GPUs) are taking modeling to new levels. “Open source and commercial frameworks are helping organizations deploy ML operations rapidly and at-scale with less risk associated with the time and complexity required to configure cloud and open source systems for AI,” says Maggie Smith, managing director, applied intelligence, at consulting firm Accenture Federal Services.

Smith says that several major cloud providers have already implemented AI model optimization and management features. The technology can be found in in tools such as Amazon SageMaker, Google AI Platform, and Azure Machine Learning Studio. “Open-source frameworks like TensorRT, and Hugging Face retrain additional opportunities for model monitoring and efficiencies,” Smith says.

Predictive networking analyzes cloud and edge workloads.

Big picture, predictive AI-based networking is not as much about the network as it is about cloud workloads, edge delivery, and user endpoint devices, such as laptop computers and mobile devices. “By understanding workloads—the network traffic they generate, latency requirements, and who is consuming data how and where—the high-fidelity data needed for predictive networking can be identified to support the automatic adaptation of virtual private clouds (VPCs),” says Curt Aubley, risk and financial advisory managing director, and US cyber detect-and-respond leader at business advisory firm Deloitte.

Micro segmentation, load balancers, and traffic shapers are all helping to optimize delivery. “The same high-fidelity data used for network-focused AI can also be used to complement cyber-security teams’ consolidated extended detection and response data lakes for security analytics,” Aubley says. AI models are used to detect anomalies, unknown unknowns, and lateral movement. “Using the same high-fidelity data from cloud workloads, networks, and endpoints for different use cases can help ensure confidentiality, integrity, and the availability of applications needed for business or government cyber risk management.”

Routers, wireless applications, switches, and various other general networking gear don’t typically collect user-specific data. While application-performance monitoring tools do measure user data, they can’t correlate results into proactive network actions. “Networks must become user and application aware in order to collect the types of data necessary to build actionable models for the use of AI and predictive technologies,” MeLampy says. “If a solution doesn’t measure experience per user, it isn’t going to be successful.”

Prescriptive analytics is the future.

The emerging field of neuromorphic computing, based on a chip architecture that’s engineered to mimic human brain structure, promises to provide highly effective ML on edge devices. “Predictive network technology is so powerful because of its ability to intake signals and make accurate predictions about equipment failures to optimize maintenance,” says Gil Dror, CTO at monitoring technology provider SmartSense. He says that neuromorphic computing will become even more powerful when it moves from predictive to prescriptive analytics, which recommends what should be done to ensure future outcomes.

Neuromorphic computing’s chip architecture is geared toward making intelligent decisions on edge devices themselves, Dror says. “The combination of these two technologies will make the field of predictive network technology much more powerful,” he says.

Organizations including IBM, Intel, and Qualcomm are developing neuromorphic computing technologies. “Some companies have released neuromorphic computing chips for research-and-development purposes, such as IBM’s TrueNorth chip and Intel’s Loihi chip,” Dror says. These chips aren’t yet generally available for commercial use, and it’s likely that there will be at least several more years of intense research and development before neuromorphic computing becomes a mainstream technology. “Once it becomes viable, the impact will be massive,” he predicts.