Americas

  • United States
Maria Korolov
Contributing writer

Red Hat extends Lightspeed generative AI tool to OpenShift and Enterprise Linux

News Analysis
May 07, 20244 mins
LinuxNetwork Management SoftwareNetworking

Red Hat's Lightspeed, a gen AI-powered assistant, will be extended to RHEL and OpenShift to help enterprises that want to use Linux, automation, and hybrid clouds but may not have the skills in house.

Female Junior Software Engineer Writes Code on Desktop Computer With Two Monitors and Laptop Aside In Stylish Office. Caucasian Woman Working On Artificial Intelligence Service For Big Tech Company.
Credit: Gorodenkoff / Shutterstock

Red Hat’s generative AI-powered Lightspeed tool was first announced last year for the Red Hat Ansible automation platform. This morning, as the Red Hat Summit kicks off in Denver, the company announced that it will be extended to Red Hat Enterprise Linux and Red Hat OpenShift.

OpenShift, Red Hat’s Kubernetes-powered hybrid cloud application platform, will be getting it late this year. Red Hat Enterprise Linux Lightspeed is now in its planning phase, with more information coming soon. (At the Summit, Red Hat also announced a new ‘policy as code’ capability for Ansible.)

“This will bring similar genAI capabilities to both of those platforms across the hybrid cloud,” says Chuck Dubuque, senior director for product marketing at Red Hat OpenShift. Users will be able to ask questions in simple English and get usable code as a result, or suggestions for specific actions, he says, and the tool is designed to address skills gaps and the increasing complexities in enterprise IT.

“More seasoned IT pros can use Red Hat Lightspeed to extend their skills by using Red Hat Lightspeed as a force multiplier,” Dubuque says. “It can help quickly generate potential answers to niche questions or handle otherwise tedious tasks at scale. It helps IT organizations innovate and build a stronger skilled core while helping further drive innovation.”

The vision is that Red Hat Linux will help companies address this skills gap and put more power in the hands of organizations who want to use Linux, automation, and hybrid clouds but don’t have the skills in house, he says, “or endless funds to enlist said skills.”

Other generative AI platforms can also answer questions and write code, but those are general-purpose LLMs, he says. “We built a purpose-driven model to solve unique challenges for IT,” he says. “The skill sets required for programming and development haven’t always been widely accessible to the entire talent pool or businesses with limited resources.”

Red Hat didn’t create the core foundation model, however.

Take, for example, Ansible Lightspeed, which became generally available last November. Ansible Lightspeed is based on IBM’s WatsonX Code Assistant, which, in turn, is powered by the IBM Granite foundation models, according to Sathish Balakrishnan, vice president and general manager at the Red Hat Ansible Business Unit,

It is then further trained on data from Ansible Galaxy, an open-source repository of Ansible content covering a variety of use cases, he says, and further fine-tuned with additional expertise from Red Hat and IBM.

For example, to create and edit an Ansible Playbook and rules, users can type in a question and get an output that’s translated into YAML content. That streamlines role and playbook creation, Balakrishnan says. This helps companies translate subject matter expertise into best practices that can scale across teams, standardize and improve quality, and adhere to industry standards.

“The service also helps safeguard private data through data isolation, so sensitive customer information remains untouched and possible data leaks are minimized,” he says.

Hundreds of customers are already using Lightspeed Ansible to generate tasks, says Dubuque. “And we’re expanding it to build full playbooks,” he says. “But Red Hat Lightspeed is bigger than just Ansible. We’re infusing generative AI into all our platforms.”

So, for example, with OpenShift Lightspeed, users will have an assistant integrated right into the OpenShift console so they can ask questions in plain English about the product or get help with troubleshooting. “Our goal is to increase productivity and efficiency,” he says.

However, we’re still in the early days of generative AI and AI assistants, says IDC analyst Stephen Elliot, so companies do need to be careful about how they use the technology. “But it’s a safe assumption that most of these models are going to get better and smarter,” Elliot says.

Maria Korolov
Contributing writer

Maria Korolov is an award-winning technology journalist covering AI and cybersecurity. She also writes science fiction novels, edits a sci-fi and fantasy magazine, and hosts a YouTube show.

More from this author