


Red Hat AI
Build AI for your world
Deliver AI value with the resources you have, the insights you own, and the freedom you need.
Red Hat® AI is engineered to help you build and run AI solutions that work exactly how your business does—from first experiments to full production.
Flexible AI for the enterprise
To stay consistent across the hybrid cloud, you need a platform that lets you deploy where your data resides.
Red Hat AI puts you in control of both generative and predictive AI capabilities—in the cloud, on-premise, or at the edge.
With Red Hat AI, you can stay flexible while you scale.
With Red Hat AI, you can:
- Access open source-assured Granite language and code models.
- Tune smaller, purpose-built models with your own data.
- Optimize inference with vLLM.
Red Hat AI includes:
Red Hat Enterprise Linux AI
Red Hat Enterprise Linux® AI can help customers at the beginning of their AI journey, who haven’t defined their business use cases yet. The AI platform is built to develop, test, and run generative AI (gen AI) foundation models.
Features and benefits
- Includes IBM's Granite family LLMs
- Fine-tune models locally with InstructLab
- Cost-efficient for customers with restricted GPU access
Red Hat OpenShift AI
Red Hat OpenShift® AI is built for customers who are ready to scale their AI applications. This AI platform can help manage the lifecycle of both predictive and gen AI models across hybrid cloud environments.
Features and benefits
- Enterprise MLOps capabilities
- Includes IBM Granite LLMs and InstructLab tooling
- Build and deliver AI at scale with hardware accelerators and hybrid-cloud support
Customize LLMs locally with InstructLab
Red Hat’s InstructLab is a community-driven project that makes it easier for developers to experiment with IBM’s Granite models, even for those with minimal machine learning experience.
It’s a great place to start if you want to experiment with the AI model of your choice or fine-tune foundation models on your local hardware.
This removes the cost and resource barriers to experiment with AI models, before you’re ready to bring AI to your enterprise.
More AI partners. More paths forward.
Experts and technologies are coming together so our customers can do more with AI. A variety of technology partners are working with Red Hat to certify their operability with our solutions.
Solution Pattern
Red Hat AI applications with NVIDIA AI Enterprise
Create a RAG application
Red Hat OpenShift AI is a platform for building data science projects and serving AI-enabled applications. You can integrate all the tools you need to support retrieval-augmented generation (RAG), a method for getting AI answers from your own reference documents. When you connect OpenShift AI with NVIDIA AI Enterprise, you can experiment with large language models (LLMs) to find the optimal model for your application.
Build a pipeline for documents
To make use of RAG, you first need to ingest your documents into a vector database. In our example app, we embed a set of product documents in a Redis database. Since these documents change frequently, we can create a pipeline for this process that we’ll run periodically, so we always have the latest versions of the documents.
Browse the LLM catalog
NVIDIA AI Enterprise gives you access to a catalog of different LLMs, so you can try different choices and select the model that delivers the best results. The models are hosted in the NVIDIA API catalog. Once you’ve set up an API token, you can deploy a model using the NVIDIA NIM model serving platform directly from OpenShift AI.
Choose the right model
As you test different LLMs, your users can rate each generated response. You can set up a Grafana monitoring dashboard to compare the ratings, as well as latency and response time for each model. Then you can use that data to choose the best LLM to use in production.
Red Hat AI in the real world
Ortec Finance accelerates growth and time to market
Ortec Finance, a global technology and solutions provider for risk and return management is serving ML models on Microsoft Azure Red Hat OpenShift and is adopting Red Hat AI.
Denizbank empowers its data scientists
DenizBank is developing AI models to help identify loans for customers and potential fraud. With Red Hat AI, its data scientists gained a new level of autonomy over their data.


