Skip to main content

Your submission was sent successfully! Close

Thank you for signing up for our newsletter!
In these regular emails you will find the latest updates from Canonical and upcoming events where you can meet our team.Close

Thank you for contacting us. A member of our team will be in touch shortly. Close

Your full stack for AI infrastructure at scale

We make AI infrastructure easy so you can focus on your models. Get all the infrastructure components you need for AI, from the operating system to the MLOps platform and all the way to the secure edge.


Contact Canonical

Why Canonical for enterprise
AI infrastructure?

  • Stable and supported end-to-end stack optimised for AI performance
  • Control your TCO with predictable pricing per node
  • Design and deploy your AI infrastructure with expert guidance
  • Fast-track compliance with security across all layers of the stack

Deploy on any platform, scale with a hybrid cloud strategy

Choose the ideal AI infrastructure for your use cases — for instance, start quickly with no risk and low investment on public clouds, then move workloads to your own data centre as you scale.

With Canonical’s solutions, you can run your workloads anywhere, including hybrid and multi-cloud environments.


Download our hybrid cloud strategy playbook ›

Deploy machine learning models to the edge

Unlock real-time data processing in distributed environments by deploying machine learning models to your edge devices.

Canonical infrastructure spans the entire AI journey, from the desktop to the edge.


Access our guide to open source edge AI ›

Stay in control
of AI infrastructure costs

Data volumes involved in AI projects can scale rapidly, making public cloud prohibitively costly. If operated efficiently, a private cloud is more cost-effective for running workloads long-term and at scale.

Build your private cloud with our AI infrastructure solutions and enjoy predictable pricing per node with no licence fees.


Check out our cloud pricing report ›

What customers say


“The level of engagement from the Canonical team was remarkable. Even before we entered into a commercial agreement, Canonical offered valuable advice around OEMs and hardware choices. They were dedicated to our project from the get-go.”


Tim Rosenfield CEO and Co-Founder, Firmus Download the full case study ›

The #1 Linux in the cloud

Thanks to its security, versatility and policy of regular updates, Ubuntu is the most popular operating system across public clouds. And the best part is: it's free. You pay only for the commercial support you need.

Get started easily on your cloud of choice with optimised and certified images.


Learn more about Ubuntu on public cloud ›


Accelerate innovation with certified hardware

Get peace of mind with Ubuntu certified hardware from all major OEMs. Choosing validated platforms significantly reduces deployment time and costs for end-customers, in addition to improving reliability and providing bug-fixes faster.

For example, combined with NVIDIA DGX systems or NVIDIA-Certified systems, choosing an Ubuntu-certified platform provides secure, performant hardware that is guaranteed to deploy quickly and run smoothly.


Explore certified hardware ›



Kubernetes optimised for AI

Kubernetes plays a pivotal role in orchestrating AI applications. Canonical delivers easy-to-use, CNCF conformant Kubernetes distributions for hybrid and multi-cloud operations.

Canonical Kubernetes is optimised to enhance AI/ML performance, incorporating features that amplify processing power and reduce latency developed and integrated in a tight collaboration with NVIDIA. For instance, you can optimise hardware utilisation through NVIDIA operators and the Volcano scheduler.


Download the solution brief:
Kubernetes by Canonical delivered on NVIDIA DGX systems ›


From experimentation to production with a modular
MLOps platform

Machine learning operations (MLOps) is like DevOps for machine learning. It is a set of practices that automates machine learning workflows, ensuring scalability, portability and reproducibility.

Our modular MLOps platform equips you with everything you need to bring your models all the way from experimentation to production. With easy access to key tools, you can move quickly and at scale.


Discover our MLOps platform ›



Security across all layers of
the stack

Canonical maintains and supports all the open source tooling in your AI infrastructure stack, including Kubernetes and open source cloud management solutions like OpenStack. Fast-track compliance and run AI projects securely with critical CVEs fixed in under 24h on average.


Explore open source security with Canonical ›


Confidential AI

Confidential AI on Ubuntu protects data in use at the hardware level. Building on Ubuntu confidential VMs, you can now safeguard your sensitive data and intellectual property with a hardware-rooted execution environment that spans both the CPU and GPU.

Ubuntu confidential VMs are available on Azure, Google Cloud and AWS.

You can enable confidential computing in the datacenter and at the edge thanks to Ubuntu Intel TDX build, which comes with all the required pieces for both the guest and host.


Read more about confidential AI ›



Open source AI infrastructure
in action