Your submission was sent successfully! Close

Thank you for contacting us. A member of our team will be in touch shortly. Close

You have successfully unsubscribed! Close

Thank you for signing up for our newsletter!
In these regular emails you will find the latest updates about Ubuntu and upcoming events where you can meet our team.Close

The Canonical AI Roadshow

How enterprises are using generative AI with open source

From the USA to the Netherlands, from UAE to Mexico, Canonical experts are going on an AI roadshow across the globe. We are ready to showcase LLMs applications and predictive analytics use cases across various industries. Get answers to your most common AI/ML questions and talk to our team about big data, MLOps and generative AI use cases with open source.

Read the roadshow highlights Get in touch


Develop artificial intelligence projects
on any environment with Canonical


Join us on the roadshow

Experience the best of both worlds: open source and machine learning in the company of Canonical experts. Get insight on the latest innovations and how you could build your own LLMS.


Meet AI/ML experts and ask your questions

AI/ML and big data experts will be joining the Canonical AI Roadshow. We will host talks on different topics, run workshops to build your expertise and more.


See our demos and try building your own LLM

We have prepared a series of demos that showcase the power of LLMs and how to build them using open source tooling.

Join one of our workshops and learn how to build your own LLM.


Run AI projects in production securely

Get enterprise support or managed services for the tools. Run your projects securely in production without worrying about upgrades, updates or vulnerabilities.


Learn how open source is leading LLM development

Large language models (LLMs) are the future of AI. They have applications across various industries, but taking them beyond experimentation is still a challenge. Open source tooling covers the entire lifecycle, easing operations and accelerating the entire process.


  • Go from raw data to features

    Perform all the steps needed to deliver features for model training with open source — now the industry standard.

  • Data features to machine learning models

    Build LLMs efficiently, automate workflows, reduce operations.

  • Machine learning models to production

    Upgrade your model serving strategy with open source.


Where to find us

Europe

19-21 September 2023
Bilbao, Spain

Find us inside the OpenSearch.org booth


11-12 October 2023
Amsterdam, Netherlands

Talk: Building LLMs: from zero to hero with Maciej Mazur, Canonical

Workshop: Building your LLM Factory with Michael Balint, NVIDIA, Andreea Munteanu, Maciej Mazur, Canonical


Ubuntu Summit, AI/ML track

3-5 November
Riga, Latvia

Workshop: Move data science beyond experimentation with MLOps platforms with Andreea Munteanu and Kimonas Sotirchos, Canonical

Panel: Future of AI: is it here to help or just take away our jobs? Moderated by Andreea Munteanu with other 3 participants


6-7 December 2023
Paris, France

AI/ML track


North America

27-29 September 2023
Seattle, Washington

Talk: Infra+OpenSearch in your laptop under 5 minutes with Alastair Flynn

Talk: From document to vector: using OpenSearch to store embedding data with Pedro Leão da Cruz

See you in Canonical Charmed OpenSearch booth in the conference


25-26 October 2023
Austin, Texas

Talk: MLOps on highly sensitive data — Strict Confinement, Confidential Computing and Tokenization Protecting Privacy with Andreea Munteanu & Maciej Mazur

Workshop: Build LLMs based on your own company's data with open-source with Andreea Munteanu & Maciej Mazur


6-9 November
Chicago, Illinois


6 November
Chicago, Illinois


27 November - 1 December
Las Vegas, Nevada


Middle East & Africa

16-20 October
Dubai, UAE

Talk: Engineering Tomorrow’s Decisions: Building Predictive analytics projects with open source with Andreea Munteanu and Rob Gibbon


Central & South America

21-22 November 2023
Sao Paolo, Brazil

Round table with Canonical AI experts and partners from the region

To join us at an event, contact us now.
For future AI events, contact pr@canonical.com


Support your AI/ML projects with Canonical

A production-grade MLOps platform where the end-to-end ML lifecycle runs.


A security-maintained and fully supported solution for Apache Spark on Kubernetes.


A platform used to manage machine learning workflows, used primarily for model registry.


Enterprise OpenSearch solution with support, security maintenance and operations automation.


Enterprise MongoDB solution with support, security maintenance and operations automation.


Access one of our consultancy lanes for AI & Data experts to kickstart your journey.


AI/ML resources

Whitepaper

A guide to MLOps

Learn how to choose your MLOps tooling and take your AI projects to production.


Case study

University of Tasmania unlocks real-time space tracking with AI/ML supercomputing

Tasmania University (UTAS) is modernising its space-tracking data processing with the Firmus Supercloud, built on Canonical's open infrastructure stack

Follow us on Medium and listen to our podcast