Table of Contents

New: Develop ML/AI Faster With Guided Journeys And Fast Image Bakery

Today, as part of our release week, we're introducing two major features to help you build ML/AI systems more quickly on Outerbounds:

  • Guided journeys that help you build end-to-end ML/AI systems for the real world.
  • Blazing fast, automated Docker image builds - no infrastructure, no boilerplate required.

Let’s take a look at the new features.

New: Guided journeys

Let’s start with the why.

A major challenge in building real-world ML/AI systems is managing their inherent complexity, driven by their many moving parts. Consider a typical MLOps stack from the 2020s that includes:

Many of these tools are excellent, but much of the accidental complexity arises from the gaps between them, not the tools themselves. And there’s more. The above stack might work for an ML developer, but a platform engineer needs to weigh in before the system ready for production:

  • How do you connect to the data warehouse securely?
  • How do you separate production and staging environments?
  • How do you manage dependencies and security updates - can you manage your own Dockerfiles?
  • How do you roll out changes in production safely?
  • How are you planning to trigger training and inference in production, reacting to new data?
  • Oh, and due to security policies, all data processing must happen in the company’s cloud premises, using the existing security policies. Can you adapt the stack to these requirements?

Addressing the big picture

Since the beginning, a key value proposition of Outerbounds has been to solve the above issues in a delightfully human-friendly way.

We don’t claim to have the best solution to each subproblem - in fact, some of our customers use point solutions like Weights and Biases successfully with Outerbounds - but we address the big picture. We ensure the needs of both ML developers and platform engineers are met equally, providing a seamless, gap-free solution.

This promise should apply regardless of your industry. We support a wide variety of use cases from custom models running on massive-scale GPU clusters to reactive data processing and generative AI, each of which have different requirements. Yet, we aim to make it easy for you to build systems tailored to your specific problem domain.

Enter journeys

With the new journeys feature, we provide guided templates for various use cases. Take a look at an example journey below which demonstrates a typical project that uses XGBoost to train a model continuously, accompanied by a batch inference pipeline reacting to new data automatically.

The journey guides you through the process of building the system on Outerbounds, step by step (no sound):

This example journey touches a number of key features of Outerbounds:

All these features work seamlessly together. Instead of juggling separate tools and gaps between them, you get everything in a unified UI with simple APIs, powered by open-source Metaflow—all running securely in your cloud account.

This is just the beginning of the journey(s). We have many templates in the works, should you want to build applications using LLMs, large-scale GPU processing, or to familiarize your teams with CI/CD best practices.

New: Blazing fast, hassle-free image builds with Fast Bakery

Let’s take a closer look at a major new feature that appeared only for a few seconds in the above video—by design.

The topic of software dependency management - that is, issues you face with pip install and Docker images -  is often underappreciated in the world of ML/AI. Yet, it is often one of the biggest practical pain points.

Earlier, we have covered the topic from the end user’s perspective and that of a security-conscious platform engineer. Today, we are taking our approach to dependency management to the next level with a new feature called Fast Bakery.

Fast Bakery takes your existing @pypi or @conda environments defined in Metaflow, and converts them into a Docker image on the fly, blazing fast! No need to craft Dockerfiles by hand, remember which image to use, or wait for tens of minutes for a CI/CD system to bake an image.

Consider a classic PITA scenario: Installing torch with CUDA drivers so you can execute your code on a cloud GPU instance:

In this case, the image with massive packages was freshly baked in less than seven seconds!

Less (cognitive) overhead

Besides faster development, Fast Bakery allows you to think about dependencies differently.

Many organizations try to manage a common shared image, so individual data scientists and ML developers wouldn’t have to learn how to manage Docker images by themselves. This is an understandable solution, but it can stifle development, as every project and experiment must work with the exact same set of dependencies.

Alternatively, some teams manage multiple virtual environments or Docker images to allow for more flexibility, but maintaining a zoo of environments incurs a significant maintenance overhead. 

Fast Bakery provides the best of the both worlds: Developers don’t have to learn about the intricacies of building images and engineers don’t have to maintain them by hand. Thanks to blazing fast build times, you can test various package versions and tools as often as needed, leading to much faster development cycles.

When it is time to push projects into production, you can rest assured that the images are securely stored in your cloud account, and kept stable and interference-free for each deployment.

Start building today

To test the new features in your environment, sign up today for a 30 day free trial, Or, come back tomorrow for more announcements!

If you're interested in building systems with GenAI specifically, don't miss our webinar with NVIDIA on Friday.

Start building today

Join our office hours for a live demo! Whether you're curious about Outerbounds or have specific questions - nothing is off limits.


We can't wait to meet you soon! Keep an eye out for a confirmation email with the deets.
Oops! Something went wrong while submitting the form.