Build Reproducible and Scalable Computational Biology Systems
This post introduces high-level trends at the intersection of biology and AI, discusses new (and old) technical challenges in building reproducible and scalable systems for AI-driven computational biology, and how frameworks like Metaflow can help address them.
Better, Faster, Stronger LLM Fine-tuning
Reactive, configurable, cheaper LLM fine-tuning with Metaflow
Retrieval-Augmented Generation: How to Use Your Data to Guide LLMs
Learn how to use Retrieval Augmented Generation to control hallucinations and get more relevant responses from LLMs.
Fine-tuning a Large Language Model using Metaflow, featuring LLaMA and LoRA
A workflow template built with Metaflow for fine-tuning LLMs for custom use cases.
Large Language Models and the Future of Custom, Fine-tuned LLMs
An overview of instruction tuning for large language models using the LLaMA family of models.
Training a Large Language Model With Metaflow, Featuring Dolly
We use Metaflow to train Dolly, to show an example of fine-tuning LLMs and what it takes to use these models in practice.
Large Language Models and the Future of the ML Infrastructure Stack
A peek into how Outerbounds views the ongoing evolution of the machine learning stack, in the wake of recent LLM waves.