Participate in our State of AI Development Surveyfor a chance to win a MacBook M4 Pro!Take 4-min Survey →
Guides
Announcing Vellum VPC
Aug 27, 2024
Akash Sharma
Co-founder & CEO
Co-authors:
No items found.
Table of Contents

TLDR

We're excited to announce that Vellum now supports Virtual Private Cloud (VPC) installations, allowing enterprises to deploy our AI development platform within their own cloud environment (AWS, Azure and GCP). With VPC deployments, all sensitive data stays in your infrastructure, supported by our partnership with Replicated for seamless updates and remote debugging.

As a development platform that enables companies to build AI systems on Large Language Models, we knew we were working with sensitive data. The sensitive data could range from PHI in the context of healthcare companies to personal financial information in the context of banks & insurance companies. We’ve designed the product from Day 1 with enterprise grade security in mind and have had SOC 2 Type II certification & HIPAA compliance for some time now. However, we kept hearing requests for more:

  • “Our company’s data shouldn’t leave our four walls”
  • “We need better alignment with our security policies to meet industry best practices”
  • “We would like to find more ways to use our pre-committed cloud computing spend”

Today we’re excited to announce support for Virtual Private Cloud installations of our software in our customer’s cloud environments. For enterprises operating in regulated industries or looking to keep their data secure, a VPC deployment of Vellum is available on all major cloud providers: AWS, Azure & GCP.

How Vellum’s VPC offering works

Vellum’s Virtual Private Cloud installation provides you complete control over your data. If you use Language Models hosted in your private cloud, you can also leverage our platform tooling in a way that allows all testing & production data to remain private and secure.

We’ve partnered with Replicated to allow you to easily self-host Vellum in your VPC and keep it up to date without our team needing any access to your infrastructure. Data and compute live on your cloud, and compute usage can be charged against any credits offered by cloud providers. We leverage support bundles from Replicated to provide application updates & debug issues remotely.

Vellum VPC vs Vellum Managed

When it comes to choosing between Vellum VPC and Vellum Managed, the decision often hinges on your company’s specific needs around data residency, compliance, and security. Vellum VPC is designed for organizations that operate in highly regulated environments or have stringent internal policies that require complete control over their data infrastructure. It provides a dedicated environment within your own virtual private cloud, giving you full visibility and control over your data and workflows.

On the other hand, Vellum Managed is ideal for companies looking for a more hands-off approach. It offers the convenience of a fully managed service, where Vellum takes care of all the operational aspects, including scaling, security updates, and maintenance. This option is perfect for teams that want to focus on building and deploying AI solutions without worrying about the underlying infrastructure.

Below, you’ll find a comparison of the key features and benefits of Vellum VPC versus Vellum Managed to help you determine which option is the best fit for your organization:

Our VPC offering is meant for companies with strict data residency, compliance, or security requirements. Here’s how the VPC offering compares to our Managed offering:

How Vellum's VPC offering compares to the Managed offering.

If you’re hesitant about your company’s data but would like to use AI in production, we’d love to support you! We provide the tooling & best practices while adhering to your privacy & security requirements. You can now deploy Vellum in your own cloud, ensuring data doesn’t leave your four walls and draw down your pre-existing cloud commitments.

If you’d like to learn more, get in touch!

ABOUT THE AUTHOR
Akash Sharma
Co-founder & CEO

Akash Sharma, CEO and co-founder at Vellum (YC W23) is enabling developers to easily start, develop and evaluate LLM powered apps. By talking to over 1,500 people at varying maturities of using LLMs in production, he has acquired a very unique understanding of the landscape, and is actively distilling his learnings with the broader LLM community. Before starting Vellum, Akash completed his undergrad at the University of California, Berkeley, then spent 5 years at McKinsey's Silicon Valley Office.

No items found.
The Best AI Tips — Direct To Your Inbox

Latest AI news, tips, and techniques

Specific tips for Your AI use cases

No spam

Oops! Something went wrong while submitting the form.

Each issue is packed with valuable resources, tools, and insights that help us stay ahead in AI development. We've discovered strategies and frameworks that boosted our efficiency by 30%, making it a must-read for anyone in the field.

Marina Trajkovska
Head of Engineering

This is just a great newsletter. The content is so helpful, even when I’m busy I read them.

Jeremy Hicks
Solutions Architect
Related Posts
View More

Experiment, Evaluate, Deploy, Repeat.

AI development doesn’t end once you've defined your system. Learn how Vellum helps you manage the entire AI development lifecycle.