Select the directory option from the above "Directory" header!

Google Cloud’s Colab Enterprise environment to help tune LLMs

Google Cloud’s Colab Enterprise environment to help tune LLMs

At its Cloud Next conference, the company added new features for advanced MLOps for generative AI and launched Ray on Vertex AI for scaling AI workloads efficiently.


Google Cloud on Tuesday launched a managed data science notebook environment, dubbed Colab Enterprise, to help data scientists customise and tune large language models (LLMs) for their enterprises.

Currently, in public preview with general availability planned for September, Colab Enterprise is based on Google’s cloud-based Jupyter notebook named Colab. Colab currently has seven million monthly active users, Google said.

“Powered by Vertex Al, Colab Enterprise provides access to everything Vertex Al's Platform has to offer from Model Garden and a range of tuning tools to flexible compute resources and machine types, to data science and MLOps tooling,” the company said in a statement.

Colab Enterprise powers a notebook experience for BigQuery Studio — a new collaborative workspace for discovering, exploring, analysing, and predicting data in BigQuery.

This feature, according to the company, allows enterprise users to start a notebook in BigQuery for preparing data and then continue on the same notebook in Vertex AI for AI infrastructure and tooling.

“Teams can directly access data wherever they're working. With the ability to share notebooks across team members and environments, Colab Enterprise can effectively eliminate the boundaries between data and Al workloads,” the company said.

Google is also expanding support for its open source frameworks with the addition of Ray on Vertex AI. Ray will be added to existing supported frameworks such as Tensorflow, PyTorch, scikit-learn, and XGBoost

Using Ray, according to Google, will help enterprises reduce costs and boost productivity.

Ray’s support on Vertex AI will allow data scientists to create a Ray cluster and connect it to the Colab Enterprise notebook in order to scale a model training job, the company said, adding that open source models, such as Llama 2, Dolly, or Falcon, can be opened inside the managed data science notebook environment for tuning and prototyping.

Enhancing MLOps for generative AI

Google is also introducing new MLOps capabilities targeted at generative AI. “We’re also introducing a new tuning method for Imagen called Style Tuning, so enterprises can create images aligned to their specific brand guidelines or other creative needs, with as few as 10 reference images required,” the company said, adding that it was making supervised tuning generally available for PaLM 2 Text.

Other capabilities include two new features — Automatic Metrics and Automatic Side by Side — that promote continuous iteration and improvement of machine learning models by letting enterprises evaluate the quality of their ML models.

While Automatic Metric evaluates a model based on a defined task and dataset, Automatic Side by Side uses a large model to evaluate the output of multiple models being tested.

Google Cloud is also adding a new generation of Vertex AI Feature Store to help enterprises avoid data duplication and preserve data access policies.

Built on BigQuery, the new Feature Store will natively support vector embeddings data type for easing access to unstructured data in real-time, the company said.

Follow Us

Join the newsletter!


Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags Google Cloud

Show Comments