Databricks on Google Cloud, a jointly-developed service that allows data teams (data engineering, data science, analytics, and ML professionals) to store data in a simple, open lakehouse platform for all data, AI and analytics workloads.
Sign up for your weekly dose of what's up in emerging technology.
The data analytics company said in a blog, “Since the announcement of Databricks on Google Cloud, we have seen tremendous momentum for this partnership as customers pursue a multi-cloud approach to their analytics and DS/ML workloads.” “Customers are demanding a simple, unified platform as they move workloads to Databricks on Google Cloud built on open standards with technologies like Delta Lake, MLflow and Google Kubernetes Engine,” the company added.
There are several new features of this GA release-
- Repo and Project support to sync your work with remote Git repository
- Table ACLs that lets you programmatically grant and revoke access to data from Python and SQL
- DB Connect to connect to Databricks from your favourite IDE
- Cluster Tags for DBU Usage tracking
- Local SSD Support for caching and improved performance
- Tableau connector to Databricks on Google Cloud
- Terraform provider to easily provision and manage Databricks along with associated cloud infrastructure.
Databricks delivers tight integrations with Google Cloud’s compute, storage, analytics and management products. This includes the first Google Kubernetes Engine (GKE) based, fully containerized Databricks runtime on any cloud, pre-built connectors to seamlessly and quickly integrate Databricks with BigQuery, Google Cloud Storage, Looker and Pub/Sub. In addition, customers can deploy Databricks from the Google Cloud Marketplace for simplified procurement and user provisioning, Single Sign-On and unified billing.