Databricks and Dash Integration

The founders of Databricks created Apache Spark, as well as other open-source data science and machine learning projects, making them valued Plotly partners. Dash applications using Databricks can be easily developed and deployed to Dash Enterprise. Read the documentation or access resources below to get started.

Contact Sales
Ballard Power Scales Hydrogen Fuel Cell Fleet Diagnostics with Databricks and Dash

Ballard Power Scales Hydrogen Fuel Cell Fleet Diagnostics with Databricks and Dash

Ballard Power Systems transitioned to Azure cloud-based architecture, leveraging Databricks for data ingress, at-scale analytics, and computing, alongside Dash Enterprise for interactive reporting and visualization. Register for the webinar on October 19.


Holmusk Delivers Behavioral Health Evidence to Top Biotech and Pharma Companies

Dash Enterprise enabled Holmusk to maximize data-driven insights within their NeuroBlu platform, seeing faster development, decision-making, and deployment. They are also adopting Databricks to ensure long-term scalable data and analytics architecture.

Collins Aerospace

Collins Aerospace uses Lakehouse and Apache Spark for Jet Streaming Data and Predictive Analytics

With their product, Ascentia, Collins Aerospace helps airlines ensure reliable flight schedules and fewer delays by anticipating aircraft maintenance issues in advance. They use Dash Enterprise and Databricks to develop production-grade, highly-custom, data-rich apps that are not possible with other tools. Watch the presentation.

Sidmach User Story

Sidmach Launches AI-Driven Platform with Dash Enterprise + Azure Databricks

Dash Enterprise front-end framework and Databricks back-end architecture empower Sidmach to go beyond traditional BI tools. View the webinar highlight reel.

Lakehouse Apps

Databricks Lakehouse Apps Announcement

Plotly is pleased to be an early collaborator on the launch of Databricks Lakehouse Apps. Currently, over 3200 enterprises use Plotly & Dash and Databricks, and we are looking forward to what this joint venture will mean for the future of data and AI.


Mitzu decreases feature ship time from weeks or months to just a couple of days

Using Plotly and Dash, Mitzu.io created a purpose-built analytics solution that takes data from an at-scale lakehouse, interpreting complex delta table schema and identifying variations in event properties.


Revenue.AI’s commercial SaaS offering uses Plotly Dash & Databricks

Revenue.ai share how they built an enterprise-grade production data app that seamlessly connects with their enterprise data architecture.

Plotly in Databricks’ 2023 Top 10 Data & AI Products

Plotly is #2 in Databricks’ 2023 Top 10 Data & AI Products

After examining trends across 9,000 global customers, Databricks ranks Plotly second in data and AI products, focusing on production data apps for interactive, at-scale analytics instead of traditional BI.

Building Plotly Dash Apps on a Lakehouse with Databricks SQL (Advanced Edition)

Building Plotly Dash Apps on a Lakehouse with Databricks SQL (Advanced Edition)

Maximizing the value of Databricks-enabled Dash apps using new SQLAlchemy integration

Uniper Architecture

Uniper's Energy Trading Analytics is enabled by Dash Enterprise and Databricks

Uniper develops trading decision-support tools and advanced analytics data apps using a workflow that includes initial collaboration in Azure Databricks notebooks. View the webinar highlight reel.

Molson Coors

Molson Coors Streamlines Supply Planning Workflows with Databricks & Plotly Dash

Migrating from Excel using Databricks SQLAlchemy & Plotly Dash AG Grid to deliver editable data apps with write-back capabilities

Databricks and Plotly Dash

Build Real-Time Production Data Apps with Databricks & Plotly Dash

The process for building at-scale interactive Plotly Dash analytics apps for streaming data via Databricks is accomplished by using the Databricks Structured Streaming solution in conjunction with the Databricks SQL python connector (DB SQL). Watch the joint technical webinar.

Databricks SQL and Plotly Dash

Building Plotly Dash Apps on a Lakehouse with Databricks SQL

For building Plotly Dash apps on Databricks, the integration process is identical to any data warehouse. Use the databricks-sql python connector (DBSQL) to create a jdbc/odbc connection to a DBSQL endpoint, or use an ORM such as SQLAlchemy.

Dash Enterprise for Public Utility Operations

Dash Enterprise for Public Utility Operations

Within one year, Dash enabled a major public utility to reduce its customer complaints by 10 times, scaling Dash usage within the company and integrating AI and Databricks.

Databricks and Plotly Dash - Integration Options

Easily connect your Databricks back end to Dash with the Python SQL Connector, Jobs API, Databricks CLI, or Databricks Connect.

  • SQLAlchemy

    Python SQL Connector with SQLAlchemy

    • Seamlessly build Dash Apps on a Lakehouse with Databricks SQL. Use the Databricks SQL Connector for Python to connect to Dash and run SQL commands on Databricks clusters and warehouses.
  • Jobs API
    • Use the Databricks Jobs API to run a data processing task for your Dash app in a cluster with scalable resources. Suitable for single tasks or large, multi-task workflows.
  • Databricks Connect v2

    Databricks Connect v2

    • With Databricks Connect v2, connect to Dash directly to your Databricks Lakehouse and use Dash Enterprise Workspaces IDE for interactive development and debugging.


Dask enables your production-grade Dash application to load and process very large datasets or models using distributed computing with familiar Python data science tools.


Build. Deploy. Scale.

Dash Enterprise




© 2023
Plotly. All rights reserved.