DASH ENTERPRISE
Databricks and Dash Integration
The founders of Databricks created Apache Spark, as well as other open-source data science and machine learning projects, making them valued Plotly partners. The Databricks platform offers a notebook interface, similar to Jupyter Notebooks, to leverage Apache Spark. Dash applications using Databricks can be easily developed and deployed to Dash Enterprise.
On-Demand Technical Session
December 13, 2022
Learn from solution architects from Databricks and Plotly to learn how the platforms offer a combined solution to ensure fast processing and visualization of real-time and large data.
Medium: Build Real-Time Production Data Apps with Databricks & Plotly Dash
The process for building at-scale interactive Plotly Dash analytics apps for streaming data via Databricks is accomplished by using the Databricks Structured Streaming solution in conjunction with the Databricks SQL python connector (DB SQL).
Medium: Building Plotly Dash Apps on a Lakehouse with Databricks SQL
For building Plotly Dash apps on Databricks, the integration process is identical to any data warehouse. Use the databricks-sql python connector (DBSQL) to create a jdbc/odbc connection to a DBSQL endpoint, or use an ORM such as SQLAlchemy.
Customer Story: Dash Enterprise for Public Utility Operations
Within one year, Dash enabled a major public utility to reduce its customer complaints by 10 times, scaling Dash usage within the company and integrating AI and Databricks.
Dash Apps Built with Dash Enterprise
Dash provides a friendly Python interface for creating flexible, interactive, and customizable apps that connect directly to your analytics code.
Architecture
Dask enables your production-grade Dash application to load and process very large datasets or models using distributed computing with familiar Python data science tools.