Persistent Files System in Dash Enterprise

Meet the Speaker

Austin Kiesewetter
Austin Kiesewetter is a Plotly community member and software engineer at Martin Engineering.
This video walks through how to use the persistent file system in Dash Enterprise to manage large or dynamic datasets within your Dash apps. It demonstrates how to enable the persistent file system in the workspace and explains how it mounts a shared directory inside your Dash app environment. Files like CSVs can be dropped or written into this mount directory and then read directly by the app at runtime.
The example includes both a Dash app and a Python script. The app reads a CSV file called dynamic_data.csv and displays it in a Dash dcc.Graph component. The accompanying script, task.py, generates and writes new data to this CSV every five seconds. It uses Python’s datetime and random libraries to simulate a time series dataset, which is then appended to the file using pandas.
To run both the Dash app and the data generation script in production, a Procfile is configured. One worker runs the app via Gunicorn, while a second worker runs the script continuously. This setup supports real-time data updates in the app without relying on external APIs or databases.
Key benefits:
- Store and serve datasets up to 25 GB from local cache
- Dynamically update files used in app visualizations
- Improve load times and performance for high-volume data
Watch the video to follow along and see it in action.