Dec 11 | Discover innovative Dash apps exploring Michelin-starred dining with AI and LLMs.

author photo

Chris Parmer

November 3, 2023 - 10 min read

Chris' Reflections on GenAI Dash Apps

Chris Parmer, Plotly Co-Founder and Chief Product Officer, offers valuable insights for the Dash community and over 8,000 subscribers in the Dash Club newsletter.

In this edition of Dash Club, Chris addresses how Plotly is thinking about generative AI and LLMs within the changing data science landscape.

Processing unstructured data with Dash

Dash Apps traditionally deal with structured input (e.g. a set of dropdowns that map to a set of parameters), a computational engine (crunching numbers)  and structured outputs (e.g. a datatable or graph).

GenAI Dash Apps enable you to swap out structured data or traditional computation with natural language:

  • Unstructured Input - A natural language prompt or a chatbot instead of a set of dropdowns (“Hey Siri”).
  • Unstructured Output - Natural language output like paragraphs of text or chat instead of graphs or tables.
  • GenAI Engine - Generating text instead of crunching numbers 

The easiest apps to build today involve unstructured inputs and outputs on both sides. For example, ask a question (Unstructured Input) to summarize (GenAI Engine + Unstructured Output) a large document (Unstructured Input). 

Summarizing text is a super common task among enterprises that traditionally has required hard NLP, sophisticated search infrastructure, or humans in the loop. Our world runs on unstructured text. Finance analysts listen to earnings calls. Pharma scientists consume scientific literature & clinical trials. Support reps read bug reports. Lawmakers write laws as text.

So an entirely new set of use cases are now unlocked within Dash. And it’s trivial to build a Dash app with this unstructured data:

from dash import Input, Output, html, dcc, callback, Dash
import llm
app = Dash(__name__)
docs = [‘transcript1.txt’, ‘transcript2.txt’]
app.layout = html.Div([
doc := dcc.Dropdown(docs, ‘transcript1.txt’)
html.Label(‘Ask a question’),
question := dcc.Textarea(),
output := html.Div()
])
@callback(Output(output, ‘children’), Input(question, ‘value’), Input(doc, ‘value’))
def update(question, selected_doc):
if not selected_doc in docs:
return ‘Document not found’
with open(selected_doc, ‘r’) as f:
content = f.read()
model = llm.get_model("llama2")
prompt = (
'Answer the following question based off of the following document:\n' +
'question: ' + question + '\n'
'document:\n' + content
)
return response = model.prompt(prompt_value).text()

(This example uses the LLM library by Simon Willison - I highly recommend checking out this package and definitely recommend his blog.)

This app runs on my Macbook GPUs. It's about twice as slow as chatgpt-3.5-turbo but it runs entirely locally! So you don’t have to worry  about leaking sensitive data to a 3rd party. And since LLMs are already pre-trained and work with a wide variety of text, we don’t need to train them on sensitive data like traditional ML models have required. Sensitive data doesn’t leak into the training set - it’s already trained!

New models are released almost every day and so I expect these models to get smaller, faster, and easier to use across dev & prod environments. The current models are trained on vast swaths of the web but we may not need to encode the entire web’s discourse to work with text within specialized domains.

This app above has unstructured data all the way through: unstructured input into a GenAI model which produces unstructured output. Mixing and matching structured with unstructured gets more difficult. Imagine Dash apps that evolve to use natural language within their UIs as an alternative to dropdowns, sliders, and inputs. This is still difficult to build and manage (see HoneyComb’s All the Hard Stuff Nobody Talks About when Building Products with LLMs). The challenges include parsing the output, dealing with consistent quality, rate limiting, and more. But it will get there. And once we enable natural language within apps, we could replace the input box with a microphone. Siri for your Dash apps!

But unstructured inputs have their challenges. As an end-user, the "blank slate" can be difficult to deal with — what types of questions can I ask? 50 years of user interface design has solved these problems with components like dropdowns that provide users with the list of available options and the jumping off point for exploration. The "backend" of data apps won't support all of the types of queries that a user will ask, especially computational apps that have a strict set of inputs.

LLMs don’t compute. They predict text, one word at a time. They could generate the code to compute, but they won’t compute themselves. Your data apps that compute will won’t be replaced with an LLM directly. We may have LLMs on either side of the app to guide a user to pick input parameters or interpret output results. But the computation in the middle remains important in the new world era of AI. As Stephen Wolfram pointed out recently, “[Our] modern technological world has been built on engineering that makes use of at least mathematical computations—and increasingly also more general computations.”

Even if our apps remain structured, our human interaction around the apps isn’t. We explain to our end users what the app does, we tell stories about the results, we train end-users on what the app can and can’t do, we field questions & bug reports. We demo our work verbosely. I can imagine AI being able to handle this informal contextual exchange in the future. Perhaps Dash app developers will first give the demo of their app to the LLM before they give it to the end-user, and that demo will become the in-app chatbot's system prompt to field questions from the end user. Incorporating these chatbots into the apps we share could enable a new era of feedback and engagement in the products and data apps that we create.

Exciting times! Want to explore Dash with GenAI? Join one of our community challenges. The next one explores LangChain. See more details on the forum.

If your organization wants to build their next AI app with Dash Enterprise, reach out to us. We'd love to help.

Want more Plotly & Dash updates?

Dash Club newsletter brings essays and updates about Plotly and Dash every 8 weeks. To have these directly delivered to your inbox, sign up.

View the full Dash Club edition to learn about our Dash online course, Dash-LangChain app challenge, component and app of the Month.

Products & Services

COMPANY

  • WE ARE HIRING

© 2024
Plotly. All rights reserved.
Cookie Preferences