Step-by-step Google BigQuery data connection setup
Google BigQuery is a serverless, highly scalable cloud data warehouse designed for fast SQL queries on massive datasets. It's tightly integrated with the Google Cloud ecosystem and supports real-time analytics at petabyte scale.
Connecting to Google BigQuery requires a service account JSON key generated from the Google Cloud Console. Follow these steps to connect to your BigQuery data and visualize it with Plotly Studio:
Step 1: Create a Google Cloud project
Log into the Google Cloud Console and ensure you have an active project with the BigQuery API enabled. If you're starting fresh, create a new project and note the Project ID as you'll need it later.
Step 2: Create a service account
In the GCP Console, use the global search bar to find Service Accounts (under IAM & Admin). Click Create Service Account and fill in a display name and description. For the role, assign BigQuery > BigQuery Admin (or a more restrictive role like BigQuery Data Viewer if you want read-only access, recommended for production).
Step 3: Generate a JSON key
Once your service account is created, navigate to its detail page and open the Keys tab. Click Add Key > Create new key, select the JSON format, and click Create. A .json file will download to your machine. This is your credential file. Open it in any text editor; you'll paste the entire contents directly into Plotly Studio.
Credentials needed
The JSON key file contains all required fields. When you paste it into Plotly Studio, the following values are extracted automatically:
- type: authentication type (will be service_account)
- project_id: your GCP project ID
- private_key_id: unique key identifier
- private_key: RSA private key for authentication
- client_email: the service account email address
- client_id: the service account client ID
- auth_uri / token_uri: Google OAuth endpoints
Tip: contact us if you need help troubleshooting these steps.
LLM prompts for connecting to Google BigQuery
Plotly Studio uses an AI agent to generate and execute the data connection code for you. The prompts below are ready to copy and paste directly into Plotly Studio's data connection chat. Use them to establish a connection, query your data, or do both in one shot. The global context rules are worth saving to your Plotly Studio global context to keep BigQuery connections consistent across projects.
Connection prompt
Connect to Google BigQuery using a service account JSON key. Use the google-cloud-bigquery
Python library and the google-oauth2 library for authentication. Parse the provided JSON
credentials using service_account.Credentials.from_service_account_info() and initialize a
bigquery.Client with those credentials and the project ID extracted from the key. List all
available datasets and tables in the project upon connection.
Query prompt
Using the connected BigQuery client, retrieve all rows from the table [YOUR_DATASET].
[YOUR_TABLE] in project [YOUR_PROJECT_ID]. Return the result as a pandas DataFrame. Respect
BigQuery's SQL dialect (standard SQL). Limit results to [N] rows if the table is large.
Example one-shot prompt
Connect to Google BigQuery using a service account JSON key. Use google-cloud-bigquery and
google-oauth2 for authentication. Load credentials using
service_account.Credentials.from_service_account_info() with the following key contents:
[PASTE FULL JSON KEY CONTENTS HERE]
Once connected, retrieve all rows from the table [YOUR_DATASET_NAME].[YOUR_TABLE_NAME] in
project [YOUR_PROJECT_ID] using standard SQL. Return the data as a pandas DataFrame and
display a preview.
Global context rules
- Always use google-cloud-bigquery as the primary client library.
- Always authenticate via service_account.Credentials.from_service_account_info() using a
parsed JSON dict — never use file path references.
- Always extract project_id directly from the credentials object (credentials.project_id)
rather than hardcoding it.
- Use standard SQL syntax only. Do not use BigQuery legacy SQL.
- Always return query results as a pandas DataFrame using .to_dataframe().
- If the table schema is unknown, retrieve it first before constructing the query to ensure
column names and types are accurate.
- Do not expose or log raw credential values in any output or error messages.
- If querying large tables, default to adding a LIMIT 10000 clause unless the user
explicitly requests a full export.
Troubleshooting and tips
- Key security: Treat this JSON file like a password. Store it securely and delete it from your local machine after connecting. GCP only generates this key once so if you lose it, you'll need to create a new one. Work with your organization admin to understand the most secure authentication approach for your team.
- Alternative auth: Not every organization supports service account JSON key authentication. Some enterprises require OAuth 2.0, workload identity federation, or other flows. If the JSON key approach is blocked, work with your GCP admin to determine the approved connection method before proceeding.
- Permissions: The service account needs at minimum the BigQuery Data Viewer and BigQuery Job User roles to run queries. The BigQuery Admin role works for setup and testing but is overly broad for ongoing use.
- Schema retrieval: Plotly Studio will automatically fetch the table schema before executing your query. This is expected behavior and ensures the generated SQL uses the correct column names and types.
- Multiple tables: You can enrich your dataset at query time by asking Plotly Studio to join across tables within the same project. Just describe the join logic in plain language in your prompt.
- After connecting: Once your data is loaded, use Explore Mode for automatic chart suggestions, App Prototyping to generate a six-chart dashboard in one shot, or the Spec Builder to create individual components step by step.
Connect to Google BigQuery in minutes with Plotly Studio
Download today for free and get started with Plotly Studio.
