Create a BigQuery Connection

White Label Data allows you to use Google BigQuery as a data warehouse and to map query results to native White Label Data visualizations. In order to use BigQuery, you must first set up a connection in White Label Data. Before you can create the connection, you need to create a service account in Google Cloud using the instructions here. In addition to creating the service account, you must configure the account with the IAM roles necessary to perform BigQuery opertions. At a minimum, the service account should have the BigQuery Job User and BigQuery Data Viewer roles.

Once the service account is created for Google Cloud, you will download the key file in JSON format. This will save a file to your local computer. You will need to know the location of this file to configure in White Label Data. Hashpath may also ask for this file in order to configure your hosted app instance to authenticate to BigQuery.

Example appconfig.json:

 "connections" : [
        {
            "type" : "bigquery",
            "service_account_file": "/opt/services/hashpathapp/hashpathcustomer/bq/bigquery-creds.json"
        }
    ]
Option Example Description
host "acme.looker.com" DNS name of your Looker instance
service_account_file "/opt/services/hashpathapp/hashpathcustomer/bq/bigquery-creds.json" The filename of the Google Cloud service account created in the Google Cloud IAM and Admin page. This file can be created using the following instructions: https://cloud.google.com/iam/docs/creating-managing-service-account-keys.

Note: this is the internal Docker file path, not a path on the host server. In most cases this will have the prefix “/opt/services/hashpathapp/hashpathcustomer”. For self-hosted configurations, this key file must be mounted into the Docker instance via a volume. For customers that have White Label Data hosted by Hashpath, this file path will be provided for you.

Adding the BigQuery key file to your local development environment

When you add a BigQuery connection, you will need to update your local development environment to include your BigQuery key file in your instance. This is done by editing the docker-compose.yml file that you created when you set up that environment.

Edit the file and add the environment variable(s) to the environment section of your instance:

wldinstance:
    image: hashpath/hashpathapp:dev
    container_name: wldinstance
    restart: always
    ports:
      - "2525:80"
    volumes:
      - <local_repository_path>:/opt/services/hashpathapp/hashpathcustomer/config
      - <local_repository_path>:/opt/services/hashpathapp/hashpathcustomer/templates/custom
      - /path/to/local/dir/with/keyfile:/opt/services/hashpathapp/hashpathcustomer/bq
    environment:
      - "HASHPATH_APPNAME=wldinstance"
      - "VIRTUAL_HOST=localhost"
      - "STATIC_URL=http://localhost:2526"
      - "AUTH0_DOMAIN=hashpathapp.auth0.com"
      - "AUTH0_KEY=<auth0_key>"
      - "AUTH0_SECRET=<auth0_secret"
      - "AUTH0_API_DOMAIN=<auth0_domain>"
      - "HASHPATH_DEBUG=True"
      - "HASHPATH_LOCAL=True"
      - "MAPBOX_TOKEN=pk.eyJ1IjoiaGFzaHBhdGgiLCJhIjoiY2p3Z3pxbHB0MDFrZDQzbnMwNmF1d3MybyJ9.0lLpesaHfYsX0-UivfyCzw"

In the above example, we added a new line under volumes that maps the path of the local directory containing the JSON key file you downloaded to a specific location within the Docker image. The internal file path is the one that is configured in the service_account_file setting in the connection.

To make the changes take effect:

docker-compose up -d