Guides / Sending and managing data / Send and update your data / Connectors

The BigQuery connector lets you read data from a BigQuery table and store it in an Algolia index.

Authentication

To authenticate the connector, you need a Google service account with the following permissions:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
bigquery.datasets.get
bigquery.datasets.getIamPolicy
bigquery.jobs.create
bigquery.models.export
bigquery.models.getData
bigquery.models.getMetadata
bigquery.models.list
bigquery.routines.get
bigquery.routines.list
bigquery.tables.createSnapshot
bigquery.tables.export
bigquery.tables.get
bigquery.tables.getData
bigquery.tables.getIamPolicy
bigquery.tables.list
resourcemanager.projects.get

If the BigQuery table is backed by GCS, the service account also requires the following permissions:

1
2
3
storage.folders.get
storage.objects.get
storage.objects.list

Those permissions can also help to speed up the indexing process, by using the BigQuery Storage API to read data from the table.

Custom SQL statement

The connector imports your selected table by default. To use a custom SQL statement, replace your table name with %s:

1
SELECT * FROM %s WHERE Status="Available"

Each row must have a unique identifier that Algolia uses as the object ID.

To stay within your BigQuery quota, don’t schedule tasks to run at most once per day.

Did you find this page helpful?