Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Loading data from BigQuery into Redis

I'm trying to load data from BigQuery into Redis, and after going through their documentation for the last 3 days, I am turning to SO, because I found nothing concrete. So, what would be a good way to load the results of multiple queries, each about a few hundred records, from BigQuery into Redis? Are there any code samples floating around that at least show how to translate a resultset into key-value pairs that are suited to be loaded into Redis? I'd like to use Python to implement this pipeline. Please help. Thanks!

Edit: We have a few datasets which we'd like to push to Redis, so that when the user runs a query, it runs against Redis, and not BigQuery. I have a snippet that pulls data from a query into a data frame in Python. I have been unable to find code snippets for loading this data, or even translating this into key-value pairs for consumption into Redis. This is the part I need help with.

like image 834
CodingInCircles Avatar asked Sep 15 '25 10:09

CodingInCircles


1 Answers

  1. Export the BigQuery table to a GCS bucket as new line delimitated JSON
  2. Use a Google Cloud Function triggered by GCS object changes to read the JSON file when it changes
  3. Attach a Serverless VPC connector to the Google Cloud Function so it can run a Redis client. It can then write the contends of the file to Redis.

I ended putting the Redis key to use in the BigQuery row, but thats details you can figure out yourself. I wrote a full blog about it here https://www.futurice.com/blog/bigquery-to-memorystore

like image 180
Tom Larkworthy Avatar answered Sep 18 '25 00:09

Tom Larkworthy