top of page



Basic Google Big Query operations with a Salesforce sync demo in Mule 4

GitHub repository with the Mule project can be found at the end of the post.

Photo by Jan Antonin Kolar on Unsplash.

If we think about data storage the first thing that comes to our mind is a regular database, this can be any of the most popular ones like Mysql, SQL Server, Postgres, Vertica, etc. But I noticed not too many have interacted with one of the services Google provides with the same purpose Google BigQuery. And maybe it is because of the pricing, but in the end, many companies are moving to cloud services and this service seems to be a great fit for them.

In this post, I would like to demonstrate in a few steps how we can make a sync job that allows us to describe a Salesforce instance and use a few objects to create a full schema of those objects (tables) into a Google Big Query Dataset. Then with the schema created we should be able to push some data into BigQuery from Salesforce and see it in our Google Cloud Console project.


To connect to Salesforce and Google BigQuery, there are a few prerequisites we need.


  • If you don’t have a salesforce instance, you can create a developer one here.

  • From the Salesforce side, you will need a username, password, and security token (you can follow this process to get it).

  • A developer instance contains a few records, but if you need to have some more data, this will help the process to sync that information over.

GCP (Google Cloud Platform)

  • You can sign up here for free. Google gives you $300 for 90 days to test the product (similar to Azure). If you already have a Google account, you can use it for this.

Creating a new project in GCP and setting up our service account key

Once you sign up for your account on GCP, you should be able to click on the New Project option and write a project name, in this example I chose mulesoft.

Once a project is created we should be able to go to the menu on the left and we should be able to select IAM & Admin > Service Accounts option.

Now, we should be able to create our service account.

“A service account is a special type of Google account intended to represent a non-human user that needs to authenticate and be authorized to access data in Google APIs. Typically, service accounts are used in scenarios such as Running workloads on virtual machines.”

At the top of the page, you should be able to see the option to create it. Then, you just need to specify a Name and click on create and continue.

The next step is to set the permissions, so for this, we need to select from the roles combo BigQuery Admin.

Once created, we should be able to select from the three-dot menu on the right the option Manage Keys.