Back to Integrations
integrationGoogle BigQuery node
integrationHTTP Request node

Google BigQuery and HTTP Request integration

Save yourself the work of writing custom integrations for Google BigQuery and HTTP Request and use n8n instead. Build adaptable and scalable Data & Storage, Development, and Core Nodes workflows that work with your technology stack. All within a building experience you will love.

How to connect Google BigQuery and HTTP Request

  • Step 1: Create a new workflow
  • Step 2: Add and configure nodes
  • Step 3: Connect
  • Step 4: Customize and extend your integration
  • Step 5: Test and activate your workflow

Step 1: Create a new workflow and add the first step

In n8n, click the "Add workflow" button in the Workflows tab to create a new workflow. Add the starting point – a trigger on when your workflow should run: an app event, a schedule, a webhook call, another workflow, an AI chat, or a manual trigger. Sometimes, the HTTP Request node might already serve as your starting point.

Google BigQuery and HTTP Request integration: Create a new workflow and add the first step

Step 2: Add and configure Google BigQuery and HTTP Request nodes

You can find Google BigQuery and HTTP Request in the nodes panel. Drag them onto your workflow canvas, selecting their actions. Click each node, choose a credential, and authenticate to grant n8n access. Configure Google BigQuery and HTTP Request nodes one by one: input data on the left, parameters in the middle, and output data on the right.

Google BigQuery and HTTP Request integration: Add and configure Google BigQuery and HTTP Request nodes

Step 3: Connect Google BigQuery and HTTP Request

A connection establishes a link between Google BigQuery and HTTP Request (or vice versa) to route data through the workflow. Data flows from the output of one node to the input of another. You can have single or multiple connections for each node.

Google BigQuery and HTTP Request integration: Connect Google BigQuery and HTTP Request

Step 4: Customize and extend your Google BigQuery and HTTP Request integration

Use n8n's core nodes such as If, Split Out, Merge, and others to transform and manipulate data. Write custom JavaScript or Python in the Code node and run it as a step in your workflow. Connect Google BigQuery and HTTP Request with any of n8n’s 1000+ integrations, and incorporate advanced AI logic into your workflows.

Google BigQuery and HTTP Request integration: Customize and extend your Google BigQuery and HTTP Request integration

Step 5: Test and activate your Google BigQuery and HTTP Request workflow

Save and run the workflow to see if everything works as expected. Based on your configuration, data should flow from Google BigQuery to HTTP Request or vice versa. Easily debug your workflow: you can check past executions to isolate and fix the mistake. Once you've tested everything, make sure to save your workflow and activate it.

Google BigQuery and HTTP Request integration: Test and activate your Google BigQuery and HTTP Request workflow

Send location updates of the ISS every minute to a table in Google BigQuery

This workflow allows you to send position updates of the ISS every minute to a table in Google BigQuery.

Cron node: The Cron node will trigger the workflow every minute.

HTTP Request node: This node will make a GET request to the API https://api.wheretheiss.at/v1/satellites/25544/positions to fetch the position of the ISS. This information gets passed on to the next node in the workflow.

Set node: We will use the Set node to ensure that only the data that we set in this node gets passed on to the next nodes in the workflow.

Google BigQuery: This node will send the data from the previous node to the position table in Google BigQuery. If you have created a table with a different name, use that table instead.

Nodes used in this workflow

Popular Google BigQuery and HTTP Request workflows

Web3 Wallet Tracker: Sync Balances to GA4, BigQuery, and Discord Whale Alerts

This workflow bridges the gap between anonymous website traffic and on-chain wallet activity. It captures wallet connections via a webhook, enriches the data with real-time USD balances from the Zerion API, and syncs the results to Google Analytics 4, BigQuery, and Discord for immediate action. This directly helps Web3 marketing and growth teams identify high-value "whales" the moment they connect to your dApp, allowing for real-time monitoring and advanced attribution analysis. How it works Video tutorial: https://youtu.be/2_wuTRzRpkg How it works Webhook Trigger: Receives the wallet address, GA Client ID, and Session ID from your website via GTM. Zerion API Integration: Queries the real-time USD balance and individual chain distributions for the connected wallet. Whale Filtering (Switch): A logic that filters wallets based on a USD threshold (e.g., >$50) to trigger high-priority alerts. Dynamic Discord Alerts: Sends a formatted message to Discord with a 2-decimal rounded total balance and a dynamic breakdown of assets across all active chains (Base, Ethereum, etc.). GA4 Push: Sends the wallet_usd_balance as a custom metric to GA4 via the Measurement Protocol to maintain session continuity. BigQuery Archive: Records the wallet address, hashed ID, and USD balance into a secure table for SQL joining with raw GA4 data Prerequisites Zerion API Key: Required for fetching real-time balance and chain data. Discord Bot Token: Required to send automated whale alerts to your team server. Google Cloud Project: A project with BigQuery enabled and a JSON Service Account key for secure data insertion. GA4 Measurement Protocol API Secret: Required to push custom metrics back into active GA4 sessions.
+8

Qualify and email literary agents with GPT‑4.1, Gmail and Google Sheets

Inspiration & Notes This workflow was born out of a very real problem. While writing a book, I found the process of discovering suitable literary agents and managing outreach to be manual, and surprisingly difficult to scale. Researching agents, checking submission rules, personalizing emails, tracking submissions, and staying organized quickly became a full-time job on its own. So instead of doing it manually, I automated it. I built this entire workflow in 3 days — and the goal of publishing it is to show that you can do the same. With the right structure and intent, complex sales and marketing workflows don’t have to take months to build. Contact & Collaboration If you have questions, business inquiries, or would like help setting up automation workflows, feel free to reach out: 📩 [email protected] I genuinely enjoy designing workflows and automation systems, especially when they support meaningful projects. I work primarily from interest and impact rather than purely financial motivation. Whether I take on a project for FREE or paid for the following reasons: I LOVE setting up workflows and automation. I work for meaningfulness, not for money. I may do the work for free**, depending on how meaningful the project is. If the problem statement matters, the motivation follows. It also depends on the value I bring to the table** -- If I can contribute significant value through system design, I’m more inclined to get involved. If you’re building something thoughtful and need help automating it, I’m always happy to have a conversation. Enjoy~! Overview Automates the end-to-end literary agent outreach pipeline, from data ingestion and eligibility filtering to deep agent research, personalized email generation, submission tracking, and analytics. Architecture The system is organized into four logical domains: The system is modular and is divided into four domains: --> Data Engineering --> Marketing & Research --> Sales (Outreach) --> Data Analysis Each domain operates independently and passes structured data downstream. Data Engineering Purpose: Ingest and normalize agent data from multiple sources into a single source of truth. Inputs Google BigQuery Azure Blob Storage AWS S3 Google Sheets (Optional) HTTP sources Key Steps Scheduled ingestion trigger Merge and normalize heterogeneous data formats (CSV, tables) Deduplication and validation AI-assisted enrichment for missing metadata Append-only writes to a central Google Sheet Output Clean, normalized agent records ready for eligibility evaluation Marketing & Research Purpose: Decide who to contact and how to personalize outreach. Eligibility Evaluation An AI agent evaluates each record against strict rules: Email submissions enabled Not QueryTracker-only or QueryManager-only Genre fit (e.g. Memoir, Spiritual, Self-help, Psychology, Relationships, Family) Outputs send_email (boolean) reason (auditable explanation) Deep Research For eligible agents only: Public research from agency sites, interviews, Manuscript Wish List, and LinkedIn (if public) Extracts: Professional background Editorial interests Genres represented Notable clients/books (if publicly listed) Public statements Source-backed personalization angles Strict Rule: All claims must be explicitly cited; no inference or hallucination is allowed. Sales (Outreach) Purpose: Execute personalized email outreach and maintain clean submission tracking. Steps AI generates agent-specific email copy Copy is normalized for tone and clarity Email is sent (e.g. Gmail) Submission metadata is logged: Submission Completed Submission Timestamp Channel used Result Consistent, traceable outreach with CRM-style hygiene Data Analysis Purpose: Measure pipeline health and outreach effectiveness. Features Append-only decision and submission logs QuickChart visualizations for fast validation (e.g. TRUE vs FALSE completion rates) Optional integration with: Power BI Google Analytics 4 Supports Completion rate analysis Funnel tracking Source/platform performance Decision auditing Design Principles Separation of concerns** (ingestion ≠ decision ≠ outreach ≠ analytics) AI with hard guardrails** (strict schemas, source-only facts) Append-only logging** (analytics-safe, debuggable) Modular & extensible** (plug-and-play data sources) Human-readable + machine-usable outputs** Constraints & Notes Only public, professional information is used No private or speculative data HTTP scraping avoided unless necessary Power BI Embedded is not required Workflow designed and implemented end-to-end in ~3 days Use Cases Marketing Audience discovery Agent segmentation Personalization at scale Campaign readiness Funnel automation Sales Lead qualification Deduplication Outreach execution Status tracking Pipeline hygiene Tech Stack Automation:** n8n AI:** OpenAI (GPT) Scripting:** JavaScript Data Stores:** Google Sheets Email:** Gmail Visualization:** QuickChart BI (optional):** Power BI, Google Analytics 4 Cloud Sources:** AWS S3, Azure Blob, BigQuery Status This workflow is production-ready, modular, and designed for extension into other sales or marketing domains beyond literary outreach.

Sync Multi-Bank Balance Data to BigQuery using Plaid

Automated Multi-Bank Balance Sync to BigQuery This workflow automatically fetches balances from multiple financial institutions (RBC, Amex, Wise, PayPal) using Plaid, maps them to QuickBooks account names, and loads structured records into Google BigQuery for analytics. Who’s it for? Finance teams, accountants, and data engineers managing consolidated bank reporting in Google BigQuery. How it works The Schedule Trigger runs weekly. Four Plaid API calls fetch balances from RBC, Amex, Wise, and PayPal. Each response splits out individual accounts and maps them to QuickBooks names. All accounts are merged into one dataset. The workflow structures the account data, generates UUIDs, and formats SQL inserts. BigQuery node uploads the finalized records. How to set up Add Plaid and Google BigQuery credentials, replace client IDs and secrets with variables, test each connection, and schedule the trigger for your reporting cadence.

Sync QuickBooks Chart of Accounts to Google BigQuery

Sync QuickBooks Chart of Accounts to Google BigQuery Keep a historical, structured copy of your QuickBooks Chart of Accounts in BigQuery. This n8n workflow runs weekly, syncing new or updated accounts for better reporting and long-term tracking. Who Is This For? Data Analysts & BI Developers** Build a robust financial model and analyze changes over time. Financial Analysts & Accountants** Track structural changes in your Chart of Accounts historically. Business Owners** Maintain a permanent archive of your financial structure for future reference. What the Workflow Does Extract** Every Monday, fetch accounts created or updated in the past 7 days from QuickBooks. Transform** Clean the API response, manage currencies, create stable IDs, and format the data. Format** Convert cleaned data into an SQL insert-ready structure. Load** Insert or update account records into BigQuery. Setup Steps Prepare BigQuery Create a table (e.g., quickbooks.accounts) with columns matching the final SQL insert step. Add Credentials Connect QuickBooks Online and BigQuery credentials in n8n. Configure the HTTP Node Open 1. Get Updated Accounts from QuickBooks. Replace the Company ID {COMPANY_ID} with your real Company ID. Press Ctrl + Alt + ? in QuickBooks to find it. Configure the BigQuery Node Open 4. Load Accounts to BigQuery. Select the correct project. Make sure your dataset and table name are correctly referenced in the SQL. Activate Save and activate the workflow. It will now run every week. Requirements QuickBooks Online account QuickBooks Company ID Google Cloud project with BigQuery and a matching table Customization Options Change Sync Frequency** Adjust the schedule node to run daily, weekly, etc. Initial Backfill** Temporarily update the API query to select * from Account for a full pull. Add Fields** Modify 2. Structure Account Data to include or transform fields as needed.

Send location updates of the ISS every minute to a table in Google BigQuery

This workflow allows you to send position updates of the ISS every minute to a table in Google BigQuery. Cron node: The Cron node will trigger the workflow every minute. HTTP Request node: This node will make a GET request to the API https://api.wheretheiss.at/v1/satellites/25544/positions to fetch the position of the ISS. This information gets passed on to the next node in the workflow. Set node: We will use the Set node to ensure that only the data that we set in this node gets passed on to the next nodes in the workflow. Google BigQuery: This node will send the data from the previous node to the position table in Google BigQuery. If you have created a table with a different name, use that table instead.

Build your own Google BigQuery and HTTP Request integration

Create custom Google BigQuery and HTTP Request workflows by choosing triggers and actions. Nodes come with global operations and settings, as well as app-specific parameters that can be configured. You can also use the HTTP Request node to query data from any app or service with a REST API.

Google BigQuery supported actions

Execute Query
Execute a SQL query
Insert
Insert rows in a table
Use case

Save engineering resources

Reduce time spent on customer integrations, engineer faster POCs, keep your customer-specific functionality separate from product all without having to code.

Learn more

FAQs

  • Can Google BigQuery connect with HTTP Request?

  • Can I use Google BigQuery’s API with n8n?

  • Can I use HTTP Request’s API with n8n?

  • Is n8n secure for integrating Google BigQuery and HTTP Request?

  • How to get started with Google BigQuery and HTTP Request integration in n8n.io?

Need help setting up your Google BigQuery and HTTP Request integration?

Discover our latest community's recommendations and join the discussions about Google BigQuery and HTTP Request integration.
Moiz Contractor
theo
Jon
Dan Burykin
Tony

Looking to integrate Google BigQuery and HTTP Request in your company?

Over 3000 companies switch to n8n every single week

Why use n8n to integrate Google BigQuery with HTTP Request

Build complex workflows, really fast

Build complex workflows, really fast

Handle branching, merging and iteration easily.
Pause your workflow to wait for external events.

Code when you need it, UI when you don't

Simple debugging

Your data is displayed alongside your settings, making edge cases easy to track down.

Use templates to get started fast

Use 1000+ workflow templates available from our core team and our community.

Reuse your work

Copy and paste, easily import and export workflows.

Implement complex processes faster with n8n

red iconyellow iconred iconyellow icon