Back to Integrations
integrationHTTP Request node
integrationNocoDB node

HTTP Request and NocoDB integration

Save yourself the work of writing custom integrations for HTTP Request and NocoDB and use n8n instead. Build adaptable and scalable Development, Core Nodes, and Data & Storage workflows that work with your technology stack. All within a building experience you will love.

How to connect HTTP Request and NocoDB

  • Step 1: Create a new workflow
  • Step 2: Add and configure nodes
  • Step 3: Connect
  • Step 4: Customize and extend your integration
  • Step 5: Test and activate your workflow

Step 1: Create a new workflow and add the first step

In n8n, click the "Add workflow" button in the Workflows tab to create a new workflow. Add the starting point – a trigger on when your workflow should run: an app event, a schedule, a webhook call, another workflow, an AI chat, or a manual trigger. Sometimes, the HTTP Request node might already serve as your starting point.

HTTP Request and NocoDB integration: Create a new workflow and add the first step

Step 2: Add and configure HTTP Request and NocoDB nodes

You can find HTTP Request and NocoDB in the nodes panel. Drag them onto your workflow canvas, selecting their actions. Click each node, choose a credential, and authenticate to grant n8n access. Configure HTTP Request and NocoDB nodes one by one: input data on the left, parameters in the middle, and output data on the right.

HTTP Request and NocoDB integration: Add and configure HTTP Request and NocoDB nodes

Step 3: Connect HTTP Request and NocoDB

A connection establishes a link between HTTP Request and NocoDB (or vice versa) to route data through the workflow. Data flows from the output of one node to the input of another. You can have single or multiple connections for each node.

HTTP Request and NocoDB integration: Connect HTTP Request and NocoDB

Step 4: Customize and extend your HTTP Request and NocoDB integration

Use n8n's core nodes such as If, Split Out, Merge, and others to transform and manipulate data. Write custom JavaScript or Python in the Code node and run it as a step in your workflow. Connect HTTP Request and NocoDB with any of n8n’s 1000+ integrations, and incorporate advanced AI logic into your workflows.

HTTP Request and NocoDB integration: Customize and extend your HTTP Request and NocoDB integration

Step 5: Test and activate your HTTP Request and NocoDB workflow

Save and run the workflow to see if everything works as expected. Based on your configuration, data should flow from HTTP Request to NocoDB or vice versa. Easily debug your workflow: you can check past executions to isolate and fix the mistake. Once you've tested everything, make sure to save your workflow and activate it.

HTTP Request and NocoDB integration: Test and activate your HTTP Request and NocoDB workflow

Scrape and summarize posts of a news site without RSS feed using AI and save them to a NocoDB

The News Site from Colt, a telecom company, does not offer an RSS feed, therefore web scraping is the choice to extract and process the news.

The goal is to get only the newest posts, a summary of each post and their respective (technical) keywords.

Note that the news site offers the links to each news post, but not the individual news. We collect first the links and dates of each post before extracting the newest ones.

The result is sent to a SQL database, in this case a NocoDB database.

This process happens each week thru a cron job.

Requirements:
Basic understanding of CSS selectors and how to get them via browser (usually: right click → inspect)
ChatGPT API account - normal account is not sufficient
A NocoDB database - of course you may choose any type of output target

Assumptions:
CSS selectors work on the news site
The post has a date with own CSS selector - meaning date is not part of the news content

"Warnings"
Not every site likes to be scraped, especially not in high frequency
Each website is structured in different ways, the workflow may then need several adaptations.

Nodes used in this workflow

Popular HTTP Request and NocoDB workflows

Aggregate News Articles from NewsAPI, Mediastack & CurrentsAPI into Database

This workflow pulls news articles from NewsAPI, Mediastack, and CurrentsAPI on a scheduled basis. Each provider’s results are normalized into a consistent schema, then written into your database (NocoDB by default). Use case: automated aggregation of categorized news for content pipelines, research agents, or editorial queues. What You Must Update Before Running API Keys Replace all placeholder keys: call newsapi.org - Top Headlines → update API_KEY in URL call newsapi.org - categories → update API_KEY call mediastack → update "ACCESS_KEY" in JSON call currentsapi → update "API_KEY" param Database Connection Workflow uses NocoDB to store results. You must: Update the NocoDB API Token credential to your own Ensure your table includes the fields used in the create operations (source_category, title, summary, author, sources, content, images, publisher_date, etc.) If you prefer Google Sheets, Airtable, or another DB: Replace each NocoDB node with your equivalent “create row” operation The Set nodes already provide all normalized fields you need Scheduling All schedulers are disabled by default. Enable the following so the workflow runs automatically: NewsAPI – Top Headlines** NewsAPI – Categories** Mediastack** CurrentsAPI** You may change the run times, but all four must be scheduled for the workflow to function as designed. What You Can Configure Categories Defined in: newsapi.org categories mediastack categories Edit these arrays to pull only the categories you care about or to match your API plan limits. Article Limits Adjust article_limit in: newsapi.org categories mediastack categories currentsapi config
+2

Analyze Facebook Ads & Send Insights to Google Sheets with Gemini AI

Stop manually digging through Meta Ads data and spending hours trying to connect the dots. This workflow turns n8n into an AI-powered media buyer that automatically analyzes your ad performance, categorizes your creatives, and delivers insights directly into a Google Sheet. ➡️ Watch the full 4-part setup and tutorial on YouTube: https://youtu.be/hxQshcD3e1Y About This 4-Part Automation Series As a media buyer, I built this system to automate the heavy lifting of analyzing ad data and brainstorming new creative ideas. This template is the first foundational part of that larger system. ✅ Part 1 (This Template): Pulling Ad Data & Getting Quick Insights Automatically pulls data into a Google Sheet and uses an LLM to categorize ad performance. ✅ Part 2: Finding the Source Files for the Best Ads Fetches the image or video files for top-performing ads. ✅ Part 3: Using AI to Understand Why an Ad Works Sends your best ads to Google Gemini for structured notes on hooks, transcripts, and visuals. ✅ Part 4: Getting the AI to Suggest New Creative Ideas Uses all the insights to generate fresh ad concepts, scripts, and creative briefs. What This Template (Part 1) Does Secure Token Management Automatically retrieves and refreshes your Facebook long-term access token. Fetch Ad Data Pulls the last 28 days of ad-level performance data from your Facebook Ads account. Process & Clean Parses raw data, standardizes key e-commerce metrics (like ROAS), and filters for sales-focused campaigns. Benchmark Calculation Aggregates all data to create an overall performance benchmark (e.g., average Cost Per Purchase). AI Analysis A “Senior Media Buyer” AI persona evaluates each ad against the benchmark and categorizes it as “HELL YES,” “YES,” or “MAYBE,” with justifications. Output to Google Sheets Updates your Google Sheet with both raw performance data and AI-generated insights. Who Is It For? E-commerce store owners Digital marketing agencies Facebook Ads media buyers How to Set It Up Credentials Connect your Google Gemini and Google Sheets accounts in the respective nodes. The template uses NocoDB for token management. Configure the “Getting Long-Term Token” and “Updating Token” nodes — or replace them with your preferred credential storage method. Update Your IDs In the “Getting Data For the Past 28 Days…” HTTP Request node, replace act_XXXXXX in the URL with your Facebook Ad Account ID. In both Google Sheets nodes (“Sending Raw Data…” and “Updating Ad Insights…”), update the Document ID with your target Google Sheet’s ID. Run the Workflow Click “Test workflow” to run your first AI-powered analysis! Tools Used n8n Facebook for Developers Google AI Studio (Gemini) NocoDB (or any credential database of your choice)

Enrich LinkedIn Profiles in NocoDB CRM with Apify Scraper

Introduction **Manual LinkedIn data collection is time-consuming, error-prone, and results in inconsistent data quality across CRM/database records.** This workflow is great for organizations that struggle with: Incomplete contact records with only LinkedIn URLs but missing profile details Hours spent manually copying LinkedIn information into databases Inconsistent data formats due to copy-paste from LinkedIn (emojis, styled text, special characters) Outdated profile information that doesn't reflect current roles/companies No systematic way to enrich contacts at scale Primary Users Sales & Marketing Teams Event Organizers & Conference Managers for event materials Recruitment & HR Professionals CRM Administrators Specific Problems Addressed Data Completeness: Automatically fills missing profile fields (headline, bio, skills, experience) Data Quality: Sanitizes problematic characters that break databases/exports Time Efficiency: Reduces hours of manual data entry to automated monthly updates Error Handling: Gracefully manages invalid/deleted LinkedIn profiles Scalability: Processes multiple profiles in batch without manual intervention Standardization: Ensures consistent data format across all records Cost Each URL scraped by Apify costs $0.01 to get all the data above. Apify charges per scrape, regardless of how much dta or fields you extract/use. Setup Instructions Prerequisites n8n Instance: Access to a running n8n instance (self-hosted or cloud) NocoDB Account: Database with a table containing LinkedIn URLs Apify Account: Free or paid account for LinkedIn scraping Required fields in NocoDB table Input: single LinkedIn URL NocoDB Field name LinkedIn Output: first/last/full name e-mail bio headline profile pic URL current role country skills current employer employer URL experiences (all previous jobs) personal website publications (articles) NocoDB Field names linkedin_full_name linkedin_first_name: linkedin_headline: linkedin_email: linkedin_bio: linkedin_profile_pic linkedin_current_role linkedin_current_company linkedin_country linkedin_skills linkedin_company_website linkedin_experiences linkedin_personal_website linkedin_publications linkedin_scrape_error_reason linkedin_scrape_last_attempt linkedin_scrape_status linkedin_last_modified Technically you also need an Id field, but that is always there so no need to add it :) n8n Setup Import the Workflow Copy the workflow JSON from the template In n8n, click "Add workflow" → "Import from JSON" Paste the workflow and click "Import" Configure NocoDB Connection Click on any NocoDB node in the workflow Add new credentials → "NocoDB Token account" Enter your NocoDB API token (found in NocoDB → User Settings → API Tokens) Update the projectId and table parameters in all NocoDB nodes Set Up Apify Integration Create an Apify account at apify.com Generate an API token (Settings → Integrations → API) In the workflow, update the Apify token in the "Get Scraper Results" node Configure HTTP Query Auth credentials with your token Map Your Database Fields Review the "Transform & Sanitize Data" node Update field mappings to match your NocoDB table structure Ensure these fields exist in your table: LinkedIn (URL field) linkedin_headline, linkedin_full_name, linkedin_bio, etc. linkedin_scrape_status, linkedin_last_modified Configure the Filter In "Get Guests with LinkedIn" node Adjust the filter to match your requirements Default: (LinkedIn,isnot,null)~and(linkedin_headline,is,null) Test the Workflow Click "Execute Workflow" with Manual Trigger Monitor execution for any errors Verify data is properly updated in NocoDB Activate Automated Schedule Configure the Schedule Trigger node (default: monthly) Toggle the workflow to "Active" Monitor executions in n8n dashboard Customization Options Data Source Modifications Different Database: Replace NocoDB nodes with Airtable, Google Sheets, or PostgreSQL Multiple Tables: Add parallel branches to process different contact tables Custom Filters: Modify the WHERE clause to target specific record subsets Enrichment Fields Add Fields: Include additional LinkedIn data like education, certifications, or recommendations Remove Fields: Simplify by removing unnecessary fields (publications, skills) Custom Transformations: Add business logic for field calculations or formatting Scheduling Options Frequency: Change from monthly to daily, weekly, or hourly Time-based: Set specific times for different timezones Event-triggered: Replace with webhook trigger for on-demand processing Error Handling Enhancement Notifications: Add email/Slack nodes to alert on failures Retry Logic: Implement wait and retry for temporary failures Logging: Add database logging for audit trails Data Quality Rules Validation: Add IF nodes to validate data before updates Duplicate Detection: Check for existing records before creating new ones Data Standardization: Add custom sanitization rules for industry-specific needs Integration Extensions CRM Sync: Add nodes to push data to Salesforce, HubSpot, or Pipedrive AI Enhancement: Use OpenAI to summarize bios or extract key skills Image Processing: Download and store profile pictures locally Performance Optimization Batch Size: Adjust the number of profiles processed per run Rate Limiting: Add delays between API calls to avoid limits Parallel Processing: Split large datasets across multiple workflow executions Compliance Additions GDPR Compliance: Add consent checking before processing Data Retention: Implement automatic cleanup of old records Audit Logging: Track who accessed what data and when These customizations allow the workflow to adapt from simple contact enrichment to complex data pipeline scenarios across various industries and use cases.

Automated SEO Performance Collection from Google Search Console to NocoDB

Problem Monitoring SEO performance from Google Search Console (GSC) manually is repetitive and prone to human error. For marketers or analysts managing multiple domains, checking reports manually and copying data into spreadsheets or databases is time-consuming. There is a strong need for an automated solution that collects, stores, and updates SEO metrics regularly for easier analysis and dashboarding. Solution This workflow automatically pulls performance metrics from Google Search Console — including queries, pages, CTR, impressions, positions, and devices — and stores them in a structured format inside a NocoDB table. It’s ideal for SEO specialists, marketing teams, or data analysts who need to automate SEO reporting and centralize data for analytics or dashboards (like Superset or Metabase). Setup Instructions Authorize your Google Search Console account Connect via OAuth2 (requires GSC API access). Create a NocoDB table Define fields to match GSC response: query (text) page (URL) device (text) clicks (number) impressions (number) ctr (percentage) position (number) Add credentials in n8n Use credential nodes for both: Google OAuth2 NocoDB API Token Customize schedule trigger Set the frequency (e.g., weekly) and adjust the domain/date range as needed. Generalize domains Replace specific domains like martechmafia.net with your-domain.com before submission. NocoDB Table Structure The NocoDB table must match the fields coming from GSC's Search Analytics API. Here's a sample schema: { "query": "string", "page": "string", "device": "string", "clicks": "number", "impressions": "number", "ctr": "number", "position": "number" }

Analyze LinkedIn Content Performance with OpenAI, Bright Data and NocoDB

AI LinkedIn Content Assistant using Bright Data and NocoDB Who’s it for This template is designed for creators, founders, and automation builders who publish regularly on LinkedIn and want to analyze their content performance using real data. It’s especially useful for users who are already comfortable with n8n and want to build data-grounded AI assistants instead of relying on generic prompts or manual spreadsheets. What this workflow does This workflow builds an AI-powered LinkedIn content assistant backed by real engagement data. It automatically: Scrapes LinkedIn posts and engagement metrics using Bright Data Stores structured post data in NocoDB Enables an AI chat interface in n8n to query and analyze your content Returns insights based on historical performance (not hallucinated data) You can ask questions like: “Which posts performed best last month?” “What content got the most engagement?” “What should I post next?” Requirements Self-hosted or cloud n8n instance Bright Data – LinkedIn scraping & data extraction NocoDB – Open-source Airtable-style database Open AI API – For AI reasoning & insights Setup Import the workflow into your n8n instance Open the Config node and fill in required variables Connect your credentials for Bright Data, NocoDB, and Open AI API Activate the workflow and run the scraper once to populate data How to customize the workflow You can extend this template by: Adding new metrics or post fields in NocoDB Scheduling regular data refreshes Changing the AI system prompt to match your content strategy Connecting additional channels (email, Slack, dashboards) This template is fully modular and designed to be adapted to your workflow. Questions or Need Help? For setup help, customization, or advanced AI workflows, join my 🌟 FREE 🌟 community: Tech Builders Club Happy building! 🚀 - Kornel Dubieniecki
+3

Auto-Generate & Approve Social Media Posts from RSS Feeds with OpenAI & Telegram

Overview This workflow automates the process of converting RSS feed articles into ready-to-publish social media posts using OpenAI, NocoDB, and Telegram. It's ideal for content teams, marketing managers, or news portals seeking to automate content curation while maintaining control through a human approval system. Features RSS Feed Monitoring Polls a specified RSS feed every 20 minutes. Detects new articles automatically. AI-Powered Content Processing Summarizes the full article using OpenAI's Assistant API. Creates an image prompt based on the article summary. Generates a platform-specific post for Facebook and LinkedIn using AI. Image Generation Leverages OpenAI's image model to generate a relevant image from the prompt. Retrieves and stores the featured image from the original article (via custom code node). Post Management with NocoDB Stores all content in NocoDB including: Article URL AI-generated summary Image prompt Post content per platform Generated image URL Post status (Pending, Approved, Declined) Human Approval via Telegram Sends post preview to a Telegram group or channel with inline buttons: ✅ Approve ❌ Decline On approval: Posts to Facebook, LinkedIn, and optionally Twitter. On rejection: Marks the NocoDB record as "Declined". Conditional Twitter (X) Posting Asks user whether to post with or without a link. Posts accordingly based on user’s Telegram response. No-Code Backend NocoDB acts as a lightweight CMS to manage, edit, and review AI-generated content before publishing. Setup Instructions Clone the workflow in your n8n instance. Configure the following credentials under Credentials > New: OpenAI API Key Facebook Graph API LinkedIn Access Token Twitter (X) OAuth credentials Telegram Bot Token and Chat ID NocoDB API Token and Base URL Set the RSS feed URL in the trigger node to your preferred news source. Adjust the NocoDB API node with your table and field names (see below). Deploy the workflow on an interval trigger (20-minute polling recommended). NocoDB Database Structure Column Name Description url Original article URL summary AI-generated summary image_prompt Prompt used for generating image image_url Final image URL (from OpenAI) post_content Formatted social media post platform Social platform (Facebook, LinkedIn, etc.) status Current status (Pending, Approved, Declined) date_created Date article was fetched Requirements An active n8n instance (cloud or self-hosted). API credentials for: OpenAI Telegram Bot Facebook Graph API LinkedIn Developer App Twitter/X Developer App NocoDB (self-hosted or cloud) A Telegram chat (group or user) where the bot is added. Customization Guidance Add more platforms: Extend the logic to other platforms like Instagram, Threads, or Mastodon. Customize AI tone: Adjust the prompt to OpenAI for a specific writing style (e.g., formal, casual, humorous). Adjust scheduling: Modify interval time or RSS feed frequency as needed. Add post delay: Schedule posts using a delay node to spread them over time. Use Cases Auto-publish summarized news articles to multiple social platforms. Reduce effort for social teams by automating draft creation and media. Maintain editorial control using the Telegram approval step. Repurpose blog or article content into engaging posts with minimal effort. Integrations Used OpenAI (Assistants & DALL·E Image Generation) Telegram Bot (Inline approval workflow) Facebook Graph API (Post publishing) LinkedIn API (Company or personal posts) Twitter/X API (Optional conditional post publishing) RSS Feed Reader (Article fetching) NocoDB (Content repository and status manager)

Build your own HTTP Request and NocoDB integration

Create custom HTTP Request and NocoDB workflows by choosing triggers and actions. Nodes come with global operations and settings, as well as app-specific parameters that can be configured. You can also use the HTTP Request node to query data from any app or service with a REST API.

NocoDB supported actions

Create
Create a row
Delete
Delete a row
Get
Retrieve a row
Get Many
Retrieve many rows
Update
Update a row
Use case

Save engineering resources

Reduce time spent on customer integrations, engineer faster POCs, keep your customer-specific functionality separate from product all without having to code.

Learn more

FAQs

  • Can HTTP Request connect with NocoDB?

  • Can I use HTTP Request’s API with n8n?

  • Can I use NocoDB’s API with n8n?

  • Is n8n secure for integrating HTTP Request and NocoDB?

  • How to get started with HTTP Request and NocoDB integration in n8n.io?

Need help setting up your HTTP Request and NocoDB integration?

Discover our latest community's recommendations and join the discussions about HTTP Request and NocoDB integration.
Alex Kim
Moiz Contractor
theo
Jon
Dan Burykin

Looking to integrate HTTP Request and NocoDB in your company?

Over 3000 companies switch to n8n every single week

Why use n8n to integrate HTTP Request with NocoDB

Build complex workflows, really fast

Build complex workflows, really fast

Handle branching, merging and iteration easily.
Pause your workflow to wait for external events.

Code when you need it, UI when you don't

Simple debugging

Your data is displayed alongside your settings, making edge cases easy to track down.

Use templates to get started fast

Use 1000+ workflow templates available from our core team and our community.

Reuse your work

Copy and paste, easily import and export workflows.

Implement complex processes faster with n8n

red iconyellow iconred iconyellow icon