Back to Integrations
integrationHTTP Request node
integrationPostgres node

HTTP Request and Postgres integration

Save yourself the work of writing custom integrations for HTTP Request and Postgres and use n8n instead. Build adaptable and scalable Development, Core Nodes, and Data & Storage workflows that work with your technology stack. All within a building experience you will love.

How to connect HTTP Request and Postgres

  • Step 1: Create a new workflow
  • Step 2: Add and configure nodes
  • Step 3: Connect
  • Step 4: Customize and extend your integration
  • Step 5: Test and activate your workflow

Step 1: Create a new workflow and add the first step

In n8n, click the "Add workflow" button in the Workflows tab to create a new workflow. Add the starting point – a trigger on when your workflow should run: an app event, a schedule, a webhook call, another workflow, an AI chat, or a manual trigger. Sometimes, the HTTP Request node might already serve as your starting point.

HTTP Request and Postgres integration: Create a new workflow and add the first step

Step 2: Add and configure HTTP Request and Postgres nodes

You can find HTTP Request and Postgres in the nodes panel. Drag them onto your workflow canvas, selecting their actions. Click each node, choose a credential, and authenticate to grant n8n access. Configure HTTP Request and Postgres nodes one by one: input data on the left, parameters in the middle, and output data on the right.

HTTP Request and Postgres integration: Add and configure HTTP Request and Postgres nodes

Step 3: Connect HTTP Request and Postgres

A connection establishes a link between HTTP Request and Postgres (or vice versa) to route data through the workflow. Data flows from the output of one node to the input of another. You can have single or multiple connections for each node.

HTTP Request and Postgres integration: Connect HTTP Request and Postgres

Step 4: Customize and extend your HTTP Request and Postgres integration

Use n8n's core nodes such as If, Split Out, Merge, and others to transform and manipulate data. Write custom JavaScript or Python in the Code node and run it as a step in your workflow. Connect HTTP Request and Postgres with any of n8n’s 1000+ integrations, and incorporate advanced AI logic into your workflows.

HTTP Request and Postgres integration: Customize and extend your HTTP Request and Postgres integration

Step 5: Test and activate your HTTP Request and Postgres workflow

Save and run the workflow to see if everything works as expected. Based on your configuration, data should flow from HTTP Request to Postgres or vice versa. Easily debug your workflow: you can check past executions to isolate and fix the mistake. Once you've tested everything, make sure to save your workflow and activate it.

HTTP Request and Postgres integration: Test and activate your HTTP Request and Postgres workflow

Generate Instagram content from top trends with AI image generation

How it works

This automated workflow discovers trending Instagram posts and creates similar AI-generated content. Here's the high-level process:

  1. Content Discovery & Analysis
    Scrapes trending posts from specific hashtags
    Analyzes visual elements using AI
    Filters out videos and duplicates

  2. AI Content Generation
    Creates unique images based on trending content
    Generates engaging captions with relevant hashtags
    Maintains brand consistency while being original

  3. Automated Publishing
    Posts content directly to Instagram
    Monitors publication status
    Sends notifications via Telegram

Set up steps

Setting up this workflow takes approximately 15-20 minutes:

  1. API Configuration (7-10 minutes)
    Instagram Business Account setup
    Telegram Bot creation
    API key generation (OpenAI, Replicate, Rapid Api)

  2. Database Setup (3-5 minutes)
    Create required database table
    Configure PostgreSQL credentials

  3. Workflow Configuration (5-7 minutes)
    Set scheduling preferences
    Configure notification settings
    Test connection and permissions

Detailed technical specifications and configurations are available in sticky notes within the workflow.

Nodes used in this workflow

Popular HTTP Request and Postgres workflows

AI-Driven Inventory Management with OpenAI Forecasting & ERP Integration

This n8n workflow automates the monitoring of warehouse inventory and sales velocity to predict demand, generate purchase orders automatically, send them to suppliers, and record all transactions in ERP and database systems. It uses AI-driven forecasting to ensure timely restocking while maintaining operational efficiency and minimizing stockouts or overstocking. Key Features Automated Scheduling:** Periodically checks inventory and sales data at defined intervals. Real-Time Data Fetching:** Retrieves live warehouse stock levels and sales trends. AI Demand Forecasting:** Uses OpenAI GPT to predict future demand based on sales velocity and stock trends. Auto-Purchase Orders:** Automatically generates and sends purchase orders to suppliers. ERP Integration:** Logs completed purchase orders into ERP systems like SAP, Oracle, or Netsuite. Database Logging:** Saves purchase order details and forecast confidence data into SQL databases (PostgreSQL/MySQL). Email Notifications:** Notifies relevant teams upon successful order creation and logging. Modular Configuration:** Each node includes configuration notes and credentials setup instructions. Workflow Process Schedule Trigger Runs every 6 hours to monitor stock and sales data. Interval can be adjusted for higher or lower frequency checks. Fetch Current Inventory Data Retrieves live inventory levels from the warehouse API endpoint. Requires API credentials and optional GET/POST method setup. Fetch Sales Velocity Pulls recent sales data for forecasting analysis. Used later for AI-based trend prediction. Merge Inventory & Sales Data Combines inventory and sales datasets into a unified JSON structure. Prepares data for AI model input. AI Demand Forecasting Sends merged data to OpenAI GPT for demand prediction. Returns demand score, reorder need, and confidence levels. Parse AI Response Extracts and structures forecast results. Combines AI data with original inventory dataset. Filter: Reorder Needed Identifies items flagged for reorder based on AI output. Passes only reorder-required products to next steps. Create Purchase Order Automatically creates a PO document with item details, quantity, and supplier information. Calculates total cost and applies forecast-based reorder logic. Send PO to Supplier Sends the generated purchase order to supplier API endpoints. Includes response validation for order success/failure. Log to ERP System Records confirmed purchase orders into ERP platforms (SAP, Oracle, Netsuite). Includes timestamps and forecast metrics. Save to Database Stores all PO data, supplier responses, and AI forecast metrics into PostgreSQL/MySQL tables. Useful for long-term audit and analytics. Send Notification Email Sends summary emails upon PO creation and logging. Includes PO ID, supplier, cost, and demand reasoning. Setup Instructions Schedule Trigger:** Adjust to your preferred interval (e.g., every 6 hours or once daily). API Configuration:** Provide credentials in Inventory, Sales, and Supplier nodes. Use Authorization headers or API keys as per your system. AI Node (OpenAI):** Add your OpenAI API key in the credentials section. Modify the prompt if you wish to include additional forecasting parameters. ERP Integration:** Replace placeholder URLs with your ERP system endpoints. Match fields like purchase order number, date, and cost. Database Connection:** Configure credentials for PostgreSQL/MySQL in the Save to Database node. Ensure tables (purchase_orders) are created as per schema provided in sticky notes. Email Notifications:** Set up SMTP credentials (e.g., Gmail, Outlook, or custom mail server). Add recipients under workflow notification settings. Industries That Benefit This automation is highly beneficial for: Retail & E-commerce:** Predicts product demand and auto-orders from suppliers. Manufacturing:** Ensures raw materials are restocked based on production cycles. Pharmaceuticals:** Maintains optimum inventory for high-demand medicines. FMCG & Supply Chain:** Balances fast-moving goods availability with minimal overstocking. Automotive & Electronics:** Prevents delays due to missing components. Prerequisites API access to inventory, sales, supplier, and ERP systems. Valid OpenAI API key for demand forecasting. SQL database (PostgreSQL/MySQL) for record storage. SMTP or mail server credentials for email notifications. n8n environment with required nodes installed (HTTP, AI, Filter, Email, Database). Modification Options Change forecast logic or thresholds for different industries. Integrate Slack/Teams for live notifications. Add approval workflow before sending POs. Extend AI prompt for seasonality or promotional trends. Add dashboard visualization using Grafana or Google Sheets. Explore More AI Workflows: Get in touch with us to build industry-grade n8n automations with predictive intelligence.

Generate AI trading alerts from CoinGecko and Alpha Vantage via Slack, email and SMS

Automates real-time market monitoring, technical analysis, AI-powered signal generation for cryptocurrencies (and stocks), filters high-confidence trades, and delivers actionable alerts via multiple channels. Good to Know Runs every 5–30 minutes (configurable trigger) to catch fresh market opportunities Pulls real-time price data from multiple crypto/stock sources in parallel Calculates popular technical indicators (RSI, MACD, Moving Averages, etc.) Uses an AI model (likely Grok/xAI, OpenAI, or similar) to interpret indicators and generate buy/sell signals with confidence scores Applies multi-layer filtering to reduce noise (thresholds, validation rules) Stores signals in a database, logs execution history, and sends notifications Supports email, Telegram, Discord, SMS (via Twilio), or trading execution webhooks Saves significant time compared to manual chart watching How It Works Trigger Schedule Trigger* or Manual Trigger* (every 5–30 minutes) Optional: Market Hours / Kill-zone filter (e.g. avoid low-volume periods) Can be webhook-based for on-demand runs Fetch & Prepare Data Fetches real-time / recent OHLCV data for a watchlist of cryptocurrencies (and possibly stocks) Sources: CoinGecko, Binance, Alpha Vantage, CoinMarketCap, Bybit, Kraken, etc. (multiple in parallel) Combines data from different APIs Prepares structured dataset (candles, volume, current price) Calculates technical indicators in parallel or via Code node / community nodes (e.g. RSI(14), MACD, EMA/SMA crossovers, Bollinger Bands, etc.) Analysis & Signal Generation Sends prepared market data + calculated indicators to an AI model Prompt instructs the model to: Analyze current market structure Evaluate indicator confluence Generate Buy / Sell / Hold signal Assign confidence score (e.g. 0–100%) Provide short reasoning Optional: Rule-based pre-filter (e.g. only proceed if RSI < 30 or MACD crossover) Validate, Alert & Store Filters** signals: minimum confidence threshold, no-duplicate check, max signals per run, etc. Validates** against additional rules (e.g. volume spike, no recent opposite signal) Stores** signal in database (PostgreSQL, Supabase, Airtable, Google Sheets, etc.) Includes: timestamp, symbol, signal type, confidence, price, indicators snapshot, AI reasoning Logs** full execution trace Sends alerts**: Email notification Telegram / Discord message (with formatting) SMS (Twilio) Webhook to trading bot / execution system Optional: Push to tradingview alert or auto-execute (paper/live) Data Sources Market Data APIs** — CoinGecko, Binance, Alpha Vantage, CoinMarketCap, etc. Technical Indicators** — Calculated via Code node, community nodes (e.g. phoenix indicators), or external libraries AI Model** — Grok (xAI), OpenAI (GPT-4o), Claude, Gemini, or local LLM Notification Channels** — Email (Gmail/SMTP), Telegram, Discord, Twilio, webhook Storage** — Google Sheets, PostgreSQL, Supabase, Notion, Airtable How to Use Import the workflow JSON into your n8n instance Configure credentials: API keys for market data providers (Alpha Vantage, CoinGecko Pro, Binance, etc.) AI provider (Grok API key, OpenAI key, etc.) Notification services (Telegram bot token, email SMTP, Twilio, etc.) Database connection if used Set your watchlist — edit the symbols in the fetch node(s) Tune the schedule — change interval in the trigger node Customize AI prompt — adjust in the AI node for more aggressive/conservative signals Set filters — confidence threshold, max alerts per cycle, etc. Test manually — use Execute Workflow button with sample data Activate & monitor — check Executions tab for logs Requirements n8n (self-hosted or cloud) API keys for at least one market data provider AI API access (Grok, OpenAI, etc.) Notification credentials (Telegram bot, email account, etc.) Optional: Database for persistent signal history Customizing This Workflow Add more exchanges/sources for better data redundancy Include on-chain metrics (whale alerts, funding rates) via additional APIs Switch AI model or fine-tune prompt for your trading style Add risk management rules (position sizing, stop-loss levels) Integrate auto-trading via exchange API (Binance, Bybit, Alpaca, etc.) Create dashboard output (Google Sheets + Looker Studio / Grafana) Add backtesting mode using historical data Implement blackout periods or news filter to avoid high-impact events
+2

Scrape Google Maps by area & Generate Outreach Messages for Lead Generation

This n8n workflow automates lead extraction from Google Maps, enriches data with AI, and stores results for cold outreach. It uses the Bright Data community node and Bright Data MCP for scraping and AI message generation. How it works Form Submission User provides Google Maps starting location, keyword and country. Bright Data Scraping Bright Data community node triggers a Maps scraping job, monitors progress, and downloads results. AI Message Generation Uses Bright Data MCP with LLMs to create a personalized cold call script and talking points for each lead. Database Storage Enriched leads and scripts are upserted to Supabase. How to use Set up all the credentials, create your Postgres table and submit the form. The rest happens automatically. Requirements LLM account (OpenAI, Gemini…) for API usage. Bright Data account for API and MCP usage. Supabase account (or other Postgres database) to store information.

Generate Instagram Content from Top Trends with AI Image Generation

How it works This automated workflow discovers trending Instagram posts and creates similar AI-generated content. Here's the high-level process: Content Discovery & Analysis Scrapes trending posts from specific hashtags Analyzes visual elements using AI Filters out videos and duplicates AI Content Generation Creates unique images based on trending content Generates engaging captions with relevant hashtags Maintains brand consistency while being original Automated Publishing Posts content directly to Instagram Monitors publication status Sends notifications via Telegram Set up steps Setting up this workflow takes approximately 15-20 minutes: API Configuration (7-10 minutes) Instagram Business Account setup Telegram Bot creation API key generation (OpenAI, Replicate, Rapid Api) Database Setup (3-5 minutes) Create required database table Configure PostgreSQL credentials Workflow Configuration (5-7 minutes) Set scheduling preferences Configure notification settings Test connection and permissions Detailed technical specifications and configurations are available in sticky notes within the workflow.
+2

Automate Event Follow-Ups with GPT-4, LinkedIn & HubSpot Multi-Channel Outreach

Automate your post-event networking with this intelligent n8n workflow. Triggered instantly after an event, it collects attendee and interaction data, enriches profiles with LinkedIn insights, and uses GPT-4 to analyze engagement and generate tailored follow-up messages. High-value leads are prioritized, messages are sent via email, LinkedIn, or Slack, and all activity is logged in your CRM and database. Save hours of manual follow-up while boosting relationship-building and ROI. 🤝✨ Advanced Features Webhook automation** – Starts instantly on event completion Multi-Source Enrichment** – Combines event data, interactions, and LinkedIn profiles AI-Powered Insights** – GPT-4 analyzes behavior and suggests personalized talking points Smart Priority Filtering** – Routes leads into High, Medium, and Low priority paths Personalized Content Generation** – AI crafts custom emails and LinkedIn messages Multi-Channel Outreach** – Sends via Email, LinkedIn DM, and Slack CRM Integration** – Automatically updates HubSpot with contact notes and engagement PostgreSQL Logging** – Stores full interaction history and analytics ROI Dashboard** – Tracks response rates, meetings booked, and pipeline impact What It Does Collects attendee data from your event platform Enriches with LinkedIn profiles & real-time interaction logs Scores networking potential using engagement algorithms Uses AI to analyze conversations, roles, and mutual interests Generates hyper-personalized follow-up emails and LinkedIn messages Sends messages through preferred channels (email, LinkedIn, Slack) Updates HubSpot CRM with follow-up status and next steps Logs all actions and tracks analytics for performance reporting Workflow Process The Webhook Trigger initiates the workflow via POST request with event and attendee data. Get Attendees** fetches participant list from the event platform. Get Interactions** pulls Q&A, chat, poll, and networking activity logs. Enrich LinkedIn Data** retrieves professional profiles, job titles, and company details via LinkedIn API. Merge & Enrich Data** combines all sources into a unified lead profile. AI Analyze Profile** uses GPT-4 to evaluate interaction depth, role relevance, and conversation context. Filter High Priority** routes top-tier leads (e.g., decision-makers with strong engagement). Filter Medium Priority** handles warm prospects for lighter follow-up. AI Agent1** generates personalized email content using chat model and memory. Generate Email** creates a professional, context-aware follow-up email. Send Email** delivers the message to the lead’s inbox. AI Agent2** crafts a concise, friendly LinkedIn connection message. Generate LinkedIn Msg** produces a tailored outreach note. Send LinkedIn** posts the message via LinkedIn API. Slack Notification** alerts your team in real-time about high-priority outreach. Update CRM (HubSpot)** adds contact, tags, and follow-up tasks automatically. Save to Database (Insert)** logs full lead journey and message content in PostgreSQL. Generate Analytics** compiles engagement metrics and success rates. Send Response** confirms completion back to the event system. Setup Instructions Import the workflow JSON into n8n Configure credentials: Event Platform API (for attendees & interactions) LinkedIn API (OAuth2) OpenAI (GPT-4) SMTP (for email) or Email Service (SendGrid, etc.) HubSpot API Key PostgreSQL Database Slack Webhook URL Trigger with a webhook POST containing event ID and settings Watch personalized outreach happen automatically! Prerequisites Event platform with webhook + attendee/interaction API LinkedIn Developer App with API access OpenAI API key with GPT-4 access HubSpot account with API enabled PostgreSQL database (table for leads & logs) Slack workspace (optional, for team alerts) Example Webhook Payload { "eventId": "evt_spring2025", "eventName": "Annual Growth Summit", "triggerFollowUp": true, "priorityThreshold": { "high": 75, "medium": 50 } } Modification Options Adjust scoring logic in AI Analyze Profile (e.g., weight Q&A participation higher) Add custom email templates in Generate Email with your brand voice Include meeting booking links (Calendly) in high-priority messages Route VIP leads to Send SMS via Twilio Export analytics to Google Sheets or BI tools (Looker, Tableau) Add approval step before sending LinkedIn messages Ready to 10x your event ROI? Get in touch with us for custom n8n automation!

Generate Post-Event Reports with GPT-4, Email Delivery & Database Storage

Streamline your post-event analysis with this smart n8n workflow. Triggered by a simple webhook, it instantly gathers attendee and engagement data from your event platform, calculates key metrics, and uses AI to generate a polished, professional report. The final summary is emailed to stakeholders and saved securely in a database — all without manual effort. Perfect for conferences, webinars, and corporate events. 📧📈 Key Features Webhook triggered** – Starts instantly via HTTP POST request Multi-source data collection** – Fetches attendees & engagement metrics Advanced analytics** – Calculates attendance rates, engagement scores, top sessions AI-powered insights** – Uses GPT-4 to generate professional reports Auto-email delivery** – Sends report to stakeholders Database archiving** – Saves reports to PostgreSQL What it Analyzes Attendance rates & check-ins Average session time Engagement scores (polls, Q&A, networking) Top performing sessions Attendee breakdown (by role & company) AI-generated insights & recommendations Workflow Process The Webhook Trigger node starts the workflow when an HTTP POST request is received with event details. Get Attendees (GET)** pulls the list of registered and checked-in participants from your event system. Get Engagement Metrics (GET)** retrieves interaction data like poll responses, Q&A activity, and session views. Process Metrics** calculates key stats: attendance rate, average session duration, engagement score, and ranks top sessions. AI Generate Report** uses GPT-4 to create a clear, professional summary with insights and recommendations based on the data. AI Agent** coordinates data flow and prepares the final report structure using chat model and memory tools. Save to Database (Insert)** stores the full report and raw metrics in PostgreSQL for future reference. Send Report Email** automatically emails the AI-generated report to the specified recipient. Send Response** returns a confirmation back to the triggering system via webhook. Setup Instructions Import this JSON into n8n Configure credentials: Event API (for GET requests) OpenAI (GPT-4) SMTP (for email delivery) PostgreSQL (for data storage) Trigger via webhook with event data Receive comprehensive report via email within minutes! Prerequisites Event platform with REST API (for attendee & engagement data) OpenAI API key (GPT-4 access) SMTP server credentials (Gmail, SendGrid, etc.) PostgreSQL database with write access Example Webhook Payload { "eventId": "evt_123", "eventName": "Tech Summit 2025", "eventDate": "2025-10-29", "email": "[email protected]" } Modification Options Add custom metrics in the Process Metrics node (e.g., NPS score, feedback sentiment) Change AI tone in AI Generate Report (formal, executive summary, or creative) Modify email template in Send Report Email with your branding Connect to different data sources by updating GET nodes Add Slack or Teams notification after Send Report Email Ready to automate your event reporting? Get in touch with us for custom n8n workflows!

Build your own HTTP Request and Postgres integration

Create custom HTTP Request and Postgres workflows by choosing triggers and actions. Nodes come with global operations and settings, as well as app-specific parameters that can be configured. You can also use the HTTP Request node to query data from any app or service with a REST API.

Postgres supported actions

Delete
Delete an entire table or rows in a table
Execute Query
Execute an SQL query
Insert
Insert rows in a table
Insert or Update
Insert or update rows in a table
Select
Select rows from a table
Update
Update rows in a table
Use case

Save engineering resources

Reduce time spent on customer integrations, engineer faster POCs, keep your customer-specific functionality separate from product all without having to code.

Learn more

FAQs

  • Can HTTP Request connect with Postgres?

  • Can I use HTTP Request’s API with n8n?

  • Can I use Postgres’s API with n8n?

  • Is n8n secure for integrating HTTP Request and Postgres?

  • How to get started with HTTP Request and Postgres integration in n8n.io?

Need help setting up your HTTP Request and Postgres integration?

Discover our latest community's recommendations and join the discussions about HTTP Request and Postgres integration.
Mikhail Savenkov
Moiz Contractor
theo
Jon
Honza Pav

Looking to integrate HTTP Request and Postgres in your company?

Over 3000 companies switch to n8n every single week

Why use n8n to integrate HTTP Request with Postgres

Build complex workflows, really fast

Build complex workflows, really fast

Handle branching, merging and iteration easily.
Pause your workflow to wait for external events.

Code when you need it, UI when you don't

Simple debugging

Your data is displayed alongside your settings, making edge cases easy to track down.

Use templates to get started fast

Use 1000+ workflow templates available from our core team and our community.

Reuse your work

Copy and paste, easily import and export workflows.

Implement complex processes faster with n8n

red iconyellow iconred iconyellow icon