Back to Integrations
integrationWebhook node
integrationMongoDB node

Webhook and MongoDB integration

Save yourself the work of writing custom integrations for Webhook and MongoDB and use n8n instead. Build adaptable and scalable Development, Core Nodes, and Data & Storage workflows that work with your technology stack. All within a building experience you will love.

How to connect Webhook and MongoDB

  • Step 1: Create a new workflow
  • Step 2: Add and configure nodes
  • Step 3: Connect
  • Step 4: Customize and extend your integration
  • Step 5: Test and activate your workflow

Step 1: Create a new workflow and add the first step

In n8n, click the "Add workflow" button in the Workflows tab to create a new workflow. Add the starting point – a trigger on when your workflow should run: an app event, a schedule, a webhook call, another workflow, an AI chat, or a manual trigger. Sometimes, the HTTP Request node might already serve as your starting point.

Webhook and MongoDB integration: Create a new workflow and add the first step

Step 2: Add and configure Webhook and MongoDB nodes

You can find Webhook and MongoDB in the nodes panel. Drag them onto your workflow canvas, selecting their actions. Click each node, choose a credential, and authenticate to grant n8n access. Configure Webhook and MongoDB nodes one by one: input data on the left, parameters in the middle, and output data on the right.

Webhook and MongoDB integration: Add and configure Webhook and MongoDB nodes

Step 3: Connect Webhook and MongoDB

A connection establishes a link between Webhook and MongoDB (or vice versa) to route data through the workflow. Data flows from the output of one node to the input of another. You can have single or multiple connections for each node.

Webhook and MongoDB integration: Connect Webhook and MongoDB

Step 4: Customize and extend your Webhook and MongoDB integration

Use n8n's core nodes such as If, Split Out, Merge, and others to transform and manipulate data. Write custom JavaScript or Python in the Code node and run it as a step in your workflow. Connect Webhook and MongoDB with any of n8n’s 1000+ integrations, and incorporate advanced AI logic into your workflows.

Webhook and MongoDB integration: Customize and extend your Webhook and MongoDB integration

Step 5: Test and activate your Webhook and MongoDB workflow

Save and run the workflow to see if everything works as expected. Based on your configuration, data should flow from Webhook to MongoDB or vice versa. Easily debug your workflow: you can check past executions to isolate and fix the mistake. Once you've tested everything, make sure to save your workflow and activate it.

Webhook and MongoDB integration: Test and activate your Webhook and MongoDB workflow

Predictive health monitoring & alert system with GPT-4o-mini

How It Works
The system collects real-time wearable health data, normalizes it, and uses AI to analyze trends and risk scores. It detects anomalies by comparing with historical patterns and automatically triggers alerts and follow-up actions when thresholds are exceeded.

Setup Steps
Configure Webhook Endpoint - Set up webhook to receive data from wearable devices
Connect Database - Initialize storage for health metrics and historical data
Set Normalization Rules - Define data standardization parameters for different devices
Configure AI Model - Set up health score calculation and risk prediction algorithms
Define Thresholds - Establish alert triggers for critical health metrics
Connect Notification Channels - Configure email and Slack integrations
Set Up Reporting - Create automated report templates and schedules
Test Workflow - Run end-to-end tests with sample health data

Workflow Template
Webhook → Store Database → Normalize Data → Calculate Health Score → Analyze Metrics → Compare Previous → Check Threshold → Generate Reports/Alerts → Email/Slack → Schedule Follow-up

Workflow Steps
Ingest real-time wearable data via webhook, store and standardize it, and use GPT-4 for trend analysis and risk scoring. Monitor metrics against thresholds, trigger SMS, email, or Slack alerts, generate reports, and schedule follow-ups.

Setup Instructions
Configure webhook, database, GPT-4 API, notifications, calendar integration, and customize alert thresholds.

Prerequisites
Wearable API, patient database, GPT-4 key, email SMTP, optional Slack/Twilio, calendar integration.

Use Cases
Monitor glucose for diabetics, track elderly vitals/fall risk, assess corporate wellness, and post-surgery recovery alerts.

Customization
Adjust risk algorithms, add metrics, integrate telemedicine.

Benefits
Early intervention reduces readmissions and automates 80% of monitoring tasks.

Nodes used in this workflow

Popular Webhook and MongoDB workflows

+6

Predictive Health Monitoring & Alert System with GPT-4o-mini

How It Works The system collects real-time wearable health data, normalizes it, and uses AI to analyze trends and risk scores. It detects anomalies by comparing with historical patterns and automatically triggers alerts and follow-up actions when thresholds are exceeded. Setup Steps Configure Webhook Endpoint - Set up webhook to receive data from wearable devices Connect Database - Initialize storage for health metrics and historical data Set Normalization Rules - Define data standardization parameters for different devices Configure AI Model - Set up health score calculation and risk prediction algorithms Define Thresholds - Establish alert triggers for critical health metrics Connect Notification Channels - Configure email and Slack integrations Set Up Reporting - Create automated report templates and schedules Test Workflow - Run end-to-end tests with sample health data Workflow Template Webhook → Store Database → Normalize Data → Calculate Health Score → Analyze Metrics → Compare Previous → Check Threshold → Generate Reports/Alerts → Email/Slack → Schedule Follow-up Workflow Steps Ingest real-time wearable data via webhook, store and standardize it, and use GPT-4 for trend analysis and risk scoring. Monitor metrics against thresholds, trigger SMS, email, or Slack alerts, generate reports, and schedule follow-ups. Setup Instructions Configure webhook, database, GPT-4 API, notifications, calendar integration, and customize alert thresholds. Prerequisites Wearable API, patient database, GPT-4 key, email SMTP, optional Slack/Twilio, calendar integration. Use Cases Monitor glucose for diabetics, track elderly vitals/fall risk, assess corporate wellness, and post-surgery recovery alerts. Customization Adjust risk algorithms, add metrics, integrate telemedicine. Benefits Early intervention reduces readmissions and automates 80% of monitoring tasks.

E-commerce Price Tracker with ScrapeGraphAI, MongoDB, and Mailgun Alerts

Product Price Monitor with Mailgun and MongoDB ⚠️ COMMUNITY TEMPLATE DISCLAIMER: This is a community-contributed template that uses ScrapeGraphAI (a community node). Please ensure you have the ScrapeGraphAI community node installed in your n8n instance before using this template. This workflow automatically scrapes multiple e-commerce sites, records weekly product prices in MongoDB, analyzes seasonal trends, and emails a concise report to retail stakeholders via Mailgun. It helps retailers make informed inventory and pricing decisions by providing up-to-date pricing intelligence. Pre-conditions/Requirements Prerequisites n8n instance (self-hosted, desktop, or n8n.cloud) ScrapeGraphAI community node installed and activated MongoDB database (Atlas or self-hosted) Mailgun account with a verified domain Publicly reachable n8n Webhook URL (if self-hosted) Required Credentials ScrapeGraphAI API Key** – Enables web scraping across target sites MongoDB Credentials** – Connection string (MongoDB URI) with read/write access Mailgun API Key & Domain** – To send summary emails MongoDB Collection Schema | Field | Type | Example Value | Notes | |-----------------|----------|---------------------------|---------------------------------------------| | productId | String | SKU-12345 | Unique identifier you define | | productName | String | Women's Winter Jacket | Human-readable name | | timestamp | Date | 2024-09-15T00:00:00Z | Ingest date (automatically added) | | price | Number | 79.99 | Scraped price | | source | String | example-shop.com | Domain where price was scraped | How it works This workflow automatically scrapes multiple e-commerce sites, records weekly product prices in MongoDB, analyzes seasonal trends, and emails a concise report to retail stakeholders via Mailgun. It helps retailers make informed inventory and pricing decisions by providing up-to-date pricing intelligence. Key Steps: Webhook Trigger**: Starts the workflow on a scheduled HTTP call or manual trigger. Code (Prepare Products)**: Defines the list of SKUs/URLs to monitor. Split In Batches**: Processes products in manageable chunks to respect rate limits. ScrapeGraphAI (Scrape Price)**: Extracts price, availability, and currency from each product URL. Merge (Combine Results)**: Re-assembles all batch outputs into one dataset. MongoDB (Upsert Price History)**: Stores each price point for historical analysis. If (Seasonal Trend Check)**: Compares current price against historical average to detect anomalies. Set (Email Payload)**: Formats the trend report for email. Mailgun (Send Email)**: Emails weekly summary to specified recipients. Respond to Webhook**: Returns “200 OK – Report Sent” response for logging. Set up steps Setup Time: 15-20 minutes Install Community Node In n8n, go to “Settings → Community Nodes” and install @n8n-community/nodes-scrapegraphai. Create Credentials Add ScrapeGraphAI API key under Credentials. Add MongoDB credentials (type: MongoDB). Add Mailgun credentials (type: Mailgun). Import Workflow Download the JSON template, then in n8n click “Import” and select the file. Configure Product List Open the Code (Prepare Products) node and replace the example array with your product objects { id, name, url }. Adjust Cron/Schedule If you prefer a fully automated schedule, replace the Webhook with a Cron node (e.g., every Monday at 09:00). Verify MongoDB Collection Ensure the collection (default: productPrices) exists or let n8n create it on first run. Set Recipients In the Mailgun node, update the to, from, and subject fields. Execute Test Run Manually trigger the Webhook URL or run the workflow once to verify data flow and email delivery. Activate Toggle the workflow to “Active” so it runs automatically each week. Node Descriptions Core Workflow Nodes: Webhook** – Entry point that accepts a GET/POST call to start the job. Code (Prepare Products)** – Outputs an array of products to monitor. Split In Batches** – Limits scraping to N products per request to avoid banning. ScrapeGraphAI** – Scrapes the HTML of a product page and parses pricing data. Merge** – Re-combines batch results for streamlined processing. MongoDB** – Inserts or updates each product’s price history document. If** – Determines whether price deviates > X% from the season average. Set** – Builds an HTML/text email body containing the findings. Mailgun** – Sends the email via Mailgun REST API. Respond to Webhook** – Returns an HTTP response for logging/monitoring. Sticky Notes** – Provide in-workflow documentation (no execution). Data Flow: Webhook → Code → Split In Batches Split In Batches → ScrapeGraphAI → Merge Merge → MongoDB → If If (true) → Set → Mailgun → Respond to Webhook Customization Examples Change Scraping Frequency (Cron) // Cron node settings { "mode": "custom", "cronExpression": "0 6 * * 1,4" // Monday & Thursday 06:00 } Extend Data Points (Reviews Count, Stock) // In ScrapeGraphAI extraction config { "price": "css:span.price", "inStock": "css:div.availability", "reviewCount": "regex:\"(\\d+) reviews\"" } Data Output Format The workflow outputs structured JSON data: { "productId": "SKU-12345", "productName": "Women's Winter Jacket", "timestamp": "2024-09-15T00:00:00Z", "price": 79.99, "currency": "USD", "source": "example-shop.com", "trend": "5% below 3-month average" } Troubleshooting Common Issues ScrapeGraphAI returns empty data – Confirm selectors/XPath are correct; test with ScrapeGraphAI playground. MongoDB connection fails – Verify IP-whitelisting for Atlas or network connectivity for self-hosted instance. Mail not delivered – Check Mailgun logs for bounce or spam rejection, and ensure from domain is verified. Performance Tips Use smaller batch sizes (e.g., 5 URLs) to avoid target site rate-limit blocks. Cache static product info; scrape only fields that change (price, stock). Pro Tips: Integrate the IF node with n8n’s Slack node to push urgent price drops to a channel. Add a Function node to calculate moving averages for deeper analysis. Store raw HTML snapshots in S3/MinIO for auditability and debugging.

Create a Secure MongoDB Data Retrieval API with Input Validation and HTTP Responses

Data Extraction from MongoDB Overview This workflow exposes a public HTTP GET endpoint to read all documents from a MongoDB collection, with: Strict validation of the collection name Error handling with proper 4xx codes Response formatting (e.g., _id → id) and a consistent 2XX JSON envelope Workflow Steps Webhook Trigger: A public GET endpoint receives requests with the collection name as a parameter. The workflow begins with a webhook that listens for incoming HTTP GET requests. The endpoint follows this pattern: https://{{your-n8n-instance}}/webhook-test/{{uuid&gt}}/:nameCollection The :nameCollection parameter is passed directly in the URL and specifies the MongoDB collection to be queried. Example: https://yourdomain.com/webhook-test/abcd1234/orders would attempt to fetch all documents from the orders collection. Validation: The collection name is checked against a set of rules to prevent invalid or unsafe queries. Before querying the database, the collection name undergoes validation using a regular expression: ^(?!system\.)[a-zA-Z0-9._]{1,120}$ Purpose of validation: Blocks access to MongoDB’s reserved system.* collections. Prevents injection attacks by ensuring only alphanumeric characters, underscores, and dots are allowed. Enforces MongoDB’s length restrictions (max 120 characters). This step ensures the workflow cannot be exploited with malicious input. Conditional Check: If the validation fails, the workflow stops and returns an error message. If it succeeds, it continues. The workflow checks if the collection name passes validation. If valid ✅: proceeds to query MongoDB. If invalid ❌: immediately returns a structured HTTP 400 response, adhering to RESTful standards: { "code": 400, "message": "{{ $json.message }}" } MongoDB Query: The workflow connects to MongoDB and retrieves all documents from the specified collection. To use the MongoDB node, a proper database connection must be configured in n8n. This is done through MongoDB Credentials in the node settings: Create MongoDB Credentials in n8n Go to n8n → Credentials → New. Select MongoDB and Fill in the following fields: Host: The MongoDB server hostname or IP (e.g., cluster0.mongodb.net). Port: Default is 27017 for local deployments. Database: Name of the database (e.g., myDatabase). User: MongoDB username with read permissions. Password: Corresponding password. Connection Type: Standard for most cases, or Connection String if using a full URI. Replica Set / SRV Record: Enable if using MongoDB Atlas or a replica cluster. Using a Connection String (recommended for MongoDB Atlas) Example URI: mongodb+srv://<username>:<password>@cluster0.mongodb.net/myDatabase?retryWrites=true&w=majority Paste this into the Connection String field when selecting "Connection String" as the type. Verify the Connection After saving, test the credentials to confirm n8n can connect successfully to your MongoDB instance. Configure the MongoDB Node in the Workflow Operation: Find (to fetch documents). Collection: Dynamic value passed from the workflow (e.g., {{$json["nameCollection"]}}). Query: Leave empty to fetch all documents, or define filters if needed. Result: The MongoDB node will retrieve all documents from the specified collection and pass the dataset as JSON to the next node for processing. Data Formatting: The retrieved documents are processed to adjust field names. By default, MongoDB returns its unique identifier as _id. To align with common API conventions, this step renames _id → id. This small transformation simplifies downstream usage, making responses more intuitive for client applications. Response: The cleaned dataset is returned as a structured JSON response to the original request. The processed dataset is returned as the response to the original HTTP request. Clients receive a clean JSON payload with the expected format and renamed identifiers. Example response: [ { "id": "64f13c1e2f1a5e34d9b3e7f0", "name": "John Doe", "email": "[email protected]" }, { "id": "64f13c1e2f1a5e34d9b3e7f1", "name": "Jane Smith", "email": "[email protected]" } ] Workflow Summary Webhook (GET) → Code (Validation) → IF (Validation Check) → MongoDB (Query) → Code (Transform IDs) → Respond to Webhook Key Benefits ✅ Security-first design: prevents unauthorized access or injection attacks. ✅ Standards compliance: uses HTTP status codes (400) for invalid requests. ✅ Clean API response: transforms MongoDB’s native _id into a more user-friendly id. ✅ Scalability: ready for integration with any frontend, third-party service, or analytics pipeline.

Create and Manage Short URLs with Telegram Bot, MongoDB and Nginx Redirects

This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow allows you to create and manage custom short URLs directly via Telegram, with all data stored in MongoDB, and redirects handled efficiently via Nginx. How it works This flow provides a seamless URL shortening experience: Create via Telegram: Send a long URL to your bot. It will ask if you want a custom short code. Store in MongoDB: All long URLs and their corresponding short codes are securely stored in your MongoDB instance. Fast Redirects: When a user accesses a short URL, Nginx forwards the request to a dedicated n8n webhook, which then quickly redirects them to the original long URL. Set up steps This setup is straightforward, especially if you already have a running n8n instance and a VPS. Difficulty: Medium (Basic n8n/VPS knowledge required) Estimated Time: 15-30 minutes n8n Instance & VPS: Ensure you have n8n running on your VPS (e.g., 2 core 2GB, as you have). Telegram Bot: Create a new bot via @BotFather and get your Bot Token. Add this as a Telegram credential in n8n. MongoDB Database: Set up a MongoDB instance (either on your VPS or a cloud service like MongoDB Atlas). Create a database and a collection (e.g., url or short_urls). Add your MongoDB credentials in n8n. Here's MongoDB data structure JSON: >[ {"_id": "686a11946a48b580d72d0397", "longUrl": "https://longurl.com/abcdefghijklm/", "shortUrl": "short-code"} ] Domain/Subdomain: Point a domain or subdomain (e.g., s.yourdomain.com) to your VPS IP address. This will be your short URL base. Nginx/Caddy Configuration: Configure your web server (Nginx or Caddy) on the VPS to proxy requests from your short URL domain to the n8n webhook for redirects. (Detailed Nginx config is provided as sticky notes in the redirect workflow) Workflow Setup: Import both provided n8n workflows (Telegram URL Shortener Creator and URL Redirect Handler). Activate both workflows. Crucial: Set an environment variable in your n8n instance (or .env file) named SHORTENER_DOMAIN with the value of your short URL domain (e.g., https://s.yourdomain.com). Refer to sticky notes inside the workflows for detailed node configurations and expressions.

Build your own Webhook and MongoDB integration

Create custom Webhook and MongoDB workflows by choosing triggers and actions. Nodes come with global operations and settings, as well as app-specific parameters that can be configured. You can also use the HTTP Request node to query data from any app or service with a REST API.

MongoDB supported actions

Create
Drop
List
Update
Aggregate
Aggregate documents
Delete
Delete documents
Find
Find documents
Find And Replace
Find and replace documents
Find And Update
Find and update documents
Insert
Insert documents
Update
Update documents

Webhook and MongoDB integration details

integrationWebhook node
Webhook

Webhooks are automatic notifications that apps send when something occurs. They are sent to a certain URL, which is effectively the app's phone number or address, and contain a message or payload. Polling is nearly never quicker than webhooks, and it takes less effort from you.

Use case

Save engineering resources

Reduce time spent on customer integrations, engineer faster POCs, keep your customer-specific functionality separate from product all without having to code.

Learn more

FAQs

  • Can Webhook connect with MongoDB?

  • Can I use Webhook’s API with n8n?

  • Can I use MongoDB’s API with n8n?

  • Is n8n secure for integrating Webhook and MongoDB?

  • How to get started with Webhook and MongoDB integration in n8n.io?

Need help setting up your Webhook and MongoDB integration?

Discover our latest community's recommendations and join the discussions about Webhook and MongoDB integration.
Benjamin Hatton
Albert Ashkhatoyan
Víctor González
João Textor
Salomão

Looking to integrate Webhook and MongoDB in your company?

Over 3000 companies switch to n8n every single week

Why use n8n to integrate Webhook with MongoDB

Build complex workflows, really fast

Build complex workflows, really fast

Handle branching, merging and iteration easily.
Pause your workflow to wait for external events.

Code when you need it, UI when you don't

Simple debugging

Your data is displayed alongside your settings, making edge cases easy to track down.

Use templates to get started fast

Use 1000+ workflow templates available from our core team and our community.

Reuse your work

Copy and paste, easily import and export workflows.

Implement complex processes faster with n8n

red iconyellow iconred iconyellow icon