Back to Integrations
integrationBaserow node
integrationSlack node

Baserow and Slack integration

Save yourself the work of writing custom integrations for Baserow and Slack and use n8n instead. Build adaptable and scalable Data & Storage, Communication, and HITL workflows that work with your technology stack. All within a building experience you will love.

How to connect Baserow and Slack

  • Step 1: Create a new workflow
  • Step 2: Add and configure nodes
  • Step 3: Connect
  • Step 4: Customize and extend your integration
  • Step 5: Test and activate your workflow

Step 1: Create a new workflow and add the first step

In n8n, click the "Add workflow" button in the Workflows tab to create a new workflow. Add the starting point – a trigger on when your workflow should run: an app event, a schedule, a webhook call, another workflow, an AI chat, or a manual trigger. Sometimes, the HTTP Request node might already serve as your starting point.

Baserow and Slack integration: Create a new workflow and add the first step

Step 2: Add and configure Baserow and Slack nodes

You can find Baserow and Slack in the nodes panel. Drag them onto your workflow canvas, selecting their actions. Click each node, choose a credential, and authenticate to grant n8n access. Configure Baserow and Slack nodes one by one: input data on the left, parameters in the middle, and output data on the right.

Baserow and Slack integration: Add and configure Baserow and Slack nodes

Step 3: Connect Baserow and Slack

A connection establishes a link between Baserow and Slack (or vice versa) to route data through the workflow. Data flows from the output of one node to the input of another. You can have single or multiple connections for each node.

Baserow and Slack integration: Connect Baserow and Slack

Step 4: Customize and extend your Baserow and Slack integration

Use n8n's core nodes such as If, Split Out, Merge, and others to transform and manipulate data. Write custom JavaScript or Python in the Code node and run it as a step in your workflow. Connect Baserow and Slack with any of n8n’s 1000+ integrations, and incorporate advanced AI logic into your workflows.

Baserow and Slack integration: Customize and extend your Baserow and Slack integration

Step 5: Test and activate your Baserow and Slack workflow

Save and run the workflow to see if everything works as expected. Based on your configuration, data should flow from Baserow to Slack or vice versa. Easily debug your workflow: you can check past executions to isolate and fix the mistake. Once you've tested everything, make sure to save your workflow and activate it.

Baserow and Slack integration: Test and activate your Baserow and Slack workflow

E-commerce product price tracker with ScrapeGraphAI, Baserow and Slack alerts

Product Price Monitor with Mailchimp and Baserow

⚠️ COMMUNITY TEMPLATE DISCLAIMER: This is a community-contributed template that uses ScrapeGraphAI (a community node). Please ensure you have the ScrapeGraphAI community node installed in your n8n instance before using this template.

This workflow scrapes multiple e-commerce sites for product pricing data, stores the historical prices in Baserow, analyzes weekly trends, and emails a neatly formatted seasonal report to your Mailchimp audience. It is designed for retailers who need to stay on top of seasonal pricing patterns to make informed inventory and pricing decisions.

Pre-conditions/Requirements

Prerequisites
Running n8n instance (self-hosted or n8n cloud)
ScrapeGraphAI community node installed
Mailchimp account with at least one audience list
Baserow workspace with edit rights
Product URLs or SKU list from target e-commerce platforms

Required Credentials

Credential Used By Scope
ScrapeGraphAI API Key ScrapeGraphAI node Web scraping
Mailchimp API Key & Server Prefix Mailchimp node Sending emails
Baserow API Token Baserow node Reading & writing records

Baserow Table Setup
Create a table named price_tracker with the following fields:

Field Name Type Example
product_name Text “Winter Jacket”
product_url URL https://example.com/winter-jacket
current_price Number 59.99
scrape_date DateTime 2023-11-15T08:21:00Z

How it works

This workflow scrapes multiple e-commerce sites for product pricing data, stores the historical prices in Baserow, analyzes weekly trends, and emails a neatly formatted seasonal report to your Mailchimp audience. It is designed for retailers who need to stay on top of seasonal pricing patterns to make informed inventory and pricing decisions.

Key Steps:
Schedule Trigger**: Fires every week (or custom CRON) to start the monitoring cycle.
Code (Prepare URLs): Loads or constructs the list of product URLs to monitor.
SplitInBatches
: Processes product URLs in manageable batches to avoid rate-limit issues.
ScrapeGraphAI**: Scrapes each product page and extracts the current price and name.
If (Price Found?): Continues only if scraping returns a valid price.
Baserow
: Upserts the scraped data into the price_tracker table.
Code (Trend Analysis): Aggregates weekly data to detect price increases, decreases, or stable trends.
Set (Mail Content)
: Formats the trend summary into an HTML email body.
Mailchimp**: Sends the seasonal price-trend report to the selected audience segment.
Sticky Note**: Documentation node explaining business logic in-workflow.

Set up steps

Setup Time: 10-15 minutes

Clone the template: Import the workflow JSON into your n8n instance.
Install ScrapeGraphAI: n8n-nodes-scrapegraphai via the Community Nodes panel.
Add credentials:
a. ScrapeGraphAI API Key
b. Mailchimp API Key & Server Prefix
c. Baserow API Token
Configure Baserow node: Point it to your price_tracker table.
Edit product list: In the “Prepare URLs” Code node, replace the sample URLs with your own.
Adjust schedule: Modify the Schedule Trigger CRON expression if weekly isn’t suitable.
Test run: Execute the workflow manually once to verify credentials and data flow.
Activate: Turn on the workflow for automatic weekly monitoring.

Node Descriptions

Core Workflow Nodes:
Schedule Trigger** – Initiates the workflow on a weekly CRON schedule.
Code (Prepare URLs)** – Generates an array of product URLs/SKUs to scrape.
SplitInBatches** – Splits the array into chunks of 5 URLs to stay within request limits.
ScrapeGraphAI** – Scrapes each URL, using XPath/CSS selectors to pull price & title.
If (Price Found?)** – Filters out failed or empty scrape results.
Baserow** – Inserts or updates the price record in the database.
Code (Trend Analysis)** – Calculates week-over-week price changes and flags anomalies.
Set (Mail Content)** – Creates an HTML table with product, current price, and trend arrow.
Mailchimp** – Sends or schedules the email campaign.
Sticky Note** – Provides inline documentation and edit hints.

Data Flow:
Schedule Trigger → Code (Prepare URLs) → SplitInBatches
SplitInBatches → ScrapeGraphAI → If (Price Found?) → Baserow
Baserow → Code (Trend Analysis) → Set (Mail Content) → Mailchimp

Customization Examples

Change scraping frequency
// Schedule Trigger CRON for daily at 07:00 UTC
0 7 * * *

Add competitor comparison column
// Code (Trend Analysis)
item.competitor_price_diff = item.current_price - item.competitor_price;
return item;

Data Output Format

The workflow outputs structured JSON data:

{
"product_name": "Winter Jacket",
"product_url": "https://example.com/winter-jacket",
"current_price": 59.99,
"scrape_date": "2023-11-15T08:21:00Z",
"weekly_trend": "decrease"
}

Troubleshooting

Common Issues
Invalid ScrapeGraphAI key – Verify the API key and ensure your subscription is active.
Mailchimp “Invalid Audience” error – Double-check the audience ID and that the API key has correct permissions.
Baserow “Field mismatch” – Confirm your table fields match the names/types in the workflow.

Performance Tips
Limit each SplitInBatches run to ≤10 URLs to reduce scraping timeouts.
Enable caching in ScrapeGraphAI to avoid repeated requests to the same URL within short intervals.

Pro Tips:
Use environment variables for all API keys to avoid hard-coding secrets.
Add an extra If node to alert you if a product’s price drops below a target threshold.
Combine with n8n’s Slack node for real-time alerts in addition to Mailchimp summaries.

Nodes used in this workflow

Popular Baserow and Slack workflows

E-commerce Product Price Tracker with ScrapeGraphAI, Baserow and Slack Alerts

Product Price Monitor with Mailchimp and Baserow ⚠️ COMMUNITY TEMPLATE DISCLAIMER: This is a community-contributed template that uses ScrapeGraphAI (a community node). Please ensure you have the ScrapeGraphAI community node installed in your n8n instance before using this template. This workflow scrapes multiple e-commerce sites for product pricing data, stores the historical prices in Baserow, analyzes weekly trends, and emails a neatly formatted seasonal report to your Mailchimp audience. It is designed for retailers who need to stay on top of seasonal pricing patterns to make informed inventory and pricing decisions. Pre-conditions/Requirements Prerequisites Running n8n instance (self-hosted or n8n cloud) ScrapeGraphAI community node installed Mailchimp account with at least one audience list Baserow workspace with edit rights Product URLs or SKU list from target e-commerce platforms Required Credentials | Credential | Used By | Scope | |------------|---------|-------| | ScrapeGraphAI API Key | ScrapeGraphAI node | Web scraping | | Mailchimp API Key & Server Prefix | Mailchimp node | Sending emails | | Baserow API Token | Baserow node | Reading & writing records | Baserow Table Setup Create a table named price_tracker with the following fields: | Field Name | Type | Example | |------------|------|---------| | product_name | Text | “Winter Jacket” | | product_url | URL | https://example.com/winter-jacket | | current_price | Number | 59.99 | | scrape_date | DateTime | 2023-11-15T08:21:00Z | How it works This workflow scrapes multiple e-commerce sites for product pricing data, stores the historical prices in Baserow, analyzes weekly trends, and emails a neatly formatted seasonal report to your Mailchimp audience. It is designed for retailers who need to stay on top of seasonal pricing patterns to make informed inventory and pricing decisions. Key Steps: Schedule Trigger**: Fires every week (or custom CRON) to start the monitoring cycle. Code (Prepare URLs)**: Loads or constructs the list of product URLs to monitor. SplitInBatches**: Processes product URLs in manageable batches to avoid rate-limit issues. ScrapeGraphAI**: Scrapes each product page and extracts the current price and name. If (Price Found?)**: Continues only if scraping returns a valid price. Baserow**: Upserts the scraped data into the price_tracker table. Code (Trend Analysis)**: Aggregates weekly data to detect price increases, decreases, or stable trends. Set (Mail Content)**: Formats the trend summary into an HTML email body. Mailchimp**: Sends the seasonal price-trend report to the selected audience segment. Sticky Note**: Documentation node explaining business logic in-workflow. Set up steps Setup Time: 10-15 minutes Clone the template: Import the workflow JSON into your n8n instance. Install ScrapeGraphAI: n8n-nodes-scrapegraphai via the Community Nodes panel. Add credentials: a. ScrapeGraphAI API Key b. Mailchimp API Key & Server Prefix c. Baserow API Token Configure Baserow node: Point it to your price_tracker table. Edit product list: In the “Prepare URLs” Code node, replace the sample URLs with your own. Adjust schedule: Modify the Schedule Trigger CRON expression if weekly isn’t suitable. Test run: Execute the workflow manually once to verify credentials and data flow. Activate: Turn on the workflow for automatic weekly monitoring. Node Descriptions Core Workflow Nodes: Schedule Trigger** – Initiates the workflow on a weekly CRON schedule. Code (Prepare URLs)** – Generates an array of product URLs/SKUs to scrape. SplitInBatches** – Splits the array into chunks of 5 URLs to stay within request limits. ScrapeGraphAI** – Scrapes each URL, using XPath/CSS selectors to pull price & title. If (Price Found?)** – Filters out failed or empty scrape results. Baserow** – Inserts or updates the price record in the database. Code (Trend Analysis)** – Calculates week-over-week price changes and flags anomalies. Set (Mail Content)** – Creates an HTML table with product, current price, and trend arrow. Mailchimp** – Sends or schedules the email campaign. Sticky Note** – Provides inline documentation and edit hints. Data Flow: Schedule Trigger → Code (Prepare URLs) → SplitInBatches SplitInBatches → ScrapeGraphAI → If (Price Found?) → Baserow Baserow → Code (Trend Analysis) → Set (Mail Content) → Mailchimp Customization Examples Change scraping frequency // Schedule Trigger CRON for daily at 07:00 UTC 0 7 * * * Add competitor comparison column // Code (Trend Analysis) item.competitor_price_diff = item.current_price - item.competitor_price; return item; Data Output Format The workflow outputs structured JSON data: { "product_name": "Winter Jacket", "product_url": "https://example.com/winter-jacket", "current_price": 59.99, "scrape_date": "2023-11-15T08:21:00Z", "weekly_trend": "decrease" } Troubleshooting Common Issues Invalid ScrapeGraphAI key – Verify the API key and ensure your subscription is active. Mailchimp “Invalid Audience” error – Double-check the audience ID and that the API key has correct permissions. Baserow “Field mismatch” – Confirm your table fields match the names/types in the workflow. Performance Tips Limit each SplitInBatches run to ≤10 URLs to reduce scraping timeouts. Enable caching in ScrapeGraphAI to avoid repeated requests to the same URL within short intervals. Pro Tips: Use environment variables for all API keys to avoid hard-coding secrets. Add an extra If node to alert you if a product’s price drops below a target threshold. Combine with n8n’s Slack node for real-time alerts in addition to Mailchimp summaries.

Track Play Store App Rankings with SerpApi, Baserow & Slack Alerts

Automatically track your Android app’s keyword rankings on Google Play. This workflow checks ranks via SerpApi, updates a Baserow table, and posts a heads-up in Slack so your team can review changes quickly. 💡 Perfect for ASO teams tracking daily keyword positions Growth & marketing standups that want quick rank visibility Lightweight historical logging without a full BI stack 🧠 What it does Runs on a schedule (e.g., weekly) Queries SerpApi for each keyword’s Play Store ranking Saves results to Baserow: Current Rank, Previous Rank, Last modified Sends a Slack alert: “Ranks updated — review in Baserow” ⚡ Requirements SerpApi account & API key Baserow account + API token Slack connection (bot/app or credential in n8n) ⚙️ Setup Instructions 1) Create a Baserow table Create a new table (any name). Add user-field names exactly: Keywords (text) Current Rank (number) Previous Rank (number) Last modified (date/time) Optional fields you can add later: Notes, Locale, Store Country, App Package ID. 2) Connect credentials in n8n Baserow: add your API token and select your Database and Table in the Baserow nodes. Slack: connect your Slack account/workspace in the Slack node. SerpApi: open the HTTP Request node and put your API key under Query Parameters → api_key = YOUR_KEY. 3) Verify field mapping In the Baserow (Update Row) node, map: Row ID → {{$json.id}} Current Rank → {{$json["Current Rank"]}} Previous Rank → your code node should set this (the template copies the old “Current Rank” into “Previous Rank” before writing the new one) Last modified → {{$now}} (or the timestamp you compute) 🛟 Notes & Tips If you prefer a single daily Slack summary instead of multiple pings, add a Code node after updates to aggregate lines and send one message. Treat 0 or missing ranks as “not found” and flag them in Slack if helpful. For multi-country tracking, include hl/gl (locale/country) in your SerpApi query params and store them as columns. 🤝 Need a hand? I’m happy to help you get this running smoothly—or tailor it to your brand. Reach out to me via email: [email protected]
+3

Smart RSS Feed Monitoring with AI Filtering, Baserow Storage, and Slack Alerts

This workflow automates the process of monitoring multiple RSS feeds, intelligently identifying new articles, maintaining a record of processed content, and delivering timely notifications to a designated Slack channel. It leverages AI to ensure only truly new and relevant articles are dispatched, preventing duplicate alerts and information overload. 🚀 Main Use Cases Automated News Aggregation:** Continuously monitor industry news, competitor updates, or specific topics from various RSS feeds. 📈 Content Curation:** Filter and deliver only new, unprocessed articles to a team or personal Slack channel. 🎯 Duplicate Prevention:** Maintain a persistent record of seen articles to avoid redundant notifications. 🛡️ Enhanced Information Delivery:** Provide a streamlined and intelligent way to stay updated without manual checking. 📧 How it works The workflow operates in distinct, interconnected phases to ensure efficient and intelligent article delivery: RSS Feed Data Acquisition 📥 Initiation:** The workflow is manually triggered to begin the process. 🖱️ RSS Link Retrieval:** It connects to a Baserow database to fetch a list of configured RSS feed URLs. 🔗 Individual Feed Processing:** Each RSS feed URL is then processed independently. 🔄 Content Fetching & Parsing:** An HTTP Request node downloads the raw XML content of each RSS feed, which is then parsed into a structured JSON format for easy manipulation. 📄➡️🌳 Historical Data Management 📚 Seen Articles Retrieval:** Concurrently, the workflow queries another Baserow table to retrieve a comprehensive list of article GUIDs or links that have been previously processed and notified. This forms the basis for duplicate detection. 🔍 Intelligent Article Filtering with AI 🧠 Data Structuring for AI:** A Code node prepares the newly fetched articles and the list of already-seen articles into a specific JSON structure required by the AI Agent. 🏗️ AI-Powered Filtering:** An AI Agent, powered by an OpenAI Chat Model and supported by a Simple Memory component, receives this structured data. It is precisely prompted to compare the new articles against the historical "seen" list and return only those articles that are genuinely new and unprocessed. 🤖 Output Validation:** A Structured Output Parser ensures that the AI Agent's response adheres to a predefined JSON schema, guaranteeing data integrity for subsequent steps. ✅ JSON Cleaning:** A final Code node takes the AI's raw JSON string output, parses it, and formats it into individual n8n items, ready for notification and storage. 🧹 Notification & Record Keeping 🔔 Persistent Record:** For each newly identified article, its link is saved to the Baserow "seen products" table, marking it as processed and preventing future duplicate notifications. 💾 Slack Notification:** The details of the new article (title, content, link) are then formatted and sent as a rich message to a specified Slack channel, providing real-time updates. 💬 Summary Flow: Manual Trigger → RSS Link Retrieval (Baserow) → HTTP Request → XML Parsing | Seen Articles Retrieval (Baserow) → Data Structuring (Code) → AI-Powered Filtering (AI Agent, OpenAI, Memory, Parser) → JSON Cleaning (Code) → Save Seen Articles (Baserow) → Slack Notification 🎉 Benefits: Fully Automated:** Eliminates manual checking of RSS feeds and Slack notifications. ⏱️ Intelligent Filtering:** Leverages AI to accurately identify and deliver only new content, avoiding duplicates. 💡 Centralized Data Management:** Utilizes Baserow for robust storage of RSS feed configurations and processed article history. 🗄️ Real-time Alerts:** Delivers timely updates directly to your team or personal Slack channel. ⚡ Scalable & Customizable:** Easily adaptable to monitor various RSS feeds and integrate with different Baserow tables and Slack channels. ⚙️ Setup Requirements: Baserow API Key:** Required for accessing and updating your Baserow databases. 🔑 OpenAI API Key:** Necessary for the AI Agent to function. 🤖 Slack Credentials:** Either a Slack OAuth token (recommended for full features) or a Webhook URL for sending messages. 🗣️ Baserow Table Configuration:** A table with an rssLink column to store your RSS feed URLs. A table with a Nom column to store the links of processed articles. For any questions or further assistance, feel free to connect with me on LinkedIn: https://www.linkedin.com/in/daniel-shashko/

Build your own Baserow and Slack integration

Create custom Baserow and Slack workflows by choosing triggers and actions. Nodes come with global operations and settings, as well as app-specific parameters that can be configured. You can also use the HTTP Request node to query data from any app or service with a REST API.

Baserow supported actions

Batch Create
Create up to 200 rows in one request
Batch Delete
Delete up to 200 rows in one request
Batch Update
Update up to 200 rows in one request
Create
Create a row
Delete
Delete a row
Get
Retrieve a row
Get Many
Retrieve many rows
Update
Update a row

Slack supported actions

Archive
Archives a conversation
Close
Closes a direct message or multi-person direct message
Create
Initiates a public or private channel-based conversation
Get
Get information about a channel
Get Many
Get many channels in a Slack team
History
Get a conversation's history of messages and events
Invite
Invite a user to a channel
Join
Joins an existing conversation
Kick
Removes a user from a channel
Leave
Leaves a conversation
Member
List members of a conversation
Open
Opens or resumes a direct message or multi-person direct message
Rename
Renames a conversation
Replies
Get a thread of messages posted to a channel
Set Purpose
Sets the purpose for a conversation
Set Topic
Sets the topic for a conversation
Unarchive
Unarchives a conversation
Get
Get Many
Get & filters team files
Upload
Create or upload an existing file
Delete
Get Permalink
Search
Send
Send and Wait for Response
Update
Add
Adds a reaction to a message
Get
Get the reactions of a message
Remove
Remove a reaction of a message
Add
Add a star to an item
Delete
Delete a star from an item
Get Many
Get many stars of autenticated user
Get
Get information about a user
Get Many
Get a list of many users
Get User's Profile
Get a user's profile
Get User's Status
Get online status of a user
Update User's Profile
Update a user's profile
Add Users
Create
Disable
Enable
Get Many
Get Users
Update

FAQs

  • Can Baserow connect with Slack?

  • Can I use Baserow’s API with n8n?

  • Can I use Slack’s API with n8n?

  • Is n8n secure for integrating Baserow and Slack?

  • How to get started with Baserow and Slack integration in n8n.io?

Need help setting up your Baserow and Slack integration?

Discover our latest community's recommendations and join the discussions about Baserow and Slack integration.
Isreal
Nicolas N
Muhammed Iqbal P B
Nicolas N

Looking to integrate Baserow and Slack in your company?

Over 3000 companies switch to n8n every single week

Why use n8n to integrate Baserow with Slack

Build complex workflows, really fast

Build complex workflows, really fast

Handle branching, merging and iteration easily.
Pause your workflow to wait for external events.

Code when you need it, UI when you don't

Simple debugging

Your data is displayed alongside your settings, making edge cases easy to track down.

Use templates to get started fast

Use 1000+ workflow templates available from our core team and our community.

Reuse your work

Copy and paste, easily import and export workflows.

Implement complex processes faster with n8n

red iconyellow iconred iconyellow icon