Back to Integrations
integrationPostgres node
integrationS3 node

Postgres and S3 integration

Save yourself the work of writing custom integrations for Postgres and S3 and use n8n instead. Build adaptable and scalable Development, and Data & Storage workflows that work with your technology stack. All within a building experience you will love.

How to connect Postgres and S3

  • Step 1: Create a new workflow
  • Step 2: Add and configure nodes
  • Step 3: Connect
  • Step 4: Customize and extend your integration
  • Step 5: Test and activate your workflow

Step 1: Create a new workflow and add the first step

In n8n, click the "Add workflow" button in the Workflows tab to create a new workflow. Add the starting point – a trigger on when your workflow should run: an app event, a schedule, a webhook call, another workflow, an AI chat, or a manual trigger. Sometimes, the HTTP Request node might already serve as your starting point.

Postgres and S3 integration: Create a new workflow and add the first step

Step 2: Add and configure Postgres and S3 nodes

You can find Postgres and S3 in the nodes panel. Drag them onto your workflow canvas, selecting their actions. Click each node, choose a credential, and authenticate to grant n8n access. Configure Postgres and S3 nodes one by one: input data on the left, parameters in the middle, and output data on the right.

Postgres and S3 integration: Add and configure Postgres and S3 nodes

Step 3: Connect Postgres and S3

A connection establishes a link between Postgres and S3 (or vice versa) to route data through the workflow. Data flows from the output of one node to the input of another. You can have single or multiple connections for each node.

Postgres and S3 integration: Connect Postgres and S3

Step 4: Customize and extend your Postgres and S3 integration

Use n8n's core nodes such as If, Split Out, Merge, and others to transform and manipulate data. Write custom JavaScript or Python in the Code node and run it as a step in your workflow. Connect Postgres and S3 with any of n8n’s 1000+ integrations, and incorporate advanced AI logic into your workflows.

Postgres and S3 integration: Customize and extend your Postgres and S3 integration

Step 5: Test and activate your Postgres and S3 workflow

Save and run the workflow to see if everything works as expected. Based on your configuration, data should flow from Postgres to S3 or vice versa. Easily debug your workflow: you can check past executions to isolate and fix the mistake. Once you've tested everything, make sure to save your workflow and activate it.

Postgres and S3 integration: Test and activate your Postgres and S3 workflow

Sync Gmail emails to PostgreSQL with S3 attachment storage

Sync Gmail emails to PostgreSQL with S3 attachment storage

Automated Gmail Email Processing System

Who's it for
Businesses and individuals who need to:
Archive email communications in a searchable database
Backup email attachments to cloud storage
Analyze email patterns and communication data
Comply with data retention policies
Integrate emails with other business systems

What it does
This workflow automatically captures, processes, and stores Gmail emails in a PostgreSQL database while uploading file attachments to S3/MinIO storage. It handles both individual emails (via Gmail Trigger) and bulk processing (via Schedule Trigger).

Key features:
Dual processing: real-time individual emails + scheduled bulk retrieval
Complete email metadata extraction (sender, recipients, labels, timestamps)
HTML to plain text conversion for searchable content
Binary attachment processing with metadata extraction
Organized S3/MinIO file storage structure
UPSERT database operations to prevent duplicates

How it works
Email Capture: Gmail Trigger detects new emails, Schedule Trigger gets bulk emails from last hour
Parallel Processing: Emails with attachments go through binary processing, others go directly to transformation
Attachment Handling: Extract metadata, upload to S3/MinIO, create database references
Data Transformation: Convert Gmail API format to PostgreSQL structure
Storage: UPSERT emails to database with linked attachment information

Requirements
Credentials needed:
Gmail OAuth2 (gmail.readonly scope)
PostgreSQL database connection
S3/MinIO storage credentials

Database setup:
Run the provided SQL schema to create the messages table with JSONB fields for flexible data storage.

How to set up
Gmail OAuth2: Enable Gmail API in Google Cloud Console, create OAuth2 credentials
PostgreSQL: Create database and run the SQL schema provided in setup sticky note
S3/MinIO: Create bucket "gmail-attachments" with proper upload permissions
Configure: Update authenticatedUserEmail in transform scripts to your email
Test: Start with single email before enabling bulk processing

How to customize
Email filters**: Modify Gmail queries (in:sent, in:inbox) to target specific emails
Storage structure**: Change S3 file path format in Upload node
Processing schedule**: Adjust trigger frequencies based on email volume
Database fields**: Extend PostgreSQL schema for additional metadata
Attachment types**: Add file type filtering in binary processing logic

Note: This workflow processes emails from the last hour to avoid overwhelming the system. Adjust timeframes based on your email volume and processing needs.

Nodes used in this workflow

Popular Postgres and S3 workflows

Sync Gmail emails to PostgreSQL with S3 attachment storage

Sync Gmail emails to PostgreSQL with S3 attachment storage Automated Gmail Email Processing System Who's it for Businesses and individuals who need to: Archive email communications in a searchable database Backup email attachments to cloud storage Analyze email patterns and communication data Comply with data retention policies Integrate emails with other business systems What it does This workflow automatically captures, processes, and stores Gmail emails in a PostgreSQL database while uploading file attachments to S3/MinIO storage. It handles both individual emails (via Gmail Trigger) and bulk processing (via Schedule Trigger). Key features: Dual processing: real-time individual emails + scheduled bulk retrieval Complete email metadata extraction (sender, recipients, labels, timestamps) HTML to plain text conversion for searchable content Binary attachment processing with metadata extraction Organized S3/MinIO file storage structure UPSERT database operations to prevent duplicates How it works Email Capture: Gmail Trigger detects new emails, Schedule Trigger gets bulk emails from last hour Parallel Processing: Emails with attachments go through binary processing, others go directly to transformation Attachment Handling: Extract metadata, upload to S3/MinIO, create database references Data Transformation: Convert Gmail API format to PostgreSQL structure Storage: UPSERT emails to database with linked attachment information Requirements Credentials needed: Gmail OAuth2 (gmail.readonly scope) PostgreSQL database connection S3/MinIO storage credentials Database setup: Run the provided SQL schema to create the messages table with JSONB fields for flexible data storage. How to set up Gmail OAuth2: Enable Gmail API in Google Cloud Console, create OAuth2 credentials PostgreSQL: Create database and run the SQL schema provided in setup sticky note S3/MinIO: Create bucket "gmail-attachments" with proper upload permissions Configure: Update authenticatedUserEmail in transform scripts to your email Test: Start with single email before enabling bulk processing How to customize Email filters**: Modify Gmail queries (in:sent, in:inbox) to target specific emails Storage structure**: Change S3 file path format in Upload node Processing schedule**: Adjust trigger frequencies based on email volume Database fields**: Extend PostgreSQL schema for additional metadata Attachment types**: Add file type filtering in binary processing logic Note: This workflow processes emails from the last hour to avoid overwhelming the system. Adjust timeframes based on your email volume and processing needs.

Build your own Postgres and S3 integration

Create custom Postgres and S3 workflows by choosing triggers and actions. Nodes come with global operations and settings, as well as app-specific parameters that can be configured. You can also use the HTTP Request node to query data from any app or service with a REST API.

Postgres supported actions

Delete
Delete an entire table or rows in a table
Execute Query
Execute an SQL query
Insert
Insert rows in a table
Insert or Update
Insert or update rows in a table
Select
Select rows from a table
Update
Update rows in a table

S3 supported actions

Create
Create a bucket
Delete
Delete a bucket
Get Many
Get many buckets
Search
Search within a bucket
Copy
Copy a file
Delete
Delete a file
Download
Download a file
Get Many
Get many files
Upload
Upload a file
Create
Create a folder
Delete
Delete a folder
Get Many
Get many folders

FAQs

  • Can Postgres connect with S3?

  • Can I use Postgres’s API with n8n?

  • Can I use S3’s API with n8n?

  • Is n8n secure for integrating Postgres and S3?

  • How to get started with Postgres and S3 integration in n8n.io?

Need help setting up your Postgres and S3 integration?

Discover our latest community's recommendations and join the discussions about Postgres and S3 integration.
Mikhail Savenkov
Honza Pav
Vyacheslav Karbovnichy
Dennis
Dennis

Looking to integrate Postgres and S3 in your company?

Over 3000 companies switch to n8n every single week

Why use n8n to integrate Postgres with S3

Build complex workflows, really fast

Build complex workflows, really fast

Handle branching, merging and iteration easily.
Pause your workflow to wait for external events.

Code when you need it, UI when you don't

Simple debugging

Your data is displayed alongside your settings, making edge cases easy to track down.

Use templates to get started fast

Use 1000+ workflow templates available from our core team and our community.

Reuse your work

Copy and paste, easily import and export workflows.

Implement complex processes faster with n8n

red iconyellow iconred iconyellow icon