Upload Files to AWS S3: Implement in Node.js + Express Server
Learn how to set up a Node.js + Express server to upload files securely to Amazon S3. This guide covers configuring AWS SDK, using multer for file handling, and setting up environment variables. With scalable, durable storage in S3, your app can handle file uploads efficiently and securely.

Uploading files to Amazon S3 is a common need for web applications, whether it’s for storing images, documents, or other media files. Amazon S3 (Simple Storage Service) offers scalable, durable, and cost-effective file storage in the cloud, making it ideal for this purpose.
In this guide, we’ll walk through setting up a Node.js + Express server that allows users to upload files directly to an S3 bucket. We’ll use the AWS SDK to configure the connection to S3, and the popular middleware, multer
, to handle the file uploads. Let’s dive in!
Prerequisites
To follow along, you’ll need:
- Node.js and npm installed on your system.
- An AWS account and an S3 bucket created for file storage.
- AWS IAM credentials with permission to upload files to S3.
Step 1: Set Up a Node.js Project and Install Dependencies
Create a new Node.js project and install the required packages: Express for setting up the server, @aws-sdk/client-s3 for interacting with S3, multer for handling file uploads, and dotenv for managing environment variables.
yarn init --yes
yarn add express @aws-sdk/client-s3 multer dotenv
Step 2: Configure Environment Variables
To securely manage AWS credentials, create a .env
file in your project root to store sensitive information like your AWS Access Key, Secret Key, and Bucket Name.
AWS_REGION=your-aws-region
AWS_ACCESS_KEY_ID=your-access-key-id
AWS_SECRET_ACCESS_KEY=your-secret-access-key
AWS_BUCKET_NAME=your-s3-bucket-name
Add .env
to your .gitignore
file to prevent credentials from being exposed in version control:
# .gitignore
.env
Step 3: Set Up the S3 Client
We’ll use the AWS SDK to configure an S3 client, which will connect our server to Amazon S3. In a file called s3Client.js
, initialize the S3 client:
import { S3Client } from "@aws-sdk/client-s3";
import dotenv from "dotenv";
dotenv.config();
export const s3 = new S3Client({
region: process.env.AWS_REGION,
credentials: {
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
},
});
This file exports the S3 client, which we’ll use in our upload route.
Step 4: Set Up Multer for File Upload Handling
In order to upload files, we needmulter
, a Node.js middleware for handling multipart/form-data
. We’ll configure it to store files in memory temporarily, allowing us to pass the file data directly to the S3 client.
Create a file called upload.js
:
import multer from "multer";
// Configure multer to store files in memory
const storage = multer.memoryStorage();
export const upload = multer({ storage });
With multer.memoryStorage()
, uploaded files are stored as buffer data in memory, which is perfect for direct upload to S3 without saving files on the server.
Step 5: Create the File Upload Controller
The core of our functionality is a controller function that uploads the file to S3 and returns the file’s S3 URL. In uploadController.js
, create the uploadAttachment
function to handle this process:
import { PutObjectCommand } from "@aws-sdk/client-s3";
import { s3 } from "./s3Client.js";
import { upload } from "./upload.js";
export const uploadAttachment = (req, res) => {
upload.single("file")(req, res, async (err) => {
if (err) {
return res.status(500).json({ message: "File upload failed", error: err.message });
}
if (!req.file) {
return res.status(400).json({ message: "No file uploaded" });
}
const fileBuffer = req.file.buffer;
const fileName = `${Date.now()}-${req.file.originalname}`;
try {
const uploadParams = {
Bucket: process.env.AWS_BUCKET_NAME,
Key: fileName,
Body: fileBuffer,
ContentType: req.file.mimetype,
};
const command = new PutObjectCommand(uploadParams);
await s3.send(command);
const fileUrl = `https://${process.env.AWS_BUCKET_NAME}.s3.${process.env.AWS_REGION}.amazonaws.com/${fileName}`;
res.status(200).json({
message: "File uploaded successfully",
fileUrl,
});
} catch (error) {
console.error("Error uploading file:", error);
res.status(500).json({ message: "Error uploading file to S3", error: error.message });
}
});
};
Here’s what each part does:
- Error Handling: If there’s a
multer
error or no file is uploaded, the function responds with an appropriate error. - Generating a Unique Filename: The filename combines a timestamp with the original file name to prevent overwriting files.
- Setting Up S3 Parameters: We define the bucket, file key (name), file body (data), and MIME type.
- Uploading to S3: Using
PutObjectCommand
, we upload the file to S3, and then return the public URL for accessing the file.
Step 6: Set Up the Express Route
In the main server file (e.g., index.js
), configure an Express route that uses the uploadAttachment
controller function to handle POST
requests to upload files.
import express from "express";
import { uploadAttachment } from "./uploadController.js";
const app = express();
const PORT = process.env.PORT || 3000;
// Route to handle file upload
app.post("/upload", uploadAttachment);
app.listen(PORT, () => {
console.log(`Server running on port ${PORT}`);
});
This route listens for POST
requests at /upload
, triggers the uploadAttachment
function, and returns the S3 URL on a successful upload.
Step 7: Testing the Upload Endpoint
With the server running, it’s time to test the file upload. You can use a tool like Postman or curl to send a POST
request with a file to http://localhost:3000/upload
.
Here’s an example using curl
:
curl -X POST -F "file=@/path/to/your/file.jpg" http://localhost:3000/upload
If the upload is successful, you should see a JSON response with the S3 file URL:
{
"message": "File uploaded successfully",
"fileUrl": "https://your-s3-bucket-name.s3.your-aws-region.amazonaws.com/your-file-name.jpg"
}
Error Handling and Security Considerations
In a real-world application, consider handling the following scenarios:
- Large File Sizes: Uploading large files from the server to S3 can affect performance. For larger files, consider implementing direct uploads from the client using signed URLs.
- Access Control: Ensure only authenticated users can upload files by adding authentication middleware to your routes.
- File Type Validation: Validate file types before uploading to prevent unwanted files in your S3 bucket.
In this guide, we built a simple Node.js + Express server that lets users upload files to AWS S3. By configuring multer
and the AWS SDK, we created a flexible solution that stores files securely and reliably. Amazon S3’s scalability and durability make it ideal for file storage, and using the AWS SDK in Node.js makes integration straightforward and secure.
With these steps, you’re well-equipped to implement file uploads in your own projects. From here, you can explore additional features like pre-signed URLs for secure access, Lambda triggers for processing files, and S3 lifecycle policies to manage file storage costs effectively.
If you are interested in pre-signed URLs checkout this article below:

That's it for this article! See ya 👋