Amazon S3 (Simple Storage Service) is a highly scalable, reliable, and low-latency data storage infrastructure that can store and retrieve any amount of data anytime. Integrating S3 with a Node.js application can empower you to build powerful features like file uploads, downloads, and more. In this article, I’ll guide you through building a simple Node.js application that allows you to upload, list, download, and delete files on AWS S3.
Prerequisites
Before we dive into the code, make sure you have the following prerequisites:
- Node.js and npm are installed on your machine.
- AWS account with an S3 bucket created.
- AWS IAM user with appropriate permissions to interact with S3.
- GitHub CLI (optional) if you want to publish the repository directly to GitHub.
Let’s start by setting up our project.
1. Setting Up the Project
Initialize a New Node.js Project
First, create a new directory for your project and navigate into it:
mkdir simple-s3-with-node
cd simple-s3-with-node
Now, initialize a new Node.js project:
npm init -y
This will generate a package.json file with default configurations.
Install Dependencies
We need to install several npm packages for our project:
npm install [email protected] dotenv express multer [email protected] nodemon
Note: — While installing the packages, make sure you are installing the write version of multer-s3 and multer otherwise aws-sdk with multer-s3 can create an error. if you are not sure you can pass the versions.
- aws-sdk: Official AWS SDK for Node.js, used to interact with AWS services like S3.
- dotenv: Loads environment variables from a
.envfile. - express: Minimalist web framework for Node.js.
- multer: Middleware for handling
multipart/form-data, primarily used for file uploads. - multer-s3: Multer storage engine for AWS S3.
- nodemon: Utility that monitors for any changes in your Node.js application and automatically restarts the server.
2. Project Structure and Configuration
Creating the .env File
In the root of your project, create a .env file to store your AWS credentials and S3 bucket information. This file should look like this:
AWS_ACCESS_KEY=your-aws-access-key
AWS_ACCESS_SECRET=your-aws-secret-access-key
AWS_BUCKET_NAME=your-s3-bucket-name
REGION_NAME=your-aws-region
Make sure you replace your-aws-access-key, your-aws-secret-access-key, your-s3-bucket-name, and your-aws-region with your actual AWS credentials and bucket details.
Note: Never commit your .env file to version control. Add it to your .gitignore file to keep your credentials safe.
Starting the Express Application
Create a file named index.js and add the following code:
require("dotenv").config();
const express = require("express");
const multer = require("multer");
const multerS3 = require("multer-s3");
const aws = require("aws-sdk");
const app = express();
- dotenv: Configures environment variables.
- express: Sets up the Express app.
- multer and multer-s3: Handles file uploads to S3.
- aws-sdk: Configures AWS SDK to interact with S3.
3. Configuring AWS SDK and Multer
Now let’s configure the AWS SDK and set up Multer to handle file uploads:
aws.config.update({
accessKeyId: process.env.AWS_ACCESS_KEY,
secretAccessKey: process.env.AWS_ACCESS_SECRET,
region: process.env.REGION_NAME,
});
const s3 = new aws.S3();
Here, we update the AWS configuration using the credentials and region specified in the .env file. Then, we create an instance of the S3 class to interact with S3.
Next, configure Multer to upload files directly to S3:
const BUCKET_NAME = process.env.AWS_BUCKET_NAME;
const upload = multer({
storage: multerS3({
s3: s3,
bucket: BUCKET_NAME,
// acl: "public-read",
contentType: multerS3.AUTO_CONTENT_TYPE, // Automatically set content type
key: (req, file, cb) => {
console.log("Uploading file:", file.originalname);
cb(null, `${Date.now().toString()}-${file.originalname}`); // Unique filename
},
}),
});
4. Implementing the Routes
Basic Route
Let’s start with a simple route that returns a welcome message:
app.get("/", (req, res) => {
res.json({ message: "Welcome to AWS S3 Upload" });
});
This route will help us verify that our server is running correctly.
Upload File Route
Next, we’ll create the route to handle file uploads:
app.post("/upload", upload.single("file"), (req, res) => {
try {
res.status(200).json({ message: "File uploaded successfully" });
} catch (err) {
console.error("Failed to upload file", err.stack);
}
});
This route listens for POST requests to the /upload endpoint. It uses Multer’s single() method to handle a single file upload. If the upload is successful, it returns a 200 status code with a success message.
List Files Route
Next, let’s add a route to list all files in the S3 bucket:
app.get("/list", async (req, res, next) => {
try {
let r = await s3.listObjectsV2({ Bucket: BUCKET_NAME }).promise();
let files = r.Contents.map((item) => item.Key);
res.status(200).json(files);
} catch (err) {
console.error("Failed to list files", err.stack);
next(err);
}
});
This route fetches a list of objects (files) from the S3 bucket and returns their names as a JSON array.
Download File Route
We’ll now create a route to download files:
app.get("/download/:filename", async (req, res, next) => {
try {
const filename = req.params.filename;
const file = await s3
.getObject({ Bucket: BUCKET_NAME, Key: filename })
.promise();
res.send(file.Body);
} catch (err) {
console.error("Failed to download the file", err.stack);
next(err);
}
});
This route takes a file name as a URL parameter, retrieves the file from S3, and sends it in the response.
Delete File Route
Finally, let’s add a route to delete files from S3:
app.delete("/delete/:filename", async (req, res, next) => {
try {
const filename = req.params.filename;
await s3.deleteObject({ Bucket: BUCKET_NAME, Key: filename }).promise();
console.log("Deleted file successfully");
res.status(204).send("Deleted successfully");
} catch (err) {
console.error("Failed to delete the file", err.stack);
next(err);
}
});
This route also takes a file name as a URL parameter, deletes the corresponding file from S3, and returns a 204 No Content status on success.
5. Handling Errors Globally
To handle errors gracefully, we’ll add global error-handling middleware:
app.all("*", (req, res, next) => {
const err = new Error(`Requested URL ${req.path} not found!`);
err.statusCode = 404;
next(err);
});
app.use((err, req, res, next) => {
const statusCode = err.statusCode || err.response?.status || 500;
res.status(statusCode).json({
success: 0,
message: err?.message || "Internal Server Error",
});
});
These middleware functions handle 404 errors (when a route is not found) and any other errors that might occur during request processing.
6. Starting the Server
Finally, start the server by adding this line at the end of the index.js file:
app.listen(3494, () => {
console.log("Server is listening on port 3494");
});
This starts the Express server on port 3494. You can now access your application at http://localhost:3494.
7. Publishing to GitHub (Optional)
If you want to publish this repository to GitHub and set it as public, you can use the following command:
gh repo create simple-s3-with-node --public --source=. --remote=origin
This command creates a new public repository on GitHub using the current directory as the source and sets origin it as the remote repository.
Conclusion
Congratulations! You’ve successfully built a Node.js application that can upload, list, download, and delete files on AWS S3. This application can serve as a foundation for more complex file management systems. Feel free to expand upon this by adding features such as user authentication, file versioning, or integrating with other AWS services.
Learn more How to upload files in S3 in a node application — Upload, Download, Delete
