Creating Node.js REST APIs for AWS S3 - Upload, List, and Delete files
Hello, today I am writing this, to briefly document the high-level process of using AWS S3 for uploading files and publicly access the uploaded content.
Here is the summary of what I will be covering in this article:
- Create an IAM user for Programmatic-access to S3 bucket
- Create an S3 bucket and make its content public-readable
- Share a very minimal and working Node.js repo
- Overview of created REST APIs to UPLOAD, LIST, and DELETE objects
- Importable Postman file to test the REST APIs
Note: I am while I am writing this article, I am also doing it practically using my AWS account, so I don’t overlook any step. I will share my codes below
Prerequisites:
1. An AWS account
2. Node.js installed in your system
3. Postman to test the REST APIs
So let's get started:
1. AWS: Create an IAM User
Writing the steps below —
- Click on “Services” > search “IAM” > click to open the IAM page
- Under “IAM resources”, click on “Users” to open existing IAM user list
- Click on the “Add user” button
> Set a “User name”
> Set Access type -"Programmatic access"
- The next step is “Set permissions” > Click “Attach existing policies directly”
- Then search with the keyword “
s3
" - To make things simple I chose/checked option “AmazonS3FullAccess” policy name
- Click “Next: Tags”, Not setting any tags here atm.
- Finish the process, click “Next: Review” > click “Create user”
- For ease of use, REMEMBER to download the CSV
(It contains a “Secret Key” and an “Access key ID” that we are gonna need soon) > Save file > Keep it safe ;)
2. AWS: Create an S3 bucket
Writing the steps below —
- Click on “Services” > search “S3” > click to open your S3 bucket list
- Click on “Create bucket”
- Enter a valid + unique name. I named it “
mygallerybucket33
", then click "Next". - I kept everything default in the “Configure option” section, click “Next”.
- In this step “Set permissions”,
Uncheck
>> "Block all public access" to uncheck everything, thenCheck
>> "Block public access to buckets and objects granted through new public bucket or access point policies".Check
>> "Block public and cross-account access to buckets and objects through any public bucket or access point policies" - NOTE: Information is given below the checkboxes there, please read them to understand what they do.
- Acknowledge the alert checkbox, and click “Next”
- Click > “Create bucket”
You will find your newly created bucket in the bucket list, now we have to make to enable public-access to the bucket contents.
- Click to open your bucket, I clicked mine “
mygallerybucket33
" - Click on “Create folder” > Give it a name, say
public_asset
> “Save” - Click on “Permissions” > click “Access Control List” then > Navigate to “Public access” section > click “Everyone”.
- A pop up “Everyone” will appear for setting further options:
Under “Access to the objects” > Check “List objects
" and “Write objects
“ > click “Save”
Notice: The bucket is now publicly accessible.
3. Node.js repo
Important packages are as follows:
"aws-sdk": To communicate with S3,
"body-parser": Helps to read request body,
"cors": Well its enables CORS
"cross-env": To set our local environment
"express": Our Node.js framework
"helmet": To protect ofcourse ;)
"multer": Handle file upload from client
"multer-s3": To move uploaded files to S3
"nodemon": For development and setting environment variables
"ts-node": For typescript support,
"typescript": Yes, I am using Typescript :)
"uuid": To create unique Ids for uploaded file
Here is the code:
https://github.com/SiddharthaChowdhury/AWS_S3_Node_REST
The repo contains a sample_nodemon.json
file with replaceable <_xxxx_>
content.
- Rename the file to nodemon.json
- Replace <_xxxx_>
with valid AWS credentials you got in Step 1>9 up above
- All 4 AWS_variables are important.
// nodemon.json
{
"env":{
"PORT": 4000,
"AWS_ACCESS_KEY_ID": "blaablaablaa234bla",
"AWS_SECRET_KEY": "bla123bla123blablablaa/blaba+blabla",
"AWS_BUCKET_NAME": "gallerybucket33",
"AWS_REGION": "eu-central-1"
}
}
- Install dependencies
npm install
- Start the server
npm start
4. APIs Overview
The file below is uploadSetup.ts
is an important file referenced in the rest of the routes further down below
Important notes about the file above are as following —
1. Line [7 -11] we’re configuring our AWS setup
2. Line [13] we created an S3 instance and made it exportable so that, we can use the configured S3 from other APIs
3. Line [14] isAllowedMimeType
function, to return if the given mimetype is an image, or not. Here we are uploading only image files.
4. Line [15–22] fileFilter
is a multer specific filter function, that helps to filter out non-image files from upload API request
5. Line [23–27] getUniqFileName
function takes original filename with extension and generates a unique filename using UUID
API: Files UPLOAD —
// Route
// We will take no more than 6 files to uploadconst router = express.Router();router.post('/upload',
handleUploadMiddleware.array('input_files', 6),
api_uploadFiles
);-----------------------------------------
import {Request, Response} from "express";const api_uploadFiles = (req: Request, res: Response) => {
res.status(200);
return res.json({
msg: "Uploaded!",
files: req.files
});
}export default api_uploadFiles;
API: Files LISTING—
This API takes folder name as input, in our case we have only one folder public_asset
where we upload our files
// Route
router.get('/list', api_ListFiles);
// Implementation
import {Request, Response} from "express";
import {S3} from "./uploadSetup";const api_ListFiles = (req: Request, res: Response) => {
const {folderName} = req.query;
if(!folderName) {
res.status(400);
return res.json({error: 'Error! Folder name is missing.'})
} const listParams = {
Bucket: process.env.AWS_BUCKET_NAME!,
Prefix: folderName?.toString() || '/'
};
S3.listObjectsV2(listParams, function(err, data) {
if (err) throw err; if(data.Contents && data.Contents.length > 0) {
const fileObjArr: any[] = [];
// fileObj: S3.ObjectList
data.Contents.forEach((fileObj: any) => {
if(fileObj.Size > 0) {
fileObjArr.push({...fileObj,
location: `https://${process.env.AWS_BUCKET_NAME}${process.env.AWS_REGION === 'eu-central-1' ? '.' : '-'}s3${process.env.AWS_REGION === 'us-east-1' ? '' : '-' + process.env.AWS_REGION}.amazonaws.com/${fileObj.Key}`
});
}
})
data.Contents = fileObjArr;
}
res.status(200);
return res.json({data}); });
}export default api_ListFiles;
Note: The Aws S3 URL creation is derived from the hint given in
http://www.wryway.com/blog/aws-s3-url-styles/
API: Files DELETE—
// Route
router.delete('/remove', api_deleteFiles);//Implementation
import {Request, Response} from "express";
import {S3} from "./uploadSetup";const api_deleteFiles = (req: Request, res: Response) => {
const {fileKeys} = req.body;
if(!fileKeys
|| !Array.isArray(fileKeys)
|| (fileKeys && fileKeys.length == 0)
) {
res.status(400);
return res.json({error: 'Error! File keys not found.'})
} const deleteParam = {
Bucket: process.env.AWS_BUCKET_NAME!,
Delete: {
Objects: fileKeys.map((key: string) => ({Key: key}))
}
};
S3.deleteObjects(deleteParam, function(err, data) {
if (err) throw err;
res.status(200);
return res.json({msg: 'Deleted!'});
});
}export default api_deleteFiles;
5. Concluding, with Postman file ofc :)
The implementation here might not be the best or most secure, but I tried to do my best to keep it simple and get the job done kinda example.
I hope it helps someone, looking to get started with AWS S3 and Node.js. I have spent a pretty good amount of time wandering around on the internet to get this thing right.
There are tons of resources out there about how to do the same, but I found them mostly incomplete, some outdated. So here I tried to do a step-by-step guide practically doing it on the go, to help someone who is totally new.
Lastly —
- Here is the link to the importable postman file
- And once again the repo link for people who skipped the boring lecture above ;)
Thanks —