
Security News
Axios Maintainer Confirms Social Engineering Attack Behind npm Compromise
Axios compromise traced to social engineering, showing how attacks on maintainers can bypass controls and expose the broader software supply chain.
multi-cloud-upload
Advanced tools
A Node.js package that provides an easy interface to upload and manage files on both Cloudinary and AWS S3. This package supports both JavaScript and TypeScript projects, making it versatile and easy to integrate into various environments.
A Node.js package that provides an easy interface to upload and manage files on both Cloudinary and AWS S3. This package supports both JavaScript and TypeScript projects, making it versatile and easy to integrate into various environments.
To install the package, use npm or yarn:
npm install multi-cloud-upload
or
yarn add multi-cloud-upload
const {StorageFactory} = require ("multi-cloud-upload")
const fs = require("fs")
const path = require("path")
require("dotenv").config();
// Load configurations from environment variables
const accessKeyId = process.env.AWS_ACCESS_KEY_ID || '';
const secretAccessKey = process.env.AWS_SECRET_ACCESS_KEY || '';
const region = process.env.AWS_REGION || '';
const bucket = process.env.AWS_BUCKET || '';
const cloudinaryConfig = {
cloud_name: process.env.CLOUDINARY_CLOUD_NAME || '',
api_key: process.env.CLOUDINARY_API_KEY || '',
api_secret: process.env.CLOUDINARY_API_SECRET || '',
// additional params can go in here.
};
const s3Config = {
accessKeyId: accessKeyId,
secretAccessKey: secretAccessKey,
region: region,
bucket: bucket,
};
const cloudinaryProvider = StorageFactory.createProvider('cloudinary', cloudinaryConfig);
const s3Provider = StorageFactory.createProvider('s3', s3Config);
// Read the actual image file into a buffer
const filePath = path.join(__dirname, 'test.png');
const fileBuffer = fs.readFileSync(filePath);
console.log(fileBuffer, "sdfnjbfdvbfd");
// Now you can use either provider with the same interface
(async () => {
let context = {}
// upload
const example1 = await cloudinaryProvider.upload(fileBuffer, 'example.png', context);
const example2 = await cloudinaryProvider.download('example.png', context);
const example3 = await cloudinaryProvider.list(context);
const example4 = await cloudinaryProvider.delete('example.png', context);
const example5 = await cloudinaryProvider.getUrl('gccyckcnbpcy5vx8gjyd', context);
// // upload file
const example6 = await s3Provider.upload(fileBuffer, 'examplex.png', {ContentType: 'image/png'} );
// donwload file
const example7 = await s3Provider.download("examplex.png", context)
// delete object
const example8 = await s3Provider.delete('examplex.png',context)
// list objects
context = {MaxKeys:5, ContinuationToken: 'xyz'}
const example9 = await s3Provider.list(context)
// get single object
context = {expiresIn: 60 * 1}
const example10 = s3Provider.getUrl('examplex.png', context)
})()
// Usage example
import {StorageFactory} from "multi-cloud-upload"
import * as dotenv from 'dotenv';
import * as fs from 'fs';
import * as path from 'path';
dotenv.config();
// Load configurations from environment variables
const accessKeyId = process.env.AWS_ACCESS_KEY_ID || '';
const secretAccessKey = process.env.AWS_SECRET_ACCESS_KEY || '';
const region = process.env.AWS_REGION || '';
const bucket = process.env.AWS_BUCKET || '';
const cloudinaryConfig = {
cloud_name: process.env.CLOUDINARY_CLOUD_NAME || '',
api_key: process.env.CLOUDINARY_API_KEY || '',
api_secret: process.env.CLOUDINARY_API_SECRET || '',
};
const s3Config = {
accessKeyId: accessKeyId,
secretAccessKey: secretAccessKey,
region: region,
bucket: bucket,
};
const cloudinaryProvider = StorageFactory.createProvider('cloudinary', cloudinaryConfig);
const s3Provider = StorageFactory.createProvider('s3', s3Config);
// Read the actual image file into a buffer
const filePath = path.join(__dirname, 'test.png');
const fileBuffer = fs.readFileSync(filePath);
// Now you can use either provider with the same interface
(async () => {
let context = {}
// upload
const example1 = await cloudinaryProvider.upload(fileBuffer, 'example.png', context);
const example2 = await cloudinaryProvider.download('example.png', context);
const example3 = await cloudinaryProvider.list(context);
const example4 = await cloudinaryProvider.delete('example.png', context);
const example5 = await cloudinaryProvider.getUrl('gccyckcnbpcy5vx8gjyd', context);
// // upload file
const example6 = await s3Provider.upload(fileBuffer, 'examplex.png', {ContentType: 'image/png'} );
// donwload file
const example7 = await s3Provider.download("examplex.png", context)
// delete object
const example8 = await s3Provider.delete('examplex.png',context)
// list objects
context = {MaxKeys:5, ContinuationToken: 'xyz'}
const example9 = await s3Provider.list(context)
// get single object
context = {expiresIn: 60 * 1}
const example10 = s3Provider.getUrl('examplex.png', context)
})()
To use this package, you need to configure your credentials and options for Cloudinary or AWS S3 depending on your storage service of choice.
Set up your Cloudinary credentials:
const cloudinaryConfig = {
cloud_name: process.env.CLOUDINARY_CLOUD_NAME || '',
api_key: process.env.CLOUDINARY_API_KEY || '',
api_secret: process.env.CLOUDINARY_API_SECRET || '',
// more params can go in here
};
Set up your AWS credentials and region:
const s3Config = {
accessKeyId: accessKeyId,
secretAccessKey: secretAccessKey,
region: region,
bucket: bucket,
};
The StorageProvider class provides a common interface for uploading, downloading, listing, getUrl and deleting files.
upload(file: Buffer, fileName: string, context: object={}): Promise<any>Buffer.download(fileName: string, context: object={}): Promise<any>delete(fileName: string, context: object={}): Promise<any>list(context: object={}): Promise<string[]>getUrl(fileName: string, context: object={}): Promise<string>The CloudinaryProvider class implements the StorageProvider interface for Cloudinary.
upload(file: Buffer, fileName: string, context: object={}): Promise<any>Uploads a file to Cloudinary.
download(fileName: string, context: object={}): Promise<any>Downloads a file from Cloudinary.
delete(fileName: string, context: object={}): Promise<any>Deletes a file from Cloudinary.
list(context: object={}): Promise<string[]>Lists files from Cloudinary.
getUrl(fileName: string, context: object={}): Promise<string>Generates a URL for accessing a file in Cloudinary.
The S3Provider class implements the StorageProvider interface for AWS S3.
upload(file: Buffer, fileName: string, context: object={}): Promise<any>Uploads a file to an S3 bucket.
download(fileName: string, context: object={}): Promise<any>Downloads a file from an S3 bucket.
delete(fileName: string, context: object={}): Promise<any>Deletes a file from an S3 bucket.
list(context: object={}): Promise<string[]>Lists files from an S3 bucket.
getUrl(fileName: string, context: object={}): Promise<string>Generates a pre-signed URL for accessing a file in an S3 bucket.
Contributions are welcome! Please fork the repository and create a pull request. For major changes, please open an issue first to discuss what you would like to change.
This project is licensed under the MIT License. See the LICENSE file for more details.
FAQs
A Node.js package that provides an easy interface to upload and manage files on both Cloudinary and AWS S3. This package supports both JavaScript and TypeScript projects, making it versatile and easy to integrate into various environments.
The npm package multi-cloud-upload receives a total of 1 weekly downloads. As such, multi-cloud-upload popularity was classified as not popular.
We found that multi-cloud-upload demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Axios compromise traced to social engineering, showing how attacks on maintainers can bypass controls and expose the broader software supply chain.

Security News
Node.js has paused its bug bounty program after funding ended, removing payouts for vulnerability reports but keeping its security process unchanged.

Security News
The Axios compromise shows how time-dependent dependency resolution makes exposure harder to detect and contain.