npm-storage-package
Version:
Universal cloud storage package for AWS S3, Google Cloud Storage, and Azure Blob Storage with Express.js integration
445 lines (351 loc) โข 11.2 kB
Markdown
# ๐ NPM Storage Package
A **powerful TypeScript-based NPM package** for seamless file uploads to **AWS S3**, **Google Cloud Storage (GCS)**, and **Azure Blob Storage** with **Express.js/Multer integration**. One unified API for all your cloud storage needs!
[](https://www.npmjs.com/package/npm-storage-package)
[](https://www.typescriptlang.org/)
[](https://opensource.org/licenses/MIT)
## ๐ Features
โ
**Unified API** โ One consistent interface for S3, GCS, and Azure
โ
**Express.js + Multer Integration** โ Drop-in replacement for standard multer
โ
**Multiple Upload Patterns** โ Single file, multiple files, multiple fields
โ
**TypeScript Support** โ Full type safety and IntelliSense
โ
**Flexible Configuration** โ Field-specific settings and validation
โ
**Production Ready** โ Error handling, file filtering, size limits
## ๐ฆ Installation
```bash
npm install npm-storage-package multer express
```
```bash
yarn add npm-storage-package multer express
```
## ๐ง Quick Start
### 1๏ธโฃ Basic Setup
```javascript
import express from 'express';
import { StorageFactory } from 'npm-storage-package';
const app = express();
// Configure once, use everywhere
const storage = new StorageFactory({
azure: {
bucket: 'my-container',
credentials: { connectionString: process.env.AZURE_CONNECTION_STRING }
},
s3: {
bucket: 'my-bucket',
region: 'us-east-1',
credentials: {
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY
}
},
gcs: {
bucket: 'my-gcs-bucket',
projectId: 'my-project-id',
keyFilename: './serviceaccountkey.json'
}
});
```
### 2๏ธโฃ Single File Upload
```javascript
app.post('/upload/single',
storage.azureMulter({
destination: 'uploads/',
generateFileName: (req, file) => `${Date.now()}-${file.originalname}`
}).single('file'),
(req, res) => {
res.json({
success: true,
file: {
originalName: req.file.originalname,
cloudUrl: req.file.path,
cloudKey: req.file.cloudKey,
size: req.file.size
}
});
}
);
```
### 3๏ธโฃ Multiple Files Upload
```javascript
app.post('/upload/multiple',
storage.s3Multer({
destination: 'gallery/',
fileFilter: (req, file) => file.mimetype.startsWith('image/'),
limits: { fileSize: 5 * 1024 * 1024 } // 5MB
}).array('photos', 10),
(req, res) => {
const uploadedFiles = req.files.map(file => ({
originalName: file.originalname,
cloudUrl: file.path,
size: file.size
}));
res.json({
success: true,
totalFiles: req.files.length,
files: uploadedFiles
});
}
);
```
### 4๏ธโฃ Multiple Fields Upload
```javascript
app.post('/upload/profile',
storage.gcsMulter({
fieldConfigs: {
avatar: {
destination: 'avatars/',
allowedExtensions: ['.jpg', '.jpeg', '.png'],
fileSizeLimit: 2 * 1024 * 1024, // 2MB
generateFileName: (req, file) => `avatar-${req.user.id}-${Date.now()}.jpg`
},
document: {
destination: 'documents/',
allowedExtensions: ['.pdf', '.doc', '.docx'],
fileSizeLimit: 10 * 1024 * 1024 // 10MB
}
}
}).fields([
{ name: 'avatar', maxCount: 1 },
{ name: 'document', maxCount: 3 }
]),
(req, res) => {
res.json({
success: true,
files: {
avatar: req.files.avatar?.[0]?.path,
documents: req.files.document?.map(f => f.path)
}
});
}
);
```
## ๐ API Reference
### StorageFactory Class
The main class for creating configured storage instances.
```javascript
const storage = new StorageFactory({
azure?: AzureConfig,
s3?: S3Config,
gcs?: GCSConfig
});
```
### Provider Methods
Each provider has a unified method that returns a configured multer instance:
#### `storage.azureMulter(options)`
#### `storage.s3Multer(options)`
#### `storage.gcsMulter(options)`
### Upload Options
| Option | Type | Description |
|--------|------|-------------|
| `destination` | `string` | Upload destination path |
| `generateFileName` | `Function` | Custom filename generator |
| `fileFilter` | `Function` | File validation function |
| `limits` | `Object` | File size/count limits |
| `fieldConfigs` | `Object` | Field-specific configurations |
### Field Configuration
For multiple fields upload, each field can have:
| Option | Type | Description |
|--------|------|-------------|
| `destination` | `string` | Field-specific upload path |
| `allowedExtensions` | `string[]` | Allowed file extensions |
| `fileSizeLimit` | `number` | Maximum file size in bytes |
| `generateFileName` | `Function` | Custom filename for this field |
## ๐ง Configuration
### Azure Blob Storage
```javascript
azure: {
bucket: 'container-name',
credentials: {
connectionString: 'DefaultEndpointsProtocol=https;AccountName=...'
}
}
```
### AWS S3
```javascript
s3: {
bucket: 'bucket-name',
region: 'us-east-1',
credentials: {
accessKeyId: 'AKIA...',
secretAccessKey: 'your-secret-key'
}
}
```
### Google Cloud Storage
```javascript
// Option 1: Service Account Key File
gcs: {
bucket: 'bucket-name',
projectId: 'your-project-id', // or auto-extracted from key file
keyFilename: './path/to/serviceaccountkey.json'
}
// Option 2: JSON Credentials
gcs: {
bucket: 'bucket-name',
credentials: {
// Service account JSON object
type: 'service_account',
project_id: 'your-project-id',
private_key: '-----BEGIN PRIVATE KEY-----\n...',
client_email: 'service-account@project.iam.gserviceaccount.com'
// ... other fields
}
}
```
## ๐ก Advanced Examples
### Custom File Validation
```javascript
const imageUpload = storage.s3Multer({
destination: 'images/',
fileFilter: (req, file) => {
// Only allow images
if (!file.mimetype.startsWith('image/')) {
throw new Error('Only image files are allowed');
}
// Custom validation logic
if (file.originalname.length > 100) {
throw new Error('Filename too long');
}
return true;
},
generateFileName: (req, file) => {
const userId = req.user?.id || 'anonymous';
const timestamp = Date.now();
const extension = file.originalname.split('.').pop();
return `${userId}-${timestamp}.${extension}`;
}
});
app.post('/upload/image', imageUpload.single('image'), handler);
```
### Dynamic Configuration
```javascript
function createDynamicUpload(provider) {
const configs = {
azure: storage.azureMulter({ destination: 'azure-uploads/' }),
s3: storage.s3Multer({ destination: 's3-uploads/' }),
gcs: storage.gcsMulter({ destination: 'gcs-uploads/' })
};
return configs[provider];
}
app.post('/upload/:provider', (req, res, next) => {
const upload = createDynamicUpload(req.params.provider);
upload.single('file')(req, res, next);
}, handler);
```
### Error Handling
```javascript
app.use((error, req, res, next) => {
if (error.code === 'LIMIT_FILE_SIZE') {
return res.status(400).json({
success: false,
error: 'File too large'
});
}
if (error.code === 'LIMIT_FILE_COUNT') {
return res.status(400).json({
success: false,
error: 'Too many files'
});
}
// Handle cloud storage errors
if (error.message.includes('bucket')) {
return res.status(500).json({
success: false,
error: 'Storage configuration error'
});
}
res.status(500).json({
success: false,
error: error.message
});
});
```
## ๐ ๏ธ File Object Properties
After successful upload, files are available in `req.file` (single) or `req.files` (multiple):
```javascript
{
fieldname: 'avatar',
originalname: 'profile.jpg',
encoding: '7bit',
mimetype: 'image/jpeg',
size: 15234,
destination: 'avatars/',
filename: 'avatar-123456789.jpg',
path: 'https://storage.googleapis.com/bucket/avatars/avatar-123456789.jpg',
cloudKey: 'avatars/avatar-123456789.jpg'
}
```
## ๐งช Testing Your Setup
Check out the complete test project in the `/test-project` directory:
```bash
cd test-project
npm install
cp .env.example .env
# Edit .env with your credentials
npm start
# Visit http://localhost:3000
```
The test project includes:
- โ
Single file upload example
- โ
Multiple files upload example
- โ
Multiple fields upload example
- โ
HTML form for easy testing
- โ
Error handling demonstrations
## ๐ Security Best Practices
โ
**Environment Variables** โ Never hardcode credentials
โ
**File Validation** โ Always validate file types and sizes
โ
**Rate Limiting** โ Implement upload rate limits
โ
**Authentication** โ Protect upload endpoints
โ
**CORS Configuration** โ Set proper CORS for web uploads
โ
**Virus Scanning** โ Consider adding virus scanning
### Example Environment Setup
```bash
# .env file
AZURE_CONNECTION_STRING=DefaultEndpointsProtocol=https;AccountName=...
AZURE_CONTAINER_NAME=uploads
AWS_ACCESS_KEY_ID=AKIA...
AWS_SECRET_ACCESS_KEY=...
AWS_REGION=us-east-1
AWS_S3_BUCKET=my-bucket
GCS_PROJECT_ID=my-project
GCS_BUCKET_NAME=my-bucket
GCS_KEY_FILE=./serviceaccountkey.json
```
## ๐ More Examples
For detailed examples and advanced usage patterns, see:
- ๐ [EXAMPLES.md](./EXAMPLES.md) - Complete usage examples
- ๐งช [test-project/](./test-project/) - Working test implementation
- ๐ [TypeScript Definitions](./src/index.ts) - Full type definitions
## ๐ค Contributing
We welcome contributions! Please feel free to submit a Pull Request.
1. Fork the repository
2. Create your feature branch (`git checkout -b feature/amazing-feature`)
3. Commit your changes (`git commit -m 'Add amazing feature'`)
4. Push to the branch (`git push origin feature/amazing-feature`)
5. Open a Pull Request
## ๐ License
This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
## ๐ Support
- ๐ง **Issues**: [GitHub Issues](https://github.com/FlickShot1432/npm-storage-package/issues)
- ๐ **Documentation**: [Examples & Guides](./EXAMPLES.md)
- ๐ฌ **Discussions**: [GitHub Discussions](https://github.com/FlickShot1432/npm-storage-package/discussions)
## โญ Show Your Support
If this package helped you, please consider:
- โญ Starring the repository
- ๐ Reporting issues
- ๐ก Suggesting improvements
- ๐ค Contributing code
**Made with โค๏ธ for the developer community**