S3 Storage
Handling file uploads, such as user avatars or document attachments, requires a robust storage strategy. Rather than storing files locally on your server (which poses scaling and persistence issues), Harpia provides a built-in S3Service powered by Bun’s native S3Client — no external AWS SDK required.
This service supports Amazon S3 and any S3-compatible alternative, such as DigitalOcean Spaces, Cloudflare R2, or MinIO.
Configuration
To connect to an S3 bucket, store the required credentials in your .env file:
S3_KEY=your_access_key
S3_SECRET=your_secret_key
S3_REGION=us-east-1
S3_BUCKET=my-harpia-bucket
# Optional: for S3-compatible providers (e.g. MinIO, R2)
# S3_ENDPOINT=https://your-custom-endpoint.com
The S3Service class (located at app/services/s3/index.ts) is instantiated by injecting the credentials directly:
// app/services/s3/index.ts
import { S3Service } from "./index";
export const storage = new S3Service({
accessKeyId: process.env.S3_KEY!,
secretAccessKey: process.env.S3_SECRET!,
region: process.env.S3_REGION!,
bucket: process.env.S3_BUCKET!,
// endpoint: process.env.S3_ENDPOINT, // Uncomment for S3-compatible providers
});
[!TIP] For S3-compatible providers such as MinIO or Cloudflare R2, simply provide the
endpointoption. Bun’s nativeS3Clienthandles the rest automatically.
Uploading Files
Standard Upload
Use the send() method to upload any file content. It accepts a string, Buffer, or a Response object as the content body.
import { storage } from "app/services/s3";
// Upload a buffer received from a multipart form
await storage.send(`avatars/${userId}.png`, fileBuffer, {
type: "image/png", // MIME type of the file
acl: "public-read", // Makes the file publicly accessible
});
Large File Upload (Streaming)
For large files, use sendLargeFile(). This method uploads the file in chunks using streaming multipart upload, preventing memory overflows for files that would otherwise be too large to load entirely into memory.
await storage.sendLargeFile(`exports/${fileName}.csv`, largeBuffer, {
type: "text/csv",
partSize: 10 * 1024 * 1024, // 10 MB per chunk (default: 5 MB)
queueSize: 5, // Concurrent chunk uploads (default: 10)
retry: 3, // Retry attempts per chunk (default: 3)
});
Reading Files
The S3Service provides typed methods for reading file contents directly from S3.
// Read a file as a plain text string
const content = await storage.readAsText("config/settings.json");
// Read a file and parse it as JSON
const settings = await storage.readAsJson<AppSettings>("config/settings.json");
// Read a file as an ArrayBuffer (useful for binary processing)
const buffer = await storage.readAsArrayBuffer("reports/monthly.pdf");
Partial Read
If you only need a specific byte range of a file (e.g., reading the header of a large binary), use readPartial():
// Read bytes 0–1023 of the file (first 1 KB)
const header = await storage.readPartial("uploads/large-file.bin", 0, 1024);
Deleting Files
// Permanently removes the file from S3
await storage.delete(`avatars/${userId}.png`);
Presigned URLs
Presigned URLs allow you to grant temporary, time-limited access to private files without making them publicly readable. This is useful for serving user-specific documents or download links.
// Generates a temporary GET URL valid for 1 hour (3600 seconds)
const downloadUrl = storage.generatePresignedUrl(`invoices/${invoiceId}.pdf`, {
expiresIn: 3600,
method: "GET",
});
// You can also generate presigned PUT URLs for client-side direct uploads
const uploadUrl = storage.generatePresignedUrl(`uploads/${fileName}`, {
expiresIn: 300, // 5 minutes
method: "PUT",
});
[!NOTE] Presigned URLs are generated synchronously and do not require a network request. The URL is signed using your secret credentials locally.
Raw File Reference
For advanced use cases, you can obtain a lazy S3File reference to an object in your bucket using file(). This gives you direct access to Bun’s native S3File API.
import { storage } from "app/services/s3";
const s3file = storage.file("uploads/report.pdf");
// Check file size without downloading
const size = await s3file.size;
// Stream the file directly into a Response
const response = new Response(s3file);