Enterprise Asset Storage
Store and serve massive digital assets with our S3-compatible object storage. Automatically optimized by a global CDN, Afribase Storage provides lightning-fast delivery for images, videos, and documents across the continent.
Storage Infrastructure
Unified S3 Access
Afribase Storage is fully S3-compatible, allowing you to use existing tools and libraries to manage your files with enterprise-grade interoperability.
African CDN Nodes
Static assets are cached at the edge across our regional nodes, ensuring your users in Lagos or Cairo receive content with minimal latency.
Quick Start — Raw Fetch API
You don't need an SDK. Upload and retrieve files using plain fetch() calls from any JavaScript environment.
Upload a File
const STORAGE_URL = "https://your-project.afribase.dev/storage/v1/YOUR_PROJECT";
const BUCKET = "my-bucket";
const file = document.getElementById('fileInput').files[0];
const fileName = `${Date.now()}.${file.name.split('.').pop()}`;
const res = await fetch(`${STORAGE_URL}/object/${BUCKET}/${fileName}`, {
method: 'POST',
headers: {
'apikey': YOUR_ANON_KEY,
'Authorization': `Bearer ${session.access_token}`
},
body: file // Send the File/Blob directly — no FormData needed
});
if (!res.ok) throw new Error(await res.text());
console.log("Uploaded!", await res.json());Get the Public URL
const publicUrl = `${STORAGE_URL}/object/public/${BUCKET}/${fileName}`;
// Use it in an <img> or <a> tag immediatelyList Files in a Bucket
const res = await fetch(`${STORAGE_URL}/object/list/${BUCKET}`, {
method: 'POST',
headers: {
'apikey': YOUR_ANON_KEY,
'Authorization': `Bearer ${session.access_token}`,
'Content-Type': 'application/json'
},
body: JSON.stringify({ prefix: '', limit: 100 })
});
const files = await res.json();Delete a File
await fetch(`${STORAGE_URL}/object/${BUCKET}/file-to-delete.png`, {
method: 'DELETE',
headers: {
'apikey': YOUR_ANON_KEY,
'Authorization': `Bearer ${session.access_token}`
}
});SDK Reference
If you're using the Afribase client library, storage operations are even simpler.
Upload an Image
JavaScript
const { data, error } = await afribase.storage
.from('avatars')
.upload('profiles/avatar-1.png', file);Python
data = client.storage.from_("avatars").upload(
"profiles/avatar-1.png", file_bytes
)Dart
final data = await client.storage
.from('avatars')
.upload('profiles/avatar-1.png', file);Swift
let data = try await afribase.storage
.from("avatars")
.upload(path: "profiles/avatar-1.png", file: fileData)Kotlin
val data = afribase.storage
.from("avatars")
.upload("profiles/avatar-1.png", fileBytes)Go
data, err := client.Storage.From("avatars").Upload(
"profiles/avatar-1.png", fileReader,
)Generate Public Link
JavaScript
const { data } = afribase.storage.from('avatars').getPublicUrl('avatar-1.png');Python
url = client.storage.from_("avatars").get_public_url("avatar-1.png")Dart
final url = client.storage.from('avatars').getPublicUrl('avatar-1.png');Swift
let url = afribase.storage.from("avatars").getPublicUrl(path: "avatar-1.png")Kotlin
val url = afribase.storage.from("avatars").getPublicUrl("avatar-1.png")Go
url := client.Storage.From("avatars").GetPublicUrl("avatar-1.png")Signed URLs & Image Transforms
JavaScript
// Signed URL (expires in 60s)
const { data } = await afribase.storage
.from('docs').createSignedUrl('private.pdf', 60);
// Image Transform
const { data: img } = afribase.storage
.from('avatars').getPublicUrl('user.png', {
transform: { width: 200, height: 200 }
});Python
# Signed URL
url = client.storage.from_("docs").create_signed_url("private.pdf", 60)
# Image Transform
url = client.storage.from_("avatars").get_public_url(
"user.png", transform={"width": 200}
)Dart
// Signed URL
final url = await client.storage
.from('docs').createSignedUrl('private.pdf', 60);
// Image Transform
final imgUrl = client.storage.from('avatars').getPublicUrl(
'user.png', transform: TransformOptions(width: 200)
);Kotlin
// Signed URL
val url = afribase.storage
.from("docs").createSignedUrl("private.pdf", 60)
// Image Transform
val imgUrl = afribase.storage.from("avatars")
.getPublicUrl("user.png") { transform(width = 200) }Fine-grained Security
Control who can read and write files using **Storage Roles**. You can define specific permissions for different user levels or automated services.
Read-only access to specific buckets.
Upload and manage files within a bucket.
Full control over bucket configuration and data.
Access Control & Troubleshooting
Managing Storage Policies
Since Storage metadata is stored in a separate schema, you must use the **Schema Selector** in the Auth Policies dashboard to switch to "storage". Target the "objects" table to define your rules.
If you're getting a "new row violates row-level security policy" error, create an INSERT policy for the storage.objects table using the owner column:
-- Pattern for user-specific uploads
(bucket_id = 'your_bucket') AND (auth.uid() = owner)Type Mismatch: uuid = integer
If you get operator does not exist: uuid = integer, you are comparing auth.uid() (UUID) to an integer. Cast the table column to UUID in your expression:
auth.uid() = user_id::uuid