Arpia Bucket Object Store S3 API Documentation
Arpia Bucket Object Store S3 API Documentation
Table of Contents
- Key Features
- Authentication
- Endpoint Overview
- Usage Summary
- General Information
- Detailed Endpoints
- GET: List Files or Get a Specific File from Bucket
- DELETE: Delete a File from the Bucket
- PUT: Upload a File to the Bucket
- POST: Work Area Project Submission
- Using AutoAPI with Buckets
Key Features
- File Management: List, retrieve, and delete files from a designated bucket.
- File Upload: Easily upload files directly to the bucket using base64 encoding or binary transfer, with instant access to the file link.
- Project Submission: Submit projects to a workspace, complete with debug mode for enhanced feedback.
- Flexible Upload Methods: Support for both base64-encoded and binary file uploads.
Authentication
All requests require an authentication token. Include a valid token in the request query parameters to securely access the API.
Tokens can be created and managed using AutoAPI for seamless integration and workflow automation.
Note: You can find your token and bucket token in the AutoAPI of type object-storage.
Endpoint Overview
The Object Store S3 API supports the following HTTP methods:
| Method | Endpoint | Purpose |
|---|---|---|
| GET | /api/bucket/ | Retrieve files from the bucket |
| DELETE | /api/bucket/ | Remove files from the bucket |
| PUT | /api/bucket/ | Upload files to the bucket |
| POST | /api/ | Submit project data to the workspace |
Usage Summary
Each endpoint allows for the following actions:
| Method | Description |
|---|---|
| GET | Retrieve a list of files or a specific file from the bucket. |
| DELETE | Remove a specific file from the bucket. |
| PUT | Upload files to the bucket, with support for base64 and binary transfer methods. |
| POST | Submit project data to the workspace, enabling integration with other ARPIA tools. |
General Information
File Size Limits
- Maximum file size: 1 GB per file
- Total bucket storage: Varies by plan (check your ARPIA subscription)
Supported File Types
- All file types are supported
- Common MIME types:
image/*,application/json,application/pdf,text/*,video/*,audio/*
Rate Limiting
- Standard requests: 100 requests per minute per token
- Large file uploads: 20 uploads per minute
- Exceeding limits returns HTTP 429 (Too Many Requests)
Important Notes
- File names must be URL-safe (avoid special characters)
- Files with identical names will be overwritten
- All timestamps are in UTC format
Detailed Endpoints
1. GET: List Files or Get a Specific File from Bucket
-
Endpoint:
GET https://cloud.arpia.ai/api/bucket/ -
Query Parameters:
- token (Required): Authentication token for accessing the API.
- bucket (Required): Unique identifier for the bucket, e.g.,
LElC1IWg. - file (Optional): Name of the file to retrieve, e.g.,
upload_test.txt. If omitted, lists all files in the bucket.
-
Description:
Retrieve a list of all files in a bucket or get information about a specific file if thefileparameter is specified. -
Example Request:
GET https://cloud.arpia.ai/api/bucket/?token=YOUR_TOKEN&bucket=LElC1IWg -
Responses:
-
200 OK
Request successful, with JSON response listing files or file data.[ { "name": "upload_test.txt", "url": "https://store-pty1.datakubes.io/LElC1IWg/upload_test.txt", "size": 124652, "modified": "2023-12-13 14:05:18" }, { "name": "upload_test2.txt", "url": "https://store-pty1.datakubes.io/LElC1IWg/upload_test2.txt", "size": 124652, "modified": "2023-12-13 14:08:23" } ] -
400 Bad Request
Missing required parameters.{ "error": "YES", "error_message": "Missing required parameter: token or bucket" } -
401 Unauthorized
Invalid or expired authentication token.{ "error": "YES", "error_message": "Invalid authentication token" } -
404 Not Found
Bucket or file not found.{ "error": "YES", "error_message": "Bucket or file not found" } -
429 Too Many Requests
Rate limit exceeded.{ "error": "YES", "error_message": "Rate limit exceeded. Please try again later." }
-
2. DELETE: Delete a File from the Bucket
-
Endpoint:
DELETE https://cloud.arpia.ai/api/bucket/ -
Query Parameters:
- token (Required): Authentication token.
- bucket (Required): Unique identifier for the bucket, e.g.,
LElC1IWg. - file (Required): Name of the file to delete, e.g.,
upload_test.txt.
-
Description:
Delete a specific file from the bucket. This action is permanent and cannot be undone. -
Example Request:
DELETE https://cloud.arpia.ai/api/bucket/?token=YOUR_TOKEN&bucket=LElC1IWg&file=upload_test.txt -
Responses:
-
200 OK
File deleted successfully.{ "result": "OK", "message": "File deleted successfully", "file": "upload_test.txt" } -
400 Bad Request
Missing required parameters.{ "error": "YES", "error_message": "Missing required parameter: file" } -
401 Unauthorized
Invalid authentication token.{ "error": "YES", "error_message": "Invalid authentication token" } -
404 Not Found
File not found in bucket.{ "error": "YES", "error_message": "File not found in bucket" } -
405 Method Not Allowed
Request method not supported for this resource.{ "error": "YES", "error_message": "This Request Method is not supported!" } -
429 Too Many Requests
Rate limit exceeded.{ "error": "YES", "error_message": "Rate limit exceeded" }
-
3. PUT: Upload a File to the Bucket
-
Endpoint:
PUT https://cloud.arpia.ai/api/bucket/ -
Query Parameters:
- token (Required): Authentication token.
- bucket (Required): Unique identifier for the bucket, e.g.,
LElC1IWg. - file (Required): Name of the file being uploaded, e.g.,
upload_test.json. - base64 (Optional): Set to
yto indicate base64-encoded file content. Omit for binary upload.
-
Description:
Upload a file to the specified bucket. The API supports two upload methods:- Base64 Upload: Send file content as a base64-encoded string in JSON body
- Binary Upload: Send raw file content directly in the request body
Method 1: Base64 Upload
When to use: For uploading files through web applications, JavaScript clients, or when the file content needs to be transmitted as text/JSON.
Endpoint Example:
PUT https://cloud.arpia.ai/api/bucket/?token=[YOUR_TOKEN]&bucket=[BUCKET_TOKEN]&base64=y&file=[FILE_NAME]
Request Headers:
Content-Type: application/json
Request Body:
{
"file": "iVBORw0KGgoAAAANSUhEUgAACK8AAAMFCAYAAAC4PkclAAAACXBIWXMAABcRAAAXEQHKJvM..."
}
JavaScript Example:
// Convert file to base64
const fileToBase64 = (file) => {
return new Promise((resolve, reject) => {
const reader = new FileReader();
reader.readAsDataURL(file);
reader.onload = () => resolve(reader.result.split(',')[1]);
reader.onerror = error => reject(error);
});
};
// Upload file
const uploadFile = async (file, token, bucket) => {
const base64Content = await fileToBase64(file);
const response = await fetch(
`https://cloud.arpia.ai/api/bucket/?token=${token}&bucket=${bucket}&base64=y&file=${file.name}`,
{
method: 'PUT',
headers: {
'Content-Type': 'application/json'
},
body: JSON.stringify({ file: base64Content })
}
);
return response.json();
};
Method 2: Binary Upload
When to use: For direct file uploads from command line, server-to-server transfers, or when maximum efficiency is needed.
Endpoint Example:
PUT https://cloud.arpia.ai/api/bucket/?token=[YOUR_TOKEN]&bucket=[BUCKET_TOKEN]&file=[FILE_NAME]
Request Headers:
Content-Type: application/octet-stream(or the file's specific MIME type)
cURL Example:
curl --location --request PUT \
'https://cloud.arpia.ai/api/bucket/?token=YOUR_TOKEN&bucket=BUCKET_TOKEN&file=example.json' \
--header 'Content-Type: application/octet-stream' \
--data-binary '@/path/to/your/file.json'
Python Example:
import requests
def upload_file(file_path, token, bucket, filename):
url = f'https://cloud.arpia.ai/api/bucket/'
params = {
'token': token,
'bucket': bucket,
'file': filename
}
with open(file_path, 'rb') as f:
response = requests.put(
url,
params=params,
data=f,
headers={'Content-Type': 'application/octet-stream'}
)
return response.json()
# Usage
result = upload_file('/path/to/file.json', 'YOUR_TOKEN', 'BUCKET_TOKEN', 'file.json')
print(result)
Response (Both Methods):
-
200 OK
File uploaded successfully.{ "result": "OK", "file": "https://store-pty1.datakubes.io/LElC1IWg/upload_test.json", "message": "File uploaded successfully" } -
400 Bad Request
Missing required parameters or invalid file data.{ "error": "YES", "error_message": "Invalid file data or missing parameters" } -
401 Unauthorized
Invalid authentication token.{ "error": "YES", "error_message": "Invalid authentication token" } -
413 Payload Too Large
File exceeds maximum size limit (1 GB).{ "error": "YES", "error_message": "File size exceeds maximum limit of 1 GB" } -
429 Too Many Requests
Upload rate limit exceeded.{ "error": "YES", "error_message": "Upload rate limit exceeded" } -
500 Internal Server Error
Server-side error during upload.{ "error": "YES", "error_message": "Internal server error. Please try again later." }
Note: You can find your token and bucket token in the AutoAPI of type object-storage.
Important: Files with the same name will be overwritten without warning. Ensure unique file names or implement versioning in your application.
4. POST: Work Area Project Submission
-
Endpoint:
POST https://cloud.arpia.ai/api/ -
Query Parameters:
- token (Required): Authentication token.
- wsp (Required): Workspace ID where the project will be submitted, e.g.,
hZBM6k4h. - debug (Optional): Set to
trueto enable debug mode for detailed processing information.
-
Request Body:
- json: Data payload for the project submission.
{ "text": "Sample project data for submission.", "metadata": { "author": "John Doe", "version": "1.0" } }
- json: Data payload for the project submission.
-
Description:
Submit a project or workflow data to a specified workspace within the ARPIA platform. This endpoint is designed for integrating bucket operations with ARPIA's project management and workflow automation tools.Use Cases:
- Submitting processed data from bucket files to a workspace
- Triggering workflows based on uploaded files
- Creating project records with associated bucket resources
-
Example Request:
POST https://cloud.arpia.ai/api/?token=YOUR_TOKEN&wsp=hZBM6k4h&debug=true -
Response:
-
200 OK
Request successful, with JSON response detailing each step of the submission process.[ { "Step": 1, "StepName": "First Step", "Msg": "Initial step for the action", "Error": "N", "Result": "DONE" }, { "Step": 2, "StepName": "Database", "Msg": "Database Transaction Query", "Error": "N", "Result": "DONE" }, { "Step": 3, "StepName": "File Process", "Msg": "File process success.", "Error": "N", "Result": "DONE" }, { "Step": 4, "StepName": "Relational Node Data Entry", "Msg": "Save to Relational Node Query", "Error": "N", "Result": "DONE" }, { "Step": 5, "FinalDBStep": "Y", "StepName": "Final DB Commit", "Msg": "Commit all database transactions", "Error": "N", "Result": "DONE" } ] -
400 Bad Request
Invalid request payload or missing parameters.{ "error": "YES", "error_message": "Invalid request body or missing workspace ID" } -
401 Unauthorized
Invalid authentication token.{ "error": "YES", "error_message": "Invalid authentication token" } -
404 Not Found
Workspace not found.{ "error": "YES", "error_message": "Workspace not found" }
-
Using AutoAPI with Buckets
If you are using buckets within the ARPIA ecosystem, integrating them with the AutoAPI system can significantly enhance your workflow efficiency. The AutoAPI allows you to:
- Dynamically retrieve or update bucket data
- Automate file processing workflows
- Create custom endpoints that interact with your bucket resources
- Build event-driven architectures triggered by file uploads
Configuration Requirements
When setting up AutoAPI endpoints for bucket operations:
- Select the appropriate HTTP method (GET, POST, PUT, DELETE)
- Link the AutoAPI to your object store in the configuration panel
- Configure method-specific settings:
- For GET: Define which files or metadata to retrieve
- For PUT: Configure upload parameters and validation rules
- For POST: Set up workspace integration and data transformation
- For DELETE: Implement access controls and confirmation workflows
Common Integration Patterns
Pattern 1: Automated File Processing
Upload a file to the bucket → AutoAPI triggers → Process file → Store results in workspace
Pattern 2: Data Synchronization
External system uploads to bucket → AutoAPI validates → Sync to database → Notify stakeholders
Pattern 3: Multi-Stage Workflows
Bucket upload → AutoAPI Stage 1 (validation) → AutoAPI Stage 2 (transformation) → Final storage
Important Configuration Notes
⚠️ Common Misconfiguration: Selecting PUT method without properly linking the object store will cause setup failures. Always verify the bucket-to-AutoAPI linkage before deploying.
✅ Best Practice: Test your AutoAPI configuration with small files first, then enable debug mode to monitor the processing pipeline.
Additional Resources
For comprehensive guidance on setting up and managing AutoAPI endpoints with buckets, including:
- Step-by-step configuration tutorials
- Advanced authentication options
- Rate limiting and throttling strategies
- Error handling best practices
See the AutoAPI Documentation.
Best Practices
Security
- Never expose tokens in client-side code - Use server-side proxies for web applications
- Rotate tokens regularly - Update authentication tokens periodically
- Implement access controls - Use workspace-level permissions to restrict bucket access
- Validate file types - Check file extensions and MIME types before upload
Performance
- Use binary uploads for large files - More efficient than base64 encoding
- Implement retry logic - Handle rate limiting and temporary failures gracefully
- Batch operations - Group multiple operations when possible
- Monitor rate limits - Track API usage to avoid throttling
Error Handling
- Always check response status codes - Don't assume success
- Implement exponential backoff - For retry strategies after rate limiting
- Log detailed error messages - Include request IDs when contacting support
- Validate before upload - Check file size and format client-side to reduce errors
File Management
- Use descriptive file names - Include timestamps or unique identifiers
- Implement versioning - Track file versions in your application logic
- Clean up old files - Implement retention policies to manage storage costs
- Document file structures - Maintain metadata about bucket contents
Troubleshooting
Common Issues
Problem: "Invalid authentication token" error
Solution: Verify your token is current and hasn't expired. Generate a new token from the AutoAPI object-storage configuration.
Problem: Files not appearing after upload
Solution: Check the upload response for success confirmation. Verify bucket ID is correct. Allow a few seconds for indexing.
Problem: Base64 upload fails with large files
Solution: Use binary upload method for files > 50 MB. Base64 encoding increases payload size by ~33%, and large base64 strings can cause timeout or memory issues.
Problem: Rate limit errors during bulk uploads
Solution: Implement request queuing with delays between uploads. Consider increasing rate limits for your plan.
Support
For additional assistance:
- Documentation: https://docs.arpia.ai
- Community Forum: https://community.arpia.ai
- Technical Support: [email protected]
- Status Page: https://status.arpia.ai
Last Updated: December 2024
API Version: 2.0
Document Version: 2.1
Updated 1 day ago
