Overview
Creates multiple transactions in a single batch operation. This endpoint is optimized for high-throughput scenarios.
Limits:
- Single batch: Up to 100,000 transactions per request; request body must not exceed 50 MB.
- Multiple files in one call: Use
POST /transactions/batch/background-multi with a sources array (up to 5 files, 100,000 transactions each); body limit 150 MB.
- File upload (multipart): Use
POST /transactions/batch/upload to send CSV, Excel, or JSON files directly; max 5 files per request. The per-file transaction limit depends on your plan.
Execution behavior (sync endpoint):
- The API keeps the connection open for up to 30 seconds.
- If the batch finishes within 30 seconds, the API returns 200 with the full summary (created, skipped, rules executed, processing time, etc.).
- If processing exceeds 30 seconds, the API returns 202 with a
jobId and continues processing in the background. When finished, the client is notified in the dashboard (and via real-time socket if connected). The response message indicates that the process may take longer and that they will be notified when it is done.
Background endpoints (always asynchronous):
POST /transactions/batch/background β Single batch; returns 202 immediately and notifies via socket when done.
POST /transactions/batch/background-multi β Multiple files in one request (max 5); returns 202 and one socket notification when all files are processed.
POST /transactions/batch/upload β Multipart form-data: send one or more files (CSV, Excel, JSON) in the file field; server parses and runs the same batch flow. Ideal when the client has files instead of a JSON body.
Ideal for:
- Importing historical transactions
- Processing large transaction files (CSV, Excel, JSON)
- High-volume payment processors
- Data migration scenarios
Endpoints
POST http://api.gu1.ai/transactions/batch
POST http://api.gu1.ai/transactions/batch/background
POST http://api.gu1.ai/transactions/batch/background-multi
POST http://api.gu1.ai/transactions/batch/upload (multipart/form-data)
File upload (multipart)
Use POST /transactions/batch/upload when you have CSV, Excel, or JSON files and want the server to parse them. You must send the request as multipart/form-data (FormData), not JSON.
- Max 5 files per request.
- Accepted formats: CSV (
.csv), Excel (.xlsx, .xls), JSON (.json).
- Per-file transaction limit depends on your organization plan (see Limits by plan).
- Query parameter:
validateGap (default true). If you send multiple files and there is a time gap of 30 minutes or more between the end of one file and the start of the next (by transactedAt), the API returns 400 with code: "GAP_VALIDATION_REQUIRED" and a list of all gaps. To proceed anyway, resend the same request with ?validateGap=false.
| Field | Type | Required | Description |
|---|
file | File | Yes | One or more files (CSV, Excel, JSON). Send multiple files by repeating the file field. |
executeRules | string | No | "true" (default) or "false" |
skipDuplicates | string | No | "true" (default) or "false" |
validateExistingEntity | string | No | "true" (default) or "false" |
const form = new FormData();
form.append('file', fileInput.files[0]); // or multiple: form.append('file', file1); form.append('file', file2);
form.append('executeRules', 'true');
form.append('skipDuplicates', 'true');
form.append('validateExistingEntity', 'true');
const response = await fetch('http://api.gu1.ai/transactions/batch/upload', {
method: 'POST',
headers: { 'Authorization': 'Bearer YOUR_API_KEY' },
body: form // Do not set Content-Type; browser sets multipart boundary
});
const data = await response.json();
Example: upload with cURL
curl -X POST http://api.gu1.ai/transactions/batch/upload \
-H "Authorization: Bearer YOUR_API_KEY" \
-F "file=@transactions.csv" \
-F "file=@more-transactions.xlsx" \
-F "executeRules=true" \
-F "skipDuplicates=true"
Limits by plan
The maximum number of transactions per file (for both the upload endpoint and background-multi) depends on your organizationβs plan:
| Plan | Max transactions per file |
|---|
| Freemium | 4,000 |
| Startup | 12,000 |
| Growth | 30,000 |
| Enterprise | 100,000 |
| Usage Based (pay-as-you-go) | 100,000 |
If batch upload is disabled for your organization, the upload and background endpoints return 403. Contact support to enable it or change plan.
Templates
To build CSV, Excel, or JSON files that match the expected schema:
- Dashboard: In Transaction Monitoring, open the Batch upload modal. Use the Download CSV, Download Excel, or Download JSON buttons to get templates with the correct headers and a sample row. This is the easiest way to get started.
- Required fields per row: Each transaction must include at least
externalId, type, amount, and currency. Optional but recommended: transactedAt, status, originName, destinationName, description. For the full schema see Create Transaction.
- CSV/Excel: Column names can be camelCase (
externalId, transactedAt) or snake_case (external_id, transacted_at); the server normalizes them. Date format: ISO 8601 (e.g. 2025-01-15T10:30:00.000Z).
There is no public URL to download templates outside the dashboard; use the batch upload modal to get the latest templates.
Authentication
Requires a valid API key in the Authorization header:
Authorization: Bearer YOUR_API_KEY
Request Body (sync and background)
Array of transaction objects to create (minimum 1, maximum 100,000 per request).Each transaction object has the same structure as the Create Transaction endpoint. Request body must not exceed 50 MB.
Whether to execute rules engine for all transactions in the batch
Skip transactions with duplicate externalId instead of failing the entire batch
If true, validate that origin and destination entities exist and auto-link by externalId. Set to false to create transactions without entity matching (e.g. bulk import).
Request Body (background-multi only)
Array of objects: { "fileName": "optional string", "transactions": [ ... ] }. Minimum 1, maximum 5 sources. Each transactions array: max 100,000 items (or your plan limit per file). Total request body must not exceed 150 MB.
Whether to execute rules for all transactions in every source
Skip transactions with duplicate externalId
If true, validate and auto-link entities; set to false for bulk import without entity matching
Response (200 when completed within 30s)
Indicates if the batch operation completed successfully
Number of transactions successfully created
Number of transactions skipped (duplicates or validation errors)
Number of transactions that failed to create
Array of created transaction objects
Array of error objects for failed transactions, including:
index - Index of the failed transaction in the input array
externalId - External ID of the failed transaction
error - Error message
code - Error code
Total processing time (e.g. β3.45sβ)
When rules are executed: total, created, skipped, failed, autoLinked, validationEnabled
Response (202 when processing exceeds 30s or background)
When the batch is processed in the background (either because it exceeded 30 seconds on the sync endpoint or because you called the background endpoint), the API returns:
Unique job identifier; use it to correlate with the real-time notification when the job completes
Human-readable message indicating that the process is running in the background and the client will be notified in the dashboard when it finishes
(Only for background-multi.) Number of files being processed
When the job completes, a real-time event is emitted (e.g. transaction:batch-completed or transaction:batch-failed) so the dashboard can show a toast and refresh the transaction list.
Examples
Batch Import with 100 Transactions
curl -X POST http://api.gu1.ai/transactions/batch \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"transactions": [
{
"externalId": "txn_001",
"type": "PAYMENT",
"amount": 50.00,
"currency": "USD",
"originExternalId": "customer_001",
"destinationExternalId": "merchant_100",
"channel": "web_browser"
},
{
"externalId": "txn_002",
"type": "TRANSFER",
"amount": 200.00,
"currency": "EUR",
"originExternalId": "customer_002",
"destinationExternalId": "customer_003",
"channel": "mobile_app"
},
... // Up to 100,000 transactions
],
"executeRules": true,
"skipDuplicates": true
}'
Large Import with Chunking (100,000 per batch)
import requests
from typing import List, Dict
def chunk_list(lst: List, chunk_size: int):
"""Split list into chunks of specified size"""
for i in range(0, len(lst), chunk_size):
yield lst[i:i + chunk_size]
def import_transactions_in_batches(transactions: List[Dict], batch_size: int = 100000):
"""Import large number of transactions in batches"""
total_created = 0
total_skipped = 0
total_failed = 0
for batch_num, batch in enumerate(chunk_list(transactions, batch_size), 1):
print(f"Processing batch {batch_num} ({len(batch)} transactions)...")
response = requests.post(
'http://api.gu1.ai/transactions/batch',
headers={
'Authorization': f'Bearer {API_KEY}',
'Content-Type': 'application/json'
},
json={
'transactions': batch,
'executeRules': False, # Skip rules for faster import
'skipDuplicates': True,
'validateExistingEntity': False # Skip validation for speed
}
)
result = response.json()
total_created += result['created']
total_skipped += result['skipped']
total_failed += result['failed']
print(f"Batch {batch_num}: Created {result['created']}, Skipped {result['skipped']}, Failed {result['failed']}")
# Handle errors
if result.get('errors'):
print(f"Errors in batch {batch_num}:")
for error in result['errors'][:5]: # Show first 5 errors
print(f" - {error['externalId']}: {error['error']}")
print(f"\nTotal: Created {total_created}, Skipped {total_skipped}, Failed {total_failed}")
return {
'created': total_created,
'skipped': total_skipped,
'failed': total_failed
}
# Usage
all_transactions = load_all_transactions_from_database()
result = import_transactions_in_batches(all_transactions, batch_size=100000)
Batch with Device and Location Details
// Import mobile app transactions with device info
const mobileTransactions = await database.query(`
SELECT
t.id as transaction_id,
t.amount,
t.currency,
d.platform,
d.os_name,
d.device_model,
l.latitude,
l.longitude,
l.city,
l.country
FROM transactions t
JOIN devices d ON t.device_id = d.id
JOIN locations l ON t.location_id = l.id
`);
const response = await fetch('http://api.gu1.ai/transactions/batch', {
method: 'POST',
headers: {
'Authorization': 'Bearer YOUR_API_KEY',
'Content-Type': 'application/json'
},
body: JSON.stringify({
transactions: mobileTransactions.map(txn => ({
externalId: txn.transaction_id,
type: 'PAYMENT',
amount: txn.amount,
currency: txn.currency,
channel: 'mobile_app',
deviceDetails: {
platform: txn.platform,
osName: txn.os_name,
model: txn.device_model
},
locationDetails: {
latitude: txn.latitude,
longitude: txn.longitude,
city: txn.city,
country: txn.country
}
})),
executeRules: true,
skipDuplicates: true
})
});
const result = await response.json();
Response Example
{
"success": true,
"created": 98,
"skipped": 2,
"failed": 0,
"processingTime": 1245,
"transactions": [
{
"id": "550e8400-e29b-41d4-a716-446655440000",
"externalId": "txn_001",
"type": "PAYMENT",
"amount": 50.00,
"currency": "USD",
"status": "CREATED",
"riskScore": 12,
"createdAt": "2024-10-03T14:30:00.000Z"
},
... // All created transactions
],
"errors": [
{
"index": 45,
"externalId": "txn_046",
"error": "Invalid currency format",
"code": "VALIDATION_ERROR"
}
]
}
Error Handling
Partial Success
The batch endpoint uses a partial success model:
- Successfully created transactions are committed
- Failed transactions are skipped
- The response includes both successes and errors
const result = await createBatchTransactions(transactions);
if (result.failed > 0) {
console.error(`${result.failed} transactions failed:`);
result.errors.forEach(error => {
console.error(`Transaction ${error.externalId} at index ${error.index}: ${error.error}`);
// Retry failed transactions individually
retryTransaction(transactions[error.index]);
});
}
console.log(`Successfully created ${result.created} transactions`);
400 Bad Request - Batch Too Large
{
"success": false,
"error": {
"code": "BATCH_SIZE_EXCEEDED",
"message": "Batch size exceeds maximum of 100,000 transactions per request",
"details": {
"provided": 150000,
"maximum": 100000
}
}
}
413 Payload Too Large
Request body exceeds 50 MB (single batch) or 150 MB (multi-file). Split into smaller batches or use multiple requests.
400 Bad Request - Empty Batch
{
"success": false,
"error": {
"code": "EMPTY_BATCH",
"message": "Batch must contain at least 1 transaction",
"details": {
"provided": 0,
"minimum": 1
}
}
}
Throughput and limits
- Maximum batch size: 100,000 transactions per request (single batch); body max 50 MB
- Multi-file: Up to 5 files, 100,000 transactions each (or plan limit per file); body max 150 MB
- Sync endpoint: Waits up to 30 seconds; if the batch completes in time, returns 200 with full result; otherwise returns 202 and continues in background with dashboard/socket notification when done
- Typical processing time: Depends on volume and rules; for large batches, expect 202 and wait for the completion notification
Optimization tips
- Disable rules for bulk imports: Set
executeRules: false for faster processing
- Skip entity validation: Set
validateExistingEntity: false for bulk import without entity matching
- Use chunking: For over 100k transactions, send multiple batches (e.g. 100k per request) or use background-multi with up to 5 files
- Handle 202: If you receive 202, keep the connection or poll; the dashboard will show a notification when the job completes (and socket event if connected)
- Handle duplicates: Always set
skipDuplicates: true to avoid failures on duplicate externalId
Example: Optimized Bulk Import
import asyncio
import aiohttp
async def create_batch_async(session, transactions):
async with session.post(
'http://api.gu1.ai/transactions/batch',
headers={'Authorization': f'Bearer {API_KEY}'},
json={
'transactions': transactions,
'executeRules': False, # Skip for speed
'skipDuplicates': True,
'validateExistingEntity': False # Skip validation
}
) as response:
return await response.json()
async def bulk_import_parallel(all_transactions, batch_size=10000, max_concurrent=5):
"""Import transactions with parallel batches"""
batches = [all_transactions[i:i+batch_size] for i in range(0, len(all_transactions), batch_size)]
semaphore = asyncio.Semaphore(max_concurrent)
async def process_batch(batch):
async with semaphore:
async with aiohttp.ClientSession() as session:
return await create_batch_async(session, batch)
results = await asyncio.gather(*[process_batch(batch) for batch in batches])
total_created = sum(r['created'] for r in results)
total_failed = sum(r['failed'] for r in results)
print(f"Imported {total_created} transactions, {total_failed} failed")
return results
# Usage
transactions = load_100k_transactions()
asyncio.run(bulk_import_parallel(transactions))
Use Cases
Historical Data Migration
Import past transactions from legacy system:
# Migrate historical data with minimal processing
response = requests.post(
'http://api.gu1.ai/transactions/batch',
json={
'transactions': legacy_transactions,
'executeRules': False, # Don't analyze historical data
'skipDuplicates': True,
'validateExistingEntity': False
}
)
Daily Transaction Import
Schedule daily batch imports:
// Cron job: Import yesterday's transactions
async function importYesterdayTransactions() {
const yesterday = new Date();
yesterday.setDate(yesterday.getDate() - 1);
const transactions = await fetchTransactionsFromDatabase(yesterday);
const result = await fetch('http://api.gu1.ai/transactions/batch', {
method: 'POST',
headers: { 'Authorization': 'Bearer YOUR_API_KEY' },
body: JSON.stringify({
transactions: transactions,
executeRules: true, // Analyze for alerts
skipDuplicates: true
})
});
const data = await result.json();
console.log(`Imported ${data.created} transactions from ${yesterday.toDateString()}`);
}
// Run daily at 2am
schedule.scheduleJob('0 2 * * *', importYesterdayTransactions);
Best Practices
- Respect limits: Max 100,000 transactions per request and 50 MB body (150 MB for multi-file)
- Always enable skipDuplicates: Prevents batch failures from duplicates
- Disable rules for historical imports: Set
executeRules: false to save time
- Handle 200 vs 202: On sync endpoint, 200 = result in body; 202 = job in background, wait for dashboard/socket notification
- Validate data before sending: Ensure required fields (externalId, type, amount, currency) per Create Transaction
- Use background-multi or upload for files: Up to 5 files in one request; use
POST /transactions/batch/upload with FormData to send CSV/Excel/JSON directly
Next Steps