Pipeline & Transaction API
Batch execute multiple commands for improved performance
The Pipeline and Transaction API provides batch execution of multiple commands. This dramatically improves performance when you need to run many commands!
Endpoints
| Method | Endpoint | Description |
|---|---|---|
| POST | /{plateId}/pipeline | Execute multiple commands in pipeline |
| POST | /{plateId}/transaction | Execute multiple commands in transaction |
Why Use Pipeline/Transaction?
The Problem
Each HTTP request has network overhead. If you need to run 100 commands:
- Naive: 100 HTTP requests = lots of waiting
- Pipeline: 1 HTTP request with 100 commands = fast!
The Solution
- Pipeline: Send multiple commands in one request, get all results at once
- Transaction: Pipeline + atomic execution (all or nothing)
Pipeline - Batch Commands
Execute multiple commands in sequence. If one fails, others still run!
Basic Pipeline Request
const plateId = "[id]";
const apiKey = "your-api-key";
const baseUrl = "[base-url]";
const response = await fetch(`${baseUrl}/${plateId}/pipeline`, {
method: "POST",
headers: {
"Authorization": apiKey,
"Content-Type": "application/json"
},
body: JSON.stringify([
["SET", "key1", "value1"],
["GET", "key1"],
["INCR", "counter"],
["HGET", "user:1", "name"]
])
});
const results = await response.json();
// results: ["OK", "value1", 1, "John"]
console.log(results);Use Cases
- Bulk loading: Import lots of data
- Batch reads: Get many values at once
- Multiple writes: Update many keys at once
Pipeline: Bulk Set and Get
const plateId = "[id]";
const apiKey = "your-api-key";
const baseUrl = "[base-url]";
const response = await fetch(`${baseUrl}/${plateId}/pipeline`, {
method: "POST",
headers: {
"Authorization": apiKey,
"Content-Type": "application/json"
},
body: JSON.stringify([
["MSET", "key1", "val1", "key2", "val2", "key3", "val3"],
["MGET", "key1", "key2", "key3"]
])
});
const results = await response.json();
// results: ["OK", ["val1", "val2", "val3"]]
console.log(results);Pipeline: Multiple Data Types
const plateId = "[id]";
const apiKey = "your-api-key";
const baseUrl = "[base-url]";
const response = await fetch(`${baseUrl}/${plateId}/pipeline`, {
method: "POST",
headers: {
"Authorization": apiKey,
"Content-Type": "application/json"
},
body: JSON.stringify([
["SET", "string:key", "value"],
["HSET", "hash:key", "field", "value"],
["SADD", "set:key", "member"],
["ZADD", "zset:key", "100", "member"],
["LPUSH", "list:key", "item"]
])
});
const results = await response.json();
// results: [1, 1, 1, 1, 1]
console.log(results);Transaction - Atomic Execution
Execute multiple commands atomically. Either ALL succeed, or NONE are applied!
Basic Transaction Request
const plateId = "[id]";
const apiKey = "your-api-key";
const baseUrl = "[base-url]";
const response = await fetch(`${baseUrl}/${plateId}/transaction`, {
method: "POST",
headers: {
"Authorization": apiKey,
"Content-Type": "application/json"
},
body: JSON.stringify([
["INCR", "balance"],
["DECR", "debt"],
["SET", "lastTransaction", "2024-01-01"]
])
});
const results = await response.json();
// results: [100, 50, "OK"]
console.log(results);Transaction: Account Transfer
const plateId = "[id]";
const apiKey = "your-api-key";
const baseUrl = "[base-url]";
const response = await fetch(`${baseUrl}/${plateId}/transaction`, {
method: "POST",
headers: {
"Authorization": apiKey,
"Content-Type": "application/json"
},
body: JSON.stringify([
["DECRBY", "account:a", "100"],
["INCRBY", "account:b", "100"],
["SET", "transfer:completed", "true"]
])
});
const results = await response.json();
console.log(results);Pipeline vs Transaction
| Feature | Pipeline | Transaction |
|---|---|---|
| Atomic | No | Yes |
| Rollback on failure | No | Yes |
| Performance gain | High | Medium |
| Use case | Read-heavy bulk ops | Write-heavy atomic ops |
When to Use Each
Use Pipeline When:
- Doing bulk operations
- Performance is critical
- Individual failures don't affect each other
Use Transaction When:
- Need atomic operations
- Dependent updates (if one fails, all should fail)
- Financial operations, inventory, etc.
Error Handling
Pipeline Errors
If a command fails, the response includes the error, but other commands still run:
// Request
[
["SET", "key1", "value1"],
["GET", "nonexistent"],
["INCR", "notanumber"]
]
// Response
[
"OK",
null,
{"error": "ERR value is not an integer or out of range"}
]Transaction Errors
If any command in a transaction fails, the entire transaction fails:
// Request - if INCR fails, entire transaction is rolled back
[
["SET", "key1", "value1"],
["INCR", "notanumber"]
]
// Response
{"error": "EXECABORTED", "message": "..."}Best Practices
- Batch size: Don't send thousands of commands at once
- Pipeline for reads: Great for reducing latency on multiple reads
- Transaction for writes: Use when atomicity matters
- Check results: Always check response for errors