Objective
Extract and transform Solana blockchain data into structured formats for external analysis, reporting, and integration with third-party systems.
Solar Sentra supports multiple extraction patterns:
Batch Extraction
Export historical data in bulk for offline analysis
Streaming Extraction
Real-time data pipeline for live monitoring systems
On-Demand Extraction
Query-based extraction for specific data requirements
JSON
Standard format for API integrations and web applications.
const data = await client.export.toJSON({
wallet: '7xKXtg2CW87...',
dataType: 'transactions',
startDate: '2025-01-01',
endDate: '2025-10-16'
});
fs.writeFileSync('transactions.json', JSON.stringify(data, null, 2));
CSV
Optimized for Excel, Google Sheets, and data analysis tools.
const csv = await client.export.toCSV({
wallet: '7xKXtg2CW87...',
fields: ['timestamp', 'type', 'amount', 'token', 'counterparty'],
delimiter: ',',
includeHeaders: true
});
Parquet
High-performance columnar format for big data processing.
import pyarrow.parquet as pq
data = client.export.to_parquet(
wallet="7xKXtg2CW87...",
compression="snappy"
)
pq.write_table(data, "transactions.parquet")
Data Pipelines
Direct Database Integration
Stream data directly to PostgreSQL, MongoDB, or MySQL.
pipeline:
source:
type: solar-sentra
wallets:
- 7xKXtg2CW87d97TXJSDpbD5jBkheTqA83TZRuJosgAsU
transform:
- deduplicate: signature
- enrich: token_metadata
- filter: amount > 0.01
destination:
type: postgresql
connection: postgresql://user:pass@localhost/analytics
table: solana_transactions
mode: upsert
primary_key: signature
Execution:
solar-sentra pipeline run config.yaml --schedule "*/5 * * * *"
Advanced Queries
GraphQL Interface
Complex queries with nested relationships.
query ExtractWalletData($address: String!, $from: DateTime!) {
wallet(address: $address) {
transactions(from: $from) {
signature
blockTime
amount
fee
instructions {
programId
accounts
data
}
tokenTransfers {
mint
from
to
amount
}
}
}
}
SQL-Like Queries
Use familiar SQL syntax for data extraction.
SELECT
signature,
block_time,
amount,
token_symbol,
type
FROM transactions
WHERE
wallet = '7xKXtg2CW87...'
AND block_time > '2025-09-01'
AND amount > 100
ORDER BY block_time DESC
LIMIT 1000
Streaming Pipeline
Real-time data extraction via event streams.
import { createWriteStream } from 'fs';
import { Transform } from 'stream';
const stream = client.export.createStream({
wallet: '7xKXtg2CW87...',
format: 'ndjson' // Newline-delimited JSON
});
const transform = new Transform({
objectMode: true,
transform(chunk, encoding, callback) {
// Custom transformation
const processed = {
...chunk,
usd_value: chunk.amount * chunk.price
};
callback(null, JSON.stringify(processed) + '\n');
}
});
const output = createWriteStream('stream_output.jsonl');
stream
.pipe(transform)
.pipe(output)
.on('finish', () => console.log('Export complete'));
Data Enrichment
Enhance extracted data with additional context.
enriched_data = client.export.with_enrichment(
wallet="7xKXtg2CW87...",
enrichments=[
"token_prices",
"token_metadata",
"counterparty_labels",
"dex_trade_info",
"nft_metadata"
]
)
# Each transaction now includes enriched fields
for tx in enriched_data:
print(f"{tx.signature}: {tx.token_name} @ ${tx.usd_price}")
Compression Options
Optimize storage for large datasets.
| Format | Compression Ratio | Extraction Speed | Use Case |
|---|
| gzip | 10:1 | Medium | General purpose |
| brotli | 12:1 | Slow | Maximum compression |
| lz4 | 5:1 | Fast | Real-time processing |
| zstd | 11:1 | Fast | Balanced performance |
await client.export.toFile({
wallet: '7xKXtg2CW87...',
format: 'json',
compression: 'gzip',
output: 'data.json.gz'
});
Scheduled Exports
Automate recurring data exports.
await client.export.schedule({
name: 'daily_wallet_export',
wallet: '7xKXtg2CW87...',
format: 'csv',
destination: 's3://mybucket/exports/',
schedule: '0 0 * * *', // Daily at midnight
retention: '30d'
});
Parallel Extraction
Split large exports across multiple workers for faster processing.
Incremental Extraction
Extract only new data since last export using checkpoints.
Filtered Extraction
Reduce payload size by filtering at source.
const incrementalExport = await client.export.incremental({
wallet: '7xKXtg2CW87...',
checkpoint: 'last_signature_hash',
batchSize: 1000
});
All extraction operations support resume capability. Interrupted exports can continue from last checkpoint.