Skip to main content

Objective

Extract and transform Solana blockchain data into structured formats for external analysis, reporting, and integration with third-party systems.

Extraction Methods

Solar Sentra supports multiple extraction patterns: Batch Extraction
Export historical data in bulk for offline analysis
Streaming Extraction
Real-time data pipeline for live monitoring systems
On-Demand Extraction
Query-based extraction for specific data requirements

Export Formats

JSON

Standard format for API integrations and web applications.
path=null start=null
const data = await client.export.toJSON({
  wallet: '7xKXtg2CW87...',
  dataType: 'transactions',
  startDate: '2025-01-01',
  endDate: '2025-10-16'
});

fs.writeFileSync('transactions.json', JSON.stringify(data, null, 2));

CSV

Optimized for Excel, Google Sheets, and data analysis tools.
path=null start=null
const csv = await client.export.toCSV({
  wallet: '7xKXtg2CW87...',
  fields: ['timestamp', 'type', 'amount', 'token', 'counterparty'],
  delimiter: ',',
  includeHeaders: true
});

Parquet

High-performance columnar format for big data processing.
path=null start=null
import pyarrow.parquet as pq

data = client.export.to_parquet(
    wallet="7xKXtg2CW87...",
    compression="snappy"
)

pq.write_table(data, "transactions.parquet")

Data Pipelines

Direct Database Integration

Stream data directly to PostgreSQL, MongoDB, or MySQL.
path=null start=null
pipeline:
  source:
    type: solar-sentra
    wallets:
      - 7xKXtg2CW87d97TXJSDpbD5jBkheTqA83TZRuJosgAsU
    
  transform:
    - deduplicate: signature
    - enrich: token_metadata
    - filter: amount > 0.01
    
  destination:
    type: postgresql
    connection: postgresql://user:pass@localhost/analytics
    table: solana_transactions
    mode: upsert
    primary_key: signature
Execution:
path=null start=null
solar-sentra pipeline run config.yaml --schedule "*/5 * * * *"

Advanced Queries

GraphQL Interface

Complex queries with nested relationships.
path=null start=null
query ExtractWalletData($address: String!, $from: DateTime!) {
  wallet(address: $address) {
    transactions(from: $from) {
      signature
      blockTime
      amount
      fee
      
      instructions {
        programId
        accounts
        data
      }
      
      tokenTransfers {
        mint
        from
        to
        amount
      }
    }
  }
}

SQL-Like Queries

Use familiar SQL syntax for data extraction.
path=null start=null
SELECT 
  signature,
  block_time,
  amount,
  token_symbol,
  type
FROM transactions
WHERE 
  wallet = '7xKXtg2CW87...'
  AND block_time > '2025-09-01'
  AND amount > 100
ORDER BY block_time DESC
LIMIT 1000

Streaming Pipeline

Real-time data extraction via event streams.
path=null start=null
import { createWriteStream } from 'fs';
import { Transform } from 'stream';

const stream = client.export.createStream({
  wallet: '7xKXtg2CW87...',
  format: 'ndjson'  // Newline-delimited JSON
});

const transform = new Transform({
  objectMode: true,
  transform(chunk, encoding, callback) {
    // Custom transformation
    const processed = {
      ...chunk,
      usd_value: chunk.amount * chunk.price
    };
    callback(null, JSON.stringify(processed) + '\n');
  }
});

const output = createWriteStream('stream_output.jsonl');

stream
  .pipe(transform)
  .pipe(output)
  .on('finish', () => console.log('Export complete'));

Data Enrichment

Enhance extracted data with additional context.
path=null start=null
enriched_data = client.export.with_enrichment(
    wallet="7xKXtg2CW87...",
    enrichments=[
        "token_prices",
        "token_metadata",
        "counterparty_labels",
        "dex_trade_info",
        "nft_metadata"
    ]
)

# Each transaction now includes enriched fields
for tx in enriched_data:
    print(f"{tx.signature}: {tx.token_name} @ ${tx.usd_price}")

Compression Options

Optimize storage for large datasets.
FormatCompression RatioExtraction SpeedUse Case
gzip10:1MediumGeneral purpose
brotli12:1SlowMaximum compression
lz45:1FastReal-time processing
zstd11:1FastBalanced performance
path=null start=null
await client.export.toFile({
  wallet: '7xKXtg2CW87...',
  format: 'json',
  compression: 'gzip',
  output: 'data.json.gz'
});

Scheduled Exports

Automate recurring data exports.
path=null start=null
await client.export.schedule({
  name: 'daily_wallet_export',
  wallet: '7xKXtg2CW87...',
  format: 'csv',
  destination: 's3://mybucket/exports/',
  schedule: '0 0 * * *',  // Daily at midnight
  retention: '30d'
});

Performance Optimization

Parallel Extraction
Split large exports across multiple workers for faster processing.
Incremental Extraction
Extract only new data since last export using checkpoints.
Filtered Extraction
Reduce payload size by filtering at source.
path=null start=null
const incrementalExport = await client.export.incremental({
  wallet: '7xKXtg2CW87...',
  checkpoint: 'last_signature_hash',
  batchSize: 1000
});
All extraction operations support resume capability. Interrupted exports can continue from last checkpoint.
I