Skip to content

Client-side caching

oneapi.finance already runs a stale-while-revalidate cache server-side; that is why you can hit /v1/quote?symbol=AAPL 1000 times a day and get answers instantly. But you can do better by caching on your side too.

This recipe lays out TTLs that match the underlying refresh cadence of each endpoint and shows reference implementations.

EndpointRecommended client TTLWhy
/v1/quote (during market hours)60 secondsUnderlying delay is ~15 minutes; sub-minute polling is wasted.
/v1/quote (after-hours)1 hourClosing print does not move until the next session opens.
/v1/time_series (intraday)5 minutesMost consumers refresh charts at this cadence.
/v1/time_series (daily/weekly/monthly)4 hours during market hours, 24 hours otherwiseEOD bars finalize after close.
/v1/statistics6 hoursMost fields refresh daily or quarterly.
/v1/profile7 daysEffectively static.
/v1/dividends24 hoursNew events on the order of weeks.
/v1/splits7 daysNew events on the order of months.
/v1/symbol_search24 hoursStable per query.
/v1/fx/time_series (current)5 minutesIntraday FX moves but slowly; 5 min is fine for portfolio totals.
/v1/fx/time_series (history)7 daysClosed bars do not change.

These are starting points. Your application’s tolerance for staleness is the real input — a real-time-feeling watchlist might want 30-second quote TTLs even though 60 is “objectively” enough.

Choosing a cache layer

ScaleSuggested cache
Single-machine CLI / cron jobSQLite
One-server web appIn-memory LRU + Redis fallback
Multi-instance appRedis or Memcached
Edge-deployed app (Vercel, Cloudflare Workers)KV or Workers Cache + per-region in-memory

The goal is to avoid stampedes: when a popular symbol falls out of cache, ten clients should not simultaneously request it from us.

SQLite reference (Python)

import json
import sqlite3
import time
from typing import Callable
class SqliteCache:
def __init__(self, path: str = "oneapi_cache.sqlite"):
self.db = sqlite3.connect(path)
self.db.execute("""
CREATE TABLE IF NOT EXISTS cache (
key TEXT PRIMARY KEY,
value TEXT NOT NULL,
expires_at INTEGER NOT NULL
)
""")
def get_or_fetch(self, key: str, ttl: int, fetch: Callable[[], dict]) -> dict:
row = self.db.execute(
"SELECT value, expires_at FROM cache WHERE key = ?", (key,)
).fetchone()
if row and row[1] > time.time():
return json.loads(row[0])
fresh = fetch()
self.db.execute(
"INSERT OR REPLACE INTO cache VALUES (?, ?, ?)",
(key, json.dumps(fresh), int(time.time()) + ttl),
)
self.db.commit()
return fresh
# Usage:
cache = SqliteCache()
def cached_quote(symbol: str) -> dict:
return cache.get_or_fetch(
f"quote:{symbol}",
ttl=60,
fetch=lambda: httpx.get(
"https://api.oneapi.finance/v1/quote",
params={"symbol": symbol},
headers={"Authorization": f"Bearer {API_KEY}"},
).json(),
)

Redis reference (Node)

import { createClient } from "redis";
const redis = createClient();
await redis.connect();
async function getOrFetch(key, ttlSec, fetcher) {
const cached = await redis.get(key);
if (cached) return JSON.parse(cached);
const fresh = await fetcher();
await redis.set(key, JSON.stringify(fresh), { EX: ttlSec });
return fresh;
}
const quote = await getOrFetch(
"oneapi:quote:AAPL",
60,
async () => {
const r = await fetch("https://api.oneapi.finance/v1/quote?symbol=AAPL", {
headers: { Authorization: `Bearer ${process.env.ONEAPI_KEY}` },
});
return r.json();
},
);

Stampede protection

A naive cache lets every concurrent miss trigger an upstream call. Add a single-flight lock:

import threading
from functools import lru_cache
class SingleFlight:
def __init__(self):
self._locks: dict[str, threading.Lock] = {}
self._guard = threading.Lock()
def lock_for(self, key: str) -> threading.Lock:
with self._guard:
if key not in self._locks:
self._locks[key] = threading.Lock()
return self._locks[key]
flight = SingleFlight()
def cached_quote(symbol: str) -> dict:
key = f"quote:{symbol}"
cached = cache.peek(key)
if cached:
return cached
with flight.lock_for(key):
cached = cache.peek(key)
if cached:
return cached
fresh = upstream_fetch(symbol)
cache.set(key, fresh, ttl=60)
return fresh

Stale-while-revalidate

When freshness is “soon enough” but you want zero perceived latency, return the cached value and trigger a background refresh:

async function swr(key, ttlSec, staleSec, fetcher) {
const wrapped = await redis.hGetAll(key);
const value = wrapped.value ? JSON.parse(wrapped.value) : null;
const fetchedAt = Number(wrapped.fetchedAt || 0);
const age = (Date.now() / 1000) - fetchedAt;
if (value && age < ttlSec) return value;
const refresh = (async () => {
const fresh = await fetcher();
await redis.hSet(key, {
value: JSON.stringify(fresh),
fetchedAt: String(Math.floor(Date.now() / 1000)),
});
return fresh;
})();
if (value && age < staleSec) {
refresh.catch(() => {}); // fire-and-forget
return value;
}
return refresh;
}

See also