Skip to content

Build a fundamentals screener

A screener is a filter over the universe of investable instruments. With /v1/stocks to enumerate and /v1/statistics to evaluate each name, you can build a custom screener in under a hundred lines of code.

This guide walks through a “GARP” screener: growth at a reasonable price.

The screen

Filter for US common stocks with:

  • Market cap > $5B (no microcaps)
  • P/E (trailing) between 5 and 25 (avoid losing money + avoid bubble)
  • ROE > 15% (capital efficiency)
  • 5-year EBITDA growth > 10% (the “growth” half — proxied by recent revenue)
  • Dividend yield > 1% (real return cushion)

Implementation

  1. Enumerate the universe via /v1/stocks.

    import os, httpx
    from typing import Iterator
    API_KEY = os.environ["ONEAPI_KEY"]
    HEADERS = {"Authorization": f"Bearer {API_KEY}"}
    def list_us_common_stock() -> Iterator[dict]:
    cursor = None
    while True:
    params = {"country": "US", "type": "Common Stock", "outputsize": 1000}
    if cursor:
    params["cursor"] = cursor
    r = httpx.get(
    "https://api.oneapi.finance/v1/stocks",
    params=params,
    headers=HEADERS,
    timeout=30.0,
    )
    r.raise_for_status()
    body = r.json()
    yield from body["instruments"]
    cursor = body.get("next_cursor")
    if cursor is None:
    break
  2. Evaluate each name with /v1/statistics.

    def stats(symbol: str) -> dict | None:
    try:
    r = httpx.get(
    "https://api.oneapi.finance/v1/statistics",
    params={"symbol": symbol},
    headers=HEADERS,
    timeout=10.0,
    )
    except httpx.HTTPError:
    return None
    if r.status_code == 404:
    return None
    r.raise_for_status()
    return r.json()
  3. Apply the filter.

    def passes(s: dict) -> bool:
    if not s:
    return False
    checks = [
    (s.get("marketCap") or 0) > 5_000_000_000,
    5 < (s.get("trailingPe") or -1) < 25,
    (s.get("roe") or 0) > 0.15,
    (s.get("dividendYield") or 0) > 0.01,
    ]
    return all(checks)
    def screen():
    results = []
    for inst in list_us_common_stock():
    s = stats(inst["symbol"])
    if passes(s):
    results.append({
    "symbol": inst["symbol"],
    "name": inst["name"],
    "market_cap": s["marketCap"],
    "pe": s["trailingPe"],
    "roe": s["roe"],
    "yield": s["dividendYield"],
    })
    return sorted(results, key=lambda x: x["market_cap"], reverse=True)
  4. Print or persist results.

    if __name__ == "__main__":
    hits = screen()
    print(f"{len(hits)} names pass the screen.")
    for h in hits[:25]:
    print(f"{h['symbol']:<8} {h['name']:<32} "
    f"PE={h['pe']:>5.1f} ROE={h['roe']*100:>5.1f}% "
    f"Yield={h['yield']*100:>4.2f}%")

Quota math

The US universe is roughly 6,500 active common stocks. The script above makes:

  • ~7 calls to /v1/stocks (1,000 per page).
  • ~6,500 calls to /v1/statistics (one per symbol).

That is 6,507 calls. On the Indie tier (100,000/month), this is comfortable for a daily run. On the free tier (1,000/month), it will not finish.

For a heavier schedule, restrict the universe with cheaper filters first (market-cap floor via a precomputed local table, sector restriction, exchange restriction) so you only call /v1/statistics for plausible candidates.

Caching

Statistics for individual names refresh quarterly (with the issuer’s filings). Caching results for 7 days locally cuts calls by an order of magnitude. See caching recipe.

import json, sqlite3, time
CACHE_TTL = 7 * 86400
def cached_stats(symbol: str) -> dict | None:
db = sqlite3.connect("screener_cache.db")
db.execute("CREATE TABLE IF NOT EXISTS stats (symbol TEXT PRIMARY KEY, data TEXT, fetched_at INT)")
row = db.execute(
"SELECT data, fetched_at FROM stats WHERE symbol = ?", (symbol,)
).fetchone()
if row and time.time() - row[1] < CACHE_TTL:
return json.loads(row[0])
fresh = stats(symbol)
if fresh is not None:
db.execute(
"INSERT OR REPLACE INTO stats VALUES (?, ?, ?)",
(symbol, json.dumps(fresh), int(time.time())),
)
db.commit()
return fresh

Going further

  • Add European listings by enumerating each exchange separately (country=DE, country=GB, etc).
  • Add a minimum-revenue check by joining against a future GET /v1/financials endpoint (roadmap).
  • Persist results in Postgres so you can run change-over-time analysis (“what fell out of the screen this month?”).

See also