DEVELOPERS
Build with Latence in minutes.
Real components shipping today. A typed SDK, a REST API, a CLI, and the open-source retrieval stack underneath them.
pip install latencePipeline to retrieval, in one file.
Three ways to integrate. The SDK, a plain REST call, or the CLI. Same concepts, same primitives.
from latence import Latence
client = Latence()
# 1. Run the intelligence pipeline
pkg = client.pipeline.run(
files=["contract_q4.pdf", "pricing.xlsx"]
)
# 2. Ingest the retrieval package into voyager-index
index = client.search.ingest(pkg, graph_sidecar=True)
# 3. Query with MaxSim, hybrid BM25, optional rerank
results = index.search(
"Q4 renewal obligations", k=10, rerank=True
)
for r in results:
print(r.score, r.source, r.snippet)
The stack underneath is open.
Real technical proof — not slideware. Every repo is maintained and publicly benchmarked.
REPO
latenceai-dataset-intelligence
Pipeline that emits retrieval packages. Parse, chunk, extract, graph.