Layered. Governed. Hybrid.
cgiCore is more than retrieval and prompt stuffing. Each layer contributes a different kind of signal, and a governance layer reconciles them into the context your AI actually sees.
End-to-end architecture.
From connectors to downstream agents, every layer is explicit and every boundary is a governed contract. Scroll through the diagram, then read each layer in plain English below.
Each layer earns its place.
cgiCore is hybrid by design: it combines reasoning, semantic retrieval, pattern signal, and governed assembly. No single layer decides the answer.
Graph
A knowledge graph holds entities, their traits, and their relationships. It answers questions about topology — what connects to what, what depends on what, what governs what.
Vector
A semantic index finds passages and entities by meaning, not by keyword. Cross-encoder reranking brings the best candidates to the top.
Reasoning
A reasoning engine enforces logical consistency. It surfaces contradictions, checks completeness, and runs inference — the logical backbone of trust.
ML Layer
An ML layer produces pattern embeddings and similarity support — a supporting signal for ranking and resolution, never the autonomous final judge.
Governance
Provenance, policy, contradiction workflow, and audit are a layer, not an afterthought. Every fact is sourced. Every reconciliation routed. Every step traced.
Proxy & packing
cgiCore acts as the LLM proxy for your stack: it injects assembled context, routes the call to the model you choose (frontier for reasoning, efficient models for bulk mapping work), captures completions on writeback, and packs to each call’s token budget.
What makes this multi-layered and governed.
Three details worth dwelling on for architects evaluating cgiCore.
Reasoning is the backbone.
The ML layer contributes signal for ranking and resolution. It does not replace reasoning or policy — those remain the authority on what is true and what is allowed.
Provenance isn’t a log. It’s an index.
Every fact references its extraction run, evidence, and authoring session. When a model cites something, your teams can follow the citation all the way to the original source, in real time.
Token budget is a first-class constraint.
Context assembly isn’t “return the top N chunks.” cgiCore ranks by relevance and priority, condenses, and packs to a tight budget — so your completions stay fast, cheap, and focused.