Singapore court citation checker -- detect hallucinated cases in AI-generated legal submissions.
Verifies citations against eLitigation (Singapore's official case law database), detecting hallucinated authorities, mismatched case names, fabricated paragraph references, and distorted quotations.
npm install -g sgciteOr run without installing:
npx sgciteRequires Node.js 18+.
# 1. Extract citations from text or IR JSON
sgcite extract-citations --input submission.txt --output citations.json
# 2. Resolve citations to eLitigation URLs (offline)
sgcite resolve --input citations.json --output resolved.json
# 2b. Resolve SLR citations via live eLitigation search
sgcite resolve --input citations.json --output resolved.json --search
# 3. Fetch judgments from eLitigation (only network step)
sgcite fetch --input resolved.json --cache ./cache/
# 4. Check citations against cached judgments (offline)
sgcite check --input resolved.json --cache ./cache/ --output ./output/report.md
# 4b. (If review.md has items) Fill verdicts in review.json, then apply
sgcite apply-verdicts --report ./output/report.json --review ./output/review.json --output ./output/
# 5. Annotate document with check results (requires superdoc-redlines IR)
sgcite annotate --ir ir.json --report ./output/report.json --output edits.json --all-outcomesFor DOCX workflows, use superdoc-redlines to extract IR and apply edits:
node superdoc-redline.mjs extract --input submission.docx --output ir.json
# ... run sgcite pipeline above ...
node superdoc-redline.mjs apply -i submission.docx -o annotated.docx -e edits.json| Command | Description | Network? |
|---|---|---|
extract-citations |
Extract case citations from text, IR JSON, or PDF | No |
validate-citation |
Validate a single citation string | No |
resolve |
Resolve citations to eLitigation URLs (offline; --search for online SLR) |
Opt. |
fetch |
Fetch judgment HTML from eLitigation into cache | Yes |
parse-judgment |
Parse judgment HTML into structured JSON | No |
check |
Verify citations against cached judgments | No |
annotate |
Generate superdoc-redlines edit JSON from check report + IR | No |
apply-verdicts |
Apply agent review verdicts from review.json to report.json | No |
split-citations |
Split citations JSON into batches for parallel fetch | No |
merge-reports |
Merge multiple check report JSONs into one | No |
The default output format is Markdown (report.md). A JSON sidecar (report.json) is always co-produced by sgcite check.
For JSON-only output:
sgcite check --input resolved.json --cache ./cache/ --format json| Status | Meaning |
|---|---|
VERIFIED |
Authority exists, case name matches, paragraph found. If no pinpoint paragraph is specified and no claim context exists, verification is structural only (authority + case name). |
PROPOSITION_UNVERIFIED |
Authority exists and case name matches, but the stated proposition has not been verified against the judgment. Requires agent review — read the full judgment to assess. |
AUTHORITY_NONEXISTENT |
No judgment found on eLitigation |
CITATION_MISMATCH |
Judgment exists but case name does not match |
PARAGRAPH_HALLUCINATION |
Paragraph number exceeds total paragraphs. For multi-section judgments, the message shows main-section and total paragraph counts separately. |
QUOTATION_FABRICATION |
Quoted text not found in judgment |
PARAPHRASE_DISTORTION |
Possible meaning distortion (requires review) |
CITATION_MALFORMED |
Citation string is structurally invalid |
UNVERIFIABLE |
Check incomplete (rate limited, pre-2000, maintenance, etc.) |
Non-VERIFIED annotations are applied to every occurrence of a citation in the document. VERIFIED annotations are deduplicated by default (use --all-occurrences to annotate all).
Court hierarchy mismatches (e.g. document describes "Court of Appeal" but citation resolves to High Court) are detected and noted in annotations when present.
Where the authority was located on eLitigation, annotations include a clickable hyperlink to the judgment source (requires superdoc-redlines v0.3.0+).
- Pre-2000 coverage: eLitigation does not have coverage for judgments before 2000. Citations predating 2000 are marked
UNVERIFIABLE. - Court-specific coverage: Not all court categories have coverage from 2000. See the table below for an estimate of the earliest available judgments per court on eLitigation.
- SLR citations: Singapore Law Reports citations require a search-based resolution step (
sgcite resolve --search) that may not find a match. Unresolved SLR citations are markedUNVERIFIABLE. - Rate limits: Default: 1 request/second, 500 requests per job. Exponential backoff on HTTP 429/5xx.
- Courts only: sgcite verifies court judgments available on eLitigation. Tribunal and board decisions (e.g.,
SGPDPC,SGIPOS,SGITBR,SGDT) are currently out of scope for sgcite. - Single source: eLitigation is the sole data source. Agents should perform a supplementary web search before accepting
AUTHORITY_NONEXISTENTfindings. - Maintenance windows: eLitigation undergoes periodic maintenance. Citations fetched during maintenance are marked
UNVERIFIABLE (source_unavailable).
| Code | Court | Earliest Available | Notes |
|---|---|---|---|
SGCA |
Court of Appeal | 2000 | Full coverage from 2000 |
SGHC |
High Court (General Division) | 2000 | Full coverage from 2000 |
SGHC(I) |
High Court (SICC) | 2015 | SICC established 2015 |
SGHC(A) |
High Court (Appellate Division) | 2021 | Appellate Division established 2020 |
SGHCF |
High Court (Family Division) | 2015 | |
SGHCR |
High Court (Registrar) | 2012 | Sparse before 2015 |
SGDC |
District Court | 2005 | Very sparse before 2019 |
SGMC |
Magistrate's Court | 2015 | Very sparse; few published judgments |
SGFC |
Family Court | 2015 | Sparse |
SGYC |
Youth Court | — | No published judgments on eLitigation |
SGAT |
State Courts Appellate Tribunal | — | No published judgments on eLitigation |
Judgments are retrieved from eLitigation (elitigation.sg).
sgcite retrieves publicly available court judgments for the purpose of citation verification.
This tool is provided on an "as is" and "where is" basis, and use is entirely at your own risk.
Citation checking in legal workflows is high-risk and can materially affect legal outcomes. sgcite is a support tool, not legal advice and not a substitute for independent legal judgment. You must independently verify all outputs (including citation existence, ratios, quotations, and context) against authoritative sources before relying on them in any filing, advice, or decision.
No warranty is given as to completeness, accuracy, fitness for purpose, or uninterrupted availability. See LICENSE for full terms.
See SKILL.md for agent-oriented documentation including the full command reference, reading results guidance, and workflow instructions.
For a ready-to-use system prompt that instructs an AI agent to run the full citation-checking pipeline on a DOCX file (simulated AI-created legal submissions for resisting summary judgment in the Singapore High Court), see docs/model-agent-prompt.md.
sgcite follows an agent-as-orchestrator design. There is no built-in pipeline -- the AI agent (Claude Code, Cursor, etc.) calls each CLI command in sequence.
sgcite works alongside superdoc-redlines for DOCX annotation. They are peer CLI tools with no runtime dependency on each other. The agent orchestrates both:
superdoc-redlines sgcite
+- extract (DOCX -> IR JSON) +- extract-citations (text/IR -> citations JSON)
+- read (DOCX -> text) +- resolve (citations -> URLs)
+- apply (DOCX + edits -> DOCX) +- fetch (URLs -> cached judgments)
+- recompress +- check (citations + cache -> report)
+- annotate (IR + report -> edits JSON)
+- apply-verdicts (review.json -> updated report)
Only sgcite fetch makes network requests. All other commands are fully offline.
Special thanks to jamescockburn47/hallucinationauditor for being an important foundation and inspiration for this project. Its clear architecture, verification-first mindset, and practical workflow helped shape the direction of sgcite.
Thanks also to legalquant/casekit, an open-source desktop application for organising civil dispute cases in England & Wales, and checks legal citations against BAILII and the National Archives. This was used as an invaluable reference point for sgcite's citation verification infrastructure.
The project also acknowledges the work of Matthew Lee (barrister, England & Wales), whose categorisation of AI hallucinations in case law was instrumental in guiding sgcite's verification approach: AI Hallucinations in Case Law: 8 Types and How to Spot Them.