RRepoGEO

REPOGEO REPORT · LITE

lumina-ai-inc/chunkr

Default branch main · commit 1bde59be · scanned 5/12/2026, 8:32:04 AM

GitHub: 2,942 stars · 182 forks

AI VISIBILITY SCORE
35 /100
Critical
Category recall
0 / 2
Not recommended in any query
Rule findings
1 pass · 1 warn · 0 fail
Objective metadata checks
AI knows your name
3 / 3
Direct prompts that named your repo
HOW TO READ THIS REPORT

Action plan is what to do next — copy-pasteable changes prioritized by impact. Category visibility is the real GEO test: when a user asks an AI a brand-free question that should surface lumina-ai-inc/chunkr, does the AI actually recommend you — or your competitors? Objective checks verify the metadata signals AI engines weight first. Self-mention check detects whether AI even knows you exist by name.

Action plan — copy-paste fixes

3 prioritized changes generated by gemini-2.5-flash. Mark items done after you ship the fix.

OVERALL DIRECTION
  • hightopics#1
    Add relevant topics to the repository

    Why:

    COPY-PASTE FIX
    rag, llm, document-processing, ocr, layout-analysis, semantic-chunking, pdf-parser, document-intelligence, vision-language-models
  • highreadme#2
    Refine the README's main heading to emphasize RAG/LLM purpose

    Why:

    CURRENT
    <h3 align="center">Chunkr | Open Source Document Intelligence API</h3>
    COPY-PASTE FIX
    <h3 align="center">Chunkr | Open Source Document Intelligence API for RAG & LLM Data</h3>
  • mediumabout#3
    Enhance the 'About' description with key functionalities

    Why:

    CURRENT
    Vision infrastructure to turn complex documents into RAG/LLM-ready data
    COPY-PASTE FIX
    Vision infrastructure for document layout analysis, OCR, and semantic chunking to turn complex documents into RAG/LLM-ready data.

Category GEO backends resolved for this scan: google/gemini-2.5-flash, deepseek/deepseek-v4-flash

Category visibility — the real GEO test

Brand-free queries asked to google/gemini-2.5-flash. Did AI recommend you, or someone else?

Same questions for every model — switch tabs to compare answers and rankings.

Recall
0 / 2
0% of queries surface lumina-ai-inc/chunkr
Avg rank
Lower is better. #1 = top recommendation.
Share of voice
0%
Of all named tools, what % are you?
Top rival
LangChain
Recommended in 1 of 2 queries
COMPETITOR LEADERBOARD
  1. LangChain · recommended 1×
  2. OpenAI · recommended 1×
  3. Hugging Face · recommended 1×
  4. Chroma · recommended 1×
  5. FAISS · recommended 1×
  • CATEGORY QUERY
    How to prepare unstructured documents like PDFs for retrieval-augmented generation (RAG) applications?
    you: not recommended
    AI recommended (in order):
    1. LangChain
    2. OpenAI
    3. Hugging Face
    4. Chroma
    5. FAISS
    6. Pinecone
    7. Unstructured
    8. PyMuPDF
    9. Apache Tika
    10. spaCy
    11. NLTK
    12. OpenAI API
    13. Azure OpenAI Service

    AI recommended 13 alternatives but never named lumina-ai-inc/chunkr. This is the gap to close.

    Show full AI answer
  • CATEGORY QUERY
    What open-source tools perform document layout analysis and OCR for LLM data ingestion?
    you: not recommended
    AI recommended (in order):
    1. PaddleOCR
    2. LayoutParser
    3. Tesseract OCR
    4. Donut
    5. Surya
    6. DocTR

    AI recommended 6 alternatives but never named lumina-ai-inc/chunkr. This is the gap to close.

    Show full AI answer

Objective checks

Rule-based audits of metadata signals AI engines weight most.

  • Metadata completeness
    warn

    Suggestion:

  • README presence
    pass

Self-mention check

Does AI even know your repo exists when asked about it directly?

  • Compared to common alternatives in this category, what is the core differentiator of lumina-ai-inc/chunkr?
    pass
    AI named lumina-ai-inc/chunkr explicitly

    AI answers can be confidently wrong. Read for accuracy: does it match your actual tech stack, audience, and differentiator?

  • If a team adopts lumina-ai-inc/chunkr in production, what risks or prerequisites should they evaluate first?
    pass
    AI named lumina-ai-inc/chunkr explicitly

    AI answers can be confidently wrong. Read for accuracy: does it match your actual tech stack, audience, and differentiator?

  • In one sentence, what problem does the repo lumina-ai-inc/chunkr solve, and who is the primary audience?
    pass
    AI named lumina-ai-inc/chunkr explicitly

    AI answers can be confidently wrong. Read for accuracy: does it match your actual tech stack, audience, and differentiator?

Embed your GEO score

Drop this badge into the README of lumina-ai-inc/chunkr. It auto-updates whenever the report is rescanned and links back to the latest report — easy public proof that you care about AI discoverability.

RepoGEO badge previewLive preview
MARKDOWN (README)
[![RepoGEO](https://repogeo.com/badge/lumina-ai-inc/chunkr.svg)](https://repogeo.com/en/r/lumina-ai-inc/chunkr)
HTML
<a href="https://repogeo.com/en/r/lumina-ai-inc/chunkr"><img src="https://repogeo.com/badge/lumina-ai-inc/chunkr.svg" alt="RepoGEO" /></a>
Pro

Subscribe to Pro for deep diagnoses

lumina-ai-inc/chunkr — Lite scans stay free; this card itemizes Pro deep limits vs Lite.

  • Deep reports10 / month
  • Brand-free category queries5 vs 2 in Lite
  • Prioritized action items8 vs 3 in Lite