Offline AI #16
LeanBitLab
started this conversation in
Offline AI
Replies: 2 comments 2 replies
-
|
I tried the Xenova grammar small quantized based on T5, Pszemraj's grammar small quantized and yeah both don't work as intended.So they are out of consideration for now I think. |
Beta Was this translation helpful? Give feedback.
1 reply
-
|
In this month my exam AR going on so,consider me mostly offline.Still,here
are some of my interesting finds:-
1)Prithivida gramformer(quantized if available I'm unable to find it in
this haste)
2)Falcon H1 Tiny 90M instruct
3)HRM Grammar Light V2(2026 edition)
Also another question have you tried the qwen offline models?
How about implementing a single model.onnx it will be better for the memory
and processes.
…On Sat, 31 Jan, 2026, 11:38 pm LeanBitLab, ***@***.***> wrote:
Only Visheratin T5 Tiny worked for me too
—
Reply to this email directly, view it on GitHub
<#16 (reply in thread)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/A2YYNFFD6G4QCBEGKJNPX334JTVRZAVCNFSM6AAAAACTQYXALCVHI2DSMVQWIX3LMV43URDJONRXK43TNFXW4Q3PNVWWK3TUHMYTKNRVHE4DEMY>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Model Name
Visheratin T5 Tiny
Device Specs (RAM/CPU)
4GB Ram sd720g Cpu
Feedback / Issue
Welcome to Offline AI discussion!
Beta Was this translation helpful? Give feedback.
All reactions