Detect and redact PII locally with SOTA performance
-
Updated
Mar 25, 2025 - Python
Detect and redact PII locally with SOTA performance
Extract structured data from local or remote LLM models
Main code chunks used for models in the publication "Exploring the Potential of Adaptive, Local Machine Learning (ML) in Comparison ton the Prediction Performance of Global Models: A Case Study from Bayer's Caco-2 Permeability Database"
Extracting complete webpage articles from a screen recording using local models
Vision-based avatar, reads Google News and extracts news by itself using only local models
A comprehensive learning repository for Model Context Protocol (MCP) - from simple tools to complex agentic workflows using local Ollama models
A streamlined interface for interacting with local Large Language Models (LLMs) using Streamlit. Features interactive chat, configurable model parameters, and more.
Add a description, image, and links to the local-models topic page so that developers can more easily learn about it.
To associate your repository with the local-models topic, visit your repo's landing page and select "manage topics."