Skip to content

Conversation

@loci-dev
Copy link

Mirrored from ggml-org/llama.cpp#18422

needed to test my memory mcp, so had claude add in mcp support to the webui. All code should be tagged as needed with the AI tag. tested a bit, seems to work. Need to add in permission system with toast or another dialog system. Uses a separate modal for the mcp settings, figured this was better to allow for quick turn on an on. Tested my default searxng/selenium grid mcp I use and a few others. Enjoy.

Make sure to read the contributing guidelines before submitting a PR

needed to test my memory mcp, sso had claude add in mcp support to the webui. All code should be tagged as needed with the AI tag. tested a bit, seems to work. Need to add in permission system with toast or another dialog system.
@loci-agentic-ai
Copy link

Explore the complete analysis inside the Version Insights

I've generated a summary report for your project. The analysis shows that Pull Request #726 for the llama.cpp repository has minimal to no performance impact.

Key highlights:

  • ✅ No modified functions showed performance changes greater than 2%
  • ✅ Both response time and throughput remain stable
  • ✅ Safe to merge from a performance perspective

The changes appear to maintain performance stability without introducing any measurable performance regressions.

@loci-dev loci-dev force-pushed the main branch 9 times, most recently from f2e8c7f to b3f45e1 Compare December 29, 2025 06:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants