Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Global community pools to share LocalAI federated instances and workers #3113

Closed
mudler opened this issue Aug 2, 2024 · 4 comments · Fixed by #3125
Closed

Global community pools to share LocalAI federated instances and workers #3113

mudler opened this issue Aug 2, 2024 · 4 comments · Fixed by #3125
Labels

Comments

@mudler
Copy link
Owner

mudler commented Aug 2, 2024

Since now we have federation support #2915 and #2343 it makes sense to build a place under the LocalAI website to list and visualize community pools.

By community pools, I'm refering to a way for people to share swarms token, so they can both provide hardware capabilities, and use the federation for inference (like, Petals, but with more "shards").

The idea is to have an "explorer" or a "dashboard" which shows a list of active pools and with how many federated instances or llama.cpp workers, reporting their capability and availability.

Things to notice:

  • In the dashboard we should list only active pools and delete pools that are offline or have 0 workers/federated instances
  • Users can add arbitrarly tokens/pools, these gets scanned periodically and the dashboard reports its status
  • We need to explicitly mention that this is without any warranty and contribute/use it at your own risk - we don't have any responsability of the usage you do and if malicious actors tries to fiddle with your systems. We are going to tackle bugs of course as a community, but users should be very well aware of the fact that this is experimental and might be unsecure to deploy on your hardware (unless you take all the precautions).

This would allow the users to:

  1. setup a cluster, and dedicate that for a specific community
  2. share the compute resources with others
  3. run inferencing if you don't have beefy hardware, but it is instead given by other community peers
@mudler mudler added enhancement New feature or request roadmap area/p2p labels Aug 2, 2024
@mudler
Copy link
Owner Author

mudler commented Aug 2, 2024

This would likely be a new golang app that can be deployed e.g. in Vercel and would need a simple form for users to submit tokens.

I see two sections in this app:

  1. a page or a form to insert new tokens and provide description/name
  2. a landing page where to show all the global pools, with the availability, number of workers, and hardware specs ( note this is not yet collected by the p2p swarm functionalities)

@mudler
Copy link
Owner Author

mudler commented Aug 3, 2024

thinking it again. no need of vercel or a dynamic web app at all: can all be static and have GH workflow pipelines to run "cron" jobs to update the data.

@mudler
Copy link
Owner Author

mudler commented Aug 9, 2024

thinking it again. no need of vercel or a dynamic web app at all: can all be static and have GH workflow pipelines to run "cron" jobs to update the data.

scratch that - too complicated to add new tokens then

@mudler
Copy link
Owner Author

mudler commented Aug 15, 2024

https://explorer.localai.io is now live. It still misses some of UX around on how to run things, but it's just low hanging fruit on documentation that will be addressed in follow-ups

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant