|
10 | 10 |
|
11 | 11 | <div class="workers mt-12 text-center">
|
12 | 12 | <h2 class="text-3xl font-semibold text-gray-100 mb-8">
|
13 |
| - <i class="fa-solid fa-network-wired"></i> P2P Network |
| 13 | + <i class="fa-solid fa-circle-nodes"></i> Distributed inference with P2P |
14 | 14 | <a href="https://localai.io/features/distribute/" target="_blank">
|
15 | 15 | <i class="fas fa-circle-info pr-2"></i>
|
16 | 16 | </a>
|
17 | 17 | </h2>
|
18 |
| - <p class="mb-4">LocalAI uses P2P technologies to enable distribution of work between peers. It is possible to share an instance with Federation and/or split the weights of a model across peers (only available with llama.cpp models).</p> |
| 18 | + <h5 class="mb-4 text-justify">LocalAI uses P2P technologies to enable distribution of work between peers. It is possible to share an instance with Federation and/or split the weights of a model across peers (only available with llama.cpp models). You can now share computational resources between your devices or your friends!</h5> |
19 | 19 |
|
20 | 20 | <!-- Tabs for Instructions -->
|
21 | 21 | <div class="bg-gray-800 p-6 rounded-lg shadow-lg mb-12 text-left">
|
22 |
| - <h3 class="text-2xl font-semibold text-gray-100 mb-6">Start a new llama.cpp P2P worker</h3> |
| 22 | + <h3 class="text-2xl font-semibold text-gray-100 mb-6"><i class="fa-solid fa-book"></i> Start a new llama.cpp P2P worker</h3> |
23 | 23 | <p class="mb-4">You can start llama.cpp workers to distribute weights between the workers and offload part of the computation. To start a new worker, you can use the CLI or Docker.</p>
|
24 | 24 |
|
25 | 25 | <!-- Tabs navigation -->
|
@@ -54,15 +54,15 @@ <h3 class="text-2xl font-semibold text-gray-100 mb-6">Start a new llama.cpp P2P
|
54 | 54 | </div>
|
55 | 55 | </div>
|
56 | 56 |
|
57 |
| - <p class="text-xl font-semibold text-gray-200"> Nodes: <span hx-get="/p2p/ui/workers-stats" hx-trigger="every 1s"></span> </p> |
| 57 | + <p class="text-xl font-semibold text-gray-200"> <i class="text-gray-200 fa-solid fa-circle-nodes"></i> Workers (llama.cpp): <span hx-get="/p2p/ui/workers-stats" hx-trigger="every 1s"></span> </p> |
58 | 58 | <div class="grid grid-cols-1 sm:grid-cols-2 md:grid-cols-3 gap-4 mb-12">
|
59 | 59 | <div hx-get="/p2p/ui/workers" hx-trigger="every 1s"></div>
|
60 | 60 | </div>
|
61 | 61 |
|
62 | 62 | <hr class="border-gray-700 mb-12">
|
63 | 63 |
|
64 | 64 | <div class="bg-gray-800 p-6 rounded-lg shadow-lg mb-12 text-left">
|
65 |
| - <h3 class="text-2xl font-semibold text-gray-100 mb-6">Start a federated instance</h3> |
| 65 | + <h3 class="text-2xl font-semibold text-gray-100 mb-6"><i class="fa-solid fa-book"></i> Start a federated instance</h3> |
66 | 66 | <p class="mb-4">You can start LocalAI in federated mode to share your instance, or start the federated server to balance requests between nodes of the federation.</p>
|
67 | 67 |
|
68 | 68 | <!-- Tabs navigation -->
|
@@ -112,7 +112,7 @@ <h3 class="text-2xl font-semibold text-gray-100 mb-6">Start a federated instance
|
112 | 112 | </div>
|
113 | 113 | </div>
|
114 | 114 |
|
115 |
| - <p class="text-xl font-semibold text-gray-200"> Nodes: <span hx-get="/p2p/ui/workers-federation-stats" hx-trigger="every 1s"></span> </p> |
| 115 | + <p class="text-xl font-semibold text-gray-200"> <i class="text-gray-200 fa-solid fa-circle-nodes"></i> Federated Nodes: <span hx-get="/p2p/ui/workers-federation-stats" hx-trigger="every 1s"></span> </p> |
116 | 116 | <div class="grid grid-cols-1 sm:grid-cols-2 md:grid-cols-3 gap-4 mb-12">
|
117 | 117 | <div hx-get="/p2p/ui/workers-federation" hx-trigger="every 1s"></div>
|
118 | 118 | </div>
|
|
0 commit comments