Skip to content

Commit e729515

Browse files
committed
Release v1.0.0
Update README.md and SETUP.md to refine spam blocking instructions and enhance flow setup guidance
1 parent fd592ab commit e729515

File tree

2 files changed

+6
-6
lines changed

2 files changed

+6
-6
lines changed

README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -46,14 +46,13 @@ Checkout out the [setup & usage process](SETUP.md) to get started!
4646
1. Name your assistant
4747
2. Limit the max amount of messages per hour per user
4848
3. Blacklist and allowlist features (allowlist overrides blacklist)
49-
4. Block spam feature (provided you give your list of phone contacts to Automate)
49+
4. Block spam feature (provided you give phone contact permissions to Automate)
5050
5. Optional reason why you are not available, to let senders briefly know what you are doing.
5151
6. Possibility of doing much more as well!
5252

5353
## Can I run this project on my phone without an external computer?
5454
1. **Yes**, you can run the servers (both Flask and Ollama) locally using Termux. Just clone the repo to Termux, install Flask if you havn't in Termux, and you can run the flask_backend.py file. However, the Ollama server may be slow on a phone, and you may need to use OpenAI for better context handling.
5555

56-
5756
## Project Structure
5857

5958
- `src/flask_backend.py`: Flask backend implementation that processes SMS messages.
@@ -63,7 +62,8 @@ Checkout out the [setup & usage process](SETUP.md) to get started!
6362
As of now, there are only a few known limitations to how this works. The biggest one (first one below) may be added as a feature later on (although, I will quickly state here that it would require some restructuring of the project, as the Flask backend would do more of the work than it does now, as explained in the [setup & usage process](SETUP.md)).
6463

6564
1. **AndroidSecretary can't process more than one response at a time.** In other words, if one message is still being processed, and another message comes in at the same time, then it will not register the second message. Each process of the message must finish for the Automate flow to detect a second incoming message, but since the chat_log is handled by the Automate app in conjunction with the Flask backend, simply duplicating or having various instances of the Automate flow would not work, as the chat_logs would be disjoint/unrelated between the two flows, leading to improper handling of message validation.
66-
2. No RCS support. So far, only SMS messages work. With some modifying of the Automate flow, however, MMS messages may also be supported, although that hasn't been tested. Until Automate will release RCS support, only SMS is indeed supported.
65+
2. **No RCS support**. So far, only SMS messages work. With some modifying of the Automate flow, however, MMS messages may also be supported, although that hasn't been tested. Until Automate will release RCS support, only SMS is indeed supported.
66+
3. **I have not yet tested using the project outside of a local network scope.** It may work, but I have not tried it out yet. Perhaps, it would still work using e.g. ceullar data on the phone, but the host machine would likely need to have an open, public IP, and it would most likely be of use to use OpenAI in that scenario.
6767

6868
## Tested LLM Models
6969
1. Llama3.1 with Ollama has been tested. The 8B parameter works just fine, so higher parameter models would work as well.

SETUP.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -9,12 +9,12 @@
99

1010
## Automate
1111
1. First, install the [Automate](https://llamalab.com/automate/) app on your Android device. This app allows you to create flows that can interact with Android APIs.
12-
2. Then, import the flow linked in the releases page of Github here, into the Automate app.
12+
2. Then, import the flow (.flo files) linked in the releases page of Github here, into the Automate app.
1313
3. In the flow, define the **BACKEND_HOST** variable to the IP address of the server running the Flask backend. If you use ollama, this will be the IP address of the computer running the Ollama server as well. If you wanat to use OpenAI instead of Ollama, set USEOPENAI to 1, indicating True. The default port of the backend is 4445.
1414

1515
**If you want to use OpenAI's api instead of Ollama, then you must set the OPENAI_API_KEY variable in the Automate flow to your OpenAI API key, AND enable OpenAI by setting USEOPENAI to 1 in the flow**
1616

17-
### The Automate flow setup is complete! Now, you can run the flask backend, and if you are using ollama, start your ollama server.
17+
### The Automate flow setup is complete! To customize the flow further, see below under [Customization Instructions](#customization-instructions). Now, you can run the Flask backend, and if you are using Ollama, start your Ollama server.
1818

1919
**Note:** The default model for Ollama in the Automate flow is llama3.1, as that has been the most tested for this project and the best performing. If you want to use a different model, you can manually change the model in the Automate flow.
2020

@@ -69,7 +69,7 @@ Important Variables (these should NOT be optional, so please fill them out when
6969

7070
1. `MAXMESSAGESPERHOURPERUSER`: Maximum messages a number can send per hour (default: 3)
7171
2. `USEOPENAI`: 0 for false, 1 for true. If this is enabled, make sure that you have provided an OpenAI API key!
72-
3. `BLOCKSPAM`: Set to 1 to block numbers not in your contacts
72+
3. `BLOCKSPAM`: Set to 1 to block numbers not in your contacts (default = 1). If you want to disable, set to 0.
7373
4. `ALLOWLIST`: These are the numbers that are **only allowed** for the assistant to respond. Allow list overrides everything, except for the max messages per hour per person limit. If you want other things such as block spam or the blacklist to work, then make sure your allow list is empty.
7474
5. `BLACKLISTNUMBERS`: These are the numbers that you do not want the assistant to respond to. However, if a number in the blacklist is in the allow list, it will be allowed.
7575

0 commit comments

Comments
 (0)