Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dev refactor #93

Merged
merged 100 commits into from
Mar 13, 2024
Merged

Dev refactor #93

merged 100 commits into from
Mar 13, 2024

Conversation

litagin02
Copy link
Owner

#92 のチェックと修正等

…from server_editor.py

The logic has not been changed, only renaming, splitting and moving modules on a per-function basis.
Existing code will be left in place for the time being to avoid breaking the training code, which is not subject to refactoring this time.
…OX to style_bert_vits2/text_processing/japanese/user_dict/
…tyle_bert_vits2/models/

The code has not yet been cleaned up, just moved.
… loaded BERT models/tokenizer and replace all from_pretrained() to load_model/load_tokenizer
…each language to style_bert_vits2/text_processing/(language)/bert_feature.py
… style_bert_vits2/text_processing/__init__.py

This was often used in 3 function sets and felt like a wasteful division with few lines.
…VITS2

Since app.py and server_editor.py already exist as alternative Web UI, there is no need to revive webui.py in the future.
I have determined that this is excessive for this project at this time.
"text_processing" is clearer, but the import statement is longer.
"nlp" is shorter and makes it clear that it is natural language processing.
pyopenjtalk_worker.initialize() has the side effect of starting another process and should not be executed automatically on import.
@litagin02 litagin02 mentioned this pull request Mar 12, 2024
tsukumijima and others added 13 commits March 12, 2024 17:45
…consumption during training in the Web UI

Since the BERT features of the dataset are pre-extracted by bert_gen.py, there is no need to load the BERT model at training time.
Include in sdist only the minimum required files for style-bert-vits2 as a library.
…PU VRAM consumption during training in the Web UI"

This reverts commit e8a76e5.
…consumption during training in the Web UI

Since the BERT features of the dataset are pre-extracted by bert_gen.py, there is no need to load the BERT model at training time.
@litagin02
Copy link
Owner Author

問題なく動きそうなのでdevへマージし、また他の機能追加等をしてから2.4へバージョンを上げることとします。

@litagin02 litagin02 merged commit c028840 into dev Mar 13, 2024
@litagin02 litagin02 deleted the dev-refactor branch March 13, 2024 06:42
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants