We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
您好,我在训练ppmatting时,指标到了300左右就不降了,最后的结果也很差,请问我应该怎么解决呢
The text was updated successfully, but these errors were encountered:
@largerwxt 这个问题太笼统了,请自行分析数据集标注、学习率等是否有问题
Sorry, something went wrong.
您好,我已经检查过数据集了,我使用的Distinctions-646数据集,并且安装官方文档准备好了。我的config如下batch_size: 4 iters: 300000
train_dataset: type: MattingDataset dataset_root: /home/wxt/pp-matting/work/PaddleSeg/Matting/data/Distinctions-646/train train_file: train.txt transforms: - type: LoadImages - type: Padding target_size: [512, 512] - type: ResizeByShort short_size: 512 - type: RandomCrop crop_size: [[512, 512],[640, 640], [800, 800]] - type: Resize target_size: [512, 512] - type: RandomDistort - type: RandomBlur prob: 0.1 - type: RandomHorizontalFlip - type: Normalize mode: train separator: '|'
val_dataset: type: MattingDataset dataset_root: /home/wxt/pp-matting/work/PaddleSeg/Matting/data/Distinctions-646/test val_file: test.txt transforms: - type: LoadImages - type: LimitShort max_short: 1536 - type: ResizeToIntMult mult_int: 32 - type: Normalize mode: val get_trimap: False separator: '|'
model: type: PPMatting backbone: type: HRNet_W48 pretrained: https://bj.bcebos.com/paddleseg/dygraph/hrnet_w48_ssld.tar.gz pretrained: Null
optimizer: type: sgd momentum: 0.9 weight_decay: 4.0e-5
lr_scheduler: type: PolynomialDecay learning_rate: 0.01 end_lr: 0 power: 0.9 但是在训练时,语义损失一直不下去,并且sad在200左右就不下降了。请问这是什么原因呢
Stinky-Tofu
shiyutang
No branches or pull requests
问题确认 Search before asking
请提出你的问题 Please ask your question
您好,我在训练ppmatting时,指标到了300左右就不降了,最后的结果也很差,请问我应该怎么解决呢
The text was updated successfully, but these errors were encountered: