Skip to content

Commit d4b72b0

Browse files
authored
Merge branch 'ultralytics:master' into trtNMS
2 parents 89b5914 + 3c1afd9 commit d4b72b0

34 files changed

+1055
-704
lines changed

.github/README_cn.md

Lines changed: 0 additions & 344 deletions
This file was deleted.
Lines changed: 26 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,26 @@
1+
# YOLOv5 🚀 by Ultralytics, GPL-3.0 license
2+
# README translation action to translate README.md to Chinese as README.zh-CN.md on any change to README.md
3+
4+
name: Translate README
5+
6+
on:
7+
push:
8+
branches:
9+
- translate_readme # replace with 'master' to enable action
10+
paths:
11+
- README.md
12+
13+
jobs:
14+
Translate:
15+
runs-on: ubuntu-latest
16+
steps:
17+
- uses: actions/checkout@v3
18+
- name: Setup Node.js
19+
uses: actions/setup-node@v3
20+
with:
21+
node-version: 16
22+
# ISO Langusge Codes: https://cloud.google.com/translate/docs/languages
23+
- name: Adding README - Chinese Simplified
24+
uses: dephraiim/translate-readme@main
25+
with:
26+
LANG: zh-CN

.pre-commit-config.yaml

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ ci:
1313

1414
repos:
1515
- repo: https://github.com/pre-commit/pre-commit-hooks
16-
rev: v4.3.0
16+
rev: v4.4.0
1717
hooks:
1818
# - id: end-of-file-fixer
1919
- id: trailing-whitespace
@@ -24,7 +24,7 @@ repos:
2424
- id: check-docstring-first
2525

2626
- repo: https://github.com/asottile/pyupgrade
27-
rev: v3.2.0
27+
rev: v3.3.0
2828
hooks:
2929
- id: pyupgrade
3030
name: Upgrade code
@@ -50,15 +50,15 @@ repos:
5050
additional_dependencies:
5151
- mdformat-gfm
5252
- mdformat-black
53-
exclude: "README.md|README_cn.md"
53+
exclude: "README.md|README.zh-CN.md"
5454

5555
- repo: https://github.com/asottile/yesqa
5656
rev: v1.4.0
5757
hooks:
5858
- id: yesqa
5959

6060
- repo: https://github.com/PyCQA/flake8
61-
rev: 5.0.4
61+
rev: 6.0.0
6262
hooks:
6363
- id: flake8
6464
name: PEP8

CITATION.cff

Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,14 @@
1+
cff-version: 1.2.0
2+
preferred-citation:
3+
type: software
4+
message: If you use YOLOv5, please cite it as below.
5+
authors:
6+
- family-names: Jocher
7+
given-names: Glenn
8+
orcid: "https://orcid.org/0000-0001-5950-6979"
9+
title: "YOLOv5 by Ultralytics"
10+
version: 7.0
11+
doi: 10.5281/zenodo.3908559
12+
date-released: 2020-5-29
13+
license: GPL-3.0
14+
url: "https://github.com/ultralytics/yolov5"

README.md

Lines changed: 137 additions & 40 deletions
Large diffs are not rendered by default.

README.zh-CN.md

Lines changed: 482 additions & 0 deletions
Large diffs are not rendered by default.

classify/predict.py

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,6 +8,8 @@
88
vid.mp4 # video
99
screen # screenshot
1010
path/ # directory
11+
list.txt # list of images
12+
list.streams # list of streams
1113
'path/*.jpg' # glob
1214
'https://youtu.be/Zgi9g1ksQHc' # YouTube
1315
'rtsp://example.com/media.mp4' # RTSP, RTMP, HTTP stream
@@ -74,7 +76,7 @@ def run(
7476
save_img = not nosave and not source.endswith('.txt') # save inference images
7577
is_file = Path(source).suffix[1:] in (IMG_FORMATS + VID_FORMATS)
7678
is_url = source.lower().startswith(('rtsp://', 'rtmp://', 'http://', 'https://'))
77-
webcam = source.isnumeric() or source.endswith('.txt') or (is_url and not is_file)
79+
webcam = source.isnumeric() or source.endswith('.streams') or (is_url and not is_file)
7880
screenshot = source.lower().startswith('screen')
7981
if is_url and is_file:
8082
source = check_file(source) # download

classify/train.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@
66
$ python classify/train.py --model yolov5s-cls.pt --data imagenette160 --epochs 5 --img 224
77
88
Usage - Multi-GPU DDP training:
9-
$ python -m torch.distributed.run --nproc_per_node 4 --master_port 1 classify/train.py --model yolov5s-cls.pt --data imagenet --epochs 5 --img 224 --device 0,1,2,3
9+
$ python -m torch.distributed.run --nproc_per_node 4 --master_port 2022 classify/train.py --model yolov5s-cls.pt --data imagenet --epochs 5 --img 224 --device 0,1,2,3
1010
1111
Datasets: --data mnist, fashion-mnist, cifar10, cifar100, imagenette, imagewoof, imagenet, or 'path/to/data'
1212
YOLOv5-cls models: --model yolov5n-cls.pt, yolov5s-cls.pt, yolov5m-cls.pt, yolov5l-cls.pt, yolov5x-cls.pt

classify/tutorial.ipynb

Lines changed: 41 additions & 38 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@
99
"<div align=\"center\">\n",
1010
"\n",
1111
" <a href=\"https://ultralytics.com/yolov5\" target=\"_blank\">\n",
12-
" <img width=\"1024\", src=\"https://github.com/ultralytics/assets/raw/master/yolov5/v62/splash_readme.png\"></a>\n",
12+
" <img width=\"1024\", src=\"https://raw.githubusercontent.com/ultralytics/assets/master/yolov5/v70/splash.png\"></a>\n",
1313
"\n",
1414
"\n",
1515
"<br>\n",
@@ -36,20 +36,20 @@
3636
},
3737
{
3838
"cell_type": "code",
39-
"execution_count": 1,
39+
"execution_count": null,
4040
"metadata": {
4141
"colab": {
4242
"base_uri": "https://localhost:8080/"
4343
},
4444
"id": "wbvMlHd_QwMG",
45-
"outputId": "43b2e1b5-78d9-4e1d-8530-ee9779bba160"
45+
"outputId": "0806e375-610d-4ec0-c867-763dbb518279"
4646
},
4747
"outputs": [
4848
{
4949
"output_type": "stream",
5050
"name": "stderr",
5151
"text": [
52-
"YOLOv5 🚀 v6.2-258-g7fc7ed7 Python-3.7.15 torch-1.12.1+cu113 CUDA:0 (Tesla T4, 15110MiB)\n"
52+
"YOLOv5 🚀 v7.0-3-g61ebf5e Python-3.7.15 torch-1.12.1+cu113 CUDA:0 (Tesla T4, 15110MiB)\n"
5353
]
5454
},
5555
{
@@ -94,30 +94,30 @@
9494
},
9595
{
9696
"cell_type": "code",
97-
"execution_count": 2,
97+
"execution_count": null,
9898
"metadata": {
9999
"colab": {
100100
"base_uri": "https://localhost:8080/"
101101
},
102102
"id": "zR9ZbuQCH7FX",
103-
"outputId": "1b610787-7cf7-4c33-aac2-aa50fbb84a94"
103+
"outputId": "50504ef7-aa3e-4281-a4e3-d0c7df3c0ffe"
104104
},
105105
"outputs": [
106106
{
107107
"output_type": "stream",
108108
"name": "stdout",
109109
"text": [
110-
"\u001b[34m\u001b[1mclassify/predict: \u001b[0mweights=['yolov5s-cls.pt'], source=data/images, data=data/coco128.yaml, imgsz=[224, 224], device=, view_img=False, save_txt=True, nosave=False, augment=False, visualize=False, update=False, project=runs/predict-cls, name=exp, exist_ok=False, half=False, dnn=False, vid_stride=1\n",
111-
"YOLOv5 🚀 v6.2-258-g7fc7ed7 Python-3.7.15 torch-1.12.1+cu113 CUDA:0 (Tesla T4, 15110MiB)\n",
110+
"\u001b[34m\u001b[1mclassify/predict: \u001b[0mweights=['yolov5s-cls.pt'], source=data/images, data=data/coco128.yaml, imgsz=[224, 224], device=, view_img=False, save_txt=False, nosave=False, augment=False, visualize=False, update=False, project=runs/predict-cls, name=exp, exist_ok=False, half=False, dnn=False, vid_stride=1\n",
111+
"YOLOv5 🚀 v7.0-3-g61ebf5e Python-3.7.15 torch-1.12.1+cu113 CUDA:0 (Tesla T4, 15110MiB)\n",
112112
"\n",
113-
"Downloading https://github.com/ultralytics/yolov5/releases/download/v6.2/yolov5s-cls.pt to yolov5s-cls.pt...\n",
114-
"100% 10.5M/10.5M [00:03<00:00, 2.94MB/s]\n",
113+
"Downloading https://github.com/ultralytics/yolov5/releases/download/v7.0/yolov5s-cls.pt to yolov5s-cls.pt...\n",
114+
"100% 10.5M/10.5M [00:00<00:00, 12.3MB/s]\n",
115115
"\n",
116116
"Fusing layers... \n",
117117
"Model summary: 117 layers, 5447688 parameters, 0 gradients, 11.4 GFLOPs\n",
118118
"image 1/2 /content/yolov5/data/images/bus.jpg: 224x224 minibus 0.39, police van 0.24, amphibious vehicle 0.05, recreational vehicle 0.04, trolleybus 0.03, 3.9ms\n",
119-
"image 2/2 /content/yolov5/data/images/zidane.jpg: 224x224 suit 0.38, bow tie 0.19, bridegroom 0.18, rugby ball 0.04, stage 0.02, 4.1ms\n",
120-
"Speed: 0.3ms pre-process, 4.0ms inference, 1.5ms NMS per image at shape (1, 3, 224, 224)\n",
119+
"image 2/2 /content/yolov5/data/images/zidane.jpg: 224x224 suit 0.38, bow tie 0.19, bridegroom 0.18, rugby ball 0.04, stage 0.02, 4.6ms\n",
120+
"Speed: 0.3ms pre-process, 4.3ms inference, 1.5ms NMS per image at shape (1, 3, 224, 224)\n",
121121
"Results saved to \u001b[1mruns/predict-cls/exp\u001b[0m\n"
122122
]
123123
}
@@ -149,29 +149,29 @@
149149
},
150150
{
151151
"cell_type": "code",
152-
"execution_count": 3,
152+
"execution_count": null,
153153
"metadata": {
154154
"colab": {
155155
"base_uri": "https://localhost:8080/"
156156
},
157157
"id": "WQPtK1QYVaD_",
158-
"outputId": "92de5f34-cf41-49e7-b679-41db94e995ac"
158+
"outputId": "20fc0630-141e-4a90-ea06-342cbd7ce496"
159159
},
160160
"outputs": [
161161
{
162162
"output_type": "stream",
163163
"name": "stdout",
164164
"text": [
165-
"--2022-11-18 21:48:38-- https://image-net.org/data/ILSVRC/2012/ILSVRC2012_img_val.tar\n",
165+
"--2022-11-22 19:53:40-- https://image-net.org/data/ILSVRC/2012/ILSVRC2012_img_val.tar\n",
166166
"Resolving image-net.org (image-net.org)... 171.64.68.16\n",
167167
"Connecting to image-net.org (image-net.org)|171.64.68.16|:443... connected.\n",
168168
"HTTP request sent, awaiting response... 200 OK\n",
169169
"Length: 6744924160 (6.3G) [application/x-tar]\n",
170170
"Saving to: ‘ILSVRC2012_img_val.tar’\n",
171171
"\n",
172-
"ILSVRC2012_img_val. 100%[===================>] 6.28G 7.15MB/s in 11m 13s \n",
172+
"ILSVRC2012_img_val. 100%[===================>] 6.28G 16.1MB/s in 10m 52s \n",
173173
"\n",
174-
"2022-11-18 21:59:52 (9.55 MB/s) - ‘ILSVRC2012_img_val.tar’ saved [6744924160/6744924160]\n",
174+
"2022-11-22 20:04:32 (9.87 MB/s) - ‘ILSVRC2012_img_val.tar’ saved [6744924160/6744924160]\n",
175175
"\n"
176176
]
177177
}
@@ -183,25 +183,25 @@
183183
},
184184
{
185185
"cell_type": "code",
186-
"execution_count": 4,
186+
"execution_count": null,
187187
"metadata": {
188188
"colab": {
189189
"base_uri": "https://localhost:8080/"
190190
},
191191
"id": "X58w8JLpMnjH",
192-
"outputId": "9961ad87-d639-4489-b578-0a0578fefaab"
192+
"outputId": "41843132-98e2-4c25-d474-4cd7b246fb8e"
193193
},
194194
"outputs": [
195195
{
196196
"output_type": "stream",
197197
"name": "stdout",
198198
"text": [
199199
"\u001b[34m\u001b[1mclassify/val: \u001b[0mdata=../datasets/imagenet, weights=['yolov5s-cls.pt'], batch_size=128, imgsz=224, device=, workers=8, verbose=True, project=runs/val-cls, name=exp, exist_ok=False, half=True, dnn=False\n",
200-
"YOLOv5 🚀 v6.2-258-g7fc7ed7 Python-3.7.15 torch-1.12.1+cu113 CUDA:0 (Tesla T4, 15110MiB)\n",
200+
"YOLOv5 🚀 v7.0-3-g61ebf5e Python-3.7.15 torch-1.12.1+cu113 CUDA:0 (Tesla T4, 15110MiB)\n",
201201
"\n",
202202
"Fusing layers... \n",
203203
"Model summary: 117 layers, 5447688 parameters, 0 gradients, 11.4 GFLOPs\n",
204-
"validating: 100% 391/391 [04:48<00:00, 1.35it/s]\n",
204+
"validating: 100% 391/391 [04:57<00:00, 1.31it/s]\n",
205205
" Class Images top1_acc top5_acc\n",
206206
" all 50000 0.715 0.902\n",
207207
" tench 50 0.94 0.98\n",
@@ -1269,45 +1269,47 @@
12691269
},
12701270
{
12711271
"cell_type": "code",
1272-
"execution_count": 5,
1272+
"execution_count": null,
12731273
"metadata": {
12741274
"colab": {
12751275
"base_uri": "https://localhost:8080/"
12761276
},
12771277
"id": "1NcFxRcFdJ_O",
1278-
"outputId": "638c55b1-dc45-4eee-cabc-4921dc61faf5"
1278+
"outputId": "77c8d487-16db-4073-b3ea-06cabf2e7766"
12791279
},
12801280
"outputs": [
12811281
{
12821282
"output_type": "stream",
12831283
"name": "stdout",
12841284
"text": [
1285-
"\u001b[34m\u001b[1mclassify/train: \u001b[0mmodel=yolov5s-cls.pt, data=imagenette160, epochs=3, batch_size=16, imgsz=224, nosave=False, cache=ram, device=, workers=8, project=runs/train-cls, name=exp, exist_ok=False, pretrained=True, optimizer=Adam, lr0=0.001, decay=5e-05, label_smoothing=0.1, cutoff=None, dropout=None, verbose=False, seed=0, local_rank=-1\n",
1285+
"\u001b[34m\u001b[1mclassify/train: \u001b[0mmodel=yolov5s-cls.pt, data=imagenette160, epochs=5, batch_size=64, imgsz=224, nosave=False, cache=ram, device=, workers=8, project=runs/train-cls, name=exp, exist_ok=False, pretrained=True, optimizer=Adam, lr0=0.001, decay=5e-05, label_smoothing=0.1, cutoff=None, dropout=None, verbose=False, seed=0, local_rank=-1\n",
12861286
"\u001b[34m\u001b[1mgithub: \u001b[0mup to date with https://github.com/ultralytics/yolov5 ✅\n",
1287-
"YOLOv5 🚀 v6.2-258-g7fc7ed7 Python-3.7.15 torch-1.12.1+cu113 CUDA:0 (Tesla T4, 15110MiB)\n",
1287+
"YOLOv5 🚀 v7.0-3-g61ebf5e Python-3.7.15 torch-1.12.1+cu113 CUDA:0 (Tesla T4, 15110MiB)\n",
12881288
"\n",
12891289
"\u001b[34m\u001b[1mTensorBoard: \u001b[0mStart with 'tensorboard --logdir runs/train-cls', view at http://localhost:6006/\n",
12901290
"\n",
12911291
"Dataset not found ⚠️, missing path /content/datasets/imagenette160, attempting download...\n",
12921292
"Downloading https://github.com/ultralytics/yolov5/releases/download/v1.0/imagenette160.zip to /content/datasets/imagenette160.zip...\n",
1293-
"100% 103M/103M [00:09<00:00, 11.1MB/s]\n",
1293+
"100% 103M/103M [00:00<00:00, 347MB/s] \n",
12941294
"Unzipping /content/datasets/imagenette160.zip...\n",
1295-
"Dataset download success ✅ (13.2s), saved to \u001b[1m/content/datasets/imagenette160\u001b[0m\n",
1295+
"Dataset download success ✅ (3.3s), saved to \u001b[1m/content/datasets/imagenette160\u001b[0m\n",
12961296
"\n",
12971297
"\u001b[34m\u001b[1malbumentations: \u001b[0mRandomResizedCrop(p=1.0, height=224, width=224, scale=(0.08, 1.0), ratio=(0.75, 1.3333333333333333), interpolation=1), HorizontalFlip(p=0.5), ColorJitter(p=0.5, brightness=[0.6, 1.4], contrast=[0.6, 1.4], saturation=[0.6, 1.4], hue=[0, 0]), Normalize(p=1.0, mean=(0.485, 0.456, 0.406), std=(0.229, 0.224, 0.225), max_pixel_value=255.0), ToTensorV2(always_apply=True, p=1.0, transpose_mask=False)\n",
12981298
"Model summary: 149 layers, 4185290 parameters, 4185290 gradients, 10.5 GFLOPs\n",
12991299
"\u001b[34m\u001b[1moptimizer:\u001b[0m Adam(lr=0.001) with parameter groups 32 weight(decay=0.0), 33 weight(decay=5e-05), 33 bias\n",
13001300
"Image sizes 224 train, 224 test\n",
13011301
"Using 1 dataloader workers\n",
13021302
"Logging results to \u001b[1mruns/train-cls/exp\u001b[0m\n",
1303-
"Starting yolov5s-cls.pt training on imagenette160 dataset with 10 classes for 3 epochs...\n",
1303+
"Starting yolov5s-cls.pt training on imagenette160 dataset with 10 classes for 5 epochs...\n",
13041304
"\n",
13051305
" Epoch GPU_mem train_loss val_loss top1_acc top5_acc\n",
1306-
" 1/3 0.348G 1.31 1.09 0.794 0.979: 100% 592/592 [01:02<00:00, 9.47it/s]\n",
1307-
" 2/3 0.415G 1.09 0.852 0.883 0.99: 100% 592/592 [00:59<00:00, 10.00it/s]\n",
1308-
" 3/3 0.415G 0.954 0.776 0.907 0.994: 100% 592/592 [00:59<00:00, 9.89it/s]\n",
1306+
" 1/5 1.47G 1.05 0.974 0.828 0.975: 100% 148/148 [00:38<00:00, 3.82it/s]\n",
1307+
" 2/5 1.73G 0.895 0.766 0.911 0.994: 100% 148/148 [00:36<00:00, 4.03it/s]\n",
1308+
" 3/5 1.73G 0.82 0.704 0.934 0.996: 100% 148/148 [00:35<00:00, 4.20it/s]\n",
1309+
" 4/5 1.73G 0.766 0.664 0.951 0.998: 100% 148/148 [00:36<00:00, 4.05it/s]\n",
1310+
" 5/5 1.73G 0.724 0.634 0.959 0.997: 100% 148/148 [00:37<00:00, 3.94it/s]\n",
13091311
"\n",
1310-
"Training complete (0.051 hours)\n",
1312+
"Training complete (0.052 hours)\n",
13111313
"Results saved to \u001b[1mruns/train-cls/exp\u001b[0m\n",
13121314
"Predict: python classify/predict.py --weights runs/train-cls/exp/weights/best.pt --source im.jpg\n",
13131315
"Validate: python classify/val.py --weights runs/train-cls/exp/weights/best.pt --data /content/datasets/imagenette160\n",
@@ -1320,7 +1322,7 @@
13201322
],
13211323
"source": [
13221324
"# Train YOLOv5s Classification on Imagenette160 for 3 epochs\n",
1323-
"!python classify/train.py --img 224 --batch 16 --epochs 3 --data imagenette160 --model yolov5s-cls.pt --cache"
1325+
"!python classify/train.py --model yolov5s-cls.pt --data imagenette160 --epochs 5 --img 224 --cache"
13241326
]
13251327
},
13261328
{
@@ -1339,19 +1341,20 @@
13391341
},
13401342
"source": [
13411343
"## Comet Logging and Visualization 🌟 NEW\n",
1342-
"[Comet](https://bit.ly/yolov5-readme-comet) is now fully integrated with YOLOv5. Track and visualize model metrics in real time, save your hyperparameters, datasets, and model checkpoints, and visualize your model predictions with [Comet Custom Panels](https://bit.ly/yolov5-colab-comet-panels)! Comet makes sure you never lose track of your work and makes it easy to share results and collaborate across teams of all sizes! \n",
1344+
"\n",
1345+
"[Comet](https://www.comet.com/site/lp/yolov5-with-comet/?utm_source=yolov5&utm_medium=partner&utm_campaign=partner_yolov5_2022&utm_content=yolov5_colab) is now fully integrated with YOLOv5. Track and visualize model metrics in real time, save your hyperparameters, datasets, and model checkpoints, and visualize your model predictions with [Comet Custom Panels](https://www.comet.com/docs/v2/guides/comet-dashboard/code-panels/about-panels/?utm_source=yolov5&utm_medium=partner&utm_campaign=partner_yolov5_2022&utm_content=yolov5_colab)! Comet makes sure you never lose track of your work and makes it easy to share results and collaborate across teams of all sizes!\n",
13431346
"\n",
13441347
"Getting started is easy:\n",
13451348
"```shell\n",
13461349
"pip install comet_ml # 1. install\n",
13471350
"export COMET_API_KEY=<Your API Key> # 2. paste API key\n",
13481351
"python train.py --img 640 --epochs 3 --data coco128.yaml --weights yolov5s.pt # 3. train\n",
13491352
"```\n",
1350-
"\n",
1351-
"To learn more about all of the supported Comet features for this integration, check out the [Comet Tutorial](https://github.com/ultralytics/yolov5/tree/master/utils/loggers/comet). If you'd like to learn more about Comet, head over to our [documentation](https://bit.ly/yolov5-colab-comet-docs). Get started by trying out the Comet Colab Notebook:\n",
1353+
"To learn more about all of the supported Comet features for this integration, check out the [Comet Tutorial](https://github.com/ultralytics/yolov5/tree/master/utils/loggers/comet). If you'd like to learn more about Comet, head over to our [documentation](https://www.comet.com/docs/v2/?utm_source=yolov5&utm_medium=partner&utm_campaign=partner_yolov5_2022&utm_content=yolov5_colab). Get started by trying out the Comet Colab Notebook:\n",
13521354
"[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1RG0WOQyxlDlo5Km8GogJpIEJlg_5lyYO?usp=sharing)\n",
13531355
"\n",
1354-
"<img width=\"1920\" alt=\"yolo-ui\" src=\"https://user-images.githubusercontent.com/26833433/202851203-164e94e1-2238-46dd-91f8-de020e9d6b41.png\">"
1356+
"<a href=\"https://bit.ly/yolov5-readme-comet2\">\n",
1357+
"<img alt=\"Comet Dashboard\" src=\"https://user-images.githubusercontent.com/26833433/202851203-164e94e1-2238-46dd-91f8-de020e9d6b41.png\" width=\"1280\"/></a>"
13551358
]
13561359
},
13571360
{
@@ -1474,4 +1477,4 @@
14741477
},
14751478
"nbformat": 4,
14761479
"nbformat_minor": 0
1477-
}
1480+
}

0 commit comments

Comments
 (0)