forked from fengdu78/Coursera-ML-AndrewNg-Notes
-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy path8 - 1 - Non-linear Hypotheses (10 min).srt
1401 lines (1121 loc) · 25.5 KB
/
8 - 1 - Non-linear Hypotheses (10 min).srt
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
1
00:00:00,440 --> 00:00:01,400
In this and in the
在这节课和接下来的课程中
(字幕整理:中国海洋大学 黄海广,haiguang2000@qq.com )
2
00:00:01,480 --> 00:00:02,640
next set of videos, I'd like
我将给大家介绍
3
00:00:02,780 --> 00:00:04,270
to tell you about a learning
一种叫“神经网络”(Neural Network)
4
00:00:04,550 --> 00:00:06,110
algorithm called a Neural Network.
的机器学习算法
5
00:00:07,190 --> 00:00:07,900
We're going to first talk about
我们将首先讨论
6
00:00:08,079 --> 00:00:09,330
the representation and then
神经网络的表层结构
7
00:00:09,600 --> 00:00:10,390
in the next set of videos
在后续课程中
8
00:00:10,410 --> 00:00:12,160
talk about learning algorithms for it.
再来具体讨论的学习算法
9
00:00:12,660 --> 00:00:14,070
Neutral networks is actually
神经网络实际上是一个
10
00:00:14,510 --> 00:00:15,870
a pretty old idea, but had
相对古老的算法
11
00:00:16,290 --> 00:00:17,680
fallen out of favor for a while.
并且后来沉寂了一段时间
12
00:00:18,200 --> 00:00:19,270
But today, it is the
不过到了现在
13
00:00:19,580 --> 00:00:20,820
state of the art technique for
它又成为许多机器学习问题
14
00:00:21,090 --> 00:00:22,390
many different machine learning problems.
的首选技术
15
00:00:23,740 --> 00:00:25,740
So why do we need yet another learning algorithm?
不过我们为什么还需要这个学习算法?
16
00:00:26,300 --> 00:00:28,030
We already have linear regression and
我们已经有线性回归和逻辑回归算法了
17
00:00:28,180 --> 00:00:31,260
we have logistic regression, so why do we need, you know, neural networks?
为什么还要研究神经网络?
18
00:00:32,280 --> 00:00:34,260
In order to motivate the discussion
为了阐述研究
19
00:00:34,790 --> 00:00:35,970
of neural networks, let me
神经网络算法的目的
20
00:00:36,120 --> 00:00:37,130
start by showing you a few
我们首先来看几个
21
00:00:37,310 --> 00:00:38,720
examples of machine learning
机器学习问题作为例子
22
00:00:38,930 --> 00:00:40,100
problems where we need
这几个问题的解决
23
00:00:40,300 --> 00:00:41,850
to learn complex non-linear hypotheses.
都依赖于研究复杂的非线性分类器
24
00:00:43,850 --> 00:00:45,650
Consider a supervised learning classification
考虑这个监督学习分类的问题
25
00:00:46,530 --> 00:00:48,440
problem where you have a training set like this.
我们已经有了对应的训练集
26
00:00:49,280 --> 00:00:50,530
If you want to apply logistic
如果利用逻辑回归算法
27
00:00:50,960 --> 00:00:52,710
regression to this problem, one
来解决这个问题
28
00:00:52,900 --> 00:00:54,250
thing you could do is apply
首先需要构造
29
00:00:54,660 --> 00:00:56,140
logistic regression with a
一个包含很多非线性项的
30
00:00:56,190 --> 00:00:57,720
lot of nonlinear features like that.
逻辑回归函数
31
00:00:58,170 --> 00:00:59,580
So here, g as usual
这里g仍是s型函数 (即f(x)=1/(1+e^-x) )
32
00:01:00,070 --> 00:01:01,710
is the sigmoid function, and we
我们能让函数
33
00:01:01,780 --> 00:01:04,680
can include lots of polynomial terms like these.
包含很多像这样的多项式项
34
00:01:05,450 --> 00:01:06,790
And, if you include enough polynomial
事实上 当多项式项数足够多时
35
00:01:07,370 --> 00:01:08,280
terms then, you know, maybe
那么可能
36
00:01:08,950 --> 00:01:10,280
you can get a hypotheses
你能够得到一个
37
00:01:11,600 --> 00:01:13,780
that separates the positive and negative examples.
分开正样本和负样本的分界线
38
00:01:14,630 --> 00:01:16,080
This particular method works well
当只有两项时
39
00:01:16,470 --> 00:01:18,400
when you have only, say, two
比如 x1 x2
40
00:01:18,620 --> 00:01:20,180
features - x1 and x2
这种方法确实能得到不错的结果
41
00:01:20,190 --> 00:01:20,980
- because you can then include
因为你可以
42
00:01:21,500 --> 00:01:22,880
all those polynomial terms of
把x1和x2的所有组合
43
00:01:23,400 --> 00:01:24,620
x1 and x2.
都包含到多项式中
44
00:01:24,810 --> 00:01:26,280
But for many interesting machine learning
但是对于许多
45
00:01:26,520 --> 00:01:27,730
problems would have a
复杂的机器学习问题
46
00:01:27,910 --> 00:01:29,230
lot more features than just two.
涉及的项往往多于两项
47
00:01:30,780 --> 00:01:31,760
We've been talking for a while
我们之前已经讨论过
48
00:01:32,320 --> 00:01:34,560
about housing prediction, and suppose
房价预测的问题
49
00:01:35,130 --> 00:01:36,990
you have a housing classification
假设现在要处理的是
50
00:01:38,020 --> 00:01:39,280
problem rather than a
关于住房的分类问题
51
00:01:39,390 --> 00:01:41,170
regression problem, like maybe
而不是一个回归问题
52
00:01:41,580 --> 00:01:43,350
if you have different features of
假设你对一栋房子的多方面特点
53
00:01:43,440 --> 00:01:44,760
a house, and you want
都有所了解
54
00:01:45,010 --> 00:01:46,000
to predict what are the
你想预测
55
00:01:46,050 --> 00:01:47,590
odds that your house will
房子在未来半年内
56
00:01:47,700 --> 00:01:48,710
be sold within the next
能被卖出去的概率
57
00:01:48,910 --> 00:01:51,040
six months, so that will be a classification problem.
这是一个分类问题
58
00:01:52,100 --> 00:01:53,060
And as we saw we can
我们可以想出
59
00:01:53,260 --> 00:01:55,130
come up with quite a
很多特征
60
00:01:55,260 --> 00:01:56,480
lot of features, maybe a hundred
对于不同的房子有可能
61
00:01:56,840 --> 00:01:58,270
different features of different houses.
就有上百个特征
62
00:02:00,130 --> 00:02:01,610
For a problem like this, if
对于这类问题
63
00:02:01,880 --> 00:02:03,260
you were to include all the
如果要包含
64
00:02:03,370 --> 00:02:04,980
quadratic terms, all of
所有的二次项
65
00:02:05,100 --> 00:02:06,260
these, even all of the
即使只包含
66
00:02:06,540 --> 00:02:07,540
quadratic that is the second
二项式或多项式的计算
67
00:02:07,930 --> 00:02:10,450
or the polynomial terms, there would be a lot of them.
最终的多项式也可能有很多项
68
00:02:10,560 --> 00:02:11,580
There would be terms like x1 squared,
比如x1^2
69
00:02:12,960 --> 00:02:17,610
x1x2, x1x3, you know, x1x4
x1x2 x1x3 x1x4
70
00:02:18,750 --> 00:02:21,880
up to x1x100 and then
直到x1x100
71
00:02:21,980 --> 00:02:23,620
you have x2 squared, x2x3
还有x2^2 x2x3
72
00:02:25,620 --> 00:02:25,980
and so on.
等等很多项
73
00:02:26,510 --> 00:02:27,770
And if you include just
因此
74
00:02:28,060 --> 00:02:29,200
the second order terms, that
即使只考虑二阶项
75
00:02:29,330 --> 00:02:30,750
is, the terms that are
也就是说
76
00:02:30,840 --> 00:02:32,090
a product of, you know,
两个项的乘积
77
00:02:32,220 --> 00:02:33,390
two of these terms, x1
x1乘以x1
78
00:02:33,510 --> 00:02:35,010
times x1 and so on, then,
等等类似于此的项
79
00:02:35,780 --> 00:02:36,920
for the case of n equals
那么 在n=100的情况下
80
00:02:38,180 --> 00:02:40,280
100, you end up with about five thousand features.
最终也有5000个二次项
81
00:02:41,890 --> 00:02:44,880
And, asymptotically, the
而且渐渐地
82
00:02:45,000 --> 00:02:46,330
number of quadratic features grows
随着特征个数n的增加
83
00:02:46,770 --> 00:02:48,670
roughly as order n
二次项的个数大约以n^2的量级增长
84
00:02:48,820 --> 00:02:50,330
squared, where n is the
其中
85
00:02:50,460 --> 00:02:52,790
number of the original features,
n是原始项的个数
86
00:02:53,370 --> 00:02:54,780
like x1 through x100 that we had.
即我们之前说过的x1到x100这些项
87
00:02:55,700 --> 00:02:58,750
And its actually closer to n squared over two.
事实上二次项的个数大约是(n^2)/2
88
00:02:59,920 --> 00:03:01,440
So including all the
因此要包含所有的
89
00:03:01,560 --> 00:03:02,920
quadratic features doesn't seem
二次项是很困难的
90
00:03:03,220 --> 00:03:04,220
like it's maybe a good
所以这可能
91
00:03:04,300 --> 00:03:05,380
idea, because that is a
不是一个好的做法
92
00:03:05,580 --> 00:03:07,050
lot of features and you
而且由于项数过多
93
00:03:07,220 --> 00:03:08,920
might up overfitting the training
最后的结果很有可能是过拟合的
94
00:03:09,330 --> 00:03:10,500
set, and it can
此外
95
00:03:10,740 --> 00:03:12,800
also be computationally expensive, you know, to
在处理这么多项时
96
00:03:14,080 --> 00:03:15,120
be working with that many features.
也存在运算量过大的问题
97
00:03:16,450 --> 00:03:17,540
One thing you could do is
当然 你也可以试试
98
00:03:17,770 --> 00:03:19,090
include only a subset of
只包含上边这些二次项的子集
99
00:03:19,290 --> 00:03:20,950
these, so if you include only the
例如 我们只考虑
100
00:03:21,050 --> 00:03:22,630
features x1 squared, x2 squared,
x1^2 x2^2
101
00:03:23,590 --> 00:03:25,180
x3 squared, up to
x3^2直到
102
00:03:25,580 --> 00:03:27,750
maybe x100 squared, then
x100^2 这些项
103
00:03:28,100 --> 00:03:29,500
the number of features is much smaller.
这样就可以将二次项的数量大幅度减少
104
00:03:29,980 --> 00:03:31,720
Here you have only 100 such
减少到只有100个二次项
105
00:03:32,070 --> 00:03:33,850
quadratic features, but this
但是由于
106
00:03:34,120 --> 00:03:35,950
is not enough features and
忽略了太多相关项
107
00:03:36,100 --> 00:03:37,170
certainly won't let you fit
在处理类似左上角的数据时
108
00:03:37,290 --> 00:03:39,330
the data set like that on the upper left.
不可能得到理想的结果
109
00:03:39,570 --> 00:03:40,550
In fact, if you include
实际上
110
00:03:41,040 --> 00:03:42,720
only these quadratic features together
如果只考虑x1的平方
111
00:03:43,170 --> 00:03:44,870
with the original x1, and
到x100的平方
112
00:03:45,350 --> 00:03:46,500
so on, up to x100 features,
这一百个二次项
113
00:03:47,460 --> 00:03:48,530
then you can actually fit very
那么你可能会
114
00:03:48,910 --> 00:03:50,210
interesting hypotheses. So, you
拟合出一些特别的假设
115
00:03:50,330 --> 00:03:52,350
can fit things like, you know, access a
比如可能拟合出
116
00:03:52,490 --> 00:03:53,860
line of the ellipses like these, but
一个椭圆状的曲线
117
00:03:55,080 --> 00:03:56,240
you certainly cannot fit a more
但是肯定不能拟合出
118
00:03:56,340 --> 00:03:57,930
complex data set like that shown here.
像左上角这个数据集的分界线
119
00:03:59,360 --> 00:04:00,530
So 5000 features seems like
所以5000个二次项看起来已经很多了
120
00:04:00,620 --> 00:04:03,090
a lot, if you were
而现在假设
121
00:04:03,230 --> 00:04:04,860
to include the cubic, or
包括三次项
122
00:04:05,140 --> 00:04:06,100
third order known of each others,
或者三阶项
123
00:04:06,440 --> 00:04:08,050
the x1, x2, x3.
例如x1 x2 x3
124
00:04:08,400 --> 00:04:09,800
You know, x1 squared,
x1^2
125
00:04:10,310 --> 00:04:12,240
x2, x10 and
x2 x10
126
00:04:12,900 --> 00:04:15,280
x11, x17 and so on.
x11 x17等等
127
00:04:15,700 --> 00:04:18,110
You can imagine there are gonna be a lot of these features.
类似的三次项有很多很多
128
00:04:19,040 --> 00:04:19,770
In fact, they are going to be
事实上
129
00:04:20,050 --> 00:04:21,260
order and cube such features
三次项的个数是以n^3的量级增加
130
00:04:22,210 --> 00:04:23,830
and if any is 100
当n=100时
131
00:04:24,150 --> 00:04:25,660
you can compute that, you
可以计算出来
132
00:04:25,740 --> 00:04:26,870
end up with on the order
最后能得到
133
00:04:27,730 --> 00:04:29,650
of about 170,000 such cubic
大概17000个三次项
134
00:04:30,040 --> 00:04:31,670
features and so including
所以
135
00:04:32,260 --> 00:04:34,470
these higher auto-polynomial features when
当初始特征个数n增大时
136
00:04:34,920 --> 00:04:36,050
your original feature set end
这些高阶多项式项数
137
00:04:36,230 --> 00:04:37,730
is large this really dramatically
将以几何级数递增
138
00:04:38,530 --> 00:04:40,440
blows up your feature space and
特征空间也随之急剧膨胀
139
00:04:41,070 --> 00:04:42,180
this doesn't seem like a
当特征个数n很大时
140
00:04:42,320 --> 00:04:43,320
good way to come up with
如果找出附加项
141
00:04:43,560 --> 00:04:45,050
additional features with which
来建立一些分类器
142
00:04:45,240 --> 00:04:48,100
to build none many classifiers when n is large.
这并不是一个好做法
143
00:04:49,590 --> 00:04:52,560
For many machine learning problems, n will be pretty large.
对于许多实际的机器学习问题 特征个数n是很大的
144
00:04:53,270 --> 00:04:53,560
Here's an example.
我们看看下边这个例子
145
00:04:55,000 --> 00:04:58,140
Let's consider the problem of computer vision.
关于计算机视觉中的一个问题
146
00:04:59,670 --> 00:05:00,770
And suppose you want to
假设你想要
147
00:05:01,260 --> 00:05:02,620
use machine learning to train
使用机器学习算法
148
00:05:02,710 --> 00:05:04,610
a classifier to examine an
来训练一个分类器
149
00:05:04,710 --> 00:05:05,880
image and tell us whether
使它检测一个图像
150
00:05:06,160 --> 00:05:08,030
or not the image is a car.
来判断图像是否为一辆汽车
151
00:05:09,480 --> 00:05:11,900
Many people wonder why computer vision could be difficult.
很多人可能会好奇 这对计算机视觉来说有什么难的
152
00:05:12,390 --> 00:05:13,140
I mean when you and I
当我们自己看这幅图像时
153
00:05:13,270 --> 00:05:15,670
look at this picture it is so obvious what this is.
里面有什么是一目了然的事情
154
00:05:15,900 --> 00:05:17,000
You wonder how is it
你肯定会很奇怪
155
00:05:17,190 --> 00:05:18,320
that a learning algorithm could possibly
为什么学习算法竟可能会不知道
156
00:05:18,910 --> 00:05:20,880
fail to know what this picture is.
图像是什么
157
00:05:22,110 --> 00:05:23,330
To understand why computer vision
为了解答这个疑问
158
00:05:23,720 --> 00:05:25,380
is hard let's zoom
我们取出这幅图片中的一小部分
159
00:05:25,650 --> 00:05:26,690
into a small part of the
将其放大
160
00:05:26,940 --> 00:05:28,180
image like that area where the
比如图中
161
00:05:28,510 --> 00:05:30,240
little red rectangle is.
这个红色方框内的部分
162
00:05:30,400 --> 00:05:31,330
It turns out that where you
结果表明
163
00:05:31,450 --> 00:05:34,270
and I see a car, the computer sees that.
当人眼看到一辆汽车时 计算机实际上看到的却是这个
164
00:05:34,780 --> 00:05:35,930
What it sees is this matrix,
一个数据矩阵
165
00:05:36,600 --> 00:05:38,110
or this grid, of pixel
或像这种格网
166
00:05:38,580 --> 00:05:40,350
intensity values that tells
它们表示了像素强度值
167
00:05:40,610 --> 00:05:42,930
us the brightness of each pixel in the image.
告诉我们图像中每个像素的亮度值
168
00:05:43,640 --> 00:05:45,170
So the computer vision problem is
因此 对于计算机视觉来说问题就变成了
169
00:05:45,550 --> 00:05:47,230
to look at this matrix of
根据这个像素点亮度矩阵
170
00:05:47,310 --> 00:05:49,140
pixel intensity values, and tell
来告诉我们
171
00:05:49,410 --> 00:05:52,440
us that these numbers represent the door handle of a car.
这些数值代表一个汽车门把手
172
00:05:54,230 --> 00:05:55,740
Concretely, when we use
具体而言
173
00:05:56,030 --> 00:05:57,220
machine learning to build a
当用机器学习算法构造
174
00:05:57,430 --> 00:05:59,060
car detector, what we do
一个汽车识别器时
175
00:05:59,360 --> 00:06:00,510
is we come up with a
我们要想出
176
00:06:00,800 --> 00:06:02,690
label training set, with, let's
一个带标签的样本集
177
00:06:02,890 --> 00:06:04,250
say, a few label examples
其中一些样本
178
00:06:04,730 --> 00:06:05,850
of cars and a few
是各类汽车
179
00:06:06,000 --> 00:06:07,150
label examples of things that
另一部分样本
180
00:06:07,380 --> 00:06:08,780
are not cars, then we
是其他任何东西
181
00:06:09,090 --> 00:06:10,590
give our training set to
将这个样本集输入给学习算法
182
00:06:10,720 --> 00:06:12,230
the learning algorithm trained a
以训练出一个分类器
183
00:06:12,310 --> 00:06:13,500
classifier and then, you
训练完毕后
184
00:06:13,680 --> 00:06:14,700
know, we may test it and show
我们输入一幅新的图片
185
00:06:14,890 --> 00:06:16,710
the new image and ask, "What is this new thing?".
让分类器判定 “这是什么东西?”
186
00:06:17,980 --> 00:06:20,030
And hopefully it will recognize that that is a car.
理想情况下 分类器能识别出这是一辆汽车
187
00:06:21,410 --> 00:06:24,000
To understand why we
为了理解引入
188
00:06:24,120 --> 00:06:26,810
need nonlinear hypotheses, let's take
非线性分类器的必要性
189
00:06:27,050 --> 00:06:27,940
a look at some of the
我们从学习算法的训练样本中
190
00:06:28,190 --> 00:06:29,360
images of cars and maybe
挑出一些汽车图片
191
00:06:29,480 --> 00:06:31,780
non-cars that we might feed to our learning algorithm.
和一些非汽车图片
192
00:06:32,960 --> 00:06:33,920
Let's pick a couple of pixel
让我们从其中
193
00:06:34,090 --> 00:06:35,630
locations in our images, so
每幅图片中挑出一组像素点
194
00:06:35,750 --> 00:06:37,040
that's pixel one location and
这是像素点1的位置
195
00:06:37,180 --> 00:06:39,500
pixel two location, and let's
这是像素点2的位置
196
00:06:39,730 --> 00:06:42,390
plot this car, you know, at the
在坐标系中标出这幅汽车的位置
197
00:06:42,510 --> 00:06:44,010
location, at a certain
在某一点上
198
00:06:44,360 --> 00:06:45,890
point, depending on the intensities
车的位置取决于
199
00:06:46,430 --> 00:06:47,870
of pixel one and pixel two.
像素点1和像素点2的亮度
200
00:06:49,260 --> 00:06:50,630
And let's do this with a few other images.
让我们用同样的方法标出其他图片中汽车的位置