forked from fengdu78/Coursera-ML-AndrewNg-Notes
-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy path12 - 1 - Optimization Objective (15 min).srt
2166 lines (1733 loc) · 38 KB
/
12 - 1 - Optimization Objective (15 min).srt
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
1
00:00:00,570 --> 00:00:01,860
By now, you see the range
到目前为止
(字幕整理:中国海洋大学 黄海广,haiguang2000@qq.com )
2
00:00:02,090 --> 00:00:04,860
of different learning algorithms. Within supervised learning,
你已经见过一系列不同的学习算法
3
00:00:05,280 --> 00:00:06,810
the performance of many supervised learning algorithms
在监督学习中 许多学习算法的性能
4
00:00:07,300 --> 00:00:08,830
will be pretty similar
都非常类似
5
00:00:09,650 --> 00:00:10,740
and when that is less more often be
因此 重要的不是
6
00:00:11,040 --> 00:00:12,140
whether you use
你该选择使用
7
00:00:12,440 --> 00:00:13,450
learning algorithm A or learning algorithm
学习算法A还是学习算法B
8
00:00:13,660 --> 00:00:15,020
B but when that
而更重要的是
9
00:00:15,190 --> 00:00:16,190
is small there will often be
应用这些算法时
10
00:00:16,360 --> 00:00:17,100
things like the amount of data
所创建的
11
00:00:17,330 --> 00:00:18,530
you are creating these algorithms on.
大量数据
12
00:00:19,280 --> 00:00:20,480
That's always your skill in
在应用这些算法时
13
00:00:20,600 --> 00:00:21,990
applying this algorithms. Seems like
表现情况通常依赖于你的水平 比如
14
00:00:23,150 --> 00:00:24,480
your choice of the features that you
你为学习算法
15
00:00:24,660 --> 00:00:25,790
designed to give the learning
所设计的
16
00:00:26,010 --> 00:00:27,030
algorithms and how you
特征量的选择
17
00:00:27,200 --> 00:00:28,530
choose the regularization parameter
以及如何选择正则化参数
18
00:00:29,190 --> 00:00:31,690
and things like that. But there's
诸如此类的事
19
00:00:31,930 --> 00:00:34,110
one more algorithm that is very
还有一个
20
00:00:34,380 --> 00:00:35,460
powerful and its very
更加强大的算法
21
00:00:35,580 --> 00:00:37,400
widely used both within industry
广泛的应用于
22
00:00:38,050 --> 00:00:39,590
and in Academia. And that's called the
工业界和学术界
23
00:00:39,850 --> 00:00:41,080
support vector machine, and compared to
它被称为支持向量机(Support Vector Machine)
24
00:00:41,200 --> 00:00:42,600
both the logistic regression and neural networks, the
与逻辑回归和神经网络相比
25
00:00:46,770 --> 00:00:48,190
support vector machine or the SVM
支持向量机 或者简称SVM
26
00:00:48,440 --> 00:00:50,120
sometimes gives a cleaner
在学习复杂的非线性方程时
27
00:00:50,890 --> 00:00:52,040
and sometimes more powerful way
提供了一种更为清晰
28
00:00:52,480 --> 00:00:53,250
of learning complex nonlinear functions.
更加强大的方式
29
00:00:54,970 --> 00:00:56,300
And so I'd like to take the next
因此 在接下来的视频中
30
00:00:56,480 --> 00:00:57,850
videos to
我会探讨
31
00:00:57,890 --> 00:01:00,100
talk about that.
这一算法
32
00:01:00,400 --> 00:01:01,400
Later in this course, I will do
在稍后的课程中
33
00:01:01,540 --> 00:01:02,710
a quick survey of the range
我也会对监督学习算法
34
00:01:03,100 --> 00:01:04,340
of different supervised learning algorithms just
进行简要的总结
35
00:01:05,200 --> 00:01:06,790
to very briefly describe them
当然 仅仅是作简要描述
36
00:01:07,430 --> 00:01:08,870
but the support vector machine, given
但对于支持向量机
37
00:01:09,370 --> 00:01:10,840
its popularity and how popular
鉴于该算法的强大和受欢迎度
38
00:01:10,980 --> 00:01:11,920
it is, this will be
在本课中 我会花许多时间来讲解它
39
00:01:12,060 --> 00:01:13,800
the last of the supervised learning algorithms
它也是我们所介绍的
40
00:01:14,440 --> 00:01:16,710
that I'll spend a significant amount of time on in this course.
最后一个监督学习算法
41
00:01:19,260 --> 00:01:20,440
As with our development of ever
正如我们之前开发的学习算法
42
00:01:20,670 --> 00:01:22,280
learning algorithms, we are going to start by talking
我们从
43
00:01:22,650 --> 00:01:23,940
about the optimization objective,
优化目标开始
44
00:01:24,750 --> 00:01:26,420
so let's get started on
那么 我们开始学习
45
00:01:26,620 --> 00:01:27,920
this algorithm.
这个算法
46
00:01:29,420 --> 00:01:30,960
In order to describe the support
为了描述支持向量机
47
00:01:31,270 --> 00:01:32,570
vector machine, I'm actually going
事实上 我将会
48
00:01:32,610 --> 00:01:34,020
to start with logistic regression
从逻辑回归开始
49
00:01:34,990 --> 00:01:35,990
and show how we can modify
展示我们如何
50
00:01:36,820 --> 00:01:37,630
it a bit and get what
一点一点修改
51
00:01:38,240 --> 00:01:39,260
is essentially the support vector machine.
来得到本质上的支持向量机
52
00:01:40,290 --> 00:01:41,740
So, in logistic regression we have
那么 在逻辑回归中
53
00:01:41,950 --> 00:01:43,680
our familiar form of
我们已经熟悉了
54
00:01:43,740 --> 00:01:46,000
the hypotheses there and the
这里的假设函数形式
55
00:01:46,450 --> 00:01:48,590
sigmoid activation function shown on the right.
和右边的S型激励函数
56
00:01:50,390 --> 00:01:51,330
And in order to explain
然而 为了解释
57
00:01:51,800 --> 00:01:52,650
some of the math, I'm going
一些数学知识
58
00:01:52,850 --> 00:01:55,960
to use z to denote failure of transpose x here.
我将用 z 表示 θ 转置乘以 x
59
00:01:57,620 --> 00:01:58,650
Now let's think about what
现在 让一起考虑下
60
00:01:58,900 --> 00:02:01,150
we will like the logistic regression to do.
我们想要逻辑回归做什么
61
00:02:01,270 --> 00:02:02,800
If we have an example with
如果有一个
62
00:02:03,070 --> 00:02:04,360
y equals 1, and by
y=1 的样本
63
00:02:04,540 --> 00:02:05,480
this I mean an example
我的意思是
64
00:02:06,100 --> 00:02:07,100
in either a training set
不管是在训练集中 或是在测试集中
65
00:02:07,440 --> 00:02:11,780
or the test set, you know, order cross valuation set where y is equal to 1 then
又或者在交叉验证集中 总之是 y=1
66
00:02:12,030 --> 00:02:14,300
we are sort of hoping that h of x will be close to 1.
现在 我们希望 h(x) 趋近1
67
00:02:14,380 --> 00:02:15,760
So, right, we are hoping to
因为 我们想要
68
00:02:16,140 --> 00:02:17,330
correctly classify that example
正确地将此样本分类
69
00:02:18,520 --> 00:02:19,390
and what, having h of x
这就意味着
70
00:02:19,510 --> 00:02:20,710
close to 1, what that means
当 h(x) 趋近于1时
71
00:02:20,850 --> 00:02:22,080
is that theta transpose x
θ 转置乘以 x
72
00:02:22,360 --> 00:02:23,380
must be much larger
应当
73
00:02:23,770 --> 00:02:24,990
than 0, so there's
远大于0 这里的
74
00:02:25,330 --> 00:02:26,680
greater than, greater than sign, that
大于大于号 >>
75
00:02:26,900 --> 00:02:28,220
means much, much greater
意思是
76
00:02:28,530 --> 00:02:30,880
than 0 and that's
远远大于0
77
00:02:31,120 --> 00:02:32,840
because it is z, that is theta transpose
这是因为 由于 z 表示
78
00:02:32,960 --> 00:02:34,750
x
θ 转置乘以 x
79
00:02:34,940 --> 00:02:35,910
is when z is much bigger than
当 z 远大于
80
00:02:36,010 --> 00:02:37,240
0, is far to the
0时 即到了
81
00:02:37,310 --> 00:02:39,060
right of this figure that, you know, the
该图的右边 你不难发现
82
00:02:39,360 --> 00:02:42,430
output of logistic regression becomes close to 1.
此时逻辑回归的输出将趋近于1
83
00:02:44,510 --> 00:02:45,580
Conversely, if we have
相反地 如果我们
84
00:02:45,630 --> 00:02:46,870
an example where y is
有另一个样本
85
00:02:47,000 --> 00:02:48,470
equal to 0 then what
即 y=0
86
00:02:48,750 --> 00:02:49,620
were hoping for is that the hypothesis
我们希望假设函数
87
00:02:50,420 --> 00:02:51,890
will output the value to
的输出值
88
00:02:52,010 --> 00:02:53,850
close to 0 and that corresponds to theta transpose x
将趋近于0 这对应于 θ 转置乘以 x
89
00:02:54,650 --> 00:02:55,990
or z pretty much
或者就是 z 会
90
00:02:56,250 --> 00:02:57,080
less than 0 because
远小于0
91
00:02:57,440 --> 00:02:58,720
that corresponds to
因为对应的
92
00:02:59,160 --> 00:03:01,250
hypothesis of outputting a value close to 0. If
假设函数的输出值趋近0
93
00:03:02,180 --> 00:03:03,590
you look at the
如果你进一步
94
00:03:03,760 --> 00:03:06,300
cost function of logistic regression, what
观察逻辑回归的代价函数
95
00:03:06,440 --> 00:03:07,470
you find is that each
你会发现
96
00:03:07,710 --> 00:03:09,400
example x, y,
每个样本 (x, y)
97
00:03:10,190 --> 00:03:11,520
contributes a term like
都会为总代价函数
98
00:03:11,700 --> 00:03:14,320
this to the overall cost function.
增加这里的一项
99
00:03:15,450 --> 00:03:16,900
All right. So, for the overall cost function, we usually, we will
因此 对于总代价函数
100
00:03:17,390 --> 00:03:18,600
also have a sum over
通常会有对所有的训练样本求和
101
00:03:18,890 --> 00:03:21,430
all the training examples and 1 over m term.
并且这里还有一个1/m项
102
00:03:22,450 --> 00:03:22,740
But this
但是
103
00:03:23,240 --> 00:03:24,150
expression here. That's
在逻辑回归中
104
00:03:24,470 --> 00:03:25,450
the term that a single
这里的这一项
105
00:03:26,220 --> 00:03:28,490
training example contributes to
就是表示一个训练样本
106
00:03:28,780 --> 00:03:31,550
the overall objective function for logistic regression.
所对应的表达式
107
00:03:33,250 --> 00:03:34,350
Now, if I take the definition
现在 如果我将完整定义的
108
00:03:35,190 --> 00:03:36,120
for the full of my hypothesis
假设函数
109
00:03:37,030 --> 00:03:38,700
and plug it in, over here,
代入这里
110
00:03:39,790 --> 00:03:40,710
the one I get is that
那么 我们就会得到
111
00:03:40,920 --> 00:03:43,130
each training example contributes this term, right?
每一个训练样本都影响这一项
112
00:03:44,270 --> 00:03:45,480
Ignoring the 1 over
现在 先忽略1/m这一项
113
00:03:45,720 --> 00:03:47,130
m but it contributes that term
但是这一项
114
00:03:47,470 --> 00:03:49,470
to be my overall cost function for
是影响整个总代价函数
115
00:03:49,680 --> 00:03:52,260
logistic regression. Now let's
中的这一项的
116
00:03:52,820 --> 00:03:54,310
consider the 2 cases
现在 一起来考虑两种情况
117
00:03:54,700 --> 00:03:55,970
of when y is equal to 1
一种是y等于1的情况
118
00:03:56,040 --> 00:03:57,250
and when y is equal to 0.
一种是y等于0的情况
119
00:03:57,820 --> 00:03:59,040
In the first case, let's
在第一种情况中
120
00:03:59,170 --> 00:04:00,260
suppose that y is equal
假设 y
121
00:04:00,520 --> 00:04:01,960
to 1. In that case,
等于1 此时
122
00:04:02,440 --> 00:04:04,850
only this first row in
在目标函数中
123
00:04:04,980 --> 00:04:06,910
the objective matters because this
只需有第一项起作用
124
00:04:07,130 --> 00:04:08,830
1 minus y term will be equal
因为y等于1时
125
00:04:09,210 --> 00:04:10,510
to 0 if y is equal to 1.
(1-y) 项将等于0
126
00:04:13,640 --> 00:04:15,340
So, when y is equal to
因此 当在y等于
127
00:04:15,400 --> 00:04:17,130
1 when in an example, x,
1的样本中时
128
00:04:17,310 --> 00:04:18,240
y when y is equal to
即在 (x, y) 中
129
00:04:18,420 --> 00:04:19,840
1, what we get is this
y等于1
130
00:04:20,010 --> 00:04:21,340
term minus log 1
我们得到
131
00:04:21,560 --> 00:04:22,370
over 1 plus e to the negative
-log(1/(1+e^z) ) 这样一项
132
00:04:22,860 --> 00:04:25,050
z. Where, similar to the last slide,
这里同上一张幻灯片一致
133
00:04:25,330 --> 00:04:26,480
I'm using z to denote
我用 z
134
00:04:27,490 --> 00:04:29,430
data transpose x. And
表示 θ 转置乘以 x
135
00:04:29,640 --> 00:04:30,930
of course, in the cost we
当然 在代价函数中
136
00:04:31,040 --> 00:04:32,130
actually had this minus y
y 前面有负号
137
00:04:32,380 --> 00:04:33,490
but we just said that you know, if y is
我们只是这样表示
138
00:04:33,540 --> 00:04:34,790
equal to 1. So that's equal
如果y等于1 代价函数中
139
00:04:35,020 --> 00:04:36,500
to 1. I just simplified it
这一项也等于1 这样做
140
00:04:36,580 --> 00:04:38,010
a way in the expression that
是为了简化
141
00:04:38,300 --> 00:04:39,820
I have written down here.
此处的表达式
142
00:04:41,950 --> 00:04:43,030
And if we plot this function,
如果画出
143
00:04:43,580 --> 00:04:45,080
as a function of z, what
关于 z 的函数
144
00:04:45,230 --> 00:04:46,320
you find is that you get
你会看到
145
00:04:47,160 --> 00:04:48,630
this curve shown on the
左下角的
146
00:04:49,220 --> 00:04:50,290
lower left of this line
这条曲线
147
00:04:51,120 --> 00:04:52,290
and thus we also see
我们同样可以看到
148
00:04:52,640 --> 00:04:53,590
that when z is equal
当 z 增大时
149
00:04:53,860 --> 00:04:54,930
to large that is to when
也就是相当于
150
00:04:55,440 --> 00:04:56,930
theta transpose x is large
θ 转置乘以x 增大时
151
00:04:57,800 --> 00:04:58,790
that corresponds to a
z 对应的值
152
00:04:58,890 --> 00:04:59,900
value of z that gives
会变的非常小
153
00:05:00,100 --> 00:05:02,050
us a very small value, a very
对整个代价函数而言
154
00:05:03,000 --> 00:05:04,650
small contribution to the
影响也非常小
155
00:05:04,740 --> 00:05:06,120
cost function and this
这也就解释了
156
00:05:06,270 --> 00:05:07,790
kind of explains why when
为什么
157
00:05:08,260 --> 00:05:10,020
logistic regression sees a positive example
逻辑回归在观察到
158
00:05:10,640 --> 00:05:12,200
with y equals 1 it tries
正样本 y=1 时
159
00:05:12,860 --> 00:05:14,220
to set theta transpose x
试图将 θ^T*x
160
00:05:14,650 --> 00:05:15,810
to be very large because that
设置的非常大
161
00:05:15,980 --> 00:05:17,440
corresponds to this term
因为 在代价函数中的
162
00:05:18,300 --> 00:05:21,490
in a cost function being small. Now, to build
这一项会变的非常小 现在
163
00:05:21,760 --> 00:05:23,640
the Support Vector Machine, here is what we are going to do.
开始建立支持向量机 我们从这里开始
164
00:05:23,740 --> 00:05:24,780
We are going to take this cost function, this
我们会从这个代价函数开始
165
00:05:25,740 --> 00:05:29,420
minus log 1 over 1 plus e to the negative z and modify it a little bit.
也就是 -log(1/(1+e^z)) 一点一点修改
166
00:05:31,270 --> 00:05:32,450
Let me take this point
让我取这里的
167
00:05:33,590 --> 00:05:35,120
1 over here and let
z=1 点
168
00:05:36,150 --> 00:05:37,200
me draw the course function that I'm going to
我先画出将要用的代价函数
169
00:05:37,280 --> 00:05:38,510
use, the new cost function is gonna
新的代价函数将会
170
00:05:38,870 --> 00:05:40,320
be flat from here on out
水平的从这里到右边 (图外)
171
00:05:42,000 --> 00:05:42,980
and then I'm going to draw something
然后我再画一条
172
00:05:43,170 --> 00:05:45,720
that grows as a straight
同逻辑回归
173
00:05:46,280 --> 00:05:49,230
line similar to logistic
非常相似的直线
174
00:05:49,530 --> 00:05:50,710
regression but this is going to be the
但是 在这里
175
00:05:50,950 --> 00:05:52,740
straight line in this posh inning.
是一条直线
176
00:05:52,870 --> 00:05:55,040
So the curve that
也就是 我用紫红色画的曲线
177
00:05:55,190 --> 00:05:57,580
I just drew in magenta. The curve that I just drew purple and magenta.
就是这条紫红色的曲线
178
00:05:58,090 --> 00:05:59,580
So, it's a pretty
那么 到了这里
179
00:05:59,730 --> 00:06:01,840
close approximation to the
已经非常接近
180
00:06:02,310 --> 00:06:03,480
cost function used by logistic
逻辑回归中
181
00:06:03,900 --> 00:06:05,060
regression except that it is
使用的代价函数了
182
00:06:05,130 --> 00:06:06,590
now made out of two line segments. This
只是这里是由两条线段组成
183
00:06:07,490 --> 00:06:09,110
is flat potion on the right
即位于右边的水平部分
184
00:06:09,430 --> 00:06:11,590
and then this is a straight
和位于左边的
185
00:06:11,860 --> 00:06:14,340
line portion on the
直线部分
186
00:06:14,630 --> 00:06:16,460
left. And don't worry too much about the slope of the straight line portion.
先别过多的考虑左边直线部分的斜率
187
00:06:16,930 --> 00:06:18,930
It doesn't matter that
这并不是很重要
188
00:06:19,180 --> 00:06:21,630
much but that's the
但是 这里
189
00:06:21,730 --> 00:06:23,910
new cost function we're going to use where y is equal to 1 and
我们将使用的新的代价函数 是在 y=1 的前提下的
190
00:06:24,100 --> 00:06:25,240
you can imagine you
你也许能想到
191
00:06:25,340 --> 00:06:28,310
should do something pretty similar to logistic regression
这应该能做同逻辑回归中类似的事情
192
00:06:29,190 --> 00:06:30,470
but it turns out that this will give the
但事实上
193
00:06:30,750 --> 00:06:32,630
support vector machine computational advantage
在之后的的优化问题中
194
00:06:33,690 --> 00:06:34,470
that will give us later on
这会变得更坚定 并且为支持向量机
195
00:06:34,890 --> 00:06:37,190
an easier optimization problem, that
带来计算上的优势
196
00:06:37,570 --> 00:06:39,670
will be easier for stock trades and so on.
例如 更容易计算股票交易的问题 等等
197
00:06:41,050 --> 00:06:41,990
We just talked about the case
目前 我们只是讨论了
198
00:06:42,120 --> 00:06:43,300
of y equals to 1. The other
y=1 的情况 另外
199
00:06:43,370 --> 00:06:44,420
case is if y is equal
一种情况是当
200
00:06:44,660 --> 00:06:46,120
to 0. In that case,
y=0 时 此时