forked from fengdu78/Coursera-ML-AndrewNg-Notes
-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy path14 - 4 - Principal Component Analysis Algorithm (15 min).srt
2132 lines (1706 loc) · 37.5 KB
/
14 - 4 - Principal Component Analysis Algorithm (15 min).srt
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
1
00:00:00,340 --> 00:00:01,410
In this video I'd like
在这个视频中,我将(字幕翻译:中国海洋大学,孙中卫)
2
00:00:01,550 --> 00:00:03,020
to tell you about the principle
介绍
3
00:00:03,340 --> 00:00:04,570
components analysis algorithm.
主成分分析算法
4
00:00:05,600 --> 00:00:06,560
And by the end of this
在视频的最后,
5
00:00:06,710 --> 00:00:09,200
video you know to implement PCA for yourself.
你应该能独自使用主成分分析算法。
6
00:00:10,170 --> 00:00:12,540
And use it reduce the dimension of your data.
同时,能够使用PCA对自己的数据进行降维。
7
00:00:13,100 --> 00:00:14,690
Before applying PCA, there is
在使用PCA之前,
8
00:00:14,800 --> 00:00:17,760
a data pre-processing step which you should always do.
将首先进行数据预处理。
9
00:00:18,510 --> 00:00:20,220
Given the trading sets of the
给定一个交易例子的集合,
10
00:00:20,520 --> 00:00:22,290
examples is important to
重要的步骤是
11
00:00:22,600 --> 00:00:24,070
always perform mean normalization,
要总是执行均值标准化。
12
00:00:25,330 --> 00:00:26,140
and then depending on your data,
依据你的数据,
13
00:00:26,840 --> 00:00:28,540
maybe perform feature scaling as well.
大概也需要进行特征的缩放。
14
00:00:29,620 --> 00:00:30,950
this is very similar to the
这是很相似的,
15
00:00:31,650 --> 00:00:33,250
mean normalization and feature scaling
在均值标准化过程与特征缩放的过程之间
16
00:00:34,080 --> 00:00:36,580
process that we have for supervised learning.
在我们已学习的监督学习中。
17
00:00:36,910 --> 00:00:38,240
In fact it's exactly the
实际上,确实是
18
00:00:38,390 --> 00:00:40,160
same procedure except that we're
相同过程,除了我们现在
19
00:00:40,310 --> 00:00:41,790
doing it now to our unlabeled
对未标记数据做的,
20
00:00:42,930 --> 00:00:43,670
data, X1 through Xm.
X1到Xm。
21
00:00:44,180 --> 00:00:45,530
So for mean normalization we
对于均值标准化,
22
00:00:45,720 --> 00:00:47,080
first compute the mean of
我们首先计算
23
00:00:47,390 --> 00:00:49,070
each feature and then
每个特征的均值,
24
00:00:49,340 --> 00:00:50,900
we replace each feature, X,
我们取代每个特征X用
25
00:00:51,150 --> 00:00:52,680
with X minus its mean,
X减去它的均值。
26
00:00:52,810 --> 00:00:54,120
and so this makes each feature
这将使每个特征
27
00:00:54,520 --> 00:00:57,450
now have exactly zero mean
现在有个恰当的零均值。
28
00:00:58,690 --> 00:01:00,690
The different features have very different scales.
不同的特征有非常不同的缩放。
29
00:01:01,540 --> 00:01:03,050
So for example, if x1
例如,x1
30
00:01:03,080 --> 00:01:04,060
is the size of a house, and
是房子的尺寸,
31
00:01:04,100 --> 00:01:05,390
x2 is the number of bedrooms, to
x2是卧室的数量,
32
00:01:05,580 --> 00:01:07,370
use our earlier example, we
对于我们先前的例子,
33
00:01:07,480 --> 00:01:08,680
then also scale each feature
我们也可以缩放每个特征
34
00:01:09,130 --> 00:01:10,540
to have a comparable range of values.
获取一个相对的价值范围。
35
00:01:10,980 --> 00:01:12,490
And so, similar to what
相似于
36
00:01:12,680 --> 00:01:13,860
we had with supervised learning,
我们之前的监督学习,
37
00:01:14,060 --> 00:01:16,200
we would take x, i substitute
我们提取xi
38
00:01:16,680 --> 00:01:17,620
j, that's the j feature
的第j个特征。
39
00:01:23,250 --> 00:01:25,530
and so we would
所以我们能够
40
00:01:25,890 --> 00:01:27,610
subtract of the mean,
减去均值。
41
00:01:28,370 --> 00:01:29,520
now that's what we have on top, and then divide by sj.
想在对于我们上面做的,除以sj.
42
00:01:29,610 --> 00:01:30,020
Here, sj is some measure of the beta values of feature j. So, it could be the max minus
在这里,sj是特征j的beta值的一些测量值。所以,它是最大减去
43
00:01:30,080 --> 00:01:31,310
min value, or more commonly,
最小值,或更普通的。
44
00:01:31,890 --> 00:01:33,540
it is the standard deviation of
sj是一个特征j的偏差标准。
45
00:01:33,640 --> 00:01:35,520
feature j. Having done
46
00:01:36,230 --> 00:01:39,480
this sort of data pre-processing, here's what the PCA algorithm does.
做了数据预处理流程后,这是PCA算法做的。
47
00:01:40,620 --> 00:01:41,630
We saw from the previous video
我们能从先前的视频中看到
48
00:01:41,960 --> 00:01:43,050
that what PCA does is, it
PCA所做的,
49
00:01:43,170 --> 00:01:44,520
tries to find a lower
它尝试着找到一个
50
00:01:44,790 --> 00:01:46,080
dimensional sub-space onto which to
低维子空间,
51
00:01:46,170 --> 00:01:47,500
project the data, so as
进行数据处理,
52
00:01:47,650 --> 00:01:49,780
to minimize the squared projection
只为了最小化平方投影
53
00:01:50,540 --> 00:01:51,660
errors, sum of the
误差,
54
00:01:51,740 --> 00:01:53,080
squared projection errors, as the
最小化平方和投影误差,
55
00:01:53,420 --> 00:01:54,800
square of the length of
随着这些蓝线所表示的
56
00:01:54,870 --> 00:01:56,790
those blue lines that and so
的平方,
57
00:01:57,110 --> 00:01:58,510
what we wanted to do specifically
我们想做的是
58
00:01:59,210 --> 00:02:02,730
is find a vector, u1, which
找到一个向量,u1,
59
00:02:03,280 --> 00:02:04,750
specifies that direction or
指定这个方向或者
60
00:02:05,040 --> 00:02:06,630
in the 2D case we want
在2维中,我们
61
00:02:06,880 --> 00:02:08,760
to find two vectors, u1 and
想找到2个向量,u1和
62
00:02:10,640 --> 00:02:12,980
u2, to define this surface
u2,来定义这个表面,
63
00:02:13,590 --> 00:02:14,610
onto which to project the data.
用于投射数据。
64
00:02:16,620 --> 00:02:17,920
So, just as a
正如
65
00:02:18,040 --> 00:02:19,160
quick reminder of what reducing
一个快速的提醒对于减少
66
00:02:19,730 --> 00:02:20,820
the dimension of the data means,
数据均值的维数。
67
00:02:21,490 --> 00:02:22,430
for this example on the
对左边的例子,
68
00:02:22,470 --> 00:02:23,560
left we were given
我们给予
69
00:02:23,680 --> 00:02:26,010
the examples xI, which are in r2.
西,在r2中。
70
00:02:26,300 --> 00:02:28,390
And what we
我们
71
00:02:28,660 --> 00:02:29,500
like to do is find
想找到
72
00:02:29,970 --> 00:02:32,400
a set of numbers zI in
一个数据集合zi在r
73
00:02:33,000 --> 00:02:34,950
r push to represent our data.
中代表我们的数据。
74
00:02:36,000 --> 00:02:37,820
So that's what from reduction from 2D to 1D means.
所以我们的均值从2维降到1维。
75
00:02:39,020 --> 00:02:41,450
So specifically by projecting
所以特别提醒
76
00:02:42,710 --> 00:02:44,080
data onto this red line there.
把数据投射到红线上。
77
00:02:44,800 --> 00:02:46,320
We need only one number to
我们仅需一个数字
78
00:02:46,450 --> 00:02:48,340
specify the position of the points on the line.
来表示在线上的点的位置。
79
00:02:48,590 --> 00:02:49,380
So i'm going to call that number
所以我们叫这个数字为
80
00:02:50,700 --> 00:02:51,830
z or z1.
z或z1.
81
00:02:52,020 --> 00:02:54,850
Z here [xx] real number, so that's like a one dimensional vector.
Z是真实数字,以便像一个1维向量。
82
00:02:55,380 --> 00:02:56,650
So z1 just refers to
所以z1仅涉及
83
00:02:56,690 --> 00:02:58,080
the first component of this,
一个主成分,
84
00:02:58,280 --> 00:03:00,430
you know, one by one matrix, or this one dimensional vector.
它是一个1:1矩阵,或是一个一维向量。
85
00:03:01,670 --> 00:03:03,170
And so we need only
所以我们仅需要
86
00:03:03,490 --> 00:03:05,590
one number to specify the position of a point.
一个数字来指明一个点的位置。
87
00:03:06,330 --> 00:03:07,940
So if this example
所以如果这个例子
88
00:03:08,460 --> 00:03:09,510
here was my example
是我的一个例子X1,
89
00:03:10,610 --> 00:03:13,160
X1, then maybe that gets mapped here.
大概得到相应的映射。
90
00:03:13,900 --> 00:03:15,450
And if this example was X2
如果这个例子是x2
91
00:03:15,680 --> 00:03:17,250
maybe that example gets mapped
大概例子也得到映射。
92
00:03:17,530 --> 00:03:18,790
And so this point
所以对应的点
93
00:03:19,060 --> 00:03:20,400
here will be Z1
将是z1
94
00:03:20,840 --> 00:03:21,920
and this point here will be
这个点对应的将是
95
00:03:22,080 --> 00:03:24,240
Z2, and similarly we
z2,相似的
96
00:03:24,620 --> 00:03:26,410
would have those other points
我们将有其他的点对于这些,
97
00:03:26,840 --> 00:03:30,230
for These, maybe X3,
大概是x3,
98
00:03:30,510 --> 00:03:32,550
X4, X5 get mapped to Z1, Z2, Z3.
x4,x5,得到映射z1,z2,和z3.
99
00:03:34,360 --> 00:03:35,940
So What PCA has
所以PCA要做的是
100
00:03:36,050 --> 00:03:36,830
to do is we need to
我们需要
101
00:03:36,930 --> 00:03:38,920
come up with a way to compute two things.
想出一个方法计算两个事情。
102
00:03:39,310 --> 00:03:40,710
One is to compute these vectors,
一个是计算这些向量
103
00:03:41,830 --> 00:03:44,970
u1, and in this case u1 and u2.
例如u1,和在哪个事件中的u1和u2.
104
00:03:45,230 --> 00:03:46,880
And the other is
另一个是
105
00:03:47,130 --> 00:03:48,140
how do we compute these numbers,
如何来计算这些数字
106
00:03:49,360 --> 00:03:51,200
Z. So on the
Z.所以
107
00:03:51,430 --> 00:03:53,910
example on the left we're reducing the data from 2D to 1D.
在左边的例子汇总我们降维数据从2维到1维。
108
00:03:55,290 --> 00:03:56,100
In the example on the right,
在右边的例子中,
109
00:03:56,510 --> 00:03:58,100
we would be reducing data from
我们能降维数据从
110
00:03:58,450 --> 00:04:00,600
3 dimensional as in
三维
111
00:04:00,710 --> 00:04:04,840
r3, to zi, which is now two dimensional.
到zi,它是一个二维数据。
112
00:04:05,390 --> 00:04:07,790
So these z vectors would now be two dimensional.
所以这些z向量现在将是二维的。
113
00:04:08,450 --> 00:04:09,590
So it would be z1
所以它将是z1
114
00:04:10,150 --> 00:04:11,410
z2 like so, and so
z2这样的,所以
115
00:04:11,640 --> 00:04:12,940
we need to give away to compute
我们需要有方法计算
116
00:04:13,670 --> 00:04:15,410
these new representations, the z1
这些新的模型代表,
117
00:04:15,570 --> 00:04:17,350
and z2 of the data as well.
这些z1和z2的数据也一样。
118
00:04:18,280 --> 00:04:20,350
So how do you compute all of these quantities?
所以你如何来计算所有的参数那?
119
00:04:20,520 --> 00:04:21,520
It turns out that a mathematical
已经证明一个数学的
120
00:04:22,490 --> 00:04:23,660
derivation, also the mathematical
推导,也就是数学的
121
00:04:24,300 --> 00:04:26,020
proof, for what is
证明对于
122
00:04:26,090 --> 00:04:27,970
the right value U1, U2, Z1,
U1,U2,Z1,Z2等是正确的。
123
00:04:28,290 --> 00:04:29,480
Z2, and so on.
124
00:04:29,690 --> 00:04:31,230
That mathematical proof is very
这个数学证明是非常
125
00:04:31,480 --> 00:04:32,890
complicated and beyond the
复杂的,超出了
126
00:04:32,950 --> 00:04:34,620
scope of the course.
课程的范围。
127
00:04:35,280 --> 00:04:37,290
But once you've done [xx] it
但是你已经做的,
128
00:04:37,590 --> 00:04:38,590
turns out that the procedure
证明这个过程
129
00:04:39,350 --> 00:04:40,570
to actually find the value
实际能够找到
130
00:04:41,200 --> 00:04:42,210
of u1 that you want
你想要的u1值,
131
00:04:42,950 --> 00:04:43,950
is not that hard, even though
这是不困难的,尽管
132
00:04:44,180 --> 00:04:45,640
so that the mathematical proof that
这个数学证明
133
00:04:45,840 --> 00:04:46,940
this value is the correct
这个值是正确的。
134
00:04:47,260 --> 00:04:48,450
value is someone more
这个值有些人想更多
135
00:04:48,700 --> 00:04:49,960
involved and more than i want to get into.
的涉及,超出想得到的i.
136
00:04:50,880 --> 00:04:52,070
But let me just describe the
但让我描述
137
00:04:52,480 --> 00:04:53,830
specific procedure that you
具体的过程,
138
00:04:53,960 --> 00:04:55,250
have to implement in order
你想执行为了
139
00:04:55,440 --> 00:04:56,450
to compute all of these
计算所有的
140
00:04:56,570 --> 00:04:57,840
things, the vectors, u1, u2,
参数,向量u1,u2
141
00:04:58,910 --> 00:05:00,980
the vector z. Here's the procedure.
向量z,这是一个过程。
142
00:05:02,070 --> 00:05:02,970
Let's say we want to reduce
我们想把
143
00:05:03,170 --> 00:05:04,220
the data to n dimensions
n维的数据降到
144
00:05:04,840 --> 00:05:05,760
to k dimension What we're
k维,
145
00:05:06,760 --> 00:05:07,640
going to do is first
我们首先要做的
146
00:05:07,900 --> 00:05:09,400
compute something called the
是计算
147
00:05:09,830 --> 00:05:11,140
covariance matrix, and the covariance
协方差,这个协方差
148
00:05:11,700 --> 00:05:13,620
matrix is commonly denoted by
通常用
149
00:05:13,820 --> 00:05:15,050
this Greek alphabet which is
希腊字母表中
150
00:05:15,190 --> 00:05:16,880
the capital Greek alphabet sigma.
的 sigma表示。
151
00:05:18,000 --> 00:05:19,210
It's a bit unfortunate that the
不幸的是
152
00:05:19,310 --> 00:05:21,080
Greek alphabet sigma looks exactly
这个希腊字母sigma看起来
153
00:05:21,760 --> 00:05:22,710
like the summation symbols.
像一个求和标记。
154
00:05:23,210 --> 00:05:24,620
So this is the
所以这里的
155
00:05:24,700 --> 00:05:26,220
Greek alphabet Sigma is used
希腊字母Sigma被用来
156
00:05:26,420 --> 00:05:29,540
to denote a matrix and this here is a summation symbol.
标记一个矩阵,在那里是一个求和标记。
157
00:05:30,510 --> 00:05:32,330
So hopefully in these slides
所以希望在幻灯片
158
00:05:32,680 --> 00:05:34,190
there won't be ambiguity about which
不要有模糊对于
159
00:05:34,410 --> 00:05:36,340
is Sigma Matrix, the
Sigma 矩阵,
160
00:05:36,520 --> 00:05:37,850
matrix, which is a
另一个是
161
00:05:38,090 --> 00:05:39,620
summation symbol, and hopefully
求和标记,希望
162
00:05:39,940 --> 00:05:41,460
it will be clear from context when
它能被区分从内容上,
163
00:05:41,820 --> 00:05:43,510
I'm using each one.
当我们使用每一个的时候。
164
00:05:43,740 --> 00:05:44,790
How do you compute this matrix
如何计算这个矩阵
165
00:05:45,530 --> 00:05:46,550
let's say we want to
比如说我们想
166
00:05:47,135 --> 00:05:47,640
store it in an octave variable
存储它到一个octave变量中,
167
00:05:48,120 --> 00:05:49,970
variable called sigma.
变量叫sigma。
168
00:05:50,840 --> 00:05:51,890
What we need to do is
我们需要做什么
169
00:05:52,030 --> 00:05:53,660
compute something called the
来计算被叫做
170
00:05:54,130 --> 00:05:56,190
eigenvectors of the matrix sigma.
矩阵sigma的特征向量。
171
00:05:57,560 --> 00:05:58,450
And an octave, the way you
对于一个octave,你的方法
172
00:05:58,590 --> 00:05:59,820
do that is you use this
是用这个
173
00:05:59,970 --> 00:06:01,020
command, u s v equals
命令
174
00:06:01,350 --> 00:06:02,600
s v d of sigma.
或公式来计算。
175
00:06:03,650 --> 00:06:06,090
SVD, by the way, stands for singular value decomposition.
SVD代表奇异值分解,
176
00:06:08,520 --> 00:06:10,590
This is a Much
这是一个更
177
00:06:10,790 --> 00:06:12,660
more advanced single value composition.
加高级的单值组成。
178
00:06:14,450 --> 00:06:15,560
It is much more advanced linear
它也是更加高级的线性
179
00:06:15,800 --> 00:06:16,950
algebra than you actually need
代数比你实际需要
180
00:06:16,950 --> 00:06:18,770
to know but now It turns out
知道的但结果是
181
00:06:18,950 --> 00:06:20,250
that when sigma is equal
sigma等同
182
00:06:20,480 --> 00:06:21,800
to matrix there is
矩阵,有几个
183
00:06:21,880 --> 00:06:23,420
a few ways to compute these are
方法可以在高维向量中
184
00:06:23,610 --> 00:06:25,810
high in vectors and If you
进行计算,如果你
185
00:06:25,960 --> 00:06:27,350
are an expert in linear algebra
是一个专家在线性代数上,
186
00:06:27,700 --> 00:06:28,610
and if you've heard of high in
如果你已经听过高维的
187
00:06:28,860 --> 00:06:30,170
vectors before you may know
向量在你知道
188
00:06:30,350 --> 00:06:31,660
that there is another octet function
有另一个叫I的octet函数,
189
00:06:31,990 --> 00:06:33,420
called I, which can
它能够
190
00:06:33,520 --> 00:06:35,030
also be used to compute the same thing.
被用来计算相同的事。
191
00:06:35,950 --> 00:06:36,980
and It turns out that the
已经证明这个
192
00:06:37,370 --> 00:06:39,180
SVD function and the
SVD函数和
193
00:06:39,290 --> 00:06:40,310
I function it will give
这个I函数将给
194
00:06:40,370 --> 00:06:42,170
you the same vectors, although SVD
你相同的向量,但SVD
195
00:06:42,840 --> 00:06:44,210
is a little more numerically stable.
将更加具有数据稳定性。
196
00:06:44,540 --> 00:06:45,890
So I tend to use SVD, although
所以我趋向于用SVD,尽管
197
00:06:46,140 --> 00:06:47,040
I have a few friends that use
我有几个朋友用
198
00:06:47,280 --> 00:06:48,720
the I function to do
I函数也能很好的做这些,
199
00:06:48,920 --> 00:06:50,050
this as well but when you
很好的做这些,当你
200
00:06:50,130 --> 00:06:51,270
apply this to a covariance matrix
用I函数求一个协方差矩阵
201
00:06:51,750 --> 00:06:52,960