1- Intelยฎ Extension for PyTorch* Backend
1+ Intelยฎ Extension for PyTorch* ๋ฐฑ์๋
22=====================================
33
4- To work better with `torch.compile `, Intelยฎ Extension for PyTorch* implements a backend ``ipex ``.
5- It targets to improve hardware resource usage efficiency on Intel platforms for better performance.
6- The `ipex ` backend is implemented with further customizations designed in Intelยฎ Extension for
7- PyTorch* for the model compilation.
4+ **์ ์ **: `Hamid Shojanazeri <https://github.com/jingxu10 >`_
5+ **๋ฒ์ญ: **: `๊น์ฌํ <https://github.com/jh941213 >`_
86
9- Usage Example
7+ - `torch.compile ` ๊ณผ ๋ ์ ์๋ํ๋๋ก, Intelยฎ Extension for PyTorch๋ ``ipex `` ๋ผ๋ ๋ฐฑ์๋๋ฅผ ๊ตฌํํ์ต๋๋ค.
8+ - ์ด ๋ฐฑ์๋๋ Intel ํ๋ซํผ์์ ํ๋์จ์ด ์์ ์ฌ์ฉ ํจ์จ์ฑ์ ๊ฐ์ ํ์ฌ ์ฑ๋ฅ์ ํฅ์์ํค๋ ๊ฒ์ ๋ชฉํ๋ก ํฉ๋๋ค.
9+ - ๋ชจ๋ธ ์ปดํ์ผ์ ์ํ Intelยฎ Extension for PyTorch์ ์ค๊ณ๋ ์ถ๊ฐ ์ปค์คํฐ๋ง์ด์ง์ ํตํด, `ipex ` ๋ฐฑ์๋๊ฐ ๊ตฌํ๋์์ต๋๋ค.
10+
11+ ์ฌ์ฉ ์์
1012~~~~~~~~~~~~~
1113
12- Train FP32
14+ FP32 ํ์ต
1315----------
1416
15- Check the example below to learn how to utilize the `ipex ` backend with `torch.compile ` for model training with FP32 data type.
16-
17+ ์๋ ์์ ๋ฅผ ํตํด, ์ฌ๋ฌ๋ถ์ FP32 ๋ฐ์ดํฐ ํ์
์ผ๋ก ๋ชจ๋ธ์ ํ์ตํ ๋ `torch.compile ` ๊ณผ ํจ๊ป `ipex ` ๋ฐฑ์๋๋ฅผ ์ฌ์ฉํ๋ ๋ฐฉ๋ฒ์ ๋ฐฐ์ธ ์ ์์ต๋๋ค.
1718.. code :: python
1819
1920 import torch
@@ -44,10 +45,9 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c
4445 optimizer = torch.optim.SGD(model.parameters(), lr = LR , momentum = 0.9 )
4546 model.train()
4647
47- # ################### code changes ####################
48+ # ################### ์ฝ๋ ๋ณ๊ฒฝ ๋ถ๋ถ ####################
4849 import intel_extension_for_pytorch as ipex
49-
50- # Invoke the following API optionally, to apply frontend optimizations
50+ # ์ ํ์ ์ผ๋ก ๋ค์ API๋ฅผ ํธ์ถํ์ฌ, ํ๋ก ํธ์๋ ์ต์ ํ๋ฅผ ์ ์ฉํฉ๋๋ค.
5151 model, optimizer = ipex.optimize(model, optimizer = optimizer)
5252
5353 compile_model = torch.compile(model, backend = " ipex" )
@@ -61,10 +61,10 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c
6161 optimizer.step()
6262
6363
64- Train BF16
64+ BF16 ํ์ต
6565----------
6666
67- Check the example below to learn how to utilize the ` ipex ` backend with ` torch.compile ` for model training with BFloat16 data type .
67+ ์๋ ์์๋ฅผ ํตํด BFloat16 ๋ฐ์ดํฐ ํ์
์ผ๋ก ๋ชจ๋ธ ํ์ต์ ์ํด ` torch.compile ` ์ ํจ๊ป ` ipex ` ๋ฐฑ์๋๋ฅผ ํ์ฉํ๋ ๋ฐฉ๋ฒ์ ์์๋ณด์ธ์ .
6868
6969.. code :: python
7070
@@ -96,10 +96,9 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c
9696 optimizer = torch.optim.SGD(model.parameters(), lr = LR , momentum = 0.9 )
9797 model.train()
9898
99- # ################### code changes ####################
99+ # ################### ์ฝ๋ ๋ณ๊ฒฝ ๋ถ๋ถ ####################
100100 import intel_extension_for_pytorch as ipex
101-
102- # Invoke the following API optionally, to apply frontend optimizations
101+ # ์ ํ์ ์ผ๋ก ๋ค์ API๋ฅผ ํธ์ถํ์ฌ, ํ๋ก ํธ์๋ ์ต์ ํ๋ฅผ ์ ์ฉํฉ๋๋ค.
103102 model, optimizer = ipex.optimize(model, dtype = torch.bfloat16, optimizer = optimizer)
104103
105104 compile_model = torch.compile(model, backend = " ipex" )
@@ -114,10 +113,10 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c
114113 optimizer.step()
115114
116115
117- Inference FP32
116+ FP32 ์ถ๋ก
118117--------------
119118
120- Check the example below to learn how to utilize the `ipex ` backend with `torch.compile ` for model inference with FP32 data type .
119+ ์๋ ์์๋ฅผ ํตํด `ipex ` ๋ฐฑ์๋๋ฅผ `torch.compile ` ์ ํจ๊ป ํ์ฉํ์ฌ FP32 ๋ฐ์ดํฐ ํ์
์ผ๋ก ๋ชจ๋ธ์ ์ถ๋ก ํ๋ ๋ฐฉ๋ฒ์ ์์๋ณด์ธ์ .
121120
122121.. code :: python
123122
@@ -128,10 +127,9 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c
128127 model.eval()
129128 data = torch.rand(1 , 3 , 224 , 224 )
130129
131- # ################### code changes ####################
130+ # ################### ์ฝ๋ ๋ณ๊ฒฝ ๋ถ๋ถ ####################
132131 import intel_extension_for_pytorch as ipex
133-
134- # Invoke the following API optionally, to apply frontend optimizations
132+ # ์ ํ์ ์ผ๋ก ๋ค์ API๋ฅผ ํธ์ถํ์ฌ, ํ๋ก ํธ์๋ ์ต์ ํ๋ฅผ ์ ์ฉํฉ๋๋ค.
135133 model = ipex.optimize(model, weights_prepack = False )
136134
137135 compile_model = torch.compile(model, backend = " ipex" )
@@ -141,10 +139,10 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c
141139 compile_model(data)
142140
143141
144- Inference BF16
142+ BF16 ์ถ๋ก
145143--------------
146144
147- Check the example below to learn how to utilize the `ipex ` backend with `torch.compile ` for model inference with BFloat16 data type .
145+ ์๋ ์์๋ฅผ ํตํด `ipex ` ๋ฐฑ์๋๋ฅผ `torch.compile`์ ํจ๊ป ํ์ฉํ์ฌ BFloat16 ๋ฐ์ดํฐ ํ์
์ผ๋ก ๋ชจ๋ธ์ ์ถ๋ก ํ๋ ๋ฐฉ๋ฒ์ ์์๋ณด์ธ์ .
148146
149147.. code :: python
150148
@@ -155,10 +153,9 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c
155153 model.eval()
156154 data = torch.rand(1 , 3 , 224 , 224 )
157155
158- # ################### code changes ####################
156+ # ################### ์ฝ๋ ๋ณ๊ฒฝ ๋ถ๋ถ ####################
159157 import intel_extension_for_pytorch as ipex
160-
161- # Invoke the following API optionally, to apply frontend optimizations
158+ # ์ ํ์ ์ผ๋ก ๋ค์ API๋ฅผ ํธ์ถํ์ฌ, ํ๋ก ํธ์๋ ์ต์ ํ๋ฅผ ์ ์ฉํฉ๋๋ค.
162159 model = ipex.optimize(model, dtype = torch.bfloat16, weights_prepack = False )
163160
164161 compile_model = torch.compile(model, backend = " ipex" )
0 commit comments