Skip to content

Commit dec8fdb

Browse files
committed
[Inference] resolve rebase conflicts
fix
1 parent 791a9fb commit dec8fdb

File tree

2 files changed

+1
-2
lines changed

2 files changed

+1
-2
lines changed

colossalai/inference/core/engine.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
import time
22
from itertools import count
3-
from typing import Dict, List, Optional, Tuple, Union, Iterable
3+
from typing import Dict, List, Optional, Tuple, Union
44

55
import numpy as np
66
import torch

colossalai/shardformer/layer/embedding.py

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -175,7 +175,6 @@ class VocabParallelEmbedding1D(ParallelModule):
175175
he initializer of weight, defaults to normal initializer.
176176
177177
The ``args`` and ``kwargs`` used in :class:``torch.nn.functional.embedding`` should contain:
178-
::
179178
180179
max_norm (float, optional): If given, each embedding vector with norm larger than max_norm is
181180
renormalized to have norm max_norm. Note: this will modify weight in-place.

0 commit comments

Comments
 (0)