Skip to content

Commit 89396d1

Browse files
authored
chore: remove repetitive words (pytorch#1244)
Signed-off-by: hugehope <cmm7@sina.cn>
1 parent ecd951d commit 89396d1

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

gat/README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@ This repository provides a clean and short implementation of the official GAT mo
1717

1818
## Key Features
1919

20-
- **GAT Model**: Implementation of the Graph Attention Network model with multi-head attention based on on the paper "Graph Attention Network" by Velickovic et al.
20+
- **GAT Model**: Implementation of the Graph Attention Network model with multi-head attention based on the paper "Graph Attention Network" by Velickovic et al.
2121
- **Graph Attention Layers**: Implementation of graph convolutional layers that aggregate information from neighboring nodes using a self-attention mechanisms to learn node importance weights.
2222
- **Training and Evaluation**: Code for training GAT models on graph-structured data and evaluating their performance on node classification tasks on the *Cora* benchmark dataset.
2323

@@ -45,7 +45,7 @@ Following the official implementation, the first GAT layer consists of **K = 8 a
4545

4646

4747
# Usage
48-
Training and evaluating the GAT model on the Cora dataset can be done through running the the `main.py` script as follows:
48+
Training and evaluating the GAT model on the Cora dataset can be done through running the `main.py` script as follows:
4949

5050
1. Clone the PyTorch examples repository:
5151

@@ -60,7 +60,7 @@ cd examples/gat
6060
pip install -r requirements.txt
6161
```
6262

63-
3. Train the GAT model by running the the `main.py` script as follows:: (Example using the default parameters)
63+
3. Train the GAT model by running the `main.py` script as follows:: (Example using the default parameters)
6464

6565
```bash
6666
python main.py --epochs 300 --lr 0.005 --l2 5e-4 --dropout-p 0.6 --num-heads 8 --hidden-dim 64 --val-every 20

0 commit comments

Comments
 (0)