@@ -15,44 +15,44 @@ pip install mxnet --pre
15
15
### Neighbor Sampling & Skip Connection
16
16
cora: test accuracy ~ 83% with ` --num-neighbors 2 ` , ~ 84% by training on the full graph
17
17
```
18
- DGLBACKEND=mxnet python examples/mxnet/sampling/train.py --model gcn_ns --dataset cora --self-loop --num-neighbors 2 --batch-size 1000 --test-batch-size 5000
18
+ DGLBACKEND=mxnet python3 examples/mxnet/sampling/train.py --model gcn_ns --dataset cora --self-loop --num-neighbors 2 --batch-size 1000 --test-batch-size 5000
19
19
```
20
20
21
21
citeseer: test accuracy ~ 69% with ` --num-neighbors 2 ` , ~ 70% by training on the full graph
22
22
```
23
- DGLBACKEND=mxnet python examples/mxnet/sampling/train.py --model gcn_ns --dataset citeseer --self-loop --num-neighbors 2 --batch-size 1000 --test-batch-size 5000
23
+ DGLBACKEND=mxnet python3 examples/mxnet/sampling/train.py --model gcn_ns --dataset citeseer --self-loop --num-neighbors 2 --batch-size 1000 --test-batch-size 5000
24
24
```
25
25
26
26
pubmed: test accuracy ~ 78% with ` --num-neighbors 3 ` , ~ 77% by training on the full graph
27
27
```
28
- DGLBACKEND=mxnet python examples/mxnet/sampling/train.py --model gcn_ns --dataset pubmed --self-loop --num-neighbors 3 --batch-size 1000 --test-batch-size 5000
28
+ DGLBACKEND=mxnet python3 examples/mxnet/sampling/train.py --model gcn_ns --dataset pubmed --self-loop --num-neighbors 3 --batch-size 1000 --test-batch-size 5000
29
29
```
30
30
31
31
reddit: test accuracy ~ 91% with ` --num-neighbors 3 ` and ` --batch-size 1000 ` , ~ 93% by training on the full graph
32
32
```
33
- DGLBACKEND=mxnet python examples/mxnet/sampling/train.py --model gcn_ns --dataset reddit-self-loop --num-neighbors 3 --batch-size 1000 --test-batch-size 5000 --n-hidden 64
33
+ DGLBACKEND=mxnet python3 examples/mxnet/sampling/train.py --model gcn_ns --dataset reddit-self-loop --num-neighbors 3 --batch-size 1000 --test-batch-size 5000 --n-hidden 64
34
34
```
35
35
36
36
37
37
### Control Variate & Skip Connection
38
38
cora: test accuracy ~ 84% with ` --num-neighbors 1 ` , ~ 84% by training on the full graph
39
39
```
40
- DGLBACKEND=mxnet python examples/mxnet/sampling/train.py --model gcn_cv --dataset cora --self-loop --num-neighbors 1 --batch-size 1000000 --test-batch-size 1000000
40
+ DGLBACKEND=mxnet python3 examples/mxnet/sampling/train.py --model gcn_cv --dataset cora --self-loop --num-neighbors 1 --batch-size 1000000 --test-batch-size 1000000
41
41
```
42
42
43
43
citeseer: test accuracy ~ 69% with ` --num-neighbors 1 ` , ~ 70% by training on the full graph
44
44
```
45
- DGLBACKEND=mxnet python examples/mxnet/sampling/train.py --model gcn_cv --dataset citeseer --self-loop --num-neighbors 1 --batch-size 1000000 --test-batch-size 1000000
45
+ DGLBACKEND=mxnet python3 examples/mxnet/sampling/train.py --model gcn_cv --dataset citeseer --self-loop --num-neighbors 1 --batch-size 1000000 --test-batch-size 1000000
46
46
```
47
47
48
48
pubmed: test accuracy ~ 79% with ` --num-neighbors 1 ` , ~ 77% by training on the full graph
49
49
```
50
- DGLBACKEND=mxnet python examples/mxnet/sampling/train.py --model gcn_cv --dataset pubmed --self-loop --num-neighbors 1 --batch-size 1000000 --test-batch-size 1000000
50
+ DGLBACKEND=mxnet python3 examples/mxnet/sampling/train.py --model gcn_cv --dataset pubmed --self-loop --num-neighbors 1 --batch-size 1000000 --test-batch-size 1000000
51
51
```
52
52
53
53
reddit: test accuracy ~ 93% with ` --num-neighbors 1 ` and ` --batch-size 1000 ` , ~ 93% by training on the full graph
54
54
```
55
- DGLBACKEND=mxnet python examples/mxnet/sampling/train.py --model gcn_cv --dataset reddit-self-loop --num-neighbors 1 --batch-size 10000 --test-batch-size 5000 --n-hidden 64
55
+ DGLBACKEND=mxnet python3 examples/mxnet/sampling/train.py --model gcn_cv --dataset reddit-self-loop --num-neighbors 1 --batch-size 10000 --test-batch-size 5000 --n-hidden 64
56
56
```
57
57
58
58
### Control Variate & GraphSAGE-mean
@@ -61,7 +61,7 @@ Following [Control Variate](https://arxiv.org/abs/1710.10568), we use the mean p
61
61
62
62
reddit: test accuracy 96.1% with ` --num-neighbors 1 ` and ` --batch-size 1000 ` , ~ 96.2% in [ Control Variate] ( https://arxiv.org/abs/1710.10568 ) with ` --num-neighbors 2 ` and ` --batch-size 1000 `
63
63
```
64
- DGLBACKEND=mxnet python examples/mxnet/sampling/train.py --model graphsage_cv --batch-size 1000 --test-batch-size 5000 --n-epochs 50 --dataset reddit --num-neighbors 1 --n-hidden 128 --dropout 0.2 --weight-decay 0
64
+ DGLBACKEND=mxnet python3 examples/mxnet/sampling/train.py --model graphsage_cv --batch-size 1000 --test-batch-size 5000 --n-epochs 50 --dataset reddit --num-neighbors 1 --n-hidden 128 --dropout 0.2 --weight-decay 0
65
65
```
66
66
67
67
### Run multi-processing training
0 commit comments