Skip to content

Gemaxis/GAT

Repository files navigation

Description

这是对 Graph Attention Networks 的简单改进,通过增加 Attention Is All You Need 中缩放点积注意力(scaled dot-product attention)的公式,可以让GAT训练的更快,且准确率更高

In layer.py line 54, I have added e = e / math.sqrt(1433) to make GAT faster and better.

Pytorch Graph Attention Network

This is a pytorch implementation of the Graph Attention Network (GAT) model presented by Veličković et. al (2017, https://arxiv.org/abs/1710.10903).

The repo has been forked initially from https://github.com/tkipf/pygcn. The official repository for the GAT (Tensorflow) is available in https://github.com/PetarV-/GAT. Therefore, if you make advantage of the pyGAT model in your research, please cite the following:

@article{
  velickovic2018graph,
  title="{Graph Attention Networks}",
  author={Veli{\v{c}}kovi{\'{c}}, Petar and Cucurull, Guillem and Casanova, Arantxa and Romero, Adriana and Li{\`{o}}, Pietro and Bengio, Yoshua},
  journal={International Conference on Learning Representations},
  year={2018},
  url={https://openreview.net/forum?id=rJXMpikCZ},
  note={accepted as poster},
}

Requirements

pyGAT relies on Python 3.9 and PyTorch 1.10.2.

Issues/Pull Requests/Feedback

Don't hesitate to contact for any feedback or create issues/pull requests.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages