[Day 97] Review of the GNN structure and training (last 2 days) + starting Colab 2 of XCS224W: ML with Graphs
Hello :)
Today is Day 97!
A quick summary of today:- read through and reviewed my notes on intro to GNNs (Day 94)
- read through and reviewed my notes on Designing a GNN layer (Day 95)
- read through and reviewed my notes on GNN training (Day 96)
- started 2nd homework of the course covering the above two
First I had a quick look through the 2nd homework that covers material covered in Day 94 and Day 95, and I decided that it would be best to do a proper review of the important concepts from those days
Topics I went over:
- Designing a single layer of a GNN,
- message computation,
- aggregation,
- GCN,
- GraphSAGE,
- GAT,
- attention and multi-head attention in graphs,
- stacking GNN layers,
- the problem of over-smoothing,
- shallow GNNs,
- using skip connections
- graph augmentation,
- feature augmentation,
- training GNNs on a node-level, edge-level and graph-level,
- pooling for graph-level tasks,
- DiffPool,
- supervised, unsupervised and self-supervised learning,
- loss functions,
- evaluation metrics,
- splitting data
As for assignment two, I am not allowed to share any code from it, but today I did not finish it. I decided to go slowly with this one and completed just 4 of the questions. Tomorrow I think I will be able to do it all and submit to get a final grade ^^
That is all for today!
See you tomorrow :)