Posts

Showing posts from April 7, 2024

[Day 97] Review of the GNN structure and training (last 2 days) + starting Colab 2 of XCS224W: ML with Graphs

 Hello :) Today is Day 97! A quick summary of today: read through and reviewed my notes on intro to GNNs  (Day 94) read through and reviewed my notes on Designing a GNN layer  (Day 95) read through and reviewed my notes on GNN training  (Day 96) started 2nd homework of the course covering the above two First I had a quick look through the 2nd homework that covers material covered in Day 94 and Day 95, and I decided that it would be best to do a proper review of the important concepts from those days Topics I went over: Designing a single layer of a GNN,  message computation,  aggregation,  GCN,  GraphSAGE,  GAT,  attention and multi-head attention in graphs,  stacking GNN layers,  the problem of over-smoothing,  shallow GNNs,  using skip connections graph augmentation,  feature augmentation,  training GNNs on a node-level, edge-level and graph-level,  pooling for graph-level tasks,  DiffPool,  supervised, unsupervised and self-supervised learning,  loss functions,  evaluation metrics,