-
Notifications
You must be signed in to change notification settings - Fork 0
Representation Learning
-
Why learning features of nodes in a graph is important? -- Traditionally machine learning techniques treat each entity independently. Whereas graph based feature learning aka representation learning gives importance to the structure and semantics of how nodes are connected in a network while learning the latent representation.
-
What is the drawback of representation learning techniques like node2vec/deepwalk? -- These techniques involve multi-step pipeline involving random walk generation and semi-supervised training where each has to be optimized separately.
-
How neural networks are used for graph representation learning? -- We pass the data through a neural network to predict node labels in a semi-supervised manner. By backpropagation we update the weights of the NN and in doing so we have a network where the activations can be used as features of the nodes
-
What is word2vec? How does it work?
-
What is node2vec?
-
What is metapath2vec? How is it different from other random walk based embedding technique?
-
What are possible drawbacks of shallow embedding techniques?
-
What is GraphSAGE? How is it different?
-
What are the steps for GraphSAGE?
-
What are the limitations of conventional GNNs?
-
What is the difference between strongly connected and weakly connected components of a graph? -- Strongly connected component is mostly defined for directed graphs, where it is possible to reach every vertex from every other vertex. On the other hand, a weakly connected component is defined for undirected graph where every node is connected somehow regardless of direction.