Heterogeneous Graph Representation Learning

Neural architectures for dynamic heterogeneous graphs (2022-2024)

Overview

From 2022-2024 I spearheaded research on novel neural models for heterogeneous graph representation learning—graphs with multiple node/edge types that evolve over time. Our architectures enable downstream tasks on dynamic heterogeneous graphs, including keyword search and subgraph mining, pushing beyond the limits of existing GNN methods.

Key Contributions

  • Proposed temporal-aware message-passing layers tailored to multi-relation graphs.
  • Introduced contrastive pre-training strategies that boost performance on low-label regimes.
  • Open-sourced benchmarking suite for dynamic heterogeneous graph tasks.

Collaborators

  • AT&T Chief Data Office
  • York University

Output

  • Multiple publications: (missing reference), (missing reference), (missing reference), (missing reference), (missing reference)

These advances lay the groundwork for scalable, real-time analytics on richly typed, rapidly changing graph data.