• Rex (Zhitao) Ying

    I'm currently an assistant professor in the department of computer science at Yale University.

    My Lab
    Slide 1
  • Rex (Zhitao) Ying

    I work on AI and machine learning algorithms that leverage graph structure of data.

    Details
    Slide 2
  • Rex (Zhitao) Ying

    I'm excited to be a founding member of Kumo.ai!

    More About Kumo
    Slide 3

Welcome to my personal website

Refer to my Yale profile for more information. Our lab is hiring motivated and capable Ph.D students interested in geometric deep learning, graph neural nets or trustworthy AI.

About Me

Introduction

I'm an assistant professor in the Department of Computer Science at Yale University. My research focus includes algorithms for graph neural networks, geometric embeddings and explainable models. I am the author of many widely used GNN algorithms such as GraphSAGE, PinSAGE and GNNExplainer. In addition, I have worked on a variety of applications of graph learning in physical simulations, social networks, knowledge graphs and biology. I developed the first billion-scale graph embedding services at Pinterest, and the graph-based anomaly detection algorithm at Amazon.


Education

I obtained my Ph.D degree in computer science at Stanford University, advised by Jure Leskovec. My thesis focuses on expressive, scalable and explainable GNNs (graph neural networks), which is available on Github. Prior to that, I graduated from Duke University in 2016 with the highest distinction. I majored in computer science and mathematics.

Notice

I moved my website to this new address, where I will make more regular updates available. Content in this current address might not be up-to-date.
photo_rex

Dr. Rex (Zhitao) Ying

Assistant professor

News

  • I'm the winner of KDD 2022 Dissertationn Award, and will present the thesis at the KDD 2022 Conference.
  • I'm excited to be a founding engineer of Kumo.ai, where we work on building a cloud AI platform based on state-of-the-art GNNs and the PyG library.
  • I'm one of the 10 winners of the 2019 Baidu Scholarship in Artificial Intelligence.

Research Outline

My research spans 3 broad areas: deep learning for graphs, geometric representation learning, and real-world applications with relational reasoning and modeling. My interests span from theoretical questions in graph learning, generalization, manifold geometry, to practical use cases in science and technology, collaborating with science institutes and IT companies.

My list of publications can be found on Google Scholar
word_cloud

Graph Neural
Networks

Geometric
Representation Learning

Unlimited
Applications


Graph neural networks (GNNs) are powerful tools that play an important role in machine learning. I use relational information between nodes (entities) to enhance the capabilities of deep learning.
I'm interested in empowering deep learning architectures with effective representation geometry, which is essential in modeling data manifolds with different characteristics.
I work on real-world applications in physical simulations, chemistry, biology, knowledge graphs, natural languages, recommendations and social networks.

Services

I'm excited to serve the research community in various aspects. I co-lead the open-source project, PyTorch Geometric, which aims to make developing graph neural networks easy and accessible for researchers, engineers and general audience with a variety of background. I served as committee members for machine learning conferences including AAAI, ICML, NeurIPS, ICLR, KDD, WebConf for over 7 years, and I am serving as area chair for LoG 2022. In addition, I organized a variety of workshops on topics including graph learning, graph neural networks and deep learning for simulation. I'll also teach the course of "Deep Learning for Graph-Structured Data" at Yale this coming academic year. More course materials will be available soon!

My Past Workshops

  • New Frontiers in Graph Learning (GLFrontiers) at NeurIPS 2022
  • Deep Learning for Simulation (SimDL) at ICLR 2021
  • Stanford Graph Learning Workshop (SGL)
  • Graph Representationn Learning and Beyond (GRL+) at ICML 2020
  • Co-organized the 2020 KDD Cup Competition on Graph AutoML.

Selected Publications

A few selected publications are listed for each research direction. See Google Scholar for a full list of publications.
Most of the algorithms developed are open-sourced as part of the PyTorch Geometric Library.

Deep Learning on Graphs

I focus on advancing graph neural network (GNN) architectures and improving the expressiveness, scalability, interpretability and robustness of GNNs.

ICML 2022

LA-GNN is a general pre-training framework that improves GNN performance through augmentation.

ICLR 2021

GNNs can learn to execute graph algorithms.

AAAI 2021

ID-GNN improves the expressiveness of GNN by considering node identities.

NeurIPS 2017

GraphSAGE is a general GNN framework for large-scale graph learning.

ICML 2018

GraphRNN is one of the first graph generative models for learning distribution of graphs.

NeurIPS 2019

The first framework to explain predictions made by GNNs!

Representation Learning

I innovate in representation learning techniques and embedding geometry for embedding for data with different characteristics (hierarchical, heterogeneous etc.).

NeurIPS 2019

HGCN embeds nodes in a graph in hyperbolic space to capture hierarchical structure. It's one of the first hyperbolic GNNs.

NeurIPS 2021

ConE uses hyperbolic cones to model the heterogeneous hierarchies in knowledge graph.

NeurIPS 2021

We embed biological sequences in Euclidean and hyperbolic spaces for solving challenging problems such as multiple sequence alignment.

IEEE Data Engineering Bulletin 2017

A survey on graph representation learning, including distributed embedding approaches and GNNs.

NeurIPS 2018

GraphRNN is one of the first deep graph generative models for learning distribution of graphs.

ICML 2019

P-GNN improves the expressiveness of position information for node embeddings.

Applications

Natural phenomenon and world's knowledge can often be expressed with the language of graphs. In addition to the popular applications of graphs such as social networks, recommender systems, knowledge graphs, biological networks and molecules, I'm also interested in novel ways of incorporating relational reasoning to other fields of science and technology, such as physical simulations, natural language and industrial relational database predictions.

ICML 2020

We enable graph neural networks to learn to produce realistic simulations of different materials.

KDD 2022

We collaborate with Saudi Aramco to use machine learning for simulating oil and water flows, underground pressure and oil production.

NAACL 2021

Heterogeneous graph attention on dependency parsing trees can improve robustness and performance of aspect-level sentiment analysis in NLP.

KDD 2018

The first GNN-based recommender system applied to billion-user-scale industrial platforms. It is deployed at Pinterest and also serves as an embedding service for many downstream use cases.

NeurIPS 2018

Combine GNNs and reinforcement learning to generate realistic molecules with desired chemical and biological goals.

KDD 2021

Dynamic GNN for anomaly detection at Amazon.com via efficient message passing procedure and pre-training.

Acknowledgement

My work would not have been possible without the support from my family, friends, students and awesome collaborators! Check out some of my collaborators: Jure Leskovec, Christopher Ré, Pietro Liò Jiaxuan You, Marinka Zitnik, William Hamilton, Xiang Ren, Bowen Liu Peter Battaglia, Petar Veličković Ines Chami Hanjun Dai Matthias Fey and many more...

Organizations

Aside from university collaborations, I also collaborated with many industrial companies and non-profit organizations including Pinterest, Facebook AI Research, Siemens, DeepMind, Amazon, SLAC National Accelerator Library, Saudi Aramco and more.

Contact

Location

I'm currently located at New Haven, CT 06520.

Email

You could reach me via email. Show Email
I will try my best to respond if the schedule permits, unless I'm overwhelmed by emails.