
News: I'll be joining Yale University as assistant professor in late 2022. I welcome recommendation of potential PhD students and Postdocs who have the background in graph learning, and with strong research background to apply.
I'm currently a fifthyear computer science PhD student at Stanford University, advised by Jure Leskovec. My research generally focuses on developing machine learning algorithms applied to graphstructured data. I'm interested advancing graph neural network (GNN) architectures that improve the expressiveness, scalability and interpretability of GNNs. I'm also interested in many applications of graph learning, including physical simulations, chemistry and biology predictions, knowledge graphs, natural languages, recommendations and social networks. I graduated from Duke University in 2016 with the highest distinction. I majored in computer science and mathematics. I did research in geometric algorithms, computational topology, numerical analysis and signal processing. Email: rexying AT stanford.edu 
News
 May 2021: I organized the Simulation for Deep Learning (SimDL) Workshop in ICLR 2021. The workshop website is: https://simdl.github.io/, and the official virtual workshop site is at: https://iclr.cc/virtual/2021/workshop/2141
 May 2021: I'm excited to join Yale University as an assistant professor next year (late 2022)!
 May 2018: Gave a poster presentation at SDSI workshop.
 May 2018: GraphRNN code is made public at https://github.com/JiaxuanYou/graphgeneration . Feel free to make issues if there's any question:)
 Looking forward to an internship in Facebook AI Research, NYC in summer 2018!
 April 2018: Gave a tutorial "Representation Learning on Networks" at WWW 2018 with Jure Leskovec, William L. Hamilton and Rok Sosic.
Website: http://snap.stanford.edu/proj/embeddingswww/  Dec 2017: One maintrack poster presentation at NIPS 2017 and an invited talk at the NIPS MLTrain Workshop on GraphSAGE.
PhD Research
Graph Representation Learning
When applying machine learning techniques to graphstructured data, an important step is to map nodes in the graph to dense vector representations (embeddings) that describe the neighborhood structure of the nodes, so that machine learning algorithms can take these node embeddings as input features. This is analogous to word embeddings in NLP, where words are mapped to embeddings and used for various downstream tasks.
My work primarily focuses on graph neural networks (GNN) architectures for learning graph and node embeddings, in the following aspects
Applications include machine learning for social networks, recommender systems, biological networks, knowledge graph, physical systems and molecular chemistry.
My work primarily focuses on graph neural networks (GNN) architectures for learning graph and node embeddings, in the following aspects
 We can leverage embeddings in inductive setting, scalable to extremely large graphs with billions of nodes (GraphSAGE, PinSage).
 Effectively leverage hierarchical structures in graphs for learning both graphlevel (DiffPool) and nodelevel (HGCN) representations.
 Build generative models (GraphRNN, GCPN) for graphs, that allows us to model distributions of graphs, or generate graphs with optimal properties.
Applications include machine learning for social networks, recommender systems, biological networks, knowledge graph, physical systems and molecular chemistry.
Research summary
Embeddings and Hierarchies for Graphs
Knowledge Graphs and Natural Languages
Simulation
Scalable Graph Representation Learning (Graph Neural Networks)
Graph Generation
 Direct Multihop Attention based Graph Neural Network, IJCAI 2021, Guangtao Wang, Rex Ying, Jing Huang, Jure Leskovec
 Identityaware Graph Neural Networks, AAAI 2021, Jiaxuan You, Jonathan GomesSelman, Rex Ying, Jure Leskovec
 Frequent Motif Mining by Walking in Order Embedding Space
 Neural Subgraph Matching, Rex Ying, Andrew Wang, Jiaxuan You, Chengtao Wen, Arquimedes Canedo, Jure Leskovec;
 Hyperbolic Graph Convolutional Networks, NeurIPS 2019, Rex Ying*, Ines Chami*, Christopher Ré, Jure Leskovec
 Hierarchical Graph Representation Learning with Differentiable Pooling, NeurIPS 2018, Rex Ying, Jiaxuan You, Christopher Morris, William L. Hamilton, Jure Leskovec
Knowledge Graphs and Natural Languages
 Modeling Heterogeneous Hierarchies with Relationspecific Hyperbolic Cones, in submission, Yushi Bai*, Rex Ying*, Hongyu Ren, Jure Leskovec
 Graph Ensemble Learning over Multiple Dependency Trees for Aspectlevel Sentiment, NAACL 2021, Xiaochen Hou, Peng Qi, Guangtao Wang, Rex Ying, Jing Huang, Xiaodong He, Bowen Zhou
Simulation
 Learning to Simulate Complex Physics with Graph Networks, ICML 2020, Alvaro SanchezGonzalez*, Jonathan Godwin*, Tobias Pfaff*, Rex Ying*, Jure Leskovec, Peter W. Battaglia
 Neural Execution of Graph Algorithms, ICLR 2020, Petar Veličković, Rex Ying, Matilde Padovano, Raia Hadsell, Charles Blundell
Scalable Graph Representation Learning (Graph Neural Networks)
 Bipartite Dynamic Representations for Abuse Detection KDD 2021, Andrew Wang, Rex Ying, Pan Li, Nikhil Rao, Karthik Subbian, Jure Leskovec
 RedundancyFree Computation Graphs for Graph Neural Networks KDD 2020, Zhihao Jia, Sina Lin, Rex Ying, Jiaxuan You, Jure Leskovec, Alex Aiken
 GNNExplainer: Generating Explanations for Graph Neural Networks, NeurIPS 2019, Rex Ying, Dylan Bourgeois, Jiaxuan You, Marinka Zitnik, Jure Leskovec
 Positionaware Graph Neural Networks, ICML 2019, Jiaxuan You, Rex Ying, Jure Leskovec
 Graph Convolutional Networks for WebScale Recommender Systems, KDD 2018, Rex Ying, Ruining He, Kaifeng Chen, Pong Eksombatchai, William L. Hamilton, Jure Leskovec
 Inductive Representation Learning on Large Graphs, NeurIPS 2017, William L. Hamilton*, Rex Ying*, Jure Leskovec
 Representation Learning on Graphs: Methods and Applications, IEEE Data Engineering Bulletin. 2017, William L. Hamilton, Rex Ying, and Jure Leskovec.
Graph Generation
 Graph Convolutional Policy Network for GoalDirected Molecular Graph Generation, NeurIPS 2018, Jiaxuan You*, Bowen Liu*, Rex Ying, Vijay S. Pande, Jure Leskovec
 GraphRNN: Deep Generative Models of Graphs, ICML 2018, Jiaxuan You*, Rex Ying*, Xiang Ren, William L. Hamilton, Jure Leskovec
Particlelevel simulation in time and space is important in scientific domains. Here are some projects where we investigate simulation of physical processes, chemistry reactions and computer science algorithms using graph neural networks.
Learning to Simulate Complex Physics with Graph NetworksICML 2020
Alvaro SanchezGonzalez*, Jonathan Godwin*, Tobias Pfaff*, Rex Ying*, Jure Leskovec, Peter W. Battaglia

Neural Execution of Graph AlgorithmsICLR 2020
Petar Veličković, Rex Ying, Matilde Padovano, Raia Hadsell, Charles Blundell

RedundancyFree Computation Graphs for Graph Neural NetworksKDD 2020
Zhihao Jia, Sina Lin, Rex Ying, Jiaxuan You, Jure Leskovec, Alex Aiken

Inductive Representation Learning on Large GraphsNeurIPS 2017
William L. Hamilton*, Rex Ying*, Jure Leskovec

Representation Learning on Graphs:

Rotations
 In spring 2016, I rotated with Professor Greg Valiant, working on efficient approximate maximum inner product search.
 In fall 2016, I rotated with Professor Leo Guibas, working on inferring driving state (velocity, acceleration, steering angle etc.) based on cars' front camera videos.
Undergraduate Research
My undergrad research topics include geometric algorithms, numerical analysis and signal processing. I also did several other projects in image processing, computational topology during my senior year. During my summer internships at Google Ads, I worked on using machine learning to detect API queries using malicious script and debiasing user rating.
Here are some of my researches at Duke.
Here are some of my researches at Duke.
A Simple Efficient Approximation Algorithm for Dynamic Time WarpingACM SIGSPATIAL 2016
Rex Ying, Jiangwei Pan, Kyle Fox, Pankaj K. Agarwal
Dynamic time warping (DTW) is a widely used curve similarity measure. We present a simple and efficient (1 + eps)
approximation algorithm for DTW between a pair of point sequences. Although an algorithm with similar asymptotic time complexity was presented in our previous paper, our algorithm is considerably simpler and more efficient in practice. Our experiments on both synthetic and realworld data sets (GeoLife trajectories) show that it is an order of magnitude faster than the standard exact DP algorithm on point sequences of length 5,000 or more while keeping the approximation error within 5–10%. For input size of 50,000, our algorithm is more than 200 times faster.

Approximating Dynamic Time Warping

Paper.pdf  
File Size:  549 kb 
File Type: 
EMGJournal of Biomechanics 2016Rex Ying, Christine E. Wall
We present a new statistical method and MATLAB script (EMGExtractor) that includes an adaptive algorithm to discriminate noise regions from electromyography(EMG) that improves the labeling accuracy.

Locally Adaptive Spline InterpolationRex Ying, Xiaobai Sun, AlexandrosStavros Iliopoulos
In this paper, I give a framework and an efficient algorithm for constructing a high order, nonstandard spline from low order splines.
Using this technique, we provide an improvement over the traditional spline technique such that the result can be more efficiently adaptive to local updates, resilient to outliers, naturally parallel and has less error propagation, while still preserving the smoothness conditions of the spline. An inhomogeneous spline refers to a spline whose piecewise functions do not have the same order. 
Create a free web site with Weebly