Hanzhi Wang   /   王涵之

I am a PostDoc at BARC (Basic Algorithms Research Copenhagen), University of Copenhagen, working with Prof. Mikkel Thorup. I obtained my Ph.D. in Computer Science and Technology from the School of Information at Renmin University of China in May 2024, under the supervision of Prof. Zhewei Wei. I received my B.E. degree in Computer Science and Technology from the School of Information at Renmin University of China in June 2019.

hzwang.helen@gmail.com  /  Github  /  Google Scholar

profile photo
Research Interests

My research interests are graph analysis algorithms, with a particular focus on the design of scalable algorithms for large graphs. My previous work primarily centers on sublinear-time algorithms for random-walk probability estimation. I am also interested in local clustering and graph neural networks.

Publications

Note: Papers marked with ** use alphabetic ordering of authors, following the convention of theoretical computer science.

Mixing Time Matters: Accelerating Effective Resistance Estimation via Bidirectional Method.
Guanyu Cui, Hanzhi Wang, Zhewei Wei
KDD, 2025

This paper improves the time dependence on Lmax in solving the problem of estimating effective resistance (ER) on undirected graphs. Here, Lmax is a graph parameter influenced by the mixing time of random walks on the graph. We emphasize that the value of Lmax in real-world networks can be very large. However, previous work often assumes a small value for Lmax, which can result in significant approximation errors. We hope this paper draws attention to the critical role of Lmax in ER approximation.

**Optimal Dynamic Parameterized Subset Sampling.
Junhao Gan, Seeun William Umboh, Hanzhi Wang, Anthony Wirth, Zhuo Zhang
PODS, 2025
Technical Report

We study the fundamental subset sampling problem in a general dynamic setting and achieve optimal complexity results for both query and update time simultaneously.

Revisiting Local PageRank Estimation on Undirected Graphs: Simple and Optimal.
Hanzhi Wang
KDD, 2024
Technical Report / slides

We establish matching upper and lower bounds for the worst-case computational complexity of single-node PageRank computation on undirected graphs, using surprisingly simple algorithms and analysis.

Efficient Algorithms for Personalized PageRank Computation: A Survey.
Mingji Yang, Hanzhi Wang, Zhewei Wei, Sibo Wang, Ji-Rong Wen
TKDE, 2024
Technical Report

We recap several frequently used techniques for Personalized PageRank (PPR) computation and conduct a comprehensive survey of various recent PPR algorithms from an algorithmic perspective.

**Revisiting Local Computation of PageRank: Simple and Optimal.
Hanzhi Wang, Zhewei Wei, Ji-Rong Wen, Mingji Yang
STOC, 2024
Technical Report / slides

We establish matching upper and lower bounds for the worst-case computational complexity of single-node PageRank computation on directed graphs. Additionally, for the celebrated ApproxContributions algorithm (also known as Backward Push or Local Push, proposed by Andersen, Borgs, Chayes, Hopcroft, Mirrokni, and Teng in WAW 2007), we upper bound its worst-case computational complexity on directed graphs and prove its optimality by establishing a matching lower bound.

Estimating Single-Node PageRank in O(min(dt, m1/2)) Time.
Hanzhi Wang, Zhewei Wei
VLDB, 2023
Technical Report / code / slides / poster

For the problem of estimating a single node's PageRank score in an undirected graph, we tighten the upper bound for query time complexity from O((ndt)1/2) to O(min(dt, m1/2)) (omitting logarithmic factors). Here n and m denote the number of nodes and edges in the given graph, respectively. dt denotes the degree of the given target node t.

Optimal Dynamic Subset Sampling: Theory and Applications.
Lu Yi, Hanzhi Wang, Zhewei Wei
KDD, 2023
Technical Report / slides / poster

We study the fundamental problem of sampling indepedent events (called subset sampling) in a dynamic setting. We present the ODSS algorithm, which is the first to achieve the optimal time complexity per sample and per update simultaneously.

Edge-based Local Push for Personalized PageRank.
Hanzhi Wang, Zhewei Wei, Junhao Gan, Ye Yuan, Xiaoyong Du, Ji-Rong Wen
VLDB, 2022
Technical Report / code / slides

The state-of-the-art algorithm, LocalPush, for Personalized PageRank computation can be rather inefficient on weighted graphs. In this paper, we propose an Edge-based Push Method (EdgePush), which decomposes the push operation of LocalPush into separate edge-based push operations and achieves superior query efficiency over LocalPush on weighted graphs.

Instant Graph Neural Networks for Dynamic Graphs.
Yanping Zheng, Hanzhi Wang, Zhewei Wei, Jiajun Liu, Sibo Wang
KDD, 2022
Technical Report / code

We propose Instant Graph Neural Network (InstantGNN), an incremental computation approach to support instant updates of graph representation results on dynamic graphs.

Approximate Graph Propagation
Hanzhi Wang, Mingguo He, Zhewei Wei, Sibo Wang, Ye Yuan, Xiaoyong Du, Ji-Rong Wen
KDD, 2021
Technical Report / code / slides / poster

We propose Approximate Graph Propagation (AGP), a unified randomized algorithm that computes various proximity queries and GNN feature propagation in almost optimal time complexity.

ExactSim: benchmarking single-source SimRank algorithms with high-precision ground truths
Hanzhi Wang, Zhewei Wei, Yu Liu, Ye Yuan, Xiaoyong Du, Ji-Rong Wen
VLDB Journal, 2021

This work is an extended version of the conference paper "Exact Single-Source SimRank Computation on Large Graphs" accepted by SIGMOD 2020. With the ground truths computed by ExactSim, we conduct the first experimental study of the accuracy/cost trade-offs of existing approximate SimRank algorithms on large graphs.

Personalized PageRank to a Target Node, Revisited
Hanzhi Wang, Zhewei Wei, Junhao Gan, Sibo Wang, Zengfeng Huang
KDD, 2020
Technical Report / code / slides

We propose Randomized Backward Search (RBS), a novel algorithm that answers approximate single-target personalized PageRank queries (also known as PageRank contribution queries) with near optimal time complexity.

Exact Single-Source SimRank Computation on Large Graphs
Hanzhi Wang, Zhewei Wei, Ye Yuan, Xiaoyong Du, Ji-Rong Wen
SIGMOD, 2020
Technical Report / code / slides

We propose ExactSim, the first algorithm that enables probabilistic exact single-source SimRank queries on large graphs. ExactSim can provide the ground truth with a precision up to 7 decimal places for single-source SimRank queries on large graphs within a reasonable query time.

Honors and Awards
  • China Computer Federation (CCF) Outstanding Doctoral Dissertation Award, 2024.
  • Wu Yuzhang Scholarship, 2024.
  • Certified Excellence in Reviewing for KDD 2023.
  • Microsoft Research Asia Fellowship, 2022.
  • Baidu Scholarship, 2021.
  • Academic Services

      PC member/Reviewer:

    • IEEE Transactions on Knowledge and Data Engineering (TKDE), 2025
    • IEEE International Conference on Data Engineering (ICDE), 2025
    • International Conference on Knowledge Discovery and Data Mining (KDD), 2023/2024/2025
    • International Conference on Learning Representations (ICLR), 2024/2025
    • ACM International Conference on Web Search and Data Mining (WSDM), 2023/2024/2025
    • ACM International Conference on Information and Knowledge Management (CIKM), 2024
    • Annual Conference on Neural Information Processing Systems (NeurIPS), 2023/2025
    • SIAM International Conference on Data Mining (SDM), 2024
    • Australasian Database Conference (ADC), 2023
    • IEEE Communications Magazine
    • Information Retrieval Journal
    • International Conference on Machine Learning (ICML), 2022
    • Invited Talks
        Approximate Graph Propagation. [slide]
        AMiner
        Online, Oct. 2021
          Efficient Computation of Random Walk and the Applications in Graph Representation Learning. [slide]
          DataFun
          Online, Aug. 2021
            A Near Optimal Algorithm for Single-Target Personalized PageRank. [slide]
            The 9th National Social Media Processing Conference (SMP, 2020)
            Online, Sept. 2020


            Last update: Dec. 2024      Template