斯博国际app

Graph Embedding: From Learning to Hashing

发布者:计算机信息发布时间:2019-05-14浏览次数:13

报告题目: Graph Embedding: From Learning to Hashing

报告人:陈玲 博士

单位:悉尼科技大学

报告时间:2019517日(周五)上午1000-1100

报告地点:翡翠湖校区科教楼A909会议室


报告摘要: The surge of real-world graph data, such as chemical compounds, networks and social communities, has led to the rise of graph mining research. However, due to the arbitrary structure of graph data, it is not easy to compute graph similarities. The essential challenge of graph mining is to represent graph data in a vector space that facilitates downstream data mining and machine learning tasks. Therefore, graph representation learning has received a great amount of attention recently. While most of the existing graph representation research focuses on learning the vector representation for graph data, this talk introduces our recent works that use randomised hashing to map graphs to vectors of a fixed number of dimensions. We have developed efficient hashing algorithms for both node embedding and graph embedding. Our experimental results demonstrate that our hashing algorithms serve as the most efficient graph feature extraction method. The generated hashing codes lead to no observable performance loss in applications such as graph classification.


报告人简介Dr. Ling Chen is an Associate Professor in the University of Technology Sydney (UTS). She received her Ph.D. in Computer Engineering from Nanyang Technological University, Singapore. Dr. Chen's main research interests focus on data mining and machine learning. She has worked on fundamental data mining tasks such as (regular and irregular) pattern mining from structured data and uncertain data, hashing and embedding learning for structured data. Her recent work also includes knowledge discovery from social networks and social media. She has developed novel and effective algorithms for event detection from social media and recommendation in social networks. Dr. Chen frequently publishes papers in major international data mining conferences including SIGKDD, ICDM and SDM, and journals such as IEEE TNNLS and TKDE.