Chapter1: Representation Learning

Liang Zhao, Emory University, liang.zhao@emory.edu
Lingfei Wu, JD.COM Silicon Valley Research Center, lwu@email.wm.edu
Peng Cui, Tsinghua University, cuip@tsinghua.edu.cn
Jian Pei, Simon Fraser University, jpei@cs.sfu.edu

Abstract

In this chapter, we first describe what the representation learning is and why we need representation learning. Among the various ways of learning representations, this chapter focuses on deep learning methods: those that are formed by the composition of multiple non-linear transformations, with the goal of resulting in more abstract and ultimately more useful representations. We summarize the representation learning techniques in different domains, focusing on the unique challenges and models for different data types including images, natural languages, speech signals and networks. At last, we summarize this chapter and provide further reading on mutual information-based representation learning, which is a recently emerging representation technique via unsupervised learning.

Contents

  • Representation Learning: An Introduction
  • Representation Learning in Different Areas
    • Representation Learning for Image Processing
    • Representation Learning for Speech Recognition
    • Representation Learning for Natural Language Processing
    • Representation Learning for Networks
  • Summary

Citation

@incollection{GNNBook-ch1-zhao,
author = "Zhao, Liang and Wu, Lingfei and Cui, Peng and Pei, Jian",
editor = "Wu, Lingfei and Cui, Peng and Pei, Jian and Zhao, Liang",
title = "Representation Learning",
booktitle = "Graph Neural Networks: Foundations, Frontiers, and Applications",
year = "2022",
publisher = "Springer Singapore",
address = "Singapore",
pages = "3--15",
}

L. Zhao, L. Wu, P. Cui, and J. Pei, “Representation learning,” in Graph Neural Networks: Foundations, Frontiers, and Applications, L. Wu, P. Cui, J. Pei, and L. Zhao, Eds. Singapore: Springer Singapore, 2022, pp. 3–15.