Leveraging Natural Supervision for Language Representation Learning and Generation: Acknowledgements

:::info
Author:

(1) Mingda Chen.

:::

Table of Links

Abstract

Acknowledgements

1 INTRODUCTION

1.1 Overview

1.2 Contributions

2 BACKGROUND

2.1 Self-Supervised Language Pretraining

2.2 Naturally-Occurring Data Structures

2.3 Sentence Variational Autoencoder

2.4 Summary

3 IMPROVING SELF-SUPERVISION FOR LANGUAGE PRETRAINING

3.1 Improving Language Representation Learning via Sentence Ordering Prediction

3.2 Improving In-Context Few-Shot Learning via Self-Supervised Training

3.3 Summary

4 LEARNING SEMANTIC KNOWLEDGE FROM WIKIPEDIA

4.1 Learning Entity Representations from Hyperlinks

4.2 Learning Discourse-Aware Sentence Representations from Document Structures

4.3 Learning Concept Hierarchies from Document Categories

4.4 Summary

5 DISENTANGLING LATENT REPRESENTATIONS FOR INTERPRETABILITY AND CONTROLLABILITY

5.1 Disentangling Semantics and Syntax in Sentence Representations

5.2 Controllable Paraphrase Generation with a Syntactic Exemplar

5.3 Summary

6 TAILORING TEXTUAL RESOURCES FOR EVALUATION TASKS

6.1 Long-Form Data-to-Text Generation

6.2 Long-Form Text Summarization

6.3 Story Generation with Constraints

6.4 Summary

7 CONCLUSION

APPENDIX A – APPENDIX TO CHAPTER 3

APPENDIX B – APPENDIX TO CHAPTER 6

BIBLIOGRAPHY

ACKNOWLEDGEMENTS

Like all great travellers, I have seen more than I remember, and remember more than I have seen.


– Benjamin Disraeli


The Ph.D. journey is an adventure mixed with daunting challenges, unanticipated bafflements, and instant delights. Many people have guided me through the challenges, clarified my confusion, and shared my happiness. I am enormously grateful for their help along the journey.


First, I want to thank my advisor Kevin Gimpel for the technical insight and research philosophy throughout these years. He was always knowledgeable about everything we worked on, meticulous about every word we wrote on papers, and patient about my mistakes. The work in this thesis would not be possible without his positive, steady influence.


I thank the rest of my thesis committee: Karen Livescu, Sam Wiseman, and Luke Zettlemoyer, for being generous with their time and insight. I also thank Karl Stratos for his guidance.


I thank my fellow students at Toyota Technological Institute at Chicago and the University of Chicago, especially Qingming Tang for the technical (and nontechnical) conversations, Bumeng Zhuo for the fun activities during weekends, and Zewei Chu for the bike rides at the lakefront in Chicago. I am also grateful to my fellow interns and mentors at Google and Facebook.


Lastly, I would like to thank my family for inspiring my interest in learning, encouraging me to apply to graduate school, and being supportive and interested in listening to my research ramblings.

:::info
This paper is available on arxiv under CC 4.0 license.

:::

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.