Publications Datasets Projects
Qintong Li, Piji Li, Wei Bi, Zhaochun Ren, Yuxuan Lai, and Lingpeng Kong.
Event Transition Planning for Open-Ended Text Generation. ACL 2022 (Findings).
[pdf][code][bib]
Changying Hao, Liang Pang, Yanyan Lan, Yan Wang, Jiafeng Guo, and Xueqi Cheng.
Sketch and Customize: A Counterfactual Story Generator. AAAI 2021.
[pdf][code][bib]
Wei Bi, Huayang Li, and Jiacheng Huang.
Data Augmentation for Text Generation Without Any Augmented Data. ACL 2021.
[pdf][code][bib]
Wei Wang, Piji Li, and Haitao Zheng.
Generating Diversified Comments via Reader-Aware Topic Modeling and Saliency Detection. AAAI 2021.
[pdf][code][bib]
Yixuan Su, Deng Cai, Yan Wang, David Vandyke, Simon Baker, Piji Li, and Nigel Collier.
Non-Autoregressive Text Generation with Pre-trained Language Models. EACL 2021.
[pdf][code][bib]
Wei Wang, Piji Li, and Hai-Tao Zheng.
Consistency and Coherency Enhanced Story Generation. ECIR 2021.
[pdf][code][bib]
Qile Zhu, Wei Bi, Xiaojiang Liu, Xiyao Ma, Xiaolin Li, and Dapeng Wu.
A Batch Normalized Inference Network Keeps the KL Vanishing Away. ACL 2020.
Piji Li, Haisong Zhang, Xiaojiang Liu, and Shuming Shi.
Rigid Formats Controlled Text Generation. ACL 2020.
[pdf] [code] [bib]
Linfeng Song, Ante Wang, Jinsong Su, Yue Zhang, Kun Xu, Yubin Ge, and Dong Yu.
Structural Information Preserving for Graph-to-Text Generation. ACL 2020.
[pdf] [code] [bib]
Jie Lei, Liwei Wang, Yelong Shen, Dong Yu, Tamara Berg, and Mohit Bansal.
MART: Memory-Augmented Recurrent Transformer for Coherent Video Paragraph Captioning. ACL 2020.
[pdf] [code] [bib]
Zhenyi Wang, Xiaoyang Wang, Bang An, Dong Yu, and Changyou Chen.
Towards Faithful Neural Table-to-Text Generation with Content-Matching Constraints. ACL 2020.
[pdf] [code] [bib]
Wenhu Chen, Jianshu Chen, Yu Su, Zhiyu Chen, and William Yang Wang.
Logical Natural Language Generation from Open-Domain Tables. ACL 2020.
[pdf] [code] [bib]
Wang Chen, Hou Pong Chan, Piji Li, and Irwin King.
Exclusive Hierarchical Decoding for Deep Keyphrase Generation. ACL 2020.
[pdf] [code] [bib]
Ruize Wang, Zhongyu Wei, Piji Li, Qi Zhang, and Xuanjing Huang.
Storytelling from an Image Stream using Scene Graphs. AAAI 2020.
[pdf] [code] [bib]
Xiaocheng Feng, Yawei Sun, Bing Qin, Heng Gong, Yibo Sun, Wei Bi, Xiaojiang Liu, Ting Liu.
Learning to Select Bi-Aspect Information for Document-Scale Text Content Manipulation. AAAI 2020.
[pdf] [code] [bib]
Shen Gao, Xiuying Chen, Piji Li, Zhangming Chan, Dongyan Zhao, and Rui Yan.
How to Write Summaries with Patterns? Learning towards Abstractive Summarization through Prototype Editing. EMNLP 2019.
Jiaao Chen, Jianshu Chen, and Zhou Yu.
Incorporating Commonsense Knowledge for Story Completion. AAAI 2019.
Shen Gao, Xiuying Chen, Piji Li, Lidong Bing, Zhaochun Ren, Dongyan Zhao, and Rui Yan.
Abstractive Text Summarization by Incorporating Reader Comments. AAAI 2019.
Mingyue Shang, Piji Li, Zhenxin Fu, Lidong Bing, Dongyan Zhao, Shuming Shi, and Rui Yan.
Semi-supervised Text Style Transfer: Cross Projection in Latent Space. EMNLP 2019.
Juntao Li, Yan Song, Haisong Zhang, Dongmin Chen, Shuming Shi, and Rui Yan.
Generating Classical Chinese Poems via Conditional Variational Autoencoder and Adversarial Training. EMNLP 2018.
Xiuying Chen, Shen Gao, Chongyang Tao, Yan Song, Dongyan Zhao, and Rui Yan.
Iterative Document Representation Learning Towards Summarization with Polishing. EMNLP 2018.
Yi Liao, Lidong Bing, Piji Li, Shuming Shi, Wai Lam, and Tong Zhang. QuaSE: Sequence Editing under Quantifiable Guidance. EMNLP 2018.
Lianhui Qin, Lemao Liu, Wei Bi, Yan Wang, Xiaojiang Liu, Zhiting Hu, Hai Zhao, and Shuming Shi.
Automatic Article Commenting: the Task and Dataset. ACL 2018 (Short).
Piji Li, Wai Lam, Lidong Bing, Weiwei Guo, and Hang Li.
Cascaded Attention based Unsupervised Information Distillation for Compressive Summarization. EMNLP 2017.
Piji Li, Wai Lam, Lidong Bing, and Zihao Wang.
Deep Recurrent Generative Decoder for Abstractive Text Summarization. EMNLP 2017.
Xin Zheng, Aixin Sun, Sibo Wang, and Jialong Han.
Semi-Supervised Event-related Tweet Identification with Dynamic Keyword Generation. CIKM 2017
Chinese Literal Style Dataset
We create a new dataset for the problem of Semi-supervised Text Style Transfer. The dataset contains a small-scale parallel corpus with ancient Chinese poem style and modern Chinese style sentence pairs and two large nonparallel corpus of these styles.
Refer to our paper for more details: Semi-supervised Text Style Transfer: Cross Projection in Latent Space. EMNLP 2019.
Please kindly cite our paper if this paper and the dataset are helpful.
Chinese Article Commenting Dataset
We create a new dataset for the problem of Automatic Article Commenting. The dataset contains real comments and a human-annotated subset characterizing the comments' varying quality.
Refer to our paper for more details: Automatic Article Commenting: the Task and Dataset. ACL 2018.
Please kindly cite our paper if this paper and the dataset are helpful.
Virtual Human for E-Sports 电竞虚拟人
我们探索AI在电竞主播和解说的技术及应用。在这些场景下,AI要给出的不只是最优的动作序列和按键操作,而是要在一个开放的环境下,结合知识、经验、重要度、精彩度、连贯度等纬度对诸多事件进行优先级排序,给出电竞局势的分析和有风格的语言、声音和形象和谐一致的多模态表达。对于普通用户,“电竞虚拟人”可以陪你对战、帮助磨练电竞技术;对于赛事观众,“电竞虚拟人”可以提供个性化解说,你可以让自己定制的小姐姐、小哥哥陪你一起观看并解说比赛;对于职业选手,“电竞虚拟人”可以成为行业里的电竞教练和数据分析员,通过对选手的专业分析,帮助他们在比赛中走的更远。
We explore the technology and application of AI in the broadcasting and commentary of E-sports. In these scenarios, AI should not only give the optimal sequence of actions and keystroke operations, but also rank events in an open domain by combining knowledge, experience, importance, highlight, coherence and other factors, and then give analysis of the game and multimodal expressions with consistent language style, voice and image. For ordinary users, “virtual human” can accompany you to play and help improve your expertise; for the audience of the competition, “virtual human” can provide personalized explanation, and you can let your customized commentator accompany you to watch and explain the competition; for professional players, “virtual human” can become coaches and data analysts in this field, and help the players go further in the competition through professional analysis of them.