
浏览全部资源
扫码关注微信
[ "万小军(1979- ),男,博士,北京大学王选计算机研究所博士生导师,主要研究方向为自动文摘与文本生成、情感分析与语义计算、多语言与多模态NLP等。曾担任计算语言学重要国际期刊Computational Linguistics编委、国际会议EMNLP 2019程序委员会主席,现任CCF-NLP专委会秘书长、中国中文信息学会理事与NLGIW专委会副主任、TACL与ARR执行编委、NLE编委、JCST编委,多次担任相关领域重要国际会议(ACL、NAACL、EMNLP、EACL、AACL)高级领域主席或领域主席。荣获ACL 2017杰出论文奖、IJCAI 2018杰出论文奖。研制推出多款AI写作机器人,如小明、小南、小柯等,应用于多家媒体单位" ]
网络出版日期:2023-03,
纸质出版日期:2023-03-15
移动端阅览
万小军. 智能文本生成:进展与挑战[J]. 大数据, 2023,9(2):99-109.
Xiaojun WAN. Intelligent text generation: recent advances and challenges[J]. Big data research, 2023, 9(2): 99-109.
万小军. 智能文本生成:进展与挑战[J]. 大数据, 2023,9(2):99-109. DOI: 10.11959/j.issn.2096-0271.2023014.
Xiaojun WAN. Intelligent text generation: recent advances and challenges[J]. Big data research, 2023, 9(2): 99-109. DOI: 10.11959/j.issn.2096-0271.2023014.
智能文本生成是人工智能与自然语言处理领域的前沿研究方向,也是AI生成内容(AIGC)的关键技术支撑,能够大幅提升文本内容的生成效率,近年受到学术界和产业界的高度关注,在媒体出版、电子商务等多个行业与场景均已得到应用。对智能文本生成的应用现状与主要方式进行系统概述,重点介绍基于深度学习的智能文本生成技术,同时阐述现有技术面临的挑战。
Intelligent text generation is one of the advanced research directions in the fields of artificial intelligence and natural language processing.It is one of key technologies for making AIGC successful
and it has been highly concerned by both academia and industry in recent years.The technology has been deployed and used in many application areas including media publishing and e-commerce
and it can much improve the efficiency of text content production.The systematic overview of the applications of intelligent text generation and the mainstream ways of text generation were given.And the recent deep learning techniques for text generation were introduced.Lastly
the challenges faced by neural text generation were summarized.
REITER E , . An architecture for data-totext systems [C ] // Proceedings of the 11th European Workshop on Natural Language Generation . New York:ACM Press , 2007 : 97 - 104 .
SUTSKEVER I , VINYALS O , LE Q V . Sequence to sequence learning with neural networks [C ] // Proceedings of the 27th International Conference on Neural Information Processing Systems . New York:ACM Press , 2014 : 3104 - 3112 .
VASWANI A , SHAZEER N , PARMAR N , et al . Attention is all you need [C ] // Proceedings of the 31st International Conference on Neural Information Processing Systems . New York:ACM Press , 2017 : 6000 - 6010 .
KINGMA D P , WELLING M . An introduction to variational autoencoders [J ] . Foundations and Trends® in Machine Learning , 2019 , 12 ( 4 ): 307 - 392 .
GOODFELLOW I , POUGET-ABADIE J , MIRZA M , et al . Generative adversarial networks [J ] . Communications of the ACM , 2020 , 63 ( 11 ): 139 - 144 .
LI X L , THICKSTUN J , GULRAJANI I , et al . Diffusion-LM improves controllable text generation [J ] . arXiv preprint , 2022 ,arXiv:2205.14217.
DEVLIN J , CHANG M W , LEE K , et al . BERT:pre-training of deep bidirectional transformers for language understanding [J ] . arXiv preprint , 2018 ,arXiv:1810.04805.
LIU Y H , OTT M , GOYAL N , et al . RoBERTa:a robustly optimized BERT pretraining approach [J ] . arXiv preprint , 2019 ,arXiv:1907.11692.
RADFORD A , NARASIMHAN K , Salimans T , et al . Improving language understanding by generative pretraining [Z ] . 2018 .
RADFORD A , WU J , CHILD R , et al . Language models are unsupervised multitask learners [J ] . OpenAI Blog , 2019 , 1 ( 8 ).
BROWN T B , MANN B , RYDER N , et al . Language models are few-shot learners [C ] // Proceedings of the 34th International Conference on Neural Information Processing Systems . New York:ACM Press , 2020 : 1877 - 1901 .
OUYANG L , WU J , JIANG X , et al . Training language models to follow instructions with human feedback [J ] . arXiv preprint , 2022 ,arXiv:2203.02155.
LEWIS M , LIU Y H , GOYAL N , et al . BART:denoising sequence-to-sequence pretraining for natural language generation,translation,and comprehension [C ] // Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics . Stroudsburg:Association for Computational Linguistics , 2020 : 7871 - 7880 .
COLIN R , NOAM S , ADAM R , et al . Exploring the limits of transfer learning with a unified text-to-text transformer [J ] . Journal of Machine Learning Research , 2020 , 21 : 5485 - 5551 .
LIU A , SAP M , LU X M , et al . DExperts:decoding-time controlled text generation with experts and anti-experts [J ] . arXiv preprint , 2021 ,arXiv:2105.03023.
0
浏览量
624
下载量
0
CSCD
关联资源
相关文章
相关作者
相关机构
京公网安备11010802024621