AI METHODS THAT CAN GENERATE LONGER TEXTS WITH MINIMAL INITIAL INPUT

Authors

  • Á. P. Sándor Kalashnikov Izhevsk State Technical University; Széchenyi István University in Gyоr

DOI:

https://doi.org/10.22213/2618-9763-2021-4-109-121

Keywords:

natural language processing, artificial intelligence, fourth industrial revolution, generative pre-trained transformer 2, stochastic models

Abstract

The article is devoted to the description of the work and the research of text generators generate longer texts with a minimum input of primary information. The research is interdisciplinary of NLP and carried out at the intersection of Natural Language Processing and Computational Linguistics. The emergence of the growth of web-based textual information has significantly accelerated the development of some scientific fields that have existed for many decades, but in the absence of much access to input data, they have not developed as rapidly as in the past two decades. There have been several promising ideas and experiments in this field for the automatic processing of natural language texts, which have already been implemented in many of the systems used, including many commercial systems. In my research, I try to better understand and compare existing CL applications and their operation in each topic. However, specific applications fall into one topic, with little or no difference in their area of operation and application. The methods NLP in the field of artificial intelligence were described. The most important and traceable form of communication is writing. Artificial intelligence and automation are at the heart of the ongoing fourth industrial revolution. During the research process, the author drew several conclusions during the research process. Among other things, he understood the importance of data vectorization and that any abstract process can be well modelled with mathematical processes. Although the use of different text generators are based on AI, the role of man is not negligible, as setting input parameters and checking output results for different methods still requires human control to this day. The layout and “quality assurance” of the text is the responsibility of AI researchers. Various techniques and methods have evolved and supplemented significantly over the past 20 years, with the latest text generators almost surpassing the framework and quality of text written by a person. This does not mean that in the future we will only read machine-generated text. Intuition and creativity still require the presence of man to this day, the machine processes the set of information we enter or enter and processes the various patterns and then generates texts using them.

Author Biography

Á. P. Sándor, Kalashnikov Izhevsk State Technical University; Széchenyi István University in Gyоr

Student, The Faculty of Mechanical Engineering, Informatics and Electrical engineering

References

Rubya S., Monir S., and Ferdous H. S. Genetic approach to a flexible cell phone keypad with reduced keystrokes and key jamming for better human technology interaction. J. Multimed, vol. 7, no. 5, pp. 341-352, Oct. 2012. DOI: 10.4304/jmm.7.5.341-352.

Sang-Hun C. Rule of Thumbs: Koreans Reign in the Texting World - The New York Times. The New York Times, 2010. Available at: https://www.nytimes.com/2010/01/28/world/asia/28seoul.html?th&emc=th

James C. L. and Reischel K. M. Text input for mobile devices. Proceedings of the SIGCHI conference on Human factors in computing systems - CHI '01, 2001, pp. 365-371. DOI: 10.1145/365024.365300

Sandnes F. E. Reflective Text Entry: A Simple Low Effort Predictive Input Method Based on Flexible Abbreviations. Procedia Comput. Sci. 2015, vol. 67, pp. 105-112. DOI: 10.1016/j.procs.2015.09.254.

Artificial Intelligence in a Modern Approach Second, Revised, Extended Edition - Russell, Stuart Nor-vig Peter, pp. 737-738.

Qafmolla N. Automatic Language Identification. Eur. J. Lang. Lit., Jan. 2017, vol. 7, no. 1, p. 140. DOI: 10.26417/ejls.v7i1. P. 140-150.

Dunlop M. D., Durga N., Motaparti S., De Meo R., and Dona P., Open Adaptxt: An Open Source Enabling Technology for High Quality Text Entry, in Chi 2012 Ea, 2012.

MacKenzie I. S., Kober H., Smith D., Jones T., and Skepner E., Letter Wise: Prefix-based disambiguation for mobile text input. UIST (User Interface Software. Proc. ACM Symp.). 2001, p. 111.

Radev D. R., Joseph M. T., Gibson B., and Muthukrishnan P. A bibliometric and network analysis of the field of computational linguistics. J. Assoc.Inf.Sci. Technol., Mar. 2016, vol. 67, no. 3, pp. 683-706. DOI: 10.1002/asi.23394.

Nadkarni P. M., Ohno-Machado L., and Chapman W. W. Natural language processing: An introduction. J. Am. Med. Informatics Assoc., Sep. 2011, vol. 18, no. 5, pp. 544-551. DOI: 10.1136/amiajnl-2011-000464.

Cambria E. and White B. Jumping NLP Curves: A Review of Natural Language Processing Research. IEEE Comput.Intell.Mag., May 2014, vol. 9, no. 2, pp. 48-57. DOI: 10.1109/MCI.2014.2307227.

Kang M., Ahn J., and Lee K. Opinion mining using ensemble text hidden Markov models for text classification. Expert Syst. Appl., Mar. 2018, vol. 94, pp. 218-227. DOI: 10.1016/j.eswa.2017.07.019.

Steven A. Data-Intensive Experimental Linguistics. Linguist. Issues Lang. Technol., 2011, vol. 6, no. 0, pp. 1-27.

Zoltán S. and Jinil Y. Taxonomy, use cases, strengths and challenges of chat bots. Inf.Tarsad., Jul. 2018, vol. 18, no. 2, pp. 41-55. DOI: 10.22503/inftars.XVIII.2018.2.3.

Kiddon C., Zettlemoyer L., and Choi Y. Globally coherent text generation with neural checklist models. EMNLP 2016 - Conf. Empir. Methods Nat. Lang. Process. Proc., 2016, pp. 329-339. DOI: 10.18653/v1/d16-1032.

Writer B. Lithium-Ion Batteries. Cham: Springer International Publishing, 2019.

Pap G. and Szűcs G. Continuous Markov Chains; Kolmogorov Equations, in Stochastic Processes, 2013.

Ibidem.

Ogada K. and Mwangi W. N-gram Based Text Categorization Method for Improved Data Mining. J. Inf. Eng. Appl., 2015, vol. 5, no. 8, pp. 35-44.

Marco Romano, Luca Paolino, Genoveffa Tortora & Giuliana VitielloThe Tap and Slide Keyboard: A New Interaction Method for Mobile Device Text Entry, International Journal of Human - Computer Interaction, 2014, 30:12, 935-945, DOI: 10.1080/10447318.2014.924349.

Weir D., Pohl H., Rogers S., Vertanen K., and Kristensson P. O. Uncertain text entry on mobile devices. In Conference on Human Factors in Computing Systems - Proceedings, 2014, pp. 2307-2316. DOI: 10.1145/2556288.2557412.

Dey A. Machine Learning Algorithms: A Review. Int. J. Comput. Sci. Inf. Technol., 2016, vol. 7, no. 3, pp. 1174-1179.

Sebastiani F. Machine Learning in Automated Text Categorization. ACM Comput. Pressure.Mar. 2002, vol. 34, no. 1, pp. 1-47. DOI: 10.1145/505282.505283.

Yeo H. S., Phang X. S., Castellucci S. J., Kristensson P. O., and Quigley A. Investigating tilt-based gesture keyboard entry for single-handed text entry on large devices. InConference on Human Factors in Computing Systems - Proceedings, May 2017, pp. 4194-4202. DOI: 10.1145/3025453.3025520.

Sotsenko A., Zbick J., Jansen M., and Milrad M. Flexible and contextualized cloud applications for mobile learning scenarios.InAdvances in Intelligent Systems and Computing, 2016, vol. 406, pp. 167-92.

Inan H. A. et al. Training Data Leakage Analysis in Language Models, 2021.

Askell A. et al. Better Language Models and Their Implications. OpenAI, 2019. Available at: https://openai.com/blog/better-language-models/(accessed 12.11.2021).

Ibidem.

Wu X. and Lode M. [GPT-2] Language Models are Unsupervised Multitask Learners. Open AI Blog, May 2020, vol. 1, pp. 1-7.

Askell A. et al. Better Language Models and Their Implications. Open AI, 2019. Available at: https://openai.com/blog/better-language-models/(accessed 2.11.2021).

Lai I. Conditional Text Generation by Fine Tuning GPT-2. Towards Data Science, 2021. Available at: https://towardsdatascience.com/conditional-text-generation-by-fine-tuning-gpt-2-11c1a9fc639d (accessed 19.11.2021).

Google-research/bert. Available at: https://github.com/google-research/bert (accessed 11.11.2021).

Yen-Chun Chen, Zhe Gan, Yu Cheng, Jingzhou Liu, Jingjing Liu. Distilling the Knowledge of BERT for Text Generation. ICLR 2020 Conference Withdrawn Submission, 2019.

Omeiza D., Adewole K. S., and Nkemelu D. EEG-based Communication with a Predictive Text Algorithm. Submitted to 31st Conference on Neural Information Processing Systems (NIPS 2018). ArXiv, Nips, 2018.

America M. Talk to Transformer. The Routledge Handbook of Remix Studies and Digital Humanities, 2021. Availableat:https://app.inferkit.com/dem (accessed 08.11.2021).

Published

18.01.2022

How to Cite

Sándor А. П. (2022). AI METHODS THAT CAN GENERATE LONGER TEXTS WITH MINIMAL INITIAL INPUT. Social’no-Ekonomiceskoe Upravlenie: Teoria I Praktika, 17(4), 109–121. https://doi.org/10.22213/2618-9763-2021-4-109-121

Issue

Section

Articles