Augmented Creativity: Leveraging Natural Language Processing for Creative Writing

Abstract

Recent advances have moved natural language processing (NLP) capabilities with artificial intelligence beyond mere grammar and spell-checking functionality. One such new use that has arisen is the ability to suggest new content to writers to inspire new ideas by using “machine-in-the-loop” strategies in creative writing. In order to explore the possibilities of such a strategy, this study provides a model to be adopted in creative writing courses in higher education. An NLP application was created using Python and spaCy and deployed via Streamlit. The AI allowed students to see if their grammar aligned with those principles and techniques taught in class to assist with a deeper understanding of the grammatical aspects of the content and also to improve their creativity as writers. The study at hand seeks to determine the efficacy of a new proprietary NLP on improving understanding of grammar and creativity in student writing. Participants in the study were assessed through surveys and open-ended questions. Findings note that participants agreed the algorithm assisted them in a better understanding of grammar but were not as receptive to assistance in improving their creativity. It should also be noted that the suggestions provided by the algorithm did not necessarily improve the written artifacts submitted in the study. Results indicate that students enjoy using NLP as part of the creative writing process but largely, as with other language processing tools, to assist with grammar and syntax.

Share and Cite:

Plate, D. and Hutson, J. (2022) Augmented Creativity: Leveraging Natural Language Processing for Creative Writing. Art and Design Review, 10, 376-388. doi: 10.4236/adr.2022.103029.

1. Introduction

Artificial intelligence (AI) includes a wide variety of functionality of computer programming that can perform some functions that have hitherto been associated only with the human mind. While AI is often commonly used as a collective term, the functionality can be subdivided into several specialized fields that include robotics, computer vision, machine learning (ML), and natural language processing. Natural language processing (NLP), which includes the ability of an artificial intelligence (AI) to process and understand written and oral communication, undergirds daily interactions with information. NLP processes and analyzes vast amounts of data and is currently used in virtual assistants, search engines, and smart phones. At the same time, the AI in NLP can go beyond analyzing, interpreting, and improving how we access information (Chowdhary, 2020). For instance, these algorithms are able to assist writers in their processes with structure, editing, and refining their work. Word processors are so ubiquitous now that the functions of spell and grammar-checking, version control, and style and language analysis are taken for granted (Clark, Ross, Tan, Ji, & Smith, 2018). Yet, while natural language processing with artificial intelligence is increasingly common in word processing software, such as Microsoft Word and Grammarly, such systems have moved beyond merely being able to provide grammar or spellchecking and are now able to augment a writer’s capabilities in a more robust fashion. The new abilities find us at the intersection between human-computer interaction, natural language generation and, importantly, computational creativity. The next phase in developing these capabilities will be machines collaborating with writers as co-authors in authoring creative texts. Research into how collaborating with AI will help future writers has expanded beyond the bounds of computer science and is now more broadly discussed. For instance, Zeiba (2021) discussed the potential in a popular website for writers, the Literary Hub. The author notes that computational or electronic literature is not new, but AI has received more attention and is now responsible for more significant roles in the writing process. 2020 witnessed the publication of Pharmako-AI, touted as the first book to be written with “emergent AI.” There is now no shortage of AI writing aids for authors to choose from, and the role of AI in authorship needs to be defined, including copyright considerations granted to non-human entities.

Despite the broad use of AI and NLP in professional writing, as well as daily emails, there remains a reticence among postsecondary educators in English and creative writing to include these technologies in classroom instruction and curriculum design. Secondary school examples exist, such as recently reported by Woo (2022) where the first Human-AI contest in Hong Kong was implemented to teach K-12 students to learn text generation and bring into mainstream education. When the potentials are discussed in postsecondary studies, the focus remains on using AI specifically to teach composition in English classes (Liu & Kong, 2021). Therefore, this study contributes to the growing body of scholarship on how NLP is being applied in postsecondary education for the purposes of improving writing, and more specifically creative writing, through the creation and empirical study of the use of an AI algorithm developed and implemented using spaCy. Study participants agreed that the AI tool deepened their understanding of grammar and that further use was desired in the creation of future works of creative writing and short stories; however, the role the tool played in improving creativity was inconclusive.

2. Literature Review

2.1. AI Use in Linguistics

Natural language processing software has demonstrated uses beyond chatbots and virtual assistants. Artificial Intelligence, henceforth known as AI, has been leveraged for a wide range of uses in linguistics and English education, including language learning, corpus linguistics, reading, vocabulary, pronunciation, error analysis, evaluation of reading support tools, testing of spoken English, and computer-assisted language learning (CALL) systems development.

NLP has been leveraged in teaching and learning for many purposes. An example is provided by Ibrahim and Ahmad (2010) where NLP was used in conjunction with domain ontology techniques in order to create Unified Modeling Language (UML) diagrams. The static structure diagrams were extracted from informal NLP through a prototype tool called Requirements Analysis and Class Diagram Extraction or RACE. The proprietary instrument would allow analysts to create a method for efficiently producing class diagrams. Still in beta testing at the time, such examples demonstrate the potential for NLP in a range of disciplines.

Other examples in tutoring include The Writing Pal, which has been described by McNamara, Crossley and Roscoe (2013). As researchers relate, The Writing Pal is an intelligent tutoring system (ITS) capable of providing secondary and postsecondary students with strategies to improve the quality of their writing. The training primarily assists in essay writing. The most significant AI resides in the NLP algorithms designed to assess the quality of essays and provide feedback to writers. Given that writing is subjective and highly individualized, these algorithms had to be developed to consider a vast array of rhetorical, contextual, and linguistic features. The study to assess the efficacy of the tool considered the potential for creating computational indices in order to better predict human assessment of the same essays. Past research has found that indices of cohesion are not predictive of human assessment of essay quality, but word frequency, complexity of syntax, and linguistic indices are. In order to address the gaps in research, the study by McNamara, et al included a larger data set, along with a larger set of indices to assess all of these elements, including syntactic, reading, rhetorical, cohesion, and lexical. The specific model used in the study looked at three specific indices relating to word frequency, syntactic complexity, and lexical diversity.

Along the same lines as Writing Pal, the Automated Writing Evaluation (AWE) system was developed to assist in evaluating and improving writing among secondary education students. Snow, Allen, Jacovina, Crossley, Perret, and McNamara (2015) expanded their research into whether high-scoring writers in high school demonstrated flexibility and how that might be quantified. The study investigated the hypothesis by comparing student use of linguistic features such as cohesion and narrativity. Next, entropy analyses were used along with NLP in order to calculate the relative rigidity or flexibility of students in their use of cohesive and narrative linguistic features diachronically. These results were then compared to the differences in vocabulary knowledge, comprehension ability, and prior experience to individual differences and essay quality. The study provided a baseline for researchers who seek to quantify the ability of students to be flexible in their writing over specific periods of time.

Another example of AI NLP to assist in improving writing was discussed in a study of eRevise by Zhang, Magooda, Litman, Correnti, Wang, Matsmura, and Quintana (2019). The tool was designed as a web-based environment to analyze writing and provide assistance with revisions using NLP processing. The features of the tool included a generation of a rubric-based essay scoring feature that triggered early and formative feedback for students by messaging system in response-to-text writing. The goal of the tool was to assist students’ understanding of the criteria of assignments for using text-based evidence during writing. Ultimately, students would be better able to revise their own drafts. At the same time, the increased access to formative feedback showed positive results in reducing demands on teachers to assist students in effectively using textual evidence. Results from initial studies in classrooms concluded tools like eRevise are able to assist writing students to improve their essays through early interventions in the writing process through formative feedback, leading to greater engagement in the revision process.

In the teaching and learning of foreign languages, corpora have demonstrated special usefulness. These corpora of language data—collections of texts or text extracts that have been put together to be used as a sample of a language or language variety—began at the turn of the millennium to play an ever more important role in framing how language curriculum is structured (Coniam, 2004). For example, Hunston (2002) outlined the varied ways in which corpora have been applied in foreign language studies, including stylistics, grammar, translation studies, and developing dictionaries. Johns (1997) had already noted that one of the most common uses in the classroom was for data-driven approaches to teaching and learning. The use is not limited to the humanities, as Kunioshi, Noguchi, Tojo, and Hayashi (2016) relate in a study of how science and engineering graduate students improved their writing abilities through analyzing discrete sample-sized corpora from their specific fields of interest.

In order to maximize the usefulness of a corpus, a software tool is necessary to process and display the results of specific searches. Many concordances and corpus analysis programs have been developed, but some of the most used include WordSmith Tools and MonoConc Pro. Interestingly, very few of these types of tools have been designed and developed specifically for a classroom context. As such, the features tend to be designed for researchers and, therefore, include those rarely used by learners in a classroom setting. Compounding the issue is the fact that the user interface design of the programs is overly complex and do not follow the conventions of recent configuration and layout of windows-based applications. Since then, attempts have been made to develop tools specifically for the classroom. For instance, Anthony (2004) discussed the use case of AntConc, a corpus analysis toolkit designed specifically for use in the classroom. Such freeware applications continue to improve and are used in secondary and postsecondary education, which often have a more limited budget than industry, and are now able to run on both Linux and Windows-based systems. The tool most often used, including that of AntConc, is the concordancer. Given their ability to facilitate learning a second or foreign language by facilitating learning vocabulary, as well as grammar and writing styles, and collocations, concordancers are used popularly for functions beyond that of pure research (Sun & Wang, 2003). At the same time, other applications have been developed to assist ESL instruction. Chang and Chang (2004), for instance, presented their results on the three-year project of Project Candle. The project utilized various corpora and NLP to create an online learning environment for non-native English speakers in Taiwan. Using the English-Chinese parallel corpus Sinorama, students were presented with materials to learn reading and writing. The use of Sinorama was coupled with TotalRecall, an online bilingual concordancer and the reference tool TANGO. Online lessons consisted of reading, verb-noun collocations, and vocabulary. However, these initial reports did not assess the effectiveness of NLP for teaching English to non-native speakers. Finally, Crossley, Allen, Kyle, and McNamara (2014) discussed the simple NLP (SiNLP) tool to increase study in discourse processing research. Results of the study found that the tool performs as well as more robust text-analysis tools like Coh-Metrix on discourse processing tasks.

2.2. AI and Creative Writing

Specific to the field of creative writing, there have been a number of natural language processing software tools created and studies conducted on their efficacy in teaching grammar and improving creativity in postsecondary education. For instance, Clark, Ross, Tan, Ji, and Smith (2018) studied the possible use of machine-in-the-loop creative writing with two specific case studies that used prototypes for slogan and short story writing. Some participants were instructed to write with the use of the AI tool and others without. Results of the study indicated students found the tool engaging and helpful and stated they would continue using it in the future. One interesting finding of the team was that the tool did not necessarily create better examples from student submissions, however, redesigning the systems used could lead to better support for creative writing in the future.

There is a growing body of scholarship on machine learning (ML) and creativity. Franceschelli and Musolesi (2021), for instance, reviewed the history of using ML techniques and computational creativity theories and how those may be leveraged for automatic writing evaluation methods. Although originating in the nineteenth century, the last three decades of the twentieth century witnessed many attempts to build machines with the capability to “originate.” For example, Harold Cohen created the AARON Project designed a program to draw images autonomously, Margaret Masterman developed a Computerized Haiku (http://www.in-vacua.com/cgi-bin/haiku.pl), are but two examples of AI applied to artistic fields in an attempt to program creativity (Cohen, 1988). For instance, the storyteller TALESPIN, the poem book RACTER, and the short narratives in MEXICA are all examples of AI being used in creative enterprises (Meehan, 1977; Racter, 1984; Perez, 2017). The examples were also examined in depth by Douglas Hofstadter in order to better understand how self-reference works with the productive of creative works and what that could mean for AI (Hofstadter, 1979).

Another AI interface to improve creativity in writing was discussed in a study by Roemmele and Gordon (2018). The tool Creative Help was designed to assist writers with writing creatively through suggesting new sentences in a story, while maintaining control over the final edits and the generated suggestions. The authors used the Recurrent Neural Network language model in order to generate suggestions for writers in a straightforward and probable manner. Interestingly for the study at hand, the researchers varied how random the suggestions were in order to determine the specific role of unpredictability in creativity. Results demonstrated that, in fact, author interactions are influenced by the degree of randomness they are presented with.

Examples of AI to assist with creative writing extend beyond individual student help. There is a growing trend in the field to examine how it can also support collaboration. Kantosalo and Riihiaho (2019) investigated the possibility and attempted to identify which quantitative metrics may be used to analyze “human-computer co-creativity” for primary school education. Participants were asked to write poems using three different co-creative writing processes: collaborating with AI (human-computer), another student (human-human), and another student and AI (human-human-computer). The computer assistant in the study was Poetry Machine, an AI-based application. Participants were given questionnaires after each experience and at the end of the processes for a wholistic comparison. The metrics used in the questionnaires to evaluate the experience were: “immediate fun,” “long-term enjoyment,” “creativity, self-expression, outcome satisfaction,” “ease of starting and finishing writing,” “quality of ideas” and “support from others,” and “ownership.” Respondents disagreed the most with regards to long-term enjoyment, quality of ideas, support, fun, and ownership. When collaborating both with another human and the AI-application, participants demonstrated the highest levels of long-term enjoyment. At the same time, the AI was judged weakest in terms of support and idea quality.

The studies reviewed demonstrate the viability of AI, machine learning, and NLP for the teaching of creative writing in postsecondary education. At the same time, these examples and AI tools have not been broadly adopted in the creative writing process. Not surprisingly, faculty in the field are not trained in coding, programming, or AI. Furthermore, the availability of tools to assist with integration in the classroom may not be widely known or accessible to instructors. In order to address the last issue, William Mattingly created and published Python for the Digital Humanities (https://pythonhumanities.com/) that came out of his dissertation work in 2015. In exploring Carolingian exegesis and networks of those who wrote scriptural commentaries in the eighth and ninth centuries in Europe, Mattingly leveraged the scripting language Python. Out of the exercise came resources for helping others from a humanities background but with no experience in programming to start learning how to code. The most useful AI tool provided that will be investigated in this study is spaCy, an open-source software tool for NLP written with Python and Cython and published under an MIT license. Through the use of such AI tools as spaCy and others mentioned above, including Creative Help and Poetry Machine, ready-made and readily available software is ready to be integrated to prepare students for the future of collaborative writing with machines.

3. Methodology

The mixed-methods study included data from surveys collected from students. The sample was collected from Lindenwood University, a private, four-year, liberal arts institution in the suburban ring of St. Louis, Missouri. Participants included 19 undergraduate students from the College of Arts and Humanities. The purpose of the study was to investigate the efficacy of existing natural language AI software and determine the best pedagogical approach to implementing these tools in creative writing courses. This project utilized a mixed-methods study design which included qualitative (open ended comments) and thematic (quantitative) results from an online survey. The concurrent triangulation design sought to contextualize the quantitative data gathered and analyzed with the free responses to best understand the utility of the tool and any challenges participants may have faced in its use. The survey tool was administered in Spring of 2022 and collected data on student demographics, comfort with technology, experience using the AI tool, whether the tool improved understanding of grammar and also improved creativity. The survey was embedded within classroom activities to ensure a high response rate. First, demographic information was gathered from participants, including year in school, age, sex, ethnicity, residential status, whether they were an international student or a military veteran, first-generation status, and major.

Next, participants were asked to indicate via a 1 - 10 Likert scale their perceptions of AI technology in general and the new tool in particular. First, their general comfort with technology was assessed along with comfort in learning new technology. The experience with the Spacy tool was then assessed. Importantly, whether the tool helped students improve their grammatical knowledge and, secondly, whether it helped improve their creativity in writing. The last two Likert questions assessed whether there was anything surprising when using the tool and whether students would want to use a similar AI tool to help support their writing in the future.

Students were asked an open-ended question regarding their experience. Students were contacted either through the University course management system or were emailed with links to online surveys. The survey was available for approximately two weeks in the middle of the term when the AI tool was used in class, and all data was collected using Qualtrics to ensure privacy and anonymity of responses. These results were sorted based on demographics, and data were exported for the survey system. Descriptive statistics were calculated and used for comparisons between groups.

4. Results

The study examined the experiences of students in creative writing with a newly developed AI NLP application at a mid-sized private university for patterns and experiences. Special attention was paid to if and in what manner students improved their understanding of grammar and creativity through the use of the tool. The survey instrument included numeric and open-ended questions. The resulting data were analyzed through descriptive and thematic methods. The total sample size for this study was 19 student responses.

Students enrolled in an advanced creative writing class were surveyed and demographic data collected from them. 14 students responded to the survey in the face-to-face class in Spring of 2022. Of those respondents, 64.29% identified as being at junior status; all students identified as 18 - 24 years of age; 50% identified as female, 42.86% as male, and 7.14% opted not to say. With regards to race, 92.86% identified as non-Hispanic, 78.57% White, and 21.43% Black or African American. Only one student identified as International. Additional demographic considerations include status as an athlete, student athlete, veteran, and more. Of respondents, 85.71% of students identified as non-athletes; 78.57% not student employees; 21.43% claimed to have a disability; no students identified as veterans; and only 14.29% identified as first-generation students. Students were evenly split between commuter and residential status. Finally, students identifying their majors responded that 35.71% were Creative Writing emphases within a BA in English while another 28.57% as BA English, while other degrees cited including Acting (BFA), Game Design (BA) with emphasis in Game Art, History (BA), Spanish (BA), and Theatre (BA)

Students were then surveyed on their experience with technology and their use of the AI tool. Of respondents, 71.43% claimed to be somewhat or very comfortable with technology in general. Turning to experience with the algorithm specifically, 85.72% claimed that they felt “somewhat” to “extremely good” about the experience of using the application. All students agreed that the spaCy either helped or may have helped with their knowledge of grammar as part of the experience with 85.71% responding in the affirmative. Responses aligned with qualitative, open-ended responses and off-hand in class remarks regarding the value of the experience. On the other hand, students did not as readily see a correlation between use of the new tool and an improvement in their creativity. Only 21.43% claimed that the tool had improved their creativity, while 78.57% claimed “maybe” or responded in the negative. 64.29% stated that there was nothing surprising about the use of the tool, suggesting a familiarity with similar applications or the expectation was there that the tool would assist with a better understanding of grammar, but likely not creativity. 92.86% stated that they would consider using a similar tool in the future during their writing process; however, with the negative responses affecting creativity, the response could indicate use for purposes of grammar instead.

As noted, anecdotally, students claimed that the tool assisted with understanding the grammatical lessons of the class with several enthusiastically endorsing the use of the technology for creative writing. In the open-ended survey question, students indicated that overall the tool was very useful specifically for understanding parataxis. For instance, one student wrote: “I thought it was very user friendly which is the most important aspect of what was made. It being user friendly made it easy for me to just chuck sentences I was unsure of into the tester. It certainly made my life easier.” Prior to using the tool in class, students expressed a great deal of confusion over the concepts relayed associated with paratactic style, while after using the application an understanding was more often cited. Very few students made note of difficulties in using the technology. Most comments focused on the relevance for grammar instruction: “I liked how the paratactic checker shows which sentences were correct and I can change them so they can be in a more paratactic style.”

5. Recommendations

The most important recommendation based on the results of this study is increased research, development, and implementation of NLP tools in creative writing (and related) courses. Though the results leave some room for ambiguity of interpretation—primarily due to sample size and the limited period of use of the tool in the course—the overwhelmingly positive gestalt perspective of the students is rare. Students seldom express near unanimity that a particular pedagogical approach or set of exercises or assignments should be used again and created a positive writing experience. This should be the primary focus of any interpretation of this study. The students enjoyed using the tool, felt that it improved their understanding of a technically complex subject, and expressed interest in using other similar tools in the future. Though the following analysis brings out some questions and qualifications with respect to these findings, this overarching recommendation can be made with confidence. More research and development of NLP tools in creative writing, English, and humanities fields in general should be pursued with as much energy and funding as possible.

This being said, the manner of implementing such tools in the classroom could be improved. The most significant shortcoming of the study (other than simple limitations of sample size, number of class sections, time, etc.) was the fact that the tool was not incorporated into synchronous classroom activities. Students were given the URL of an application that would allow them to input sentences to test whether they were paratactic in style or not. Students enjoyed using the tool and made repeated use of it as they revised their writing assignment, but because this work was done exclusively outside of the classroom, the instructor had no way to observe how the students altered sentences or to interact with students as they made revision choices. An important additional use of the tool might have been to allow more time for students to work in groups to generate sentences that fit the style (as demonstrated by the app) and to attempt to “break” the expectations of the app. Creativity in writing often comes from informal experimentation, doodling, low-stakes attempts at babble alternating with polished serious writing. This could be encouraged and implemented more effectively in the classroom with the web-based application up on the screen for demonstration purposes.

Though this article recommends synchronous use of the tool in the classroom to reinforce the learning of syntactic structures, it is important to note the potential for this, and related, tools for asynchronous instruction. One of the challenges of online learning is the shortage of engaging back-and-forth pedagogical tools that allow students to get frequent feedback on their work. This is particularly the case in writing-intensive classes where the traditional pedagogical model of typed feedback on papers is still the most common method for showing students strengths and weaknesses in their work. This NLP tool was used by some students quite frequently over a short period of time, and the students enjoyed the gamification side of the experience. A student could type sentences into the tool, discover if the sentences had a paratactic style or not, make revisions to those sentences, and then resubmit them to see if they had “beaten” the app. More NLP tools that allow independent checking of oneself against a reliable and real-time “instructor” is the best way for students to improve in technical subject matter.

The most challenging recommendation to make has to do with the question of creativity. While students expressed near unanimity in enjoying the tool and finding it improved their technical skills with respect to grammatical/stylistic analysis, they were less positive about using the tool to enhance their creativity. Possibly, this is due to the fact that the tool most naturally comes into play during the proofreading/editing phases of the writing process, which fewer students conceive of as creative. But it is undeniable that the study shows little evidence of improving student originality or even stylistic sophistication.

It seems likely that the recommendation here is related to the point made above about incorporating the tool into synchronous in-class activities. If students have more experience with using the tool to revise sentences in interesting ways and, more importantly, with generating sentences with the tool they might not have thought of otherwise, it is probable this tool would have improved the students’ creativity with stylistic choices at least. It is important to clarify that this is a speculative point, however. The data from the study do not show that students improved in their ability to generate more creative products, either at the sentence level or at the global level of a story, essay, or poem.

6. Conclusion

As stated in the recommendations, the results of this study point to only one conclusion. More work of this sort should be done in most academic fields, and in fields such as creative writing where not much has been done so far, the acceleration should be as great as time and funding permit.

Two changes in higher education make the use of these tools especially promising. First, the always expanding progress of technological development makes it inadvisable for disciplines in the humanities to see themselves as continuing in a traditional vein favoring critical thinking, language skills, writing, etc. over STEM skills. These NLP tools in particular are the perfect bridge between language-and-text disciplines and others considered to be quantitative instead. NLP tools bring out the fact that the Internet itself and any code used to run these tools are built of text. Students can be drawn into seeing disciplines traditionally at odds as complementing each other in the students’ educational experience.

More practically, the shift in higher education toward online and remote forms of education will inevitably change the way in-person, in-classroom pedagogy functions as well. Tools such as the one piloted in this study show a way forward in developing tools for in-person instruction that can then be adapted in minor ways to function in a more “hands-off” way for asynchronous modules in courses. As with the humanities-STEM opposition above, the online-in-person opposition has been a contentious one in recent years. This study shows that these need not be the case. NLP tools can augment face-to-face instruction in ways that students enjoy, and they can facilitate high quality pedagogy in contexts where face-to-face interaction is not possible or simply not the intention.

As much research in AI is showing, this should not be an either-or pedagogical debate or struggle. AI tools broadly and NLP tools in particular allow augmented learning and should be developed and utilized wherever possible.

Conflicts of Interest

The authors declare no conflicts of interest regarding the publication of this paper.

References

[1] Anthony, L. (2004). AntConc: A Learner and Classroom Friendly, Multi-Platform Corpus Analysis Toolkit. In IWLeL 2004: An Interactive Workshop on Language e-Learning (pp. 7-13).
[2] Chang, J. S., & Chang, Y. C. (2004). Computer Assisted Language Learning Based on Corpora and Natural Language Processing: The Experience of Project CANDLE. In L. Anthony, S. Fujita, & Y. Harada (Eds.), IWLeL 2004: An Interactive Workshop on Language e-Learning (pp. 15-23).
[3] Chowdhary, K. R. (2020). Natural Language Processing. In K. R. Chowdhary (Ed.), Fundamentals of Artificial Intelligence (pp. 603-649). Springer.
https://doi.org/10.1007/978-81-322-3972-7_19
[4] Clark, E., Ross, A. S., Tan, C., Ji, Y., & Smith, N. A. (2018). Creative Writing with a Machine in the Loop: Case Studies on Slogans and Stories. In Proceedings of 23rd International Conference on Intelligent User Interfaces (pp. 329-340). Association for Computing Machinery.
https://doi.org/10.1145/3172944.3172983
[5] Cohen, H. (1988). How to Draw Three People in a Botanical Garden. In Proceedings of AAAI-88 (pp. 846-855).
[6] Coniam, D. (2004). Concordancing Oneself: Constructing Individual Textual Profiles. International Journal Corpus Linguistics, 9, 271-298.
https://doi.org/10.1075/ijcl.9.2.06con
[7] Crossley, S. A., Allen, L. K., Kyle, K., & McNamara, D. S. (2014). Analyzing Discourse Processing Using a Simple Natural Language Processing Tool. Discourse Processes, 51, 511-534.
https://doi.org/10.1080/0163853X.2014.910723
[8] Franceschelli, G., & Musolesi, M. (2021). Creativity and Machine Learning: A Survey. arXiv:2104.02726.
[9] Hofstadter, D. (1979). Godel, Escher, Bach: An Eternal Golden Braid. Basic Books, Inc.
[10] Hunston, S. (2002). Corpora in Applied Linguistics. Cambridge University Press.
https://doi.org/10.1017/CBO9781139524773
[11] Ibrahim, M., & Ahmad, R. (2010). Class Diagram Extraction from Textual Requirements Using Natural Language Processing (NLP) Techniques. In Proceedings for 2010 Second International Conference on Computer Research and Development (pp. 200-204). IEEE.
https://doi.org/10.1109/ICCRD.2010.71
[12] Johns, T. (1997). Contexts: The Background, Development and Trialling of a Concordance-Based CALL Program. In A. Wichmann, S. Fligelstone, T. McEnery, & G. Knowles (Eds.), Teaching and Language Corpora (pp. 100-115). Longman.
https://doi.org/10.4324/9781315842677-9
[13] Kantosalo, A., & Riihiaho, S. (2019). Quantifying Co-Creative Writing Experiences. Digital Creativity, 30, 23-38.
https://doi.org/10.1080/14626268.2019.1575243
[14] Kunioshi, N., Noguchi, J., Tojo, K., & Hayashi, H. (2016). Supporting English-Medium Pedagogy through an Online Corpus of Science and Engineering Lectures. European Journal of Engineering Education, 41, 293-303.
[15] Liu, A., & Kong, D. (2021). Research on the Teaching Mode of College English Based on Artificial Intelligence. Journal of Physics: Conference Series, 1848, Article ID: 012117.
https://doi.org/10.1088/1742-6596/1848/1/012117
[16] McNamara, D. S., Crossley, S. A., & Roscoe, R. (2013). Natural Language Processing in an Intelligent Writing Strategy Tutoring System. Behavior Research Methods, 45, 499-515.
https://doi.org/10.3758/s13428-012-0258-1
[17] Meehan, J. (1977). TALE-SPIN, an Interactive Program That Writes Stories. In IJCAI 1977 (pp. 91-98). Morgan Kaufmann Publishers Inc.
[18] Perez, R. (2017). Mexica: 20 Years-20 Stories [20 anos-20 historias]. Counterpath Press.
[19] Racter. (1984). The Policeman’s Beard Is Half Constructed. Warner Books, Inc.
[20] Roemmele, M., & Gordon, A. (2018). Linguistic Features of Helpfulness in Automated Support for Creative Writing. In Proceedings of the First Workshop on Storytelling (pp. 14-19). Association for Computational Linguistics.
https://doi.org/10.18653/v1/W18-1502
[21] Snow, E. L., Allen, L. K., Jacovina, M. E., Crossley, S. A., Perret, C. A., & McNamara, D. S. (2015). Keys to Detecting Writing Flexibility over Time: Entropy and Natural Language Processing. Journal of Learning Analytics, 2, 40-54.
https://doi.org/10.18608/jla.2015.23.4
[22] Sun, Y. C. & Wang, L. Y. (2003). Concordancers in the EFL Classroom: Cognitive Approaches and Collocation Difficulty. Computer Assisted Language Learning, 16, 83-94.
https://doi.org/10.1076/call.16.1.83.15528
[23] Woo, D. J. (2022). Secondary School Student-AI Creative Writing: Strategies from Text Generator Interactions. arXiv preprint arXiv:2207.01484.
[24] Zeiba, D. (2021) How Collaborating with Artificial Intelligence Could Help Writers of the Future. Literary Hub.
https://lithub.com/how-collaborating-with-artificial-intelligence-could-help-writers-of-the-future/
[25] Zhang, H., Magooda, A., Litman, D., Correnti, R., Wang, E., Matsmura, L. C. et al. (2019). eRevise: Using Natural Language Processing to Provide Formative Feedback on Text Evidence Usage in Student Writing. Proceedings of the AAAI Conference on Artificial Intelligence, 33, 9619-9625.
https://doi.org/10.1609/aaai.v33i01.33019619

Copyright © 2024 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.