Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

So if we asked GPT to write a book, it would be hallucinating a chain of words without sticking to any coherent plot. However, we could use a "multi-resolution" approach even with today's version of GPT: at the top level we ask it to write a brief plot for the entire novel, at the next level we'll use this plot as the context and ask to outlines sub-plots of the 3 books in our novel, at the third level we'll use the overall plot and a book's summary as context to generate brief descriptions of chapters in the book, and so on.


It’s interesting because in the writing world there’s a spectrum with plotters at one end and pantsers at the other. Plotters work similarly to what you’ve suggested, starting with a plot and working their way down to the actual writing. Pantsers just start writing ‘by the seat of their pants’ and see what emerges. Stephen King is famously in the latter camp. Most people fall somewhere in between, having a rough plot in mind and work out the rest as they go along. Would be interesting to see different AIs take different approaches and see what emerged.


The pantsers can also fit the model I've described. In this case GPT would keep in memory a sliding window of past N=1024 words, like it does today, but in addition to that it would remember the past N paragraph-tokens (symbols that are blurry versions of all the words in that paragraph), the past N chapter-tokens and so on. When generating words, GPT would first generate the next chapter-token, then the next paragraph-token and finally the next word-token.


https://arxiv.org/abs/2209.14958 This paper outlines a similar method, but with the addition of guiding the plot structure. See page 30 for the specific prompt sets they used.


It's worth a try, but I expect you will still get continuity issues between chapter 1 and later chapters. It's not necessarily coherent even at small scale.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: