nanogenmo / 2018 Goto Github PK
View Code? Open in Web Editor NEWNational Novel Generation Month, 2018 edition.
Home Page: https://nanogenmo.github.io/
National Novel Generation Month, 2018 edition.
Home Page: https://nanogenmo.github.io/
I hesitate between:
Or more difficult, more interesting but less funny:
Frustrated with how much writing I need for my other idea, and chronic fatigue, this is a simpler idea.
Have four rooms, with four weapons, and four different mythic creatures. And get them to interact.
I'm going to create a novel using UFO sightings data which in the form of a diary.
The plan for this is to build a scraper to find UFO sighting data. 👽
I originally reported this incident to bufora with dogmatic response which i was never satisfied with. i felt that it lacked scrutiny. prognosis was a bolide space debris, however there was no explosion - the object simply disappeared, it was not a straight flight, it was utterly silent - the night was very still for a november, fireworks emit bangs, there were none. unusually it was very hard to range the object which i don't normally find difficult being a professional engineer. i also gave another incident to report from the same location of an orange orb in 2009 also reported with a dismissive response again that i felt didn't address the issue. photos attached carry the originally completed forms. please take seriously. many thanks.
My idea for a novel would be to place lyrics written by The Beatles, either from one of their albums or all of them, and put them in reverse order. This idea sparked from the many conspiracy theorists who believe in the idea that the original Paul McCartney is dead and that he currently has a twin replacement. Also The Beatles were known to have a keen sense of humor and played on these myths by putting subliminal messages in their music and album covers as a joke.
Here's a crossover NaNoGenMo / NaNoLiPo entry!
We all know a word containing "e"s.
meow
And we all know a word not containing "e"s.
woof
And we all know a monosyllable not containing "e"s.
woof
Here is my entry for NaNoGenMo 2014, 50,000 Meows refactored (copied and pasted) into 50,000 Woofs.
Original | woof | words | with translation | words |
---|---|---|---|---|
Moby Dick; or The Whale, by Herman Melville | txt pdf | 215,136 | txt pdf | 430,272 |
Here's part of Moby Dick; or Th Whal, by Hrman Mlvill:
"WOOOF.... Wf. wof Wof. WOOF. Woof woooof wf wooof woof wooooooof wf
wooooof; wof wf Wof. WOOOF wf woooof wf wooooof." --WOOOOOF'W WOOOOOOOOF
"WOOOF.... Wf wf woof wooooooooof woof wof Wof. wof Wof. WOOOOF; W.W.
WOOF-WOF, wf woof, wf woooof." --WOOOOOOOOF'W WOOOOOOOOF
WOOOF, WOOOF.
WOOOF, WOOOF.
WOOOF, WOOOF-WOOOF.
WOOOF, WOOOOF.
WOF, WOOOF.
WOOF, WOOOOOF.
WOOOF, WOOOOOOOF.
WOOOF, WOOOOOF.
WOOOOOF, WOOOOF.
WOOOOOF, WOOOOOF.
WOOOF-WOOF-WOOF, WOOOF.
WOOOF-WOOF-WOOF, WOOOOOOOOOF.
WOOOOOOF (Woooooof wf w Wof-Wof-Wooooooof).
And with line-by-line translations:
That night, in the mid-watch, when the old man--as his wont at
Woof wooof, wf wof wof-wooof, woof wof wof wof--wf wof woof wf
intervals--stepped forth from the scuttle in which he leaned, and went
wooooooof--wooooof wooof woof wof wooooof wf wooof wf woooof, wof woof
to his pivot-hole, he suddenly thrust out his face fiercely, snuffing
wf wof wooof-woof, wf woooooof woooof wof wof woof woooooof, woooooof
up the sea air as a sagacious ship's dog will, in drawing nigh to
wf wof wof wof wf w wooooooof woof'w wof woof, wf wooooof woof wf
some barbarous isle. He declared that a whale must be near. Soon that
woof wooooooof woof. Wf woooooof woof w wooof woof wf woof. Woof woof
peculiar odor, sometimes to a great distance given forth by the
woooooof woof, wooooooof wf w wooof woooooof wooof wooof wf wof
living sperm whale, was palpable to all the watch; nor was any mariner
woooof wooof wooof, wof woooooof wf wof wof wooof; wof wof wof wooooof
surprised when, after inspecting the compass, and then the dog-vane, and
wooooooof woof, wooof woooooooof wof wooooof, wof woof wof wof-woof, wof
then ascertaining the precise bearing of the odor as nearly as possible,
woof woooooooooof wof wooooof wooooof wf wof woof wf woooof wf woooooof,
Ahab rapidly ordered the ship's course to be slightly altered, and the
Woof wooooof wooooof wof woof'w woooof wf wf woooooof wooooof, wof wof
sail to be shortened.
woof wf wf wooooooof.
https://github.com/hugovk/NaNoLiPo2018/tree/master/01-avoidlipo
I will create a "novel" that describes a tournament for the card game War, with a combination of play by play and color commentary. We'll see how the color commentary goes.
I wrote a simple War simulator in Javascript that I'll post, and use this as the basis for the tournament coverage.
I'm going to start off with something easyish, and then see what follows.
First by reproducing some historical procgen shared by James Ryan:
extremely early #procgen: Pfizer used an IBM 702 in 1956 to generate 42,000 prospective drug names, which were compiled into a printed book!
I'll start with one or both of these from Corpora to get word endings:
I'll find a bunch of one and two syllable words, then combine them to get ~42,000 words.
Only 42k words? Worry not.
Like the IBM machine, it will be "taught to be discreet" and "fixed to automatically eliminate four-letter combinations that wouldn't be proper in a family medicine chest". So I'll need a list of "improper words" to filter those out. And unlike the original, I think they will belong in an ~8kword appendix.
I may also dip into @ojahnn's NaNoLiPo for other ideas, or some unrealised ideas from 2017, 2016 and 2015 and 2014.
(There were other goals but this turned out to be the main one.)
Of particular interest are the world-description in 930 lines of Samovar and the 363-line Python 3 script that renders the generated events into sentences.
The version of Samovar used was 0.2. For more information on Samovar, see its entry at Cat's Eye Technologies or its repository on GitHub.
In celebration of GitHub's recent acquisition by Microsoft, I have provided this document for you in Microsoft Word format. It is 85 pages long and consists of 53,129 words.
(If you cannot (or prefer not to) view files in Microsoft Word format, there is also a Markdown version which you can view directly on GitHub.)
Because it tells a story over the course of 50,000 words, I feel that a single excerpt would not do it justice, so here are a handful of them.
[...] Moonlight flooded in through the French window and illuminated the suit of armor. The shadowy figure rubbed his chin. An owl hooted outside and the shadowy figure froze. The shadowy figure coughed and was now sure no one else was about. An owl hooted outside and the shadowy figure froze. The shadowy figure cast a furtive glance around the room and coughed and was now sure no one else was about and examined the leather couch closely and leaned back in the leather couch and looked out the French window and examined the leather couch closely and looked out the French window and leaned back in the leather couch and got up and stretched and coughed and rubbed his chin and coughed and
[...] Pranehurst put down the encyclopedia. Scurthorpe looked at Pranehurst. Throgmorton nodded to Pranehurst. Furze-Platt looked at Pranehurst. Pranehurst nodded to Throgmorton and nodded to Furze-Platt. Scurthorpe picked up the quill pen and got up and stretched and walked around the library and coughed. Throgmorton looked at Scurthorpe. "I shall write to Old Grisbourne. He will know just what to do," said Throgmorton. Throgmorton looked at Pranehurst. Scurthorpe walked around the library. Furze-Platt examined the bookshelf closely. Throgmorton brushed some dust off his coat sleeve. Pranehurst nodded to Scurthorpe and
[...] Nearby there was a grandfather clock. Furze-Platt walked over to the fireplace. Throgmorton sat down on the leather chair and leaned back in the leather chair. Furze-Platt rubbed his chin. Throgmorton brushed some dust off his coat sleeve. Furze-Platt rubbed his chin and picked up the whiskey. Throgmorton looked at Furze-Platt. "I think YOU stole the silver statuette of Artemis, Furze-Platt!" shouted Throgmorton. "WHAT?" bellowed Furze-Platt. Throgmorton rubbed his chin and put down the newspaper. Furze-Platt walked around the sitting room. Throgmorton rubbed his chin and coughed and nodded to Furze-Platt. "I think YOU stole the silver statuette of Artemis, Throgmorton!" shouted Furze-Platt. "Well I never!" bellowed Furze-Platt. Furze-Platt spluttered and looked out the window.
[...] Furze-Platt looked out the grimy kitchen window and put down the empty teapot and put down the empty kettle and picked up the tea infuser and picked up the empty teapot and put down the empty teapot and looked out the grimy kitchen window and rubbed his chin and picked up the cannister of tea and rubbed his chin and picked up the empty teacup and picked up the empty kettle and examined the grimy kitchen window closely and put down the empty teacup and picked up the empty teapot and put down the empty kettle and picked up the empty kettle and coughed and rubbed his chin and rubbed his wrist and coughed and looked out the grimy kitchen window and walked away from the grimy kitchen window and walked over to the oven and walked away from the oven and
(Original content of this post is retained below)
For past NaNoGenMos I've alternated between "experimental works" and generating "proper novels". Last year I did some "experimental works" so this year I guess I better generate a proper novel, hey? Not that I have the time for this.
Looking at my previous generators, The Swallows was essentially simulation-based and MARYSUE was more-or-less grammar-based. For this one, I'd like to combine the two approaches using techniques that could be called railroading (TVTropes link).
Also, both of those generators modelled the world as discrete objects, in the manner of, say, a typical text adventure game. In this one, in contrast, I'd like to model the world as a set of propositions, similar to the "database" in Prolog. (I don't think I'll actually use Prolog - I mean logic is great and all but I've never been convinced it's very good for programming in. I actually sketched a DSL for this approach a while back, but I don't think I'll use that either. With the right set of abstractions, doing it in a "mainstream" language should be fine, and Python is what I'm most used to these days.)
My hope is that those two things will work well together and will allow some more sophisticated narrative development, stuff that was kind of awkward in the previous generators, to fall out fairly naturally.
There are certainly other things I'd like to tackle, but finding the time to do what I've already described is already a stretch. But at least one deserves mentioning, which is the actual construction of sentences. The output of a simulation is a sequence of events, and yes you can write one sentence per event, but it's horrendous, even if you use a lot of templates. What would be ideal is if the actual "writing" part of the generator could construct sentences more "from scratch", based on a grammar (obviously) but not just expanding that grammar randomly (obviously) but rather reflecting the content of the events. This is obviously incredibly difficult and I'm not going to get very far in this area, but I'd regard even a tiny bit of progress here a success.
I have three ideas this year, and I'm making a separate issue for each. This one is a little different, and technically it isn't a NaNoGenMo entry.
First, a bit of background: I'm a college professor working at a school where most of my job is teaching (a 4/4 load). As such, I don't have as much time as I'd like to to work on research and publishing. Fortunately, my career advancement is tied mostly to teaching, but I do like writing and publishing things when I can. The problem -- besides time -- is that without the extrinsic motivation of "publish or perish" now that I have tenure, it's been harder to focus on actually finishing things.
I've currently got three or four projects in various stages of completion, so I want to use NaNoGenMo 2018 as a catalyst for finishing one of those things. Specifically, I did a conference paper for ELO 2018 about NaNoGenMo, and I want to expand that into a journal article by the end of the month.
I've got a repository where I'll be collecting my data and updating my draft as as it grows. Even if no one cares to look at it as I'm writing, I hope that just having it public at this stage will be help me be accountable to finishing it.
I'm trying to get my students into NaNoGenMo, so I made a couple of quick novels yesterday to demonstrate the concept.
Because it was raining a lot then (and still is today), here are:
and
Not sure if I was going to get time to complete this in November, so I worked on it early. This novel (or more accurately, a novella with a bunch of "junk meows" at the end to meet an arbitrary word count) is an entry along the same lines as "The Track Method", and was an attempt to find and reuse other people's works. The main difference is that the novel text doesn't solely rely on public domain works, but also works that have been licensed under CC-BY-SA (such as Wikiquotes and the StackExchange network). I also generate citations so that people know what works I used (and where I used them).
I'm torn about this novel.
As for the novella's text itself - some of the scenes (not all) in the novella are really cool for me to read and write (as they deal with certain themes that I'm interested in). It's possible that "The Track Method" may be a superior novella though.
The novel consists of:
11,995 words (if you're counting only readable words)
12,382 words (if you're counting readable words and citations)
62,393 words (if you're counting readable words, citations, and junk meows to get me past the word count)
This is my yearly "yes I'm going to finish something for NaNoGenMo this year!"
Hopefully I'll finally find the time and motivation
I have three ideas this year, and I'm making separate issues for each. This is one of them.
A little while ago I made a twitter bot out of the text you get when you ask Microsoft Cognitive services to describe what it sees in an image. There are some limitations to that API, but I bet it could work on a series of images to eventually produce 50K words of text. The trick will be figuring out a meaningful series of images.
It might work to feed it a graphic novel one panel at a time, but I don't think the AI works very well with drawn images. I feel like it's just going to say "a picture of a drawing" every time.
I have yet to try it, though, so maybe that will work or maybe I'll need to think of something else.
The best writers were alcoholics. Presumably their editors fixed all the "I can't find the keyboard" typos. But we don't need editors, because we have autocorrect. Surely hammering out an entire novel while drunk and then having autocorrect fix all the typos will lead to a perfectly accurate text, and not some nonsensical abomination, right?
...Right?
Project X
will be about showing the wonders of what x
is, what is should be, it's definitions and it's usage. Project X will wonder in documentation space of R functions.
I'm an R user, all functions arguments
in R are documented. Many of them are are called x
. Example below
plot(x, y, ...)
Arguments
x
the coordinates of points in the plot. Alternatively, a single plotting structure, function or any R object with a plot method can be provided.
print(x, ...)
Arguments
x
an object used to select a method.
So x
can be a lot.
For this submission to NaNoGenMo, I'll get all textual documentation of all arguments of all R functions which are called x
and I'll generate new possible values of x
. How will these be generated?
x
Inspired in part by Casey Callich's wonderful post on Asemic Writing, I decided to produce a novel with no semantic content.
If I stay super dedicated, maybe I'll get as far as a procedurally generated font and some nice rendering tools so that I can get away from these horribly meaning-rich ASCII glyphs. But for now the goal is just to make a novel out of meaningless wordlike ASCII units.
It felt somehow appropriate, even important, to produce a full novel as the absolute first output of my program – no inspecting the outputs of components as I worked, no bugfixing, a pure attempt to envision the shape of the output of my code and then see what happened. From blank file to output was about 40 minutes.
This exercise went very well – I got about 90% of what I wanted, and the bugs that made it in gave the output a really quite incredible flavor that I never would've imagined on my own. I'm very pleased.
I will likely go ahead and fiddle with the numbers and shape it into something more closely resembling my original intention, but for now, here's Ogaeiouaeioujaeiou.
The code is available at https://github.com/jseakle/nanogenmo2018
Having never attempted anything like this before I think I'm going to write this in a "Griffin and Sabine" style. A series of correspondence between two characters. So the first letter would be the input into the 2nd letter from the 2nd character and then a response again from the first etc...
Andres is from Buenos Aires Argentina.
Brook is from St Louis MO USA.
Wish me luck.
I have a more, what I consider, painterly method in mind. Lots of layers, lots of curation.
What I want most is to have the results readable, if not compelling. Humor is also very important. I want a sense of progression and narrative thread. I've made procedural/mechanical books before and they've always seemed odd and not very fun for other, more pedestrian, readers.
Here and here and here you'll find my purchasable poetry books (some have a few pages for preview to get the taste of them)
Instead of a smoothie style text where any one part is pretty similar to any other part, I want to paint with distinct moments. I want to sculpt a journey across a fictional land with verisimilitude, with texture and feeling. Still, I want to embrace the machine part of it and will leave lots of room for buggy fun and discovery of non-intuitive forms.
Narrative will be presented in 2nd person for an extra hook of engagement.
To this end I've begin compiling lists of the essences of day to day life around the world. Also, I will be relying heavily on the amazing database over at ConceptNet.io for filling this world.
Here is a rough outline of the content of the journal entries:
List of types of entries:
(Probabilities are independent and entries stack)
50-200 words each day journaled, about 600 entries total.
Type (content source)
Common:50-75%
Personal Activity (1-5x)
Fellow Activity (if present)
Environment Activity (using ConceptNet)
Weather + Clouds (web)
Dialogue with fellow (if present)
Frequent:25-50%
Personal psychology/religion (list of feeling words)
Fellow psychology (if present)
Environment hopes/predictions (see weather)
Missing relatives/home/hometown (wiki kinship/houseItems/TownParts)
Rare: <25%
Medical emergency (body part then conceptNet isusedfor)
Quotes (TEXT FILES * 1liners.cap)
Song (scout songs)
Poetry (web)
Holidays (holidays and observations in the united states, markovify)
FoodRecipe (recipes tried and true, markovify (title & then content)
ChildhoodMemory (web A New England Girlhood, search: “I was”, “I did”, “I saw”, …)
InventoryProblems (web)
Once:
Haunting of previous versions of protagonist (at certain times)
Ghost story (Spooky campfire stories)
Death / Killed by travel companion (based on strand length)
Travel companion gives birth / lays eggs / mating.
I audacious ambition I also want to introduce a recursive plot structure like the following:
Recursive plot structure kind like:
ABCD(CD(CD)E)EF
Where the parenthesis denote the story of a fellow traveler met in the previous step.
And with locations like:
A = ocean, B = beach, C = forest & ghost encounter, D = town & new traveler, E = desert & ghost story, F = mountain goal
Reactions? Ideas? Suggestions?
Last year I wanted to create a language model from Obama's speeches and use that to generate dialogue in a play or novel. Probably play because the structure will help.
The goal here is to generate a glossy Catalog of bizarre and unique objects - like you might see in SkyMall or The Sharper Image, except these will be even stranger.
The title, description and price of objects are procedurally generated. (Now is as good a time as any to learn Tracery...) Each object will also have a "photo", made by procedurally generating a POV-Ray scene file and rendering it. The final collection of objects will go into a layout tool that generates a .pdf or .ps with cover, table of contents, back page, advertisements, etc.
For determining word count of the pictures, I'll probably go with counting the words in the POV-Ray script that drives rendering.
I'd like to develop a formal system for generating plots. I'm inspired by previous generative attempts by others. In particular (wish I had links; may have to edit this later):
The Swallows, which I remember as a logic engine with rules for picking up objects, putting them down, moving people from one room to another, and so forth. This got more interesting when one of the objects was a corpse in the bathtub.
A planning system wherein someone is planning a crime. The data in the system consists of number of actions that can be performed, which have preconditions and consequences. There are also probabilities associated with the actions. The "novels" put out are more or less a trace of the backtracking search procedure.
The drawback in the first one seems to be that there is no direction. There does not seem to be much in the way of individual motives, desires, which could lead to conflicts.
The drawback in the second was that the actions and situations were modeled by fixed strings. This made for a finite search space. I'm hoping to parameterize the actions and situations to make the system more flexible.
Thus I'm going to introduce unification for a logic in the style of the first inspiration, and a probabilistic choice of actions in the style of the second.
I'm also likely to do this while reading Antonio Damasio's book, the Strange Order of Things, which is about the evolution of mind. I'm expecting to learn something a bit formal about emotions there, but I'll see what happens.
The whole process of writing this is likely to be very heuristic and experimental, with only intuitive exploration of partial results. Advice and commentary will be very welcome.
I like playing Seedship quite a lot. Recently I thought to myself, "Maybe I should try something similar in NaNoGenMo"? And that's what I will be doing. My goal is to write code that will be able to generate description of a spaceship's journey through stars in a search for a new home for humanity.
I have no idea yet what or how I'm going to do, but I wanted to put a stake in the ground to put associate some cost, no matter how small, to simply flaking-out.
OK, I'm going to try this year.
I want to start with the NaNoWriMo novel I wrote in 2015, which is here. I'll tag it by hand, then infer some kind of stochastic model to capture its structure, then use that, with a bunch of models trained on Gutenberg corpora, to generate something new. I want to do it all in Rust, which I'll be learning along the way.
I have a terrible track record of completing hobby projects.
(time passes)
Yay, I finished! You can read the generated novel and an explanation of how it was all done here:
https://github.com/kranzky/insoluble/blob/master/README.md
I have three ideas this year, and this is one of them. I'm making separate issues for each.
As you may know, I've started a bit of research on NaNoGenMo, which includes a catalog (i.e. some basic metadata) of every completed entry since 2013, as well as an archived copy of the full text, if I could get it.
All told, that's 382 novels that I have the full text of, which all together contain about 45 million words.
So for this entry, I want to make a NaNoGenMo retrospective by using this corpus as the input to some of the common Travesty-esque text generation methods used by participants over the years.
To get to 50K, I think I'll do, chapters of may 8000 each, something like:
And depending how many techniques I end up with, I could do multiple chapters of some techniques and just change the parameters.
I'd also like to generate some literary criticism about NaNoGenMo, so I'm looking at SciGe and the Pomo generator to see if I can get somewhere by tweaking the nouns. I could also markovify some Harold Bloom (for example) but switch out the examples in his prose with examples from NaNoGenMo.
I intend to enter something this year, and the plan is to create a book consisting of randomly-generated horoscopes. I don't know how exactly I will generate them, but I've got a few ideas, and it's not as if my randomly-generated horoscopes could be any less accurate than the real thing.
I finally got around to reading M. R. James' anthology Ghost Stories of an Antiquarian. While his stories do indeed have the same structure as most of Lovecraft's non-Dreamlands stories (i.e., an academic stumbles upon something that Should Not Be Known By Man), the style is strikingly different. In fact, were it not for the encounter (often a Twilight Zone worthy twist -- such as a hotel room no. 13 that appears only at night, or an engraving that displays a stop motion image of an abduction), these would be James telling a mundane story about a friend who went to a town in order to study parish records or look at an old building, stayed in a hotel, and had a perfectly nice time consuming large amounts of whiskey in a genteel manner.
What makes this a prime candidate for generation is that, while James' references to architectural styles or historical events are presumably meaningful, they largely went over my head (and thus over the head of the average reader) -- in other words, without the supernatural element to act as a foil, there's very little reason to believe anything is wrong with the rest of the flavor text, and very often the supernatural element is subtle or comes in quite late!
In other words, we can readily emit descriptions of obscure historical events or technical details about architecture, the business of nineteenth century academic print distribution, the collecting of rare medieval manuscripts, or any of the other things James spends pages describing, and make these descriptions utter nonsense, but the fact that these descriptions are utter nonsense can be hidden from the reader.
Tracery is the ideal tool for such a project: we can produce descriptions of buildings, medieval or renassance local political squabbles, folios containing pieces of assorted known works, various accomodations, et cetera, in James' style and marry them together with generated descriptions of well-bred Cambridge men talking about golf over whiskey-and-soda, and preface each one with his typical introduction wherein he claims we "must remember" some acquaintance of his, who once told him this story in confidence after they met at some museum or other.
I'll update this issue with my thought processes.
High-level plan:
As I'm attempting NaNoWriMo this year I'm doing a minimal NNGM which will use my NNWM draft as raw material, probably via a recurrent neural net
I'm in. Zero ideas about what I'm gonna do and how I'll do it. But still.
Not sure what I'm making yet. Some ideas:
Recently I've been super into Dwarf Fortress and I am in awe of the complexity of the world/character generation, so I want to make a story where the characters are random (but not 100% random, still logical) and that affects what happens throughout the story.
The story will overall be about 4 men trying to get through a haunted forest, other than that I'm aiming for as many possibilities and outcomes as possible, all based on the characters and their stats etc.
My main goal for this is, like DF, to have a seemingly infinite amount of stories to be generated from one program. When you're done reading, you will be able to press a button to generate a new story and read a new one.
For my 1.0 release, I'm going to target something that generates a conversation between characters about a randomly-selected object. Source code will be hosted here:
This is an open issue where you can comment and add resources that might come in handy for NaNoGenMo.
There are already a ton of resources on the old resources threads for the 2013 edition, the 2014 edition, the 2015 edition, the 2016 edition, and the 2017 edition.
I’m hoping to make a field guide of (some facts but mostly) generative poetry and prose on the subject of various flora. This is perhaps a generous definition of fiction, but it’s inspired by the sort of travel guide structure at work in, ex., the Annals of the Parrigues.
I have no idea yet what techniques I will use as I haven’t done a lot of research! But as soon as I start a repository I will add it here.
The idea is to have a dungeon which is a very long corridor, full of creatures, items, magic things and others.
Adventurers will go down the corridor, looking to find the magical item in the end of it. Each chapter will be one adventurer going down the corridor, fighting monsters until eventually perishes or reaches the end, where the book ends.
Fights will be done using D&D 5ed rules, the items a character carries will also stay in the dungeon for the next adventurer to collect and use.
Okay, I'm back.
AND WORSE THAN EVER!
I'm actually planning on NOT GENERATING NEW TEXT but assembling text into images, or using them to reveal images, or something like that.
I hope to make a headless version of my (poorly-named) imagetexter app
web-version and source
https://www.instagram.com/p/BpTDsmDhS8z/
A new text could be generated, but my focus will be on automation, piping chunks of text into the app, and returning images that will fill a book. And somehow assembling them into a book. so a PDF or something.
Or maybe something else. Who knows. I just moved, again. This is becoming a bad habit. I don't even know where my pants are right now. My wife said "they're in a box. I think you labelled it miscellaneous
."
Not sure about my time commitment, but this is something I've been planning for a while
I would like to make some sort of a Addventure (a collaborative gamebook) using Jupyter Notebook and some python 3.0
I have a general concept of how it is going to work, but I am still very much in planning stages
The idea is to create a bunch of actions that have preconditions, expressed as prolog, and consequences. For example:
start_pursuit(A, B) :- character(A),
character(B),
knows_whereabouts(A, B),
desires_interaction(A, B, OutcomeA, _),
positive(OutcomeA).
which has as consequence pursuing(A, B)
. Another program will handling introducing and removing facts from the prolog environment. (frankensteined linear prolog basically) Possibly I will take character knowledge up one meta level, but not two. The novel is then a bunch of actions. Already this gives interesting narrative structures, like "tell the story in topological sort order but intentionally off by two" so you're always finding out why actions were just taken. The issues that I can foresee are:
pursuing
and knows_whereabouts
so I can write enough different actions?desires_interaction
is doing a lot of work here.I'm sure I will find many other problems but I think the approach is promising, I know similar things have been done before.
PDF/source: https://github.com/eq/nanogenmo2018/releases
This is going to be my 'gimme' novel this year.
Sound has frequency and words have frequency -- and both music and rhetoric use repetition -- so why not perform a conversion? Specifically, I'm planning to take midi files of existing & well-known songs, convert each note into a frequency, and then substitute it with the word whose frequency rank is the same as the note's frequency in hertz.
(I may or may not adjust by 20. Human hearing range is approximately 20Hz-20kHz, and while the top 20 most common words are extremely boring, music very rarely sits on the border of infrasound.)
Open questions: should I treat each voice as a separate stream entirely, make them parallel (with some kind of formatting), or somehow combine them (with an average, or a sum, or maybe combined with word2vec)?
Tools: midi2csv, a list of the 20,000 most common english words, maybe a table of notes to frequencies (in case I somehow have trouble with calculating scales).
I've been kicking an idea around for a while, though I've not yet tried to implement it.
Procedurally generate a MUD-style world, and then send a series of adventurers through it, and log their progress.
Not very exciting, though perhaps vergining on readable which is sort of my goal.
There's the issue whereby adventurers may die, before we get the length of novel we want, because I rely so heavily on randomness for this sort of thing. Hence the series of adventurers.
Not sure if I want to keep trying until I get one of the right length, or send in new adventurers, and have the dead haunt whatever room they died in.
Tweaking how the NPCs, who write our story for us, behave would probably take the most amount of time.
In the book of Chronicles in yon Bible of yore, there's just them lists of who begat whom:
Adam begat Seth; and Seth, Enos, Kenan, Mahalaleel, Jered, Henoch, Methuselah, Lamech, Noe, Shem, Ham, and Japheth.
The sons of Japheth were Gomer, Magog, Madai, and Javan, Tubal, Meshech, and Tiras.
Forsooth the sons of Gomer were Ashchenaz, and Riphath, and Togarmah.
And the sons of Javan were Elishah, and Tarshish, Kittim, and Dodanim.
The sons of Ham were Cush, and Mizraim, Put, and Canaan.
And the sons of Cush were Seba, and Havilah, Sabta, and Raamah, and Sabtecha. And the sons of Raamah were Sheba, and Dedan.
[....]
And Caleb, the son of Hezron, took a wife, Azubah, by name, of whom he begat Jerioth; and his sons were Jesher, and Shobab, and Ardon (and her sons were Jesher, and Shobab, and Ardon).
And when Azubah was dead, Caleb took a wife, Ephrath, which childed Hur to him (who bare him Hur).
And Hur begat Uri; (and) Uri begat Bezaleel.
After these things Hezron entered to the daughter of Machir, the father of Gilead, and he took her to wife, when he was of sixty years; and she childed Segub to him. (After these things Hezron went to the daughter of Machir, the father of Gilead, and he took her for his wife, when he was sixty years old; and she bare him Segub.)
So here's the plan: Make that. But with some occasional flair:
"Zebleth begat Womam, who traveled to the distant East and conquered the kingdoms there of Geotash and Gammom, becoming known thereafter as Womam, Razer of the East."
"The lone daughter of Laasies was Aarpya, who went to the delta of the Ten-Moon River in her 23rd year and swam in it, so that she could not be killed by snakes or knives ever thereafter."
etc.
I also think it would be fun to include some language shift as the generations progress; people being named for their ancestors but with slightly different spellings as "ch" becomes "k", doubled letters are combined, silent vowels dropped, etc.
Anyway that's my plan! We'll see how it goes. Feels simple enough to be completable as a first attempt at language gen but also complex enough to stay interesting for a month. Will probably be streaming work on this every Thursday at twitch.tv/fourbitfriday, if anyone wants to keep me company!
Here's a crossover NaNoGenMo / NaNoLiPo entry!
We all know a word whose letters all have at least one axis of symmetry.
woof
F it. Let's try another.
moo
Here is my entry for NaNoGenMo 2014, 50,000 Meows refactored (copied and pasted) into 50,000 Woofs (including unit tests), refactored (copied and pasted) into 50,000 Moos.
cp ../data/pg2701.txt .
python3 moo.py pg2701.txt > moo-pg2701.txt
python3 moo.py --translation pg2701.txt > moo-x2-pg2701.txt
# Or with PDFs, needs py2pdf.py and 'pip install reportlab'
./moo.sh pg2701.txt "Mo Dick; o T Wal, Hman Mlvill (or Mooby Dick; Or the White Sea Cow)"
Original | meow | words | with translation | words |
---|---|---|---|---|
Moby Dick; or The Whale, by Herman Melville | txt pdf | 215,136 | txt pdf | 430,272 |
Here's part of Mo Dick; o T Wal, Hman Mlvill (or Mooby Dick; Or the White Sea Cow):
"MOOOO.... Mo. moo Moo. MOOO. Mooo mooooo mo moooo mooo moooooooo mo
moooooo; moo mo Moo. MOOOO mo mooooo mo moooooo." --MOOOOOO'M MOOOOOOOOO
"MOOOO.... Mo mo mooo moooooooooo mooo moo Moo. moo Moo. MOOOOO; M.M.
MOOO-MOO, mo mooo, mo mooooo." --MOOOOOOOOO'M MOOOOOOOOO
MOOOO, MOOOO.
MOOOO, MOOOO.
MOOOO, MOOOO-MOOOO.
MOOOO, MOOOOO.
MOO, MOOOO.
MOOO, MOOOOOO.
MOOOO, MOOOOOOOO.
MOOOO, MOOOOOO.
MOOOOOO, MOOOOO.
MOOOOOO, MOOOOOO.
MOOOO-MOOO-MOOO, MOOOO.
MOOOO-MOOO-MOOO, MOOOOOOOOOO.
MOOOOOOO (Mooooooo mo m Moo-Moo-Moooooooo).
And with line-by-line translations:
Fishiest of all fishy places was the Try Pots, which well deserved
Mooooooo mo moo moooo mooooo moo moo Moo Mooo, moooo mooo mooooooo
its name; for the pots there were always boiling chowders. Chowder for
moo mooo; moo moo mooo moooo mooo mooooo moooooo mooooooo. Moooooo moo
breakfast, and chowder for dinner, and chowder for supper, till you
moooooooo, moo moooooo moo mooooo, moo moooooo moo mooooo, mooo moo
began to look for fish-bones coming through your clothes. The area
moooo mo mooo moo mooo-moooo mooooo moooooo mooo moooooo. Moo mooo
before the house was paved with clam-shells. Mrs. Hussey wore a polished
mooooo moo moooo moo moooo mooo mooo-mooooo. Moo. Mooooo mooo m mooooooo
necklace of codfish vertebra; and Hosea Hussey had his account books
mooooooo mo moooooo mooooooo; moo Moooo Mooooo moo moo moooooo moooo
bound in superior old shark-skin. There was a fishy flavor to the milk,
moooo mo mooooooo moo moooo-mooo. Moooo moo m moooo mooooo mo moo mooo,
too, which I could not at all account for, till one morning happening
moo, moooo M moooo moo mo moo moooooo moo, mooo moo moooooo moooooooo
to take a stroll along the beach among some fishermen's boats, I saw
mo mooo m mooooo moooo moo moooo moooo mooo moooooooo'm moooo, M moo
Hosea's brindled cow feeding on fish remnants, and marching along the
Moooo'm mooooooo moo moooooo mo mooo mooooooo, moo mooooooo moooo moo
sand with each foot in a cod's decapitated head, looking very slip-shod,
mooo mooo mooo mooo mo m moo'm moooooooooo mooo, moooooo mooo mooo-mooo,
I assure ye.
M mooooo mo.
Not sure exactly what I'll have time to do, but I have a lot of random ideas. I'll definitely not be able to get through all of them, but I'm committing to get at least one of them done.
I also have some vague ideas about using neural networks to extract hierarchical formulaic story structures, but that's probably overambitious. I sort of approached this problem the last couple of years, but never got very far.
I'm going to work on an encyclopedia of medicines/cures/treatments for conceptual / philosophical ailments. The main thing I want to do is explore what can be done with Wikidata, in terms of connecting concepts and ideas and things and people to each other, as well as in terms of being an interesting source of random stuff.
I got started today, with some code for plucking a random "concept" — instance of (P31) concept (Q151885) — from Wikidata, and finding a random orchid species.
https://github.com/ragesoss/NaNoGenMo2018
So far, I've got it building some preliminary titles/subtitles of encyclopedia entries, like this:
Genoplesium formosum: end of history's balm
Pelexia fiebrigii: anti-dormancy draught
Epidendrum rigidiflorum: anti-creditor draught
Cyrtochilum tetracopis: essence of range
Ponera dressleriana: tincture of population group
Scaphyglottis coriacea: anti-open access in South Korea draught
Pterostylis clavigera: distilled linear village
Pleurothallis excelsa: distilled samadhi
I'm planning to use PrimitivePic for illustrations, to provide slightly abstract visuals (starting from the linked pictures on Wikidata).
Not sure yet how I'm going to fill out the bodies of each entry, but I'd like to do something with building a narrative and/or instructions for preparation and usage out of connected facts from Wikidata and/or Wikipedia.
We don't have a particular approach planned out quite yet, but Team Wuthering Hyperparameters (@dellison and @daggledmermaids) will be participating this year!
A compendium of various national (and otherwise notable) flags as well as a brief history about them.
Hi! I intend to participate in the challenge. This is my first time joining NaNoGenMo. I've some experience with procedural content generation and have a newly found interest in solo RPG games, so I thought that starting with a procedurally generated fantasy novel would be fun. My choice of programing language is Python and I'll probably use ideas from many solo RPG engines and see what it becomes. My plan is to make it as readable as it can be, maybe I'll go with a diary format to avoid some text coherency issues. I'll start today with setting up a repo on GitHub and messing around with some of the ideas I came up with.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.