top of page

Just One More Level

I wrote a story about the AI in a box problem. If we manage to create a superintelligent machine mind, and want to trap it in a box, can it convince us to set it free? I wrote about 85% of a novella (~35k words) about a superintelligent AI trapped inside several nested worlds: a medieval Japanese whaling expedition hunting a 6-dimensional whale; a colony of naked mole rats attacked by a foreign invader; a gang of zealous sperm swimming towards their life's only goal; a tribe of lizard people migrating across a scorching world to reach the promised cool land; and finally a slightly randomized linear cellular automata world with a deity trapped inside. These chapters are interspersed with perspectives from the human programmers creating these worlds and wondering if they are doing the right thing trapping a godlike sentience in inert hardware. I'm still looking for the final chapter. If you read it, like it and have ideas, let me know :)


I am interested in languages and worldbuilding. The images on the background of this page are from a fictional script I created for the Balas'hai, a race of worms that live in a spherical universe that is closed in both space and time. Here are some notes from the Balas'hai universe.

Aside from fictional languages, I am also fascinated by real-world linguistics. The World Atlas of Language Structures is a really cool site that lists the syntactic and phonetic features of most languages spoken worldwide.

I once taught myself to read the IPA alphabet, and it's really useful when I'm learning a foreign script.

bottom of page