Posted by Trixter on August 27, 2013
I have the dumbest first-world problems. The current one: I’ve come up with new ideas for a retroprogramming/demoscene project, ideas that nobody on my platform (4.77MHz 8088) has ever attempted. They’re pretty radical, and on par with 8088 Corruption in terms of code cleverness and impact, except this time, I actually know what I’m doing (as opposed to stumbling into it somewhat by accident with Corruption). Every time I think about what is possible, I get really excited and obsess over implementation details. In my head, it’s all coded and running beautifully, bringing shocks of laughter to other retroprogrammers and demosceners alike.
The problem is, even after weeks of design, it’s still in my head.
What I have planned relies heavily on same tricks as the 8-bit democoding scene, namely lots of tables. Generation of those tables, compared to how they are used, is a highly asymmetric process: It can take hours or days to create the table, whereas using it only costs a few cycles for a memory lookup. This very similar to the process of vector quantization, where coming up with the codebook is an NP-hard problem that there is no universal solution for. To bring my ideas to life, I need to write programs that take advantage of gigs of memory and multiple cores to come up with the tables I need. So why not just write them? What’s stopping me is the dumbest of dumb problems: I simply haven’t written any large, modern, multithreaded applications before. All of my programming over the last 30 years has been either in Pascal, or assembler, using dialects and architectures that aren’t applicable to modern resources any more.
“So pick up a new language, ya moron!” is what all of you just screamed at me. I know, I know. But this is what kills me: It would take me at least a month to pick up a new language, learn its IDE, learn its visual form/component system, and the inevitable learning of how to debug my code when it all goes pants. I’ve been putting it off for at least a decade. I can do it… but it would take at least a month to ramp up to full speed! And (said in Veruca Salt cadence) I want to code up my idea NOW!
I’m frozen. What a stupid, stupid first-world problem.
I need an algorithm for getting over the hump that can be solved in polynomial time.