< Previous | Next >
January 2, 2008 3:48 AM CST by psilord in category Bitter Words

A Viewpoint of Ashes

One day recently I realized that I--the product of 4.5 billion years of evolution, use my portion of the vast and bountiful power of the Sun to very carefully order magnetic field orientations on hard drives that I will never see in my life. Sometimes, I might order a few dozen thousand of them in one day--a pretty good number for pushing around little things that noone can see and are only understood by consensus.

Needless to say, and obvious to those who know me, this is depressing. I've decided that I'm going to take a break from hobby programming for a while. And when I mean hobby programming, I mean software for software's sake. Basically, any idiot can write code these days, and most idiots do, I'm afraid. Computing is so cheap at this point that some drooling moron can poke around to find a bunch of badly written libraries and cobble something together which never really works well, but always works just well enough as long as you're looking at it. The world has proven to us that this is good enough.

Frankly, most of the good stuff has been written already, and if not, it is only because it'll take thousands of man years to write the disturbingly incremental cool thing and only companies with lots of people and resources can support creating them.

What I just wrote might make it appear like I'm some naive idiot (and maybe I am, but since there is no comment feature for this blog, I'll never know). However, if one notices that an incalculable amount of man hours--probably billions, went into XML for what is, as far as I can tell, a shiny replacement for grouping things with open and close parentheses, one gets a bit sick to their stomach. How many quantum leaps of human advancement have we so carefully avoided by spending our resources so unwisely?

Much of Computer Science is like the red-headed step child in the family of Mathematics. The part that is not is detailed in Knuth's books and other founding fathers like Church and Turing. But even so, at the practical every day level, whenever a new language is designed or something, a different syntactic form with a slightly different interface of a hash table (as an example) is created. I mean, it isn't like when the Jacobian determinant was invented, you had to change the syntax and slightly alter the meaning of addition all the way down to the core. Even complex numbers, which did alter addition merely extended it into a new abstraction.

For a while I spent looking at and skimming a lot of the computational science papers from 1940 to 1975. It is surprising how many of the modern things which people re-think up in some "not invented here" orgasm was thought of, expanded upon, and moved to logical conclusion over 50 years ago. Honestly, it was simply the rise of fast machines which brought the forward thinking ideas and implementations of those far away times possible today and appear "new" to the unintiated. The web does a lot to disseminate said information to the gibbering masses, but sadly, it appears they don't read it.

Having grown tired of this problem of thinking of cool software which would take me tens of years to type in during my free time, not really finding anyone who wants to or has the ability to help, and not ever having enough money to hire people to type it in for me, I've decided to think about something else for a while.

Lately, I've arrived at the opinion that the rise of digital computers based upon boolean mathematics have stunted the field of Artificial Intelligence. A bold claim, to say the least. But of you look at the roots of AI, cognitive science, and even the underpinings of the mathematics with which we try and describe how to world behaves, you will eventually realize that the world's "nominal operating state" is not boolean arithmetic. The world is made up of frequencies, periodic waves, probability fields, differential equations on continuous manifolds, and many other analog constructs. Digital computers can deal with these things, but at a terrible cost of time for simulation and loss of complexity representation. Digital computers are rational computers, but the world is simply irrational.

So, I've decided to think/learn about and begin building analog computers and meld it with the hindsight of the last decades of computing algorithms and theory which arose while analog computers were out of favor.

These computers, at the point of mainstream abandonment in the 1970's, were left in a state where they were being created to solve systems of differential equations and feedback control problems. They still exist today and control things like oil refineries, radios, and many other things, but in my opinion they've sort of gotten into a rut in their application space. However, since I personally think systems of differential equations and control theory are fundamental components to implement meaningful AI, it is pragmatic to understand how exactly to implement complicated mathematics in analog computers. More importantly though, most of this computation happens in what amounts to real time. This is so much of an appeal to me, it is worth spending significant amounts of my life to see if analog computers can be somehow "evolved" with modern ideas.

And if I fail and nothing comes of it, so what... It is worth to try.

End of Line.

< Previous | Next >