< Previous | Next >

Angry Unix Programmer

Segmentation fault
Debugger process has died
Sanity breaking.
January 2, 2008 3:48 AM CST by psilord in category Bitter Words

A Viewpoint of Ashes

One day recently I realized that I--the product of 4.5 billion years of evolution, use my portion of the vast and bountiful power of the Sun to very carefully order magnetic field orientations on hard drives that I will never see in my life. Sometimes, I might order a few dozen thousand of them in one day--a pretty good number for pushing around little things that noone can see and are only understood by consensus.

Needless to say, and obvious to those who know me, this is depressing. I've decided that I'm going to take a break from hobby programming for a while. And when I mean hobby programming, I mean software for software's sake. Basically, any idiot can write code these days, and most idiots do, I'm afraid. Computing is so cheap at this point that some drooling moron can poke around to find a bunch of badly written libraries and cobble something together which never really works well, but always works just well enough as long as you're looking at it. The world has proven to us that this is good enough.

Frankly, most of the good stuff has been written already, and if not, it is only because it'll take thousands of man years to write the disturbingly incremental cool thing and only companies with lots of people and resources can support creating them.

What I just wrote might make it appear like I'm some naive idiot (and maybe I am, but since there is no comment feature for this blog, I'll never know). However, if one notices that an incalculable amount of man hours--probably billions, went into XML for what is, as far as I can tell, a shiny replacement for grouping things with open and close parentheses, one gets a bit sick to their stomach. How many quantum leaps of human advancement have we so carefully avoided by spending our resources so unwisely?

Much of Computer Science is like the red-headed step child in the family of Mathematics. The part that is not is detailed in Knuth's books and other founding fathers like Church and Turing. But even so, at the practical every day level, whenever a new language is designed or something, a different syntactic form with a slightly different interface of a hash table (as an example) is created. I mean, it isn't like when the Jacobian determinant was invented, you had to change the syntax and slightly alter the meaning of addition all the way down to the core. Even complex numbers, which did alter addition merely extended it into a new abstraction.

For a while I spent looking at and skimming a lot of the computational science papers from 1940 to 1975. It is surprising how many of the modern things which people re-think up in some "not invented here" orgasm was thought of, expanded upon, and moved to logical conclusion over 50 years ago. Honestly, it was simply the rise of fast machines which brought the forward thinking ideas and implementations of those far away times possible today and appear "new" to the unintiated. The web does a lot to disseminate said information to the gibbering masses, but sadly, it appears they don't read it.

Having grown tired of this problem of thinking of cool software which would take me tens of years to type in during my free time, not really finding anyone who wants to or has the ability to help, and not ever having enough money to hire people to type it in for me, I've decided to think about something else for a while.

Lately, I've arrived at the opinion that the rise of digital computers based upon boolean mathematics have stunted the field of Artificial Intelligence. A bold claim, to say the least. But of you look at the roots of AI, cognitive science, and even the underpinings of the mathematics with which we try and describe how to world behaves, you will eventually realize that the world's "nominal operating state" is not boolean arithmetic. The world is made up of frequencies, periodic waves, probability fields, differential equations on continuous manifolds, and many other analog constructs. Digital computers can deal with these things, but at a terrible cost of time for simulation and loss of complexity representation. Digital computers are rational computers, but the world is simply irrational.

So, I've decided to think/learn about and begin building analog computers and meld it with the hindsight of the last decades of computing algorithms and theory which arose while analog computers were out of favor.

These computers, at the point of mainstream abandonment in the 1970's, were left in a state where they were being created to solve systems of differential equations and feedback control problems. They still exist today and control things like oil refineries, radios, and many other things, but in my opinion they've sort of gotten into a rut in their application space. However, since I personally think systems of differential equations and control theory are fundamental components to implement meaningful AI, it is pragmatic to understand how exactly to implement complicated mathematics in analog computers. More importantly though, most of this computation happens in what amounts to real time. This is so much of an appeal to me, it is worth spending significant amounts of my life to see if analog computers can be somehow "evolved" with modern ideas.

And if I fail and nothing comes of it, so what... It is worth to try.

End of Line.

September 25, 2007 6:13 PM CDT by psilord in category Empty Space

A Slice Of Mind

When you look at this, it is the mental equivalent of biting into an apple and only seeing half a worm.

APL Sudoku

⌹ ⍟ ⍋ ⍕ ○ ⍱ ⍉ ↑ ↓

I have the solved sudoku. Later, if I haven't forgotten, I'll put it up.

Also thanks to Alan De Smet for being so disgusted at the original sudoku HTML, that he made some CSS classes for me to use for the sudoku grid.

End of Line.

December 16, 2006 10:03 PM CST by psilord in category Useless Ideas

A Spline in My Foot

I have always liked to concept of splines. I think they are one of the coolest forms of math simply because they are so generally useful and the pretty picture coefficient for them is quite high.

A while ago, I had used variants of them for a certain type of interpolation related to interpolating between regions of stability in chaotic functions. I needed an interpolation such that in between the regions there was good behavior instead of the wild flailing of pure chaos itself. Here is an example of a variant I made.

You can find much better descriptions on the web about Bezier curves than I am going to give here. Suffice to say, I will simply write the two equations of note, and move on. The first one produces a collection of functions called the Bernstein function:

[Bernstein Equation]

The next one is the Bezier curve function itself in one dimension: [Bezier Equation]

The Bezier function takes a set of control points which dictate the order of the curve(quadratic, cubic, quartic, etc) depending how many control points there are. They are specified in the Pi variable of the above equation. This is the part of the equation with which we are going to have fun. Let's replace the control point with a function which produces control points instead. That's far more fun.

[Modified Bezier Equation]

The variable s needs a little bit of explanation. This variable is kind of like a parametric variable for F(s)i. F must produce a control point of some arbitrary dimensionality according to the designer's wishes for the dimensionality of the curve. For the purposes of this discussion today, we will assume F produces 3 dimensional control points.

Some examples of F could be:

[Example Function 1] [Example Function 2] [Example Function 3]

An interesting thing to see is that in this case, s is used as the domain for the little function in the y axis of the generated control point. So, all three of these curves act like they have been created in an x,y plane, and the z component is for placement of these individual xy planes in "parallel" to each other. Of course, this is quite a restrictive use of these functions, but it doesn't have to be that way. It just happens to be for this example. Also, there may be many more than 3 functions, which just increase the order of the curve.

I bet you are wondering what something like that would look like if a computer program drew it. Fear not, instead of doing something useful with my life like making money or helping the sick, I've written one so you can see what kind of patch surface the above example would create in 3d given varying values of s and t. The white lines are the individual curves as t varies, and each white line is on account of s varying. The green dots are locations on the "control functions", and the green lines are visualization lines to show the typical lines associated with the control points of an individual curve. The rectangular box in the middle bottom is a scale that goes from -2*PI to 2*PI in the x axis, and -1 to 1 in the x and y axes. The coordinate system is left to right is negative to positive x axis, bottom to top is negative to positive y axis, and behind of screen to in front screen is negative to positive z axis. The various functions have been scaled a bit to make them pretty (which are not reflected in the above equations). [Example of Above Equations]

Interesting no? There are some things I haven't yet fully figured out about the surface generated using this method. For example, what kind of a surface is it? Is it continuous, smooth, differentiable? To what extent? An awesome example to put light to these questions comes if you replace F2 with (x, tan(x), 0), and F1, F3 are simply straight horizontal lines in the x,y plane. Of course, there are some dirty scalings going on, but I've left them out (also the green visualization lines) of the examples because they clutter up the equations. So, here is what a stranger patch may look like: [Example Using Tan(x)]

See what I mean about characterizing what kinds of patches these are? They look like they should be integrable and have partial differentials which may describe all sorts of good things like slopes and gradients and whatnot, but due to the tainting of the Bezier method, that might not be true. This might seem like a limitation, but I don't think it is. For example, F may be anything, like an iterated function system, a set of differential equations, other parametric curves, normal functions....

Of course, the next step would be to design some splines to interpolate through each function's control point given some more information like partial differentials at said control point, etc, etc, etc. This would make it very interesting to have, say, a sin wave for the first function, an iterated function converging to a fern, and a quadratic equation for the third equation. You could have a curve interpolate through each one, which might be interesting for a lot of reasons, most of them pretty to look at. As for whether or not that would form a surface, hmm....

I haven't gotten to that part yet. Might never. Depends on how bored I am.

End of Line.

December 15, 2006 12:56 AM CST by psilord in category Idiocy

Box of Oreos... Check

C++ can die screaming in a fire, preferably with the viciously beaten body of Make wedged underneath it.

For three hours today, I spent time debugging a crazy bug where a value held in a class field variable was correct in one function of the class, and garbage in another function of the same class with no assignments or buffer overflows in between. I wrote all sorts of debugging code and did nasty tricks with gdb to find out how that damn variable was changing from one function to the next.

When I found the source of the bug, it was one of those moments where I understood that I was dangerously straining against the clear and frail veil of sanity. I was absolutely positive that the lush verdant fields so tantalizingly close to my grasp were infinitely better than the burned out weeds at my feet. Upon "The Realization", the headache which pierced through my brain like a old man's car through a farmer's market splintered my will to live into pieces and scattered them across the permanently salted fields of primal rage.

It was a class declaration mismatch in object files. This happens when you change the structure of a class declaration, but only some objects using that class get recompiled. Some object files are using the old declaration of the class and others are using the new declarations. This causes the linker to pick an address for the field variables which is correct in some object files and wrong in others, hence leading to corruption simply at the call of a function. I'd say that this bug is actually worse than a memory corruption bug that is outside the ability of a memory checker program--because the declaration skew has no incorrect source code to discover upon inspection.

This was a classic case of a particular failure of Make. I typed make in a subdirectory of the workspace, and it didn't go back out and recompile the object file in another directory which should have been updated. Since I thought no one else was using the class declaration, it didn't occur to me to go back a directory and recompile. However, it shouldn't have to occur to me...

I looked at some make replacements and found a couple of note. Between SCons and Jam I'd choose SCons because the dependency language is also Python itself. Jam's language does look pretty rich, but having full Python (and its libraries) available to you edges SCons out over Jam.

However, it looks like SCons's autoconf-like functionality has been hastily grafted on and doesn't look too complete. If the functionality was extended to be at least a super set of autoconf, then SCons would be a pretty powerful tool. Sadly, autoconf's solution of producing a configure script really is a good one, because all UNIX machines everywhere have a Bourne shell on them. SCons is nice, but will some old and crusty hpux machine be able to deal with it? Probably not.

I wonder if there is an SConstruct to configure translator....

End of Line.

November 20, 2006 1:48 AM CST by psilord in category Artificial Intelligence

N * 1 = N

On Identity.

Identity may be defined as those inputs which statistically--directly or indirectly, correlate to the implementation of the AI available in the pattern space, any statistical information processing system which models said implementation, and any predictive modification of the pattern space which affects the implementation.

The obvious ramification of this is that the AI's modification to the pattern space may also modify its own implementation. This also implies that the AI can not only gauge how well a self-modification may have been executed, but may goal search to find the best modification available given the limited context of its senses and history.

Reification of this idea into actual practice also allows "hard coded" statistical overrides for the manipulation of the AI's implementation towards a goal. For lack of a better definition, especially one that wouldn't take up 10 pages in scientific explanation, this hard coded system dealing with the bounding of the manipulation of one's own implementation may be regarded as the modeling of pain.

The pain model starts off as a hard coded response system which would check to see if certain inputs (think of a continuous region, not discrete inputs) from the pattern space, or potential predictions of modification of the pattern space would result in pushing the parameters of the implementation out of stability. The system is initially hard coded because when the pain signal becomes intense "enough", hard coded modifications of the pattern space are mixed into the entire statistical processing algorithm of the AI forcing it to move out of the statistical context which is causing the pain. After the pattern space stimulus is removed or lessened, the usual information processing path is again allowed to control the information flow through the system.

However, the true power of the pain model becomes evident when the pain signal itself becomes a new dimension in the pattern space. This allows statistical prediction of pattern space modifications which would affect the pain intensity signal without having to actually perceive the pain. Of course, minimizing the pain signals would be part of the designer's goal in the construction of the AI and the previous experience of pain would be the guide to learning pattern space modifications to specifically avoid those experiences.

The upshot of the pain model skeleton is that arbitrary (not just pain), and non-hard coded models may be prescribed upon the implementation by the AI itself. For example, if the AI had been designed to have a sense of symmetry and to prefer symmetry over non-symmetry, then the AI may modify the pattern space in order to produce symmetry so as to rate the statistical information processor's inputs better than if those modifications had not been performed. Since the pattern space includes the implementation, it is not inconceivable that the implementation may be modified to ascribe to the model of symmetry if such modifications do not exceed the designer's limits of the pain model.

End of Line.

November 2, 2006 1:41 AM CST by psilord in category Codecraft

Back When I Was Smart

So a good friend, Alan, wandered into my office today babbling about some analysis I had written about something or other in the late 90's or early 2000's. Knowing Alan for many years, I'm happy I got at least that much information out of him. After a few minutes of opening mental doors I thought long sealed by tequila and vomit, I stumbled in the dark over what the hell he had been talking about.

Primes.

Yes, those numbers which have driven people mad as they've ground their lives into meaningless dust trying to understand them. Those numbers which have broken people's souls and left them to rot in the hell of misery--thrice damned by the alienation of friends, family, and reason.

It was back in 2003 when another friend of mine named Mike Turner had asked a simple question about the Chomsky classification of a program he'd seen (which Alan had been referencing in my office):

On Sun, Feb 16, 2003 at 04:22:19PM -0800, Michael Turner wrote:
> Most people are familiar with the prime number 'regular' expression
> tester in perl.
> 
> perl -e 'print "PRIME" if (1 x shift) !~ /^(11+)\1+$/' #
>  (where '#' is any number to be tested)
>
> It can eaisly be proven that this is not a true regular language.
> My question is - what type of language is it?
> 
> The Chomsky hierarchy of languages (with ammedments) has several
> types of lanauages and various machines that recognize them:

etc. etc. etc.

I stared at that tiny perl program like a portrait of a person stares at the wall across from it. I was fascinated by it. I just couldn't believe that it would find primes. But test after test, it magically decided correctly for every number I gave it, well, until the perl interpreter segfaulted....

How in the hell did it work? I just had to know. Here is my analysis, which I've extensively edited from the original to explain some concepts better.

The Theory of Operation for:

perl -e 'print "PRIME" if (1 x shift) !~ /^(11+)\1+$/' #

Let's start off with a few substitutions for what happens when you run it with say, the number 15:

print "PRIME" if 111111111111111 ~! /^(11+)\1+$/;

So, each 1 on the left of the ~! represents exactly that, a one. The fact that there are fifteen of them implies a "1 group by 15 ones" grouping of those ones. This style of grouping will come back to help us out later.

What about the right hand side? The funny looking regex? It means to greedily match the (11+) against the input and then after you've produced the largest (11+) group, copy it next to itself (this is what the backreference does). In particular, pay attention to the + after the \1 backreference. This ensures that there will be "enough" groups of ones to potentially match the left hand side. Later in the examples, you'll see how this alters one of the factors being searched for the target number. For the number 15, this ends up starting at (111111111111111)(111111111111111). The left group is the greedy match of the first (11+) in the regex, and the right group is due to the \1 backreference.

Why (11+) and not something else? It is because the (11+) represents a factor of 2 or more (remember, the ones actually stand for ones, and any collection of ones in a ( ) group represents that actual integer). The number of (11+) groups, which starts at two, due to the backreference, may increase during the backing off of the greedy matcher via the \1 backreference, which will always add another group of the current size when it is determined that the greedy matcher had in fact backed off too far and now there isn't enough groups of ones to match the set of ones to the left of the ~!.

Specifically:

(11+)           represents this: (1 + 1 + [1 + 1 + ...]) >= 2
(11)(11)        represents this: (1 + 1) + (1 + 1) == 4 or (2 + 2) = 4
(1111)          represents this: (1 + 1 + 1 + 1) == 4
(111)(111)(111) == 3 + 3 + 3 = 9 == 3 * 3 = 9.

So you see that, say the number 15, can be represented like this:

(111111111111111)          1 group by 15 ones
(11111)(11111)(11111)      3 groups by 5 ones
(111)(111)(111)(111)(111)  5 groups by 3 ones

Now, a trick we must utilize:
2 * 2 = 4 means the same as 2 + 2 = 4. 

The trick is used for things like (111)(111)(111)(111)(111). This is not only 5 groups by 3 ones (5 * 3), but it is also 3 + 3 + 3 + 3 + 3 = 15. The latter form makes much more sense if you are representing each number in a group as a collection of ones.

A very important thing to realize is that in the regexp language the ( ) operator is merely a grouper so (111)(111) is equivalent to (111111). Of course, this isn't true if you are referencing a specific match, but in this case we are not.

Now some examples. In these examples, you'll see the initial state of the regexp for a particular input number, and then watch the backoff of the greedy + operator and the addition of the currently sized group via the backreference. As you get to see the algorithm play out, you'll gain a feel for how it detects primes. At each step, check to see if the pattern specified equals the pattern desired, if not, consider it false and either backtrack, or if finally there is nowhere to backtrack to, the whole thing becomes false:

Initial Number:        3 
Initial Number Group:  111

Regex Algorithm Start:

(111)(111) false, backtrack
(11)(11) false

Stop backtracking, each group can't get smaller than initially specified (11)!

Input Number             Final Groups
------------             ------------
(111)                    (11)(11)
 3                        4

They don't match so number is prime.

Factors Discovered: none

Initial Number:        4 
Initial Number Group:  1111

Regex Algorithm Start:

(1111)(1111) false, backtrack
(111)(111)   false, backtrack
(11)(11)     true

Stop backtracking, each group can't get smaller than initially specified (11)!

Input Number             Final Groups
------------             ------------
(1111)                   (11)(11)
 4                        4

They match so number is composite.

Factors Discovered: 2 * 2

Initial Number:        5 
Initial Number Group:  11111

Regex Algorithm Start:

(11111)(11111) false, backtrack
(1111)(1111)   false, backtrack
(111)(111)     false, backtrack and activate backreference
(11)(11)(11)   false

Stop backtracking, each group can't get smaller than initially specified (11)!

Input Number             Final Groups
------------             ------------
(11111)                  (11)(11)(11)
 5                        6

They don't match so number is prime.

Factors Discovered: none

Initial Number:        6 
Initial Number Group:  111111

Regex Algorithm Start:

(111111)(111111) false, backtrack
(11111)(11111)   false, backtrack
(1111)(1111)     false, backtrack
(111)(111)       true

Input Number             Final Groups
------------             ------------
(111111)                 (111)(111)
 6                        6

They match so the number is composite.

Factors Discovered: 2 * 3

Initial Number:        7
Initial Number Group:  1111111

Regex Algorithm Start:
Notice how the backtracking is really coming out of each group...

(1111111)(1111111) false, backtrack
(111111)(111111)   false, backtrack
(11111)(11111)     false, backtrack
(1111)(1111)       false, backtrack and activate backreference
(111)(111)(111)    false, backtrack and activate backreference
(11)(11)(11)(11)   false

Stop backtracking, each group can't get smaller than initially specified (11)!

Input Number             Final Groups
------------             ------------
(1111111)                (11)(11)(11)(11)
 7                        8

They don't match so the number is prime.

Factors Discovered: none

Initial Number:        8
Initial Number Group:  11111111

Regex Algorithm Start:

(11111111)(11111111) false, backtrack
(1111111)(1111111)   false, backtrack
(111111)(1111111)    false, backtrack
(11111)(11111)       false, backtrack
(1111)(1111)         true

Input Number             Final Groups
------------             ------------
(11111111)               (11)(11)(11)(11)
 8                        8

They match so the number is composite.

Factors Discovered: 2 * 4

What about a more interesting composite number that doesn't have 2 for any factor?

Initial Number:        9
Initial Number Group:  111111111

Regex Algorithm Start:

(111111111)(111111111) false, backtrack
(11111111)(11111111)   false, backtrack
(1111111)(1111111)     false, backtrack
(111111)(111111)       false, backtrack
(11111)(11111)         false, backtrack and activate backreference
(1111)(1111)(1111)     false, backtrack
(111)(111)(111)        true

Input Number             Final Groups
------------             ------------
(111111111)              (111)(111)(111)
 9                        9

They match so the number is composite.

Factors Discovered: 3 * 3

In pseudo-math that is closer to how people think, here is the explanation of number 9 with a different notation:

9 + 9 = 18 == 9? NO
8 + 8 = 16 == 9? NO
7 + 7 = 14 == 9? NO
6 + 6 = 12 == 9? NO
5 + 5 = 10 == 9? NO
oops 4 + 4 is less than 9, so add another 4 factor!
4 + 4 + 4 = 12 == 9? NO
3 + 3 + 3 = 9 == 9?  YES
3 groups * 3 ones in each group == 9? YES therefore COMPOSITE

And now here is a larger number for fun.

Initial Number:        15
Initial Number Group:  111111111111111

Regex Algorithm Start:

(111111111111111)(111111111111111) false, backtrack
(11111111111111)(11111111111111)   false, backtrack
(1111111111111)(1111111111111)     false, backtrack
(111111111111)(111111111111)       false, backtrack
(11111111111)(11111111111)         false, backtrack
(1111111111)(1111111111)           false, backtrack
(111111111)(111111111)             false, backtrack
(11111111)(11111111)               false, backtrack and activate backreference
(1111111)(1111111)(1111111)        false, backtrack
(111111)(111111)(111111)           false, backtrack
(11111)(11111)(11111)              true

Input Number             Final Groups
------------             ------------
(111111111111111)        (11111)(11111)(11111)
 15                       15

They match so the number is composite.

Factors Discovered: 3 * 5

So, it appears to me that the regexp deconstructs the number to see if it can be either constructed through addition of two identical numbers--n * 2 meaning 2 is a factor, therefore composite, or if that fails, then can it be constructed by a factorization of numbers where the number of ones in a group multiplied by the number of groups produces a factorization of the number. I predict this algorithm will always try to find the minimum number of factorizations required to determine compositness. And the minimum number of factors always found for composite number is two since one of the factors represents the groups, and the other the number of ones in a group. I also predict the smaller factor will be the number of groups.

So, if we used 30, then the first match found will be 15 * 2, and
never something like  2 * 3 * 5.

If we used 50, then the first match found will be 25 * 2, and never
something like  2 * 5 * 5.

Note: I've retconned this summary since the last time you read it. I'm some nobody with some dumb blog nobody reads so I don't feel bad about it. Though, thinking about it, I don't think I'd feel bad retconning this under any circumstances.

So what happens is that this algorithm basically divides the number by 2 at the first step, 3 at the first \1+ invocation, 4, 5, 6, 7, and so on, with the grouping of the ones being one of the factors and the number of ones in each group being the other factor. Since the algorithm really starts with 2*n due to the \1+ aspect of the regex, many steps are wasted performing backtracks even before the grouping system starts becomming mathematically effective.

End of Line.

October 2, 2006 2:18 AM CDT by psilord in category Empty Space

Human++

Science fiction, the good stuff anyway, goes to great lengths to reason about what will happen to our culture and civilization when stunning scientific advances allow us to alter our environment in profound ways. However, I've noticed that there isn't much science fiction which alters what it means to be human.

The problem is that most or all scientific advances in s.f. fall into the category of tools. Hyper drives, teleporters, cyber eyes, livers which metabolize alcohol or other poisons in 3 seconds or less, wormholes, space ships, are just tools to be used by what usually appears to be normal human beings. Sometimes the human has something like telepathy, but even that is often portrayed as a magical cell phone or action at a distance tool.

So again, why is there no science fiction which alters the definition of humanness?

One answer maybe that once a human consciousness ceases to be human, and by definition becomes alien, we as humans--who experience these characters, can never scry the intentions of the alien characters. This disconnects them from us and we don't care about their actions since they cannot be put into pattern or predicted by us. As an example, in the s.f. show Babylon 5, the only true aliens were the Vorlons and the Shadows. All the other alien's motivations and responses were decidedly human-like and could be easily categorized as something a human could have done. However, the Vorlons and Shadows, to my disappointment in the end of the series, had reasoning and motivations that were understandable by humans. It was a let down precisely because the intrinsic alienness--which generated much of their horror, of those two races vanished when human consciousness laws applied to them. They just became every day humans, who just happened to have a lot of power.

So, why am I talking about this stuff at all? I see some trends in our civilization which will provide us with the best tool that could ever exist. A tool which may alter our own implementation.

It will take its form as a set of tools. One of them will be the genome sequencing of all life. Another will be understanding of protein function and folding. Another will be unraveling of the information processing equations of the mammalian brain (you should see the work at IBM modeling a rat brain cell by cell, and Paul Allen's mouse brain project). Another will be nanotechnology at the atomic level for self assembly of structures which have interesting and strange properties allowed by the laws of nature, but rarely seen in nature, like negative refractive lenses for example. Another will be enough synchronized computing power to bootstrap the first of the new forms of life.

Once these technological fields are mature, humanity will begin, by instinct, to tamper with its own implementation. It will be slow, since the affront to the Identity of what it means to be Human will cause a deep fundamental resistance. At first it will be very small modifications to ourselves, things that can be viewed as tools and only tools--an organ replacement, cancer treatment, hearing replacement. Then it will move on to performance enhancement, longevity, increased efficiency of our body, wider selection of energy acquisition methods.

Finally, we will move to the brain itself. This will be the last, final, and most altering frontier of modification. One day, we will decide that having better mathematical ability is en vogue, craft a dozen grams of new neocortex with a nourishing and cooling blood supply, and graft it, made out of our own cells, into our head. The bone will regrow naturally around it as if it was always there and unimaginably complex mathematics will be available to us with enough training and time. Laws will be set up governing certain mathematical rules about how our consciousness is implemented as to keep some sort of normalcy in the self interaction with our own species. We will endlessly tinker with ourselves until our consciousnesses are linked by high speed, high bandwidth electronic devices and our many brains function in a truly parallelized fashion.

We will become a collective, since that is the natural extension of group behavior, culture, and language which has allowed the Idea of us to evolve and prosper throughout the countless millenia of the past. It will provide us with low latency planetary computational ability and great waves of information integration will ride across the surface of the earth, passing each new being for its little bit of processing. The intelligence, now even beyond the concept of decision making by my reckoning, simply designs whatever body and consciousness it needs to leave this planet as it now has the information processing capability to understand what really needs to happen.

The thoughts, desires, and emotions, of that hyper consciousness, and whether or not those concepts even have meaning anymore at that time to someone like me, call to me like the Siren's song.

End of Line.

July 27, 2006 8:57 PM CDT by psilord in category Unknown Kadath

0 / 1

So a good friend of mine had asked me to join LiveJournal for a decent purpose. I thought about it for a while, but will decline.

The reason is pretty simple. I'm waiting for my blog to die in people's eyes. I want the traffic on it to be so low for such a long time, that people go "Huh, I guess he didn't write anything interesting after all" and remove me from their RSS feeds, aggregators, and bookmarks. Only when I'm passed over, forgotten, and deleted, will I begin to write.

End of Line.

July 12, 2005 1:27 AM CDT by psilord in category Bitter Words

Hello, My Name is 867-53-0969.

The only reason you have any privacy at all is because someone hasn't bothered to look through your information yet.

This lack of privacy has even become a cottage industry. Phone companies charge you a fee (per month) to keep an unlisted number, credit card companies offer you account protection for when you lose your card (Read: when some cracker steals 5 millions records of customer data from their servers and sells it to the Russian Mafia). Supermarkets and bookstores give you "discounts" to use their niche cards.

People still scream about their loss of privacy, but really the effect of the discussion is the same as lamenting about the loss of your favorite sports team. With wide adoption of things like gmail, the people have spoken, and they say to hell with privacy.

Sure, Google has a nice sentiment about privacy that no human ever reads your email, that they don't show or sell the information. And you know what? I believe them that they won't abuse your email. But think about something here.... They say you should NEVER delete your email but search and create views instead? Great, what happens in 20 years when Google ends up in junk bonds and John Q. Tyrant, Inc. buys the assets. OOPS! Now you have 20 years of billions of people's emails you can cross correlate to find all sorts of goodies in there and a search engine powerful enough to find that Jack T. Smith once spoke against the State in .035 seconds. The fact that Jack T. Smith has a bad heart because of all of his emails with his Doctor will only provide a simple means of making it look like an accident....

Think about gmail, hotmail, and yahoo email services for a second. You've offloaded the most personal and private things about you to a third party company who will do what it takes to make profit. They know the countries you've visited, the people with which you communicate, whether you take your daughter to piano lessons, if your mother's sick, everything. You can make the same argument about having your mail kept at a place of employment or anywhere that isn't a direct delivery to your home computer. Hmm.... I speak of they like a schizophrenic speaks of they, except sadly, I'm sane. The real deal is that people have given up their privacy and in fact expect to give it up.

So, now that the cat's out of the bag concerning your entire personal history in many aspects of your life, might as well teach the cat to hunt mice. Know what I mean?

End of Line.

June 29, 2005 12:36 AM CDT by psilord in category Apocryphal

Lateness Is A State of Mind

Sorry for the chasm of time between posts.

I got married.

End of Line.


< Previous | Next >