Wednesday, August 19, 2009

Analog Machines

For many years I used to imagine building an analog computer. Unlike the binary system we are so committed to nowadays I considered the issues of ranges of values other than 1 and 0. Now, those numbers can just be arbitrary smooth fractions between 1 and 0, or just indefinite numbers of unsigned integers, either of which have however many digits they need. The computer would be made "somehow", even if only of "unobtainium." A value could be a voltage, or a frequency, or a shape of a waveform. Some kind of signal.

In an analog computer the instructions would be made of signals. The signals are entirely analog. There are only as many "pieces" of the signal as is deemed necessary by the receiver of the signal. For instance an "ear" would be sensitive to the low end of the dynamic range. A radio would be sensitive to a large part of the middle ranges. There are also digital inputs that come from the outside and come from the inside. However they are really just analog signals shaped in such a way to approximate the square waves sufficiently to be interpreted as a digital signal. These could be as high or low of frequency needed, up to a certain practical limit (super gamma X ray frequency would not be safe for use in any kind of computing device.)

This computer must be capable of storing and retrieving signals. One such storage would be a light beam reflecting inside one of many tubes of mirrors. Electronically it could be voltage amplitudes bouncing between two identical feedback amplifiers. They only increase the output amplitude a fractional amount in order to make up for propagation entropy or attenuation. Time lapse would occur, however, such that the time cost equals one echo per connection. Such propagation effects already exist in current machines. The goal is to minimize time delay, however, so that reality can be "modeled" or interacted with in as realistic a manner as possible.

However the action is done, whether it be an optical means, using light beams and mirrors, etc. or if using cascades of transistors, capacitors and inductors, the sources of electrical signal would not be so much high speed all the time, just very accurately handled, regardless of frequency, whether many very very slow signals are used (allowing interim multiprocessing), or very tightly synchronized ultra high speed signals that test the fastest limits of the hardware.

For computational instance, if I wanted to get the square root of a number, I would put 25 volts in one hole of "the square rooter circuit" and the other hole would instantly output 5 volts. Only the reaction time of some transistor or other component is important, and they are generally quite fast, almost to the point of taking near 0 time. If a component was bypassed "to save time," there would still be some time used by electrical conductivity in the wire, however nearly 0 it was. Other mapping functions could be that perhaps X volts input equals Y(f(X)) volts output, or whatever. There are far more possible instances of specific electronic mathematics that can be integrated within the flexible fabric of electronic connectivity. There are too many to enumerate since they cover almost everything in reality and everything that has never happened yet.

The single input and single output is only a simplification, and there may be many inputs and many outputs for any given "thing which does something" in the machine.

You don't think of radio as being a "batch job" that takes all day to process, like baking pastries. If pastry baking was like radio they would take very little time or energy to bake. It takes little time for the input to a radio transmitter to be sent and received simultaneously by multiple radio receivers at long distances and at a vast number of locations. It would be as if as soon as you shoveled eggs, flour, sugar and milk into the input pipe, fully cooked cupcakes would instantly flutter from the output pipe in somebody else's living room. Computers made in such a way would be far faster than todays linear step-by-step digital computers. (I don't think cupcakes will ever be quite so convenient this way...)

As time goes on, various electronic component designs are made at in ever increasing numbers. There will always be far many more ideas than actual implementations because of the need for real wires and real components. Talk is cheap but pickles cost nickels. Yet, over time, these accumulated parts have been segmented, cataloged and stored in a vast, human maintained memory, such as in all the engineering file cabinets and databases in all the companies in the world that make such things.

There are fields within fields of connections possible between devices, and some are like ponds and rivers of undulating liquids with millions of separate sources and sinks for the voltages.

A huge amount of things would not even have names, such as nobody has a name for every grain of sand. A Z-transistor with a K-capacitor and an I-coil might have a name in a database, such as Part Number ZKI-123. A computer which manipulated or used such devices might even create them on the fly, only momentarily hooking up parts X, M and P for a microsecond, never to be used again, because the conditions necessary to use that combination never occurs again.

Flexible machines of this sort, which could synthesize parts on a need-be basis, could also evolve solutions to problems never encountered before. Much like human immune cells which can recognize things that are NOT correct in the body, a machine could recognize when nothing in its current configuration is suitable for whatever current problem occurs. It would counter the problem with a shaped electrical field and collection of molecules made ideally for the situation, just as water in gravity field will fill every nook and cranny of a cup it is poured into. The water does not "think" about the shape, it just automatically assumes that shape. Auto-pilot in airplanes has some similarity to this effect.

An analog computer could be made from mirrors, light emitters and light sensors, hopefully of very high reaction speed. A device made within spheres within spheres, with mirrored surfaces, can emit light signals on one side of the sphere and sensors on the far sides could pick up the reflections, with no wiring necessary. The "individuality" of signals would not be in discrete wires, but in discrete colors of the spectrum, as many as are needed all the way down to Planck's wavelength (theoretically). Also, signals could be serial and monochromatic and use only encoded "from-to" addressing to separate one signal from another, similar to how Ethernet works -- certainly not as fast as discrete, private spectral subdivisions could be. The parts which synthesize other physical stuff, such as unique molecules, are more chemical and nano-mechanical in nature, however the brains for such things would be in the analog computer itself.

Powering such devices might occur at the "heat" level. The heat radiating from the planet passes through everything on its way out -- sort of a photon wind. Those photons and the resulting molecular motion within crystals could then produce controlled oscillations which act as tiny "generators" and "motors" in our little machines. Of course there may be limits, and the entropy may be such that it actually damages machines instead of powering them! The implementation details are left to more qualified minds -- I am only thinking out loud. Perhaps if such things were possible, living systems would have already stumbled upon them.

Nevertheless, I can imagine, on some far distant world, after billions of years of biological evolution had run its course, that what we now consider "machines" would evolve in this way. What we consider nano-tech in our vernacular would be the DNA of their evolution. I can't imagine what the end result might be, nor whether it would be good or bad. I suppose there are many bad science fiction stories that could be made about such things. But, regardless of whether God created our own evolution, there is nothing, so far, to indicate that any other kind of evolution would thereby be prohibited.

Thursday, August 13, 2009

Cave Men

There are many things that bother me about the world I live in. In America we still have a large number of bigots, racists, hate groups, militias, gangs and other throngs of violent or verbally abusive people. This is not the only country with such entities, and perhaps not even the worst.

However, in America we like to claim that we are superior, that we have a moral high ground, that we promote "the greatest good" in the world. Whenever I see those words I grimace. I always hope that such things will be true someday, for the sake of my grandchildren.

Today, in the "brave new world" of the 21st century, we still live like tribal savages from the time before the Pyramids. We have barely evolved, if at all, from the cave-men caricatures of our ancestry. In fact, if not for a continuous monitoring of the populace, including a monitoring of the monitors themselves -- our civilization would collapse into a Mad Max world of savages. Racism and bigotry would be the major forces in the world. Intellect and civility would be overrun like Easter bonnets in a cattle stampede.

So, just think about the world you are creating, and what you will leave for your children and grandchildren. If you are just teaching them bigotry, then you are destroying their future.

Tuesday, August 11, 2009

Virtuoso

Recently I have tried using "virtual computing" and I'm very impressed really. Although I have tried to do this before, using a computer that wasn't quite up to the specifications needed to handle it, this time it has been far more successful.

I am using an Alienware computer with 6gb of memory and an Intel i7 processor with 8 threads, so the horsepower and memory are already there, this time around.

Using a free, for-home-usage product called VmPlayer (a tool in the VmWare product line), and another tool from a web page (known as EZY-VMX) I was able to generate several virtual machines to run on a single computer. I tried a few different arrangements, including using multiprocessors, large memory space to single processor, small memory space, and then running virtual systems using Linux.

So first I made a system for Ubuntu 8.04 (using an existing Ubuntu boot CD I had.) No problems there -- everything worked flawlessly and the virtual machine hardly made a dent in the Alienware's Vista system. What was even more impressive was rebooting Alienware into an existing Ubuntu Linux OS, downloading VmPlayer for Linux and running exactly the same virtual machine from there as ran in Vista. I was a little troubled that I had done so much work for an older version of Linux, however.

Next, I found a pre-built "appliance" for VmPlayer, where someone else had already booted OpenSuse 11 into a virtual machine and stored it as a compressed blob. All I had to do was download the blob, decompress it and run it in VmPlayer. It worked flawlessly as well. And it also ran at the same time as the other virtual machine with Ubuntu. So I had all three running at once, Vista, OpenSuse Linux and Ubuntu Linux. The machine was happily humming along. I decided to get a more up-to-date version of Ubuntu (9.04) by downloading an appropriate blob for that.

So, anyway, this has been a good experience with virtual machines, but I have to admit that I did it all out of curiosity rather than necessity. If I was a business, howevehr, I would probably go ahead and shell out the bucks for a total VmWare package so that I could tweak things and get all the proper updates, etc.

I then updated them with the latest and greatest open source software, especially for software development and word processing, etc. And once I got everything working good, I compressed the disk files that comprise each virtual machine on Vista and backed them up as single blobs of my own. So if I blow something up I can just retrieve the backed up machine and proceed from there.

Now, I'm not advertising any of this, I don't make a dime off anything I'm saying here, it is just my personal experience with these systems. For all I know there might be just the same abilities using other virtual machine software out there, it just so happened I tried this combination first. I'm also not easily impressed, but this is good stuff.

Tuesday, August 4, 2009

Why I dislike C++ and C#

I admit it. I am a "Object Oriented Programming" hater.

At first there was just a kind of disconcerting feeling about it, such as a sudden shifting from the simplistic terminology of C to the "elitist" terminology of C++. (These are not the only examples of opposite languages, but they are good for examples.) I always tried to program computers in the most simple, direct manner so that I could understand what I did later when it needed to be changed. Although C cannot make things assuredly simple, C++ makes the simplest things hard.

For instance, the "Hello World" program can be made almost identical for C and C++, since C++ will allow certain forms of C syntax and function to work right out of the box. This is a good thing, otherwise most C++ programs would never work. Yet it is possible to use the "class object method" model for programming "Hello World" and suddenly a 2 or 3 line C program becomes an entire screen full of symbols and gobbledygook with the string "Hello World" stuck in there several places.

Now, I am not saying that C++ doesn't have good aspects about it. Certainly I like the fact that objects clean up after themselves in a more orderly fashion than C functions, although part of that ability is on the part of the programmer to make sure it is done. I had the same habits when I programmed in C -- to make sure all allocations were freed, all files closed, all errors returned, etc. It was just a habit of programming rather than a structural part of the language.

I originally wrote code in BAL (IBM Basic Assembly Language), Burroughs Assembly Language and PDP-11/70 Assembly Language. There were some other awful things in there too, like Cobol, various Basics and Fortrans. Lisp, Forth and some self-written languages also made my list. But when you program in assembly language, you learn to think in certain patterns that keep you from shooting yourself in the feet. Other languages try to force your feet to keep out of the way of bullets, or disallow bullets entirely.

Macros were an important part of assembly languages. These allowed repetitious aspects of programming to be done once and then reused wherever necessary in new programs. In some ways the C language is merely an enormous macro language encapsulating all the goop of assembly language. Yet the very thing I liked best about assembly was the pinpoint accuracy it gave you. Whatever the machine was capable of, you could make it do it. In today's world, most of a machine's capabilities are wasted, and some small subset is used in 99% of programs.

Although I do not wish to program in assembly language any more, (carpal tunnel hell,) I do miss the pinpoint accuracy. Using C makes me feel like I'm using very dull pencils. Using C++ makes me feel like I'm using Legos with Swiss Army Knife attachments made from balsa wood. With C++ I hardly every achieve exactly what I set out to do with a particular program. It always winds up being what I am allowed to do by some hidden Fascist inside the machine.

C# is another level of icky gooey stuff poured over C++. In some ways it is like a scripting language, or a little bit like Java. I think the benefit of C# is sort of lost -- it is just another arbitrary thing created by Microsoft that could just as well been done with Java (but without Microsoft's purely profit driven reasoning...)

I stay away from C# for that reason. It isn't that I want to program with difficult, syntactically punctuated languages at all. I just dislike arbitrary reinventions of wheels. It was a great waste of programmer time and it is a waste of my own time to learn and use it for anything. Especially since there is a performance and capability loss with the use of C# (and its .NET world.) It is like using C++ with thick mittens on, and under the watchful eye of a vicious Nun.

Sunday, August 2, 2009

Thinkless Machines

I read all kinds of stuff about Apple's hardware, software, iPhones, iPods, iWhatevers. I am not really against Apple, really, but I don't own any of those things.

I know lots of people with iPods, etc., such as my daughter and most of her friends. Her husband evidently ran over one with his car recently -- I found the flattened, broken thin glass and metal thing in a compartment in his truck when I was looking for a rolling marble or something. Whatever it cost, it is worthless now.

I have many computers, including old, 1999 obsolete Thinkpad, newer medium quality Dell laptop, a higher end Alienware desktop, plus a lesser, older Dell desktop about 5 years old. I use all of them for various purposes -- sort of software quality filters. If something will still run decently on the Thinkpad (with Linux) it will run extremely good on the Alienware box (with any OS...)

But, back to the Apples. Why don't I have an Apple? Actually my first "home" computer was an Apple, although I had played with TTL circuitry and made weird little contraptions -- sort of proto-computers -- before I plopped down the $2500 bucks for an Apple IIe. Now, that was a SLOW computer.

Later on, due to my profession in software, I had access to very powerful machines -- servers, workstations, robotic systems, etc. -- and didn't really have time to screw around with gutless machines. So I needed whatever the latest greatest fastest stuff was -- usually an Intel box, but sometimes it was SGI or even an IBM system of some kind. Apple was not on the list.

I did play with a NeXT box for a while, something that was for porting software to, but it wasn't a very popular system for whatever reason, cost or lack of color or something. I thought it was OK, and certainly I liked the C and C++ programming for it. But it was just a brief project, and on to the next junk.

I have never used a Mac, especially an iMac or whatever they have, either the desktop or the laptop or the iPod or iPhone or anything else. I guess the closest I've come is having used Safari web browser for a few days, and occasionally using iTunes for playing mp3s (but not syncing to an iPod or using the iStore...)

And now that I'm getting on in years, I probably won't ever purposely buy anything from Apple. If somebody buys me one as a gift or as a work project or something, I guess I wouldn't kick it off the desk. But I'm not rushing out to empty my wallet on one anytime soon.

For one thing, for absolutely FREE, I can use a myriad instances of Linux (I know, I know -- Apple-heads look down on Linux). Yet, for all practical purposes, except for the prices of machines and the software, Linux is very similar to Apple's stuff. Not identical, no. Nor is it identical to Microsoft, nor is Linux even identical to Linux, since there are so many flavors.

But I am a machine head -- I like the fact that I can mold Linux any which way I want, or not, on a whim. I don't have Cupertino's lawyers breathing down my neck, nor the ghost of Bill Gates haunting me. I just have nice systems running in all my machines.

I also have Microsoft, of course. I have to because I'm a software guy. But I use that the way I'd use a semi-truck - to haul cargo. I use Linux the way I'd use space craft -- to do whatever the heck I feel like doing. I'm using Ubuntu Linux on my Alienware to write this, but I could just have well done it with the Thinkpad, although a bit slower.

Apple? I'm not sure I even know what to think about those things. I hardly ever think about it them at all. And I for sure will never pay actual money for one, ever again. I paid way, way to much for that Apple IIe, and it couldn't even think one bit.