The idea occurred to me as I was listening to some of the j-pop music i have on cd. I don't know Japanese yet, so I don't quite understand it yet (not that it prevents me from enjoying it). It's based around the ideas of foreign languages, your native tongue, and the specialized dialects that develop in each language for particular fields, and how CPU architectures, Compiled and Assembled programs, and similarity in CPU design mimic (or mock) the behavior of human languages.

Now lets say, for instance, that you are an English speaker, and that your brain is along the lines of an Intel x86. When someone speaks/writes in a way that's easy to understand, that's the equivalent of a program written in x86 Assembly. A French speaker whose brain is along the lines of a MIPS, will be the equivalent. Now, say an English speaking x86 person was given a document, say, a college thesis. Usually, the level of writing is much higher (some might say at times, obfuscated). A normal person, like a normal x86, could interpret such a text/program, but it would take more CPU cycles (more time thinking) to grasp what was being given to them, the equivalent of writing a program in a higher level language. A law document handed your average, every day person would be the equivalent of giving an insanely complex visual basic program to a 486. A well written C program would be like handing a HOWTO to someone who is generally adept with computers (a beginning sysadmin, let's say), and having them work through it.

Now let's throw a bit of technical jargon in there. Technical jargon, in this case, would equate to specializations of CPU functions. If your technical bias is towards computers, and you speak to someone whose language does not include said words, they won't be able to interpret that. Similar to if you wrote a program that was dependent on MMX, it would not run on any CPU that lacked those functions. Now if you wrote a program that while it could use and was optimized for, say, AltiVec, but was not dependent on it. You would have to simplify your language a bit for those that didn't understand it, and while your message would be understood, it would take longer, and some of the context would be lost along the way (IE the program eats more CPU, and you get less detail).

Foreign languages are like different CPU architectures. An English x86 speaker would have a hard time understanding a German SPARC speaker, unless the x86 brain were given software to emulate it (IE taught the language). But, like emulation, it's rarely as good as the actual hardware (spoken from childhood), and takes up more CPU cycles to operate at, likely, a slower speed. If you've ever run a program on NT that was made for the Alpha version on the i386 version, it usually pops up an error that "this program isn't made for this type of CPU," the equivalent of "no habla español" in the computer world. Now given an emulator, like the Alpha has (FX!32), the Alpha could be taught to understand the language of the x86 and be able to understand what is "being said."

CPU Clones, such as AMD, Cyrix, and the previous x86-compatible CPUs, are like dialects. For instance, the traversing of x86 from Intel to AMD, to Cyrix, and then to IDT is similar to the transfer, and subsequent changes in, of English from England, to the Americas, Austrailia, and New Zealand.

New CPUs, such as Transmeta's VLIW based CPU strike me as a brain that speaks a language like no other, yet can learn any other language quickly, and accelerate its interpretation rate, but only at the cost of forgetting any preivously learned languages.

As for machine language, no one can do that yet unless you've figured out how to directly trigger the impulses in our brain that interpret sound->thoughts (bypassing the hearing structures entirely), and haven't told us yet.

Log in or register to write something here or to contact authors.