Saturday, November 9, 2013

The Chief Minion: Hello My Name Is... CPU


So it’s been a while since my last post. Yes, after the initial, improbable success in placing a CNN link into a website page, I fielded several follow-up questions that had me worried:

“Are you going to design a website?”
“I didn’t know you wanted to make a website!”
etc…

It turns out, I do not want to design a website. But the minion language HTML is for designing websites so I find myself in a bit of a pickle. My choices are to either continue learning about HTML things even though I have no interest in actually using it (I did complete my Code Academy course. Promise.) Or I could acknowledge an early wrong turn and seek a new direction for my blogging code-cracking expedition.

Note: I Googled “Wrong Turn.” Apparently that’s a horror flick from 2003 about inbred monsters from West Virginia. I do not think my brief foray into HTML was this much of a “wrong turn” but perhaps some of you will enjoy the movie reference. I don’t like horror. It gives me nightmares.

Anyway, I was discussing my blog with a minion expert I met recently. After seeming confused (bemused?), then humoring me in the way of being polite, he nonchalantly referred to something as “The Chief Minion.”
Wait a minute. There’s a Chief Minion? 

Why yes, he explained, and it’s called the CPU. I became so excited I literally ran away from the conversation to find my iPhone and make a note to myself. “Chief Minion = CPU.” I need to get to really know my minions and I’m going to start with their Chief: CPU. 

Anthony Michael Hall is my eternal representation of the King of the Nerds and is therefore my blog’s stunt double for the Chief Minion. One can only hope the CPU is as awesomely hilarious as Anthony Michael Hall. To be honest, I have doubts.



CPU stands for Central Processing Unit (according to both the minion expert and Wikipedia). I tried to read the Wikipedia entry. Tried. The first paragraph went pretty well: the central hardware of the computer that carries out all the basic math and stuff…things I can logically infer from the name. The next paragraph begins to get a little tricky: the CPU has two parts, an ALU which is where the math happens and the CU which is where the memory of the CPU is (I think?). I suddenly feel like my brief foray into HTML and the CNN tab is like learning Pig-Latin in order to tackle Shakespeare. The third paragraph talks about array or vector processing and distributed computing that doesn’t even have a CPU. Minions without a Chief. Sounds too trippy for a beginner. 

This is my visualization of minions without a Chief.  Trippy.

I have concluded after slogging through the remainder of the entry, that the CPU is where my minions use the much-talked-about binary code. Unlike the HTML language, in which I use lots of symbols and letters and numbers to speak to my minions, this CPU is where they get to do their 01001010101 thing. So why is that? I mean, if my CPU is the Chief of all Minions, why doesn’t the Chief speak the most complex minion speak, complete with all the symbols and numbers and letters, instead of the most stripped down one with only 0s and 1s? 

Here’s what I’ve surmised:
 
(Note: the following has been fact-checked for glaring absurdity or inaccuracy. However, no self-respecting minion expert would sign off on officially “approving” this description of why the CPU uses binary code. Fair enough.) 

The CPU is hardware. This means that it exists. I mean, it exists physically in space. It’s that little “chip” you see in Intel commercials. I’m going to compare the electrical signals of this physical chip to movements that are little easier for my tired brain to visualize: light switches. In my simple brain, there are two types of light switch, an up/down version and a dimmer switch that allows for awesome seamless transitions from light to dark and vice versa.  

 

The dimmer switch, while cool and with a myriad of possible lighting levels, is hard to use if the goal is to 100% accurately replicate the exact, particular lighting level that you want every single time. A little more up….no, down a little…no up a little…etc. That dimmer is like using all the symbols/letters/numbers. Cool, but complex. 



The up/down toggle switch is way less cool. Up. Down. 0. 1. But you can always tell where you are: on or off. 0 or 1. So the little electric signals in the CPU need that type of accuracy, precision and reliability. Up. Down. But here’s where it gets cool: if you are looking to reliably and precisely replicate a certain level of lighting, you can set up a series of up/down switches (imagine them all lined up against a wall) whose end result is exactly the level of lighting you want. Up. Up. Down. Up. Down. 00101. The CPU is where your computer exists in a physical reality of tiny electric up/down switches. And the more up/down switches you have and the smaller you can make them to fit in one “room,” the more varied the lighting arrangements you can make.  



I feel like Marie Curie who’s just discovered radium. Actually, I’m like some other person who just realized x-rays exist 50-75 years after Marie Curie. And after everyone else in the world. And after my great aunt Betsy had her arm x-rayed. Better late than never.

No comments:

Post a Comment