Friday, September 27, 2013

Blog 5: Where I voluntarily make myself nuts, bit by byte.



Stock prompt: what is the most complex or difficult about the reading's references to Boole, Russell, Gödel, Maxwell, and others? Why is it challenging? What words or concepts in particular are problematic? I pose these questions because I know that difficulty is often a way in or point of access: if you figure out what's difficult about a text, you've figured out its problem. That lets you investigate the problem. If you're a designer, an artist, a videographer, a musician, or somebody who works in media other than text, I'd love to see how you'd represent and upload to your blog the most complex or difficult problem posed by Gleick.

So since the very first chapter of Gleick, Claude Shannon and his work has rather confused me. I've been struggling mainly with a single question: what exactly is a bit? How can information, which once could only be conveyed through physical or more direct means (a record for music, a letter for writing, paint for colors, or even a physical conversation) be somehow now portrayed through a mathematical code of 1's and 0's? 

I understand that Charles Babbage created the beginning of mechanized information, Ada Lovelace foresaw the future of mechanized information, and Claude Shannon played a huge role in the reality that is our digital forms of information today -- but I just don't understand the connections between binary, information theory, and everything we can gain from our computers and phones -- sound, images, text, etc.

I wish I could say that I successfully did my research and can now explain it all awesomely to you. But...not really. Don't get me wrong, I did my research--it just turns out, information theory isn't exactly something that can be understood in a day, especially by a rather technologically and mathematically illiterate person such as myself. However, I'll try lay out what I did understand from the reading, and what I have come to understand from my investigations: 

The comparison that helped me initially begin to understand what a bit is, is the description on page 174 of Gleick, comparing binary coding to logic: "As in logic, he saw that circuitry could make 'if...then' choices." This is basically what a single bit is: only capable of a if/then, yes/no sort of question/answer. A 1 represents an open circuit, or basically "yes," while a 0 represents a closed circuit, or "no."It's a very basic, electronic logic pretty much.  In it's simplest form, binary is more or less a digital maze for an electronic pulse, each binary code giving it varied commands. As more bits are added, the paths become more complex, like so:


 (Behold, a crappy image drawn far too late at night)

 Perhaps not the best illustration, since 1's and 0's are generally described as opened and closed circuits, but I feel like it shows how different binary combinations can lead to different paths, thus very different command executions. From what I've seen, it seems the more bits you have, the more complexity the code can offer, thus the more complex commands a programmer can give. The difference in what you can achieve between a single bit (a yes/no question), a byte (equal to 8 bits, and can create numbers from 0 to 255), to multiple bytes (30-90 bytes might equal a line of text), and beyond to megabytes, gigabytes, etc.

So more memory, more bytes, allows for more complex functions, such as sound, text, and color. The exact how it all works I think is the part that is beyond my current understanding. I think part of it is where codes to extend binary further than numbers come in, such as ASCII, and the programming languages such as C++ and Java further help programmers to be able to instruct computers without having to write out each individual 1 and 0.  I'm sure Claude Shannon's Information Theory would shed some light on the matter if, again, I had even remotely the brains and background to comprehend it. 

All in all? I admit that I didn't gain even remotely the even slightly comprehensive knowledge of the nitty-gritty of electronic information that I was hoping for, but it didn't take me too long into my research to realize that this was a hopelessly giant and highly technical subject to try to comprehend in a day. However, I do feel like my research has helped me to understand, just a little tiny bit (pun not intended?), the background of digital information. My investigation was a pretty interesting experience too -- interesting enough where I didn't just do the smart thing by dropping the subject and writing instead about telegraphs or telephones. If nothing else, I do feel like I understand my computer slightly better.

Also, I now have an even greater respect for the scientists, engineers, programmers, and mathmeticians who can fully understand this subject, because daaaaaaang: this is some complicated stuff right here. 

I'll leave you with some links that I thought were helpful for a layman like me:

Youtube: Computer Tutorial How Binary Code Works
A Binary Overview
More info on bits and bytes, and a little on ASCII
Webopedia entry on ASCII

Also, a shout-out to my brother, who has done some programming, and was able to tolerate my breath-taking ignorance long enough to give me a somewhat more knowledgeable perspective on computing languages.

Finally, I shall end with a fun fact: apparently if you type “binary” into Google, it will tell you that there are “About 0b1000001110011011011010000000 results.” Google humor strikes again!



No comments:

Post a Comment