There is a topic I've been learning in my computer science class that caught my attention recently, and I thought I'd share it with my readers. It's something called abstraction, which apparently is one of the foundational concepts of programming. It's something I've done before I even knew what it is, and now that I look back, I'm sometimes confused by it.
Abstraction is all about using something when you don't really know what it truly is. We use a symbol of some kind to represent it and use it in the program. Does that sound confusing? Let me give you a basic example.
Perhaps you know that computers think in a number system called binary, which is expressed as zeros and ones. The use of zeros and ones as symbols is a form of abstraction. The computer isn't really operating on actual zeros and ones; rather, there are electrical strengths flowing through the circuits of the machine. We represent the higher voltage with a 1, and a lower voltage with a 0. That makes it easier to understand and work with. I don't know the deep electrical mechanics of the computer, but I know how to handle zeros and ones. It's easier to think of zeros and ones than high voltages and low voltages.
We could have some fun with this. Suppose instead of using electrical currents, we utilized water streams. There could be two possible states: water flowing, or water not flowing. These could be represented by 1 and 0, respectively. Because water flows are so similar to electric current, they could be used to make huge water-based computers, although the graphics wouldn't be very nice.
Let's take this to the next level of programming. At the very base, computers operate by handling sets of zeros and ones in a specific manner programmed into their hardware. There is something called assembly language, which is very primitive and one step up from the ones and zeros. It may look something like this:
LD A, 0
INC A
LD HL, BC
RCCA
RET
Doesn't look very readable. The letters and numbers correspond to specific pairs of numbers. This is abstraction because we don't need to know the exact zeros and ones in order to make the computer operate. We just need to know this vaguely more English-like set of commands.
Take this one more step into languages which most programmers are more familiar with - Java, C++, PHP, JavaScript, and so on. They look much more like written language, even though there is a lot of mathematical appearance. These are a large leap of abstraction. Consider a FOR or WHILE loop. You don't know all the bits and bytes being moved around, but that's alright. You don't need to, you can just run these loops to make it work.
Or think of variables. They have all sorts of types and names. This is touched on in fields of math like algebra and calculus. You have a variable x. You don't know what it is but you still do stuff with it, such as solving or simplifying equations for it. In JavaScript, you might do something like this:
function showMessage(x) { alert(x); }
The function doesn't know what x is, and indeed you could throw any kind of value into it and it will work nonetheless. One example which we did recently in my CIS 300 class involved abstraction when it created an array of type T. This was part of a class definition and went something like this:
private T[] arrayName = new T[10];
It was done in C#. This is a cool example because T means that the data type could be anything the user wanted, be it int, string, long, bool, object, Thingamadoozy, or whatever! You didn't need to know what the data type was when you made the code, you could just use T as a stand-in until the data type was given.
I think that abstraction, along with the very structured hierarchy that many languages, particularly Java, show are some of the things which are catching my interest these days.
Thursday, September 6, 2012
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment
What do you think of this? Keep replies decent and non-insulting. Or I will delete them. ^.^