I remember reading a sci-fi book called “2001: A Space Odyssey”…well I did find that book a little ‘too’ futuristic when I read it in school, back in early 1998. But like millions of other foreseers I too dreamt of a very modern world when mankind would step in the new millennium. What was I looking for anyway?
2001 arrived & I was still waiting for someone to build a Starship Enterprise. But to my dismay there weren’t any wormholes, neither were parallel universes. They still reside in their actual form & that will be ‘Theories’. Read about space a lot, but I found computers more interesting.
Forgetting the future for a while, let’s go back to the invention of transistors. Long back in 1947, three physicists named John Bardeen, Walter Brattain, and William Shockley, gave birth to something so important that we would soon forget about its existence & start using it blindly, just like we use Oxygen. They made the first Transistor, at the Bell labs. What was the itch that made these guys make the most important invention of the 20th Century? All they wanted to do is to use the knowledge in electrical devices gathered since the 1800’s, to make devices for communication.
This urge gave birth to the Information Age. At first, the computer was not high on the list of potential applications for this tiny device. This is not surprising—when the first computers were built in the 1940s and 1950s, few scientists saw in them the seeds of technology that would in a few decades come to permeate almost every sphere of human life. Then comes the digital explosion. What do I mean by the explosion? Well, mass producing transistors & integrating them with chips (not the fried ones, but the silicon ones), gave computers a new meaning. Computers grew smaller in size, from huge rooms to table tops & now, laps & palms. Ever wondered how 2005 would be without the transistor? Don’t even bother to think about it.
All this info I gave you is just a small intro to what actually follows. What exactly do we mean by the words Information Technology? One can easily “Google” up the words “information Technology” & generate some 100 million pages in 0.12 secs. But that’s not what IT is all about. Its all about application of the information…..Sounds techy? Wait till u read ahead.
Information – Any relevant data that can be accumulated & presented is known as Information. That will be a very basic definition, but information is too vast to define. In the early ages people used to gather information by looking at the sky (to find what time it was, duh) but it worked. I don’t remember looking up for the time ever since I began to talk & ask people what time it was. Anyway, the example of staring at the sky for the time, sounds as obsolete as using a cave man tool to cut veggies. All I am trying to imply is the use of very basic things that would sound so weird if told long back in the past before their invention. Who ever thought that someone would come up with an idea to make a mechanical piece of metal to tell you time ! Well someone sure did come up with it. It’s the same with computers. Computers have had generations of growth. They still want to grow, but this time its not in size. No matter how small the technology grows, it will always become simpler to handle. That’s when the human element of stupidity comes into picture.
Imagine a cave man making a stone tool for himself. Finally he comes up with a way to make it & then starts mass producing it so that he can gain profit from his simple invention. This tool, overtime, becomes a basic necessity of all the people who use it. They cant do many things if they lose such a handy tool. They device various ways to decipher the simplicity & structure of the tool. Some of the wiser ones actually do decipher it & then they use it more productively. But some people seem to overlook the simplicity & use the tool blindly.
Now consider this tool to be a computer. How many computer users actually know what’s inside the casing & why did they pay the mean looking PC dealer, just to bring it home? Its really not expected from every person on this planet to understand what this technology is made of. Now just filter these potential users & extract the users in the IT field. What do we get? All we find here is 60 % of these users don’t bother to know what goes on behind the scenes. From the remaining 40%, we can say that 5% were responsible to make it & 15 % are some how related to each of the component used inside the computer. Also the 20% crowd is developing stuff for governing the hardware. So why do we have this rift? Why do people do not look into the things they believe in so blindly.
Someone might say that its none of their business. I agree to this till a certain point, the point being the actual implementation of the hardware & the actual flow of signals & so on. This contradicts the basic principle behind modern computers, Simplicity. But how will some one argue on the basis of the computers usage!
Its been quite a while since I was looking for a legitimate answer to this question…but today I feel I have found the answer. The magic word is “Curiosity”. If someone asks me why curiosity, I’d reply, “it works”. It was curiosity that told us about gravitation, it was curiosity that led to the invention of the computers in the first place. It drives the 40% crowd in IT & baffles the remaining 60%. For those who don’t care about curiosity towards a computer, are to be considered amongst the ones who don’t bother to know what the hell they are doing in IT !
So really, what the hell are they doing there ?
I finally feel like saying sumthing to all those people in the field of IT, who dont dig computers:
"Chill dudes, its just ones & zeros"