Tuesday, 3 December 2013

The Laws of Computing



The Laws of Computing

Takeaway: 
"Even in the extremely abstract field of computing, there are some observable "laws" - just like in mathematics. By studying these laws, we can build on our understand of computing and expand innovation.
While computer science isn't exactly like physics, where there are observable laws in nature, there have been a number of "laws" discovered by researchers. They might seem old-school, but they're the foundation upon which innovation is built. Check it out!

Moore's Law

Moore's Law is probably the best-known "law" in the computer world. It's named for Intel founder Gordon Moore. In a 1965 paper, he noticed that the the number of transistors on an integrated circuit doubled about every two years. This meant that the chips had more functionality than before for the same price. In other words, as time went on, the chips did more for less.

You've probably seen this in your own life. When you buy a new computer, it's generally faster than the last one you bought - and costs less as well.

Moore's Law is not only observable in microprocessors, but also in memory and storage space. It seems there is no limit, but chip makers can squeeze only so many circuits on those silicon wafers. On the other hand, quantum computers may offer a solution, though they're still a long way off from mainstream use.

By closely studying these laws, we can build on our understand of computing and build even better things by expanding innovation.

More about it in my subsequent post
:)

No comments:

Post a Comment