History

At first, computers were big, few in number, and they ran just one program at a time.

Later as they became more powerful they could run more than one program at a time and they allowed more than one user to use the same powerful central computer. Everyone shared the same copy of their software on the same central computer but there were a number of screens and keyboards (terminals) connected to that same computers.

Then we started to see individuals and families getting their own workstations and computers. Each computer had its own copy of everything and ran one program at a time. Eventually these computers could run more than one program at a time, but for just one user at a time since they had just one screen and keyboard.

Then came another major shift as these single user computers allowed multiple users over a network. This means a single computer could become a “server” which someone on another computer could utilize over a network. So instead of terminals we have full-fledged computers that can do everything themselves sharing resources on a centralized “server” in order to enable some more collaboration and communication between users. Most of the software is installed and running on the individual computers and on the server.

Local networks of computers were interlinked to form the Internet, and servers were put under higher demand for performance and scalability. Single central servers were broken up into multiple servers to handle the increased demands for throughput and reliability.

Then we started to see software being developed which ran “on the cloud” - centrally, requiring no installation onto the computer the user accesses the software with. This is similar to how terminals worked with large central computers a long time ago except now each browser tab is its own terminal and can connect to a different central server, and the browser can do a lot more by itself.

Also the smartphone arrived on the scene meaning that computers became even smaller and more convenient than before, but mainly running the own software rather than use the cloud.

Also of honorable mention is the invention of various peer-to-peer technologies allowed user’s computers to collaborate (almost) without a central server.

One pattern I see here is probably the most obvious: computers are getting smaller, cheaper, and more portable.

Also I see that computers are becoming more numerous - what ran on one server before, now runs on several. People who used a single computer in the past are now often toting two or more (a laptop, a smartphone, a tablet).

New generations of computers are less customizable with fewer capabilities and as a result easier to use than their predecessors. I recently thought to myself that as software enters the “late majority” it must become as hard to figure out as a shampoo bottle. Then I realized my 3-year-old can use an iPad but not wash his own hair.

I also see that new sizes / form factors are introduced periodically and are initially standalone single-user systems that eventually become networked multi-user systems that take advantage of centralized resources and servers.

The future?

If these trends continue we should see a new more convenient type of computer come on the scene in the next decade or so. Some of the possible manifestations that come to mind are wearable computers, Star Trek style intercoms, and implanted “cyberware”.

Smartphones should continue to depend more and more on the Internet for their functionality, especially for collaborative and productivity apps. They will probably eventually allow one to “switch users” on the same phone much the way we can with personal computers now.

Powerful desktop computers and laptops should continue to decline in popularity despite the fact they offer more bang for the buck; people are willing to pay more for the convenience of a smaller device. People will choose the smallest device that meets their personal needs, and only a certain category of professionals and gamers really need a powerful computer or laptop. Smartphones, tablets, and whatever the next generation is will get better and better to the point where the majority of users can accomplish all their normal computing tasks well enough not to miss the computer.