The earlier computers were much larger in comparison to the computers available today, sometimes filling entire rooms or sections of buildings. Some persons may be familiar with the term “mainframe” which is a large computer which can do many different tasks simultaneously. Modern mainframes and older mainframes were not much different in their function which is to provide a centralized point for all data to be processed and sometimes stored but older mainframes did not have the benefit of the newer technologies available today.
As computers evolved and got smaller and smaller, the use of mainframes to handle workloads decreased as standalone computers became more popular. The difference with standalones versus those attached to the mainframe – which were called workstations, is the fact that standalones could run applications by themselves while workstations depended on the mainframe being online in order to function. Much older mainframes would run different programs based on a schedule and persons were assigned to feed or load the scheduled application at the appointed time.
When the appointed program was scheduled to be run a person would have to load either punch cards or tape spools into the mainframe in order for it to run the specified application and the workstations would then load the application to the mainframe. In more recent times servers have generally replaced mainframes but in some companies there are still those who use modern mainframes as an alternative to having hundreds of servers in different locations. The size of the computer is the greatest measure of how far technology has come.
We have moved from computers filling entire buildings to laptops that have as much functionality as a desktop and portable enough to tote around everyday. A quick timeline of how computers have evolved include the movement from punch card program input to magnetic tape program input, the moving of storage devices to magnetic drum media, moving from vacuum tube electronic components to transistor electronic components and ultimately the reduction in the size of transistors which is the primary reason for computers having reached where they are.
An interesting concept to examine in computing would be Moore’s Law which outlines a trend in computing and the ‘block’, which will be the farthest point that computing will reach. As components keep getting smaller and smaller allowing for more features to be crammed into a single circuit, Moore’s Law dictates that once transistors reach atomic particle sizes the improvement of computers will have to stop.