Evolution Of Operating Systems Information Technology Essay

Operating systems have evolved from slow and expensive systems to present-day technology where computing power has reached exponential speeds and relatively inexpensive costs. In the beginning, computers were manually loaded with program code to control computer functions and process code related to business logic. This type of computing introduced problems with program scheduling and setup time. As more users demanded increased computer time and resources, computer scientists determined they needed a system to improve convenience, efficiency, and growth (Stallings, 2009, p. 51). As a result, they created an operating system (OS) to process jobs in batches. Later they created Multitasking and Time-Sharing to run multiple jobs and allow user interaction to improve efficiency. Multitasking brought challenges to manage I/O operations required by multiple jobs in which computer vendors resolved with interrupts.

Operating System Concept

An OS provides an interface between a user application and the computer (Stallings, 2009, p. 51). The purpose of an OS is to manage system resources, schedule processes, handle error conditions, and audit logging. Using an OS freed programmers from writing code to handle machine functions so they could concentrate on writing code for user applications.

Serial processing

From the 1940s to the mid1950s, programmers setup and controlled early computers by loading programs using punch cards, magnetic tape (Stallings, 2009, p. 55). All output went to a printer if the program did not abort due to errors. In the event an error occurred, indicator lights illuminated notifying the programmer. When users required computer time, they had to sign a sign-up sheet indicating the amount of time they needed (Stallings, 2009, p. 55). This method of scheduling presented problems whenever a program generated errors or computer failures occurred. The user may run out of their allotted time slot and have to reschedule the job for another other time.

Read also  Methodology Of Dicta Travel Agency Information Technology Essay

Simple Batch Processing

After Serial Processing, the mid1950s introduced Simple Batch Processing, the first operating system. This technology improved the efficiency of scheduling and setup time as an operator loaded user jobs sequentially in batches accessed by monitor software (Stallings, 2009, p. 58). The monitor processed each job in the order it was loaded. When one job finished the monitor ran the next job in line from the batch until all jobs completed. Despite an improvement over Serial Processing, Simple Batch Processing was slow and consumed large amounts of processing time.


Multitasking, an improved form of Simple Batch Processing took advantage of processor idle time by loading the processor with multiple user jobs. When one program completes processing, the results transfer to an I/O device, and the processor executed another job waiting in memory (Stallings, 2009, p. 60). Multitasking utilizes computer resources efficiently as it switches between jobs until each one completes. Operating systems like Microsoft Windows 7 still use multitasking today.

Time-Sharing Systems

Time-sharing is an extension of Multitasking. Time-sharing is a technique for multiple users to share system resources simultaneously. This offers the users an opportunity to interact directly with the computer (Stallings, 2009, p. 62). Using a terminal and keyboard, each user submits a job request by pressing a transmit key and wait their turn for a response from the processor. The intention of Time-sharing is to minimized response time back to the user, reduce idle time, and still maximize processor usage (Stallings, 2009, p. 62). Today, UNIX systems still use Time-sharing.


As technology further evolved, computer vendors designed hardware to support I/O interrupt controls for Multitasking systems. Interrupts provide flow control for activities between the processor and I/O devices. There is a controller built into the processor to manage interrupt requests. An interrupt handler is a program contained in the OS to process I/O routines. Each I/O device uses interrupt requests (IRQ) to send a signal that alerts the processor it is ready for an action. Using an interrupt controller, the processor can send control of a job waiting for an I/O operation to the appropriate device (Stallings, 2009, p. 62). After a device sends a signal, the processor temporarily halts the current program and enters an interrupt service routine (Stallings, 2009, p. 18). When the OS routine is complete, the processor continues the current program it paused before the interrupt.

Read also  Examining The Computer History Of Microsoft Information Technology Essay

Scheduling and Resource Management

An important role for an operating system is to manage system resources efficiently and schedule processes to ensure each one has equal opportunity to complete its job. To accomplish this, an OS uses a scheduler and resource management to control memory, I/O devices, programs, and data (Stallings, 2009, p. 53). The OS uses a Short-term queue to maintain processes loaded in main memory while waiting on processor availability. When notified, the Short-term scheduler selects the next job by priority level or using round-robin technique (Stallings, 2009, p. 73). The OS places new job requests in the Long-term queue and move one of the processes to the Short-term queue when memory space becomes available. Together the scheduler and resource manager ensures each process has a fair amount of time to complete its task as efficiently as possible.

Flavors of Linux Operating Systems

Linux is an open source operating system build on a Unix-like platform that comes in a wide variety of distributions. Linus Torvalds wrote the first version of Linux in 1991 while attending college (Stallings, 2009, p. 94). Since that time, many people have written versions for various reasons. Some people felt compelled to create their own version. Others may have written versions in hopes of making money. There are groups of people who collaborate to improve current versions. In addition, businesses may write their own version to perform specific functions. Linux distributions are compatible in business environments as well as at home use. There are more than 200 distributions available worldwide and many are based on the following five types: RedHat, Debian, Mandriva, Slackware, and Fedora (Thoke, O. 2009). A few examples are power users for the experts, Live CD to boot from a CD, and personal for the novice user.

Read also  The History Of Cloud Computing Information Technology Essay

A version I could apply in our office would be Debian GNU/Linux. After visiting their website, Debian appears as a viable option for an office environment and a software development team since it comes with over 25000 packages as part of the distribution. Debian claims it will run on almost all personal computers, supports many devices, and is easy to install, upgrade, and maintain (Debian Project, 2010). As a software developer, I want a system that is dependable and easy to manage.


Operating systems are an evolving work in progress. Developers learn lessons from previous concepts to improve the next. Batch Process overcame the problem of scheduling and job setup posed by Serial Processing. However, it was still an inefficient system causing long delays between jobs. Both Multitasking and Time-Sharing resolved the issues of processor idle time allowing multiple jobs and users to interact simultaneously. Hardware vendors have an important role with improving performance by creating better processors and I/O controls. Current technology continues using Multitasking and Time-Sharing concepts. However, today graphical user interfaces incorporate them within desktop environments.

Order Now

Order Now

Type of Paper
Number of Pages
(275 words)