What Is Microprocessor Information Technology Essay
A microprocessor incorporates most or all of the functions of a computer’s central processing unit (CPU) on a single integrated circuit (IC, or microchip). Computer processors were for a long period constructed out of small and medium-scale ICs containing the equivalent of a few to a few hundred transistors. The integration of the whole CPU onto a single chip therefore greatly reduced the cost of processing capacity. From their humble beginnings, continued increases in microprocessor capacity have rendered other forms of computers almost completely obsolete with one or more microprocessor as processing element in everything from the smallest embedded systems and handheld devices to the largest mainframes and supercomputers.
history of microprocessor
In November, 1971, a company called Intel publicly introduced the world’s first single chip microprocessor, the Intel 4004 (U.S. Patent #3,821,715), invented by Intel engineers Federico Faggin, Ted Hoff, and Stan Mazor. After the invention of integrated circuits revolutionized computer design, the only place to go was down in size that is. The Intel 4004 chip took the integrated circuit down one step further by placing all the parts that made a computer think on one small chip. Programming intelligence into inanimate objects had now become possible.
# History about Intel
In 1968, Bob Noyce and Gordon Moore were two unhappy engineers working for the Fairchild Semiconductor Company who decided to quit and create their own company at a time when many Fairchild employees were leaving to create start- ups. People like Noyce and Moore were nicknamed the “Fairchildren”.
Bob Noyce typed himself a one page idea of what he wanted to do with his new company, and that was enough to convince San Francisco venture capitalist Art Rock to back Noyce’s and Moore’s new venture. Rock raised $2.5 million dollars in less than 2 days.
The Intel 4004 Microprocessor
The 4004 was the world’s first universal microprocessor. In the late 1960s, many scientists had discussed the possibility of a computer on a chip, but nearly everyone felt that integrated circuit technology was not yet ready to support such a chip. Intel’s Ted Hoff felt differently; he was the first person to recognize that the new silicon-gated MOS technology might make a single-chip CPU (central processing unit) possible.
Hoff and the Intel team developed such an architecture with just over 2,300 transistors in an area of only 3 by 4 millimeters. With its 4-bit CPU, command register, decoder, decoding control, control monitoring of machine commands and interim register, the 4004 was one heck of a little invention. Today’s 64-bit microprocessors are still based on similar designs, and the microprocessor is still the most complex mass-produced product ever with more than 5.5 million transistors performing hundreds of millions of calculations each second – numbers that are sure to be outdated fast.
How Microprocessors Work
The computer you are using to read this page uses a microprocessor to do its work. The microprocessor is the heart of any normal computer, whether it is a desktop machine, a server or a laptop. The microprocessor you are using might be a Pentium, a K6, a PowerPC, a Sparc or any of the many other brands and types of microprocessors, but they all do approximately the same thing in approximately the same way.
A microprocessor — also known as a CPU or central processing unit — is a complete computation engine that is fabricated on a single chip. The first microprocessor was the Intel 4004, introduced in 1971. The 4004 was not very powerful — all it could do was add and subtract, and it could only do that 4 bits at a time. But it was amazing that everything was on one chip. Prior to the 4004, engineers built computers either from collections of chips or from discrete components (transistors wired one at a time). The 4004 powered one of the first portable electronic calculators.
Microprocessor performance
The number of transistors available has a huge effect on the performance of a processor. As seen earlier, a typical instruction in a processor like an 8088 took 15 clock cycles to execute. Because of the design of the multiplier, it took approximately 80 cycles just to do one 16-bit multiplication on the 8088. With more transistors, much more powerful multipliers capable of single-cycle speeds become possible.
More transistors also allow for a technology called pipelining. In a pipelined architecture, instruction execution overlaps. So even though it might take five clock cycles to execute each instruction, there can be five instructions in various stages of execution simultaneously. That way it looks like one instruction completes every clock cycle.
64-bit Microprocessors
Sixty-four with us since 1992, and in the 21st century they have started to become mainstream. -bit processors have been Both Intel and AMD have introduced 64-bit chips, and the Mac G5 sports a 64-bit processor. Sixty-four-bit processors have 64-bit ALUs, 64-bit registers, 64-bit buses and so on.
One reason why the world needs 64-bit processors is because of their enlarged address spaces. Thirty-two-bit chips are often constrained to a maximum of 2 GB or 4 GB of RAM access. That sounds like a lot, given that most home computers currently use only 256 MB to 512 MB of RAM. However, a 4-GB limit can be a severe problem for server machines and machines running large databases. And even home machines will start bumping up against the 2 GB or 4 GB limit pretty soon if current trends continue. A 64-bit chip has none of these constraints because a 64-bit RAM address space is essentially infinite for the foreseeable future — 2^64 bytes of RAM is something on the order of a billion gigabytes of RAM.
With a 64-bit address bus and wide, high-speed data buses on the motherboard, 64-bit machines also offer faster I/O (input/output) speeds to things like hard disk drives and video cards. These features can greatly increase system performance.
Use of microprocessor
The use of microprocessors was limited to task-based operations specifically required for company projects such as the automobile sector. The concept of a ‘personal computer’ was still a distant dream for the world and microprocessors were yet to come into personal use. The 16 bit microprocessors started becoming a commercial sell-out in the 1980s with the first popular one being TMS9900 the of Texas Instruments.
Intel developed the 8086 which still serves as the base model for all latest advancements in the microprocessor family. It was largely a complete processor
integrating all the required features in it. 68000 by Motorola was one of the first microprocessors to develop the concept of microcoding in its instruction set. They were further developed to 32 bit architectures. Similarly, many players like Zilog, IBM and Apple were successful in getting their own products in the market. However, Intel had a commanding position in the market right through the microprocessor era.
The 1990s saw a large scale application of microprocessors in the personal computer applications developed by the newly formed Apple, IBM and Microsoft corporation. It witnessed a revolution in the use of computers, which by then was a household entity.
This growth was complemented by a highly sophisticated development in the commercial use of microprocessors. In 1993, Intel brought out its ‘Pentium Processor’ which is one of the most popular processors in use till date. It was followed by a series of excellent processors of the Pentium family, leading into the 21st century. The latest one in commercial use is the Pentium Dual Core technology and the Xeon processor. They have opened up a whole new world of diverse
Intel 8085
Produce: From 1977 to 1990s
Common manufacturer(s) Intel and several others
Max. CPU clock rate 3,5 and 6 MHz
Instruction set pre x86
Package(s) 40 pin DIP
The Intel 8085 is an 8-bit microprocessor introduced by Intel in 1977. It was binary-compatible with the more-famous Intel 8080 but required less supporting hardware, thus allowing simpler and less expensive microcomputer systems to be built.
The “5” in the model number came from the fact that the 8085 required only a +5-volt (V) power supply rather than the +5V, -5V and +12V supplies the 8080 needed. Both processors were sometimes used in computers running the CP/M operating system, and the 8085 later saw use as a microcontroller.
The 8085 had a very long life as a controller. Once designed into such products as the DECtape controller and the VT100 video terminal in the late 1970s, it continued to serve for new production throughout the life span of those products (generally many times longer than the new manufacture lifespan of desktop computers).
The 8085 was a binary compatible follow up on the 8080, the successor to the original Intel 8008. The 8080 and 8085 used the same basic instruction set as the 8008 (developed by Computer Terminal Corporation) and they were source code compatible with their predecessor. However, the 8080 added several useful and handy 16-bit operations above the 8008 instruction set: The 16-bit stack pointer in the 8080 enabled the stack to hold parameters and local data as well as return addresses, just like in larger CPUs, and the single 8008 addressing mode via the HL register pair was complemented by direct addressing for 8/16-bit loads and stores. The ability to employ BC and DE as two additional 16-bit pointers was also new in the 8080. The 8085 added only a few relatively minor instructions above the 8080 set.
Intel 8086
The 8086 (also called iAPX86) is a 16-bit microprocessor chip designed by Intel, which gave rise to the x86 architecture; development work on the 8086 design started in the spring of 1976 and the chip was introduced to the market in the summer of 1978. The Intel 8088, released in 1979, was a slightly modified chip with an external 8-bit data bus (allowing the use of cheaper and fewer supporting logic chips and is notable as the processor used in the original IBM PC.
Background
In 1972, Intel launched the 8008, the first 8-bit microprocessor. It implemented an instruction set designed by Datapoint corporation with programmable CRT terminals in mind, that also proved to be fairly general purpose.
Two years later, in 1974, Intel launched the 8080, employing the new 40-pin DIL packages originally developed for calculator ICs to enable a separate address bus. It had an extended instruction set that was source- (not binary-) compatible with the 8008 and also included some 16-bit instructions to make programming easier. The 8080 device.
The 8086 project started in may 1976 and was originally intended as a temporary substitute for the ambitious and delayed iAPX 432 project. It was an attempt to draw attention from the less-delayed 16 and 32-bit processors of other manufacturers (such as Motorola, Zilog, and National Semiconductor) and at the same time to counter the threat from the Zilog Z80 (designed by former Intel employees), which became very successful. Both the architecture and the physical chip were therefore developed rather quickly by a small group of people, and using the same basic microarchitecture elements and physical implementation techniques as employed for the slightly older 8085 (and for which the 8086 also would function as a continuation).
The first revision of the instruction set and high level architecture was ready after about three months, and as almost no CAD-tools were used, four engineers and 12 layout people were simultaneously working on the chip.[8] The 8086 took a little more than two years from idea to working product, which was considered rather fast for a complex design in 1976-78.
The 8086 was sequenced using a mix of random logic and microcode and was implemented using depletion load nMOS circuitry with approximately 20,000 active transistors (29,000 counting all ROM and PLA sites). It was soon moved to a new refined nMOS manufacturing process called HMOS (for High performance MOS) that Intel originally developed for manufacturing of fast static RAM products. This was followed by HMOS-II, HMOS-III versions, and, eventually, a fully static CMOS version for battery-powered devices, manufactured using Intel’s CHMOS processes. The original chip measured 33 mm² and minimum feature size was 3.2 ?m.
Application
, the microprocessor is provided with an instruction set which consists of various instructions such as MOV, ADD, SUB, JMP etc. These instructions are written in the form of a program which is used to perform various operations such as branching, addition, subtraction, bitwise logical and bit shift operations. More complex operations and other arithmetic operations must be implemented in software. For example, multiplication is implemented using a multiplication algorithm.
The 8085 processor has found marginal use in small scale computers up to the 21st century. The TRS-80 Model 100 line uses a 80C85. The CMOS version 80C85 of the NMOS/HMOS 8085 processor has/had several manufacturers, and some versions (eg. Tundra Semiconductor Corporation’s CA80C85B) have additional functionality, eg. extra machine code instructions. One niche application for the rad-hard version of the 8085 has been in on-board instrument data processors for several NASA and ESA space physics missions.
Order Now