I found that blogspot has a new utility to update the look and feel of your blog at
Blogger Template Designer
For the next few weeks you might be seeing some updates to the style and look of the page. Let me know if there are any that you like.
Saturday, March 27, 2010
Wednesday, March 24, 2010
Toastmasters Speech #8 "Get Comfortable with Visual Aids"
Microprocessors
How many have heard of microprocessors? What do you think about when you hear the term? Do you think about computers like a Dell or Apple? Of course. How about an iPhone or a cell phone or an Xbox? Ok, I can see that. What about an engine controller or a washing machine or a coffee maker? Wait a second. Microprocessors are widespread. Most electronics contains some form of a microprocessor, which is the central brain or processing unit of the device. So, why are microprocessors so widespread and used in such a wide variety of applications? How does a microprocessor work and how does it process the data? Today, I would like to answer some of these questions as I explain the history of the microprocessor, the architecture of the microprocessor, and a simple example of how microprocessors work.
Before the microprocessor was invented, most fabricated digital circuits were custom made for only one purpose. It could only perform a specific function and could not be used for anything else, you could not reprogram it and it was difficult to make updates to the hardware. This resulted in a long development cycle. That is when in 1969 an Intel employee named Ted Hoff, invented a general purpose device called a microprocessor that could be easily programmable. This allowed companies to have a much quicker development cycle in which they only had to develop software for an already available microprocessor and easily make edits to the program or software for quick enhancements or fixes.
So we know why we need microprocessors, but know we want to know what is inside a microprocessor? This right (pic 1) here is an simplified diagram of a microprocessor. It consists of an Arithmetic Logic Unit (ALU) which performs addition and subtraction, an instruction fetcher and instruction decoder, registers to locally store data, and a memory interface to store data and the instructions. When a computer first strarts up, a program is stored in memory and the instruction fetcher will grab the first instruction from memory. This instruction is passed on to the instruction decode to determine the operation. Basically, the instruction decode can perform 3 operations; add or subtract using the ALU (Arithmetic/Logic Unit), move data from one memory location to another, or make a decision and jump to a new set of instructions. This is the basic architecture of a microprocessor.
How many of you have ever heard of binary code? It is just basically 1's and 0's. Now how many of you can read this (pic2)? I can't either. Thank goodness a microprocessor can. What this represents is a set of instructions that simple microprocessor can read. Of course no one else can, but using the following tables we can decode exactly what the microprocessor is instructed to do. Now if you feel a little overwhelmed by this table, that is OK, this is only here for reference. From this table (pic3) we can see, for example, the instruction ADD gets mapped to the binary value 10. You can also see from the next table (pic4) that a register and a memory location is also mapped to a binary value. Using these tables we can encode an instruction to add of register A and register B (pic5). Building on that example, (pic6) we are able to decipher the binary code that I showed you before. This shows that the microprocessor is instructed to load the contents of memory 1 to register A, and load the contents of memory2 to register B, Add register A and register B and put the result in register A. Then finally store the value of register A into memory3. Congratulations, you just finished your first software program.
In conclusion, I have given you an overview of the history and operation of the microprocessor. We have seen why microprocessors are necessary for quick development and can support of a variety of applications, and we have seen a simple example of how a microprocessor works. When Ted Hoff invented the microprocessor, little did he know the impact that microprocessor would have on the world the next 40 years, and now it would be difficult to imagine a life without microprocessors.
How many have heard of microprocessors? What do you think about when you hear the term? Do you think about computers like a Dell or Apple? Of course. How about an iPhone or a cell phone or an Xbox? Ok, I can see that. What about an engine controller or a washing machine or a coffee maker? Wait a second. Microprocessors are widespread. Most electronics contains some form of a microprocessor, which is the central brain or processing unit of the device. So, why are microprocessors so widespread and used in such a wide variety of applications? How does a microprocessor work and how does it process the data? Today, I would like to answer some of these questions as I explain the history of the microprocessor, the architecture of the microprocessor, and a simple example of how microprocessors work.
Before the microprocessor was invented, most fabricated digital circuits were custom made for only one purpose. It could only perform a specific function and could not be used for anything else, you could not reprogram it and it was difficult to make updates to the hardware. This resulted in a long development cycle. That is when in 1969 an Intel employee named Ted Hoff, invented a general purpose device called a microprocessor that could be easily programmable. This allowed companies to have a much quicker development cycle in which they only had to develop software for an already available microprocessor and easily make edits to the program or software for quick enhancements or fixes.
So we know why we need microprocessors, but know we want to know what is inside a microprocessor? This right (pic 1) here is an simplified diagram of a microprocessor. It consists of an Arithmetic Logic Unit (ALU) which performs addition and subtraction, an instruction fetcher and instruction decoder, registers to locally store data, and a memory interface to store data and the instructions. When a computer first strarts up, a program is stored in memory and the instruction fetcher will grab the first instruction from memory. This instruction is passed on to the instruction decode to determine the operation. Basically, the instruction decode can perform 3 operations; add or subtract using the ALU (Arithmetic/Logic Unit), move data from one memory location to another, or make a decision and jump to a new set of instructions. This is the basic architecture of a microprocessor.
How many of you have ever heard of binary code? It is just basically 1's and 0's. Now how many of you can read this (pic2)? I can't either. Thank goodness a microprocessor can. What this represents is a set of instructions that simple microprocessor can read. Of course no one else can, but using the following tables we can decode exactly what the microprocessor is instructed to do. Now if you feel a little overwhelmed by this table, that is OK, this is only here for reference. From this table (pic3) we can see, for example, the instruction ADD gets mapped to the binary value 10. You can also see from the next table (pic4) that a register and a memory location is also mapped to a binary value. Using these tables we can encode an instruction to add of register A and register B (pic5). Building on that example, (pic6) we are able to decipher the binary code that I showed you before. This shows that the microprocessor is instructed to load the contents of memory 1 to register A, and load the contents of memory2 to register B, Add register A and register B and put the result in register A. Then finally store the value of register A into memory3. Congratulations, you just finished your first software program.
In conclusion, I have given you an overview of the history and operation of the microprocessor. We have seen why microprocessors are necessary for quick development and can support of a variety of applications, and we have seen a simple example of how a microprocessor works. When Ted Hoff invented the microprocessor, little did he know the impact that microprocessor would have on the world the next 40 years, and now it would be difficult to imagine a life without microprocessors.
Subscribe to:
Posts (Atom)