Evolution and History of Programming Languages
EVOLUTION
To write a program for a computer, we must use a computer language. A computer language is a set of predefined words that are combined into a program according to predefined rules (syntax). Over the years, computer languages have evolved from machine language to high-level languages.
Machine languages
In the earliest days of computers, the only programming languages available were machine languages. Each computer had its own machine language, which was made of streams of 0s and 1s. In Chapter 5 we showed that in a primitive hypothetical computer, we need to use eleven lines of code to read two integers, add them and print the result. These lines of code, when written in machine language, make eleven lines of binary code, each of 16 bits, as shown in Table 9.1.
Machine languages (first-generation languages) are the most basic type of computer languages, consisting of strings of numbers the computer's hardware can use.
Different types of hardware use different machine code. For example, IBM computers use different machine language than Apple computers
Assembly languages
The next evolution in programming came with the idea of replacing binary code for instruction and addresses with symbols or mnemonics. Because they used symbols, these languages were first known as symbolic languages. The set of these mnemonic languages were later referred to as assembly languages. The assembly language for our hypothetical computer to replace the machine language in Table 9.2 is shown in Program 9.1.
Assembly languages (second-generation languages) are only somewhat easier to work with than machine languages.To create programs in assembly language, developers use cryptic English-like phrases to represent strings of numbers.The code is then translated into object code, using a translator called an assembler.
0 comments:
Post a Comment