Download presentation
Presentation is loading. Please wait.
Published byEllen Roberts Modified over 8 years ago
2
Human language : commonly used to express feeling and understand what other people expression. Computer language : are a languages by which a user command a computer to work on the algorithm which a user has written to get an output. Two type of computer language Low -level languages High -level languages
3
Program : set of finite instructions that perform specific task. The program usually written in one of programming language then install in computer device then will use by user. Computer device don’t know any language except the machine language. Machine language its 0 and 1 ( binary code) Applications and programs translate from which language its written to the machine language by using suitable tools such as (interpreter, compiler, assembler.. ).
4
To build programs, people use languages that are similar to human language. The results are translated into machine code, which computers understand. Machine language Machine languages (first-generation languages) are the most basic type of computer languages, consisting of strings of numbers (0,1) the computer's hardware can understand. Different types of hardware use different machine code. For example, IBM computers use different machine language than Apple computers.
5
Assembly languages (second-generation languages) are only very easier to work with machine languages. To create programs in assembly language, developers use cryptic English-like phrases to represent strings of numbers. The code is then translated into object code ( machine language code ), using a translator called an assembler.
6
Assembler Assembly code Object code
7
Higher-level languages are more powerful than assembly language and allow the programmer to work in a more English-like environment ( more flexible). Higher-level programming languages are divided into three "generations," each more powerful than the last: Third-generation languages Fourth-generation languages Fifth-generation languages
8
Bit ( binary digit ) : is the basic and smallest unit of information in every modern computer system. Nibble : A four bit binary number. Byte (short for binary term): A unit of storage for a single character, typically an eight bit (2 nibble) binary number. 1 bit = a single digit, either 1 or 0 8 bits = 1 byte, a combination of 1's and 0's 1024 Bytes = 1 KB (kilobyte) 1024 Kilobytes = 1 MB (megabyte) 1024 Megabytes = 1 GB (gigabyte)
9
When we count upward through the positive integers using decimal, we start with a 0 in the one's place and increment that value until we reach the upper limit of a single digit, i.e., 9. At that point, we've run out of the "symbols" we use to count, and we need to increment the next digit, the ten's place. We then reset the one's place to zero, and start the cycle again.
10
Figure : Counting in Decimal The only difference is that decimal uses ten symbols ( 0, 1, 2, 3, 4, 5, 6, 7, 8, and 9 ) while binary only uses two symbols ( 0 and 1 ).
11
To convert the decimal number to binary number we will dividing the original integer by 2, this is call ( remainder method ) For example: We now apply the remainder method to convert (23) 10 to base 2. As shown, the integer is initially divided by 2, which leaves a remainder of 0 or 1
12
Figure : A conversion from a base 10 integer to a base 2 integer using the remainder method.
13
We can check the result by converting it from base 2 back to base 10 using the polynomial method: (10111) 2 = 1 * 2 4 + 0 *2 3 + 1 *2 2 + 1 *2 1 + 1 *2 0 = 16 + 0 + 4 + 2 + 1 = (23) 10
14
Convert the following decimal number to binary number 1- 4 2- 8 3- 6 4- 10 5- 25 6- 30 7- 33
15
Convert the following binary number to decimal number 1- 1001 2- 1110 3- 0010 4- 111 5- 11 6- 01010 7- 1100
16
Thank You Question..?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.