ASCII
ASCII was developed in the 1960s for teleprinters and became the foundation for modern computer character sets, HTML, and the Internet. Unicode extends ASCII to support millions of characters, with UTF-8 becoming the web standard in 2007.
ASCII is a character encoding system that converts text into numerical values for computers and electronic devices. It uses a seven-bit binary system to represent 128 characters, including letters, numbers, punctuation, and control codes. The first 33 codes handle non-printing functions like tab and line feed, while the rest are readable characters. Often, it’s expanded to an eight-bit system, allowing for 256 characters, including symbols from other languages. ASCII has been a key part of computing and data communication for decades.
Roman numerals
Roman numerals originated in ancient Rome and were used across Europe until the Late Middle Ages. They represent numbers using Latin letters, each with a fixed value. This system remained the standard for writing numbers for centuries.
Roman numerals are read based on addition and subtraction rules. Repeating a letter adds its value (II = 2), while placing a smaller letter before a larger one means subtraction (IV = 4). If a larger letter comes first, the values are added (CX = 110). The letters V, L, and D can’t be repeated, and only I, X, and C are used for subtraction. Their exact origin is unclear, but they may have been created for trade.
Binary Numbers
The modern binary number system was studied in Europe in the 16th and 17th centuries by Thomas Harriot, and Gottfried Leibniz. However, systems related to binary numbers have appeared earlier in multiple cultures including ancient Egypt, China, Europe and India.
Binary numbers are the foundation of computing, using only the digits 0 and 1, known as bits. Each bit represents a power of two, and converting binary to decimal involves multiplying each bit by its corresponding power and adding the results. This system, called base-2, represents numbers using only two symbols: '0' (zero) and '1' (one). Some rational numbers can also be expressed in binary as the quotient of an integer by a power of two. Binary is essential in digital technology and computing.
Characters
Characters in programming languages have evolved alongside computing. Early languages like FORTRAN (1957) and COBOL (1959) used basic character sets, while ASCII (1963) standardized text representation. Later, Unicode (1991) expanded character support to include global languages and symbols.
In programming, characters represent letters, numbers, symbols, and control codes. They are assigned using single quotes and represented as 'char' or 'chr' in code. Character strings, enclosed in double quotes, include letters, digits, spaces, and special symbols. Control characters like newlines and tabs perform specific functions rather than displaying visible symbols. These characters play a crucial role in coding, formatting, and device communication.
Octal Numbers
Octal, or base-8, has been around since the early days of computing. It was a handy way for engineers to work with binary in a more readable format. While it was widely used in older computers, it later took a backseat to hexadecimal.
Octal is a number system that uses only eight digits: 0 to 7. It’s like a shortcut for writing long binary numbers since each octal digit represents a group of three binary digits. Back in the day, programmers and engineers used it a lot in coding and machine language. You can still see octal in UNIX file permissions, where numbers like 755 or 644 control access to files. While it’s not as common today, it remains a cool and useful way to handle numbers in computing!
Hexadecimal Numbers
Hexadecimal, or base-16, became popular in computing as a more efficient way to represent binary numbers. It was widely adopted in the 1960s when computers started using 8-bit bytes, making hex a perfect fit. Today, it’s still used in programming, memory addresses, and color codes.
Hexadecimal uses 16 symbols: the digits 0-9 and the letters A-F. It’s a shorthand for binary, where each hex digit represents four binary digits, making it easier to read and write long binary numbers. You’ve probably seen hex in HTML color codes, like #FF5733, which defines colors on websites. It’s also essential in programming, helping computers store and process data efficiently. While we don’t use it in everyday math, it’s a must-know for tech and coding!