Introduction

  • Early computer systems used electrical switches and when electrical switches were replaced by less mechanical devices such as vacuum tubes, then the transistor, the integrated circuit, the concept of switching on and off remained with computers but a representation of the on/off behaviour of computers had to be made.
  • The term, digital, in computing and electronics applies to converting real-world information to binary numeric form.
  • The binary (base-2) number system represents two discreet values using two symbols or digits -- 0 and 1.
  • Because this is relatively simple to implement in electronic circuitries made up of switches, the binary system maps on quite easily to all modern digital computers.
  • The binary number system, where a zero symbolizes no electrical current (OFF) and a one represents electrical current exists (ON) developed and became the standard means of representing internal computer workings.
  • By combining a series of these 0s and 1s (OFF/ON), the computer is capable of representing a number of complex things.
  • All computer data (alpha-numeric symbols and characters, audio, graphics and video) are represented or encoded using sequences of binary digits that are interpreted according to appropriate software.
  • Below are some simple examples illustrating how binary numbers may be used to represent different data according to different interpretive systems.