What is a computer? A computer is a digital electrical machine that can be programmed to perform arithmetic and logical operations (computation) in a predetermined order. Programs are overall collections of operations that modern computers can complete tasks. These programs allow computers to perform a variety of tasks. A computer system is a “full” computer that contains the required equipment, operating system (primary software), and peripheral devices for “full” functioning. This term can also refer to a connected and cooperating set of computers, including a computer network or a data network.
Computers are used to manage large-scale industrial and consumer equipment. Simple special-purpose devices such as microwave ovens and remote controls, as well as factory equipment such as industrial robots and computer-aided design, as well as general-purpose equipment such as laptops, and mobile devices such as mobile phones, are examples of computers. The Internet, which connects billions of other computers and people, is powered by computers to control communication.
Computers were basically developed to be used only for calculations. Since ancient times, simple manual devices such as the abacus have had advantages for humans in completing computations. Some mechanical devices were easily started in the Industrial Revolution to automate long, repetitive processes, such as guiding patterns for weavers.
In the early twentieth century, more complex electrical equipment performed specialized analog calculations. During World War II, the first digital electronic calculating computers were developed. In late 1940, the first semiconductor transistors were developed, followed by silicon-based MOSFET (MOS transistor) and monolithic computer chip (IC) chip technology in the late 1950s, paving the way for the microprocessor and microcomputer revolutions in 1970.
Since then, computer speed, power, and diversity have increased tremendously, with transistor counts rising at a rapid rate (as expected by Moore’s law), resulting in the Digital Revolution in the late twentieth and early twenty-first centuries.
A modern computer typically has at least one computing device, such as a central processing unit (CPU) in the form of a microprocessor, as well as some kind of memory, such as semiconductor memory chips. A timing and management unit can vary the order of operations in reaction to stored information, while the processing element performs arithmetic and logical operations.
Peripheral Devices of Computer
Input devices of computers such as keyboards, mice, joysticks, and so on, the output devices like monitor displays, printers, and so on), and input/output devices that perform both jobs are all examples of peripheral devices. Peripheral devices allow information to be received from an external source, as well as save and retrieve the results of activities.