NOW OPEN: International Conference on Future Africa | April 14-16, 2020

SHOWCASE – INFO TECH

SHOWCASE – INFO TECH

Information technology (IT) is the use of computers to store, retrieve, transmit, and manipulate data,[1] or information, often in the context of a business or other enterprise.[2] IT is considered to be a subset of information and communications technology (ICT). An information technology system (IT system) is generally an information system, a communications system or, more specifically speaking, a computer system – including all hardware, software and peripheral equipment – operated by a limited group of users.

Humans have been storing, retrieving, manipulating, and communicating information since the Sumerians in Mesopotamia developed writing in about 3000 BC,[3] but the term information technology in its modern sense first appeared in a 1958 article published in the Harvard Business Review; authors Harold J. Leavitt and Thomas L. Whisler commented that “the new technology does not yet have a single established name. We shall call it information technology (IT).” Their definition consists of three categories: techniques for processing, the application of statistical and mathematical methods to decision-making, and the simulation of higher-order thinking through computer programs.[4]

The term is commonly used as a synonym for computers and computer networks, but it also encompasses other information distribution technologies such as television and telephones. Several products or services within an economy are associated with information technology, including computer hardware, software, electronics, semiconductors, internet, telecom equipment, and e-commerce.[5][a]

Based on the storage and processing technologies employed, it is possible to distinguish four distinct phases of IT development: pre-mechanical (3000 BC – 1450 AD), mechanical (1450–1840), electromechanical (1840–1940), and electronic (1940–present).[3] This article focuses on the most recent period (electronic).

Academic perspective

In an academic context, the Association for Computing Machinery defines IT as “undergraduate degree programs that prepare students to meet the computer technology needs of business, government, healthcare, schools, and other kinds of organizations …. IT specialists assume responsibility for selecting hardware and software products appropriate for an organization, integrating those products with organizational needs and infrastructure, and installing, customizing, and maintaining those applications for the organization’s computer users.”[37]

Commercial and employment perspective

Companies in the information technology field are often discussed as a group as the “tech sector” or the “tech industry”.[38][39][40]

Many companies now have IT departments for managing the computers, networks, and other technical areas of their businesses.

In a business context, the Information Technology Association of America has defined information technology as “the study, design, development, application, implementation, support or management of computer-based information systems”.[41][page needed] The responsibilities of those working in the field include network administration, software development and installation, and the planning and management of an organization’s technology life cycle, by which hardware and software are maintained, upgraded and replaced.

Ethical perspectives

The field of information ethics was established by mathematician Norbert Wiener in the 1940s.[43]:9 Some of the ethical issues associated with the use of information technology include:[44]:20–21

  • Breaches of copyright by those downloading files stored without the permission of the copyright holders
  • Employers monitoring their employees’ emails and other Internet usage
  • Unsolicited emails
  • Hackers accessing online databases
  • Web sites installing cookies or spyware to monitor a user’s online activities