If data is oil, then data centres are the new reservoirs,’ goes the popular adage. Data Centres are computing infrastructures connected to the internet where centralised computing happens.
Cisco defines a data centre as a physical facility used by an organisation to store data and critical applications. Data centres are designed based on a network of computing and storage resources that facilitates the delivery of data and shared applications. The key components of a data centre include routers, switches, firewalls, storage systems, servers, and application delivery controllers.
Over the years, data centres have evolved in terms of physical and virtual infrastructures. Now most enterprises use hybrid cloud to take advantage of the best of both on-premise and public cloud. Earlier, data centres struggled with infrastructure hassles such as power supply, cooling systems, cabling woes and lack of mobility, but now most of the major concerns are around speed, performance and efficiencies.
Made with Visme Infographic Maker
Evolution of technology in data centres
1946: Electronic Numerical Integrator And Computer (ENIAC) is considered the first general-purpose electronic digital computer. The US Army designed the computer during the Second World War to calculate artillery fire. The Manhattan Project leveraged ENIAC to develop the first thermonuclear bomb.
1951-54: Universal Automatic Computer (UNIVAC) became the first computer to store data on magnetic tapes. It was tabulated vacuum tubes and circuits printed out or stored on magnetic tape.
1960: IBM’s first transistorised computer, TRADIC, helped data centres to expand from the military domain to the commercial space, eliminating the need for labyrinthine vacuum tube systems. TRADIC exponentially increased computational abilities and features. It made computer systems smaller and easier to fit into multi-purpose spaces like office buildings. The system also enabled NASA to put a man on the Moon.
1971: Intel introduced the first general-purpose programmable processor called – 4004 processor. It was the first in the line of Intel CPUs and the first commercially produced microprocessor. Intel 4004 was also customisable.
1973: The Xerox Alto was the first desktop computer to use a graphical UI, a bit-mapped high-resolution screen tcomplere with large internal storage and special software.
1977: ARCnet introduced the first LAN and was used at the Chase Manhattan Bank. It connected to 255 computers across the network and supported data rates of 2.5 Mbps.
1980: The introduction of Personal Computers ushered in a new era in personal computing. PCs were rapidly adopted and installed and were operated directly by the user without an operator running interference. A PC can be a desktop computer or a laptop, netbook, tablet or a handheld device.
1990: It was the birth of modern data centres as microcomputers started to fill old mainframe computer rooms as servers and these rooms became known as data centres. Big companies started to assemble these servers or the data centre inside their office premises.
Mid 90s: The emergence of the internet warranted a need for larger data centres as a service model. As a result, the data centre got bigger to incorporate thousands of servers in the room.
1999: VMware Workstation, similar to Virtual PC, came into existence. The initial versions ran only on Windows but later supported other operating systems.
2001: VMware ESX was launched. The first server virtualisation product had bare-metal hypervisors that ran directly on server hardware without any additional underlying OS.
2002: Amazon Web Services launches cloud-based services, including storage, computation, and Amazon Mechanical Turk.
2006: Amazon Web Services begins offering IT infrastructure services to other industries through web services aka cloud computing.
2007: Sun Microsystems introduces the modular data centre and transformed the fundamental economics of corporate computing.
2008-2011: Enterprises started to focus on power efficiency, cooling technologies, and management of data centre facilities.
2011: Facebook launched Open Compute Project, an initiative to share best practises and specifications to create energy-efficient and economic data centres.
2013: Google invested USD 7.35 billion in capital expenditures to expand its global data centre network.
Telcordia introduced generic requirements for telecommunications data centre equipment and spaces.
2019: A distributed computing paradigm has evolved, changing the industry dynamics with the prevalence of edge data centres.
Data is connected across multiple data centres, the edge, and public and private clouds. The data centre can communicate across sites, both on-premises and in the cloud.