Views: 125 Author: Site Editor Publish Time: 2022-01-25 Origin: Site
The application of information technology such as Internet, artificial intelligence and Internet of Things is changing people's life and work style, and even some jobs have been replaced by artificial intelligence locks. The application of technology also generates a large amount of data and information, and the one that carries these data is often referred to as data center.
You may not realize it, but the amount of data we are consuming and creating is causing a data explosion. According to a recent report by DOMO, by 2020, every person on the planet will generate 1.7 terabytes of data per second. Storage giant EMC claims there will be about 40 trillion gigabytes of data by next year. These staggering numbers almost feel unreal and abstract. As does the data center where all this data is physically stored.
Data centers around the world are working 24/7 in the background while we've been busy using instant messaging programs or binge-watching the Internet on Netflix. Not long ago, data centers around the world were more like a processing and storage space. But with the arrival of cloud, big data and analytics, the data center is finally taking center stage in the IT world.
As the name implies, hyperscale is the ability to scale at an ultra-high rate to meet super requirements. It is the ability to scale up to meet growing demand. Hyperscale demand implies the ability to add capacity quickly and efficiently, with priority given to speed to market. Having the best performance in terms of increased space, functionality, computing power, memory, network infrastructure and storage resources is the way hyperscale data centers are typically defined.
For example, while a data center (DC) can support hundreds of physical servers and thousands of virtual machines, a hyperscale facility will be able to support thousands of physical servers and millions of virtual machines. If IDC defines a facility with at least 5,000 servers and a total size of no less than 10,000 square feet as a hyperscale facility, the size and area of a hyperscale data center is typically much larger.
To give you a perspective, Microsoft's hyperscale DC in Quincy, Washington, has 24,000 miles of network cables that could almost orbit the globe, while the Azure data center in Singapore is twice as large and has enough concrete space to build a sidewalk from London to Paris. Yotta will open India's largest data center with 8.2 million square feet of space and 7,200 racks.
In addition to massive scale, one of the biggest advantages of hyperscale DCs is upward scalability. For a rapidly growing legacy system, this is a huge challenge. On the other hand, a hyperscale data center will be able to handle horizontal or vertical scaling efficiently with minimal hassle. It will improve end-user uptime and load times and run large workloads that can also easily consume large amounts of power. The mega-DC adds a top layer of analytics and machine learning.
Since efficiency is a requirement for mega DCs, automation is inevitable. Typically, the companies that build and operate these DCs are very concerned with automation and self-healing processes. The systems thus created are controlled and automated to the point that the inevitable outages and delays in the environment will correct themselves, resulting in increased data efficiency.
Energy efficiency is another pillar of hyperscale data centers. Large facilities will optimize their power architectures to the greatest extent possible, thereby significantly reducing costs and environmental impact. A hyperscale data center optimizes airflow throughout the structure. It ensures that hot air flows in one direction and often recovers heat from the exhaust stream for recycling. The power usage efficiency (PUE) of a hyperscale facility is much lower than that of a traditional DC and is more environmentally friendly.