One of our core theses at Data Center Frontier is that the world is being transformed by data, and that the infrastructure that manages all this data becomes more mission-critical to our society by the day.

This theme is constantly reinforced by the growth we see all around us, including larger data centers, bigger campuses and eye-popping data on the volume of leasing and construction.

Sometimes we see data points affirm the extraordinary nature of this shift to digital business platforms. But every now and again someone in the industry shares a data point that just blows your mind. That was the case yesterday, as a blog post from Amazon’s Jeff Barr noted the 15th anniversary of the launch of Amazon Web Services, which kicked off the era of cloud computing. Jeff’s post included a mind-blowing data point.

“Today I am happy to announce that S3 now stores over 100 trillion (1014, or 100,000,000,000,000) objects, and regularly peaks at tens of millions of requests per second,” Barr wrote. “That’s almost 13,000 objects for each person in the world, or 50 objects for every one of the roughly two trillion galaxies in the Universe.”

How does a service got from zero to 100 trillion in 15 years? One day at a time. No one is using less data today than they were yesterday. Tomorrow, we will all use more data, as technology continues to create new capabilities, and shifts others from physical to virtual.

At Data Center Frontier we think about data generation in terms of streams – there’s many different streams streams of of data coming into the data center industry from a lot of different technologies – AI, the Internet of Things, edge computing, 5G wireless, satellites, virtual reality, drones, robots, connected cars and much more. Each of these technologies generates streams of data, which merge into rivers and oceans and manifest in a tsunami for the world’s data center industry to store and manage.

AWS is a primary example of this, as companies pursuing all of these next-generation technologies now store and manage their data on AWS, alongside the huge volume of business from traditional IT shops that are discovering the value of cloud services.

All of this requires real-world infrastructure to deliver. Here are some stories that examine what this looks like in the real world:

  • AWS Plans Massive Expansion of its Northern Virginia Cloud Cluster: Amazon Web Services is continuing to expand in Northern Virginia, including a massive data center campus just south of Dulles Airport, which could add up to 2.5 million square feet of cloud computing capacity. It’s part of an ongoing expansion of the company’s massive cloud computing infrastructure.
  • Inside Amazon’s Cloud Computing Infrastructure:  Shortly after we launched DCF, we did a deep dive into Amazon’s cloud infrastructure, including how it builds its data centers and where they live (and why). Thought details have changed over the years, this provides a good overview of the foundations for their cloud and its subsequent growth.
  • How AWS Cloud Customers Are Using Local Zones for Edge Computing:  Amazon customers are tapping Local Zones, the AWS edge computing initiative, to run hybrid environments and tap single-digit millisecond latency for services like game rendering.
  • Amazon Bets $10 Billion on the Future of Space Commerce:  Amazon will invest $10 billion in Project Kuiper, a satellite network to provide broadband Internet service. Analysts say Amazon’s ambitions in space could expand the market for its online retail and cloud computing businesses.
  • Amazon Web Services Archive: All our coverage of AWS and its growth.

Leave a Reply

Your email address will not be published. Required fields are marked *