These days, cloud computing is big news, not just in the tech industry but in almost every sector imaginable. Cloud computing is a huge part of how the internet now works for businesses, and in most people’s personal lives as well.
The first question many people ask when ‘new’ technology hits the mainstream is ‘when was it invented?’. With cloud computing, the assumption is that it came into being some time in the last decade, or perhaps as far back as the turn of the millennium. In fact, the ‘cloud computing’ was first used in this context in 1996, but the origins of the concept stretch much further back, all the way to the 1950s.
Back when computing was really taking off in the 50s, mainframe computers were huge and expensive beasts. For most organizations, the cost of providing employee access to individual computers would have been unbelievably prohibitive. As a workaround, companies purchased one or two mainframe computers, then gave users access to them using ‘dumb’ terminals with no processing power of their own. Providing shared access to a centralized piece of technology made sense, particularly as the actual processing power required back then by individual users was usually minimal. Shared computing power was the standard, and provided the seeds for the concept of ‘cloud’ computing.
In 1969, the Advanced Research Project Administration (ARPANET) was established by the US government. It was designed to share resources between the scientific community by connecting the computers of four universities. ARPANET was the first time that digital resources could be shared among computers in remote locations, and was an early precursor to the internet as we know it today.
In the early 1970s, the concept of virtual computers became reality. Multiple distinct virtual computing environments could be created within one mainframe, running different operating systems, and allowing multiple users, even multiple organizations to all work off one physical location. This was probably the first true application of cloud computing, and led to the virtual telecommunications boom during the 1990s. Virtual Private Networks (VPNs) allowed telecommunications companies to offer virtual services at reduced costs without building out physical infrastructure.
With Tim Berners-Lee’s invention of the World Wide Web in 1990, the internet became available and visible to all computers. Resources and information was shared between computers at speed, and organizations increasingly began to work ‘online’. By the mid-90s businesses and educational institutions were already using elements of the cloud to work, and in 1999 the launch of Salesforce.com brought the first service to offer enterprise applications via the cloud to market, the pioneer of software as a service (SaaS).
The final piece of the cloud computing puzzle, and the beginning of what we know as cloud computing today, was the launch of Amazon Web Services (AWS) in 2006. This platform delivered a suite of web-based computing services, designed for businesses to add Amazon services to their own sites, that allowed customers to only pay for what they actually used. Following the launch of AWS, Amazon went further, modernising their data centres so they could offer excess capacity as ‘cloud storage’ to customers.
In 2010, Microsoft launched Azure, their own cloud computing platform, originally designed to support the development of web and mobile apps over the internet, and now covering SaaS, IaaS and PaaS.
Over the last decade, cloud computing has become commonplace, and businesses are increasingly moving their entire infrastructure online. With cloud vendors looking to the Internet of Things and machine learning, cloud computing is only going to become more powerful and ubiquitous in the future!