The ‘science’ behind data center location that makes edge computing tick.

Edge computing is the new game in town: The decentralized computing solution for bringing data, applications and services much closer to the consumers, enterprise businesses and industrial organisations that actually need them.

Edge computing is also seen as the way forward to delivering the full potential of new technologies including Artificial Intelligence (AI), Machine Learning (ML), Augmented Reality (AR), Virtual Reality (VR) and numerous Internet of Things (IoT) innovations.

The key to everything is reducing latency – the time measured in milliseconds it takes between the user instruction via their device to the request taking place. This achieves a better, faster ‘real-time’ user experience.

It would be no exaggeration to say that none of the above can be fully realized without an underlying fit-for-purpose edge infrastructure.

Such infrastructure is the digital economy’s – albeit vastly technically advanced – take on the telephone exchanges of the last century.

In this scenario, regional and metropolitan fibre networks terminate (connect) at data centers positioned near consumers and enterprise businesses to reduce latency. This approach also curtails the often significant data transit (backhaul) costs that network service providers and businesses incur when sending or receiving data between different locations.

Connected edge data center infrastructure brings the required compute and data storage resources within closer reach of local populations.

However, the skills and local knowledge of leading data center operators like nLighten and dark fibre infrastructure providers makes all the difference when considering the geographies, enterprise densities and customer demographics involved for locating edge data center sites. Along with a strategic location, the data centers must be able to rapidly provision and scale compute and storage resources exactly when and where they’re needed – but without risk of compromising IT security and resilience. They must also be carbon energy efficient and sustainable.  

Done right this will pay dividends for their customers in terms of their operational efficiencies and agility – through reduced latency – and cost reductions by cutting out the distances otherwise involved if backhauling data hundreds of miles to centralised data centers.

The keys to successful edge data center deployment

To warrant the significant edge infrastructure investment involved, the target region or metro area population needs to be sufficiently large in size (number of eyeballs) and endowed with a sizeable enterprise density. These factors have a direct impact on both latent and future market demand for the colocation facilities which will support edge-based applications and enable high-speed delivery of services such as content streaming, online gaming and mobile apps.

As part of the decision-making process, a prudent colocation provider will pay close attention to the proximity of the incumbent carrier’s fibre network as well as existing or planned dark fibre infrastructure. Consideration is also given to where on the network the data center will be situated. These factors will impact on its suitability for meeting specific latency use cases and are important to it establishing a direct point of presence (PoP) on the right networks for serving the region as well as access to nationwide and international connections. The more strategic the networks, the more compelling the data center will be for enterprise businesses, cloud and CDN providers looking for a viable colocation solution.

Furthermore, a diverse choice of networks will enable the colocation operator to more easily scale its edge platform by leveraging a fully-meshed infrastructure, capable of interconnecting all of its sites to create a seamless national and pan-European solution – all from a single source.            

Only when these evaluation criteria have been satisfactorily addressed is the way clear to implement a regional edge data center.  

Enterprises, manufacturers, and tech clusters in the vicinity will gain access to a wide choice of low latency services.

Benefits include faster response times for applications including the latest generation of latency dependent AI technologies. The result: greater productivity, agility, competitiveness and customer satisfaction.

Additionally, the data center will present opportunities for coupling with local energy providers – to help support grid stabilization – or perhaps to re-use the excess heat it produces for the benefit of the local community, in municipal buildings and public facilities for example.