top of page

Data Centres and Cloud Efficiency

Updated: Jun 3, 2019


The data centre market is predicated on the basis of being able to store your data somewhere safely and securely for cheap. Before the Internet of Things, when people needed to store large amounts of data it didn't matter where they stored it because the access speeds were fast enough to access the data efficiently across vast distances. To meet the need, data storage companies built large warehouses for digital data in rural areas, where land is cheap. This worked until devices needed to access data instantaneously and the amount of data increased exponentially.


As applications have gotten faster, the decision time in between a request and a response, or a question and answer, needed to get smaller and shorter. Think of it like using your Sat Nav / GPS. When you come to a crossroads, would you accept the Sat Nav saying, “Oh, hold on a second, I'll just try and find out which way to turn. It will take me a couple of minutes to know if you are turning left or right.” No, you would throw the device out the window and give a map to your wife to do the navigation. Ten years ago you may have waited an extended period of time for the Sat Nav/GPS to make a decision, but not in today’s instant decision economy.


Data Centre Distance, Speed, and IoT


With the Internet of Things, the quantity of data is growing exponentially. It's not just me talking to you. It's my fridge talking to Safeway. Safeway talking to the farmer and everyone else along the supply chain. What everybody is trying to do is cut down the bandwidth transition time between where the data resides, and the endpoint device or customer. By shortening the distance the data has to travel, you need to push the data repository centre as close to the endpoint device as you can. That's called edge computing, which means the time between request to the response, or the question to the answer, is as short as possible.


The best solution is to have three types of data centres. First, the big cloud data centres, which sit outside of the city in huge warehouses to provide cheap data storage for redundant and old data. These data centres can use in upwards of 20 to 25 megawatts with thousands of racks inside them. The next type is a smaller community type data centres, which are three floors tall, no more than 900 racks, and probably use four megawatts at absolute maximum. The third is a newer trend in Canada, to put data centres in city centres to provide service for a building or block.


You can't get a huge amount of real estate in the city centre because it is so expensive. Data centre pods can enable companies to store data as close to the device as possible without incurring high rents. A pod is a 40 foot sea containerized truck, which sits in a car park or the basement of the building and enables the building or block to be active between its data and its application. The closer the data is to the application the faster they can make decisions, which is important for those industries requiring instantaneous communication between request and response applications, like in the health care, financial, and media industries. If you have too much distance between the data centre and the application or device you end up out of sync. When this happens on your TV the lips don't keep in line with the sound because you've got packet loss, or jitter, which are the two things slowing down data coming from the repository to the application. Sort of doesn't matter when you're watching Charlie's Angels, but it does matter when you're looking for information about a defibrillator.


The trend is to bring as much data as possible back into the city and be divide your data into different loads geographically. Most companies tend to take a data centre deal and put everything into their cloud storage, however, they don't need everything in the closest, most expensive, data centre. They can have some of it in Reykjavik, they can have some of it in Prince George, they can store older data wherever they want, because they don't need access to it on a regular basis. If the IRS or somebody calls up and says I need to see your data, you can request it from a cheap storage facility located in the middle of nowhere instead of using up their expensive urban storage space. A strategy play plans where data is stored and a tactical play determines how geographically close you can get relevant data to the devices to increase speed.


Strategic and Tactical Use of the Cloud


The cloud is nothing new. I love the phrase and I hate the phrase in the same equal measure. In the good old days, we used to have our computers sitting underneath our desk. Then when we needed more data storage, we had a little room in the basement with three or four the servers daisy chained together and Johnny the IT guy came in twice a week to do whatever he needed to do.


Then we had a power outage in the building and decided “Hold on, we've lost all of our data, now what?” Maybe we need to put it outside of the company’s offices, and the outsourced IT market was born. Again, this happened in Europe 10-15 years ago, and we are now beginning to see the growth of it here in Canada.


Your existing IT department can do smart and clever things and the grunt work of ICT management can be outsourced. Outsourcing means the data is put in someone else's data centre by renting space and you can either co-locate to buy power property, internet connectivity, and a rack, to install and manage your own servers. Or, you can have managed services and never turn up at the data centre because the services company is doing the install and data management.


Now, once you've got one data centre, you probably want redundancy, resiliency, disaster recovery, or business continuity. A Cloud company is likely to sell you space in their data centre and mirror your workloads. If you have large volumes going through it, you might want to do what's called load balancing. IT can say, “there's too much workload going across this server, so we are going to split half the volume and put it across that other server.” Now you are load balancing your work and you need two data centres.


The Cloud and Sovereignty


"The Cloud" is a great marketing term by IBM, nothing more. It means you have access to a miriade of Data Centres, however, they are still in the same racks, still in the same servers, and are still connected to the internet. Johnny, your IT guy, still looks after them, however, you don't necessarily know where the data is geographically. This has caused its own problems due to data sovereignty, which means, if I'm serving a particular market, my data needs to be in that geographical market by law. With AWS, certain verticals cannot use the cloud. The healthcare market is really struggling to use the cloud because the industry regulators want to know where the data is residing. You know it's on server number 1754321, but you won't know where in the world that particular server is located. It could be in Bogota. It could be in Silicon Valley. It could be in Seattle. It could be in Toronto. You don't know. The cloud players are struggling to keep up with building the data centres to allow people to have data sovereignty.


The next layer is personal sovereignty, which is not about the where the data resides as much as, who has control of the data. We are beginning to say, not only do I want to know where my data is, but I want to know who has access to it. I want to be able to partition access off. For example, if I get hit by a truck, I want the paramedic to instantaneously see 100% of my health record to know what's wrong with me, if I have any allergies, or am taking certain medications. I don’t want the insurance companies to have access to the same information because they will raise my premium, or they might not pay the medical bill. The data centre is not a one size fits all solution. It is not one conurbation or geography fits all solution.


The cloud is good for scaling a business because you can spin up or start a server quickly and you can take them down depending on your business workload needs. The problem is with data sovereignty and personal sovereignty.


The Cost of the Cloud


Another issue is the cost of the cloud. Data storage in the cloud is very expensive. I did a review of a client’s data need and if he put the data into AWS, it would cost $40,000 Canadian dollars a month. To build his own data centre and place data into a redundant data centre, it would cost $100,000 once, plus monthly utility costs. The cloud is good, it's quick, but it is expensive. Tracking of the spend is quite difficult too.


The industry made it easy to spin up a data centre or spin up a server using a credit card, which is cool. However, you now have to track who's using which credit card and what budget the cost is going against. The IT managers are now trying to grapple with where the expense for that spin up is. Johnny, the IT guy, put the data storage purchase on his personal credit card and the expense went under miscellaneous, it didn't go to a computer budget. The company still pays it, but the IT department doesn't know the true cost of the cloud.


Once everybody's got it, everybody uses it but not everybody needs to use it. Growth in cloud storage is greater, but it is not necessarily linked to the demand of companies. It's just getting bigger because we've all got “shiny, shiny,” so let's use it. It is not efficient or cost effective to upload all the company’s data onto the cloud.



 

Nicholas Jeffery is a Smart City, IoT, Expert who has worked Internationally on various Smart City initiatives. Nicholas pivoted in the new millennia to bring his business acumen and strategic thinking to bear in the technology, media and telecommunications markets, helping companies flesh out their growth strategies, business development and sales operations.

Find out more about his impressive resume on LinkedIn

34 views0 comments

Recent Posts

See All
bottom of page