Like many other IT functions, data analytics can be moved to the cloud. And in reality, it is in the process right now. This movement requires more than news approaches, architectures, and skills compared to the traditional analysis done in-house.
According to Gartner, one of the top 10 data and analytics technology trends for 2021 is open, containerized analytics architectures that make analytics capabilities more composable. This trend enables enterprises to quickly build flexible and intelligent applications, which can help data analysts connect insights to actions. And as the center of data gravity moving to the cloud, composable data and analytics will become a more agile way to build analytics applications enabled by cloud marketplaces and low-code and no-code solutions.
According to the systems and technology director at LEAP (Loveurope and Partners), Cloud-based analytics enables the scalability a company needs for a high-compute workload. Traditional analytics doesn’t scale the way the cloud does, as the world continues to digitize everything. Therefore, organizations need to be able to build with file data at an exponential scale.
Along with the opportunities come challenges that are bound to be hurdles to overcome.
Fear of losing control
Data analytics is highly strategic for enterprises, and the idea of moving the analytics process to the cloud can be daunting for technology leaders accustomed to having complete control over such resources.
As clouds become a new trend, the traditional status quo is starting to get challenged. One of the most daunting challenges is the fear of losing control from CIOs and CDOs (Chief data officers). According to Advanced Analytics Enablement Leader Anthony Abbattista, at Deloitte Consulting, one of the key challenges most clients faced is organizational inertia/fear of losing control.
Among several senior IT executives on shifting to cloud-based analytics he has worked with, he sees a pattern: The traditional role of IT and the CIO has been to protect and be a guardian of data assets. And now, the cloud challenges the status quo because it can be quicker to market; for example, there’s more limited product selection and assessment, point-and-click provisioning, no need for large incremental capital expenditures, and so on.
Chief data officers and CIOs need to work together to vet and get comfortable with cloud platforms, so they can help derive business value and competitive advantage at least as quickly as their competitors. This might require the adoption of acceptable, proven, and emerging models in the market rather than designing/architecting the analytics environment from the ground up.
Due to their outdated and inflexible existing analytics processes, many organizations are slow to explore new analytics capabilities. This kind of outdatedness can result in fewer incentives and initiatives to try new capabilities and drive innovation. So, to overcome this, the IT department at WAEPA try and use a cloud-enabled sandbox environment to establish a trial-and-error ideation process, using key performance indicators from key stakeholders and creating a prototype-first analytics environment.
Making the shift
If IT leaders have overcome the fear of losing and the unknown, moving to the actual migration process and ensuring no interruption of services happens is another hurdle that makes many hesitate.
The hardest thing for many IT leaders is to navigate the path to the cloud. If they choose the right solutions, this path will not be as tricky as their previous experiences.
When migrating data analytics to the cloud, IT leaders, in many cases, start with the “lift and shift” approach by porting existing operations over to the cloud. Often this means re-tooling applications and systems to re-architect them for the cloud.
For example, you want to migrate a massive amount of unstructured file data to the cloud using an analytics platform. Most company’s file data share a common feature: they are all spread across a range of disparate legacy storage systems; therefore, it is extremely labor-intensive for data administrators to manage and locate data at different points in the workflow.
For Cloud service providers such as CMC Global, shifting all data without the need to refactor applications for the cloud requires the right decision of a tool that makes it simple to replicate and extract data across multiple environments.
This kind of shift enables companies to optimize their data analytics and accelerate performance up to 240 times. Analytics allows the company to see how many clients are connected, who’s using the most bandwidth, and where the system is proliferating. Especially for those with a global network of partners and employees working remotely, leveraging the cloud to efficiently and securely collaborate on creative projects is the best decision so that their production process would not come to a halt.
Acquiring the right skills
All successful IT projects always come down to having the necessary skills available, and moving analytics to the cloud is no exception.
According to research at Deloitte, the demand for skills is beginning to shift. Rather than specialists to support each part of the technology stack in traditional analytics/BI, the cloud analytics environment requires more ‘full stack’ thinking. To address this challenge, the technology team supporting these new-age environments needs to understand the offerings on a cloud platform, adopt standard patterns, and then evolve as new techniques, tools, and offerings become available.
Companies that choose to build their own analytics platform in a cloud environment or rely on vendor systems will need to have specific in-house technical expertise. These inhouse employees will need to have skills to create, maintain, and derive analytics from a data lake and how best to employ cloud-native or third-party artificial intelligence and machine learning capabilities to draw additional insights from the environment.
These drawbacks can be overcome through experienced partners and consultants. Jewett says. Some believe that with excellent in-house sources, the best arrangement is to gain experience from outside experts. And once the contract is up, the company will possess the knowledge and expertise to continue to evolve cloud-based analytics as needed.
Securing the data
The fear is real, especially when the malware is rising as COVID-19 makes everyone stay at home for more than a year. Therefore, no matter how much cloud service providers emphasize the security of their infrastructures, many clients will always be concerned about how safe their data actually is in the cloud.
Security is top-of-mind any time you’re shifting your company’s valuable data out of a private data center because the insights gained from analyzing data can be a competitive differentiator. There is also worry about exposing compassionate data such as customer information.
The real challenge is to make sure customer’s data would be protected in a cloud accessed by internal and external users. Hopping between cloud accounts and securely storing and exchanging keys becomes a security issue. There needs to be strong governance around the appropriate use of data. This is more urgent in the cloud than on-premises because it’s so easy for people to copy data and use it in ways that are not authorized.
The ease with which someone can use cloud applications opens up challenges, many of which are rooted in the fact that people can inadvertently create security, privacy, and economic concerns.
Budget is always a concern
Although using cloud services can help organizations avoid costs such as on-premises storage systems, expenses can quickly get out of control or come in higher than expected.
The one-size-fits-all data architecture can be an IT spending trap. When making the decision to move analytics to the cloud, enterprises often feel pressured to pay a high upfront cost and get locked into a long contract that doesn’t fit their current needs.
The key is to find a provider that doesn’t force cloud lock-in – but how? When evaluating cloud platforms, shop around for the right solution that can address your current analytics needs, with the flexibility to scale up as needed for your future needs.
Two of the most effective ways to control cloud costs are
- Take control of the way cloud accounts are created
- Be completely transparent about who is consuming cloud resources.
As for the first point, we need to migrate all cloud accounts under each provider into a single ‘master’ account. By centralizing who can create new cloud accounts, individuals and groups must go through a formal request process. Their request will include business justification, department budget information, and business owner.
As for transparency, whenever a request is approved, any new cloud account is created by the central team under the master account. This kind of policy allows us to provide transparency into the costs for which we get invoiced by our cloud providers. Each account is created with the information provided in the request, and we can then use the cloud providers’ portals or consoles to monitor the spending that matches each initial request.