Migrating to cloud: solving the big data problem – Information Age

Alberto Pan, CTO at Denodo, discusses overcoming struggles with big data when it comes to migrating to the cloud

Leveraging big data still isn't without its obstacles.

For many organisations, cloud computing is now a fact of life. Over the years, its established a reputation as the key to achieving maximum agility, flexibility and scalability. The benefits of cloud technologies like big data are well documented. Organisations that adopt them are able to scale up and scale down their data storage and compute capacity as needed. This provides a new level flexibility for dynamic workloads, enabling the business to take advantage of new data types or new business opportunities, without making huge commitments to infrastructure.

With recent studies revealing that 42% of companies operating in the UK have some sort of cloud service in place, an 18% increase on 2014, adoption levels arent set to slow down anytime soon.

And, whilst many organisations are already experiencing the benefits of migrating their data to the cloud, some are thinking about how to take it up a notch by introducing big data into the mix.

Cloud computing and big data are both powerful advancements in technology when thought about separately. When brought together; they could give a business the competitive edge.

DevTeam.Space takes a look at what benefits cloud computing can provide for companies, including storage capabilities and cost. Read here

But, whilst the benefits could be huge, migrating big data to the cloud is not without its challenges

For many organisations, when it comes to migrating to cloud, big data can pose a big problem.

Due to its need for large processing power, big data has traditionally been processed on premise. Its complexity means that companies can still find it difficult to even begin to imagine moving to a different environment.

One of the first challenges businesses need to face is the movement from a physical to virtual infrastructure. In order to do this effectively, connectivity between different data sources is essential. Businesses need to ensure that they are able to migrate their big data to the cloud whilst running their on-premise systems as efficiently as possible. They also need to be sure that different systems and applications are able to connect following the migration to ensure the smooth flow of data otherwise they risk impacting overall business productivity and even downtime.

Mark Banfield, LogicMonitors chief revenue officer, discusses the problem of IT outages and how their impact can be reduced. Read here

Once this step is complete, organisations often face another problem. For many, migrating to cloud can bring a sense of loss of control; especially when compared to on-premise. This is usually because you have less direct contact with your data and is a problem often magnified when considering big data. Big data will often contain sensitive information such as the personal information of individuals, including both employees and customers. Thanks to regulations such as the GDPR, ensuring this data is kept protected has never been more important in order to avoid financial fines and reputational damage.

Although these two challenges should not hinder the movement of big data to cloud, the fear so often associated with them means it can be difficult to get the whole company on board. Often, certain members of the c-suite will need reassurance to buy into the idea of migrating this information.

Enter data virtualisation: the missing link when it comes to cloud migration.

Data virtualisation has the potential to help organisations to overcome the fear that often reigns when it comes to migrating big data to cloud.

By generating a single, logical view of all business data, no matter where it resides and without having to duplicate information in a physical repository, it enables organisations to overcome connectivity and security challenges.

This single view grants organisations with the power to monitor connectivity as well as performance between different sources during the migration stages. For example, when a data source is moved to the cloud, the virtualisation layer will provide redirection by reconfiguring the virtual layer, meaning there will be no need to manually reconfigure applications. This saves on resources which ultimately saves on financial spend.

Mark Pidgeon, vice president of technical services and customer success at Sumo Logic, explores the steps to a successful cloud migration. Read here

Similarly, from a security stand point, when big data is in the cloud environment, this single layer grants a full view of data sources, which can help companies to better protect their most confidential information and comply to the latest regulatory standards, without impacting the overall performance of operations.

Moving big data to the cloud has many benefits. The clouds scalable environment is far more cost effective and also could be used to improve the speed, performance and scalability of business operations.

Data virtualisation is emerging as the key to helping organisations to achieve these benefits. By providing one single, logical view of all data no matter where it resides, it can enable businesses to shift big data into a cloud environment whilst still ensuring that they are able to manage connectivity, respond to security fears and compliance requirements as well as ultimately find whatever information they require.

When it comes to cloud migration, data virtualisation has the power to convert the big data problem into a big data opportunity.

Read more:

Migrating to cloud: solving the big data problem - Information Age

Related Posts

Comments are closed.