IT Transformation

Data and Information Transformation

By Choong Keng Leong June 29, 2012

We continue our IT transformation journey with the data and information layer. Another way of depicting the transformation layers discussed in my previous articles IT Transformation – Where to Start? and Transforming Your Infrastructure can be seen below.

Figure 1: IT Transformation Layers

Why is data and information part of the transformation journey, and what role does it play? Data and information are continuously being generated by the Business, Operations, and Consumers. This trend is not abating as mobility, social media, and the cloud continue to push the explosive growth of data and information. As a result, we have to transform how we manage, store and mine the data and information for business and operational use, more efficiently and cost effectively.

While on this topic, I thought it would be good to clarify some terminologies:

Intelligence vs. Analytics

While “intelligence” is about mining past data and information to report on “what” has happened, “analytics” is about using statistical and predictive modeling to answer the “what if” and “why” something happened.

If an enterprise can analyze and predict future trends, actions, or consumption behavior and patterns, than it has a winning edge over its competitor. The enterprise will be able to dangle a product or service that the customer needs or wants at that moment. As well as compliment what he or she is doing, to show the power of data and information analytics.

Big Data

“Big Data” is not just about managing large quantities of data. A big dataset may not be Big Data if it is very structured and can be processed using existing data management tools, statistical and visualizer packages.

From Mike 2.0, Big Data is defined as the complexity of processing and analyzing data from many independent sources, including historical data and streaming inputs, to derive useful permutations and real-time outputs. The system that takes multiple real time data feeds and performs detailed correlation and analysis is called Complex Event Processing or CEP.

Gartner defines Big Data as being three-dimensional: increasing volume (amount of data), velocity (speed of data in and out), and variety (range of data types and sources).

Transforming How We Manage Data & Information

The goal and objective of an enterprise is to extract as much business value from its data, as possible. In the Big Data era, this is achieved through the manipulation of large sets of structured, unstructured, or semi-structured data; using real-time techniques to access, secure, move, analyze, process, visualize, and enhance data.

In order to support the real-time, complex and computationally intensive processes, enterprises need to consider transforming the different layers in Figure 1:

  • Infrastructure

In Big Data use case, the more data you have, the more accurate the forecast and prediction from data analytics. This puts strain on the existing storage infrastructure. The requirement is to now have more data online and nearer to the applications, and high bandwidth. This is different from how we have traditionally designed our storage for IOPS performance, tiering and moving of data to optimize storage.  As a result, scale-out storage solutions, Flash and SSDs may be a better fit for Big Data.

Compute virtualization also provides the elasticity that allows one to easily and programmatically spin up or down computational resources as data processing demands grow or shrink.

  • Applications

Big Data is being generated by everything around us…from smartphones, PCs,  and laptops, to electronic sensors found in modern automobiles; in every organization, industry and country.

Big Data Applications (BDAs) are a new breed of applications consuming Big Data, based on new architectures that can work with new data management approaches and tools, such as Hadoop, MapReduce and MPP (Massively Parallel Processing), to handle the massive amount of data with reasonable cost and performance.

  • End User Computing

End users generate a big bulk of Big Data and they also consume Big Data. With the consumerization of IT, a shift towards lifestyle-based, self-service, on-demand, on-the-go and access anywhere type of computing and IT services happens. End users, through their smartphones, tablets or laptops, can visualize and analyze data presented from a Big Data service provider, which has aggregated data from various sources and providing the results from its data analytics in the back-end.

This changes how we provide end-user applications and IT services.

  • Operating & Business Model

The rise of Big Data requires new thinking and new skill sets such as Data Scientists and specialists with data analytics skills. They can help unlock business value from Big Data with required technical skills around Hadoop, MapReduce, and proprietary Big Data frameworks..

  • Security & Compliance

With a growing mass of unstructured data, accessed by distributed cloud of users and applications that slice and dice the data in a million and one ways, the security and compliance teams are faced with a new challenge to keep tab on regulated information such as PCI, PHI, PII, etc.

Implementing data governance framework and classifying the data is a starting point to securing Big Data. Coupled with data analytics to automate the process of understanding where all the sensitive data is located, who’s accessing and what they are doing with it, etc. will help keep organizations in compliance.

  • Data Protection – DR, Backup, Recovery & Archive

Big Data requires organizations to rethink data protection. How do you protect and recover Big Data efficiently?

Potentially, cloud storage offers a cost effective and viable solution to safeguard big data and reduce recovery time; at the same time providing the benefits of scalability, elasticity and rapid deployment.

  • Service Automation

A typical Big Data application consists of multiple layers: database, web, process, cache, data synchronization and distribution, report, etc. Each layer uses different management, provisioning, monitoring and troubleshooting tools.

Without a consistent framework for management, monitoring and orchestration across the layers, it would make maintenance and management of Big Data Applications significantly harder.


The era of Big Data is here and the value it brings to business is clear. Like in virtualization and cloud computing, to successfully leverage Big Data, organizations need to transform, which includes investment in proven technologies, update workforce skills and data management processes – in order to implement a Big Data platform that suits the business and its objectives.

About Choong Keng Leong

Keng Leong has spent over 18 years dealing with large IT infrastructure projects in banks, government agencies, large telcos, and other organizations. He recognized the importance of IT as a Service early on, and has successfully helped many organizations move down that path.

Keng Leong has many professional certifications, including EMC Cloud Architect Expert (EMCCAe), Data Science Associate (EMCDSA) and ITIL v3 Expert, but his most important certification remains his sincere passion for IT as a Service and his strong belief in the future of IT being very cloud-centric.

Read More

Join the Conversation

Our Team becomes stronger with every person who adds to the conversation. So please join the conversation. Comment on our posts and share!

Leave a Reply

Your email address will not be published. Required fields are marked *