Big data represents an incredible amount of data! It accounts for more than 80% of the world’s data today and can be of many kinds –structured, semi-structured, non-structured, real-time. It is being created and shared at a pace never seen before. It can be explained by the rise of Information Technology Channels (ITC), which range from smart phones to tablets and include any kind of connected devices (computer, TV, GPS, cars, etc). Henceforth, the challenge for businesses is to correctly understand and integrate all this data, and find a solution that will help sort and extract the most essential and workable information.
Surfing on the Big Data wave, Data Quality Management (DQM) is today a critical point for businesses who should starting to seriously take it into consideration. According to a recent Forrester survey made amongst members of the Association of Business Process Management Professionals (ABPMP), only 38% of persons who are working on improving CRM processes have evaluated the impact of poor-quality data on the overall performance of the process (1).
Strong evidences show that companies should care about the quality of data they are manipulating to take decisions. Those evidences are represented on one hand in terms of cost savings and on the other hand on the improvement of the Customer Relationship Management. When the quality of data held in a company information system is ensured, it becomes a strong competitive advantage and allows taking informed decisions.
Nevertheless, spending time and money on improving data quality appears as a very high cost investment because of the sometimes heaving project it implies as well as the high price of the IT tools that must be setup. At the same time, it becomes essential for businesses to start investing on data consolidation and standardization in order to maintain a real-time and centralized vision of a specific business’ situation and on customers’ information.
Thereby, the solution that can be adopted by businesses is to find an IT tool that could convert all the amount of information into smart data and into information that is workable right away. The holistic and synthetic vision of an activity appears as the most efficient solution to avoid losing time in searching and sorting all data available. To conserve real-time information, the solution should enable to incorporate qualitative information such as feelings. The extraction of meaningful information from any kind of sources represents an alternative to high cost projects that consist in a deep redesign of your information systems. The main objective is to differentiate between valuable information and what should be considered as “noise”!
Thereby, Big Data does not necessarily lead to poor quality data. Businesses just have to keep in mind that it is the way they are collecting, storing and treating the information that leads to data degradation. In order to avoid those issues, companies should implement a solution that will help them stay focus on what matters and on meaningful insights coming from their data sources, having then the possibility – if required – to drill-down in their ocean of data