Big data lives in a world of its own. It is messy, unstructured and constantly changing. To make use of it, (marketing) professionals need to leverage the power of data integration tools and get that data to a point where it is ready for analysis. This process, however, comes with its own challenges, so we have made a list of our top 5. Read on!


Challenge 1: Getting the data where you want it

Data is coming at you from multiple directions. The number of data sources you use is likely quite large already and it is going to grow even further in the future. Collecting, transforming and sending your data to a designated location is therefore becoming increasingly complex.

Clearly, finding the data is not the problem: It is there and there is plenty of it. The challenge is to manage it so that when it lands in your target database, it is easy to find, use and derive insights from. IT professionals Dipthi Karnad and Kapil Tulsan advise strongly in favor of integrating individual datasets one at a time in order to avoid errors early on.

“Suppose you need to merge data from three applications: a merchandise management system, a customer database, and a product database. Break it down into individual datasets such as customer information, sales data, financial data, etc., and merge them one at a time.”

While it may seem more efficient to do everything at once, the truth is that your integration tool may become overwhelmed by the amount of data coming in, thus performing at less than 100%. That, in turn, can lead to compromised data quality and faulty business decisions made further down the line.

Request a live demo for our data integration tool!

Challenge 2: Achieving great data quality

As soon as you have your data collected from its initial sources, the challenge is to make sure that it is all properly cleansed and harmonised, so you can move on to deriving insights. In other words, you need to ensure the highest possible quality of your data.

Like we have said before, quality data is complete, consistent, timely and up-to-date. The last two of those characteristics can be particularly challenging, especially since data first enters your BI tool as one big clunky mess. There is no real structure or order to it in this initial phase. What’s more, different sources push data at different intervals of time and in formats that often vary among each other. The risk of getting out of sync on all your data is not that small, after all.

Bring your data’s quality up by activating a powerful ETL tool, and eliminate any possible inconsistencies:

"To improve data quality, make sure your Extract and Transform processes occur at the source level, and not at the target level. The Load and Display processes can then be completed at the target level,"

Ditpthi & Kapil

Challenge 3: Integrating data at the deepest level possible

The end quality of your data is almost entirely dependent on how well you clean, transform and integrate it into your business processes. If there is even the slightest inconsistency in your initial data, they will seep through to every next move you make.

Karnad and Tulsan, thus, stress the importance of what is known as drill-down reporting. Say, you are doing marketing for a retail brand: The data that you are collecting can be sorted in a number of categories from product and department, to store, city, country and region, to name a few.

"The best way to integrate data is at the product level, because then the BI tool can automatically merge it with the higher levels as needed. Integrating data directly at the company level, for example, will open up challenges when generating product-level drill-down reports."

Ditpthi & Kapil

The more accurate the basis of your data is, the fewer errors you are bound to encounter and of course, the better the decisions you make in the end.

Challenge 4: The how and when of scalability

Even if it’s not on the top of our minds all the time, big data and scalability go practically hand in hand. As data does not seem to be slowing down in terms of growth, being prepared for even larger amounts in the future is a must. Unfortunately, however, only few organisations realise that before they embrace a specific BI tool.

While many still prefer to have their data housed on-premise, there are more and more cloud-based options out there. Deciding on a tool - whether on-premise, in the cloud, or a combination of both - involves an evaluation of a number of factors. Speed, cost, scalability and security are among them.

If you’re looking for speed, tight security and control, an on-premise data warehouse may be the choice for you. You are sure to have all your data in the server room at the back of your office, making it as secure as it possibly can be. Security is also a main benefit of the cloud, where protocols are being updated on a regular basis and potential bugs are being fixed as soon as they are detected.

Speed is certainly a stronger feat for on-premise solutions since users do not have to wait on data to bounce through multiple servers - often in multiple, far-away countries.

Where a cloud solution is clearly taking the lead, though, is in having a much lower entry cost (no servers, no hardware!) and higher possibilities for scaling. After all, the space you can have in the cloud in practically infinite.

Challenge 5: Finding the right data talent

While one option is to have your data activities outsourced to a third party, there seems to be growing consensus around the fact that companies need to hire more talent with strong data skills.

In a survey we conducted earlier this year, more than 50% of respondents said they have various people responsible for the data integration process, and 47% have already created a dedicated position - anything from Data Analyst to Head of Data Technology and Chief Data Officer (CDO).

According to Wired, “the CDO ensures that strategic importance of data is properly maintained and managed throughout the organization. With access to detailed market information, customer data and predictive analytics, [the] CDO is in perfect position to identify ways in which this data can be harnessed.”

It looks like data and tech talent will be forming the core of many of today’s companies. Scott Hagedorn, CEO of Omnicom's new media agency Hearts & Science, is one of the people to highlight this shift that organisations are trying to make tin becoming more data-driven: “I see the roles changing [... and] that includes the rise of the marketing technologist as being a fundamental person on the team.”

At the end of the day, data integration is there to make the work of marketing, IT and other professionals easier and more efficient. It does not, however, come without its challenges, so do keep those in mind when crafting your long-term data (integration) strategy.

Download the PDF case study

mina square.jpg

Written by Mina Nacheva

Content marketer and strategist. With a background in media and business, Mina is particularly excited about the new trends in digital journalism, marketing and everything social media. Loves blogging, good jokes, and dreams about flying to Mars.



Post a Comment