Data Migration business intelligence migration

Interview with industry expert

Business intelligence migration: changing or adding BI tool.

You’re in luck.

You know why? Because we decided to sit down with one of our founding partners, Oskar Grondahl, to seek answers on the complex matter of migrating your data landscape from one BI tool to another. And we recorded everything. And transcribed it into this now comprehensive interview and guide on business intelligence migration.

Oskar Grondahl
CEO @ NodeGraph

Having spent the last 15 years in the BI industry, Oskar specializes in Data Analytics, Governance and Enterprise Data Strategies. In 2016, Oskar left his CEO role for a successful Qlik consultancy to co-found NodeGraph – now a leading data intelligence platform with a unique automated solution.

Lightbulb icon

See NodeGraph in action

Learn more about how NodeGraph can help automatically map your entire BI landscape and prepare it for migration by watching our online demo.

Watch Demo

When does a company need to migrate from one BI tool to another? What could be the reason for this?

There are a couple of different reasons. Mainly, what tends to happen is that many of these BI tools are being brought into the company by different departments. For example, the Finance department might choose to buy Tableau because someone on that team is a native in that tool, while Sales chooses to buy Qlik, and everyone starts building their own respective applications. Then, as this starts to grow, the IT department wants to organize everything but is stuck with a mashup of several different tools and platforms (Power BI, Qlik, Tableau, etc.) and needs to start consolidating this – perhaps choosing to migrate over to a single tool or some decided-upon combination of tools.

Another reason is unique to QlikView and Qlik Sense. Because, historically, you have always been able to just update QlikView and you have not needed to migrate anything. But then when Qlik Sense came, there is a whole new technology that means that you actually have to migrate your solution. So that is one typical business intelligence migration case.

Then there’s always the case of acquisitions. As a smaller company gets acquired, they need to adhere to the decided-upon tools of the parent company.

You mentioned the decision to use several tools simultaneously. Is there a reason why companies might choose to do this?

No, not really. Especially when we are talking about tools like PowerBI, Tableau and Qlik, because they are fundamentally quite similar. They all have their edge but overall you can accomplish the same result. There are some other BI tools , where you have more mathematical structures focusing on statistics and more complicated analytics. So, you might choose to have one visualization tool and complement this with something more mathematical.

Otherwise, I think it’s pretty hard to justify why you would choose to use more than one tool. Because the problem is, and this is another driver to migrate, is that it’s very expensive to have several tools at once. And this isn’t even primarily linked to license costs, but more so to the cost of hiring consultants who know each respective tool to ensure competency. So, there are quite a few reasons against mixing tools.

But it’s still a huge challenge because a lot of this happens outside the control of IT (in other departments across the organization). And, generally, these departments move quite a lot faster than IT so you end up in a situation where IT just give them the go-ahead because they can’t keep up. And when you are up against management, departments like Sales often have more clout, again forcing IT to step down.

What common pitfalls exist in a business intelligence migration project?

I think the most common mistake is to try to do everything at once. Which causes you to be overwhelmed and not really move forward. So, I think you need to restructure your focus onto your big wins and start the process there.

Then you also have the challenge of properly understanding both environments involved in the migration. Which is especially hard when it comes to the tool you are moving away from.

Basically, it’s hard to motivate someone to document an entire solution that is in the process of being decommissioned.

And this is when it becomes especially important to have a platform like NodeGraph – which allows someone who does not necessarily have the competency in the tool in question to apply NodeGraph to the solution and understand how everything works.

How complex and time-consuming can such a project be? Is it possible to correctly estimate cost and time? What hidden costs are usually overlooked?

Let us look at an example. Omitting the actual client’s name, we have been working on a project where the client has managed to build up an environment of about 2,000 Qlik Sense applications in a very short amount of time (about 2 or 3 years) that they now want to migrate to Power BI.

So, this means that you have to go through each of these 2,000 applications and examine what expressions (i.e. KPIs) there are, and you also need to understand what data is being used. This means not only examining the expressions but also the code in all the different scripts. To do this manually, in 2,000 applications, is a massive project.

Can you estimate ahead of time how much time a business intelligence migration project is going to take?

Yes. And I think the way you would go about doing this is to categorize your applications into 3 sections: easy, medium and difficult. Then you can estimate that a difficult application might take, for example, 80 hours to comment, a medium might take 40 and an easy 20. Then you extrapolate the migration time based on this, plus minus 5 or 10%.

What hidden costs might arise?

I think the biggest cost here, that a lot of people underestimate, is the required involvement of people from the data management side, i.e. those who deliver the data. And here, communication- and time-issues often arise. Basically, the fact that you don’t own the whole data flow from start to finish, but rather from when it gets delivered from the data warehouse creates a potential point of tension where collaboration and knowledge-sharing is required.

Then, it’s worth noting that the BI tools do have their individual strong suits. Even though they visually look similar, some are more effective than others when it comes to, for example, handling volumes of data. So, say we are looking a case where you are moving from Qlik, that handles data in QVDs (an extremely efficient way to process data), to Power BI and realize that you just can’t do the same thing in Power BI. In this case, you would be required to build an additional staging environment just to make it work. So, there might be certain unforeseen functionality compatibility issues that risk sinking the migration project.

Want to learn how to optimize time and cost when changing BI tool?

Then you should join our webinar on Thursday, September 24th 2020 – 3:00 PM (CEST). During this 30-min live session, Oskar Grondahl ( CEO at NodeGraph) will share more practical insights about…

  • Best practices and proven strategies for migrating to a different BI tool
  • Analyzing the impact of a data migration project on your BI environment
  • How you can use automation to speed up the migration, improve quality, and maintain business continuity

Fill in your details to register now.

In these business intelligence migration projects, how likely is it that you are going to find bad, outdated or undocumented data?

I’d your chances are about 100%. You have to understand that a lot of the development of BI visualization environments is driven from the business-side and that they don’t usually have expertise in building IT software. So, they will build something that solves their problems without necessarily focusing on proper documentation. Often these solutions will also contain a lot of shortcuts – as a result of having been built by someone with a business background rather than a tech background.

How can you identify them as early as possible without overspending on data preparation and without increasing the project scope exponentially?

The first thing to do, or the first thing I would do if I were looking at a migration project, is to analyze the underlying structure of the data solution. For example, how are the folders organized? This will give you an initial indication of the state of the solution. Then I would go into an example app and look more specifically at the code.

The problem is unless you have a tool like NodeGraph, you have to manually track how everything is connected. This becomes very time-consuming and you also don’t get a comprehensive overview of the entire solution. However, by applying a tool like NodeGraph, you can instantly see where all the dependencies exist in a way that would be impossible as a result of manually entering each individual folder.

What tools and strategies are worth investing in?

There aren’t really any tools that will do the whole migration job for you. In other words, the migration has to take place in your native tools. Qlik has created a tool that helps you migrate from QlikView application to Qlik Sense, so this is a possible way to migrate a small installation but in a complex environment this will not work.

But I think that the strategy is key. As I said, first categorize your applications in terms of difficulty-level but also in terms of their importance. This is also a great time to sort through and clean your solution, remove any redundancies and fully optimizing the way that you use your data.

All in all, a BI migration project is a big investment. And the larger it is, the more likely you need additional investment. How likely is it that you will need to invest further during the migration process?

My feeling is that IT projects generally tend to become larger than initially predicted. And I think this applies here as well. Subsequent needs for further investment often stem from bad project management and incorrect prioritization.

How profitable can it be and how soon can you expect to get a return on your investment?

This is an impossible question to answer. But instead of trying to apply a specific number, you can look at it in terms of the amount of resources you were allocating to different BI tools and teams prior to the migration. For example, if you have Power BI, Tableau and Qlik running simultaneously in your organization, you’ll have a cost associated to each of these. This acts as one parameter. But another important thing to consider as well is the quality of your solution(s). By uniting your data in one tool, you can ensure that you are basing your business decisions on the correct KPI (rather than having two or three separate versions to choose from).

Ultimately, I would say the ROI is split between directs savings, less software and manpower costs, and (perhaps most importantly) indirect savings, the quality of your data solution.

What can you do to improve the project’s ROI / get a return faster?

I think a lot of this is related to your overall understanding of the solution in question, your ability to reach this understanding and the amount of time and money required to get there. Again, this can be drastically shortened by using NodeGraph. It’s a question of having to go through 2,000 applications manually vs. spending 8 hours to configure a platform that will do it for you – further allowing you to click around in the dependencies and really drill down on specific elements.

I think it’s in automating the mapping of your current data landscape that you will find the most time-savers. Because when it’s about moving code – that takes the time it takes. However, when you’re talking about documentation and understanding the landscape – automation really is key to maximizing ROI and success.

Aside from applying NodeGraph during the migration process, does it serve a purpose beyond this?

Now we are entering the governance perspective, and more specifically how to assess complex environments. When you build a BI landscape, you need a solid structure to make it work. Sure, if you only build one application, you can wing it – but this won’t work on bigger solutions. And, for the purpose of monitoring this and ensuring that every application is built according to a predetermined standard, NodeGraph is perfect. Really, the value of NodeGraph is recurring and exponential. You can ensure that everything is properly documented and adhering to your governance guidelines.

Who usually leads a project like this? What roles in the project team do you need? How many people are typically involved?

You definitely need a clearly defined project manager who owns the migration project. This, together with some kind of enterprise architect who can establish the governance structure, best practices, and so on. These two in accordance with each other are necessary.

And how many need to be involved if you use NodeGraph?

You can’t really conclude this without knowing the size of the project. But what you can conclude is that the most critical part of a migration project lies in understanding the data solution(s) in question, and this is where NodeGraph shines.

What are the top challenges that businesses typically face when changing BI tools?

First off, it’s not necessarily easy to get the whole organization on board with the migration. It’s worth noting that there is probably a champion behind every BI platform that has been used who believes their tool is the best. So, you need to present the migration project not only in terms of IT optimization but in a way that everyone can get on board with.

Next up, mapping the landscape and determining what should be migrated can be hugely challenging.

How does NodeGraph help to deal with these challenges? How does the NodeGraph data intelligence platform work? What does it do?

Alright. Well, say you don’t have NodeGraph and that you decide to use it for the first time during your business intelligence migration project. The first step would be to install NodeGraph and connect it to your current BI solution – allowing you to visualize your entire landscape. From here, you can make all the necessary categorizations and determine what elements should be migrated. After deciding this, you would move on to selecting the applications one-by-one (in NodeGraph) in order to see what dependencies exist in your solution. Following this step, you would connect NodeGraph to your new BI solution and compare the landscapes this way – ensuring that everything looks correct.

Finally, you can also do a quality check in NodeGraph – automatically testing your data to ensure that everything has been properly set up. I highly recommend setting up these tests and letting them run for a few weeks. This because a lot of the logic in a BI solution is time-based, ex. update this biweekly, read this incrementally, etc., leaving room for error that can really only be identified over time.

Are there any alternative tools or strategies? When is NodeGraph’s solution better than these alternatives?

I would say that without NodeGraph, the main strategy is to map and move the solution manually. Alternatively, you might choose to close down the former environment entirely and build the new landscape based on what people miss as a result of the shutdown. The latter is not a very popular way to go. But it can provide a level of efficiency that comes hand-in-hand with working on a blank canvas.

Can you touch on some specific cases where clients have successfully migrated their BI solution with NodeGraph? What were the key factors to success?

I think if we go back to the unnamed client from before, we are migrating a lot of different applications but also a lot of different versions of the same KPI. For example, we have one KPI that NodeGraph can automatically tell us has 27 different definitions. If you were to manually map and migrate this, you might mistakenly migrate these 27 as individual KPIs, whereas NodeGraph can clearly show us that they are in fact one and the same.

That’s it for this time.

illustrated-migration-how-to

Want to learn more about the NodeGraph platform?

NodeGraph is a multinational data intelligence platform that helps large companies in the world visualize, understand, and scale their data environment in an intuitive way. To find out how NodeGraph can help you during your business intelligence migration and beyond, watch an online demo or connect with one of our experts directly.
Watch Demo