In these business intelligence migration projects, how likely is it that you are going to find bad, outdated or undocumented data?
I’d your chances are about 100%. You have to understand that a lot of the development of BI visualization environments is driven from the business-side and that they don’t usually have expertise in building IT software. So, they will build something that solves their problems without necessarily focusing on proper documentation. Often these solutions will also contain a lot of shortcuts – as a result of having been built by someone with a business background rather than a tech background.
How can you identify them as early as possible without overspending on data preparation and without increasing the project scope exponentially?
The first thing to do, or the first thing I would do if I were looking at a migration project, is to analyze the underlying structure of the data solution. For example, how are the folders organized? This will give you an initial indication of the state of the solution. Then I would go into an example app and look more specifically at the code.
The problem is unless you have a tool like NodeGraph, you have to manually track how everything is connected. This becomes very time-consuming and you also don’t get a comprehensive overview of the entire solution. However, by applying a tool like NodeGraph, you can instantly see where all the dependencies exist in a way that would be impossible as a result of manually entering each individual folder.
What tools and strategies are worth investing in?
There aren’t really any tools that will do the whole migration job for you. In other words, the migration has to take place in your native tools. Qlik has created a tool that helps you migrate from QlikView application to Qlik Sense, so this is a possible way to migrate a small installation but in a complex environment this will not work.
But I think that the strategy is key. As I said, first categorize your applications in terms of difficulty-level but also in terms of their importance. This is also a great time to sort through and clean your solution, remove any redundancies and fully optimizing the way that you use your data.
All in all, a BI migration project is a big investment. And the larger it is, the more likely you need additional investment. How likely is it that you will need to invest further during the migration process?
My feeling is that IT projects generally tend to become larger than initially predicted. And I think this applies here as well. Subsequent needs for further investment often stem from bad project management and incorrect prioritization.
How profitable can it be and how soon can you expect to get a return on your investment?
This is an impossible question to answer. But instead of trying to apply a specific number, you can look at it in terms of the amount of resources you were allocating to different BI tools and teams prior to the migration. For example, if you have Power BI, Tableau and Qlik running simultaneously in your organization, you’ll have a cost associated to each of these. This acts as one parameter. But another important thing to consider as well is the quality of your solution(s). By uniting your data in one tool, you can ensure that you are basing your business decisions on the correct KPI (rather than having two or three separate versions to choose from).
Ultimately, I would say the ROI is split between directs savings, less software and manpower costs, and (perhaps most importantly) indirect savings, the quality of your data solution.
What can you do to improve the project’s ROI / get a return faster?
I think a lot of this is related to your overall understanding of the solution in question, your ability to reach this understanding and the amount of time and money required to get there. Again, this can be drastically shortened by using NodeGraph. It’s a question of having to go through 2,000 applications manually vs. spending 8 hours to configure a platform that will do it for you – further allowing you to click around in the dependencies and really drill down on specific elements.
I think it’s in automating the mapping of your current data landscape that you will find the most time-savers. Because when it’s about moving code – that takes the time it takes. However, when you’re talking about documentation and understanding the landscape – automation really is key to maximizing ROI and success.
Aside from applying NodeGraph during the migration process, does it serve a purpose beyond this?
Now we are entering the governance perspective, and more specifically how to assess complex environments. When you build a BI landscape, you need a solid structure to make it work. Sure, if you only build one application, you can wing it – but this won’t work on bigger solutions. And, for the purpose of monitoring this and ensuring that every application is built according to a predetermined standard, NodeGraph is perfect. Really, the value of NodeGraph is recurring and exponential. You can ensure that everything is properly documented and adhering to your governance guidelines.
Who usually leads a project like this? What roles in the project team do you need? How many people are typically involved?
You definitely need a clearly defined project manager who owns the migration project. This, together with some kind of enterprise architect who can establish the governance structure, best practices, and so on. These two in accordance with each other are necessary.
And how many need to be involved if you use NodeGraph?
You can’t really conclude this without knowing the size of the project. But what you can conclude is that the most critical part of a migration project lies in understanding the data solution(s) in question, and this is where NodeGraph shines.
What are the top challenges that businesses typically face when changing BI tools?
First off, it’s not necessarily easy to get the whole organization on board with the migration. It’s worth noting that there is probably a champion behind every BI platform that has been used who believes their tool is the best. So, you need to present the migration project not only in terms of IT optimization but in a way that everyone can get on board with.
Next up, mapping the landscape and determining what should be migrated can be hugely challenging.
How does NodeGraph help to deal with these challenges? How does the NodeGraph data intelligence platform work? What does it do?
Alright. Well, say you don’t have NodeGraph and that you decide to use it for the first time during your business intelligence migration project. The first step would be to install NodeGraph and connect it to your current BI solution – allowing you to visualize your entire landscape. From here, you can make all the necessary categorizations and determine what elements should be migrated. After deciding this, you would move on to selecting the applications one-by-one (in NodeGraph) in order to see what dependencies exist in your solution. Following this step, you would connect NodeGraph to your new BI solution and compare the landscapes this way – ensuring that everything looks correct.
Finally, you can also do a quality check in NodeGraph – automatically testing your data to ensure that everything has been properly set up. I highly recommend setting up these tests and letting them run for a few weeks. This because a lot of the logic in a BI solution is time-based, ex. update this biweekly, read this incrementally, etc., leaving room for error that can really only be identified over time.
Are there any alternative tools or strategies? When is NodeGraph’s solution better than these alternatives?
I would say that without NodeGraph, the main strategy is to map and move the solution manually. Alternatively, you might choose to close down the former environment entirely and build the new landscape based on what people miss as a result of the shutdown. The latter is not a very popular way to go. But it can provide a level of efficiency that comes hand-in-hand with working on a blank canvas.
Can you touch on some specific cases where clients have successfully migrated their BI solution with NodeGraph? What were the key factors to success?
I think if we go back to the unnamed client from before, we are migrating a lot of different applications but also a lot of different versions of the same KPI. For example, we have one KPI that NodeGraph can automatically tell us has 27 different definitions. If you were to manually map and migrate this, you might mistakenly migrate these 27 as individual KPIs, whereas NodeGraph can clearly show us that they are in fact one and the same.