How to make your data more valuable using Rapid BI?

What is Rapid Business Intelligence (BI)? According to Lucas Thelosen, founder of DAS42, there is no one right or one wrong approach. There are many right approaches and many wrong ones. Traditional BI can be explained when looking at one of Thelosen’s favorite books The Hitchhiker’s Guide to the Galaxy. You formulate a question, for example: “What is the answer to life and the universe and everything?”, and the computer thinks about it for seven and a half million years and comes up with “42”. That is traditional BI. You ask a question, wait forever, and at some point, you get an answer and it’s not really the one you wanted.

The traditional way of prototyping is you focus a lot of time on the transformation layer. You spend a good amount of time figuring out your extract, your transform, and your load. The first big part of your project is building up your data warehouse and figuring out the structure. Many companies never make it to analytics in traditional BI projects, because they have already spent too much money, too many resources, and too many hours setting up their data infrastructure.

 

 

The approach Lucas Thelosen is proposing is the way that works best for his clients. It is based on building a data infrastructure as quickly as possible by centralizing data, virtually or physically. The second step involves setting up your BI layer (analytics infrastructure), where you and your end users can see reports. With the business users in the same room, you can now see the reports, and set up the transformation layers that you need. You do the process of transformation earlier and with the end user. In contrast to the traditional approach, where you have the transformation later in the project when you already have reports and dashboards available.

There are many benefits to Thelosen’s approach, but some pitfalls. The main benefit is that you move fast and actually succeed with your BI because you can provide insights to the people in the organization. Secondly, you have an agile approach and can react quickly to changes in your business.

The number one pitfall is that many people do not take the time to clean up by deleting dashboards and reports they don’t use. They tend to hang onto them because they think they might need them later, and in the end, things become very messy. 

Model your data! That’s one thing Lucas Thelosen is stressing. It is very important to define what something means. “What is a new buyer?”, “What do we actually count as revenue?”, “What do we not count as revenue?”, “Who belongs in this cohort?”, “Why is this a cohort?”, all in all, define and document your KPIs (Key-Performance-Indicators). It is incredibly important because later you will never have to fight or argue over your KPIs in the reporting.

According to Matthias Korn, Head of Solution Engineering at Data Virtuality, his company helps organizations to build an agile data infrastructure within one day. Their solution, Data Virtuality, accelerates rapid BI prototyping, as it automatically manages the data flow between data sources and business intelligence tools, thereby freeing up time for the analytics part of the BI project. Data Virtuality Logical Data Warehouse has more than 200 connectors and a powerful modeling layer making it the most agile and performant data infrastructure for a company of any size.

 

How long does it take to create a BI set up?

Many companies struggle with setting up their BI. Months pass and it feels like not much has been accomplished. We asked experts how long it takes to create a BI setup with a rapid BI approach.

Thelosen remembers a client where it took them days for step one, which included extracting and loading the data into a data warehouse. The second step took another 5 days to start with the analytics implementation and get initial dashboards up, which were used as a starting point to talk to people and iterate. 

The biggest mistake Thelosen observes is that companies try to build and write all their connectors in-house. That is a huge waste of time and puts the project in danger. If it is your core competency and your name happens to be Data Virtuality, it makes sense to build these connectors in-house. If not, it is unwise.

 

Should I use a set of specialised tools, or a full stack BI tool?

There are some full stack systems on the market that do everything from loading and centralization, right through to the analysis. Thelosen recommends separating those steps. By using a full stack solution, a common example would be Domo dashboards, you run into the problem that you do not have any access to your data warehouse - which limits the ability of your data scientist to examine it thoroughly. Additionally, you are also tied to the vendor who is controlling the speed of your database, which effectively means that the vendor owns your database. Therefore, it is important that companies own their own data and have it in-house and fully accessible.

Imagine a merger or an acquisition situation: from the perspective of company value and integration, it would be preferable to own your golden nuggets, that is, your data.

Furthermore, moving away from a full stack BI tool is difficult. Having separate components allows companies to switch easily between each component of the BI stack. Data warehouse solutions, data integration solutions and analytics solutions can often be chopped and changed according to need. This enables companies to be truly agile because they can adapt their BI stack at the speed their business changes. Companies with full stack BI tools lack these possibilities and consequently suffer from a chronic data disadvantage.

You can find the entire video series right here on our YouTube channel.