How to make your data more valuable using Rapid BI ?

What is Rapid BI ?

According to Lucas Thelosen, founder of DAS42, there is no one right or one wrong approach. There are many right approaches and many wrong ones. Traditional BI can be explained when looking at one of Lucas Thelosen’s favourite books The Hitchhiker’s Guide to the Galaxy. You formulate a question, for example: “What is the answer to life and the universe and everything,” and the computer thinks about it for seven and a half million years and comes up with “42”.  That is traditional BI. You ask a question, wait forever, and at some point you get an answer and it’s not really the one you wanted.

 The traditional way of prototyping is you focus a lot of time on the transformation layer. You spend a good amount of time figuring out your extract, your transform, and your load. The first big part of your project is building up your data warehouse and figuring out the structure. Many companies never make it to analytics in traditional BI projects, because they have spent already too much money, too many resources, and too many hours setting up the data infrastructure.  



The approach Lucas Thelosen is proposing is the way that works best for his clients. It is based on building a data infrastructure as quickly as possible by centralizing data, virtually or physically. The second step involves setting up your BI layer (analytics infrastructure), where you and your end users can see reports. With the business users in the same room you can now see the reports, and set up the transformation layers that you need. You do the process of transformation earlier and with the end user, in contrast to the traditional approach, where you have the transformation later in the project as you already have reports and dashboards available.

There are many benefits to that approach, but some pitfalls. The main benefit is that you move fast and actually succeed in BI as you can provide insights to the people in the organisation. Secondly, you have an agile approach and can react quickly to changes in your business.

The number one pitfall is that many people do not take the time to clean up by deleting dashboards and reports they don’t use. They tend to hang onto them because they think they might need them later, and in the end things become very messy. Model your data! That’s one thing Lucas Thelosen is stressing. It is very important to define what something means. “What is a new buyer?”, “What do we actually count as revenue?”, “What do we not count as revenue?”, “Who belongs in this cohort?”, “Why is this a cohort?”, all in all, define and document your KPIs (Key-Performance-Indicators). It is incredibly important because later you will never have to fight or argue over your KPIs in the reporting. Modelling is very important.

According to Matthias Korn, Head of Solution Engineering at DataVirtuality, his company helps organizations to build an agile data infrastructure within 1 day. It accelerates rapid BI prototyping, as DataVirtuality manages automatically the data flow between data sources and analytic tools freeing time for the analytics part of the BI project. DataVirtuality Logical Data Warehouse has more than 100 connectors and a powerful modeling layer making it the most agile and performant data infrastructure for a company of any size.

How long does it take to create a BI set up?

Many companies struggle with setting up their BI. Months pass and it feels like not much has been accomplished. We asked experts how long it takes to create a BI set up with a rapid BI approach.

Lucas Thelosen, Founder of DAS42, remembers a client, where it took them days for step one, which included extracting and loading the data into a warehouse. In the second step it took another 5 days to start with the analytics implementation and get initial dashboards up which can be used as a starting point to talk to people and iterate. The biggest mistake Lucas Thelosen observes is that companies try to build and write all their connectors in-house. That is a huge waste of time and puts the project in danger. If it is your core competency and your name happens to be Data Virtuality, it makes sense to build these connectors in-house.  If not, it is unwise.

Should I use a set of specialised tools, or a full stack BI tool?

There are some full stack systems on the market that do everything from loading, centralisation, to the analysis. Lucas Thelosen, Founder of DAS42, recommends separating those steps. By using a full stack solution, a common example would be Domo dashboards, you run into the problem that you do not have any access to your data warehouse - which limits the ability of your data scientist to examine it thoroughly. Beside this, you are also tied to the vendor who is controlling the speed of your database, which effectively means that the vendor owns your database. Therefore it is important that companies own their own data and have it in-house and fully accessible.

Imagine a merger or an acquisition situation: from the perspective of company value and integration, it would be preferable to own your golden nuggets, your data.

Furthermore, moving away from a full stack BI tool is difficult. Having separate components allows companies to switch easily between each component of the BI stack. Data warehouse, data integration solutions and analytics solutions can often be chopped and changed about according to need. This enables companies to be truly agile, as they can adapt their BI stack at the speed their business changes. Companies with full stack BI tools lack these possibilities and suffer from a chronic data disadvantage as a consequence.