Over the weekend I had a bit of time on my hand and I wanted to try out some new features in the latest Power BI release.
While looking for an interesting data set I thought about the Austrian election. As most of you know this is the country where I was born before I moved to Australia and came across an interesting web site that aggregates poll results and other election information called “Neuwal.com“. The Neuwal operators are kind enough to make their data available via json format. So, I started my shiny new latest version of the Power BI engine and connected to that data set and put an initial simple analysis together in about an hour:
I tried the fairly new ribbon chart visual to show the aggregate, average poll results, this is an interesting alternative to stacked bar charts for showing trends. The table visual on the right contains the links to the publications that have published the poll results. So, with a click on the survey time line, the relevant links are automatically filtered and the user can navigate to the specific publications. Finally I am showing at the top left a comparison of the last election to the average of all polls for a selected time frame.
To put this analysis took about an hour with a live link to the data source. So, new Neuwal results are automatically updated. If you would like to “play with the application” please use the embedded report below:
To discuss how Managility can help with your analytics projects please contact us here: Contact Managility
Managility just completed a project with a large retail company analyzing and optimizing their shipment processes based on a Power BI solution. The relevant data set included 300.000 annual shipment transactions with all relevant details like pickup/drop off location, weight, volume, provider and shipment cost as well as a pricing data from a variety of logistics companies.
The initial step involved gaining insights into the data and providing the client with a clear picture. The outcome is shown in the dashboard below (for confidentiality reasons all data is anonymised). It enables insights into developments over time, grouping data into weight and volume buckets, rankings of top sender and drop off locations as well as a variety of filter options (e.g. dangerous goods etc.). Helpful insight was gained through a new custom visual called “Flow Map” that visualizes route details through the thickness of the link between two locations.
The next step in the project was focused on realising tangible savings. Shipment rates of all relevant logistics suppliers were added to the model. This fairly complex requirement with a variety of varying rate parameters like availability in the route, differentiations by weight, time of day, urgency, volume etc. was handled effectively with Power Query. DAX calculation logic was then implemented to determine the optimal provider/ rate option to minimise cost.
The results of this process are available in an interactive dashboard that includes all providers and their shipping products. As shown in the screen shot below savings of around $1m through using an optimal mix of logistics products was identified that can then be further filtered using the criteria on the right.
The projects was initiated using Managility’ s Fast Start program that included clarification of project deliverables and priorities, the building of an initial prototype and a project plan for the entire project that in the end took only 10 days from start to finish. Using legacy BI solutions that Managility has been using beforehand, a project like this would have taken weeks.
For information on the Managility Fast Start program and product/supplier mix optimization please contact us here.
PowerApps, is a relatively new service by Microsoft that enables business users to build mobile or web browser based apps in conjunction with their corporate data in an Office like environment with no or a very low number of lines of code. At Managility we have been using Powerapps (apps) and the related Flow (workflows) services mostly to implement customised budgeting/forecasting process steps like for example budget data collection and ongoing tracking of actual purchases to assigned budgets. Recently we wanted to test how Powerapps could be utilised for our internal processes and built a practice management / time sheet system for our consultants. Within just a few days an initial version of the application was ready that runs as a mobile app or from a web browser.
Powerapps Active Projects Screen
How does it work?
Initially our consultants are automatically authenticated using their office365 subscription without a need to login separately. From that point onwards they can select the project and enter tasks and hours against it using an easy, touch enabled interface.
Users with admin rights can access summary reports directly on the app , create new projects or invoke a billing process that generates invoices to be sent via email with a summary of the billable using our xero accounting solution.
Finally, there is also the option to use the data of the app with an interactive Power BI dashboard:
Power BI Report
Contact us how Powerapps mobile applications could streamline your processes: Contact Managility
Power BI is an awesome tool but no tool will be of any use if your underlying data features the 3 “I”s:
For a simple ad-hoc analysis of one source you can get by fixing #1 in Power Query and hoping to identify the “Incorrect” in the analysis. It’s no secret that the initial benefits of many Business Intelligence projects are typically the identification of data issues that become apparent through improved insights with the use of effective data visualisation.
As soon as you want to use multiple sources #3 will become an issue, where “changes in the front end (i.e. Power Query)” will often not fix the issues anymore and you will require a broader BI repository strategy where a key part is the architecture of an integrated data storage approach for analytical purposes often referred to as a “Data Warehouse”.
For us at Managility, the reporting and front end side is typically the easiest part of project that takes 10-20% of the overall implementation time. Typically, the major portion of the rest of the time in our projects is spent on the design and implementation of the back end with loading and cleansing and ensuring consistency of data in between the different sources into a database optimised for analytics.
A process referred to as ETL (Extraction Transformation and Loading) in our tech speak. As much as software vendors will try to position their tool as “really easy” and “self-service” an inherent part of the data integration process are complex issues where data typically “doesn’t match” and mappings andrules will need to be established that no out of the box process will be able to cover co,mpletel and that typically requires specialist knowledge.
At a minimum, we recommend that as soon as you embark on more complex analytics projects with different sources, you make yourself familiar (or find someone to help you…) with multi-dimensional modelling and how to structure data into star schemas (in a nutshell the separation of facts, records of what is happening and dimensions: details and hierarchies by which you want to analyse your data). It is a more or less mandatory concept in Power BI to have data structured that way if you want to analyse across different sources. At minimum, you will require a dimension table with distinct elements by which the different sources and their “Fact Tables” can be joined. The simplest and likely most widely used example there is a common date table which we will cover in more detail in the next tips&tricks post here.
Inspired by the Delta Master BI -a tool that we have worked with for many years and that combines advanced analytics and a great UI- I was looking to realise one of its cool features in Power BI. Delta Master is a brilliant tool but unfortunately has an “enterprise price tag. As we will see we can get pretty close to using one analysis method using Power BI.
The Delta Master feature is called “Power Search” and does a ranking across all dimensions and levels in a multi-dimensional cube (e.g. Analysis Services).
The user can specify a measure e.g. “Revenue” and the method returns a ranking across all dimensions like for example shown here:
This is a great starting point to see what is “driving ” your revenue. In this case the first 2 results are trivial as we are only operating in this market and with one company but from rank 2 it gets more interesting and we see that a customer group has the biggest share of “Revenue” with 80% followed by a product group classification with 78% etc.
A ranking like this is also particularly interesting if it’s on a measure like Actual-Budget variance as it will avoid that I have to navigate through multiple dimensions to find where is the variance coming from. Using the Power Search approach I get the key factors immediately.
Taking this approach further Power Search also allows you to include combinations in the ranking when it gets even more interesting like:
To realise something similar in Power BI you just add the Table visual to your dashboard and drag the attributes that you want to use in the ranking into there:
And voila we see an overview of what is driving Revenues across all relevant attributes. To Calculate the share of the total just use this simple DAX calculation: