Power BI & The Austrian Election

Power BI & The Austrian Election

Over the weekend I had a bit of time on my hand and I wanted to try out some new features in the latest Power BI release.

While looking for an interesting data set I thought about the Austrian election. As most of you know this is the country where I was born before I moved to Australia and came across an interesting web site that aggregates poll results and other election information called “Neuwal.com“. The Neuwal operators are kind enough to make their data available via json format. So, I started my shiny new latest version of the Power BI engine and connected to that data set and put an initial simple analysis together in about an hour:

I tried the fairly new ribbon chart visual to show the aggregate, average poll results, this is an interesting alternative to stacked bar charts for showing trends. The table visual on the right contains the links to the publications that have published the poll results. So, with a click on the survey time line, the relevant links are automatically filtered and the user can navigate to the specific publications. Finally I am showing at the top left a comparison of the last election to the average of all polls for a selected time frame.

To put this analysis took about an hour with a live link to the data source. So, new Neuwal results are automatically updated. If you would like to “play with the application” please use the embedded report below:

To discuss how Managility can help with your analytics projects please contact us here: Contact Managility

Managility Logistics Optimization For Power BI

Managility Logistics Optimization For Power BI

Managility just completed a project with a large retail company analyzing and optimizing their shipment processes based on a Power BI solution. The relevant data set included 300.000 annual shipment transactions with all relevant details like pickup/drop off location, weight, volume, provider and shipment cost as well as a pricing data from a variety of logistics companies.

The initial step involved gaining insights into the data and providing the client with a clear picture. The outcome is shown in the dashboard below (for confidentiality reasons all data is anonymised). It enables insights into developments over time, grouping data into weight and volume buckets, rankings of top sender and drop off locations as well as a variety of filter options (e.g. dangerous goods etc.). Helpful insight was gained through a new custom visual called “Flow Map” that visualizes route details through the thickness of the link between two locations.

The next step in the project was focused on realising tangible savings. Shipment rates of all relevant logistics suppliers were added to the model. This fairly complex requirement with a variety of varying rate parameters like availability in the route, differentiations by weight, time of day, urgency, volume etc. was handled effectively with Power Query. DAX calculation logic was then implemented to determine the optimal provider/ rate option to minimise cost.

 

The results of this process are available in an interactive dashboard that includes all providers and their shipping products. As shown in the screen shot below savings of around $1m through using an optimal mix of logistics products was identified that can then be further filtered using the criteria on the right.

The projects was initiated using Managility’ s Fast Start program that included clarification of project deliverables and priorities, the building of an initial prototype and a project plan for the entire project that in the end took only 10 days from start to finish. Using legacy BI solutions that Managility has been using beforehand, a project like this would have taken weeks.

For information on the Managility Fast Start program and product/supplier mix optimization please contact us here.

Powerapps based Practice Management App

Powerapps based Practice Management App

What Is It?

PowerApps, is a relatively new service by Microsoft that enables business users to build mobile or web browser based apps in conjunction with their corporate data in an Office like environment with no or a very low number of lines of code.
At Managility we have been using Powerapps (apps) and the related Flow (workflows) services mostly to implement customised budgeting/forecasting process steps like for example budget data collection and ongoing tracking of actual purchases to assigned budgets.
Recently we wanted to test how Powerapps could be utilised for our internal processes and built a practice management / time sheet system for our consultants. Within just a few days an initial version of the application was ready that runs as a mobile app or from a web browser.

 

How does it work?

Initially our consultants are automatically authenticated using their office365 subscription without a need to login separately. From that point onwards they can select the project and enter tasks and hours against it using an easy, touch enabled interface.

Users with admin rights can access summary reports directly on the app , create new projects or invoke a billing process that generates invoices to be sent via email with a summary of the billable using our xero accounting solution.

 

 

Finally, there is also the option to use the data of the app with an interactive Power BI dashboard:

 

Contact us how Powerapps mobile applications could streamline your processes:  Contact Managility

Power BI Top 10 Learnings Tip #1 GIGO: Garbage In Garbage Out

Power BI Top 10 Learnings Tip #1 GIGO: Garbage In Garbage Out

Power BI is an awesome tool but no tool will be of any use if your underlying data features the 3 “I”s:

1. incomplete,

2. incorrect,

3. inconsistent

For a simple ad-hoc analysis of one source you can get by fixing #1 in Power Query and hoping to identify the “Incorrect” in the analysis. It’s no secret that the initial benefits of many Business Intelligence projects are typically the identification of data issues that become apparent through improved insights with the use of effective data visualisation.

As soon as you want to use multiple sources #3 will become an issue, where “changes in the front end (i.e. Power Query)” will often not fix the issues anymore and you will require a broader BI repository strategy where a key part is the architecture of an integrated data storage approach for analytical purposes often referred to as a “Data Warehouse”.  

For us at Managility, the reporting and front end side is typically the easiest part of project that takes 10-20% of the overall implementation time. Typically, the major portion of the rest of the time in our projects is spent on the design and implementation of the back end with loading and cleansing and ensuring consistency of data in between the different sources into a database optimised for analytics.

A simple Star Schema used for ClearJelly.net

A process referred to as ETL (Extraction Transformation and Loading) in our tech speak. As much as software vendors will try to position their tool as “really easy” and “self-service”  an inherent part of the data integration process are complex issues where data typically “doesn’t match” and mappings andrules will need to be established that no out of the box process will be able to cover co,mpletel and that typically requires specialist knowledge.

At a minimum, we recommend that as soon as you embark on more complex analytics projects with different sources, you make yourself familiar (or find someone to help you…) with multi-dimensional modelling and how to structure data into star schemas (in a nutshell the separation of facts, records of what is happening and dimensions: details and hierarchies by which you want to analyse your data). It is a more or less mandatory concept in Power BI to have data structured that way if you want to analyse across different sources. At minimum, you will require a dimension table with distinct elements by which the different sources and their “Fact Tables” can be joined. The simplest and likely most widely used example there is a common date table which we will cover in more detail in the next tips&tricks post here.

Easy Sentiment Analysis for Twitter

Easy Sentiment Analysis for Twitter

In this post we are covering how the new Microsoft Flow can be used to insert data from Twitter with sentiment analysis into a Power BI streaming dataset.

Create streaming dataset in PowerBI:

Go to PowerBI.com -> Then “Streaming dataset” –> Create streaming datas

Now we will create a dataset type of API.

Name the dataset –> Then add the following values:

Time ——DateTime.

Tweet——Text

Sentiment—–Number

Sentiment values will be numbers in 0,1 (with 0 being negative and 1 positive).

Create flow to push data from twitter to PowerBI:

Go to –>flow.microsoft.com–> My Flow–>Create from blank.

Enter your flow name and title and on the screen below Click on twitter category

Then select new step–> Add an action –>search by “Sentiment” word and select “Text Analysis – Detect Sentiment”

Example: select tweet text to be your text that will be analysed with text analytics – Detect Sentiment

In the last step we are going to push the data into a PowerBI Streaming dataset.

Click on new step –>Add an action–> select PowerBI–> Then click on PowerBI – Add rows to a dataset.

Then select your workspace, dataset, and table

Then select the following:

time: created at

Tweet: Tweet Text

Sentiment: Score

Then select create the flow.

Now the data is inserting directly to your streaming dataset.

Let’s see how to use that in a dashboard:

Go to PowerBI, dashboard then click on add tile.

Click on tile –> Then select “Custom Streaming Data” –> select your streaming dataset–>select your visual type ie: Line Chart –> Then add time as Axis, Tweet as Legend, and Sentiment as Values

Or you can use build direct report from your streaming data set.

Go to streaming data set–> click on create report.

Power Search

Power Search

Inspired by the Delta Master BI -a tool that we have worked with for many years and that combines advanced analytics and a great UI- I was looking to realise one of its cool features in Power BI. Delta Master is a brilliant tool but unfortunately has an “enterprise price tag. As we will see we can get pretty close to using one analysis method using Power BI.

The Delta Master feature is called “Power Search” and does a ranking across all dimensions and levels in a multi-dimensional cube (e.g. Analysis Services).

The user can specify a measure e.g. “Revenue” and the method returns a ranking across all dimensions like for example shown here:

This is a great starting point to see what is “driving ” your revenue. In this case the first 2 results are trivial as we are only operating in this market and with one company but from rank 2 it gets more interesting and we see that a customer group has the biggest share of “Revenue” with 80% followed by a product group classification with 78% etc.

A ranking like this is also particularly interesting if it’s on a measure like Actual-Budget variance as it will avoid that I have to navigate through multiple dimensions to find where is the variance coming from. Using the Power Search approach I get the key factors immediately.

Taking this approach further Power Search also allows you to include combinations in the ranking when it gets even more interesting like:

To realise something similar in Power BI you just add the Table visual to your dashboard and drag the attributes that you want to use in the ranking into there:

Power Search

 

And voila we see an overview of what is driving Revenues across all relevant attributes. To Calculate the share of the total just use this simple DAX calculation:

Share = CALCULATE(sum(Xero_Sales_Data[LineAmount])/CALCULATE(sum(Xero_Sales_Data[LineAmount]),ALLSELECTED()))