Powerapps based Practice Management App

Powerapps based Practice Management App

What Is It?

PowerApps, is a relatively new service by Microsoft that enables business users to build mobile or web browser based apps in conjunction with their corporate data in an Office like environment with no or a very low number of lines of code.
At Managility we have been using Powerapps (apps) and the related Flow (workflows) services mostly to implement customised budgeting/forecasting process steps like for example budget data collection and ongoing tracking of actual purchases to assigned budgets.
Recently we wanted to test how Powerapps could be utilised for our internal processes and built a practice management / time sheet system for our consultants. Within just a few days an initial version of the application was ready that runs as a mobile app or from a web browser.

 

How does it work?

Initially our consultants are automatically authenticated using their office365 subscription without a need to login separately. From that point onwards they can select the project and enter tasks and hours against it using an easy, touch enabled interface.

Users with admin rights can access summary reports directly on the app , create new projects or invoke a billing process that generates invoices to be sent via email with a summary of the billable using our xero accounting solution.

 

 

Finally, there is also the option to use the data of the app with an interactive Power BI dashboard:

 

Contact us how Powerapps mobile applications could streamline your processes:  Contact Managility

Power BI Top 10 Learnings Tip #1 GIGO: Garbage In Garbage Out

Power BI Top 10 Learnings Tip #1 GIGO: Garbage In Garbage Out

Power BI is an awesome tool but no tool will be of any use if your underlying data features the 3 “I”s:

1. incomplete,

2. incorrect,

3. inconsistent

For a simple ad-hoc analysis of one source you can get by fixing #1 in Power Query and hoping to identify the “Incorrect” in the analysis. It’s no secret that the initial benefits of many Business Intelligence projects are typically the identification of data issues that become apparent through improved insights with the use of effective data visualisation.

As soon as you want to use multiple sources #3 will become an issue, where “changes in the front end (i.e. Power Query)” will often not fix the issues anymore and you will require a broader BI repository strategy where a key part is the architecture of an integrated data storage approach for analytical purposes often referred to as a “Data Warehouse”.  

For us at Managility, the reporting and front end side is typically the easiest part of project that takes 10-20% of the overall implementation time. Typically, the major portion of the rest of the time in our projects is spent on the design and implementation of the back end with loading and cleansing and ensuring consistency of data in between the different sources into a database optimised for analytics.

A simple Star Schema used for ClearJelly.net

A process referred to as ETL (Extraction Transformation and Loading) in our tech speak. As much as software vendors will try to position their tool as “really easy” and “self-service”  an inherent part of the data integration process are complex issues where data typically “doesn’t match” and mappings andrules will need to be established that no out of the box process will be able to cover co,mpletel and that typically requires specialist knowledge.

At a minimum, we recommend that as soon as you embark on more complex analytics projects with different sources, you make yourself familiar (or find someone to help you…) with multi-dimensional modelling and how to structure data into star schemas (in a nutshell the separation of facts, records of what is happening and dimensions: details and hierarchies by which you want to analyse your data). It is a more or less mandatory concept in Power BI to have data structured that way if you want to analyse across different sources. At minimum, you will require a dimension table with distinct elements by which the different sources and their “Fact Tables” can be joined. The simplest and likely most widely used example there is a common date table which we will cover in more detail in the next tips&tricks post here.

Easy Sentiment Analysis for Twitter

Easy Sentiment Analysis for Twitter

In this post we are covering how the new Microsoft Flow can be used to insert data from Twitter with sentiment analysis into a Power BI streaming dataset.

Create streaming dataset in PowerBI:

Go to PowerBI.com -> Then “Streaming dataset” –> Create streaming datas

Now we will create a dataset type of API.

Name the dataset –> Then add the following values:

Time ——DateTime.

Tweet——Text

Sentiment—–Number

Sentiment values will be numbers in 0,1 (with 0 being negative and 1 positive).

Create flow to push data from twitter to PowerBI:

Go to –>flow.microsoft.com–> My Flow–>Create from blank.

Enter your flow name and title and on the screen below Click on twitter category

Then select new step–> Add an action –>search by “Sentiment” word and select “Text Analysis – Detect Sentiment”

Example: select tweet text to be your text that will be analysed with text analytics – Detect Sentiment

In the last step we are going to push the data into a PowerBI Streaming dataset.

Click on new step –>Add an action–> select PowerBI–> Then click on PowerBI – Add rows to a dataset.

Then select your workspace, dataset, and table

Then select the following:

time: created at

Tweet: Tweet Text

Sentiment: Score

Then select create the flow.

Now the data is inserting directly to your streaming dataset.

Let’s see how to use that in a dashboard:

Go to PowerBI, dashboard then click on add tile.

Click on tile –> Then select “Custom Streaming Data” –> select your streaming dataset–>select your visual type ie: Line Chart –> Then add time as Axis, Tweet as Legend, and Sentiment as Values

Or you can use build direct report from your streaming data set.

Go to streaming data set–> click on create report.

Power Search

Power Search

Inspired by the Delta Master BI -a tool that we have worked with for many years and that combines advanced analytics and a great UI- I was looking to realise one of its cool features in Power BI. Delta Master is a brilliant tool but unfortunately has an “enterprise price tag. As we will see we can get pretty close to using one analysis method using Power BI.

The Delta Master feature is called “Power Search” and does a ranking across all dimensions and levels in a multi-dimensional cube (e.g. Analysis Services).

The user can specify a measure e.g. “Revenue” and the method returns a ranking across all dimensions like for example shown here:

This is a great starting point to see what is “driving ” your revenue. In this case the first 2 results are trivial as we are only operating in this market and with one company but from rank 2 it gets more interesting and we see that a customer group has the biggest share of “Revenue” with 80% followed by a product group classification with 78% etc.

A ranking like this is also particularly interesting if it’s on a measure like Actual-Budget variance as it will avoid that I have to navigate through multiple dimensions to find where is the variance coming from. Using the Power Search approach I get the key factors immediately.

Taking this approach further Power Search also allows you to include combinations in the ranking when it gets even more interesting like:

To realise something similar in Power BI you just add the Table visual to your dashboard and drag the attributes that you want to use in the ranking into there:

Power Search

 

And voila we see an overview of what is driving Revenues across all relevant attributes. To Calculate the share of the total just use this simple DAX calculation:

Share = CALCULATE(sum(Xero_Sales_Data[LineAmount])/CALCULATE(sum(Xero_Sales_Data[LineAmount]),ALLSELECTED()))

OPTION PRICING AND GREEKS – JEDOX

OPTION PRICING AND GREEKS – JEDOX

Hi,

In this post, I would like to present you the “Option Pricing n Greeks” in Jedox.

Well continuing on my previous post, I would like to highlight the potential of Jedox in terms of number crunching capacity, where I have designed the macro for option pricing most commonly used in any front desk team of an investment bank.

For all the background about options derivatives used in the financial industry and the measures (the Greeks) incorporated in this model references are as below:

http://www.investopedia.com/university/options/

http://www.investopedia.com/university/option-greeks/

http://en.wikipedia.org/wiki/Option_(finance)

http://en.wikipedia.org/wiki/Greeks_(finance)

This model in Jedox computes the option prices and its greeks based on the user inputs and as the benefits of Jedox come along together the possibilities are enormous.

Code:

  1.  
  2.  
  3. function dOne($S, $X, $T, $r, $v, $d){
  4.  $n1 = application()->Log($S / $X);
  5. $n2 = ($r $d + 0.5 * pow($v , 2)) * $T;
  6. $d1 = $v * pow($T,2);
  7. $value = ($n1 + $n2) / $d1;
  8. return $value;
  9. }
  10.  
  11. function NdOne($S, $X, $T, $r, $v, $d){
  12. $nd = dOne($S, $X, $T, $r, $v, $d);
  13. $n1 = 1* (pow($nd , 2) / 2);
  14. $d1 = application()->sqrt(2 * application()->Pi());
  15. $value = application()->Exp($n1) / $d1 ;
  16. return $value;
  17. }
  18.  
  19. function dTwo($S, $X, $T, $r, $v, $d){
  20. $value = dOne($S, $X, $T, $r, $v, $d) $v * application()->sqrt($T);
  21. return $value;
  22. }
  23.  
  24. function NdTwo($S, $X, $T, $r, $v, $d){
  25. $value = application()->NormSDist(dTwo($S, $X, $T, $r, $v, $d));
  26. return $value;
  27. }
  28.  
  29. function OptionPrice($OptionType, $S, $X, $T, $r, $v, $d) {
  30. if($OptionType == ‘C’){
  31. $value = application()->Exp(-$d * $T) * $S * application()->NormSDist(dOne($S, $X, $T, $r, $v, $d)) $X * application()->Exp(-$r * $T) * application()->NormSDist(dOne($S, $X, $T, $r, $v, $d) $v * applica tion()->sqrt($T));
  32. } elseif ($OptionType == ‘P’){
  33. $m1 = application()->Exp(-1*$r * $T);
  34. $m2 = application()->NormSDist(-1 * dTwo($S, $X, $T, $r, $v, $d));
  35. $m3 = application()->Exp(-1 * $d * $T);
  36. $m4 = application()->NormSDist(-1 * dOne($S, $X, $T, $r, $v, $d));
  37. $s1 = $X * $m1 * $m2;
  38. $s2 = $S * $m3 * $m4;
  39. $value = $s1$s2;
  40. }
  41. return $value;
  42. }
  43.  
  44. function OptionDelta($OptionType, $S, $X, $T, $r, $v, $d){
  45. if($OptionType == ‘C’) {
  46. $value = application()->NormSDist(dOne($S, $X, $T, $r, $v, $d));
  47. }elseif ($OptionType == ‘P’){
  48. $value = application()->NormSDist(dOne($S, $X, $T, $r, $v, $d)) 1;
  49. }
  50. return $value;
  51. }
  52.  
  53. function Gamma($S, $X, $T, $r, $v, $d){
  54. $value = NdOne($S, $X, $T, $r, $v, $d) / ($S * ($v * application()->sqrt($T)));
  55. return $value;
  56. }
  57.  
  58. function Vega($S, $X, $T, $r, $v, $d){
  59. $value = 0.01 * $S * application()->sqrt($T) * NdOne($S, $X, $T, $r, $v, $d);
  60. return $value;
  61. }
  62.  
  63. function OptionRho($OptionType, $S, $X, $T, $r, $v, $d){
  64. if($OptionType == ‘C’){
  65. $value = 0.01 * $X * $T * Exp(-1 * $r * $T) * application()->NormSDist(dTwo($S, $X, $T, $r, $v, $d));
  66. }elseif ($OptionType == ‘P’){
  67. $value = 0.01 * $X * $T * Exp(-1*$r * $T) * (1 application()->NormSDist(dTwo($S, $X, $T, $r, $v, $d)));
  68. }
  69. return $value;
  70. }
  71.  
  72. function OptionTheta($OptionType, $S, $X, $T, $r, $v, $d){
  73. if($OptionType == ‘C’){
  74. $value = -(($S * $v * NdOne($S, $X, $T, $r, $v, $d)) / (2 * application()->sqrt($T)) $r * $X * Exp(-1 * $r * ($T)) * NdTwo($S, $X, $T, $r, $v, $d)) / 365;
  75. }elseif ($OptionType == ‘P’){
  76. $value = -(($S * $v * NdOne($S, $X, $T, $r, $v, $d)) / (2 * application()->sqrt($T)) + $r * $X * Exp(-1 * $r * ($T)) * (1 NdTwo($S, $X, $T, $r, $v, $d))) / 365;
  77. }
  78. return $value;
  79. }
  80.  

SCREEN SHOT ON WEB VIA JAVASCRIPT AND SAVING BACK TO SERVER

SCREEN SHOT ON WEB VIA JAVASCRIPT AND SAVING BACK TO SERVER

Screen Shot’s” associated with the windows keyboard “Print Screen” function key, have been and still being utilized heavily to report error’s occurring on the client side with the application. And clients not sending/capturing the screen state at the time of errors lead to difficulty reproducing the issue within the development arena in order to resolve the same.

Thus to simplify a similar situation, I thought it would be great if the screen shot could be captured at the time of error occurring (error handling “catch” block) and sent across to server and saved as an image without client going through the pain of capturing, saving, attaching and send via portal tickets/email.

Below is a small POC demonstration with the help of a JavaScript library and custom PHP code customization to achieve the similar.

The functionality above illustrated, will allow you to capture the screen shot of the current spreadsheet in Jedox, (but the widget area will remain blank as the JavaScript API excludes the iframe html components to render in the the screen capture image). But this technique is not restricted to Jedox it can be part of any large scale application which is custom HTML written.

With this interesting technique I would only like to highlight the dangers of the same, as it can be wrongly used to target the client side confidential information, so please beware !!! (Mischievous intent people shoo!!! away)

The process above illustrated is a two step process, in which the first step source target area screen shot is captured via “html2canvas.js” and converted and appended to he body of the code base as html “Canvas” element, which is then picked up by next JavaScript code blocks and sent to the server as raw blob content where it is saved back as a “PNG” file on the server.

Sample Code:

HTML/Javascript:

[code language="html"]

<html>
<body>
<p>Screen Shot Widget</p>
<input type="button" value="Capture Screenshot" onclick="takeScreenShot();">
<input type="button" value="Post Screenshot" onclick="postImage();">
<script src="/pr/custom/html2canvas.js"></script>
<script type="text/javascript">
 function takeScreenShot(){
 html2canvas(window.parent.document.body, {
 onrendered: function(canvas) {
 var cand = document.getElementsByTagName('canvas');
 if(cand[0] === undefined || cand[0] === null){

 }else{
 //cand[0].remove();
 document.body.removeChild(cand[0]);
 }
 document.body.appendChild(canvas);
 }
 });
 }

 function postImage(){
 var cand = document.getElementsByTagName('canvas');
 var canvasData = cand[0].toDataURL("image/png");
 var ajax = new XMLHttpRequest();
 ajax.open("POST",'/pr/custom/testSave.php',false);
 ajax.setRequestHeader('Content-Type', 'application/upload');
 ajax.send(canvasData );
 alert('done');
 }

 takeScreenShot();

</script>

</body>

</html>
[/code]

PHP:

[code language="php"]

<?php
if (isset($GLOBALS["HTTP_RAW_POST_DATA"]))
{
// Get the data
$imageData=$GLOBALS['HTTP_RAW_POST_DATA'];

// Remove the headers (data:,) part.
// A real application should use them according to needs such as to check image type
$filteredData=substr($imageData, strpos($imageData, ",")+1);

// Need to decode before saving since the data we received is already base64 encoded
$unencodedData=base64_decode($filteredData);

//echo "unencodedData".$unencodedData;

// Save file. This example uses a hard coded filename for testing,
// but a real application can specify filename in POST variable
$fp = fopen( 'test.png', 'wb' );
fwrite( $fp, $unencodedData);
fclose( $fp );
}
?>

[/code]

CALENDAR HANDLING IN JEDOX : FORMULA DRIVEN DYNAMIC CALENDAR

CALENDAR HANDLING IN JEDOX : FORMULA DRIVEN DYNAMIC CALENDAR

In this post I plan to share a trick in Jedox spreadsheet to lay out a dynamic formula driven calendar based on your input month and year value the month laid out calendar changes itself according to the week days monthly template is laid out.

The calendar is driven via another sheet in the wss, where the formulas are laid out for the monthly template to look upon according to the week day index’s.

The inspiration for me to build this calendar is partially a project requirement along with support from the following blog post laying out the same for excel.

http://chandoo.org/wp/2011/12/27/download-free-2012-calendar/

http://www.excelhero.com/blog/2010/01/live-calendar-musings.html

This is also a clear representation of the power of Jedox in  terms of bringing excel to web, in its complete essence.

Download the attached wss file for yourself’s to review and feel free to ask me any relevant questions relating to the same.

Enjoy !!!

USING PROFESSIONAL DEVELOPMENT TOOLS WITH JEDOX

USING PROFESSIONAL DEVELOPMENT TOOLS WITH JEDOX

In this post I would like to share my view on few challenges I faced  by embarking on a large project powered by Jedox, which incorporated writing of several thousand lines of code (macro):

Challenges:

1. Editor Simplicity: Jedox is more geared towards business users providing the functionality they require as a BI product, than providing functionality for the developers to write better code. No problems, I guess they do best at what they deliver, and for the solutions area I have made out a way where I can leverage the same old friend’s of developer tools such as Eclipse for PHP (Macro) development.

2. Code Re-use: This was a major roadblock for me until I figured out a way to achieve the building of complete application styled solution for the projects delivered by Jedox. As when you write macro, it is encapsulated in its own shell, so if you want to achieve the same functionality for multiple spreadsheets (which is written once and used multiple times across spreadsheets) in the native installation of Jedox its hard to achieve, unless you expose out a single piece of functionality making tweaks to the “macro_engine_config.xml” file as mentioned in their admin manual.

Solution:

The magic bullet to resolve all and more issue as mentioned above and you may face, following illustrates a way , which helped me big time to write better code is as follows:

1. Create a folder “shared_macro” or any name you like, which will be the entry point for your macro based application, nested sub folders and php files representing macros (comprising your application) all can be created and referenced relatively beneath this folder. This is the folder which will be configured with Jedox to represent the entry point for your custom code.

C:\Program Files (x86)\Jedox\Jedox Suite\httpd\app\shared_macro

2. Edit the “macro_engine_config.xml” for the following enteries

C:\Program Files (x86)\Jedox\Jedox Suite\core\macro_engine_config.xml

Add the highlighted enteries into the above configration file and re-start the service : “JedoxSuiteCoreService

to reload the configuration.

That’s it , now Jedox can peek into this folder and utilize any php based files as macros, how to achive it is as follows:

– Create a file inside the folder created in the step 1 above as “dummy.php”, with the code content as follows:

– Reference the file from any spreadsheet macro and call upon the file contained function as follows:
And finally when you hit the button, you get the following output as desired:

Now, we have configured our Jedox to peek and utilize files from out custom folder, how now we edit amend and build a comprehensive complete PHP project styled application via eclipse:

That’s easy enough to achieve as well :

First download the eclipse package and install the necessary software as illustrated,

Then create a PHP project and add the above create “shared_macro” directory to the newly created project as follows:

(Ignore the name, it can be any thing, as I already have the project previously created for illustration i had to suffix the name with “.1″)

Once the project is successfully created, then attached the “shared_macro” folder as follows:

And that’s it, your project is now fully configured in eclipse to rock-n-roll out some comprehensive macros with all the benefits of code-reuse, intellisense and more benefits available to the developers for their development purposes.
Hope the above steps, helps you going into  overdrive with Jedox…

SUCCESS FACTORS FOR PLANNING AND BUDGETING SOLUTIONS

SUCCESS FACTORS FOR PLANNING AND BUDGETING SOLUTIONS

Key Criteria For Effective Processes

Planning is typically a dreaded activity, often involving inefficient convolutes of spreadsheets and related maintenance nightmares.  Managility specialises in the implementation of effective solutions to run as much as possible automated, responsive processes using the technologies best suited to our client’s needs.

Below we have compiled ten success factors that were tested and successfully applied in over fifteen years at hundreds of client’s projects across the globe:

1. Push Button Setup: On-Premise or in the Cloud

The core framework of a planning solutions needs to be available for an immediate start without complex installations and special hardware requirements.

We have had great experiences with preconfigured virtual machines, tailored to the client’s specific needs (e.g. in regards to the required platform components). With those it those it should be easy to deploy them in the internal network or host them securely as a cloud based solution at the preferred location and jurisdiction of the client.

With the Managility Fast Start program, we guarantee an initial version of an effective planning process that incorporates your requirements based on this approach.

2. Simple Integration of Source System

A key component in any planning system is the integration of source systems to obtain comparison data, structures and hierarchies. The chosen solution should offer a broad selection of standard connectors that enable the integration of data and structures with either a native connector the relevant sources (e.g. ERP systems like SAP, Microsoft Dynamics, etc.), relational databases and at a minimum flexible text format import functionalities. The process should preferably empower the business users directly and provide them with options to implement changes or new models using simple drag and drop interfaces.

3. Flexible Modelling and Maintenance

In our experience of implementing planning systems for more than 15 years one of the key factors of a success is the ability for business users to modify structures to changing needs and extend existing business models independently of the used sources systems (conceptual modelling)

An effective solution needs to enable to easily simulate the introduction of a new product in an existing dimension, or the acquisition of a company in the organisational hierarchy without reliance on external or specialised IT resources.

4. Comprehensive Security and Compliance

It’s no surprise that security and compliance are an absolute must for planning systems. Managility focuses on four aspects:
a. Security of the infrastructure: through encryption mechanisms at all data transfers, and state of the art data storage platforms.
b. Integration into existing authentication standards like active directory and LDAP avoiding the need to manage users separately in a planning system.
c. The safety mechanisms around budgeting system data for specific users. Key criteria here are cell-based security (i.e., rights can be precisely defined on any combination; for example, User A is allowed to modify data for only one product group in a specific planning scenario) that is in place independently of whatever front end is used e.g. web browser, Excel, mobile.
d. Functional role based access: For every functional object in the system it should be possible to define the specific access rights e.g. user is able to write global calculations, has access to special analytical methods, etc.

5. Top-down, Bottom-up

An effective planning process needs to combine easy data collection on a detailed level, as well as simple top-down changes with flexible allocation mechanisms. This process should preferably occur in real time, so when the user enters a figure on a detailed level, they can immediately view the effects on the aggregate level as well (and vice versa – where an adaptation on a top level would immediately reflect changes on their children).

6. Best Practice Planning Logic Library

Avoid re-inventing the wheel and ensure that your provider of planning software has comprehensive experience and preferably already implemented a similar requirement.
At Managility we developed an extensive library of business process methodologies that effectively handle calculation and workflow requirements for specific aspects. These cover a wide array of subjects, from complex HR planning, CAPEX planning, and sales budgeting processes to strategic driver-based budgeting approaches.

7. Horses for Courses: Interchangeable Fronts End (Web,Excel,Mobile)

In typical planning scenarios, requirements for users vary dramatically — from simple completion of data entry forms to complex ad hoc analysis and modelling. We recommend to consider solutions that equally run within web browsers, offer a bi-directional integration into Excel where users can read/ write and directly interact and model with data (as opposed to just an offline data dump into spreadsheets), and specialised mobile device applications.

8. Tailored Workflows

More complex budgeting processes require the easy management of workflow scenarios within the process. Our recommendation is to use platforms that enable flexible and simple setup, as well as execution of workflows (e.g., for approval processes, alerts and exception handling).

9. Extendibility

Despite wide range of existing functionalities in various platforms, often features are required that are not part of the offering. We recommend working with platforms that are extensible and enable the integration of third party web services like Google Maps, online FX rates and other web services respectively offer the customisation of existing features via API access or macros.

10. Support for Advanced Predictive Analytics (e.g. R-Libraries)

The integration of advanced statistical and data mining functions has become a critical feature of planning and budgeting systems. Your chosen solution should offer the integration of respective calculation logic – respectively, the use of standards like R-Libraries (e.g., the easy application of different forecasting methods or the integration of choice modelling metrics for the prediction of customer behaviour).

REALISING POSTAL CODE SHAPES IN GOOGLE MAPS IN JEDOX WEB

REALISING POSTAL CODE SHAPES IN GOOGLE MAPS IN JEDOX WEB

Hi, hope you enjoy my first post. I am a Senior Consultant at Managility specialising in the ETL, report development and custom visualisations.

Recently I faced an interesting challenge at a Managility client who was looking to visualise sales employee performance within a specific Australian postal code area.

A quick and free option is using Google Map techniques integrated in a Jedox Web widget to enable very interesting insights.

So to start off, The end result we are aiming is depicted in the screenshot below:

The solution that I worked out is composed of the following components:

1. Google Geocoding API: This API usage was primarily to place the marker on the map requiring the latitude and longitude coordinate values mapping to the postal code in Australia. This API is responsible for translating the postal code value to the required coordinates for the map marker

Google Geocoding API

2. Google Charts API: This API is used to draw/plot the sales person performance data in the google map marker information window. Using this API in the sample I have rendered a column chart but, it has to offer a wide variety of chart types with great detailing tweaks and settings offered in the API

Google Charts API

3. Google Maps (Marker/Info window): This API sections have been used to place the marker and the associated Info window (containing the embedded chart)

Google Map Markers

Google Map Info windows

4. Postal code area Highlight: Now, here’s the tricky bit… Which took me a while to figure out myself. For the postal code area to be highlight now a days Google by itself offers a highlight mechanism as illustrated below:

In the above check the red dotted line mapping the postal code boundary

Google Map Default Postal Code Boundary (AU, NSW 2008)

Now first line of thoughts made me to leverage what’s available with the google by default and I tried using the same in conjunction with rest of my task to be accomplished involving setting the marker with an info window and embed chart inside the same. But scanning the entire google maps API I was unable to figure out a way as to where I can combine all the pieces of the puzzle together.

But then again googling examples to find a methodology for the same I stumbled upon the following link which partially answered my problem (and being very close to my requirements):

www.aus-emaps.com

Now only thing was to marry all together….

So investigating upon and with an aim to not to elongate the mystery of the solution, I found my solution with the use of KML files used in conjunction with google maps to custom draw shapes on the google maps driven by a custom directive files for the shapes termed as KML (Keyhole Markup Language)

Keyhole Markup Language (KML Wiki)

Google Maps KML Layer API

So now I was able to draw shapes on google maps, but still I wasn’t having any near plans to write or draw this KML files by myself in this life for Australian continent, landing me up to the page for Australian Bureau of Statistics

Postal Areas (POA) 2006 Digital Boundaries in ESRI Shapefile Format

Now next challenge which I faced was, how to convert the SHP file which I have obtained from the site to the required KML file for the google map.

The problem was answered by the following open source software which was very helpful to convert the shape file to KML, with also depicting how my shape file would lay out on the map.

QGIS, A Free and Open Source Geographic Information System

Now when I opened my newly generated KML file I faced with my next task, that it comprised the bulk of postcode data all in one file. This file needed to be split apart into to post code areas so that on demand the respective KML to the post code area can be picked up and rendered on the map.

That’s where my skills of Excel/VBA came into play to split the bulk KML file into their respective postal code files. And hence finally I was able to obtain what I wanted to bring all pieces of the puzzle into one place and render my final solution.

With just a last bit of a knack that my KML files didn’t work, I was able to see my map, with the respective marker and info window with the required chart inside it, but no postal area was highlighted.

Eventually after little hair pulling on my end I was able to resolve the issue by noticing that the KML files which was being passed to google for rendering, required some kind of parsing for which google can’t pick the files from the local host URL, it needs a publicly hosted URL to pick the relevant KML file and parse render it on the map, hence to overcome the last glitch the following JavaScript library was required:

KML processor for the Google Maps JavaScript API V3

And after usage of the above library… Voila… I ended up with my required result as desired

Please contact us for further information or help with your Jedox Business Intelligence requirements @ info@managility.com.au