Guest interview in Power BI Developer Blog and a new Send To Power BI app for Excel

Be sure not to miss the latest blog on the official Power BI Developer Blog :) , the amazing Power BI team asked us to share some thoughts on our latest work & research working with the new Power BI APIs and capabilities. And we were deeply honored to contribute. How cool is that? :)

Read the full post here.

And celebrating the day :) also announcing that even if you can already use SmartCharts to push & sync data to the new Power BI Preview directly from Excel, you now have a dedicated app just to do that :) a brand new Send To Power BI Office App for Excel.

Give it a test drive and leave us your feedback or review. :) It will be greatly appreciated!

Download the new app from the App store.

clip_image002

For more info:

DevScope GitHub repository
Send To Power BI Excel App
SmartCharts Excel App
DevScope Office Apps
Rui Quintino Blog
Rui Romano Blog
www.devscope.net

Awesome day! :)

Rui

Advertisements

More Power BI API awesomeness, now on PowerShell & SmartCharts

Amazing week for Data lovers & Business Intelligence minds, Power BI Preview is now *globally* available! Be sure to check out the public announcement  for details. And Chris Webb’s BI blog for additional insights.

By now you know that there’s a new & strong developer story for the new Power BI,  and even if at the moment we have a simple API to test drive, it already allows to think of interesting scenarios. Particularly for real-time dashboards or  the click to insights pattern  I happen to obsess so much ….

Really, seems there will be no place for data to hide from Power BI :). When Power BI can’t get there… we will bring it to Power BI in a blink of an eye, in real time if needed, all using the brand new APIs.

Don’t have Power BI Designer? or even Excel? and need to quick load  some data to explore? Real time? Check out DevScope github  & use the awesome Power BI API Powershell modules to load your Data into PowerBI directly from the command line (for more info & a cool real time sample don’t miss  Rui Romano blog post).

Not forgetting we already have NuGet Power BI api modules for .Net, Xamarin,  iOS & Android by Sylvain Pontoreau .  Or more recently new sample web apps for Power BI api scenarios.

Adding that, as of… now :) you can also use SmartCharts to quick send & live sync to Power BI directly from Excel. :)

Start with your data as usual or the sample dataset and just click Send To Power BI.

image

Authenticate & authorize the app for Power BI access,  choose a Dataset and a table name :

image

and your data will be available in your Power BI account ready to more Power BI awesomeness

(note: if using date/time columns we also add some extra time related columns we happen to use in the app).

image

Your data ready to explore in Power BI:

image

image

image

Noting that you should be able to do all this on trial mode without any restrictions.  Got feedback and requests? let us know (there’s a feedback form in the app :) we will get back to you)

Get SmartCharts here, or checkout or other DevScope Labs  Office Apps here.

If you want to know more about the new APIs, some additional recommended reading:

More to come :) stay tuned

Rui

AzureML Web Service Scoring with Excel and Power Query

While drafting some samples & labs for this saturday SQLPort AzureML workshop decided to dig a little bit trying to use some regular AzureML scoring web services (request/response) directly in Excel, using Power Query. :)

So… can it be done? Sure! Advisable? Far,far from it…. but for a quick sample/test scenario it might be handy. :)

Saying again, what I’ll show here is far from a recommended web service usage for AzureML in Excel. It will be slow, probably insecure if you need to distribute your file, and there will be I’m sure better ways to do it with the batch web service mode, soon I hope (without the need to pass through the Azure marketplace),

Side notes:

    How it works

    To the point, you can use the Power Query Web.Contents function to invoke the AzureML web service api. As long as you are careful on: 1) using post and not get, 2)setting the authorization header expected by azureml, 3)setting the content type to json  4) building the json request like shown in the api help page.
    PowerQuery will handle the json response automatically..We encapsulate the web request in a function, we can then use it to build our scored table passing the score input column values and getting the score in return.

Step by Step-From AzureML to Excel Power Query Scoring

You’ll need an AzureML  scoring web service, I used a simple two class log regression model to classify the well known Fisher Iris Dataset (with only two classes, as the original as three):

image

The AzureML Experiment & Web Service Experiment, can’t get much simpler than this.

Training Experiment

image

Web Service Experiment

image

Publishing the trained model as a Web Service

AzureML-Iris Classifier Web Service API Help page

image

this JSON request will be needed for Power Query, same thing as the service key:

image

In Power Query you’ll then create a function like this (text version below):

image

image

image

Invoking the Power Query scoring function

Then you just import the original table to Power Query and add a custom column invoking the new scoring  function

image

Note: to test this it is advisable to filter the top 1/2 rows first, it is not lightning fast…. Smile

image

The result should now be visible, a little bit more Power Query expanding and renaming and that’s it!

image

image

To check that I’m getting the expected score, just used another small tool (more on this later)

image

Power Query results

image

Scoring the whole 100 rows iris dataset

The problem using the regular Power Query web.contents function  is that you will be receiving api usage threshold errors from the AzureML API very, very soon. So using another trick inspired (again) on this post by Chris Webb (amazing tip by Alejandro Lopez-Lago), to add some delay between requests.

Just remove the top filter and refresh the query. Wait between 30 secs-60 secs (this will depend on the kind of endpoint you are using, AzureML is changing this whole “stag/production web service model “ right now from what I’m seeing)

image

note: think the latest Power Query update now shows the row count progress, neat! perfect for this Smile

image

image

image

Some misses there Smile guess the model needs some additional training!

image

Download the sample here, (note: will need your own AzureML endpoint & key, you do know that AzureML has a free tier right? :) ) the full Power Query score function for this model is below.

Take care!

Rui

let

     Score= (sepal_length as any,sepal_width,petal_length,petal_width) =>

let

serviceUri="*yourserviceuri_note:ends with score*",
serviceKey="yourservicekey”,

PostContents= "

{
  ""Id"": ""score00001"",
  ""Instance"": {
    ""FeatureVector"": {
      ""sepal_length"": """&sepal_length&""",
      ""sepal_width"": """&sepal_width&""",
      ""petal_length"": """&petal_length&""",
      ""petal_width"": """&petal_width&"""
    },
    ""GlobalParameters"": {}
  }
}
        
                   
",
      //Define Wait function
    Wait = (seconds as number, action as function) =>
            if (List.Count(
             List.Generate(
              () => DateTimeZone.LocalNow() + #duration(0,0,0,seconds),
              (x) => DateTimeZone.LocalNow() < x,
              (x) => x)
              ) = 0)
               then null else action()
    //Call Wait function to wait 5 seconds
    //then return the current time again
   
,
    Source=Wait ( 0.05, () => Web.Contents(serviceUri,
    [Content=Text.ToBinary(PostContents),
    Headers=[Authorization="Bearer "&serviceKey,#"Content-Type"="application/json; charset=utf-8"]])),
    #"Imported JSON" = Json.Document(Source),
    #"Table from List" = Table.FromList(#"Imported JSON", Splitter.SplitByNothing(), null, null, ExtraValues.Error),
    #"Transposed Table" = Table.Transpose(#"Table from List")
   in
     #"Transposed Table"
    in Score

Hot July 2013 for #msbi content–Selected posts & news on #ssas #powerbi #powerview

I don’t remember the last post I wrote just to gather/curate content on #msbi, but these last weeks/days have been so full of good #msbi content, I really can’t resist! …and we still have two days left! :)

Amazing content, read for yourself: (anything missing? just say so, comment below !)

MSFT Blogs & all about Power BI

“Pure Power” :)

Don’t miss the amazing Power BI demo by Amir Metz: Power BI Demo

Power BI Blog

Worldwide Demographics & Statistics using Power Query and Power View by Faisal Mohamood [MSFT]

Office Blogs

Announcing Power BI for Office 365

Microsoft Business Intelligence

Q&A Feature of Power BI

MSFT research

DO YOU KNOW WHERE YOU’LL BE 285 DAYS FROM NOW AT 2 P.M.? THESE DATA-MASTERS DO

Kasper de Jonge PowerPivot Blog

Power BI

 

#msbi community

Chris Webb’s BI Blog

New MDX Divide() Function

Point-In-Time Dimension Reporting In DAX

Defining DAX Measures In The With Clause Of An MDX Query

Some Thoughts About Power BI

 

Jason Thomas Blog

Drill-down Behaviour in Power View

Power BI for Office 365

 

Gerhard Brueckl’s BI Blog

DAX vs. MDX: DataMembers in Parent-Child Hierarchies

 

Blue Granite Blog

Is Microsoft Power BI a Game Changer?

 

CSS SQL Server Engineers

Tracking down Power View Performance Problems by Adam W. Saxton

 

LessThanDot

Rants about Connect by Koen Verbeeck

 

From oralys blog:

Round function in MDX by Hilmar Buchta

PDW: Cube processing experience by Michael Mukovskiy

Considerations for the distribution key for distributed tables on PDW (part 2) by Hilmar Buchta

Considerations for the distribution key for distributed tables on PDW (part 1) by Hilmar Buchta

Caution with SQL Server 2012 SP1 CU4, Bug with nested multi-valued parameter by Stefan Grigat

 

Paul Turley’s SQL Server BI Blog

DAX: Returning the Nth Selected Value in a Slicer

We’ve Got The Power: “Power BI”, New Microsoft BI Suite Announced

 

http://redphoenix.me/

Excluding Analysis Services files from Anti Virus scanning

 

Jason’s Technical Topics

Cache Warming on an Analysis Services Tabular server

 

Byobi

SSRS + MDX: Dynamic Dimension Attribute on Row Axis

Why You Need to Routinely ProcessFull

 

Other selected news & content:

Stephen Few

A new edition of Information Dashboard Design

http://www.idgconnect.com/

Portugal: An Unlikely Business Intelligence Hotspot? :)

SmartPivot new build – now with Saved/Quick Pivot filters and Filter by List improvements

Fresh new build for SmartPivot users, download here as usual. This version features:

  •  Improvements on Filter by List feature (interestingly one of the features mostly mentioned by users)
  • And (finally!) a new Saved/Quick filters feature. :)

New Quick/Saved Filters feature

This feature allows saving filter selections on the pivot for later use on other pivots targeting identical hierarchies. The saved/quick filters are persisted locally in user profile configuration, so that you can always refer to them very quickly.

image

Ex: filter your pivot as usual, or through SmartPivot fast search

 

image

 

Now, pick the Quick/Saved Filters option to save your filter:

 

image

 

image

Select your Pivot field and Save Current Filters option:

image

Now just create any other connection to the same cube, and use the quick/saved filters to apply the exact same selection as before.

(as most features in SmartPivot you can fast search saved filters )

image

double click and your previous selection is now filtering the new pivot:

image

Improvements in Filter By List

Filter by list feature now works as a task pane (expect most of SmartPivot dialogs to be migrated to task panes as they allow for better usability).

Improvements in this release:

  • Available as an Excel Task Pane
  • Pick values from range
  • Filter by member names or member keys

image

Other improvements

Fixed an issue with table reports and orphaned calculated measures (undefined measure groups)

Wrapping up

Download SmartPivot through the product page here.

 

Take care,

Rui

Fastest way to get a #msbi VM with #sharepoint 2013 & #PowerView Multidimensional? use online MS sales demo VMs

Do you need a clean Sharepoint 2013 #msbi VM with PowerView multidimensional for demos/testing purposes? Don’t want to –or can’t- build one from scratch? and have you access to Microsoft partner resources?

Online pre-prepared #msbi vms&demos  at http://fs.mssalesdemos.com/

Well, be sure to check the MS partner site http://fs.mssalesdemos.com/ for demo VMs. You’ll find at least two fully loaded VMs for #msbi with SharePoint 2013 and several prepared demo content & resources (powerview, powerpivot, excel).

These VMs work extremely well for demoing or testing purposes, they start in minutes and you connect using remote desktop. Performance is very acceptable. And you can save them for later use! :)

Unfortunately none has CU4 applied to run PowerView over multidimensional cubes, but we can solve that very quickly! Just download & install the CU4/(or the most recent CU5) :)

Download and install SQL 2012 SP1 CU4 for SSAS multidimensional support in SharePoint PowerView

I started up an instance of the “SQL Server 2012 BI Demo – Breakthrough Insight (March 2013)” VM. Already has SQL 2012 SP1.

image

Wait until the VM starts, connect using the rdp link:

image

and just download the hotfix inside the VM (should be very fast, 770MB took only a few minutes in my case).

image

Apply the SQL 2012 SP1 CU4 (I updated all instances just in case, took a bit longer). (ps-be sure to stop the World Wide Web Publishing Service service to avoid a restart and closed all Man. Studio and similar apps,

restart the www service after CU4 finished)

image

image

And you’re done!

Create a “Semantic Model connection for PowerView” data source for your SSAS cubes

Now, just create a new data source in Sharepoint (ex: http://intranet.contoso.com/Shared%20Documents/) and open it to start PowerView . You even got some multidimensional models installed (ex: contoso retail).

image

image

Click the created data source and PowerView should open connected to the Contoso Retail cube:

Check the official guide to know more: “Understanding Power View for Multidimensional Models

image

& Save the demo VM for later use on mssalesdemos

And for the better part :), you can then save the VM for later use (just don’t trust it will be there forever… they sometimes expire and disappear without warning…).

For this use the “Stop and Save” option (prepare to wait between 5-10 minutes while the VM is saved).

image

image

The new VM should then appear in your catalog. You can come back later and start from were you left off:

image

btw- if you want to check other demo resources in the VM:

 

http://intranet.contoso.com/sites/bicenter

http://intranet.contoso.com/PowerPivot%20Gallery/Forms/Gallery.aspx

imageimage

Take care!

Rui

New samples at @devscope #msbi codeplex- II–#PowerShell Dynamic Outliers with #SSAS mining models

Finally… ! Updated our codeproject project with an improved sample for this. :) btw-See the first post on this thread for other Powershell MSBI data driven/dynamic samples & utilities

II-#PowerShell Dynamic Outliers Monitoring with #SSAS mining models

The idea for this sample goes right back to the initial releases of the Excel Data Mining Addin. Now that’s a cool Excel Addin, in fact it’s way more valuable today and it’s a little sad that data mining is even a little bit forgotten in the MSBI stack :( (yes, it is…). The issue with SSAS Data Mining, particularly this addin I think… it  really was ahead of its time! because now, in our always changing data driven world, it’s the time for such agile & self service data mining/data intelligence scenarios.

Anyway, that Addin showed that is was possible to do 100% dynamic outlier detection based only on the given data/schema. Models would be generated in runtime for each dataset.

And so, finally, we had the time to take this concept and turn to PowerShell to make a lot of other data driven scenarios  scriptable & able to run completely automated (ex: scheduled/intelligent alerts/trends/anomalies monitoring).

So, what if we could use this to monitor several data streams we have available in our solutions?  We already have & use several very agile & data driven kpis/reports/datasets… but that still usually involves fixed alert thresholds, what if we would have something looking at these data streams every day and tell us, “hey, here’s something that’s not usual, are you aware?”? (ie something that would do what I usually do, so that I can go on doing several other things… like blogging and reading :) ).

And that’s precisely what the sample scripts does:

  • iterate all the views in a predefined schema (love this pattern for dba/bi monitoring)
  • execute & get the datasets, we used SQL server but you can use any datatable
  • pass the datatables to the data mining module (it will, like the excel data mining addin, create a temporary model dynamically for each table, do the outlier stuff, some nasty hacks of our own, get an outlier probability value for each row
  • then filtering (in this scenario) for today outliers, using a convention styled approach – choosing the first date column available (btw-a better approach would be save the last execution time for each view and use that date value as a date filter)
  • if there are outliers, send the report by mail (using the cmdlets I talked about in my last post)
  • if not, keep quiet!

And anytime I want to monitor another dataset, just create another view in that schema, the script will adapt accordingly… hence the data driven mantra applies:) (data+services mindset to explore in a future post…)

Like this:

A schema (mon) where I can put the data streams I want to monitor for outliers/anomalies.

image

As a sample I use a dataset of SQL Analysis Services processing logs/elapsed times, by object (measure group, partition, dimension,…).

Ensure that I have a date column to allow for proper date/time filtering-after the outlier detection-, (1 month old outliers aren’t that interesting anyway).

 

image

Now imagine that I force something strange in that data stream, changing the started date as so to increase the processing time for that SSAS object (creating an anomaly)…

image

And I run the provided outlier sample script…. :)

image

and…amazingly (remembering that I did not configure any kind of warning thresholds…), my two “simulated” outliers are now highlighted in my mailbox

image

How cool is that? :) The script doesn’t know have to know the data in anyway, anything goes, that’s the beauty of it.

(defending myself :) from possible comments by MSBI team regarding this screenshot… I can only say that I promise to reassess my mobile stack after the “new” release of the MSBI stack for mobile… :)  )

Be advised though that the script is just a sample, there are a –few- known “issues” at this moment:

  • Do not expect perfect results, we didn’t spend much time with SSAS mining model tuning for the moment, and we are limited to the accuracy we can expect to get from analysis services mining models (would be great to get some data mining experts help/feedback!)
  • Some column types can cause the mining model/structure not to be created
  • We had to use a hack… and introduce a outlier probability of 2 when we were getting an outlier probability of 0… (cof, I know, will have to dig deeper as for the reason for this… sure we messed up anywhere)
  • Still missing is the column highlight where the outlier is most probable, that’s possible, the data mining addin does this but we hadn’t time for that yet, sorry
  • several others issues will appear I’m sure…( I did say that’s a sample right? :) )

That’s it for today, browse/get the sample at http://devscopebisamples.codeplex.com !

Note that there’s a bundle download available with Excel Data Insights AddIn, Mail DataTable Reports & this outliers sample.

The “tiny” main script:

# for each view found
$viewName = $reportViewsSchema+”.”+$_.Name

$data = Invoke-OLEDBCommand -connectionString $connStr -sql “select * from $viewName”

$firstDateCol= $data[0].Table.Columns | ? {$_.DataType -eq [datetime]} | select -index 1 | select $_.ColumnName

$dmTable = Set-Outliers -dataTable $data[0].Table -connectionString $ssasconn

write-host “Evaluating OutlierProbability threshold…”
$x=$dmTable | sort-object OutlierProbability -descending | select -index 5| select OutlierProbability

write-host “Highlighting…”
$dmTable=$dmTable | ? {($_.$firstDateCol -ge [System.DateTime]::Now.Date )} | select *,@{name=”SysRowCssClass”;expression={if ($_.OutlierProbability -ge $x.OutlierProbability ){ “highlight” } else { “” }}} | sort-object $firstDateCol.ColumnName -descending

$alertCount=($dmTable | ? {$_.SysRowCssClass -like “highlight”}).COunt
write-host “Today outliers: $alertCount”

# send mail only if alert column is present and its 1
if ($alertCount -gt -1)
{
$html = Out-Email -subject “Outliers in $viewName : $alertCount (today)” -inputObject $dmTable -to $mailto -from $mailfrom -smtpServer $mailsmtp
}

ps-and don’t tell my boss that PowerShell & Data Mining is doing our work now! ;) We’ll figure out something else to do eventually….

Take care,

 

Rui