Fastest way to get a #msbi VM with #sharepoint 2013 & #PowerView Multidimensional? use online MS sales demo VMs

Do you need a clean Sharepoint 2013 #msbi VM with PowerView multidimensional for demos/testing purposes? Don’t want to –or can’t- build one from scratch? and have you access to Microsoft partner resources?

Online pre-prepared #msbi vms&demos  at

Well, be sure to check the MS partner site for demo VMs. You’ll find at least two fully loaded VMs for #msbi with SharePoint 2013 and several prepared demo content & resources (powerview, powerpivot, excel).

These VMs work extremely well for demoing or testing purposes, they start in minutes and you connect using remote desktop. Performance is very acceptable. And you can save them for later use! :)

Unfortunately none has CU4 applied to run PowerView over multidimensional cubes, but we can solve that very quickly! Just download & install the CU4/(or the most recent CU5) :)

Download and install SQL 2012 SP1 CU4 for SSAS multidimensional support in SharePoint PowerView

I started up an instance of the “SQL Server 2012 BI Demo – Breakthrough Insight (March 2013)” VM. Already has SQL 2012 SP1.


Wait until the VM starts, connect using the rdp link:


and just download the hotfix inside the VM (should be very fast, 770MB took only a few minutes in my case).


Apply the SQL 2012 SP1 CU4 (I updated all instances just in case, took a bit longer). (ps-be sure to stop the World Wide Web Publishing Service service to avoid a restart and closed all Man. Studio and similar apps,

restart the www service after CU4 finished)



And you’re done!

Create a “Semantic Model connection for PowerView” data source for your SSAS cubes

Now, just create a new data source in Sharepoint (ex: and open it to start PowerView . You even got some multidimensional models installed (ex: contoso retail).



Click the created data source and PowerView should open connected to the Contoso Retail cube:

Check the official guide to know more: “Understanding Power View for Multidimensional Models


& Save the demo VM for later use on mssalesdemos

And for the better part :), you can then save the VM for later use (just don’t trust it will be there forever… they sometimes expire and disappear without warning…).

For this use the “Stop and Save” option (prepare to wait between 5-10 minutes while the VM is saved).



The new VM should then appear in your catalog. You can come back later and start from were you left off:


btw- if you want to check other demo resources in the VM:


Take care!



Exploring saved #Kindle Highlights with #SQL2012 Semantic Search

Kindle Highlights

I have said it before, archiving book highlights is one of the best features available on Kindle. You highlight your books on Kindle or any Kindle app and  then you can review them online whenever you want, and with a little effort… you can (you should actually, who knows what amazon will decide tomorrow…) save them locally for personal reference.

Anyway, I was reading about the new SQL 2012 Semantic Search feature and I thought it could be interesting  to run the new semantic search through my Kindle highlights.

There’s an amazing sample by Roger Doherty on codeplex called MySemanticSearch. I don’t know any other good app to explore (as in freely explore…) my highlights, so this seemed to be a very good start to explore the new SQL 2012 semantic search.


Be sure to :

Getting my Kindle Highlights

First of all, cranking some PowerShell and using the kindle site, I got my very own highlights for several of my Kindle books :




Loading Streams into SQL using SQL 2012 FileTable feature

Using FileStreams and the new SQL 2012 Filetable I quickly loaded these into SQL:

— Create Database
ON PRIMARY (Name = FileTableDB, FILENAME = ‘D:\SQL2012\MSSQL11.SQL2012\MSSQL\DATA\MySemanticSearch.mdf’),
(NAME = FileTableFS, FILENAME=’D:\SQL2012\MSSQL11.SQL2012\MSSQL\FileStream\MySemanticSearch’)
(Name = FileTableDBLog, FILENAME = ‘D:\SQL2012\MSSQL11.SQL2012\MSSQL\DATA\MySemanticSearch.ldf’)

Then the FileTable

CREATE TABLE MyKindleHighlights AS FileTable
(FileTable_Directory = ‘MyKindleHighlights’);


Loading files into BLOBs using SQL FileTable shared folders

Then just copied all my text files into the newly created folder for the FileTable (created automatically by SQL):



The table updates automatically, content/streams up and running!


Creating the full text/semantic index

You can do this with SQL of course (see here), I choose the lazy way, through Management Studio UI :)








Using semantic search

It’s now possible to use semantic queries to explore my highlights:


Well…. not so great, I don’t know exactly why, but my text files (I think apostrophes and some non printable chars) caused some confusion in the indexer, I get keywords like “isna”, “whata”… not good.

Reparsing the text files was not an option for now, I opt for adding some more stop words to the already existing in the system stop words list. For that I listed all the words in my semantic index:



I loaded that list in PowerShell and with a very rude dictionary check script borrowed from the web I listed  the words that were unknown nor anywhere in the files. With that I could now create a SQL Full text custom stop word list based on the already existing system stoplist:


Using the PowerShell script to produce something like:


Configured the Fulltext index to use that stop list instead:



Run the query again, and as expected those noise words were now removed from my semantic queries:


Loading into MySemanticSearch codeplex sample

Time to load that FileTable into MySemanticSearch! Made a few visual adjustments, and a few others  so that it could work on my dev laptop (some strange problem with joining stream_id and document_key, I used path_locator instead of stream_id).

I opted for building MySemanticSearch from the source code, using VS 2010, so that took me some additional time to set up.

But finally all running! :) Connect to my sql 2012 instance:


It scans for available FileTables and finds MyKindleHighlights immediately:


Some interesting insights (remind you that these are from my biased highlights only, a very small subset of the complete books):






Final thougths

There seem to be some shortcomings (seems to handle unigrams only and does treats plurals/singular as completely different keywords, ex cat and cats… ). From a “semantic” search point of view I would expect a little better. May be I’m missing something, as this was a very initial test.

Nevertheless,  as an easily configurable and completely automated unstructured text mining tool now available in SQL Server, it is a very valuable addition to SQL Server.




SQL Server “Denali” CTP 3 is Out–Best Posts & my top picks (Business Intelligence Features)

Yes, SQL Server next version “Denali” ctp3 is out, and so finally… we can go public,  starting making some noise, and sharing thoughts. :)

No, you won’t find another deep review of “denali” here. :) No time for that, there’s so much to explore. But I’ve been reviewing several posts, so I leave you with my top posts & notes so far.

(I’ll try to keep this updated… as in… “try” :) )

Download & Documentation

Download SQL Server “Denali” CTP 3 here.

Technet > What’s New in SQL Server “Denali” CTP3

Release Notes

Product Documentation for Microsoft SQL Server Code-Named “Denali” CTP3

Books Online for SQL Server “Denali”

Online Documentation  – Tabular Model Solutions (SSAS)

Online Documentation – Tabular Modeling (Adventure Works Tutorial)

Online Documentation- DAX Table Query Syntax Reference

Online Documentation- Introduction to the CSDL Extensions for Tabular Models

Online Documentation- CSDL Concepts

Online Documentation-Connect to a Tabular Model Database (SSAS)

Online Documentation-Data Quality Services

Great Crescent Tutorial

Crescent design experience

SQL Samples for Denali CTP3

Download Details – Microsoft Download Center – Microsoft® SQL Server® code name ‘Denali’ CTP 3 Feature Pack

Installation & Walkthroughs

Dan English’s BI Blog >SQL Server ‘Denali’ CTP3 Install Experience

MSDN > Deployment Checklist: Reporting Services, Project Crescent, and PowerPivot for SharePoint

MSDN > SQL Server BI Feature Installation with SharePoint

MSDN > Install PowerPivot for SharePoint 

MSDN > Installing Reporting Services SharePoint Mode Report Server for Project Crescent and Data Alerting

Best resources & community posts so far


Microsoft® SQL Server® code name ‘Denali’, Community Technology Preview 3 (CTP 3) Product Guide


SSIS Team Blog > SSIS – What’s New in SQL Server Denali

SQL Server Code Name “Denali” Breakthrough Insight

New “BISM”,Tabular Models & DAX

Andrew Fryer > Analysis Services cubed in SQL Server Denali ctp3

Analysis Services and PowerPivot Team Blog > Welcome to Tabular Projects

Analysis Services and PowerPivot Team Blog > Creating a tabular Model with AMO

Chris Webb’s BI Blog > DAX Queries, Part 1

Chris Webb’s BI Blog > DAX Queries, Part 2

Cathy Dumas’s Blog > Tabular Designer Architecture

WesleyB’s Blog > SQL Server Denali CTP3–Lots of interesting resources

SQLBI – Marco Russo > First steps with #ssas #Tabular in #Denali #CTP3

Data Visualization > Tabular Model, Columnstore, new BIDS released!

Analysis Services and PowerPivot Team Blog > The Diagram is here…

PowerPivot Denali: Parent child using DAX

Project “Crescent”

YouTube > NEW Demo for SQL Server Project “Crescent”

Getting Started with Project “Crescent” and PowerPivot for Excel in SQL Server Code-Named “Denali” Community Technology Preview 3 (CTP3)


SQL Server Reporting Services Team Blog  > SQL Server codename “Denali” CTP3, including Project “Crescent” is now publically available

Technet> Project Crescent Overview

Technet>Data Visualizations in Project Crescent

Small multiples - one bubble chart per month

Technet > Crescent FAQ, Troubleshooting, Tips, and Trick

Technet > Tutorial Create Charts, Tiles, and Other Visualizations in Project Crescent

Reporting Alerts *NEW!!!*

Reporting Alerting in SQL Server Denali ctp3 – Insufficient data from Andrew Fryer – Site Home – TechNet Blogs

new data alert 2

Data Quality Services *NEW!!!*

Data Quality Services (DQS) > Introducing Data Quality Services

Data Quality Services (DQS) > How to add Reference Data Services in Data Quality Services (DQS)

Interesting findings

Hope that’s not for rtm… Does this mean no excel on tabular models direct mode??? Will Excel support DAX directly? 2010? 2012?

“Connecting to DirectQuery Models

After you have switched the model to DirectQuery mode, traditional OLAP clients cannot connect to the model. For example, if you attempt to create an MDX query against a DirectQuery model, you will get an error indicating that the cube cannot be found, or has not been processed. However, you can use DAX formulas and XMLA queries. For more information about how you can perform ad hoc queries against tabular models, see Tabular Model Data Access.”

Promising… finally! Note that its driven by data feeds… so we’ve got great potential here.

Open Alert Designer from SharePoint library

Self Service Alerting:

Alerting is a new capability that we are adding to Reporting Services. As soon as the report server is upgraded to Microsoft SQL Server Code Name “Denali”, it will enable any user that can access reports previously built with SSRS 2005, SSRS 2008 or SSRS 2008 R2 whether in Report Builder or BIDS, to setup alert rules and be alerted when report data changes occur that match a set of rules.  No changes are required to the existing reports. You’ll need to have SSRS integrated with SharePoint in order to take advantage of this new capability (the reports must also use stored credentials data sources, similar to subscriptions).  Management of alerts is also enabled through SharePoint through a common interface.

CTP 3 Books online – Data Alerts (SSRS)

Workflow in Reporting Services alerting

Data Alert Designer

Areas within the Alert Designer user interface

“RDLC Designer and Report Viewer Control: Have been upgraded with Reporting Services “DENALI” code base (Including support for DOCX, XLSX Export, Map functionality, RDL 2010 schema support…). Starting with the next release of Visual Studio, RDLC and Report Viewer will ship with the same level of functionality of the most current release of SQL Server.”

I want to believe that for RTM that’s still hope for a *real* semantic model… that feeds crescent & other tools like excel, and supports both tabular models, as olap models… Come on guys…

You always start Crescent from a tabular model in a SharePoint Server 2010 document library or in a PowerPivot Gallery. The model can be:

A PowerPivot file (XLSX) in a PowerPivot Gallery in SharePoint Server 2010. For more information, see Use PowerPivot Gallery.
A shared data source (RSDS) with a Microsoft Business Intelligence Semantic Model data source type, based either on a PowerPivot file on a tabular model on a Analysis Services server. For more information, see Tabular Model Connection Type (SSRS).
A BISM connection file (BISM) based on a tabular model on a Analysis Services server. BISM connection files can be either in a standard SharePoint Server 2010 document library or a PowerPivot gallery. For more information, see PowerPivot BISM Connection (.bism).”


Closing “rants“ :)


– Ahhh….Excel team: restricting the new models to pivot table style reporting… hmmmm …. you’re fixing this right? ;) It seems almost unbelievable that MS keeps missing the wide usage of *their own* excel as BI tool, and not (yet) building a proper query editor that covers *real* business intelligence semantic models… with much better query editor & query results (both pivot & tables)…

– And yes, I’ve got a feeling and for now I’m agreeing with Andrei Pandre , and sorry Andrew I have to disagree, I can’t easily spot a proper BISM (semantic model)(cof, yes, apart from the .bism extension…) that unifies & shields client apis from tabular or typical olap models. I see two models, with some clients (excel) being forced to see a tabular model as an olap model, and others seeing only a tabular model (crescent) … kind of “messy” … Hope it will be better! The old report models seem much more semantic to me….

– And sorry guys… a little too early to go “frenzy” deep dive learning DAX. :) No need to rush… honestly we should be focusing more on helping MS get a great new version out. If they get it right, DAX will then be a huge success… but only then…

Kind regards,


DevDays DAT302, Tuning & understanding MDX Query Performance in SQL Server 2008 Analysis Services





A little bit late I know , but now in english,. Thanks for the “heads up” Pedro.


Anyway, here it is, all the materials I’ve used on the DevDays session.Notes of interest:

  • Report/Dashboard that shows the mdx trace log, properly organized by query
  • Possible to filter, using sub report links
  • Use of a custom report in management studio
  • I think it should run fine in sql 2005 management studio (needs sp2 for custom reports)
  • Sample code, use as is, can  be buggy! :)

All feedback is welcome!





MDX Studio

Blog Mosha Pasumansky

BIDS Helper

A Solution for Collecting Analysis Services Performance Data for Performance Analysis

Microsoft SQL Server Community Samples: Analysis Services

SQL Server 2008 White Paper: Analysis Services Performance Guide

SQL Server Best Practices Article: Identifying and Resolving MDX Query Performance Bottlenecks in SQL Server 2005 Analysis Services

Microsoft SQL Server 2008 Analysis Services Unleashed (Paperback)