Power Who ? …. Launching Power View

Power View is the new whiz bang reporting tool by Microsoft that is part of the next generation of easy to use yet powerful data visualisation tools. For a start if you don’t have at least Excel 2010 (with at least the Power Pivot add-in 2012 RTM installed), Sharepoint Server 2010, SQL Server 2012 Business Intelligence, Developer or Enterprise editions, then forget it. But if you do have all of the above, and you havn’t using Power View previously, you are in for a visual treat.

To embark on a quick guide to launching Power View read on…

Unlike Report Builder, launching a Power View report begins with creating one of the data sources that Power View supports. These data sources include a Power Pivot database, a BISM Connection file or a Shared (Tabular) Data Source. To create a Power View report from Power Pivot, click the Create Power View report icon from the upper right hand corner of the Power Pivot xlsx file. Yes you must have Excel 2010 to run the free to install Power Pivot 2012 RTM add-in. To create a Power View report within Sharepoint, click the down arrow next to the Shared Data Source or BISM Connection File and select Create Power View Report. The Power View design environment then opens and you will be able to start creating exciting visuals using this exciting new reporting tool.

If you are unable to launch Power View or cannot create a BISM Connection file from within your document library, you will need to make some adjustments to the features for your site. For a start we assume you have correctly integrated Reporting Services and Power Pivot to your Sharepoint Server 2010 environment (only basic Reporting Services integration under Sharepoint Foundation 2010). Now go to the site that contains the document library you will be storing your Power View reports, select Site Actions, Site Settings, Site Collection Features then ensure the Power View and Power Pivot Integration Features are Active. The BISM Content Type must be specifically enabled for your document library which you can access under Library Settings, Advanced Settings, and Add from Existing Content Types.

Further material available here.

SSAS 2012 … The 3 Headed Dog

Cerberos, in Greek and Roman mythology, is a multi headed hound (usually 3 heads) that guards the gates of the Underworld, to prevent those who have crossed the river Styx from ever escaping.  Kerberos, required for a Performance Point Extranet topology (https://popbi.wordpress.com/2012/04/01/performance-point-2010-external-access-must-read/) also refers to 3 heads, the client, the server and trusted third party (being the active directory domain controller).  

Whats this guy on about ?  Ok, Analysis Services is, for now, a 3 headed beast too, the three heads depicted as different types of instances, with an organisation choosing to have three, just two or only one.   These 3 heads are Multi-dimensional/Data mining, Tabular and PowerPivot which are three physically separate installations of Analysis Services.  You can tell the difference between one or the other by observing the difference in icon style when all 3 are registered in Management Studio, or the Deploymentmode property in the msmdsrv.ini file – as outlined here Differentiating SSAS Instance Types.

Traditional SSAS Multi-Dimensional databases are simply too hard to construct and support for some organisations, so these new approaches offer very good news.    PowerPivot /Tabular is a more user friendly option for developing analytical cubes that won’t make you feel like throwing up in the wastebin, unlike MDX for some people. 

Each instance type of Analysis Services supports certain data source types, can be programmed with certain design tools, and each has its own model features which are all outlined here in this useful reference Differentiating Features of SSAS Instance Types.  PowerPivot and Tabular are fundamentally similar underpinned predominantly by the design tools and subtle model differences.  However you can import a Powerpivot database into BISM and publish back out as a Tabular model. 

There are some gotchas before you can use the new tabular approach…. As long as your requirements can be modelled without writeback,  many to many relationships, very complex calculations or very large datasets then you can use this fantastic new method of developing analytical cubes.   You won’t need to know MDX (but instead will need to learn the Excel PowerPivot DAX feature set), and can revel in the new features such as connecting to a range of External Data Sources and blistering performance courtesy of the Vertipaq engine.  

There is an excellent whitepaper on DAX for Tabular models here DAX in the Tabular Model Whitepaper and Samples but if you are simply looking for some visual capability demo’s check out the various demo’s like Picnic and Contoso here Power View Demo’s (you will need to have your Window’s Live ID and have Silverlight installed).

Finally, some additional design perspectives are offered by Paul Te Braak here Paul’s Stuff.

Source http://en.wikipedia.org/wiki/Cerberus.

Sharepoint 2010 Report Server File Sync Feature

If your users will frequently upload published Report Server items directly to SharePoint document libraries, you may want to consider enabling the Sharepoint 2010 File Sync feature as a best practice.

The Report Server File Sync Feature synchronizes Report Server files (.rdl, .rsds, .smdl, .rsd, .rsc) from a SharePoint document library to the report server when files are added or updated in the document library.

But what does this mean ? I have already integrated Reporting Services with Sharepoint, I thought this would already be the case!   Yes and No.  The File Sync features utilises Sharepoint event handlers to synchronize the report server catalog with items in document libraries more frequently. The reality is that in an integrated Reporting Services environment you have a Report Server database and a seperate Sharepoint Content databases that share information.   If the File Sync feature is not activated, content will still be synchronized but not as frequently, so why take the risk.

A classic example is if you are utilising third party products to build scheduled solution documentation snapshots such as Pragmatic Works BI Documenter.   This tool reads the Report Server catalog (as well as the database, SSIS and SSAS components) to build an exact as built snapshot of your current Microsoft B.I. content and solutions.  This gives you the ability to find out what has changed between any two snapshots, hence an excellent example of why you would want the Report Server catalog kept strictly up to date.

and Setup is mindless….

1. From the main page of your site, click the Site Actions menu and click Site Settings

2. In the Site Actions click Manage Site Features

3. Find Report Server File Sync in the list

4. Click Activate

 

B.I. with Sharepoint 2010 Budgie (Foundation) edition

Before you jump into Sharepoint 2010, have you considered running the budget no frills Foundation 2010 edition for starters ?  Check out Compare Editions for a feature comparison.  You may be surprised about what you get in Foundation.  Its free to download and you just need to be licensed for Windows. Check out the official license info here and more basic and insightful license coverage here.

Asides from Reporting Services 2012 integration, Foundation edition has loads of other features to bring to life fully functional and professional looking portals and comes with Business Connectivity Services.  This feature is handy for businesses who want to maintain their own reference data and need writeback to an external datasource.  On the SQL Server 2012 Standard edition side, you still get these great features and hence can still run your Analysis Services and Integration services packages demanded by an end to end Business Intelligence solution.   In this integrated combination you don’t get Reporting Services Data Alerts nor Power View, however for most organisations starting down the Business Intelligence road where baby steps are usually taken,  Reporting Services 2012 Sharepoint Foundation 2010 integration might do just nicely for minimal dollars.

Lets do it !  Where can I download !?

Part 1 : Xray Vision for SQL Server Error Logs

You could be forgiven for not looking at your error logs….but only once….after all, look what happened to Krypton!

In this multi part series, we walk through how to transform the volumes of mundane SQL Error logs for your entire SQL Server fleet, into insightful, integrated and simplified 30,000 foot view dashboards.  Consolidating your error logs and creating Powerful Centralised Dashboards on a common Sharepoint 2010 platform then creates other flow on benefits, such as delegating those monitoring tasks to someone else – like the helpdesk !

In this first part of a multi part series we consolidate the error logs for your entire SQL Server fleet into one table.   I decided not to reinvent the wheel here….after all, Rodney Landrum has already done an outstanding job of creating a SSIS package for you already – download the source code here.

Rodney also has published many other useful methods of monitoring your SQL Server databases and an excellent Ebook to download called SQL Server Tacklebox full of work smarter not harder solutions, but enough of the Rodney plug…

Skills required for this walkthrough are Database and SSIS.   If you are new to business intelligence then this will be a perfect end to end learning tool, allowing you to apply these techniques to data you are very familiar with….SQL Server Error Logs !  Lets move on with staging and consolidating the data we are interested in.

1. To get started, download the original package here and extract the solution files to a common folder where you will be able to add additional solutions from this series later.   If you are having trouble with downloading Rodney’s package just Download All My Part 1 Files Here containing the SSIS solution, the specific database objects required and the entire database backup.  Whilst SQL 2012 now released I have used SQL 2008 R2 versions of files for better compatibility.

2. Create yourself a database that will contain your consolidated error logs, then create the database objects required.  You can either create the database objects manually or just restore the starter database both of which are available in my Part 1 Download Link.

4. Populate the SSIS_ServerList configuration table to include one row for each server you want to manage. 

As an example you could build a series of insert statements

INSERT dbo.ServerList_SSIS

(Server, Connect, DMZ)

VALUES

('YourSQLInstanceName',1,0)

5. Open the SSIS solution and update the Connection Manager named “DBA_Rep” to reflect the correct connection string of your new database.

6. Set the Package Protection Level property accordingly (e.g. don’t save sensitive) then save the package.

7. Create a SQL Server Agent Job that runs this package.  This will require adding a job step that runs the SSIS package either from file location or MSDB database (whatever deployment method you choose for the package).  If unsure just go for the file location option.  A good general reference for running SSIS packages through SQL Server Agent jobs is outlined here.

8. Schedule the package to run once daily (say, 7am).  This will capture the overnight logs which will set us up well for Part 2 of this series.

9. Run the package and make sure the package runs successfully.  Note –  before we move onto Part 2 of the series, you need to ensure there are rows in the consolidated error log table for each of the servers you configured in the ServerList_SSIS table. 

 SELECT Server, COUNT(1) AS CountErrorLogRows

FROM SQL_ErrorLog

GROUP BY Server

ORDER BY Server

In Part 2 of this Xray Vision for SQL Server Error Logs series – we use a second SSIS package to funnel our consolidated error logs into a predesigned Kimball Star Schema data model. Stay tuned !

If you have any quesitons please Get in Touch on Linked In.

SSRS – Quick Date Formatting for Tables, Textboxes and Charts

Datasets containing datetime fields can be excellent for ordering but you will want to apply some quick formatting techniques to tidy up those date formats when used in tables, text boxes, or even labels in charts.  Outlined below are some basic suggestions for quickly tidying up 3 different scenarios in your report.

For a datetime field used in a Tablix, go to the Format property and enter … 

dd/MMM/yyyy

For chart labels you will need to enter an expression in the Label property …

=Format(Fields!YourDateTimeField.Value,"dd/MMM/yyyy")

For textboxes for example, displaying the datetime of the report execution, you can enter the following in the Format property of the textbox …

dd/MMM/yyyy hh:mm tt

For Dates coming from OLAP cubes, you may find the above expressions fail to work.   If that is the case, try using CDate in your tablix cell first using the following field expression…

=CDATE(Fields!YourOLAPDateField.Value)

Then in the Format property for the textbox, enter your date format e.g. …

dd/MMM/yyyy

PerformancePoint Unexpected Error Occurred 11861 Creating Report

You have launched Performance Point Dashboard Designer in Sharepoint 2010, created a Data Source connection to your cube, then attempted to create an Analytical Chart Report.  You get an Unexpected Error 11861 popup message.

You just need to right click on your new connection and select Save, then you will be able to create your Analytical Chart.

Reference PerformancePoint Unexpected Error Occurred 11861