I just finished reviewing the November entries in the TechNet Wiki Guru competition and was quite impressed with the submission from Visakh Murukesan who submitted an article titled “Random SSRS Musings 2 : Simulating NOT IN and NOT LIKE Functionality In SSRS Filter Expressions”. I found this to be a well written post with excellent examples. These are both excellent examples of how simple expressions can be used to extend the capabilities of reporting features.
Comments : Leave a Comment »
Categories : SQL Server, SSRS Design
Can an SSRS report be designed to drill-through to an Excel workbook in-context (showing only the same filtered detail data)? I have to admit that I have been chasing this solution for a few years now before I was motivated enough to work it through to a viable result. More than a few consulting clients have asked if it is possible to build a dashboard solution with the option to drill-through to transactional details in Excel. The answer has always been “well, sort of” …we could only drill-through to an SSRS report and then export the contents of that report to Excel. That answer usually wasn’t good enough for the Excel power users who need to create their own workbook formulas and calculations, and use other Excel formatting and features; like PivotTables, slicers and conditional visualizations. Over the past few years, I have used some clumsy work-around techniques and discovered things like: if the target workbook were published in SharePoint and managed in a web part, workbook parameters can be used with great effort to achieve this task. However, that option has not proven to be a practical solution in most cases. As my good friend Steve Eaton once said: “Anything is possible if you have a positive mental attitude, tons of money and supernatural powers.” I’ll admit that I’m short on two of the three but I do have persistence and I’m bull-headed enough to apply a little out-of-the-box thinking now and again. The technique I will demonstrate will work in a standard Reporting Services deployment with any edition of Excel on the desktop.
I’ll start with a quick demonstration of the finished solution. The Order Details report, shown here in Internet Explorer, gets data from a data warehouse (AdventureWorksDW2014 in this example). The order details and line items are stored in our line-of-business transactional database (for this demo, AdventureWorks2014). As you can see, I choose a data range using the standard date parameter UI. I’ve also exposed a Product parameter using a set of cascading lists and custom actions in the report header (I’ll cover that technique in another post). The relevant point is that we’re selecting some parameters to get a filtered view of the report data. After choosing the date range, I use the Category, Subcategory and Product lists in the report header to select Clothing, then Socks and then the product: Racing Socks, L. The cascading list simplifies the selection from among several hundred products.
A list of orders is displayed for the selected data range and product. In the list, I click the Sales Order Number for which I want to see the order and line details in my Excel report.
This displays a link “button” in the table header with a summary of the order number I selected.
When I click this link, the web browser confirms that I want to open the Excel file. This existing file stored in a network file share contains a connection to the transactional database with the order detail information. When the report opens, it applies the filters and shows the order with line item details. Our business users are thrilled with this because they’re actually using Excel with any features and capabilities they want. Rather than dumping a copy of data into a static table, live data is presented in PivotTables or charts which contain their calculation formulas and custom formatted data. If the business user decides to add another column, calculation, chart or other item to their report; they simply save it with those changes and use that as their detail report going forward. The new drill-through data just magically shows up in their workbook file with those additions the next time they drill-through from the summary report.
How Does It Work?
Reporting Services allows us to use parameters to pass information between reports – and that’s awesome if you’re only using Reporting Services. It allows you to maintain the context of properties and filtering options. But, if you’re not using Reporting Services and don’t have some kind of mechanism to “pass” parameters (like QueryString parameters to send information to a web page), we need to put those values some place so the target “report” (i.e. Excel workbook in this case) can retrieve them and apply them as filters. So, where would be a reliable place to store parameter information? How about SQL Server! Novel, huh?
To get started, open SQL Server Management Studio and create a database named ReportStandards. Let’s add all of the objects at once and then I’ll step through the use of each one. For demonstration purposes, I have not taken time to optimize this database and adding a few indexes would be advisable in a production scenario. Execute this T-SQL script:
constraint pk_ReportUserFilters primary
(UserName, TableName, FilterKey)
( UserName, TableName, FilterKey, InsertDateTime )
from ReportUserFilters where UserName =
function dbo.GetReportUserFilters( @TableName nvarchar(100)
returns @TableOut table ( FilterKey varchar(100)
select FilterKey from ReportUserFilters where UserName =
and TableName = @TableName
if @SalesOrderNumber is
and @SalesOrderNumber <>
‘FactResellerSales’, ‘SalesOrderNumber’, @SalesOrderNumber
So, what did we create? Here’s a quick summary:
Stores sets of parameter values for a user running a report
InsertReportUserFilter stored procedure
Inserts a parameter record into the ReportUserFilters table
ClearReportUserFilters stored procedure
Removes stored parameter records for the current user
Returns a set of filter parameters to be used in a SQL WHERE clause IN function
PrepareReportSalesOrder stored procedure
An implementation for a specific report
I’ll open the finished SSRS report in Report Builder to show you the working solution.
There are several features of this report that aren’t directly related to the Excel drill-through action so I’ll fast-forward through those after a brief summary. This is like that scene in The Princess Bride where the masked swordsmen wakes up after being mostly dead and brought back to life by Miracle Max. Inigo says “let me e’splain… No. There is too much… let me sum up.” Here’s the summary:
The relevant report elements are circled in red and annotated with red numbers. Everything else is standard stuff that I would have designed into a report that doesn’t do this drill-through thing to Excel. The non-circled elements are parameters, datasets and other report elements that let the user interact with the report and filter a list of orders for a selected product and data range. I’ll refer to these numbers as I describe these report design elements.
Item 1 is a report parameter named SalesOrderNumber. A report action on the SalesOrderNumber textbox in the table (item 5) sets this parameter value. The parameter is defined as Text type, is hidden and has a default value of -1. The default value is a placeholder value that isn’t a valid SalesOrderNumber value.
The Orders dataset is just a plain old query that returns order records from the AdventureWordksDW2014 database filtered on the ProductKey, OrderDateFrom and OrderDateTo parameters. Nothing fancy here:
rs.OrderDate BETWEEN @OrderDateFrom AND @OrderDateTo
ProductKey = @ProductKey
The table (Item 5) is unsophisticated as well. The SalesOrderNumber textbox in the detail row has a report action defined. Open the textbox properties window and select the Action page which looks like this in the designer:
The target report expression (labelled “Specify a report”) just refers to the Globals!ReportName object. This target re-runs this report when the user clicks this textbox. All the parameters but the last one are simply used to maintain their current values when the report is re-rendered. Ignore the ShowParameters item as well. The SalesOrderNumber parameter is set to pass the SalesOrderNumber field value so we know which order the user selected.
Item 3 is a dataset named SetupSalesOrderReport which serves two purposes. Most importantly, it writes the selected SalesOrderNumber value to a table so it can be used to filter the result set in the Excel workbook. This dataset is a simple stored procedure that returns a flag value used to display the drill-through link. Item 4 is a textbox that serves this purpose and its Hidden property is set to the following expression:
This simply says “if the SetupSalesOrderReport Result field value is ‘Success’, show this textbox”.
The SetupSalesOrderReport dataset is references the PrepareReportSalesOrder stored procedure and passes the SalesOrderNumber report parameter. When the selected order number is passed to the parameter in this report action, the procedure stores the value and returns “Success”. This, in turn, displays the textbox showing the link. The Value of the textbox uses the following expression to display a dynamic instruction to the user:
=”Sales order ” & Parameters!SalesOrderNumber.Value & ” details in Excel”
..and the Action for this textbox uses a Go to URL link using this expression:
=”file:\\\\tsclient\D\Projects\Excel Drillthrough Reports\Sales Order Details.xlsx”
Any valid UNC path will work here, prefixed with “file:\\”. This particular path is for a folder on my laptop that I am accessing from within a virtual machine I use for development and demonstrations. You will need to grant file system permission to the folder or share to be able to open this file.
The Excel “report” is a standard workbook. I’m using Excel 2013 but any supported version or edition of Excel will work. The important element of this solution component is the connection used to drive the PivotTables in the workbook. You can use a SQL statement to define a connection/query in Excel but it’s much easier to use a view. This report uses the following view which I created in the AdventureWorks2014 database:
join Sales.SalesOrderDetail sod on soh.SalesOrderID = sod.SalesOrderID
join Sales.Customer c on soh.CustomerID = c.CustomerID
join Person.Person cp on c.PersonID = cp.BusinessEntityID
join Sales.SalesPerson sp on soh.SalesPersonID = sp.BusinessEntityID
join HumanResources.Employee e on soh.SalesPersonID = e.BusinessEntityID
join Production.Product p on sod.ProductID = p.ProductID
(select * from ReportStandards.dbo.GetReportUserFilters(‘FactResellerSales’)
After selecting this view when defining the connection in Excel, the command text is simply:
I’ve updated the connection properties to refresh the data when the workbook file is opened. This will run the query and return live sales order data and apply the filtering logic that was added from the Reporting Services report.
Two PivotTables are added to the worksheet along with a calculated field (just to demonstrate that calculations can be performed in Excel rather than externally). Several styling enhancements are added in Excel such as data bars, font and color changes, hiding the grids, etc.
There is a lot of opportunity to enhance this solution depending on specific business needs. For example, a report name column can be added to the ReportUserFilters table to store user/parameter values separately for each report. I haven’t used the InsertDateTime column in this example but this could be used to go back and run the detail report for a point-in-time.
The previous example only inserted one parameter value but the following enhancement could be used to pass multiple selected parameter values from SSRS so they would all be included in the detail report:
declare @SQL nvarchar(1000)
set @SQL =
‘insert into ReportUserFilters ( UserName, TableName, FilterKey, InsertDateTime ) ‘
‘select distinct system_user, ”’
+ @TableName +
+ @KeyColumnName +
‘, getdate() ‘
+ @TableName +
‘ t where cast(‘
+ @KeyColumnName +
‘ as varchar) in(”’
Comments : 2 Comments »
Categories : SQL Server, SQL Syndication, SSRS Design
Please join my associates and I for an all-day SQL Server Upgrade workshop on November 3rd
If you are planning an upgrade to a newer version of SQL Server, you won’t want to miss this all-day, preconference workshop.
Join John Martin, Sr. Technology Evangelist from Microsoft, will spend the morning discussing migration planning from SQL Server 2005.
In the afternoon, SolidQ mentors Ron Talmage, Richard Waymire, Jim Miller and Paul Turley will talk about and demonstrate upgrading specific workloads (projects and features) from older product versions to newer product versions. We will introduce the comprehensive SQL Server 2014 Upgrade Whitepaper that we recently wrote for publication by Microsoft.
Additional to upgrading specifically from SQL Server 2005 to SQL Server 2014, we will also talk about upgrading from SQL Server 2008 and 2008 R2 to SQL Server 2012 and 2014.
From the PASS Summit Sessions page:
An upgrade and/or migration is far more than just a case of moving a database or installing a new version of SQL Server, there have been so many changes since SQL 2005 arrived that we need to do a lot of tasks to ensure we have a successful upgrade project.
This session will guide you through the process, looking at not only the technology but the methodology, processes and tools that we have at our disposal to make sure that when we do move from SQL Server 2005 to 2012/2014 or to the cloud with SQL Server in an Azure VM or Azure SQL Database that we do so in a way that we can be confident of success. We will take a special look at workload-specific upgrade needs for OLTP, HA, SSAS/SSRS, and SSIS environments.
In my afternoon section, I will demonstrate the capabilities of SSAS Tabular models and discuss upgrading and migrating Reporting Services from 2005, 2008 and 2008R2 to 2012 and 2014.
Comments : Leave a Comment »
Categories : PASS, SolidQ, SQL Server, SQL Syndication, SQLServerPedia, SSAS Administration, SSAS Design, SSRS Administration, SSRS Design, Tabular Models
Thank you to the Microsoft MVP Program for awarding me MVP for six years running! I love this program and look forward to another year of great community events and opportunities to advise and mentor.
This year I had the privilege to speak at numerous SQL Saturdays, conferences and user groups, help develop a certification program for the University of Washington, contribute to books and papers, moderate forums, test new products and advise Microsoft product teams, direct the Oregon SQL PASS Chapter, and work with consulting clients to develop database and BI solutions using cutting-edge technologies. I’m planning the Oregon SQL Saturday this year and will be co-presenting a pre-conference at the PASS Global Summit.
I’m always looking for new opportunities to support the SQL Server and Microsoft Business Intelligence community. If you’re looking for a speaker, writer or need some help, please contact me.
Comments : 6 Comments »
Categories : BI Industry, BI Projects, Business Analytics, Microsoft BI Platform, MVP Community, PASS, SolidQ, SQL Saturday, SQL Server, SQL Server Pro Magazine, SQL Syndication, SQLServerPedia
I just received some bad news and found out that a friend passed away today. Reed Jacobson was one of the early leaders in the Microsoft Business Intelligence community. Somewhere around 2004-2008 I worked with Reed at Hitachi Consulting in Seattle where we travelled, taught classes and worked on client projects together. He was one of the earliest adopters and gurus in the BI community working for a small start-up called OLAP Train who helped bring SSAS (aka “OLAP Services”) to the masses. He spoke at many conferences and taught a lot of classes.
Not only was he one of the smartest people I’ve known when it came to solving business problems with Analysis Service & MDX but he was a kind, friendly man who really enjoyed helping and connecting with people. He wrote many books about data analysis with Excel & SSAS and he wrote the MDX training curriculum for Hitachi Consulting. He was my mentor. We co-taught some of those early classes and I spent some time with him on the road. He bent my brain intellectually and we talked about work, family, politics and religion. He was man of principles who asked questions and gave a lot of thought to his actions.
Reed left a lot more than a collection of old technical books behind. He had many good friends, customers and connections who appreciate his leadership, guidance and friendship. I am privileged to have worked with Reed.
A memorial will be held this Sunday, July 20 at 7:00 PM
Dr. Jose Rizal Park; 1008 12th Ave S, Seattle, Washington 98134
Comments : 10 Comments »
Tags: Reed Jacobson
Categories : BI Industry, PASS, SQL Server, SQL Syndication, SQLServerPedia
I’ve created a simple query performance logging tool for Analysis Services, called the SSAS Performance Logger. The tool allows you to navigate through the metadata for a tabular model and select from measures and table attributes to build a simple query which is executed and timed. The query can be executed many times to get an accurate sampling of test variations and to test performance while various loads are placed on the server.
I literally created this over the weekend and plan to add additional capabilities as time allows – so please check back. To provide feedback, add comments to this post.
My initial objective was to choose two different measures that were alternate methods of calculating the same value, run them side-by-side and then see which performed best. Then it occurred to me that we can run any number of measure performance tests in iterative loops and compare the results by playing back the captured log results. Since the results are captured in a log database, the test results for each measure or query can easily be compared and analyzed using tools like Excel and Reporting Services.
One of my main objectives for a future version of the tool is to add logging for server performance counters like memory usage, thread counts and CPU capacity; while these queries are running.
What you will need:
I’ve developed SSAS Performance Logger for use with SSAS Tabular in SQL Server 2012. To manage the logging database, it uses the OLEDB data provider for SQL Server 2012 which may be backward compatible as far back as SQL Server 2008. It also uses the ADOOMD data provider version for Analysis Services. When MDX support is added to a future version, it should support SSAS sources as far back as SSAS 2008.
- Microsoft .NET Framework 4.5
- ADOMD 6.0 (can be installed with the SQL Server 2012 SP1 Feature Pack)
- An instance of SQL Server 2008 or better with administrative/create database rights for the logging database
With these components installed, you should be able to copy the executable to a folder and run it.
- Download the zip file and extract it to a folder on your system
- Verify that the required dependent components are installed
- Run SSAS Perf Tester.exe
The first time it runs, the application will check for the logging database and prompt you to create it
- On the SSAS Query page, enter the Analysis Services server or instance name and press Enter
- Select a model or perspective, measure and table/attribute from the drop-down list boxes to build the query
- Click the Start button to run the query once and see how long it took to run
- On the Options page, type or select the number of times to repeat the query
- Click the Start button the run the query in a loop. The results are displayed in a line chart showing the duration in milliseconds for each execution
Every query execution is logged for later analysis. Here is a view of the logItem table in the SSASPerformanceTest database:
Features I hope to add soon:
- Generate MDX queries in a addition to DAX
- Handle multiple measures in one testing batch
- Log server performance counters
Comments : 5 Comments »
Tags: SSAS Performance Logger
Categories : Business Analytics, DAX, SolidQ, SQL Server, SQL Syndication, SSAS Administration, SSAS Design, Tabular Models
This fact is probably not documented very well but this is an important factor in SSRS visual report design…
The use of rectangles is very much a core part of Reporting Services report design and a difference in behavior is expected. Cells in a tablix will grow vertically to accommodate wrapping text and when that happens any object on the same row will stretch to fit the row height. For images, charts (which are rendered as images), gauges, indicators (which are gauges) and sparklines (which are charts); this means that the image must be resized and scaled to fit. Image scaling adds a lot of rendering overhead. If you place any of these objects in a rectangle, this prevents the image scaling and speeds things up considerably. This is by design and a recommended practice.
Melissa Coates posted a related topic:
In several comments to that post, users noted slow rendering when charts and sparklines were used in a table. In addition to the rectangle trick, one user suggested “Setting the SizeToReportContent to false on the report viewer allows the images to be consolidated into one image and the performance improves dramatically.”
In July, 2012, Jason Thomas posted on a similar topic:
In that post, he provides step-by-step instructions with images, to use rectangles in a dashboard-style report to correct the rendering issue. He also has several posts on related topics.
Comments : 3 Comments »
Categories : SQL Server, SQL Syndication, SQLServerPedia, SSRS Design