I have previously written about upgrading and moving Reporting Services to SSRS/2008 R2/SP2010, and also on upgrading to the new Service application in 2012. Both of these deal with moving prior versions of Reporting Services running in SharePoint mode to more recent versions, also running in integrated mode. What has been lacking from Microsoft until now was a mechanism to help move an organization from Reporting Services in Native mode to Reporting Services in integrated mode.
The solution to date has been to go back to the source projects in BIDS and redeploy them to the Integrated Mode server. This of course assumes that BIDS was used for report design (not Report Builder), and that the projects are available. You also lose all server side configurations (like subscriptions) with this approach.
On Friday, April 20 2012 Microsoft released Version 1.0 of the Reporting Services Migration Tool, which allows you to do just that. It’s a high level tool that brings all of the artifacts out of the Native mode instance, and at a later point in time, import them into the Integrated Mode instance. Ultimately, the stated aim of the tool is to allow a file system level backup of your Reporting Services Instances, be they Native or Integrated mode.
It can be run either by command line, or through a GUI. A snapshot of the GUI screen can be seen below.
The tool is definitely version 1, and has several limitations which I’ll outline below, but it does work. It does so by connecting to either the WMI provider, or the Reporting Services web services, then extracting all of the available content, and then building a PowerShell script which can be run to place the backed up content in a SharePoint document library that has been properly configured to support the Reporting Services content types.
Operation of the tool is relatively straightforward, and is adequately documented on the download page, so I won’t go through a step by step, but I do want to share a few observations.
Firstly, migration is from Native Mode to Integrated Mode only. The stated objective of this tool is to support both modes on either end of the migration path, but for now it’s a one way trip. For the moment, it does limit its ability to perform as a backup tool. However, if you examine your output folder, you’ll find all of your report files, connection files (etc), so if you’ve built your reports with BIDS, and lost the original source project, it’s a great way to get them out of the Reporting Services database.
I have also been unable to get the WMI provider to work at all. I’ve tested with both SSRS 2008R2 and SSRS 2012 Native mode sources, but the tool can’t seem to find the WMI instance. The tool still works in this configuration, but it will not back up passwords or history snapshots. I’ll update this post if/when a solution to this can be found.
UPDATE – Thanks to Tristan in this MSDN forum thread – The WMI provider is working. I have added the paragraph and image below.
The Instance Name field is mislabelled. It should be SERVERInstance for non default instances, or just SERVER for default instances. Unfortunately the nowhere in the help is the requirement for SERVERNAME mentioned. Essentially, you should treat this field the same as you would the Server field when connecting via Management Studio. The image above has been updated to show the correct value for Instance Name (In this case, although not necessary, I have included the name of the default instance).
As outlined on the download page, the tool does not back up Reporting Services security information, or role information – which makes sense when moving to a new security model. Also, linked reports aren’t supported in Integrated mode, so they’re not backed up at all.
For a complete list of constraints and instructions, visit the download page.
For it’s limitations, this tool is a very welcome addition to the toolkit. Migrating from Native Mode to SharePoint Integrated mode Reporting Services no longer needs to be painful.
PowerView is one of the shiny new features that are available to SharePoint users with the implementation od SQL Server 2012. To be sure, you don’t need to convert your database engine over to SQL Server 2012, but you do need to install PowerPivot for SharePoint, and/or the new Reporting Services service application for SharePoint, both only available in SQL Server 2012.
PowerView requires a connection to an xVelocity Analysis Services engine (formerly known as tabular mode or VertiPaq), or to an Excel spreadsheet with an embedded PowerPivot model in order to function. When connecting to the Analysis Services engine, there are a couple of permissions that need to be set on that engine that you should be aware of.
If you wish to connect to the Analysis Services engine, you have two choices. Firstly, you can use a Reporting Services connection (.rds) and select “Microsoft BI Semantic Model for PowerView” as the Data Source type.
Once this is done, it can be used to create PowerView reports. This approach works without any additional configuration. The reason that this is so is that Reporting Services takes advantage of the EFFECTIVEUSERNAME connection property for user impersonation. This is something that BISM does as well. When using an RS connection, the stored credential (in this case) is used to establish a connection with the data source, and then all queries are executed with the identity of the requesting end user (provided that the “Set execution context…” option is not selected.
For further reading on how Reporting Services uses the EFFECTIVEUSERNAME property, check this article. Since BISM uses the same property, it’s highly relevant to PowerView.
However, the problem arises if we choose the second option, and want to work with a BISM file directly. If we want to use a basic BISM file, we first add it by creating a document with the the BI Semantic Model content type.
However, the connection form is far simpler than the one used to configure the Reporting Services connection. You’ll notice immediately that there is no option for storing credentials. For a server connection, all that is necessary is a name for the connection, the name of the server, and the name of the database.
Unfortunately, if you haven’t gotten everything configured just so and you click ok, you’ll receive the error “Cannot connect to the server or database”.
This is due to the fact that when the server performs the validation check against the Analysis Services server, it uses a system account to first connect to the AS server. This is documented in this Connect forum entry. The corrective action is to add the account to the Analysis Services Server Administrator role:
What account needs to be added? The forum article I referenced above indicates the PowerPivot service account, which is now listed in SharePoint as the “SQL Server Analysis Services” account. However, my testing has indicated that the farm account is the one that is used to verify connections with Analysis Services, so that’s what I’m recommending here. This is only used for validation purposes, and to get around this problem without adding the account to the administrators role, you can simply select the “Save the link file without connection validation” option. This may however be unsettling to users.
Once you’ve created your BISM connection, you should then be able to use it to launch Excel, or create a PowerView report. When you launch Excel, the BISM is referenced by an Excel connection, which then connects directly to the data source with the credentials of the end user. No problem. However, if you attempt to create a PowerView report, you may find yourself faced with “An error occurred while loading the model for the item or data source”.
In the body, it misleadingly leads you to believe that it’s the end user (btidwell in this case) that doesn’t have access. What should happen is that, as explained above, the Reporting Services service makes a connection with the AS server, and then uses the EffectiveUserName property to impersonate the user. For more detail on how BISM authenticates to back end data sources, see this MSDN article.
If you’re seeing this error, the chances are that the service account that is being used for Reporting Services doesn’t have access to the Analysis Services database.
As in the verification step above – granting the account access to the specific database doesn’t do it, you need to add the service account that is used by Reporting services to the Server Administrators role. Once you’ve done that, PowerView should behave smoothly.
If you’ve used SQL Server Reporting Services to any great extent, you’ve likely encountered the need to generate reports automatically. This requirement may be for for delivery purposes, archival purposes, or simply to reduce report rendering wait times for the end users. SSRS supports this requirement out of the box – a report administrator can set up a subscription, enter the required parameters, and the report will be generated and delivered on that schedule.
This approach is highly declarative, and puts the onus of subscription creation on the report administrator. To this end SSRS also supports data driven subscriptions, which allow the subscriptions to be looked up from a SQL table. How that table is maintained is up to the individual organization, but it does allow a measure of dynamism. With SQL Server Reporting Services 2012, this feature is made much more user friendly through the use of User Driven Subscriptions.
The down side to any of this dynamic behaviour is that in every case, it requires the Enterprise version of SQL Server (with SQL Server 2012, the BI SKU also has this capability). In addition, with SQL Server versions prior to 2012, the capability is somewhat less than user friendly.
In this post, I will outline a methodology that will allow you to provide SharePoint list based report subscriptions that will allow users to subscribe to published reports, and have them published to a SharePoint document library. The approach is not restricted to SharePoint – indeed it could be used with native mode to read through a list of possible parameter values, and email the resulting reports, or store them in a file system, but the SharePoint example is the one that I will be using below.
I should also point out that although the examples below use SQL Server 2012, the approach should work with versions back to SQL Server 2005.
The primary components of this solution are a SharePoint list that will be read to determine what reports to render (the subscription list), a SQL Server Integration Services (SSIS) package that will read through the subscription list and use the values therein to render the report, and finally, a SharePoint document library that will house the reports. Of course, we also need a report to be rendered, and in our case, this report is also stored in a SharePoint document library, as Reporting Services is running in SharePoint Integrated mode.
The good news, is that all of the constituent portions of this solution are either downloadable for free, or come with SQL server in any other edition besides Express. Therefore, the chances are that if you have SharePoint, then you already have all of the tools that you need.
Step 1 – Obtain the SharePoint List Source and Destination Project
Out of the box, SSIS doesn’t know how to talk to SharePoint data. Fortunately, there’s an excellent Codeplex project that adds the required capability. If you haven’t already done so, download the SharePoint List Source and Destination project from Codeplex. You will find a good blog post on working with this tool here. Once installed, you will be ready to build the solution. Of course, this step is only necessary if you want to use a SharePoint list as a subscription source.
Step 2 – Create your subscription and report library
In this solution, we will allow a user to enter a subscription request in our subscription list. The user can specify the URL of the report to be run, the parameters for the report, the file type that is to be produced, and the library where the report is to be stored. In order to support this, we’ll need at least one document library where the produced reports will be stored, and one custom list.
Next, create a custom list. In our case, the list will be named “Subscriptions” and will be created in the “http://home.nautilusinc.local/sites/nmarine/IT/Sandbox” site. Where you create this list is not important, but what is important is the display name of the list, and the URL of its parent site.
For our use case, we want the user to be able to specify the Report to be rendered, the destination to place the rendered reports, the parameters to use for the rendered report, and the file type of the rendered report. To that end, we will add 4 additional columns to the list, as shown below.
You will also note that the “Title:” field has been renamed to “Subscription” on this list. This is purely for cosmetic purposes. Three of the new fields are simply single line text fields, while the Format field is choice. In our example, the options available for the choice field are WORDOPENXML, PDF, EXCELOPENXML, IMAGE, and NULL. You can allow any of the possible output types that Reporting services supports. I have outlined these types previously in another post here.
While it is outside the scope of this article, you will likely want to modify the form to display more user friendly names for the options than “WORDOPENXML”, etc, and automatically calculate the value for the subscription field. InfoPath would be an excellent tool to do this with, and there are other alternatives as well. For our purposes, we will work with the form as is.
Once done, you will want to add a couple of subscriptions. In our case, we’re working on a very simple report as shown below:
The report takes a single parameter, employee name, and renders the report filtered by that parameter. The subscription list item that we’ll create will look something like below:
After adding two subscriptions, our subscription list appears as follows:
When our job runs (defined below) it will iterate through this list and create a corresponding PDF file and Word file in the destination library. Next, we create the SSIS package that will actually do the work.
Step 3 – Create a Reporting Services Web Service Proxy Class
In order to render the Reporting Services reports, we will need to call the Reporting Services web service from a SSIS Script task. In order to do that, we’ll need to use a proxy class. Luckily, we can just generate one using the WSDL.EXE generation tool available from the .Net 3.5 SDK. You run the tool with the following options:
Once you have the output file, make note of its location – we’ll use it below when creating the script task in SSIS.
Step 4 – Build the SSIS Package
I’m going to assume that most people reading this have little or no exposure to SSIS, so I’ll try to be as detailed as possible. You’ll need to start SQL Server Data Tools (if you’re using SQL Server 2012) or Business Intelligence Development Studio (for SQL versions prior to 2012).
You may notice that is has a striking resemblance to Visual Studio 2010. That’s because it is VS2010. Select “New Project” then in the “Business Intelligence” section, select “integration Services Project”. Give the new project a name and location and click OK.
Once created, we’ll need to create a SharePoint List connection manager. From the Solution explorer, right click on “Connection Managers” and select “New Connection Manager”. Scroll down on the window, select “SPCRED” and click Add.
You will only see SPCRED if you completed Step 1 above. The Connection Manager will then prompt for a name and a set of credentials. Provide the name, and also provide it with an account that has access to the subscription list. If the SSIS service account has access, you can select “Use Credentials of Executing Process”, otherwise provide a service account with access.
We’ll be working within a Data flow task, so drag a Data Flow Task onto the design canvas.
Next, double click on the data flow task, or click on the Data Flow tab to bring up the Data Flow Task Editor. From there, drag a “SharePoint List Source” action onto the canvas. (Note: if the SharePoint List Source does not appear, there may have been a problem installing it. Consult the documentation for the SharePoint List Source and Destination project for troubleshooting steps.) Double Click on the List Source action to configure it. The first item to configure is the Connection Manager. Simply select it from the (hidden!!! ) drop down list. Click on the area beside “SharePoint Credential Conn…” to reveal the dropdown.
Next, click on the “ Component Properties” tab. Here, you perform the bulk of the action configuration. There are many options to choose from, but the ones that we’re concerned with here are SiteUrl and SiteListName. SiteURL is the absolute URL of the site that will contain our list, and SiteListName is the display name of the list. I stress display name as this is different than working with most other APIs for SharePoint, which tend to use the internal name. Also – it’s relatively easy for users to change the display name of the list. Doing so will break the package until it is reconfigured.
Next, drag a Script Component onto the canvas, below the data source. If prompted, choose “Transformation” for the script type. Next, connect the two actions by dragging the arrow from the SharePoint List Source to the Script Component.
Next, double click on the script component to bring up the script component editor. From the left, select Input Columns and select all of the columns to use in this script. In our case, we’ll be working with the columns shown below:
Next, click on the Script section, choose the language that you want to work with, then click the “Edit Script” button.
Without getting into too much detail of how the script action works, what we are going to do is to add code that will run for each row of data that flows through the transformation. In our case, that will be for each configuration item. We’re going to use the values of the columns of each configuration item to render the reports. Therefore, the code that we will write will go into the “Input0_ProcessInputRow” sub.
Before we can do that however, we need to add some supporting items. Firstly, since we’ll be working with web services, we’ll need to reference the .Net System.Web.Services library. Right click on the project name in solution explorer, and select Add Reference. From the .Net tab, select System.Web.Services, and click OK.
Next, expand the “Imports” section and import the System.IO and the System.Net namespace.
We now need to add our Reporting Services proxy class. The best way to do this is to first create a new class. Right click on the project in solution explorer, and select Add – Class.
Next, give the class a name. I like to match the name to the main class embedded, so the new name is ReportExecutionService.vb. Next, using Notepad, open the file that you created or downloaded in Step 3 above. Select all text, copy it into the clipboard, and then paste it into the newly created class, overwriting anything already there. Once done, save and close the class.
Next, I add a helper function to the script that helps to deal with URLs missing an ending slash. You can add it immediately above the “Input0_ProcessInputRow” sub. The code is below:
As we saw below, the output format parameters aren’t the friendliest, and we will need to specify the extension for the output file. To allow this, I also wrote a small helper function to turn output format values into file extensions, and include it below. This also needs to be added to the script.
r = m_WC.UploadData(destinationUrl, "PUT", report)
Catch ex AsException
Again, without getting into too much detail, some explanation of the above code is in order.
Lines 3-5 initialize the web service, assign it a URL (Don’t forget to change this for your environment!!) and assign it the credentials to use when calling the web service.
When this sub is called by SSIS, it is passed a row object. The row object contains column objects for each column that is used by the script (this was configured above). Therefore, to get the value for any given column, you simply need to refer to it as row.ColumnName. In our case, to get the value of the Parameters column, you use row.Parameters. Lines 9 through 17 get the value of the parameters column, split the value into an array of string objects using a semicolon as a value delimiter, then for each of these objects, separates them into name/value pairs using the equals sign as a delimiter, and them finally assigns them to a Reporting Services parameter collection.
Using this approach, we can use a single field to store all of the parameters for a report, and any report can have any number of parameters.
Lines 19-32 are primarily used for initialization. Line 33 loads the report specified in the subscription (by calling row.ReportURL). Line 34, sets the parameters, and lines 35-36 set the destination variables.
Finally, Line 40 calls the web service to actually render the report into a byte stream, and line 43 uses the .Net WebClient object to upload the file directly into SharePoint. In this example, we don’t actually add any metadata to the SharePoint library, but if this was required, you could use the techniques outlined in this post. We are now ready to test the process.
Step 5 – Run the Package
Close the Script editor window and click the OK button. If all is well, your Script Component action should show no errors. When ready, click the run button to test your package. If all is well, after a short compilation period, you should see that 2 records were successfully read from the subscription list, an both steps should show green. If things don’t go well, the error messages are pretty good….
Navigating to the destination library, we see the two requested reports.
Obviously, every time that this package runs, the reports will be overwritten with the new report. This may be desired behaviour, but if not, you may want to turn on version control (each version will be stored as a version) or modify the script to change the file name on each run (date stamping is a common technique).
In addition, you will want the package to be run automatically without human intervention. To do this, you’ll want to deploy it to a SQL Server running SSIS , and to schedule it to run as an agent job. There is a wealth of information online for how to do that.
The example provided above covers a single use case, but with minor adjustment could be used to automate all sorts of reporting tasks. A common one would be to use the NULL renderer to refresh report caches on a server. If you find any unique uses of this approach, I would love to hear about it. Please post a comment!
In preparing this post, I found the following articles to be useful: