Skip to content

Tag: SQL Server 2012

SQL Server Data Tools, BIDS, Visual Studio – What Do You Need?

The Business Intelligence design components of SQL Server have an identity crisis.

What I’m talking about are the tools that are used by designers to create BI objects in the SQL Server Business Intelligence stack, specifically Analysis Services (SSIS) OLAP cubes and tabular models, Integration Services (SSIS) ETL packages, and Reporting Services (SSRS) reports. These tools have always been bundled into a single product as part of the SQL Server distribution. The original incarnation of the tools was called Business Intelligence Development Studio, or just BIDS. It was originally introduced with SQL Server 2005, and was included on the SQL distribution media as an optional install component.

BIDS was a distribution of the Visual Studio shell, and a set of bundled project types for creating cubes, SSIS packages and SSRS reports. The original version was based on Visual Studio 2005, and subsequent releases of SQL Server stayed in step with more recent versions of Visual Studio. The projects were tied tightly to the Visual studio versions. If, for example you had Visual Studio 2010 already set up, installing BIDS would still install a different Visual Studio to support the projects. BIDS maintained its own identity. Installing BIDS is straightforward, you simply run setup from the SQL Server media, and select “Business Intelligence Development Studio”.

When SQL Server 2012 was released, a change was introduced. In addition to the projects required for tabular models in SSAS being added to the tools, they also received a new name. Henceforth they were to be known as SQL Server Data Tools (SSDT). As expected, the requisite level of Visual Studio was incremented and SSDT was based on Visual Studio 2010. The installation experience didn’t change fundamentally, the installation option just took on the new name.

So far so good. Products get renamed all of the time. Simply substitute SSDT for BIDS, and everything is pretty much as it once was.

SQL Server 2014 is the latest SQL Server version, and it introduced another major change. Data tools is no longer available from the SQL Server installation media.

This is where it gets very confusing. It was always possible to download BIDS or SSDT directly from Microsoft. These products don’t require a license to deploy and use, so they were freely available. This is also true with SQL 2014, but now, a download is the only way to get the tools. That’s simple enough. However, if you search for SQL Server Data Tools and download the version for either Visual Studio 2012 or Visual Studio 2013, you won’t find the projects that you were looking for. You’ll instead find projects for deploying databases and DACPACs. So what’s going on here?

As it turns out, a separate Microsoft team put out a separate set of VS project templates in the SQL Server 2012 timeframe that were also called SQL Server Data tools. Apparently there are a finite quantity of names.

The product that we originally knew as BIDS, and then SSDT was renamed one again for the SQL Server 2014 wave of products. Well, only sort of renamed. It’s now called SQL Server Data Tools – BI. Apparently this was intended to avoid confusion…. SSDT-BI is available for either Visual Studio 2012 or Visual Studio 2013. If you don’t already have a Visual Studio installed, it will install a VS shell for you.

The summary of all this is to say that if you want to build BI projects for SQL Server, you’ll need to have the right tooling for your target server, and that tooling is as follows:

Server Toolset Name Location
SQL Server 2008 R2 and below Business Intelligence Development Studio On SQL Media
SQL Server 2012 and 2012 SP1 SQL Server Data Tools On SQL Media
SQL Server 2014 SQL Server Data Tools – BI Download for Visual Studio 2012
Download for Visual Studio 2013

The complete current Data Tools product set is laid out and can be downloaded from here. The Data Tools team blog is here.

I hope that this helps to clear up some confusion. I can’t wait to see what they have in store for us in V.Next….

25 Comments

License Requirements for SQL Server Reporting Services in SharePoint Integrated Mode

As I am keen to point out every opportunity I get, when installing SQL Server Reporting Services in SharePoint Integrated mode, it is important to ensure that it is installed on a SharePoint server, and not on a SQL Server. It’s a bit counter-intuitive because the installation files are on the SQL Server media, and not the Sharepoint media. This causes confusion in a number of areas, but one question that I hear a lot is in the realm of licensing.

Technically, using any SQL Server component on any server requires a full SQL Server license for that component for each server. Running SQL Server Analysis Services and SQL Server Database on two different machines requires two different licenses. Seen that way, putting the SSRS Service on a SharePoint server would require a second license of SQL Server, which can be an expensive proposition. It also doesn’t make much sense, because it promotes bad design. Luckily, the SSRS service application is specifically exempted from additional licensing requirements. The following is taken directly from the Microsoft SQL Server 2012 SP1 license document. Section 2.5 states:

2.5    Running Instances of the Additional Software.

You may run or otherwise use any number of instances of additional software listed below in physical or virtual operating system environments on any number of devices. You may use additional software only with the server software directly, or indirectly through other additional software.

  • Business Intelligence Development Studio
  • Client Tools Backward Compatibility
  • Client Tools Connectivity
  • Client Tools SDK
  • Data Quality Client
  • Data Quality Services
  • Distributed Replay Client
  • Distributed Replay Controller
  • Management Tools – Basic
  • Management Tools – Complete
  • Reporting Services – SharePoint
  • Reporting Services Add-in for SharePoint Products
  • Master Data Services
  • Sync Framework
  • SQL Client Connectivity SDK
  • SQL Server 2012 Books Online

 

Notice that little bullet point “Reporting Services – SharePoint”? That’s the service application. Put simply, this means that in order to use SSRS in SharePoint Integrated mode, you’ll need to have at least one licensed SQL server, but you don’t need to run it on the SQL machine. In fact, according to the line “any number of instances” you can run it on as many SharePoint servers as you wish to take advantage of load balancing without incurring any additional licensing cost for SQL Server.

6 Comments

Schedule Data Refresh for SSAS Connected Excel Workbooks with PowerPivot for SharePoint

Using Excel Services, SharePoint users have been able to share workbooks that are connected to back end data since SharePoint 2007. Typically, the connection is made to SQL Server, or to Analysis services although a wide variety of sources are available. It’s also possible to publish individual components from these workbooks anywhere within the site collection through the Excel Web Access web part. Users can navigate to a dashboard page that contains all sorts of elements including an Excel chart that is connected to back end data. Well, to be precise, it was connected to back end data, the last time the workbook was saved. The workbook itself can be refreshed, but only manually.

When you open an Excel workbook in a browser through Excel services, by default, you’ll see the visualizations and any stored data in precisely the way that the workbook was when it was last saved. If you need to see more up to date data, you can select “Refresh Connections”. If (and sometimes that’s a big if) the server and connections are set up properly, the server will fetch updated data and update the workbook.

 This works well enough, but the problem is that when you, or anyone else opens the workbook again, they’ll still see the old version of the workbook, and will need to manually refresh the date again. In addition, any visualizations published elsewhere on a dashboard will also continue to show old data unless manually refreshed. If the amount of data is significant, this poses a serious performance issue to the server(s). There’s also a significant usability impact in that it’s a pretty big ask of an end user to have them constantly hitting a refresh button.

To get around this issue, one option is to set the refresh options in the data connections of the workbook. Excel Services respects these options. There are two settings that we need to be aware of, periodic refresh, and refresh on open. Connection properties can be accessed within the Excel client by selecting the Data tab, choosing Connections, then highlighting the connection in question and selecting Properties.

Periodic refresh will allow the workbook to be automatically refreshed in the background while it is opened in the browser. This can be useful when the source data is changing frequently. Refresh on opening will have the greatest impact in our scenario, as it will automatically refresh the data in the workbook whenever the file is opened. This will also work with published objects (Excel Web Access web parts) – every time that the web part is opened, the data will be automatically refreshed. This solves the usability problem above because the user no longer needs to manually update the data. However, it does not affect the server load problem.

Due to the fact that the data and visualizations retain the state that they had when the workbook was last saved, it also affects search. When the search indexer runs, it will only index the data that is saved in the workbook. It has no means of refreshing the data. Finally, in addition to the load imposed on the servers by constant refreshes, if the quantity of data being refreshed is large, users can experience significant lags when loading the file. This obviously introduces another usability option. While the refresh options in Excel are helpful, they don’t fully solve the problem. What is needed is a way to automatically open the file for editing, refresh the data, and resave it to SharePoint.

If you have ever used Power Pivot for SharePoint, you know that it can do exactly that. Power Pivot for SharePoint contains two primary elements – a specialized instance of SQL Server Analysis Services that allows users to interact with workbooks that contain embedded PowerPivot models, and a SharePoint service application that among other things, keeps those embedded models refreshed. Using the PowerPivot Gallery (enabled when PowerPivot for SharePoint is installed), you can configure a workbook’s refresh options by clicking on the icon in the Gallery view, or by selecting “Manage PowerPivot Data Refresh” in the simple All Documents view.

 Data Refresh options in PowerPivot Gallery View

 Data Refresh options in All Documents View

Once configured, the PowerPivot for SharePoint Service will refresh the data model in the workbook on a periodic basis (no more than once per day). The service essentially opens the workbook in edit mode, refreshes all of the data connections, and saves the workbook back to the library. If versioning is enabled, it will be saved as a new version. Unfortunately, if you’re not using a PowerPivot data model, the options are unavailable. In Gallery view, the icons are simply unavailable, and while the option is available in the All Documents view, selecting it results in an error.

On the surface, it would seem that using workbooks with PowerPivot is the only option for keeping large volumes of back-end data up to date in Excel visualizations. However, there is a small loophole that you can take advantage of.

The refresh function in PowerPivot for SharePoint refreshes all of the connections in a workbook. While this option is unavailable if the workbook has no embedded PowerPivot model, when it does, it refreshes ALL of the data connections in the workbook, whether they connect to a model, a back end SSAS server, SQL server or whatever. So therefore, if you want to keep your connected data refreshed, the solution is to add a dummy PowerPivot model to your workbook.

Simply open up the PowerPivot window, import some small amount of data from an external source, and save it. Once saved, the PowerPivot refresh options will appear, and you’ll be able to schedule data refresh for your workbook. You can even deselect the refresh of the source data for your dummy model, and the other connections will work just fine.

Once your workbooks are being updated automatically, your users will be presented with up-to date data on load with no delays, all dashboard visualizations will be up to date and quick to render, and the visible data will be picked up by your search crawler. All will be well with the world.

3 Comments

The New Power BI – Now With Enterprise!

Yesterday Microsoft announced the next step in the evolution of Power BI. It’s getting quite a bit of attention, and rightly so for its aim of bringing Business Intelligence closer to users. Democratizing BI has always proved a challenge – it’s the realm of the gurus in the white coats that hold the keys to the data. Microsoft is aiming to accomplish this democratization through a combination of user focus, and as of yesterday, a drastic change in its pricing model. Power BI just went from about $40 per user per month, to free, or $9.99/user/month for advanced capabilities. That’s quite a drop, and arguably the biggest announcement from yesterday – it will have a massive impact. The detailed price breakdown can be found here.

However, all of the focus around personal BI is, in my opinion, missing a key component. Power BI and its components have always focused squarely on both personal and team BI solutions. That is to say the ability for a power user to model data, visualize it quickly and easily and to share it out with fellow team members. While that capability is certainly retained in the new Power BI, this new version contains the first appearance of enterprise grade BI in the cloud for Microsoft.

To fully understand this, it’s necessary to touch on the Microsoft BI stack as it stands today.

Microsoft BI On Premises

The On-Premises BI story from Microsoft may be confusing, and occasionally difficult to understand, but it is very powerful, and relatively complete. In a nutshell, the story is good from a personal, team and enterprise perspective.

On the enterprise side, there are products from both the SQL Server team, and the Office team. Data warehousing is served by SQL Server and ETL duties fall to SQL Server Integration Services (SSIS). Multidimensional analysis storage is served by SQL Server Analysis Services in both OLAP and Tabular modes, and Reporting is performed by SQL Server Reporting Services (SSRS). The SQL product line doesn’t have much on the client side for analysis apart from SSRS, but this slack is taken up by the analysis tools available in Excel, and through Performance Point services in SharePoint.

Indeed, SharePoint also provides a platform for SSRS via SSRS SharePoint mode, and for Excel based analytical workbooks connected to SQL Server and to SSAS through Excel Services.

On the personal BI side, that role has traditionally fallen to Excel. The pitfalls of importing data into Excel workbooks for analysis are well documented and don’t need to be discussed here, but the bulk of those issues were addressed with the introduction of PowerPivot several years ago. PowerPivot allows for massive amounts of data to be cached within the Excel file for analysis without any data integrity concerns. The addition in recent years of  analytic visuals (Power View, Power Map) and ETL capabilities (Power Query) have further rounded out the offering.

Taking that Excel workbook and sharing it brings us into the realm of Team BI. This is to say that the analyses are relatively modest in size, and of interest to a targeted group. These models may not require the rigour or reliability associated with enterprise BI models. Once again, the technology involved here is SharePoint. A user can take a workbook with an embedded PowerPivot model, share it through a SharePoint library, and other users can interact with that embedded model using only a browser. This capability requires PowerPivot for SharePoint, which is really a specialized version of SSAS, along with a SharePoint service application.

One thing to note about these seemingly disparate approaches is that a power user can build a Power Pivot data model with Excel, share it to a team via SharePoint, and when it requires sufficient rigour or management, it can be “upgraded” into SSAS in tabular mode. This common model approach is powerful, and is key to understanding Microsoft’s entire BI strategy. You can also see here that SharePoint straddles the two worlds of team and enterprise BI.

Moving to the cloud

The BI workload is one of the last Microsoft workloads to move to the cloud, and with good reason. Massive amounts of data present problems of scale, and security or data sovereignty concerns tend to keep data on premises. However, there is a very real need to provide BI to users outside of the firewall.

SharePoint is the hub of BI on prem, so it’s logical to assume that with SharePoint Online, it could continue to perform that function in the cloud. The big catch here is that on-prem, SharePoint is simply the display platform. In the enterprise scenario, users connect through SharePoint to the back end servers. This isn’t an option in the cloud, so enterprise BI was left off the table.

With the personal and team BI scenarios, data is cached in a Power Pivot data model, which could be supported in the cloud. When Office 365 moved to the SharePoint 2013 code base for SharePoint online, rudimentary support for embedded Power Pivot models was indeed added. Essentially PowerPivot for SharePoint “light” was added. I call it light for two major reasons. Firstly, data models could be no larger than 10 MB. Secondly, there was no way to update the data contained within the Power Pivot cache, outside of re-uploading the Excel workbook. This is still true without a Power BI license. The inability to refresh the data renders team BI almost useless, except in static data scenarios.

The first generation of Power BI changed all of that. With a Power BI license, it was possible to install a Data Management Gateway on premises that would connect to team BI workbooks in Office 365 and update them on a scheduled basis. Yes, the gateway had many limitations (many of which have been removed over time), but finally, the on-prem refresh story was solved. In addition, the model size limit was increased to 250 MB. However, we were still left with a number of problems or limitations.

  1. Daily data refresh schedule. Automatic data refreshes could be daily at their most frequent. Manual refreshes could be done anytime
  2. Capacity. The maximum size of a data model was increased to 250 MB, which is relatively small for enterprise scenarios. In addition, refreshes aren’t differential, which means that the entire model is re-uploaded on every refresh
  3. Data sensitivity/sovereignty.  The refresh problem was solved, but because the data is still cached in the workbooks, there can be reluctance to sending it outside of the corporate firewall
  4. Per User Security – Power Pivot data models have no concept of user security in a workbook (tabular models in SSAS do). Security is at the workbook level
  5. Cost. This initial cost of Power BI was $40 per user per month. A power BI license was required to interact with any workbook that had a data model larger than 10 MB. Considering that a full Office 365 E3 license was around $25 per user per month, this price tended to limit the audience for sharing.

All of this is to say that Power BI in its first (and as yet current) incarnation is suitable for personal and team BI only. There has been no enterprise cloud BI story.

Power BI V2

The announcements yesterday outlined the next generation of Power BI. Going forward, Power BI will be available as a standalone offering, at the price points offered above. Office 365 users will continue to be able to use it from Office 365, but Office 365 will no longer be required to use it. In it’s early days, Power BI was a SharePoint app, but a careful examination of URLs in the current offering quickly reveals that it’s actually two apps currently, both running on Azure (not in SharePoint).

If you’ve signed up for the new Power BI preview, you may notice that the URL is http://app.powerbi.com/…… so this move isn’t a big surprise.

With the new model, Excel is no longer the central container. Users connect to data and publish it directly to Power BI. Behind the scenes, the service is doing a very similar thing as what it does with Power Pivot models – it’s storing them in SSAS. In fact, the same limits still apply – 250 MB per model (at least for now) Excel can still be used, but now it is as a data source.

Visualizations are performed through Power Views, and data is acquired through Power Query. These are no longer add-ons, but available on their own through Power BI Designer. This decoupling is good for those that have not made an investment in SharePoint Online, or Excel.

These changes to the architecture and the cost are great news for adoption, but don’t address the needs of the enterprise. Except for one thing – The SSAS Connector.

image

One of the data sources available to the new Power BI is the SSAS data connector. This connector is a piece of code that runs on premises (it actually includes the Data Management Gateway). It acts as a bridge between the Power BI service, and an on prem SSAS server.

The biggest distinction worth noting is that with the gateway, data is NOT being uploaded to the service, it remains on prem. The way that it works is that when a user interacts with a visualization from the cloud, a query is sent to the SSAS server through the gateway. That query is run, and its results sent back to the user’s visualization, and the data is not persisted.

In addition, when the query is sent back to the SSAS it is run with the permission of the user making the request. This is accomplished through the EFFECTIVEUSERNAME feature in SSAS. This provides for full user level security, and since tabular models in SSAS can utilize per user security, we no longer need to rely on proxy accounts/document level security.

Finally, because the data is being stored in an on prem SSAS server, it can be refreshed automatically as often as desired. For the same reason, we have no capacity limits – you can grow your own SSAS servers as large as you like.

The SSAS connector removes most of the limitations that prevent cloud based enterprise Business Intelligence, and the new pricing model removes the rest. Certainly there are going to be feature limits in the near term, but it appears to me at least that the back of this thorny problem has finally been cracked.

6 Comments

Error 1603 Installing Reporting Services Add-In onto a New SharePoint Server

I had an interesting (annoying) error this week at a customer site. I was adding a new server to an existing SharePoint farm. This was a relatively new farm, installed a couple of months previously. This farm has SQL Server Reporting Services (SSRS) installed. The new server wasn’t running the SSRS service application, but as with any WFE server, it needed the SSRS SharePoint add-in installed.

Although the SQL Server Reporting Services add-in is included as part of the SharePoint Server prerequisites install, it’s an older version. Since in this case I was using SSRS 2012 SP1, so I needed to manually install it. However, attempting to do so resulted in a rather nasty failure error 1603, and the add-in simply refused to install. I hadn’t seen this one before.

Poking around a bit, I found that you can get this error when the installed version of the bits on a SharePoint server don’t match with what has been applied. This is the state of a server after a Cumulative Update or Service Pack has been installed, but PSConfig (the Products Configuration Wizard) has not yet been run. I then ran the configuration wizard on the new server, and it did in fact indicate that a upgrade was needed. This was a bit confusing, as I had only just installed the SharePoint bits on this machine. How did this happen?

Well, as has been my practice for some time, whenever I install SharePoint bits on a server, I immediately run Windows update to make sure that everything is nice and new. It’s a new server, that’s the worst that could happen, right? Wrong. As I was aware (but had forgotten about), in September, Microsoft started delivering SharePoint updates along with regular Windows updates. This, in my opinion, is not a good thing, and is why as pointed out by Todd Klindt, you should no longer enable auto update on your SharePoint servers. You should apparently be careful of doing manual updates as well.

My little Windows update had snuck in some SharePoint updates causing the upgrade requirement, and the problem with the SSRS add-in. In addition, since this was a multi server environment, it put this server out of step with the other servers. After bringing the rest of the servers up to the latest update level, and running PSConfig on all of them, I was able to install the SSRS add-in on the new server. All was right with the world again.

6 Comments