Skip to content

Month: November 2013

Changes to the Power BI Data Management Gateway – Nov 2013

Several of the Power BI preview components were updated last week, most notably, Power Query and the administration app in BI sites. Without much fanfare, the Data Management Gateway was updated as well. There is a fair bit of documentation on what’s new in Power Query, and I’ve added my own thoughts here. There’s also some documentation outlining the changes to the the admin application here. However, although I looked, I haven’t found anything for the DMG.

I therefore decided to poke around a bit myself, and discovered a few things. This should obviously not be taken as a comprehensive list, and if I find anything else, I’ll update it. What follows are the changes that I’ve observed.

Performance Improvements With Azure data sources

In my article on working with the data management gateway, I observed that refreshing a relatively large model (1MM rows, 20 columns) required upwards of 10 minutes. After updating the my version of DMG, refresh required only about 1.5 minutes. I am told that performance was a focus for the team, and that focus appears to have paid off.

No more Gateway Limits

If you set up an on premises data source, you can expose it as an OData feed, and consume it with Power Query. In early testing, users would bump into a limit with large data sets. As soon as the data stream hit 100 MB, an error would result. This was due to a built in limit in the gateway. In current testing, it appears that this limit has been completely removed in this version.

Oracle support

Until now, the only data refresh support has been for SQL server. In this version of the gateway, Oracle support has been added. I haven’t tested it, as I don’t have any Oracle handy, but it’s there.

image

Credential storage and use

When creating a data connection, it is necessary to input credentials. These credentials are used to connect to the original data source when the model is refreshed. Due to security constraints, Microsoft cannot “see” the credentials, and they are stored locally. With this version of the gateway, there is now an option to save the (encrypted) credentials in the cloud. This helps with restoring gateways. 

System Health OData Feed

In the admin center, it’s possible to get quick visibility into the performance of all gateway machines.

image

With the November 2013, you can also get this performance information via an OData feed, and perform your own analysis, and use tools like Power View to visualize it.

image

This is what I’ve uncovered so far. I’ll update this post if I learn of any more.

Leave a Comment

New Clarity on SharePoint and Office Release Cadence

For the past few years, Microsoft has been talking about cloud computing, as it continues to transition itself into a devices and services company. As part of this transition, there has been a strong push from within to increase its release cadence, shortening the time between major releases of its products.  In fact, we’ve seen the result of this in Office 365, with now features coming to the product every 3-4 months or so. As a side note, the place to keep track of the changes is the Office 365 technology blog.

What has been less clear is how this impacts the traditional on-premises versions of the products. Being a part of the SharePoint community, I’m obviously concerned with staying on top of developments with it, and from some of the messaging from Microsoft, it appeared that we were looking at minor releases on a quarterly basis, with major releases coming annually.

While the idea of this is very interesting from a technical standpoint, it causes some very real concern for IT departments that don’t want to be retooling every year. The good new is, these concerns are unfounded. This morning, at a session for the Office group MVPs, Julia White, a General Manager in the Office division cleared this up for us, and succinctly articulated the release strategy moving forward. (She specifically stated that this information was not NDA and could be freely disseminated).

While the cloud versions of Office 365 will continue to receive updates every few months or so, the on premises versions of the components will continue to be updated every 2-3 years, with patches being released in the form of incremental updates every two months. Essentially, it’s business as usual for on-premises deployments.

In my opinion, this is a solid approach. Those wanting access to the latest and greatest can turn to Office 365, or use it in a hybrid scenario, and IT departments will be able to keep pace as well. This “cloud first” approach benefits everyone. New features don’t need to wait for a major release to see the light of day, early adopters can get access to the latest and greatest in a speedier fashion, and when major releases are rolled out, they will have benefitted from the fact that the constituent features have been proven in the field.

Leave a Comment

Conditionally Starting a Workflow with the Nintex Workflow Filter Action

For quite some time now, it has been possible to start workflows conditionally with Nintex Workflow. Standard options for starting a workflow are manual start, run when items are created, and run when items are edited. For the latter two, it’s possible to set conditions for the triggers, which is to say “only run when these conditions are satisfied”. This is a great feature, and works in most cases, but I’ve run across cases when it is problematic.

In some cases, if the list item is encountering multiple, simultaneous edits, the underlying SharePoint engine may back off, and place the workflow into a “Starting” state for a period of time (usually until the next timer job run). This also may be OK, but in other cases, this lag may be unacceptable. Another problem is that “these conditions” are restricted to the values available on the item that the workflow is running on. There’s no way to interrogate system values, time values, database values, etc. It may be necessary to perform a few actions before deciding not to run further.

Whenever I’ve encountered these conditions, I swap out the conditional start capability for a filter action. I can’t say that I much like the name of the action (it’s not very descriptive), but it’s quite useful. In essence, what it does is to evaluate a condition, and if the condition fails, the workflow is ended. It’s almost like it never ran at all, which is the effect that we’re trying to achieve.

If I’m just trying to get past the stuck on “Starting” problem, I’ll add a filter action (found in the logic and flow section) to the beginning of the workflow.

image

Configuring it is straightforward. Simply enter your conditions in the same manner as you would a “Run If” action. You can leverage all of your Nintex workflow design elements – variables, list lookups, context items, constants or user profiles.

image

In the above a simple item value is interrogated. If its value is greater than zero, the workflow continues, otherwise, it ends. That’s it.

Another nice feature of the filter action is that it can be placed anywhere in the workflow, not just at the beginning, which allows you to conditionally run only part of the workflow. In many cases there are also other ways to accomplish this, but this is quick and easy.

If the standard conditional start options work for you, then by all means continue to use them. However, if you start to run into concurrency problems, or your requirements grow in complexity, you might want to have a look at the filter action. It has worked well for me every time. 

2 Comments

Changes to Data Loading in the November 2013 Power Query Update

Last week, Microsoft released a new version of the Power Query preview for Power BI. It’s a significant update and contains some welcome improvements. I’m not going to list them all out here, as the Power BI team has done so here on their blog. I do however want to focus in on the changes to the basic data load procedure, and how queries are built. The process is easier, and significantly faster, but I still have some concerns, which I will get to.

Most of the changes concern data navigation. One of the major changes is the ability to directly use T-SQL when importing data from SQL server. This makes the tools significantly more useful when dealing with complex queries, or if you just happen to be handy with T-SQL. The opportunity to use your query occurs immediately after you select SQL Server as a data source.

image

Once you click OK, you are presented with the new Query editor window. Previously, the navigator and the query were contained in this window, but the “meta” query information, such as the query name and load properties were displayed along with the data in the worksheet. This lead to a fair bit of confusion, and this new change is welcome. You can see this new unified dialog below.

image

It’s worth pointing out a few things about this window. Firstly, at the top, we see the new editor ribbon. All of the functions in the ribbon were previously available in context menus, but the ribbon makes them far more discoverable (and therefore easier to use). In the command line, you can see that the native T-SQL query has been transformed into the internal Power Query language. It can be edited further at this point. On the right of the window you can see that in addition to the Applied Steps list, you can edit the name of the query, and change the load settings. Previously, the load had to be initiated before the settings could be changed, and this is therefore quite welcome. However, I still have some concerns with how this works, and I’d like to expand on it further.

To start with, I believe that the defaults are backwards. Previously (and with an older version of Power Query) I discussed the different ways that you can bypass Excel and load data directly into the data model. This is important not simply because of the row limit in Excel (1,048,576 rows) but because of the limits imposed on Excel Services by Office 365 (10 MB file size). Power BI allows for data models up to 250 MB, but only if the data is contained in the model – the 10 MB Excel limit still applies. I discuss this at length in this article.  With this default, users will load data into the worksheet without thinking about it, upload the file to Power BI, and run straight into the file size limit. I can see this as being very frustrating, and limiting adoption.

Another option might be to warn the user – “your file size is approaching 10 MB – would you like to load data into the model exclusively?”, or something of that nature.

My other concern is that now you don’t necessarily need to see the Query editor at all when importing data. On the surface, this is a good thing, and if the data load defaults were as I suggest, I would cheer it, but we run directly into the same problem right now. Let’s say that I just want to import a lot of data from a single SQL table. From Excel, I will just select the Power Query tab, and select “From SQL Server Database”. Now if I don’t enter a custom query as above, and click OK, I won’t see the editor window. Instead, I see the data navigator in the worksheet.

image 

One big feature to note here is the checkbox on the top of the navigator “Select Multiple Items”. With this version of Power Query, you can now select multiple tables at once – something that was previously unavailable. While you can bring up the editor window by selecting the Edit Query button at the bottom, you can load the data simply by clicking on the Load button. This is simple, but doing so will load all of the data directly into Excel. At best, you wind up with a workbook that is likely too big for Power BI (obviously depending on the amount of data), but at worst you bump into the row limit.

image

Unfortunately, when this happens, it isn’t immediately obvious how to fix the problem – the edit query button disappears. You can edit the query by double clicking on the query name, and changing the load options. However, users may just assume that they can’t use Power BI to do what they need, which is dead wrong. Again, even a few warning dialogs here would help significantly.

I like this new Power Query. It’s easier, has more features, and is significantly faster than its predecessors. I would also like to see it succeed, and I’d hate misunderstandings, or minor UI issues to get in the way of that. 

4 Comments

Running a Business in the Cloud – 3 Years In

A couple of years ago, I wrote a post called “Why I love Office 365”. It centered on the reasons that a SharePoint infrastructure consultant might want to outsource their SharePoint operations. After running our business in the cloud with it and a few other services for a few years now, I thought that I would revisit the topic and share a few observations and impressions.

In 2010 When UnlimitedViz became UnlimitedViz again (it’s a long story), we made a conscious decision to not invest in traditional “brick and mortar” infrastructure. Everyone working with us would work from their homes, hotel rooms, or wherever they happen to be. If fact, as I write this, I am sitting on a beach just outside of Playa Del Carmen in Mexico (it’s tough, I know). In order to do that, we would either need to set up our own IT infrastructure at a hosting center, or to rely on cloud services. We opted for the latter. At the time, it really was a leap of faith.

In the early days, we used BPOS – the original cloud offering from Microsoft that provided SharePoint, Exchange and Live Meeting. At the time, the SharePoint offering was extremely limited, so we primarily used Exchange for email, and that was about it. Because we’re a SharePoint shop, we also relied on local virtual machines for development and demonstration farms. We used to carry around a huge machine that was configured with several VMs to recreate a real world multi server infrastructure. To complete the picture, we chose Freshbooks for our time and expense entry service – it has served us well.

Eventually, we moved to Cloudshare for our demonstration environment to support our infrastructure, workflow and BI demonstrations. No more lugging around big hardware. The big concern there is what happens if there is no connectivity, or there is a problem with the platform? These continue to be concerns, although connection availability has become even better over time. Since hardware failure is also a risk, I consider that and the platform risk to be a wash.

When BPOS became Office 365, it became our core cloud platform. From our vantage point, Exchange continued to work well, we gained Lync, which we now use every day for instant messaging and conferencing, but the big bonus was that SharePoint Online became a viable option for us. The more recent update to the SharePoint 2013 feature set brought even more value.

In that spirit, I thought that I would list a few of the Office 365 features that we use on a regular basis, and how we use them.

Exchange Online – This is the service that we’ve used the longest. Secure, reliable email service with server side spam filtering that you can control. It never seems to go down. Email is probably the most sensitive feature to outages, and in the three years that we’ve been using it, we’ve noticed outages twice – only once during business hours, and in both cases, it was remedied quickly. It’s a track record that I don’t think that many on premises installations could match.

Lync Online – As mentioned above – we are a small, but highly distributed organization. Lync allows everyone to stay in touch, and to understand what each other is up to. Lync to Lync communication is great, and we are constantly using it to ask for assistance, or to demonstrate concepts quickly. Our cloud only implementation prevents us from taking advantage of call to landline capability, and video and voice quality isn’t quite as good as I’ve found with Skype (which I use instead of a landline). Hopefully these things will be addressed with future Skype integration.

We’ve federated Lync with Microsoft, which means that we can communicate just as easily with Microsoft employees, as well as with any other organizations that have federated their Lync environment. Of course, the Skype and MSN users are federated too, so Lync can be used to communicate with them as well.

Document Management – All of our corporate documentation goes into SharePoint. However, even though we always preach good document management practices, when it came to our own documents we weren’t always as disciplined as we should have been, and too many documents were sent via email, or stored locally. There are good reasons for this, as it wasn’t always simple to get the documents uploaded  into SharePoint. SkyDrive Pro has removed these barriers, making the process as simple as working with the file system. In fact, when your library is synchronized, you are working with the file system, and any third party tools can work with it seamlessly, whether on or offline.

Team One Notes – We have a wide variety of customers, with a wide variety of information to keep track of. this information isn’t well structured, so it’s a perfect problem for OneNote to solve. We have a single library with a single Notebook for each customer that is available to all of our consultants. OneNote replicates, so it isn’t necessary to be online, and it’s a great place to store things like VPN client installations, credential information etc. OneNote is a fantastic place to put down quick thoughts, meeting notes etc. so that they don’t get lost. It allows for one consultant to pick up from where the other left off just by opening the OneNote. It really is a handy tool and something my team uses every day.

InfoPath Forms – Although we use Office 365 for our SharePoint needs, not all of our customers do. The bulk of them are still on premises, and we help them with their infrastructure. When working with a SharePoint farm, there is a lot of information to keep track of – server names, IP addresses, service accounts, running services, etc. To keep all of this organized, we use InfoPath forms. This provides a consistent set of information about our customers’ SharePoint environments that can be used by anyone within the company. In fact, it’s  been used more that once to provide information back to our customers when it was lost.

Demonstration Environment – I mentioned before that we use a mock company infrastructure in Cloudshare for our demonstration environment. As more and more features are enabled in Office 365, we can increasingly move to it for our demonstrations. This is certainly true of information architecture demos, but with the recent release of Nintex Forms for Office 365, our workflow demos are moving there too. At the moment, there are many Business Intelligence features in SharePoint that we can’t use in Office 365, but with the pending release of Power BI, even that gap is narrowing.

Customer Collaboration – Office 365 allows external users to be invited to a SharePoint site at no extra cost. All that they need is a Microsoft (Live) or Organizational (Office 365) account. Whenever we need to collaborate with a customer on a project, we can spin up a team site, or an entire site collection, invite all of the team members, whether internal or external and get going. All of the content is in SharePoint and is therefore searchable (if desired) throughout the Office 365 tenant. We don’t always use this capability, but it comes in very handy when it’s needed.

There are many more features that we use quite regularly, but these are a few of them. Quite frankly, Office 365 allows us to run our business with our lean, highly disconnected model. Right out of the box it provides the bulk of our business requirements. In Canada, the “all-in” version of Office 365 (Enterprise E4) which includes Office Professional Plus costs $25.50 per user per month. My cell phone bill regularly comes in at over $200 per month, and when I compare the value offered by the two, it isn’t even close. In my opinion, Office 365 is far and away the best value of any of our infrastructure investments.

So yes – 3 years in, I still love Office 365.

2 Comments