Using Power BI for SharePoint Analytics

Here at DevFacto, we love SharePoint. But we also believe that the value of this enterprise solution is fully realized only when users leverage it in their daily work.

There are various tools that can help you track SharePoint user adoption – in fact, Microsoft provides some valuable SharePoint Analytics right out of the box. However, if you’re looking for a more robust SharePoint reporting solution, you might want to consider integrating your SharePoint with Power BI. In this article we’ll show you how the built-in SharePoint Analytics stack up against Power BI and which solution might be optimal for your organization.

To start, let’s take a look at the reporting capabilities that come with SharePoint out of the box.

Built-in SharePoint Reporting Capabilities

Depending on whether you are using SharePoint Online or On-Premise, your solution will have some form of built-in analytics:

Usage Reports

Usage reports are meant to show statistics about how visitors have interacted with SharePoint content. All reports are broken down per event usage. This means, for example, that you are able to see the number of views or number of recommendations displayed for an item. The two main reports included are the Most Popular Items and Popularity Trends. This offering is only available in SharePoint On-Premise.

Site Usage

Site usage visuals are available to all users on your SharePoint site. The site usage visual shows information such as unique viewers, total site visits, site traffic insights, as well as what’s being shared with external users. Site usage reports are meant to be a quick snippet of very high-level information; no detail is provided.

SharePoint site usage report

Click to enlarge. SharePoint site usage report

Site Activity Web Part

The site activity web part can be added to a modern web page on your SharePoint site. It shows documents that are being edited, added or deleted within your SharePoint site.

Audit Log Reports

The audit log reports are meant to track who is opening SharePoint files and folders in any site, and what they are doing with those files. Not all information tracked is available in SharePoint On-Premise and SharePoint Online. These audit log reports are very detailed and displayed in a line-by-line format.

Office 365 Admin Center Reports

The Office 365 admin center reports are only available for SharePoint online. In the O365 Admin Center you can view application usage on a user-by-user basis, however the information is limited and does not offer usage insights. It is intended to be very high-level and is only available to administrators of Office 365.

SharePoint Admin Center Report

Click to enlarge. SharePoint Admin Center Report – Site Usage by User

Office 365 Adoption Content Pack

The Office 365 Adoption Content Pack is a detailed Power BI report created by Microsoft. It visually shows summary analytics about Office 365 adoption. However, it is only available for SharePoint Online with Office 365. While only Office 365 administrators have full control over the information, they are able to grant permissions to various people within the company.

Microsoft 365 Usage Analytics - O365 Adoption Content Pack

Click to Enlarge. Microsoft 365 Usage Analytics – O365 Adoption Content Pack

Limitations with built-in SharePoint Analytics

Although the out of the box offerings for SharePoint Analytics provide a lot of information, they do come with some limitations:

The data isn’t quite what you need to make quick decisions

All these offerings are presented either as a generalized summary analytics or as un-summarized information. For example, the Site Usage visuals show quick summary charts that don’t give you in-depth detail, while the Audit Log Reports show line-by-line data but no summary analytics. Depending on your situation, you may want to view summary and drill-down into detail, or vice versa.

The most detailed data available isn’t the right detail

Understanding how the users interact with SharePoint content is the most important aspect of adoption monitoring. You may want to know which users are visiting which sites, which users are not using any sites, which sites are visited the most, which devices are being used to access your site, etc. However, none of the OOTB offerings show the detail that you might be searching for, which likely means you will have to come to conclusions on your own or summarize the data yourself.

No single spot for conclusive insights

Since the six built-in offerings all show different data, you may find yourself going to different spots track down information. This means sifting through large volumes of data, just to find relevant insights. In addition to that, some of the offerings are available only on SharePoint Online or only on SharePoint On-Premises, but not both.

The data comes in all forms

Some of the data comes in the form of excel spreadsheets, while other in the form of online charts or visualizations. This makes it impossible to combine all sources of data together to view a quick summary snippet of your information.

There is no ability to set permission levels

Companies often want to set up SharePoint reports based on permission levels. For example, your CEO may want to view all summary information to understand how the company is adopting SharePoint. On the other hand, your Operations Manager may want to view a slice of detailed information related to everyday operations. Unfortunately, this kind of functionality does not currently exist in SharePoint out of the box. While some reports (such as the audit reports) are only available to admins, they aren’t easily accessible to others in your organization.

Automating SharePoint Analytics with Power BI

So, is there a better way to report SharePoint usage, ideally, by using the tools you already have? Being huge fans of simple, user-friendly Power BI dashboards we developed a solution that integrates these two applications giving you full visibility of your SharePoint metrics in one centralized location and helping you make quick, informed decisions based on actual SharePoint usage. This reporting solution works for both SharePoint Online and On-Premise and combines Power BI and Azure Application Insights to deliver reliable SharePoint usage information.

By integrating SharePoint with Power BI you can track all key SharePoint metrics and determine which users are using/not using your sites, folders and files; which sites are getting used and which aren’t; which browser types are being utilized; what devices are connecting to which sites; or what is the number total users per day. In addition to that, the solution can be configured to collect other usage data that are important to your company.

SharePoint Site Usage Report in PowerBI

Click to enlarge. Reporting SharePoint Usage in Power BI

Benefits of Using Power Bi for SharePoint Reporting

Get the right depth out of your data

By integrating SharePoint with Power BI you can access both summary statistics and in-depth data about your SharePoint adoption. You can choose which view is the most important to you and dive deeper when you see appropriate.

Only see the data that is important to you

Whereas the OOTB offerings show you all of the available data regardless of your needs, Power BI dashboard is easily tailored to your actual usage. This means that you can choose to see only the data that matters to you, and nothing more.

Automate SharePoint Reporting

Thanks to Azure App Insights, the process of updating your data is completely automated and which means your reports are available on-demand anytime.

Set permission levels and change views

With different user types you can tailor access to data based on job needs giving the right data to those who need them.

Access all SharePoint usage data is in one place

By using Power BI for SharePoint analytics you can access all your reports in one central location eliminating redundancy.

Leverage the Microsoft Stack

All technologies involved in this solution are a part of the Microsoft stack. This means no additional enterprise agreements and little to no added ongoing costs.

Best of all? This solution is simple and inexpensive, allowing you to get started quickly.

 

Need help with gaining deeper business insights from your SharePoint analytics? Get in touch. We can help you integrate your SharePoint with Power BI for reliable and accessible SharePoint usage metrics.

Surfacing On-Premise Data in Power BI

If you have been looking at Power BI, or are in the process of implementing it, one of the things you have undoubtedly wondered about is whether you can surface your on-premise data in Power BI. The short answer is yes, and in this blog post, I’m going to walk through how that is done within the context of Power BI. I’ll talk about the difference between accessing on-prem data using Power BI Desktop and Power BI Service. I’ll also talk about using the Power BI Personal Gateway and the Power BI Enterprise Gateway to give Power BI Service secure access to your on-prem data.

Before we dive in, I must point out that Power BI is an ever-changing tool, and the information in this guide at some point might go out of date. We do our best to keep this post current, but if you notice that there’s new information that’s missing, let us know in the comments.

Desktop vs. Service

When it comes to Power BI, there are two main ways to use it: Power BI Desktop and Power BI Service. Power BI Desktop is a free desktop app you can download to your computer. In this app, you can connect to data and create reports and dashboards. Power BI Service is a web app within Office 365 that you log into and do the same kinds of things as you did in the Desktop version. In addition to that, you can also publish from Power BI Desktop to Power BI Service and share your dashboards with others. There are some differences between the two products, but that’s beyond the scope of this post. What’s important to know is that there are two pieces, and they can be used separately or together to super-charge your Power BI experience.

In April 2019, Microsoft also released Power BI Report Builder, a new desktop application focused on building paginated reports that can be hosted in the Power BI Service. This is a separate application from Power BI Desktop, and is effectively the SSRS report builder for connecting to Power BI datasets.

On-Premise Data Using Power BI Desktop

If you are already using Excel to connect to these data sources, the idea is the same. When you’re inside your corporate network, you can easily use Power BI Desktop to connect to your data sources. However, once you leave your network, there would need to be external connectivity enabled for the on-prem data source. Sometimes, though, how you connect can change depending on whether you are in the network or not. This means you might need to make some adjustments every time you move. For example, when you are at work, you connect to the sales database by connecting to “SalesDB”, but when you are working from home, you might need to type “sales.apps.contoso.com” to get the same access. What you need to consider when using Power BI Desktop is where the data is stored and how it is accessed.

Power BI Service

If you are already using Office 365’s Excel Online to connect to source data, the procedure you’ll follow for Power BI Service will be the same. There is a shared workspace in the cloud that people can connect to and access files. This data needs to be refreshed for all users, which means the data source needs to be accessible from that server. In Power BI Service, you would use a Power BI Gateway to allow Power BI Service to access your data in a secure fashion. For data sources that are not externally available, you would install a gateway on a machine (typically a server) in your corporate network. Once the gateway is configured, it acts as a go-between for the Power BI Service and your data. When the Power BI Service needs to get data, it asks the gateway for it. This means you don’t need to expose your data to the entire internet or change authentication mechanisms to get at it. The one thing you need to consider with Power BI Service is how you’re going to access data from outside your organization.

Enterprise Gateway vs. Personal Gateway

There are two choices when it comes to gateways: Enterprise Gateway or Personal Gateway. The difference between these two options comes down to governance and data sources you can use.

Let’s start with the Personal Gateway. The user installs the Personal Gateway onto a personal use computer, usually a desktop or laptop machine, and points the gateway to any Excel workbooks or Power BI Desktop files used as sources. An example would be a user with a desktop computer at work who needs to surface a list of third-party vendors. Once the installation is complete, the gateway will refresh data and send it to the Power BI Service to be used in the corresponding reports.

The Enterprise Gateway does all of this and more. With the Enterprise Gateway, you can have an administrator centrally set up connections to data sources in the company (both files like Excel workbooks but also to databases and cubes), and then specify which users get to use that connection. IT can also see statistics about how these connections are being used. Lastly, these connections can be used for live data access. This means that when a dashboard or report is accessed, the data is pulled live from the on-prem data source instead of from a stored version retrieved with a refresh schedule.

You can use both types of gateways in your organization. By doing that, you maintain governance over important or sensitive data sources with the Enterprise Gateway and ensure that users have the flexibility to collaborate by leveraging the benefits of the Personal Gateway.

Power BI Report Server

For some organizations, publishing their reports to the Power BI Service is not practical (or not allowed) even if the data stays on premises (which is the focus of this post). Power BI Report Server is an option in this scenario because it allows to connect on-premises data in published reports without using any cloud services. Power BI Report Server does not have all the functionality of the Power BI Service, nonetheless it’s a useful tool for sharing and securing data and reports for users.

In Summary

In this post, we talked about Power BI Desktop and Power BI Service, and how the two are different when it comes to accessing on-prem data. We also covered the differences between the Power BI Personal Gateway and the Power BI Enterprise Gateway. If you want to know more about Power BI, or you have questions about the material I discussed in this post, reach out to us.

Resources for Power BI Desktop:

https://powerbi.microsoft.com/en-us/documentation/powerbi-desktop-getting-started/

https://powerbi.microsoft.com/en-us/documentation/powerbi-desktop-data-sources/

Resources for Power BI Service:

https://powerbi.microsoft.com/en-us/documentation/powerbi-service-get-started/

https://powerbi.microsoft.com/en-us/documentation/powerbi-service-get-data/

https://powerbi.microsoft.com/en-us/documentation/powerbi-personal-gateway/

https://powerbi.microsoft.com/en-us/documentation/powerbi-gateway-enterprise/

5 Master Data Management Myths You Can Stop Believing

There are many exciting topics under the Data Management umbrella, but if I had to choose one that resonates with me the most, I’d go with Master Data Management (MDM). Why? Because  MDM programs can enable any organization to realize the value of one the most crucial assets: data.

Master Data Management is a hot topic in the IT industry. However, many organizations are on the fence about implementing it due to the wide impact of changes and a relatively high rate of MDM project failures. Yet, despite this, developing a comprehensive MDM strategy should be a priority for modern, data-driven organizations.

In my two decades of work in the data management space, I’ve noticed that even though MDM has gained considerable traction, isn’t always well understood. In this post, I’ll share and hopefully debunk the most common myths that surround it.

But before diving in, let’s review the fundamentals of MDM.

Master Data Management is a framework that allows organizations to generate uniquely identifiable, business critical data. This data is often referred to as an “entity”. In essence, MDM makes corporate data an integrated, harmonious whole by continuously bringing together source data, assessing its quality and ironing out the inconsistencies to solve data-related business problems.

Now that we established what MDM is, let’s explore what it isn’t.

Myth #1: Master Data Management is a software.

Too often, we see that MDM is perceived as a software solution, when it really is a framework. Unfortunately, no software can handle the entire MDM framework right out of the box. Many vendors will pitch their product as the ultimate, holistic system, but what they don’t tell you is that MDM software is just an accelerator.

Of course, there is an undisputed value in MDM software, especially when it comes to simplifying and expediting certain elements of the master data management program such as Identity Resolution, Automation, Survivorship, and Remediation. However, approaching vendors to find what is available on the market shouldn’t be the first step when planning an MDM program.

While tools can certainly help or hinder MDM efforts, a successful MDM implementation is not made or broken by a tool. The real key to effective MDM lays in identifying fundamental elements of the program and carefully designing the implementation roadmap. Planning early on will increase the probability of MDM implementation success and will help avoid unnecessary software spend.

Myth #2: MDM can be done in silo

Organizational silos and resulting data silos are generally not conducive to effective data management operations.

Let’s take a financial department of a large manufacturing enterprise. This department takes in information from a couple of different financial systems which leads to duplicated and inconsistent data. Organization’s CFO decides to run MDM solely for the finance department focusing specifically on the “client” entity. As a result, this newly implemented MDM solution generates unique client records within the financial systems. It all looks good until a client contacts the company to change his address. Although the change is promptly reflected in the CRM, the financial systems remain untouched. Even though the client received his product, the invoice never arrives since it was sent to the old address.

Successful MDM requires that we track our chosen data entity across the ENTIRE organization, without exceptions. If, as per above example, client data is in 80% of departments, then we need to incorporate ALL of them into the solution.

Myth #3: Master Data Management is expensive

Because MDM is typically an enterprise scale project, it’s automatically associated with large investments and significant effort. As with any large-scale project, the strain on corporate resources is hard to predict and can go well beyond the initial estimates. In the presence of multiple risk factors (broad scope, high impact of changes, technology that’s new to the organization), MDM projects can be loaded with uncertainty. However, there are ways to mitigate the risks and pave way for success. One of the best ways to do it, is by phasing MDM implementation based on criticality of data entities and their prevalence across the organization. This approach helps companies validate the program with a small budget, while still offering value to the organization.

For example, rather than focusing on a major data entity that’s widespread across the organization, companies can drastically reduce the risk by starting with a “smaller” and less critical entity. For many organizations, “employee” is a good test entity to use as a proof of concept. Not only is it less prevalent across the enterprise than much more critical entities such as “customer” or “product”, but it also offers a good starting point for further program expansion.

A gradual, iterative approach to MDM alleviates the risk of failure, improves adoption and sets the program up for success when rolled out on a large scale.

Myth #4: MDM can be successful without Data Quality

In my experience, enterprises are very confident in Data Quality across their systems, until that’s put to the test. In most cases, Data Quality is lacking, and the organization is rarely fully aware of it. There are two main reasons why this happens:

  1. There is no such thing as perfect data. Data Quality rules are based on current business needs and subjective decisions about quality of data, therefore it is hard to achieve and maintain a state where data is perfect.
  2. Data is constantly changing, and live data tends to bring inconsistencies. So, data quality is only a threshold of at which the quality reaches an acceptable level.

Before embarking on an MDM implementation, it’s vital to address critical data discrepancies between existing systems. This can mean, for example, standardizing the way you specify state or province names across all systems (i.e. picking between an abbreviated name “AB” or a full name “Alberta”). Without proper Data Quality, records will remain separate and potentially create false-negative scenarios.

Myth #5: MDM is best left to IT

Assigning MDM implementation to IT might seem like an obvious choice. After all, the project will deal with both data and software systems. And while IT department is one of the major MDM stakeholders, and responsible for technical implementation, MDM is also about solving specific business problems. When IT is assigned to establish MDM, business tends to get less involved. This means that even though technology is handled, business needs may not be fully addressed. This can lead to low MDM adoption.

Due to wide scope and reach of MDM, the program should be centrally organized, overseen by a committee and implemented by a well represented, cross-functional team across the entire organization.

Are you ready for MDM?

Each year, the amount of data enterprises gather and produce increases exponentially. But without a structured organizational approach to data management, not only does the data diminish in value but also poses liability risks when mishandled. MDM can do a lot more for an organization than just reduce these risks – it can actively help realize the full potential of data by establishing an enterprise-wide state of data clarity.

Curious about Master Data Management, but don’t quite know where to start? Contact us today and we’ll walk you through the steps for adapting the framework in your organization.

Spark Some Joy and Declutter your Business Intelligence

It’s human nature to hoard stuff. I, for example, have a habit of buying tools specific to the hobby project that I’m working on. My behaviour usually follows the same pattern: I’m in a midst of a challenging task when I realize that a specialized tool would make my job easier. I then do a quick run to the local Home Depot to pick up what’s needed. When the task is complete, I find the right storage space for my new purchase. But despite of trying to stay on top of what I own, I’m periodically confronted with a garage full of gadgets that serve a similar purpose.

Most often, one of these two things happens when we’re faced with clutter: we lose track of what we have and resort to using only the things from the top of the pile; or, like me, we keep getting new tools, only to realize that we own a lot of similar items that take up space but add little value.

The same principles apply to business intelligence and analytics. Most organizations that have BI programs in place, at some point struggle with duplicate reports, messy BI storage or just plain old lack of regular planning.

Luckily, as with other types of mess, BI can too be organized.

In fact, organizations that make a conscious effort to maintain order and declutter from time to time, reap the benefits of timely analytics that meet business needs without sacrificing performance.

Decluttering Business Intelligence

Setting up a successful Business Intelligence (BI) program requires planning. In the early stages of getting business analytics off the ground, we tend to spend a lot of time deliberating the size, scope and scale of the program. However, once the Business Intelligence program is up and running, new problems emerge making organizations question their original approach. Just as in keeping a tidy house, the key to an efficient BI program lies in an ongoing rationalization and decluttering of various BI components.

How to tell if my Business Intelligence needs reorganizing?

If:

  • You’ve got business intelligence reports or visualizations, but they don’t fully address your current needs. You find yourself creating completely new reports based on new data sets.
  • You’ve got many data sets in a centralized location, but you aren’t entirely sure where to find the right data.
  • You’re doubling or tripling the effort and/or data.

.. you might be experiencing the common symptoms of a cluttered BI.

On the surface, the above issues seem easy to address. Create a couple of catalogs, decommission old BI elements and the BI platform will be clean. Unfortunately, it is rarely that easy…

Consider this example:

Suppose you have a BI report which shows which products were sold in different regions and at what price. To get this report, source data was extracted from several siloed systems in each of your regions including production systems, financial systems, and ERP systems. After extraction, your data was cleansed, transformed and consolidated in the dimensional Enterprise Data Warehouse. Then, the required dimensions and facts were extracted to a Data Mart and your report was created.

However, the market situation has changed, and you need to analyse how different types of clients consume your product. You reach out to the Data Analysts from the business unit, but they have a hard time locating required industry information in the consolidated BI storage. When you turn to the team responsible for maintaining BI storage, they offer little insight into the data content. You realize that there are no data owners since data sets in the BI Storage are all based on consolidated data. Now, to get the results you’re looking for, you will need to start a new project, extract the entire data set again and create a new report. Often, at the very end of it, you will realize you’ve created a duplicate data set.

Take steps to organize your Business Intelligence

It’s easy to imagine what your business intelligence will look like after repetitively generating BI products without addressing the underlaying issues. At some point, BI rationalization (decluttering) will be required.

While there are many variations of BI implementations from an organizational and technical standpoint, at a high level the fundamental elements and data flow process remain the same:

Process of Organizing Business Intelligence

1. Start by reviewing your Data Sources

We often see that one of the leading causes of BI clutter is ingesting more data from the sources than what’s needed. To get organized, begin by identifying the current and future use cases for your business intelligence. For starters, every extracted data set should be associated with a Data Owner (also called Data Steward or Data Custodian) who can provide insights into this data set.

Once the data is extracted from its sources, it’s often transformed, normalized (restructured and modeled) and merged or joined based on the BI Data Architecture. This is a point where things can go awry. To avoid future complications, always opt to retain metadata lineage which could track the data back to its sources.

2. Take a good look at your BI Storage

Business requirements and data consumption performance are the two major drivers that affect decisions on how data is originally joined, merged, and structured. However, in virtually all cases, business needs will change over time. When they do, the newly created BI products (dashboards, reports) and added sources of data will affect the structure of your BI storage sometimes leading to unnecessary complexity and poor performance.

There are several ways to reduce this complexity, which all boil down to planning and organizing the storage regardless of whether you’re using dimensional or non-dimensional data stores for your program. It’s self-evident that modeling is the most important clutter limiting measure for dimensional databases, as it enables future development and assures ultimate performance. However, what less often understood is that non-dimensional BI stores also require a good dose of organizing. Any non-dimensional BI storage can benefit from clearly defining, maintaining and enabling all data types and linking them back to their sources. By also associating data with a specific owner and documenting it in a catalogue you can ensure that your BI storage will be ready for further analysis.

3. Analyse your BI Consumption

Numerous organizations struggle with efficient intelligence analysis because their BI products don’t meet the changing organizational needs. As the enterprise grows and evolves, its dashboards and reports should follow suit. One of the best ways to ensure that the insights you receive are relevant and timely is by cataloguing BI products and attaching them to current business requirements.

You will notice that when data lineage is preserved and ready to handle the full BI data lifecycle, decluttering will become simpler and a lot more efficient.

4. Consider the organizational approach to Business Intelligence

As the last step, take a look at the big picture. For some companies, the source of BI clutter may be rooted in organizational approach rather than the BI process itself. For example, when BI elements are handled by several distinct groups within the organization, there is a significant risk that the approach to data isn’t always consistent.

In other organizations, what’s needed is a revamped project methodology. For instance, BI processes, strategy, and delivery are rarely effective when following the waterfall approach. Conversely, agile methodology can offer greater benefits, especially when paired with a defined Information Strategy and governance.

Finally, remember that you’re not alone. In fact, the industry has been struggling with disorganized BI for a long time. What helps set the leaders on track is a holistic, organization-wide change that comes with Implementation of Architecture Practices and defined governance that enable timely, efficient and structured BI.

Do you see any red flags in your BI program? Give us a call. Together we can ensure that your organization has the right insights for accurate decision-making.

Business Intelligence – Tools Are Not Enough

We worry about tools too much.

Don’t get me wrong – tools are great. It’s hard to do much without them. But when it comes to building business intelligence solutions, focusing on tools rather than the business you’re building the solution for is a key reason why these solutions often end up being failures. The solutions don’t meet their users’ needs and are thus simply not used.  Read more