Using Data to Mitigate the Impact of COVID-19 on Today’s Businesses

As the concern over COVID-19 continues, companies globally are exploring actions they need to take now to maintain their business. This crisis has hammered organizations forcing layoffs, reducing services, and creating anxiety among leaders and employees alike.

The C-suite has a vital role in making sure the organization can function as concern mounts over the scale and impact of the pandemic. They are being forced to make tough decisions about how to operate amid the chaos. What should they focus on, where can they cut, and how to prepare for the new world as the crisis eventually lifts – these are the thoughts weighing heavy on leaders’ minds today. Pairing high levels of uncertainty with mass amounts of new information mean they have to forge ahead without the luxury of past experience to help navigate.

There is a reason for hope, however. Information, data insights, and technology have fundamentally changed our capacity to successfully manage a threat like this pandemic and adapt for the future. Taking advantage of your data effectively to guide business practice and improve the well-being of the organization and its employees is the single best option to mitigate the impact of COVID-19.

Transforming data into actionable information and analytics that drive business decisions will fast-track the return to normalcy. Volumes of raw operational data are just waiting to be turned into valuable business intelligence assets that will draw the path forward.

Business intelligence helps leaders frame their strategy in three ways:

Looking at today

Understanding real-time or near real-time data trends will help prescribe rather than predict scenarios.

Having complete visibility into cash flows and other operational metrics is a critical first step to understanding the current business situation and achieving working capital performance goals.

Gaining access to available government incentives and programs and recognizing expense-reduction opportunities are a couple of examples of responding to the immediate cash-flow management challenges. Focusing on current business needs is integral to a companies’ overall COVID-19 risk assessment and action planning in the near term.

Looking to the future

The most critical analysis is a forward-looking perspective. Foresight provides organizations with the business and operating capabilities they need to position themselves for a successful post-crisis recovery.

A detailed assessment of future working capital and cash-flow requirements along with key business and division unit metrics will help to understand at what point, if any, is the continuation of certain operations no longer viable and suspending business activities preferable. Alternatively, the metrics could indicate what variable measures should or can be taken to ensure healthy operation.

Run scenario planning to ensure financing remains viable and to understand how much cash is needed and for how long. This informs considerations around such things as determining what capital investments can be postponed and which are required for creating a competitive advantage in the rebound. Gaining insight into end-to-end supply chain visibility for future demand planning, inventory management, and production planning and scheduling can ensure you have a robust framework for managing supply chain risk. Depending on what the financial scenario planning reveals, you may need to consider alternative financing options to support the future success of the business.

Looking back

Does the past predict the future? Maybe not, but it is the best way to protect us from being blindsided by any future structural shaking scenarios like this pandemic. Monitoring and measuring how the business gets through this crisis remains the best way to forecast for the future. Now is an excellent time to measure and evaluate business efficiencies, or inefficiencies, stemming from the alternate models of work that have been adopted.

There are two ways businesses can come through this crisis. You can use data to tell the story about the past, the present, and the future, or you can use a personal perspective (from gut instinct to experience). The problem with using your personal view is that it is often filled with bias and assumption. Using data is the only objective way to measure risk and lessen the impact of COVID-19 on today’s businesses.

Organizations that gain insight into their business in new and innovative ways will outperform those that don’t by making data-driven decisions that mitigate risk, drive innovation, and take advantage of opportunities for efficiency.

New call-to-action

The Changing Shape of Digital Transformation

The moment of truth – will COVID-19 be the tipping point for digital transformation.


If “necessity is the mother of invention,” than COVID-19 has forced business leaders all around the world to rethink their digital transformation strategies.

At its core, coronavirus is disruption. What it really is, is a disruption frenzy, as nations try to slow the virus, this pandemic is impacting our political environment, our healthcare system, our economy, and the way we use technology. When disruption comes, businesses must adapt or die. And, not just adapt but adapt better and faster than competitors.

The truth is, even before COVID-19, the business world was changing at a pace that was hard to keep up. Those businesses that had already embraced a culture of change and realized their digital transformation goals might have been better prepared for this disruption than others. For others, it’s been a challenge. A recent survey by Gartner reports that only 12% of organizations are highly prepared for the impact of coronavirus. This does not mean you should hunker down and wait out COVID-19, it’s time to rev up your digital engine and position yourself for growth when the pandemic subsides.

Here are the areas you should be focusing on:

Cloud first

As COVID-19 pushes businesses to their limit, there has never been a better time to focus on cloud strategies. With the rapid response to the pandemic, millions of people have shifted remote work, pushing companies to adopt cloud-based solutions at an unprecedented rate. Rapid to deploy, cloud-based solutions are the reason organizations around the world remain productive during this time. If there was ever a good time to migrate to the cloud, this is it.

Boost process automation

Unexpected circumstances place an added burden on the workforce. Consider automating repetitive processes to free up teams and allow them to focus on more meaningful work. Extending automation throughout the enterprise can help mitigate disruption and offer stability in times of uncertainty.

Harness the power of data

Now more than ever, business leaders need to analyze risks and develop the best methods for mitigating them. They need to know the effectiveness of the current recovery processes. To do this, they need to use data effectively to guide business decisions and improve the well-being of the organization and its employees. Fostering a data-driven organization galvanizes the vision of faster, better-informed decisions to enable businesses to return to normalcy sooner.

Be champions of collaboration

No matter the industry, digital collaboration platforms are the foundation of work during the pandemic. The success of operations today highly relies on the ability to collaborate in real-time and from anywhere. Many businesses are recognizing the need to improve collaboration to ensure business continuity now and into the future.

Deliver valuable experiences

As the coronavirus crisis accelerates the transition to a digital future, the shift to digital customer experience also hits fast-forward. Empathic tools such as service design and design thinking seek to address customers’ acute needs and forge stronger ties with the market in the post-COVID-19 era. Reimagining customer experience to meet the changing needs should drive your next steps in your digital transformation journey.

The new era of business reinvention is upon us. For most industries, revenues will fall in 2020, that’s a given. But companies can emerge stronger, more innovative, and more purposeful. By creating a forward-thinking, customer-focused digital company, leadership teams can mitigate today’s threat and accelerate into an eventual recovery.


COVID-19 Business Intelligence

How to Export from Power BI to Excel – An Overview of Methods

Whenever we talk about data tools we always joke about the time-to-Excel. It stands for the amount of time a user will be in the tool before they choose to export to a more familiar software like Excel to customize the data exactly the way they want. This happens more often than many would like to admit, and it’s hardly something to be shy about. If you’re wondering how to export from Power BI to Excel we’ve got a handy chart for you to compare the available methods. Below, I explain how to get from different parts of the Power BI ecosystem back to Excel when preferred.

Power BI, as a tool, is primarily meant for data exploration and analysis within the tool – that is, Power BI Desktop, the Power BI Service (, or other parts of the Power BI ecosystem. Exporting for further analysis is not the primary goal of the platform.

Luckily for those who would much rather work with the familiar worksheet view, there are ways to manipulate data from Power BI in Excel. I’ve listed them below in order from the easiest and most complete to the most difficult and least complete.

Exporting from Power BI to Excel

MethodRequirementsOutput FormatRow LimitOther LimitationsPremium/
Connect to Data Mart/Warehouse Directly
(No Power BI)
View Access to Data Mart/WarehouseAny (as allowed by tool)NoneNoneNoNo
Export from Paginated Reports (Desktop)Paginated Report Builder, View Access to Data Source or Build Permission to Shared PBI DatasetCSV, Excel, Word, PowerPoint, PDF, XML, MHTMLNoneNoneNoNo
Export using DAX StudioView Access to Data Mart/WarehouseCSV, SQL ServerNoneLocal HardwareNoNo
Connect to Data Mart/Warehouse with Excel Power QueryView Access to Data Mart/WarehouseExcelExcel – 1,048,576None*NoNo
Analyze in Excel from Power BI ServiceView Access to Workspace or AppExcelExcel – 1,048,576NoneNoMaybe*
Export from Paginated Reports (Published to Service)View Access to Workspace or AppCSV, Excel, Word, PowerPoint, PDF, XML, MHTMLNoneUnable to change table structure from how report was builtYesMaybe*
Copy from Power BI Desktop Data View (Table View) to ExcelPower BI Desktop, View Access to Data Source or Build Permission to Share PBI DatasetExcelExcel – 1,048,576Local HardwareNoNo
Export from Power BI Service VisualView Access to Workspace or AppExcel or CSVExcel – 150k
CSV – 30k
Limited by Visual Summarization*NoMaybe*
Export from Power BI Desktop VisualPower BI Desktop, View Access to Data Source or Build Permission to Share PBI DatasetExcelExcel – 30kLimited by Visual SummarizationNoMaybe*

*Please note that the values above are for Import reports. DirectQuery reports have slightly different export limits when limited.

Further Reading:

Excel Power Query Limitations –

Paginated Report Limitations –

Paginated Report FAQs –

Helpful Links:

Power BI Desktop Download:

Power BI Paginated Report Download:

Excel Power Query Download:


Using Power BI for SharePoint Analytics

Here at DevFacto, we love SharePoint. But we also believe that its the value is fully realized only when users leverage it in their daily work.

The thing is that organizations don’t always know how their employees are using SharePoint. Luckily, there are various tools that can help you track SharePoint user adoption – in fact, Microsoft provides some valuable SharePoint Analytics right out of the box. However, if you’re looking for a more robust SharePoint reporting solution, you might want to consider integrating your SharePoint with Power BI. In this article we’ll show you how the built-in SharePoint Analytics stack up against Power BI, and which solution might be optimal for your organization.

To start, let’s take a look at the reporting capabilities that come with SharePoint out of the box.

Built-in SharePoint Reporting Capabilities

Depending on whether you are using SharePoint Online or On-Premise, your solution will have some form of built-in analytics:

Usage Reports

Usage reports are meant to show statistics about how visitors have interacted with SharePoint content. All usage reports are broken down per event usage. This means, for example, that you are able to see the number of views or number of recommendations displayed for an item. The two main reports included are the Most Popular Items and Popularity Trends. This offering is only available in SharePoint On-Premise.

Site Usage

Site usage visuals are available to all users on your SharePoint site. The site usage visual shows information such as unique viewers, total site visits, site traffic insights, as well as what’s being shared with external users. These reports don’t offer details, serving rather as a quick snippet of very high-level information. Here is an example of what a Site Usage report looks like:

SharePoint site usage report

Click to enlarge. SharePoint site usage report

Site Activity Web Part

The site activity web part can be added to a modern web page on your SharePoint site. It shows documents that are being edited, added or deleted within your SharePoint site.

Audit Log Reports

The audit log reports are meant to track who is opening SharePoint files and folders in any site, and what they are doing with those files. Not all information tracked is available in SharePoint On-Premise and SharePoint Online. These audit log reports are very detailed and display information in a line-by-line format.

Office 365 Admin Center Reports

The Office 365 admin center reports are only available for SharePoint online. In the O365 Admin Center you can view application usage on a user-by-user basis, however the information is limited and does not offer usage insights. This is a high-level report that is only available to administrators of Office 365.

For us at DevFacto, this is what an Office 365 admin center reports looks like:

SharePoint Admin Center Report

Click to enlarge. SharePoint Admin Center Report – Site Usage by User

Office 365 Adoption Content Pack

The Office 365 Adoption Content Pack is a detailed Power BI report created by Microsoft. It visually shows summary analytics about Office 365 adoption. However, it is only available for SharePoint Online with Office 365. While only Office 365 administrators have full control over the information, they are able to grant permissions to various people within the company. This is what an out of the box Microsoft 365 Power BI report looks like:

Microsoft 365 Usage Analytics - O365 Adoption Content Pack

Click to Enlarge. Microsoft 365 Usage Analytics – O365 Adoption Content Pack

Limitations with built-in SharePoint Analytics

Although the out of the box offerings for SharePoint Analytics provide a lot of information, they do come with some limitations:

1. The data isn’t quite what you need to make quick decisions

All these offerings are presented either as a generalized summary analytics or as un-summarized information. For example, the Site Usage visuals show quick summary charts that don’t give you in-depth detail, while the Audit Log Reports show line-by-line data but no summary analytics. Depending on your situation, you may want to view summary and drill-down into detail, or vice versa.

2. The most detailed data available isn’t the right detail

Understanding how the users interact with SharePoint content is the most important aspect of adoption monitoring. You may want to know which users are visiting which sites, which users are not using any sites, which sites are visited the most, which devices are being used to access your site, etc. However, none of the OOTB offerings show the detail that you might be searching for, which likely means you will have to come to conclusions on your own or summarize the data yourself.

3. No single spot for conclusive insights

Since the six built-in offerings all show different data, you may find yourself going to different spots track down information. This means sifting through large volumes of data, just to find relevant insights. In addition to that, some of the offerings are available only on SharePoint Online or only on SharePoint On-Premises, but not both.

4. The data comes in all forms

Some of the data comes in the form of excel spreadsheets, while other in the form of online charts or visualizations. For this reason, it becomes impossible to combine all sources of data together and to get a quick picture of the situation.

5. There is no ability to set permission levels

Companies often want to set up SharePoint reports based on permission levels. For example, your CIO may want to view all summary information to understand how the company is adopting SharePoint. On the other hand, your Operations Manager may want to view a slice of detailed information related to everyday operations. Unfortunately, this kind of functionality does not currently exist in SharePoint out of the box. While some reports (such as the audit reports) are only available to admins, they aren’t easily accessible to others in your organization.

Automating SharePoint Analytics with Power BI

So, is there a better way to report SharePoint usage? And, ideally, can you do it using the tools you already have? Being huge fans of simple, user-friendly Power BI dashboards we developed a solution that integrates these two applications giving you full visibility of your SharePoint metrics in one centralized location. By far, the best thing about it is the ability to monitor all your key SharePoint metrics at a glance and drill into detailed data when needed. The dashboard makes it easy to understand you organization’s SharePoint usage and to make quick, informed decisions based on your own data. What’s more, our reporting solution works for both SharePoint Online and SharePoint On-Premise which means that you can get the insights you need no matter which version you’re using. This solution combines Power BI and Azure Application Insights to automatically deliver reliable SharePoint usage information.

What kind of data can you track with by integrating SharePoint with Power BI? Here are some examples of insights you can get from this automated SharePoint analytics solutions:

  • The total number of users per day;
  • Users are using or not using your sites, folders and files;
  • Sites that are used the most and/or the least;
  • Browser types used to access SharePoint;
  • Types of devices that connect to your sites.

In addition to that, the solution can collect other usage data that are important to your company.

Here is what our DevFacto SharePoint usage dashboard looked like at a point in time:

SharePoint Site Usage Report in PowerBI

Click to enlarge. Reporting SharePoint Usage in Power BI

Benefits of Using Power BI for SharePoint Reporting

Here is how you can benefit from leveraging this Power BI and Azure App Insights based solution in reporting your oranization’s SharePoint usage.

1. Get the right depth out of your data

By integrating SharePoint with Power BI you can access both summary statistics and in-depth data about your SharePoint adoption. You can choose which view is the most important to you and dive deeper when you see appropriate.

2. Only see the data that is important to you

Unlike the OOTB offerings that show you all of the available data regardless of your needs, Power BI dashboard is easily tailored to your actual usage. This means that you can choose to see only the data that matters to you, and nothing more.

3. Automate SharePoint Reporting

Thanks to Azure App Insights, your data updates automatically, which means your reports are available on-demand anytime.

4. Set permission levels and change views

With different user types you can tailor access to data based on job needs giving the right data to those who need them.

5. Access all SharePoint usage data is in one place

By using Power BI for SharePoint analytics you can access all your reports in one central location eliminating redundancy.

6. Leverage the Microsoft Stack

All technologies involved in this solution are a part of the Microsoft stack. This means no additional enterprise agreements and little to no added ongoing costs.

Best of all? This solution is simple and inexpensive, so you can get started quickly.


Need help with gaining deeper business insights from your SharePoint analytics? Get in touch. We can help you integrate your SharePoint with Power BI for reliable and accessible SharePoint usage metrics.

Surfacing On-Premise Data in Power BI

If you have been looking at Power BI, or are in the process of implementing it, one of the things you have undoubtedly wondered about is whether you can surface your on-premise data in Power BI. The short answer is yes, and in this blog post, I’m going to walk through how that is done within the context of Power BI. I’ll talk about the difference between accessing on-prem data using Power BI Desktop and Power BI Service. I’ll also talk about using the Power BI Personal Gateway and the Power BI Enterprise Gateway to give Power BI Service secure access to your on-prem data.

Before we dive in, I must point out that Power BI is an ever-changing tool, and the information in this guide at some point might go out of date. We do our best to keep this post current, but if you notice that there’s new information that’s missing, let us know in the comments.

Desktop vs. Service

When it comes to Power BI, there are two main ways to use it: Power BI Desktop and Power BI Service. Power BI Desktop is a free desktop app you can download to your computer. In this app, you can connect to data and create reports and dashboards. Power BI Service is a web app within Office 365 that you log into and do the same kinds of things as you did in the Desktop version. In addition to that, you can also publish from Power BI Desktop to Power BI Service and share your dashboards with others. There are some differences between the two products, but that’s beyond the scope of this post. What’s important to know is that there are two pieces, and they can be used separately or together to super-charge your Power BI experience.

In April 2019, Microsoft also released Power BI Report Builder, a new desktop application focused on building paginated reports that can be hosted in the Power BI Service. This is a separate application from Power BI Desktop, and is effectively the SSRS report builder for connecting to Power BI datasets.

On-Premise Data Using Power BI Desktop

If you are already using Excel to connect to these data sources, the idea is the same. When you’re inside your corporate network, you can easily use Power BI Desktop to connect to your data sources. However, once you leave your network, there would need to be external connectivity enabled for the on-prem data source. Sometimes, though, how you connect can change depending on whether you are in the network or not. This means you might need to make some adjustments every time you move. For example, when you are at work, you connect to the sales database by connecting to “SalesDB”, but when you are working from home, you might need to type “” to get the same access. What you need to consider when using Power BI Desktop is where the data is stored and how it is accessed.

Power BI Service

If you are already using Office 365’s Excel Online to connect to source data, the procedure you’ll follow for Power BI Service will be the same. There is a shared workspace in the cloud that people can connect to and access files. This data needs to be refreshed for all users, which means the data source needs to be accessible from that server. In Power BI Service, you would use a Power BI Gateway to allow Power BI Service to access your data in a secure fashion. For data sources that are not externally available, you would install a gateway on a machine (typically a server) in your corporate network. Once the gateway is configured, it acts as a go-between for the Power BI Service and your data. When the Power BI Service needs to get data, it asks the gateway for it. This means you don’t need to expose your data to the entire internet or change authentication mechanisms to get at it. The one thing you need to consider with Power BI Service is how you’re going to access data from outside your organization.

Enterprise Gateway vs. Personal Gateway

There are two choices when it comes to gateways: Enterprise Gateway or Personal Gateway. The difference between these two options comes down to governance and data sources you can use.

Let’s start with the Personal Gateway. The user installs the Personal Gateway onto a personal use computer, usually a desktop or laptop machine, and points the gateway to any Excel workbooks or Power BI Desktop files used as sources. An example would be a user with a desktop computer at work who needs to surface a list of third-party vendors. Once the installation is complete, the gateway will refresh data and send it to the Power BI Service to be used in the corresponding reports.

The Enterprise Gateway does all of this and more. With the Enterprise Gateway, you can have an administrator centrally set up connections to data sources in the company (both files like Excel workbooks but also to databases and cubes), and then specify which users get to use that connection. IT can also see statistics about how these connections are being used. Lastly, these connections can be used for live data access. This means that when a dashboard or report is accessed, the data is pulled live from the on-prem data source instead of from a stored version retrieved with a refresh schedule.

You can use both types of gateways in your organization. By doing that, you maintain governance over important or sensitive data sources with the Enterprise Gateway and ensure that users have the flexibility to collaborate by leveraging the benefits of the Personal Gateway.

Power BI Report Server

For some organizations, publishing their reports to the Power BI Service is not practical (or not allowed) even if the data stays on premises (which is the focus of this post). Power BI Report Server is an option in this scenario because it allows to connect on-premises data in published reports without using any cloud services. Power BI Report Server does not have all the functionality of the Power BI Service, nonetheless it’s a useful tool for sharing and securing data and reports for users.

In Summary

In this post, we talked about Power BI Desktop and Power BI Service, and how the two are different when it comes to accessing on-prem data. We also covered the differences between the Power BI Personal Gateway and the Power BI Enterprise Gateway. If you want to know more about Power BI, or you have questions about the material I discussed in this post, reach out to us.

Resources for Power BI Desktop:

Resources for Power BI Service:

5 Master Data Management Myths You Can Stop Believing

There are many exciting topics under the Data Management umbrella, but if I had to choose one that resonates with me the most, I’d go with Master Data Management (MDM). Why? Because  MDM programs can enable any organization to realize the value of one the most crucial assets: data.

Master Data Management is a hot topic in the IT industry. However, many organizations are on the fence about implementing it due to the wide impact of changes and a relatively high rate of MDM project failures. Yet, despite this, developing a comprehensive MDM strategy should be a priority for modern, data-driven organizations.

In my two decades of work in the data management space, I’ve noticed that even though MDM has gained considerable traction, isn’t always well understood. In this post, I’ll share and hopefully debunk the most common myths that surround it.

But before diving in, let’s review the fundamentals of MDM.

Master Data Management is a framework that allows organizations to generate uniquely identifiable, business critical data. This data is often referred to as an “entity”. In essence, MDM makes corporate data an integrated, harmonious whole by continuously bringing together source data, assessing its quality and ironing out the inconsistencies to solve data-related business problems.

Now that we established what MDM is, let’s explore what it isn’t.

Myth #1: Master Data Management is a software.

Too often, we see that MDM is perceived as a software solution, when it really is a framework. Unfortunately, no software can handle the entire MDM framework right out of the box. Many vendors will pitch their product as the ultimate, holistic system, but what they don’t tell you is that MDM software is just an accelerator.

Of course, there is an undisputed value in MDM software, especially when it comes to simplifying and expediting certain elements of the master data management program such as Identity Resolution, Automation, Survivorship, and Remediation. However, approaching vendors to find what is available on the market shouldn’t be the first step when planning an MDM program.

While tools can certainly help or hinder MDM efforts, a successful MDM implementation is not made or broken by a tool. The real key to effective MDM lays in identifying fundamental elements of the program and carefully designing the implementation roadmap. Planning early on will increase the probability of MDM implementation success and will help avoid unnecessary software spend.

Myth #2: MDM can be done in silo

Organizational silos and resulting data silos are generally not conducive to effective data management operations.

Let’s take a financial department of a large manufacturing enterprise. This department takes in information from a couple of different financial systems which leads to duplicated and inconsistent data. Organization’s CFO decides to run MDM solely for the finance department focusing specifically on the “client” entity. As a result, this newly implemented MDM solution generates unique client records within the financial systems. It all looks good until a client contacts the company to change his address. Although the change is promptly reflected in the CRM, the financial systems remain untouched. Even though the client received his product, the invoice never arrives since it was sent to the old address.

Successful MDM requires that we track our chosen data entity across the ENTIRE organization, without exceptions. If, as per above example, client data is in 80% of departments, then we need to incorporate ALL of them into the solution.

Myth #3: Master Data Management is expensive

Because MDM is typically an enterprise scale project, it’s automatically associated with large investments and significant effort. As with any large-scale project, the strain on corporate resources is hard to predict and can go well beyond the initial estimates. In the presence of multiple risk factors (broad scope, high impact of changes, technology that’s new to the organization), MDM projects can be loaded with uncertainty. However, there are ways to mitigate the risks and pave way for success. One of the best ways to do it, is by phasing MDM implementation based on criticality of data entities and their prevalence across the organization. This approach helps companies validate the program with a small budget, while still offering value to the organization.

For example, rather than focusing on a major data entity that’s widespread across the organization, companies can drastically reduce the risk by starting with a “smaller” and less critical entity. For many organizations, “employee” is a good test entity to use as a proof of concept. Not only is it less prevalent across the enterprise than much more critical entities such as “customer” or “product”, but it also offers a good starting point for further program expansion.

A gradual, iterative approach to MDM alleviates the risk of failure, improves adoption and sets the program up for success when rolled out on a large scale.

Myth #4: MDM can be successful without Data Quality

In my experience, enterprises are very confident in Data Quality across their systems, until that’s put to the test. In most cases, Data Quality is lacking, and the organization is rarely fully aware of it. There are two main reasons why this happens:

  1. There is no such thing as perfect data. Data Quality rules are based on current business needs and subjective decisions about quality of data, therefore it is hard to achieve and maintain a state where data is perfect.
  2. Data is constantly changing, and live data tends to bring inconsistencies. So, data quality is only a threshold of at which the quality reaches an acceptable level.

Before embarking on an MDM implementation, it’s vital to address critical data discrepancies between existing systems. This can mean, for example, standardizing the way you specify state or province names across all systems (i.e. picking between an abbreviated name “AB” or a full name “Alberta”). Without proper Data Quality, records will remain separate and potentially create false-negative scenarios.

Myth #5: MDM is best left to IT

Assigning MDM implementation to IT might seem like an obvious choice. After all, the project will deal with both data and software systems. And while IT department is one of the major MDM stakeholders, and responsible for technical implementation, MDM is also about solving specific business problems. When IT is assigned to establish MDM, business tends to get less involved. This means that even though technology is handled, business needs may not be fully addressed. This can lead to low MDM adoption.

Due to wide scope and reach of MDM, the program should be centrally organized, overseen by a committee and implemented by a well represented, cross-functional team across the entire organization.

Are you ready for MDM?

Each year, the amount of data enterprises gather and produce increases exponentially. But without a structured organizational approach to data management, not only does the data diminish in value but also poses liability risks when mishandled. MDM can do a lot more for an organization than just reduce these risks – it can actively help realize the full potential of data by establishing an enterprise-wide state of data clarity.

Curious about Master Data Management, but don’t quite know where to start? Contact us today and we’ll walk you through the steps for adapting the framework in your organization.

Spark Some Joy and Declutter your Business Intelligence

It’s human nature to hoard stuff. I, for example, have a habit of buying tools specific to the hobby project that I’m working on. My behaviour usually follows the same pattern: I’m in a midst of a challenging task when I realize that a specialized tool would make my job easier. I then do a quick run to the local Home Depot to pick up what’s needed. When the task is complete, I find the right storage space for my new purchase. But despite of trying to stay on top of what I own, I’m periodically confronted with a garage full of gadgets that serve a similar purpose.

Most often, one of these two things happens when we’re faced with clutter: we lose track of what we have and resort to using only the things from the top of the pile; or, like me, we keep getting new tools, only to realize that we own a lot of similar items that take up space but add little value.

The same principles apply to business intelligence and analytics. Most organizations that have BI programs in place, at some point struggle with duplicate reports, messy BI storage or just plain old lack of regular planning.

Luckily, as with other types of mess, BI can too be organized.

In fact, organizations that make a conscious effort to maintain order and declutter from time to time, reap the benefits of timely analytics that meet business needs without sacrificing performance.

Decluttering Business Intelligence

Setting up a successful Business Intelligence (BI) program requires planning. In the early stages of getting business analytics off the ground, we tend to spend a lot of time deliberating the size, scope and scale of the program. However, once the Business Intelligence program is up and running, new problems emerge making organizations question their original approach. Just as in keeping a tidy house, the key to an efficient BI program lies in an ongoing rationalization and decluttering of various BI components.

How to tell if my Business Intelligence needs reorganizing?


  • You’ve got business intelligence reports or visualizations, but they don’t fully address your current needs. You find yourself creating completely new reports based on new data sets.
  • You’ve got many data sets in a centralized location, but you aren’t entirely sure where to find the right data.
  • You’re doubling or tripling the effort and/or data.

.. you might be experiencing the common symptoms of a cluttered BI.

On the surface, the above issues seem easy to address. Create a couple of catalogs, decommission old BI elements and the BI platform will be clean. Unfortunately, it is rarely that easy…

Consider this example:

Suppose you have a BI report which shows which products were sold in different regions and at what price. To get this report, source data was extracted from several siloed systems in each of your regions including production systems, financial systems, and ERP systems. After extraction, your data was cleansed, transformed and consolidated in the dimensional Enterprise Data Warehouse. Then, the required dimensions and facts were extracted to a Data Mart and your report was created.

However, the market situation has changed, and you need to analyse how different types of clients consume your product. You reach out to the Data Analysts from the business unit, but they have a hard time locating required industry information in the consolidated BI storage. When you turn to the team responsible for maintaining BI storage, they offer little insight into the data content. You realize that there are no data owners since data sets in the BI Storage are all based on consolidated data. Now, to get the results you’re looking for, you will need to start a new project, extract the entire data set again and create a new report. Often, at the very end of it, you will realize you’ve created a duplicate data set.

Take steps to organize your Business Intelligence

It’s easy to imagine what your business intelligence will look like after repetitively generating BI products without addressing the underlaying issues. At some point, BI rationalization (decluttering) will be required.

While there are many variations of BI implementations from an organizational and technical standpoint, at a high level the fundamental elements and data flow process remain the same:

Process of Organizing Business Intelligence

1. Start by reviewing your Data Sources

We often see that one of the leading causes of BI clutter is ingesting more data from the sources than what’s needed. To get organized, begin by identifying the current and future use cases for your business intelligence. For starters, every extracted data set should be associated with a Data Owner (also called Data Steward or Data Custodian) who can provide insights into this data set.

Once the data is extracted from its sources, it’s often transformed, normalized (restructured and modeled) and merged or joined based on the BI Data Architecture. This is a point where things can go awry. To avoid future complications, always opt to retain metadata lineage which could track the data back to its sources.

2. Take a good look at your BI Storage

Business requirements and data consumption performance are the two major drivers that affect decisions on how data is originally joined, merged, and structured. However, in virtually all cases, business needs will change over time. When they do, the newly created BI products (dashboards, reports) and added sources of data will affect the structure of your BI storage sometimes leading to unnecessary complexity and poor performance.

There are several ways to reduce this complexity, which all boil down to planning and organizing the storage regardless of whether you’re using dimensional or non-dimensional data stores for your program. It’s self-evident that modeling is the most important clutter limiting measure for dimensional databases, as it enables future development and assures ultimate performance. However, what less often understood is that non-dimensional BI stores also require a good dose of organizing. Any non-dimensional BI storage can benefit from clearly defining, maintaining and enabling all data types and linking them back to their sources. By also associating data with a specific owner and documenting it in a catalogue you can ensure that your BI storage will be ready for further analysis.

3. Analyse your BI Consumption

Numerous organizations struggle with efficient intelligence analysis because their BI products don’t meet the changing organizational needs. As the enterprise grows and evolves, its dashboards and reports should follow suit. One of the best ways to ensure that the insights you receive are relevant and timely is by cataloguing BI products and attaching them to current business requirements.

You will notice that when data lineage is preserved and ready to handle the full BI data lifecycle, decluttering will become simpler and a lot more efficient.

4. Consider the organizational approach to Business Intelligence

As the last step, take a look at the big picture. For some companies, the source of BI clutter may be rooted in organizational approach rather than the BI process itself. For example, when BI elements are handled by several distinct groups within the organization, there is a significant risk that the approach to data isn’t always consistent.

In other organizations, what’s needed is a revamped project methodology. For instance, BI processes, strategy, and delivery are rarely effective when following the waterfall approach. Conversely, agile methodology can offer greater benefits, especially when paired with a defined Information Strategy and governance.

Finally, remember that you’re not alone. In fact, the industry has been struggling with disorganized BI for a long time. What helps set the leaders on track is a holistic, organization-wide change that comes with Implementation of Architecture Practices and defined governance that enable timely, efficient and structured BI.

Do you see any red flags in your BI program? Give us a call. Together we can ensure that your organization has the right insights for accurate decision-making.

Business Intelligence – Tools Are Not Enough

We worry about tools too much.

Don’t get me wrong – tools are great. It’s hard to do much without them. But when it comes to building business intelligence solutions, focusing on tools rather than the business you’re building the solution for is a key reason why these solutions often end up being failures. The solutions don’t meet their users’ needs and are thus simply not used.  Read more