The Microsoft IT journey to transform BI

Technical Case Study

Published June 2015

Microsoft IT is on a multi-year journey to transform how the company works with BI. This is a major undertaking for any enterprise, requiring changes to people, processes, and technology. Learn about the roadmap we’ve built; the progress we’ve made so far; and the lessons learned, best practices, and reference architecture you can leverage to help your organization effect its own transformation.

Download

Download Technical Case Study, 2.41 MB, Microsoft Word file

Situation

Solution

Benefits

Products & Technologies

The traditional highly structured BI data that Microsoft IT managed centrally did not align with the company’s transformation from a software licensing-driven company to a services and devices company. In particular, the Finance department needed a more agile solution that could provide strategic, tactical, and operational types of reporting based on controlled and governed data sets—but also offered more intuitive front-end tools that business users could use to collaborate, mash data, and derive new insights—something structured reports couldn’t accommodate.

Microsoft IT is building out a bimodal solution that supports Finance’s mature data reports while simultaneously adding an agile, self-serve aspect that opens up BI analysis to thousands of business users across the company. Currently in the middle of a multi-year roadmap, Microsoft IT is helping drive change in three key areas that are required to ensure a successful implementation:

  • People: Make analytics a cultural attribute of the company so that it becomes embedded in the corporate culture and is pervasive across the value chain.
  • Processes: Change the processes behind accessing, managing, and governing the data so that BI analysis is always based on quality data and can be performed by everyone.
  • Technologies: Create “one version of the truth” without having to move existing disparate data sources to a centralized warehouse, and offer intuitive tools that simplify data analysis.
  • Productivity gains
  • Faster, better insights for leadership
  • Easy, powerful BI for business users
  • Frees Microsoft IT to focus on high-value services
  • Microsoft SQL Server
  • Power BI
  • Azure

Table of Contents

Executive Summary

The Situation for Finance

The Situation for Microsoft IT

The Need to Think Differently

Architecture and Functional Components

Use Case: BI Data Flow for Finance

Outcomes to Date

Lessons Learned and Best Practices

Conclusions

Resources

Related Videos

For More Information

Executive Summary

Businesses are transforming. As a consequence, how a company measures its business must change. Done well, Business Intelligence (BI) and analytics can empower business leaders and decision makers throughout an organization. Those who need BI must be able to access it quickly—and in such a way that they can both understand and trust the underlying data they use to support their decision making.

Despite this potential, most organizations struggle with their data, as evidenced by John Lucker, Principal and Global Advanced Analytics and Modeling Leader at Deloitte in the TechRepublic article Ask the 'crunchy' big data questions that lead to breakthrough analytics: “Most CxOs do not feel that they have a good handle on how to get the most out of their big data analytics… They say that they do analytics, but it's hard for them to define exactly what value they are getting from it.”

Historically, business leaders have not viewed their data in the same way as they have other more tangible assets, and the inherent complexity involved in managing data has overwhelmed even those who have applied significant effort to the task. As a result, most businesses find that, despite their best efforts, they end up with too many disparate data systems such as data warehouses, departmental databases, enterprise resource planning (ERP) and customer relationship management (CRM) systems, and others. As these core applications continually evolve, it becomes increasingly challenging for organizations to provide a unified view of both historical performance and predictive trends.

The scale of the challenge for any enterprise is enormous. After struggling with a deluge of data for years without achieving the desired levels of success, Microsoft IT is spearheading an initiative to undergo the most significant change to BI in the history of the company. Microsoft IT is following what Gartner has termed a bimodal approach: one that provides commonality at the core while pushing agility at the edge. These seemingly contradictory goals can be achieved by offering a combination of more traditional BI processes that focus on stability and efficiency (such as for Finance), while providing BI for other more experimental groups such as Sales or Marketing that look to accelerate product time-to-market.

In this paper, we discuss our ongoing, multi-year journey to adopt a new bimodal BI data model, to drive a culture change that embraces agility, and to transform how the company works with BI in order to enable the business to make strategic decisions more quickly and to ensure that those decisions are based on higher quality data.

Although the story we describe focuses on a specific organization’s (Finance’s) need, the concepts, the functional architecture, the roadmap, and the benefits the business is deriving from the steps Microsoft IT has achieved to this point are applicable to virtually any function within your organization.

The Situation for Finance

Similar to other large enterprises, Microsoft collects, stores, and analyzes large amounts of data to derive BI that is used to help make strategic decisions for an internal Finance organization. As an example, Finance utilizes the following key data systems and solutions for its BI:

  • Revenue reporting solution

  • Planning and forecast solution

  • Expense reporting solution

  • An analytics reporting solution that works in tandem with a Microsoft Excel plug-in and Microsoft SQL Server

Around the time that Microsoft transitioned from Steve Ballmer to Satya Nadella as CEO, Finance began reviewing the metrics it had been using to drive the business at the top level. Up until this point, the company had been very licensing-focused, which was a natural reflection of its revenues being derived primarily from software licensing. But Microsoft and its markets were shifting. With new cloud-based services such as Microsoft Office 365 and Azure, it became clear that the future revenue driver would be the cloud, with subscriptions and online services driving future growth.

This change in revenue models had significant downstream impact. Finance needed to identify and gain consensus on what the new key metrics should be, and then determine what types of reporting they needed in order to align with the company’s efforts to transform from a software licensing company to a cloud and devices business. Simply put, Finance had to re-assess how to define success.

Properly defining success is especially challenging when the business is evolving as rapidly as it is at Microsoft. How do you measure growth and success when they aren’t based purely on gross margins? How will the rhythm of the business be determined? How would the engineering and sales teams and other core groups be evaluated? Responding to these questions has been this initiative’s foundation—and it is still a work in progress. Finance has solidified around a core set of metrics, but its goal moves in step with the ever-escalating speed of change in the business.

This is the challenge Finance brought to Microsoft IT: to deliver a more robust underlying data architecture that addresses its current and future business needs in a dynamic way. The new solution needed to provide strategic, tactical, and operational types of reporting based on controlled and governed data sets. Furthermore, these data sets had to be accessible by more intuitive front-end tools that business users could use to collaborate, mash data, and derive new insights—something structured reports can’t accommodate.

The Situation for Microsoft IT

Historically, Microsoft IT was tasked with centrally managing all BI data, developing the data queries based on input from Finance (and other internal customers), and delivering the compiled BI reports to Finance. Microsoft IT was taking a set of pre-defined KPIs and building out systems to support them. This process evolved during a time when Finance’s reporting needs were very structured: reports were always compiled in a very specific way, run once a month, and the reporting pattern didn’t change.

During this period, Microsoft IT focused its BI development efforts on building 5-10 large, centralized data warehouses that were designed to serve the needs of a few internal business customers (including Finance). Each new report development effort required 6-9 months, and—given the structured nature of the data—the reports were designed to surface known factors as opposed to identifying potentially new insights.

The speed of business is much faster today than it was 10 to 20 years ago. Decisions now must be made based on data that comes in by the second, not by the day, week, or month. For example, Finance must regularly account for corporate acquisitions; reports must be adjusted to accommodate new markets and spaces that the company is entering. The legacy, highly structured BI reporting model could no longer keep pace with the business. As a result, Finance was not able to ask the questions that changes in the company’s business mandated.

In order to better support the business, Microsoft IT needed to evolve its data gathering, management, and reporting processes into an agile BI solution that could react to changes at a much faster pace. But how to undergo such a large, complex initiative? This required re-examining every aspect of how BI is owned, managed, delivered, and consumed.

The Need to Think Differently

Although the business need was clear, the way forward was not. As a technology-oriented culture, Microsoft IT has traditionally led initiatives with technology. However, after attempting a few unsuccessful technology-based approaches, Microsoft IT came to the realization that technology by itself could not produce such a solution. Delivering this new type of BI necessitated looking beyond hardware and software requirements.

Microsoft IT determined that it needed to develop a portfolio of capabilities that would improve access to, and the quality of, the data. In contrast to previous efforts that delivered reports on pre-defined KPIs, the new solution had to be bimodal in its approach (as termed by Gartner), addressing the changes to Finance’s mature data reports while simultaneously adding an agile, self-serve aspect that would enable more business users to benefit from BI insights.

As noted previously, successful implementation of this new solution would require more than a new technology stack. By examining the business drivers from an end-to-end view, Microsoft IT identified three interrelated areas that needed to change:

  • People: Make analytics a cultural attribute of the company so that it is embedded in the corporate culture and is pervasive across the value chain.

  • Processes: Change the processes behind accessing, managing, and governing the data so that BI analysis is always based on quality data and can be performed by everyone.

  • Technologies: Create “one version of the truth” without the need to move existing disparate data sources to a centralized warehouse, and offer intuitive tools that simplify data analysis.

Figure 1. Three areas that must be incorporated into a roadmap to drive effective change in the company’s BI data gathering and management process.
Figure 1. Three areas that must be incorporated into a roadmap to drive effective change in the company’s BI data gathering and management process.

This new holistic approach was essential—previous initiatives that focused solely on a technology solution had failed. A successful outcome to this initiative mandated that the new roadmap would articulate what needed to change in all three of these areas.

Drive change in people

Our epiphany occurred when we realized that Finance should be enabled to perform their own BI analysis and reporting, and that Microsoft IT should focus on providing, scaling, and supporting the data. But this simple statement had wide-ranging ramifications.

Embracing a cultural change

Getting an IT organization to think differently—to become a business enabler—is critical. Microsoft IT needed the ability to provide data in a way that enabled Finance to ask questions with the speed and flexibility they required.

Microsoft IT assembled a group of people to work together, architect a plan, and obtain buy-in. This included:

  • Assembling managers and stakeholders—people at levels who could help create the appropriate connections and messaging—so they could work through their concerns and frustrations with the current system and hammer out an agreement on how to move forward.

  • Ensuring that all the vice presidents’ direct reports were truly aligned with the initiative and were committed to work out blocking issues as they arose.

  • Adding "boots on the ground" representatives who could provide tactical feedback back to management.

Changes for IT people

Microsoft IT decided to split IT personnel into two teams: business operations and engineering, and then to physically move the business operations people into Finance. Sitting within the Finance organization forced the business operations people to be closer to the business and to ask smarter questions about the data, while still maintaining a strong tie to Microsoft IT. This was a huge cultural change, requiring Microsoft IT personnel to re-evaluate their value in a new operational paradigm.

Changes for business people

Microsoft IT personnel weren’t the only ones impacted; changes needed to happen to the business as well. Finance had to understand that they should own the data because they own the process and the business impact they were trying to drive. By doing so, Finance could ask critical questions such as, Where are expenses coming and going in the company? Where is my ability to make money in the company? What are the trends? Can I overshadow situations that are occurring in the industry with things that are happening in my own company (such as supply chain)? Having the ability to answer these types of questions would help Finance make more accurate predictions to positively impact the company’s bottom line.

Owning their data also meant that Finance needed to take on the responsibility of learning about their data as well as how to use tools and technologies to drive insights. In the past, when a person received a report that informed them they were not hitting their sales, they didn’t need to know the intricacies of a report’s underlying calculations or how certain errors were ignored in order to surface only the valid data. In this new paradigm, the business would need to know everything about its data; they would be accountable. No longer could Finance depend on Microsoft IT to clean up the data in the reporting—Finance would have to care about how their data was being entered by their organization and by other organizations.

Driving this cultural change of understanding peoples’ new roles and responsibilities in the organization is key. The new owners of the data—Finance—must know what to do to ensure the data they use on a daily basis is where it needs to be. This culture shift where the business becomes more knowledgeable about their data reflects a critical process change—data governance—which is discussed in the following section.

Drive change in processes

The single-most important change Microsoft IT needed to make to business processes was to have Finance own their data. As was introduced from the role perspective in the previous section, no longer would Microsoft IT be the owner of BI data; Finance recognized that they must drive and own how their systems are built in the company, and how they want to do predictive analytics.

In order to achieve this change, an entirely new set of processes around data governance had to be put into place to support the business considering, managing, and using its data as a business-critical corporate asset.

Establishing data governance

Data governance is a set of processes for defining, implementing, and enforcing data policies to meet a company’s mission or to achieve specific business objectives (such as, “Know your customer”). It is about putting people in charge of fixing and preventing data issues so that the enterprise achieves its business goals.

When properly implemented, data governance ensures there is a mechanism to facilitate and communicate a common definition and understanding of information. Although we define data governance as a process in this paper, Microsoft IT recognized aspects of it must encompass all three areas of change (people, processes, and technologies) in order to ensure that key information delivered throughout the organization is appropriately managed and maintained.

Data governance refocuses current efforts by setting governance in place to ensure data is correct and consistent across the organization, including:

  • Assigning an enterprise/institutional governance council to oversee efforts in managing data as an asset across the organization.

  • Defining roles to govern and own decisions about the data and business intelligence.

  • Outlining governance operating processes to carry out policies and procedures.

  • Defining technology enablers to support the organization.

  • Aligning to security, data life-cycle management, and quality standards.

Figure 2. Data governance spans all three areas of change.
Figure 2. Data governance spans all three areas of change.

Organizational structure

There are multiple strategies that must be adopted for achieving a goal. One of the strategies that plays a key role to enable data governance to be successful is defining the organization structure that aligns to, executes, and potentially accelerates the journey toward the goal.

As illustrated in Figure 3, a range of options exist for data governance structures. At one end of the spectrum, all decisions occur centrally at the enterprise level. At the other end, decentralized models place all data decisions “locally” or at the department or functional areas. In between these extremes lie models that share varying levels of control between IT and the business.

Figure 3. Microsoft IT is determining the best data governance structure to use for the company, which lies somewhere between a Balanced and Federated data governance structure.
Figure 3. Microsoft IT is determining the best data governance structure to use for the company, which lies somewhere between a Balanced and Federated data governance structure.

For Microsoft IT, recognizing the need to move away from its traditional centralized model was easy. The challenge has been in striking the right balance between the amount of centralized (IT) control versus local (business) control—something Microsoft IT is still working on fine-tuning today.

Operating model

The Finance data governance operational model Microsoft IT has built in conjunction with guidance from a network of governance bodies provides accountability at all levels of the organization to ensure data standards and policies are adhered to. The following table summarizes the roles involved.

Role

Responsibilities

Data Governance Sponsor

  • Provides overall guidance and makes overarching decisions

  • Sets vision, direction for data strategic issues

Data Governance Committee

  • Establishes the data strategy, and prioritize KPIs, business rules, and data quality that are required by the data program

  • Resolves issues that cannot be resolved at the lower level and are escalated to this committee

Data Domain Owners and Business Data Stewards

  • Pushes out policies, practices, initiatives, compliance, etc.

  • Brings data issues and requests through the governance process

Data Producers and Data Consumers

  • Data Producers: Creates, updates, and closes data in accordance with enterprise standards and policies

  • Data Consumers: End users who receive data outputs for business or technical purposes

Table 1. Roles involved in the BI data governance operational mode for Finance.

Drive change in technologies

In order to support this new data model where the business owned the data and performed their own analysis, Microsoft IT had to drive technology change at both the front end and back end.

Building out infrastructure

Microsoft IT needed to build out the infrastructure that could support these new BI processes. Each time a new system (such as a new planning system) was brought online, Microsoft IT had no standard means to combine the data within the UI layer for reporting. As an example, comparing how traditionally licensed on-premises data was collected versus capturing information on Office 365 subscriptions required a whole new engineering cycle to get the data into the reporting analytics tool and expose its information.

Microsoft IT reviewed how it aggregated data from data warehouses and other sources, and then how it surfaced the data. By designing the infrastructure in a set of layers, new underlying data sources could quickly be tied into a data processing layer, which, in turn, could surface data in an easy-to-consume fashion for data visualization tools (as described in the following information).

Tip: Details of the solution’s reference architecture and its implementation are summarized in this paper’s following sections.

Developing BI tools for the business user

For Finance, Microsoft IT began its technology efforts initially on custom code and defining a pattern of defining views, performing an extract, transform, load (ETL) process, and getting the data to a state where metadata could be applied against it. Then, when this was served up to customers, Microsoft IT could examine the metadata layer and data views to answer a specific business question.

Microsoft IT decided to provide the data to business users and to expose views that are consumable, while shielding them from the technical intricacies of how all the data is presented. This was achieved in two stages:

  1. First, the business looked at prescribed views to gain insights. These views advanced the reporting process to a certain point by moving report writing out of the way and by enabling decision makers to gain insight on specific business questions. However, these prescribed views limited the types of questions that could be asked to those within their scope.

  2. Next, Microsoft IT added the new capabilities of Power BI, such as Power Pivot and Power View, that enable different perspectives and data views to be mashed together. Equally important is how Power BI’s ease of use opens BI exploration to the average business person. No longer is BI confined to the analyst; instead, thousands of people can now play with data and make discoveries—which ultimately translates to a much greater likelihood of gaining new business insights.

Architecture and Functional Components

In this section, we provide a reference functional architecture of the BI solution and offer some descriptions of key components that can be applied to virtually any enterprise organization. Figure 4 provides a high-level diagram of the solution’s layers and key functional components.

Note: This graphic represents the solution’s functional components and their inter-relatability; it is not designed to compare any construct’s relative importance or complexity versus another.

Figure 4. A reference architecture for a BI solution that highlights layers and key functional components.
Figure 4. A reference architecture for a BI solution that highlights layers and key functional components.

Additional details concerning these layers and their components are provided in the following sections.

Data source layer

The data source layer represents the entire data ecosystem filled with applications, devices, sensors, operational data stores, and other types of sources from which data originates. These various generators of data could be on premise, in the cloud, or supplied from third-party sources.

No data-generating source should be excluded from this layer, because you won’t be able to predict which pieces of data provide you with the best insights. The power you gain is through the connection of data—which isn’t readily apparent when thinking through traditional reporting needs. You can prioritize which sources you should get into the data processing layer first, and then slowly work down the list until you have all your data flowing there.

The volume and frequency of the data that generates in these sources will vary dramatically, and so will the required sensitivity, data freshness, and criticality. Based on these needs, it will be important to select the appropriate transport method to get the data into your data processing layer—be that from traditional batch processing, event-based processing, or data streaming.

Data processing layer

The Data processing layer is the business analytics engine that handles secure data ingestion of varying formats and velocity. Whether data is flowing in via event-based, stream-based, or traditional batch processing, it all lands in the data lake that acts as the central repository for data. It then leverages both relational (data warehouse, data mart, and cubes as described in the following text) and non-relational (data lake as described in the following text) data processing capabilities to transform the data into information which can be leveraged for insights via distribution in various curated states to the presentation/consumption layer. It also provides a single enterprise data directory of its distribution-ready data assets to support self-discovery and consumption by BI developers, analysts, and decision makers alike.

Figure 5. The data factory is the heart of the data processing layer, orchestrating all the solution’s data processing.
Figure 5. The data factory is the heart of the data processing layer, orchestrating all the solution’s data processing.

Data factory

At the heart of the data processing layer is the data factory. The data factory is the orchestrator for all processing that must occur with the data, such as ingestion, profiling, cleansing, integration, curation, and cataloging. The key is having that “single pane of glass” to see how your data moves and transforms between the different components in your data processing layer. This becomes hugely critical in data management—especially when attempting to understand and address data quality.

Non-relational processing – data lake

One of the key components of the data processing layer is what we describe as the data lake. This construct is the starting point for the enterprise data in the company. Think of it as a sort of “source control” for the data where all ingestion happens. This will be a big data platform that can either be in the cloud, on-premises, or a hybrid approach. The important aspect of this construct has to do with the two key purposes it serves:

  • To be the ingestion layer for all generated data, and

  • To provide non-relational processing capabilities

The massive scale-out capabilities and distributed computing power of the data lake make it a perfect candidate for these two purposes, and provide the foundation for your data storage strategy. In addition to data flowing in from sources outside the data processing layer, data will also flow in from the relational components of the data processing layer.

For example, you may have moved relational data into a data warehouse to for data mastering purposes. Or perhaps you completed further curation in a data mart for finance purposes. Once those transformations are complete, your curated datasets would then be copied right back into the data lake. This ensures anyone can always depend on the fact that the data lake holds all enterprise data they may need.

Relational processing - data warehouse, data mart, and data cubes

The relational components of the data processing layer provides capabilities for more traditional relational processing operations that are still prevalent in today’s ecosystem. While Hadoop-based offerings excel at non-relational, unstructured data needs, they still can’t replace the relational capabilities of traditional SQL platforms.

Some common examples are maintaining master data entities, performing “set processing,” and setting fine-grained data security. In addition, there are still many presentation/consumption layers—especially those targeted at non-technical decision makers—which are designed to connect to relational sources such as multi-dimensional and tabular cubes. Therefore, the relational components in your data processing layer will still be key assets for leveraging in a holistic solution.

Enterprise data directory

One aspect in such a complex ecosystem that can’t be forgotten is the discoverability of the data assets that are generated, so insights can be derived. This is where an enterprise data directory comes in. This is a central directory where data assets can be registered for discoverability and consumption—whether by the BI developer, analyst, or decision maker.

Similar to how the data factory is the glue that ties the process together from a technical perspective, the enterprise data directory is the glue that ties it together from a business process perspective. The integration of this component with the data factory is key to ensuring that the registration is frictionless. If your directory is a separate stand-alone solution requiring a separate business process to keep it up to date, the likelihood of it succeeding is low (which will have a direct impact on the discoverability of the data assets—and therefore the insights derived which will drive business value).

Presentation and consumption layer

The Presentation and consumption layer is the final piece of the business analytics infrastructure that exposes the corporate data assets to the business, customers, and partners alike. Whether it’s the use of Power BI for self-service analytics (as described in the following section), custom applications and services for customer experiences, or machine learning for advanced analytics insights, this suite of consumption assets provides people the power to analyze and combine data in new ways beyond the traditional static departmental reporting.

Making data more accessible to the non-technical decision maker is key to empowering the whole organization to gain new insights and perspectives quickly—which, in turn, drives better decision-making. Gone are the days where only a select few could harness the richness of the data that was locked up in the corporate data ecosystem, or when the business could afford to wait weeks or months for IT to deliver on reporting requirements provided to them.

The decisions made on the technologies in this layer should be about easy discoverability, frictionless access, and intuitive manipulation of the data assets to solve real business problems. Furthermore, this should be accomplished through self-service and allow for easy collaboration and sharing. The social elements in this data analysis process are vital to create a healthy corporate data culture.  

A company can only get so far if the consumption layer is still viewing the data in a siloed mentality (such as where a sales person looks only at sales-related data). Individuals must be able to make connections beyond the data they normally work within. In order to view BI through a single pane of glass, people also must be able to “play” with the data to discover new insights that present themselves beyond the standard questions they may ask.

Use Case: BI Data Flow for Finance

To better exemplify the broader architecture from the enterprise perspective described previously, this section details a specific use case—BI for Finance. This use case presents Finance’s new BI ecosystem at a high level, touching on data input, processing, retrieval and publishing, and data presentation. Although this first implementation of the new BI solution is still underway, it is already providing value to the business—and continues to benefit Finance at each milestone.

Overview

Microsoft has a reporting and analytics tool (a plug-in to Excel) that provides predefined, customizable views of data that the solution sits on top of: Revenue, Planning, Forecast solutions, and so on. In this example, by consolidating worldwide sales in an efficient single solution, the revenue reporting system provides users with a common, accurate, and consistent picture of the state of the company’s business. In turn, this information enables better strategic decisions throughout the company and its subsidiaries.

Figure 6 provides a specific implementation of the architecture presented in the previous section, where the relevant parts of the generic architectural diagram are mapped to the Finance solution’s specific components and data.

Figure 6. Finance’s revenue reporting BI solution.
Figure 6. Finance’s revenue reporting BI solution.

Data source layer

The revenue reporting system has over 1,400 upstream reporting data sources and provides over 20 years of revenue details for all businesses and channels across the company. This accumulates to approximately 9 billion rows of data and almost 2 TB total used data space that is hosted on nearly 30 servers. Clearly, this is an extremely complex system.

This system is both a provider and a subscriber of various amounts of data. Primary inputs are SAP, CRM, and many other internal and external reporting partners’ data. The revenue reporting system receives this data and lands it in their data warehouse. It then goes through the data factory process where various business rules are applied, resulting in the formation of marts and cubes that offer unique perspectives of the data (such as end customer, licensing, reseller, and others).

A few years ago, the Microsoft business model changed slightly to align with cloud computing. Because these SKUs are handled differently than boxed products, the reporting solution had to begin accepting these files in an updated format through our internal systems and big data instances. This new transfer of data is the future for Microsoft, which the Finance space has already begun to adopt.

Data processing layer

After landing in the revenue reporting system, upstream data enters the factory process, which sends it through a series of complex business rules and allocations which the business is able to update in real time. This process was built with flexibility in mind so that the business can update rules on their own while eliminating the need for a change request to the engineering team for requested updates. The end result is the creation of multiple cubes and marts for downstream consumption by additional systems or individuals.

Presentation and consumption layer

The revenue reporting system’s primary analytical tool for data consumption is the company’s internal reporting analytics system. Prior to November 2014, this analytics system was used across multiple data sets in independent extracts to pull data into multiple Excel files and tabs, where they would be mashed up manually in an attempt to gain further insights across the data. With the integration of the company’s Excel-based BI visualization tool known as Power BI into the analytics system, users can pull those data sets in to a single analytics client, export to a Power Pivot model, combine the data based on defined keys, and then create a visual representation of the data in Power View.

Once a data set is curated enough to visualize—whether it be from the enterprise data directory or from Power BI—a few different tools can be used to display the information: Power View, Power Map, Power BI.com (currently available as a public preview), or Microsoft SharePoint Online with BI Sites enabled. Across the enterprise, utilizing any of these new technologies has proven to increase speed to insights, provide frictionless access to once “hidden” data, and allows a rich new experience to what is now actionable information.

Power BI

Microsoft Power BI provides a rich suite of data visualizations that enable users to dynamically interact with their information quickly and efficiently. Power BI provides native support for integrating with Stream Analytics directly to consume data and refresh dashboards seamlessly without the need for page refresh or reissue of a query to a data source. Additionally, Power BI also supports the more traditional approach of consuming data from Azure SQL Database (amongst many other relational and non-relational data sources) and storing it in a data model hosted in Azure.

This model can be scheduled to automatically refresh the data against the underlying data source on a periodic basis—down to minutes if appropriate. By supporting both real-time event-driven dashboards and batch analysis based on pre-populated data models, Power BI allows organizations to capture data from an array of different system types while also accommodating differences in data velocity.

To support both real-time and scheduled refresh data in this reference architecture, Power BI connects to Azure Stream Analytics to consume events that are egressed from the engine, and Azure SQL Database to refresh the populated data model. Data visualizations representing data from either Stream Analytics or SQL Database can be centralized onto a central dashboard to support a single view across all information, regardless of the underlying source.

Although it is not reflected in the previous diagram, Power BI also provides support for connecting Azure Blob Storage and HDInsight directly. This is a common analytics scenario, but not one that is generally part of an event-based real-time processing solution.

Outcomes to Date

Although we are still on our journey of BI transformation, we are already deriving benefit from our efforts. Figure 7 provides a snapshot comparison of our old way of doing BI versus the value we are gaining from the progress we have made so far.

Figure 7. Comparing the results of the old way Microsoft IT did BI with the new solution.
Figure 7. Comparing the results of the old way Microsoft IT did BI with the new solution.

Microsoft roadmap

As illustrated in Figure 8, Microsoft IT devised a multi-phase roadmap that both allowed for a realistic span of years to fully implement Finance’s BI solution while providing value to the business at each milestone along the path toward completion.

Figure 8. Our multi-phase, multi-year roadmap to achieving an agile BI solution.
Figure 8. Our multi-phase, multi-year roadmap to achieving an agile BI solution.

Phase 1: Reorganize personnel

Phase 1 was about changing the mental model of the organization around how analytics gets done.  

  • Getting IT to focus on implementing a data ecosystem that allowed information workers to easily discover and consume the data and drive their own insights.

  • Getting the business to be more data-driven in their decision making by leveraging the data available to them to drive new insights.

This all is done by driving a data culture and shifting the appropriate resources into the business to improve the analytics capability of the organization.

With the re-allocation of business operations IT personnel into Finance completed, Microsoft IT then focused on encouraging people to share BI information and ideas, and to especially leverage other peoples’ ideas to innovate. By fostering a sense of inter-organizational collaboration, the movement continues to capture more attention and promotes the culture shift in ever-widening circles.

Phase 2: Define and implement processes

Phase 2 is about changing the processes behind accessing, managing, and governing the data so that BI analysis is always based on quality data and can be performed by everyone. The goal of Phase 2 is to improve the discoverability, timeliness, accuracy, and connectability of the data to accelerate data-driven decision-making and drive quicker, deeper, and richer business insights.

Phase 3: Scale across the enterprise

In the third phase, Microsoft IT will focus on the technologies needed to scale and support the solution once the people are on board, and the new systems, processes, and roles are established throughout the enterprise.

The milestone to achieve in this phase is to evolve from the successes of a local BI level (such as Finance) to helping connect the entire company. When business users can connect to different people in different groups and with different responsibilities across departments and organizations, they will gain insights that were never even considered possible previously.

Phase 4: Embrace future changes

For Microsoft IT, implementing Phase 3 correctly means reaching BI nirvana. The challenge with Phase 4 is that it is pushing into the unknown and deriving business value from future technologies. It might be out there, but we don’t know what it is yet.

Today, the way we do connections is by reacting to the data. We’re trying to get iterative data streaming in, and we’re trying to run predictive modeling on what we see, and possibly use some machine learning capabilities to start to have it learn what we’re seeing. Technology—and the way we design technology—has to catch up. We need to get it to a place where we’re actually influencing the data that’s going to be generated. We’re creating situations and interactions that drive certain new behaviors with our customers, ourselves, where we allow the technology to self-evolve.

Machine learning is in its infancy. Cortana, as an example, has a conversation with the user, starts to understand him or her as a person, what they think and feel, and how they react to things. Cortana can use this information to craft new connections that is not even in the original data structure.

This is new technology, but it's coming quickly. If a company isn’t reaching Phase 3 level in the next few years, it simply won’t be able to compete against other companies who achieve Phase 4. We’ve seen similar technology waves before. Consider how back in the 1970s and 1980s when microcomputers first became popular. PCs were interesting, but not central to the survival of the business. Yet where would any enterprise be today without their computer systems—could they compete? Of course not. Microsoft is thinking the same BI and its future criticality. Some companies might consider it attractive but not necessary today. But in the next few years, the company’s survival might be at stake.

Productivity gains

Although such a large and complex initiative is expected to require several years to be implemented fully, we are already seeing how our efforts are producing value to the business at each milestone.

One example is the increase in productivity. Microsoft Finance Director Marc Reguera estimates his team of BI analysts recovered approximately 20 percent of the time they used to spend searching for data and pulling individual pivots into PowerPoint. The new solution enables them to surface the right data quickly through powerful visualizations—driving interactive, informed decision-making without needing to engage with Microsoft IT for coding support each time they want to ask a different data modeling question.

Faster, better insights for leadership

Another benefit we are seeing from our efforts is the alignment with the “One Microsoft” initiative to provide end-to-end views of key internal processes. With this new, agile BI solution, leadership can gain faster and better insights into the business, helping them make data-driven decisions within moments of asking their questions.

Figure 9. An example of how the new BI solution enables leadership to get immediate and accurate answers to strategic questions.
Figure 9. An example of how the new BI solution enables leadership to get immediate and accurate answers to strategic questions.

Easy, powerful BI for the business user

Building powerful BI tools that work within Excel has brought BI to the masses. The new BI solution’s front-end tools offer managed self-service BI that enable business users to mash data easily and to create interactive, dynamic reports on their own.

Mt238292.image010(en-us,TechNet.10).png

The exponentially increasing number of people who are playing with BI data and generating reports is strong evidence of the Microsoft culture shift in action. It also reflects the intersection between the change in people (to move from a siloed to a collaborative mindset), the change in process (where the business owns the data and provides certified quality information for people to use), and the change in technology (to provide intuitive tools that are easy for the business user to work with).

Cumulatively, the new BI solution is helping everyone derive new insights, not just the relatively few power analysts. When you consider the tens of thousands of users who are now visualizing their data in thousands of different ways, the business can find answers and insights from questions that they didn’t even know they had.

Frees Microsoft IT to focus on high-value services

In this new BI solution, Microsoft IT provides support for the underlying infrastructure and services, while the business owns accountability for defining data and use. This frees Microsoft IT to focus on delivering high-value services to internal customers, including scaling the solution to ultimately deliver self-service BI functionality companywide.

One example project that Microsoft IT has been able to pursue is master data management, which involves defining a “single source of truth” for very important and commonly used data (such as product/person/company). Defining this single source is critical to connecting data correctly across the company. Before the new BI solution, Microsoft IT had no time to devote to master data management. But today, the new BI solution’s self-service capabilities allow Microsoft IT to operate proactively. Moreover, the master data management effort acts as an accelerant on the BI initiative by feeding better data into the system. In turn, the business uses the higher quality data that comes from master data management to improve BI analysis and to glean more insight.

Lessons Learned and Best Practices

People best practices

People come first and foremost in the steps to success. If you don’t get the people on board, you can put all the processes in place and incorporate all the latest technology, and your initiative will fail miserably. A CIO might buy the latest and greatest of everything and adopt every business process and best practice, yet still not achieve success if the people haven’t been brought into the equation.

As a company, Microsoft has learned this from our own experiences. Due to our traditional culture, we have always led initiatives with technology; it’s always been our starting point. The people and the processes would not be the core focus, nor would they garner much attention. In fact, there were occasions where the people aspect was never considered at all. The mindset was always technology first, then process as necessity demanded. Such an initiative would never get off the ground. Lead with your people.

Move away from a siloed organizational mindset

What we learned

Moving to a federated data model creates a lot of tension on duplicate data. Microsoft IT learned that it still needed a level of governance around the subject areas across the company that are shared as data is pushed to the edges.

Our recommended best practices

In the beginning, a siloed mentality can help you effectively optimize specific verticals quickly. However, these seams quickly become major barriers to driving true data insights and will result in a fragmented experience for your customers. It can also impact the efficiency of the organization to align and drive business outcomes holistically.

The ability for organizations in an enterprise to have the ability to leverage each other seamlessly is becoming critical to maintain competitiveness. If your company is similar to Microsoft in that the culture is one of a startup mindset where each organization is their own entity and there is very little sharing and very little dependency management, that model must change. Some companies might have more time to make the required change than others, but ultimately the success of any enterprise will in part be determined by its ability to encourage inter-organizational innovation. Silos must go.

Build a v-team that includes both top level and boots on the ground representatives

What we learned

The best assemblage of people is a virtual team that includes both top level and “boots on the ground” representatives who collectively can ensure that the actual change management happens from a people and a process perspective. If a CIO thinks he or she can just have this initiative work organically, it will get to a certain point and then die. If the CIO simply provides an edit to direct reports, the effort will still fail because the “boots on the ground” will never receive the clear message and obtain an understanding of the initiative and their value in the new order. The message has to come from both top and bottom, meet in the middle, and adjust as needed.

Microsoft IT came to this realization after seeing how earlier initiatives would fail when driven only by top-down or bottom-up approaches.

  • The problem with top-down is that most organizations don’t have the ability to go seven levels down and keep the message crisp to where it’s actionable. This includes Microsoft.

  • Similarly, the bottom-up approach that a CIO will see is that boots on the ground can get to a certain level of progress from a departmental perspective, because this is the level that they can control. But when you consider the scale of a Finance department in a Fortune 500 company such as Microsoft, people didn’t have to go very far to have interdependencies on aspects that are outside their control.

Our recommended best practices

To get people to invest in solving these interdependent problems, you must have the top-down in conjunction with the bottom-up. Without the top, the bottom might achieve pockets of success where the initiative is working well, but it will never be able to scale to the enterprise level.

You have to meet in the middle. You must have buy-in from both ends, and you have to learn from each other. The bottom will find areas where the top’s message didn’t hit all the VPs or middle management, and the bottom can highlight these trouble spots and the top level can ensure the message is passed properly. The top level might request the bottom perform a certain task as part of the initiative, but the bottom can provide feedback about potential unforeseen tactical issues that might put the effort’s success at risk. This regular cycling of sharing, offering feedback, and learning from each other remains key to success.

There is one more dimension to consider beyond bottom-up and top-down: across organizations. Whatever process you’re dealing with or vertical you’re in, this third dimension concerns the need to tie measurable success to situations where cross-organizational insights, innovations, or other successes are met. This harkens back to the necessary change in culture from a single person, heads-down-to-solve-the-problem mindset to a collaborative approach where people proactively search out each other’s experiences to help find solutions to problems or new opportunities. Essentially, this means it’s top-down, bottom-up and across.

Identify who in IT to move into the business

What we learned

One of the critical elements required to effectively enable a data culture is to have those individuals who are skilled in analytics sit in the business to help drive the maturity up in that organization. When these people sat in IT, Microsoft IT was perpetuating the old process of Finance needing to provide requirements to Microsoft IT, who then had to deliver on the data analytics needs without a good understanding of the business.

Microsoft IT also encountered challenges with skills alignment when trying to think differently about affected roles. Analysts who had spent their careers being good at performing analytics were suddenly faced with relearning how to curate data for their business to consume. Without clear communication, morale was impacted. Microsoft IT realized that it’s all about having the right role and resource in the right location in the organization.

Our recommended best practices

How do you determine who in your IT organization should be physically moved into the business they support? First, get the people out of IT who are purely report writers. If those who actually build the reports are sitting in IT, they need to be reorganized so that they sit in the business. Create an organization within Finance (or other organization) that is an analytics organization that has report-creating capabilities.

In contrast, keep the people in IT who have been serving up the data and driving processes. Of course, they too must think differently: these IT people need to provide the data in a way that the report writers (who now sit in the business) can use to drive the insights.

There is no room anymore for a person who sits inside IT and builds reports. The natural tendency for an IT person when they’re told they’re no longer providing IT reports is to ask, “What am I going to do? This is what I’ve been trained for, and what I’m good at doing. You’re taking my job away.” A dynamic is created wherein if IT isn’t positioned correctly and its people messaged to correctly, there will be massive angst that they are being outsourced and no longer have value.

To alleviate this, leadership must be clear about communicating the overall picture and the new, important roles these IT people can play. To maintain relevance, report writers will need to have at least one of the following skills:

  • Being comfortable as a business person who is very skilled at analytics: Those who want to be the person who drives insights in a business should move to the business and become an enabler within that department.

  • The interest and desire to become a teacher and to evangelize: Those who like to teach and help drive adoption of the process should stay within IT’s engineering team.

Indicators for when it’s time to change

You know your organization needs to make this culture change when IT’s perspective is, “the business can never make up its mind; they’re constantly changing their requirements, we build what they ask for, and they still don’t want it;” and the business perspective is, “IT is too slow, they never give me what I asked for, and by the time they give it to me I’ve already lost the opportunity. I’m tired of waiting for them, so I’m going to do this myself.” When you hear two organizations describe each other in these terms, you know they’re ripe for this cultural change.

When Microsoft IT and Finance arrived at this point, Microsoft IT decided to find a way to give Finance a certain set of views that they could use to write the reports themselves. As a first step, Microsoft IT developed a reporting analytics plug-in that ran on top of Excel. Its views allowed Finance to answer the questions they were asking. However, the addition of a new technology was not an answer unto itself. Finance suddenly had challenges managing the data properly, which caused a good deal of frustration from business analysts and increased the amount of support Microsoft IT had to spend to lock down the data in an effort to minimize the potential errors Finance might introduce.

Tip: This type of challenge is not because your initiative is flawed; it’s a growing pain. Every organization is going to go through this. Organizations must have an opportunity to review the situation and determine if the way they’re approaching their BI solution needs adjustment to reduce some of these newly introduced pain points. Is it just a matter of the business needing to mature, or is IT not partnering with the business well, and not educating them on the things IT understands inherently that the business doesn’t understand because it’s not part of their DNA—and IT might not even know to tell the business about it.

To be successful in this venture, both organizations—IT and the business—must trust each other and be willing to take on the new roles and responsibilities and know how to partner in this initiative. This causes IT to become much more involved with the business: to sit with them, to offer learnings and share experiences to help bring the business along. This is when IT becomes a true business enabler.

Do BI closer to the business to derive insights

What we learned

Another people-oriented aspect concerns the different types of reporting and a recognition of how deriving insights requires the closet affiliation with the business. While Microsoft IT had a team who was skilled at scaling systems, they didn’t have a solid understanding of the business. Microsoft IT had to constantly ask the business to provide more information and context in order to build out reports that provided value—not the best use of scarce IT resources.

Our recommended best practices

The farther you are away from the business, the more likely you will set up reporting on past actions. But insight is more about responding to things that are happening now. This type of reporting in particular must be close to the business, because it’s the business who has the context necessary to ask the right questions to derive valuable insights.

Process best practices

Start with early adopter organizations and then scale up to the enterprise

What we learned

The Reporting Analytics and Finance IT teams within Finance are innovators, and were ready to work with Microsoft IT on the new BI solution. There are other groups within the same organization that are more conservative with their data and were not appropriate partners for such early engagement. Making this initiative work meant identifying the right teams who wanted to track fast, applying learnings at that scale, and then applying lessons learned when the project was ready to scale more broadly.

Our recommended best practices

CIOs must understand that they cannot drive this type of change at the enterprise level. It’s far too disruptive, too complex, and involves too many people. Instead, search for the path of least resistance. Find the champions—the people and organization who are willing to be innovators and to move fast and try new things. Get those successes early, learn from your failures, and then apply these best practices to a broader audience.

Implement an effective data governance process

What we learned

Without an effective data governance process, we have only been able to get so far with the connectability and morphability of the data. Poor data governance results in siloed data which requires a lot of cleanup and coding to get to a usable state. Even then these efforts can’t get the data to a place that makes it easy to consume—especially when trying to combine data from different organizations.

Our recommended best practices

While data governance initiatives can be driven by a desire to improve data quality, they are more often driven by C-Level leaders responding to external regulations. Examples of these regulations include Sarbanes-Oxley, Basel I, Basel II, HIPAA, and a number of data privacy regulations. To achieve compliance with these regulations, business processes and controls require formal management processes to govern the data subject to these regulations. Successful programs identify drivers meaningful to both supervisory and executive leadership.

Common themes among the external regulations center on the need to manage risk. The risks can be:

  • Financial misstatement

  • Inadvertent release of sensitive data

  • Poor data quality for key decisions

Methods to manage these risks vary from industry to industry. Examples of commonly referenced best practices and guidelines include COBIT, ISO/IEC 27018, ISO/IEC 38500, and others. The proliferation of regulations and standards creates challenges for data governance professionals, particularly when multiple regulations overlap the data being managed. Organizations often launch data governance initiatives to address these challenges.

Additional descriptions of how data governance touches all three areas are provided in the following information.

  • People: Define your data governance organization

    • Data governance standards are generally defined by the business and are supported by IT.

    • Data governance must be sponsored at the highest levels of the organization.

    • Data governance must have the authority and sponsorship to make and enforce decisions and standards.

    • Data governance stewards are accountable for the quality and accuracy of data and must rely upon internal controls to manage data access, usage, and quality.

    • The business must view data governance as an enterprise initiative where processes are regularly reviewed, updated, and enhanced.

  • Processes: Data management

    • Data governance must have real authority including the ability to resolve business issues, review project data issues, and settle disputes.

    • Data Governance priorities should be assessed and decisions applied with an enterprise point of view which implies trade-offs between operational vs. analytical and department/entity vs. enterprise need.

    • Data governance initiatives should be legitimized and implemented with strong communication, change management actions, and issue escalation processes.

    • Data governance metrics should be put in place to measure the governance and data management processes, monitor and enforce standards, and identify data quality issues.

    • Data governance should include identification/documentation of data security levels, governing security regulations, and data security ownership as part of data definition and capturing business metadata.

  • Technologies: Develop data governance tools

    • Tools should be utilized to enable and support the established processes.

    • Interface and application design standards should adhere to data governance goals.

    • Tools facilitate data governance activities used to:

      • Monitor and audit data.

      • Request data changes and support change processes and workflow.

      • Run reports and comparisons.

      • Communicate and track issues.

Customize the process for the business

What we learned

As a company, Microsoft initially maintained a decentralized data management model that reflected its decentralized corporate culture. However, like many other enterprise technology companies, Microsoft IT later embraced the advent of enterprise data warehouses, and moved its data management into a highly centralized model. That model didn’t work either: trying to implement a “command and control” structure to a company with a decentralized culture was too much of a misfit. The pendulum had swung too far.

As a result, adoption of the costly central warehouse didn’t move us toward our end goal of data enablement. It also created even more duplicity of data which cascaded into additional problems with people continuing to take copies from that location to meet their own needs.

Today, Microsoft IT is moving the pendulum away from the centralized extreme and basing its BI model on a balanced, or federated, data management model.

Our recommended best practices

There is no single optimal model for all businesses; there is no “one size fits all.” Each company must review its own business needs in light of the spectrum from highly centralized to highly decentralized, and apply a BI model that works for them.

For smaller startups that have a reduced number of touch points, a centralized model could work well. At the other extreme, a large enterprise might have thousands of systems that would make centralized control virtually impossible; these organizations clearly would benefit from shifting toward a more decentralized model.

Large enterprises might also discover that different models need to be applied to different organizations within the company, such as more centralized for Finance and HR departments, and more decentralized models for Sales and Marketing.

Start small, fail fast, and experiment often to define a successful process

What we learned

The company’s business is changing from the top down, requiring updates to many systems across organizations. As an example, when the Microsoft Office team was building reporting and metrics to track the company’s cloud-based Office 365 service, the initial model asserted that that the “customer” dimensionality being used in the existing volume licensing space did not apply, due to the Office 365 service’s focus on new aspects such as tenants and workloads.

Based on this assumption, every customer was assigned all the usage for their tenant on the service. However, since one part of a company could be paying for Office 365 while the service was being used across the globe in different geographies, the new model lost visibility to local customers and could not accurately compare the data coming in from Office 365 to its volume licensing space. The result was that it failed quickly.

Rapidly building out, testing, and identifying the failure helped the team move in a better direction. The Office team consulted with other groups in the company, modified their approach to blend the old and the new, and consequently rolled out a successful new experience.

Our recommended best practices

True agility requires courage to try new ideas and expect many of them to fail. Not everything has to be a multi-million-dollar, 12-month project. This is all about enabling agility at your core.

1.      Begin with an idea about what business outcome you want to achieve; start with a small data set and iterate quickly with stakeholders.

2.      Define the key metrics and stakeholders, and assess their quality and availability.

3.      Connect to necessary data sources. Pull the data in, profile it, connect it, learn, adjust, get more data, discard some, etc.

4.      Quickly validate the right data, then expand the amount being used, mash it up, model it, and visualize.

5.      Perform quick iterations to validate through stakeholder reviews and feedback; adjust and repeat.

Technology best practices

Adjust infrastructure to accommodate the new model

What we learned

In Finance, revenue recognition and quotas were processed separately for each business model: tracking shrink-wrapped software for enterprise customers were handled one way, ecommerce micro-transactions were handled through a different process, brick and mortar had still another process, and so on. There wasn’t any mandate to standardize these systems and processes to ensure that all the different sales transactions and their revenues would be recognized consistently in one standard process.

Our historical sets of fragmented data meant that we could only answer questions in silos. We recognized the need to change, but the proliferation of discrete systems meant that the effort to integrate it all into an interconnected solution that could help drive self-service was significant.

Our recommended best practices

IT must build an infrastructure that provides data to the business quickly. IT can no longer just build systems based on “business scenarios” (such as operations business, where systems are designed from a very rational, operational perspective). Instead, you must question what you are trying to do with your customers. How does a given system help? IT organizations must start thinking about the data and how it flows through. It’s a slight shift in perspective, but it has a profound effect on the system architecture.

Build tools for the information worker

What we learned

In earlier times when Microsoft IT utilized mature processes for all BI analysis, manipulation of the data (“active BI”) was performed solely by technical analysts with Excel being virtually their only tool—and that was for generating static reports. Manipulating data in such a manual manner was especially difficult when there were millions of rows of data to analyze. Clearly, some front-end tools needed to be developed to streamline reporting.

As a first step, Microsoft IT added a reporting analytics plug-in to Excel. Analysts would still need to know a certain amount of information about the data set they were examining, but this reporting analytics plug-in provided a canvas to drag-and-drop fields, tables, and other data they wanted to review. Unfortunately, even with this plug-in, Finance analysts were still struggling to take multiple data sets from multiple systems, mash them together, and present a unified version of history and projection into a singular, all-up fiscal year view of the business. This need is what drove the development of Power BI.

In its latest generation, the reporting analytics plug-in integrates with the Microsoft Power BI suite. Users can now point a query at whatever data set they’re viewing. Data can be exported to Power Pivot, other data sets brought in and mashed together, and even real-time analytics can be added against multiple data sets.

In addition to whatever insights we’re answering, these data visualization tools are giving the business the ability to share their insights. This is incredibly important. Instead of a single team of a few people reviewing a report, information can now be shared easily with many people—such as with the field, which has ongoing demands of our Finance team for insight.

Our recommended best practices

Sharing information outside of one’s organization and across the enterprise is the key to enabling people to leverage each other’s skill sets and to find answers to questions that didn’t exist before. But where do you start? Take a look at the time you spend copying and pasting into static Microsoft PowerPoint decks. Instead, consider connecting the presentation layer to the data layer directly.

Anytime you have a deck with numbers, you should ask yourself if there is a way to connect that front-end presentation to back-end data. This minimizes the risk of errors and significantly increases the time people can spend on converting data to information and insight. All of this increased agility comes without any compromise of security—a critical element of any solution. A self-service technology for Information Workers should help them easily pull data together and present it in a way that would traditionally require IT assistance.

At Microsoft, combining Power BI with Power Query, Power Pivot, Power View, Power Map, and Q&A allow our users to tell a story with their data—and then share it through Power BI Sites.

Figure 10. At Microsoft, users work with the Power BI suite to view, analyze, present, and share their BI.
Figure 10. At Microsoft, users work with the Power BI suite to view, analyze, present, and share their BI.

These are the kinds of “uncommon” connections that drive real innovation and help the company differentiate. Remember: the logical way of doing business never generates the quantum leaps. If that were true, everybody would be doing it, and everyone would be innovative. Instead, it’s the connections that you would never think are associative that produce breakthroughs.

Conclusions

The risk of the siloed company

Our philosophy is that maintaining a siloed organizational structure today risks stagnation, because silos by their nature hinder innovation from occurring across the company. We are in the digital age. No longer can businesses focus on wins-based optimization; now companies must innovate or die. The traditional view of separate teams not needing to interact is based on an outdated business model: everyone now must be all in if the company is to thrive.

The problem is that you don’t know what you don’t know. What is clear is that maintaining silos will prevent the insights that could arise from a more collaborative culture. Depending on how diversified you are as a company, the amount of time before an entrenched silo mindset will negatively impact the company’s revenue will vary. Such an organization might continue profitably for a while, but leadership must steer the company away from silos and into the collaborative world where innovation drives success.

Measuring success

Our message to leadership is this: the biggest challenge overall isn’t technology; it’s culture. Although every enterprise is different, the recognition of the need for a culture shift and what it encompasses is the single-most significant factor in achieving success. We cannot underemphasize the criticality of people being at the core of the initiative, and then adding the processes and technologies to support your transformation.

At Microsoft, we have seen the need to evolve our own siloed mindset into an interoperative culture where the sum is greater than its parts. We recognize that true innovation can only arise when people with different backgrounds and perspectives—and from different organizations—collaborate to generate new insights. We believe that this culture change is helping fuel the tool with the greatest potential for innovation we have ever encountered: the creativity that comes from our own people working together.

Change at this scale isn’t easy, and we are still on our own journey of transformation. Over the past two years, Finance has run through a number of exercises to define aspects of success that would be most relevant in the new world of cloud-based services. We’ve come a long way; we have a much better sense of the metrics that are used to drive our sales behavior in the field, how we drive financial performance, and what we want to do to drive usage.

We have locked onto the top 30-40 things across the company that we need to measure. We are continuing to define how to get the data to measure these metrics and ensure that the proper governance processes are in place. With an understanding of the governance processes—and the requisite changes to personnel and roles that accompany that—Finance has been able to respond to its business needs in a more agile way.

You need to start moving. We’re giving you some tools and food for thought about the right people to bring together, the sponsorship that’s required, and the process you can use along your own journey to help you identify what milestones you must reach in order to ensure a successful transition.

Next steps

You've heard Microsoft's story. Now it's your turn: take your company further by incorporating a similar BI solution that adds agility to your organization and encourages the innovation you need to maintain a competitive edge. Just like Microsoft IT is transforming Microsoft Finance reporting, Microsoft Consulting Services (MCS) can help you reinvent how your organization manages data to achieve more agile business reporting and analytics.

Why partner with MCS?

Before you commit funds and resources to such a large project, you want to make sure the first step you take puts you on the right path. MCS can help you run an assessment to determine maturity and build a roadmap specifically for your organization. Microsoft Services is uniquely positioned to help you with every step along your journey:

We’ve been where you are now. As an operating company, we are running an extremely complex version of what your business is running today. The resources we used to define our own roadmap are available to help share learnings and expertise that can be applied to your business. The insights and lessons learned from our real-world, hands-on, multi-year initiative to transform our own BI is available to our consultants.

We offer the only true end-to-end solution. Microsoft builds the complete spectrum of technologies required to achieve your next-generation BI. This spans everything from cloud-based solutions, to on-premises back-end data stores, to the analytics layer, the reporting layer, and the insight layer. With the power of the Microsoft stack behind them, MCS can review your existing infrastructure and design a plan of action that is purpose-built to give your company a true end-to-end solution. The Microsoft platform is built to leverage and augment existing platforms, so we will take into consderation investments already made in your organization.

We have a proven track record of developing the right tools and technologies. Our releases of Microsoft Power BI, Azure Machine Learning, SQL (on-premises and cloud-based Azure), and others enable us to build out your solution using a technology stack that is designed from the ground up to interoperate seamlessly—and to provide powerful, intuitive visualization tools that can be accessed through the world’s most popular spreadsheet: Microsoft Excel.

How to engage

MCS offers a broad range of custom services and solutions, but our engagements usually follow a standard roadmap designed to get organizations moving toward achieving their goals quickly and cost-effectively:

  1. We typically recommend that our clients start by working with us in a business-focused strategy briefing, where our consultants learn about the organization’s top business priorities.

  2. Next, we work with both business and technology leaders to gain a full understanding of the organization’s present state in terms of systems, technologies, and processes.

  3. Finally, we establish an implementation plan designed to deliver value as rapidly as possible and in an iterative fashion, showing business value along the way.

Through this approach, we help our customers get the most out of their technology investments. We use our expertise across the broad Microsoft portfolio—along with our network of partners, technical communities, tools, diagnostics and channels—to help our customers transform their businesses—regardless of where they are on their journey.

For more information, contact your local Microsoft Account Team or Microsoft Consulting Services representative.

Resources

Microsoft Finance Leverages Power BI to Transform Reporting

Related Videos

The agile development process for self-service BI at Microsoft

Top 5 things to avoid when rolling out Business Intelligence in your company

Real World IT Deep Dive: Launching Power BI Services at Microsoft

For More Information

For more information about Microsoft products or services, call the Microsoft Sales Information Center at (800) 426-9400. In Canada, call the Microsoft Canada Order Centre at (800) 933-4750. Outside the 50 United States and Canada, please contact your local Microsoft subsidiary. To access information via the World Wide Web, go to:

http://www.microsoft.com

http://www.microsoft.com/microsoft-IT

http://www.microsoft.com/itshowcase/topTrends/dataInsights

https://www.microsoft.com/en-us/microsoftservices/

 

 

© 2015 Microsoft Corporation. All rights reserved. Microsoft and Windows are either registered trademarks or trademarks of Microsoft Corporation in the United States and/or other countries. The names of actual companies and products mentioned herein may be the trademarks of their respective owners. This document is for informational purposes only. MICROSOFT MAKES NO WARRANTIES, EXPRESS OR IMPLIED, IN THIS SUMMARY.

Show: