Data is quickly becoming a company’s most important assets. To monetize this asset, organizations leverage analytical tools that generate insights in order to gain a competitive edge in the marketplace. One issue that has plagued corporations for years is the lack of standards and methods around their analytical capabilities, one example of which is deciding which analytical tools to build deep competencies around. How can an organization gain efficiencies, reusability, and build deep common skillsets, if each group is using a different analytical tool?
To make matters more confusing, analytical tools tend to proliferate within organizations at the intersection of market innovation cycles and underserved business groups. As the market innovates with new functionality, the business sees these new capabilities that provide deeper or faster analytical insights as a tool they need to add to their arsenal of tools. Often underserved Business groups go ‘rogue’ and purchase their own analytical tools, which are rarely the corporate standard.
In this latest innovation cycle is creating an environment where analytical tools are proliferating in organizations with little to no oversight into standards. For example, one part of the organization may have a visualization tool, like Tableau, and another group will have a similar tool, like Power BI, creating issues around common methods, reusability, and the need for varied skillsets.
The natural answer to this challenge is to rationalize the varied set of tools and create standards for each new tool purchase or analytical project to adhere to. This is typically performed by a governing body made up of various analytical business constituents and representatives from IT. The governing body provides the team standards and a process by which to perform a tool selection often needs a method for conducting such an exercise. The steps outlined below will provide your organization with a framework for determining how to optimize its analytical tool portfolio and prevent further tool proliferation:
- Research and Discovery – One of the first steps to take is to conduct interviews with key stakeholders including end users in all user groups some of which include: data scientist, analyst, developers, IT administrators, and executives. The goal is to map the current state of analytical tool usage and analytical capabilities within the organization. It’s important to get an exhaustive inventory of what tools and capabilities each group is using in their respective area. Also determine the users’ pain points and gaps in functionality with their current toolset and any upcoming tool purchase desires.
2. Current State Landscape – The second step is to inventory the marketplace of existing analytical tools and map them into tool classes. There is often the case for multiple analytical tools that are needed that reside in one or more classes. This mapping may be useful in situations where the focus is on driving down complexity. There may be circumstances where users just need help choosing which type of tools they should use for which types of business problems. We have identified the following tool classes as a starting point for your exercise:
- Report Writers
- Semantic Layer Reporting Tools
- MDX/Cube Query Tools
- Data Discovery & Visualization Tools
- Embedded BI & Reporting Tools
- Data Science & Modeling Tools
- AI & ML Use Case Driven Tools
- Capability Tree – The third step is to create a capability tree that leverages the inventory of capabilities from step one and classifies them against the current landscape of tools. From this exercise, you may see overlaps and gaps that exist in your organizations current capabilities. Reading analyst reports on the criteria used to rank anlaytical tools can be helpful in filling out the capability tree that will provide direction for future purchases or rationalization exercises. If the analysis is based on specific tools it may be relevant to include non-technical criteria like pricing, support, and existing presence/support/skillsets for the tool within your company.
4. Decision Matrix – The next step in the process is to create a Decision Matrix that provides a method for scoring the various capabilities. For example, you can use a five-point scale and provide a weighting to each capability for each tool or tool class depending on the importance of that capability to the organization. The variability of the scoring will help determine the weighting for each capability as a final score is calculated.
5. Decision Tool – Finally, create a decision tool from the decision matrix that allows for ease of use in determining what tool should be used for what business capability or project. The tool should leverage the decision matrix to determine which capabilities to include in a particular tool decision and the overall scoring when comparing various tools against themselves.
The decision tool provides clarity on what class of tool should be used to solve what specific problem and can help deter rogue purchases of new tools to satisfy business problems that may already be satisfied by currently owned analytical tools.
Analytical tools are evolving at a faster pace than ever. Leverage the process outlined above to develop a decision tool that provides clarity on what class of tool (or what specific tool) should be used to solve a specific problem. Running this decision tool against all current and planned analytical projects will likely tease out which tools or tool classes are truly useful within your organization and which may be redundant or obsolete.
Josh Levy is a Manager in the Analytics practice of Aspirent, a management-consulting firm focused on analytics He has spent the past 20 years working in various capacities and industries within the Business Intelligence space.
Andrew Roman Wells is the CEO of Aspirent, a management-consulting firm focused on analytics, and co-author of Monetizing Your Data: A Guide to Turning Data into Profit-Driving Strategies and Solutions. For more information, please visit www.aspirent.com or www.monetizingyourdata.com.