The Dangerous World of Self-Service Analytics

analytics analytics engineering data sql Apr 10, 2023
Photo by Erica Nilsson on Unsplash

 

Most analytics leaders put self-service analytics at the top of their list of items on their annual roadmap, but they never seem to talk the potentially disastrous implications of creating self-service tools. Enabling self-service analytics sounds straight forward where data engineering teams work with analytics teams to build dashboards. When deployed, these dashboards enable stakeholders to obtain information and insights for most of the functions needed to perform their job. In theory, business stakeholders can directly access the required data, avoiding reaching out to analysts, explaining the business situation, and waiting for the analyst to produce an analysis. Stakeholders love it because they aren’t bottlenecked by overworked analysts and analysts love it because there’s one less request that they have to deal with. But the self-service enablement almost always neglects to protect the stakeholders and business from itself because as the old saying goes, you can prove anything with data. Even if that data is inaccurate or misinterpreted.

I’ve seen time and time again where a stakeholder takes clean and accurate data and uses it to explain causation where there was no causation, just merely correlation. I’ve seen cases where stakeholders don’t realize that there is something nuanced about the data, such as a system outage that cause an anomaly in daily sales results, thus impacting averages and other metrics. Sure, the stakeholder was able to get the data, but getting the data doesn’t necessarily equate to getting the right data and drawing the right conclusions. This is where the danger comes into play.

Stakeholder Errors

I have countless stories from every organization that I’ve ever worked with where a stakeholder was given the keys to the car, but they were never trained on how to drive the car. They were also never trained to spot bad drivers or other hazards on the roadway which is critical to avoiding costly accidents. For example, at one company I was working with a team of product managers that ran an A/B test, saw that the test group was performing better than the control group, and nearly invested millions of dollars into a new product that would have failed because the product team failed to accurately interpret the results and understand causation versus correlation.

At another company, I was working with an executive who was reviewing a self-service dashboard and saw low sale performance of our newly clearance products. Unsatisfied with performance, he asserted that the clearance pricing was not compelling enough to customers and that the price needed to be reduced further to increase sales. However, the problem had nothing to do with the price of the products but rather improper inventory placement and price labeling practices within stores. By following the guidance of the executive, based on his perception of the self-service data, the company would have lost over $40m in revenue.

Yet even with stories like these, analytics leaders continue to push the narrative of utopian self-service analytics environment without ensuring that the users the self-service environment have the proper training, skills, knowledge, and partnerships necessary for success. Part of the reason for this is due to lack of awareness of potential issues and the other reason is because many stakeholders don’t even realize that they are making mistakes in the first place.  What’s worse is that in some environments such as at Amazon, self-service is pushed to an even riskier state where product managers and other individuals are given access to write their own SQL statements to query data directly from the database to perform their own analysis. On the surface, this may sound like a great idea because it enables team members and removes the burden on data analysts, but companies are playing with fire with these capabilities.

The Way Things Really Work

In my 20 years in tech in roles including software development, software quality assurance, software release management, and analytics, I’ve spent the last 12 years leading and building world-class analytics teams. During this time, I’ve interviewed, trained, and worked with hundreds of analysts and only a small percentage of these analysts had a deep understanding of the SQL language, the database, and domain knowledge, yet they were the declared experts. While it was true that these analysts were experts compared to all employees in the company, it doesn’t mean that they possessed expert knowledge, efficiency, or quality of results. In fact, over the last 12 years working at start-ups, eBay, GameStop, Amazon, VMWare, and more, I’ve yet to walk into an organization that was already functioning with a world-class analytics architecture.

This is a crucial call-out because if the teams that are supposed to be the experts lack specific and necessary tools, skills, knowledge, and business domain expertise to ensure data accuracy and business usage, it would be unreasonable to expect that individuals outside the analytics organization can safely perform these functions. This is why self-service enablement can be so dangerous to an organization. Because to produce accurate information and insights within the proper context, analytics teams need to be equipped with the proper training and tools before other team members can be enabled to utilize the analytics environment.

However, in the analytics organization, rarely is there any documentation for the database tables and views, nor is there much documentation for the Tableau and Power BI dashboards that analyst work with and that stakeholders frequently use. Meaning, even the analysts aren’t always familiar with how to work with the data or dashboards that have been built. But those dashboards are just the tip of the iceberg.

SQL is the main language for communicating with the database to query data and produce an analysis. But the interesting thing about SQL is that just because it returns data doesn’t mean that the results are accurate. Yet even with this knowledge, little emphasis is placed on the necessary architecture, proper code reviews, and quality assurance tests of SQL queries and results in the analytics organization. Again, if these practices aren’t being properly utilized within the analytics team, how can they be performed properly by untrained employees outside of the team? Now even if proper quality assurance practices and a world-class analytics architecture being was being used, stakeholders would still be at risk of producing inaccurate insights.

This is because there’s an assumption that the underlying database tables and views and visualization dashboards remain in a static and stable state. But this is rarely the case. Engineers and analysts are constantly finding bugs, solving issues, and modifying underlying assets which can cause old SQL statements to produce incorrect results even though the code runs and produces an output. In practice, this means that you can easily have product managers, project managers, and marketing team members running, or heaven forbid, writing their own SQL statements. But those queries could be out-of-date, producing incorrect results without the stakeholder ever knowing.  Yet this potentially incorrect information is what stakeholder teams have been enabled with and are relying upon to make multi-million-dollar business decisions. To avoid putting business decisions and the business at risk, analytics organizations need to do more than enabling team members though creating self-service analytics capabilities.

How to Improve

First, analytics teams need to ensure that they are building a world-class analytics organization for themselves. This organization must have the proper analytics tools and architecture and must ensure that all analysts are sufficiently trained to produce work in a manner that ensures accurate results and maintains a stable infrastructure. After building this foundation, stakeholders must be properly trained on how to work with the self-service tools and data, understand how the data could lead to inaccurate results, and why stakeholders need a true partnership with the analytics team. Lastly, a true partnership must be created between stakeholders and analysts. In this partnership, the analyst is expected to fully understand the business, stakeholder needs, and business context. This is necessary for the analyst to provide detailed, valued, and accurate guidance and insights to stakeholders. In cases where stakeholders are leveraging self-service tools without the analyst, the analyst should always be consulted to double-check the stakeholder’s assumptions, usage, and conclusions. By following these practices, companies will find themselves working with more accurate information, will avoid costly decisions due to inaccurate data and misinterpretation, and will elevate their teams to a level that they’ve never experienced.

 

Brandon Southern, MBA, is the founder of Analytics Mentor, specializing in providing analytics advising, consulting, training, and mentorship for organizations and individuals. Brandon has been in tech for 20 years in roles including analytics, software development, release management, quality assurance, six-sigma process improvement, project & product management, and more. He has been an individual contributor as well as a senior leader at start-up companies, GameStop, VMWare, eBay, Amazon, and more. Brandon specializes in building world-class analytics organizations and elevating individuals.

You can learn more about Brandon and Analytics Mentor at http://www.analyticsmentor.io/about

 

 

Subscribe to Receive New Articles

Join our mailing list to receive the latest articles and tips to elevate your career.

We hate SPAM. We will never sell your information, for any reason.