HACKER Q&A
📣 former-aws

How do you manage delivering dashboards between data teams and others?


Full context: A friend and I are working on addressing a pain point that bugs both of us. There is a painful disconnect that exists between understaffed data teams who clean and build dashboards/reports AND the non-technical teams who end up consuming those reports.

To probe further, we ended up doing a lean startup style "customer discovery" interview process. We ended up talking to 35+ people in the SMB space which validated some of our initial hypothesis but also not giving a strong signal on the top pain point.

So, what are the challenges you face? I would love to hear how you are dealing with these challenges. If you prefer to shoot me an email, you can find my email on my profile.

Thanks in advance!


  👤 focusedone Accepted Answer ✓
My experience is that ascertaining exactly what the report consumer wants to see on a dashboard report is, by far, the most difficult part of the process.

Often the consumer would want a visual way to confirm a vague feeling. By the time a data team can ascertain what the consumer actually needs to see, collects and wrangles the data, and builds the dashboard, the consumer may have lost interest or become frustrated and moved on.

Identifying a feeling and graphically representing it with data is an extremely difficult thing which led me to a place of great despair at a previous job.

Anyway, I don't have a good answer for this but I hope you can crack it and become wildly successful doing so!


👤 solardev
In my experience, small teams especially like to fall into the "any data is good data" trap, where they build a bunch of big, beautiful dashboards full of irrelevant, misleading data. A solar company produces a bunch of time-series lines with electrical jargon and pretty icons that maybe 2% of the customers actually understand. The devs making the dashboards don't always know what the data actually means, how it's measured, what the precision is, what the nominal range should be, etc. A marketing team ends up with something that looks like Google Analytics had a baby with NASA Mission Control, with big scary numbers, awe-inspiring trendlines, and fancy charts that your math teacher wouldn't even be able to name.... but that don't result in anything actionable, because nobody can put it all in context.

People can't really focus on more than 2-3 things at a time, and ideally even less. If you just give them a big dashboard full of pages of all that pretty data you've massaged, IMHO you've met your needs (how do I present my collection) more than theirs (what do I need to do differently to improve performance).

More than the raw numbers, they have to be able to derive useful insight from the data, and that requires both deep domain knowledge and some rudimentary understanding of statistics, but many dashboards aren't really set up to highlight actionable things. They might show you a % change from last week, but are they showing the outliers (most weeks we see +/- x% change, but suddenly there's this one data series that shows a 400% gain... is that highlighted? alerted?). Does it also show changes in the rates of change? Has the variance/IQR/etc. of a relatively stable dataset suddenly changed? Has a trendline elsewhere reversed? Does the dashboard easily show the big-picture highlights (the "this is interesting..." tidbits) to facilitate big-picture discussions at the weekly meeting, but still allow deeper drilling-down during someone's individual time? How does the data presented help with a SWOT analysis, or even just the question of what people should look into that week?

If it's the same bunch of numbers and charts over and over again, eyes will just start glazing over. It's the anomalies that really matter, but teams don't always know which ones to watch out for and how to use the proper statistical tests to measure them. Software can help with that by abstracting relatively complex stats into UI elements (colors, highlights, warning, notifications), but you still need someone with the domain knowledge and statistical know-how to set those up in the first place. A dashboard is just a prettier spreadsheet unless it's smart enough to surface business concerns.

A basic stats background (like even just an intro online course) can go a lot further than a dashboard that only uses basic arithmetic, because stats can dramatically improve the signal-to-noise ratio of a dataset. Sadly, many teams and devs in my experience don't think that way ("what useful signals can I derive from this noise, and what are the statistical tools I need in order to do so"), and instead just think "how do I make sure all the data is here, and how do I fit it all on screen". They focus on quantity and completeness, which is often the opposite of what they actually want (a good signal to noise ratio that highlights actionable things, not just a mountain of overwhelming data). A good dashboard helps them focus on what actually matters, and a poor one just adds to their stress...