HACKER Q&A
📣 hjkm

Lessons learned from implementing user-facing analytics / dashboards?


We're currently writing an article about this.

If you'd be up for sharing some lessons / takeaways / challenges here, or even better, having a chat (I'll reach out) that would be amazing.

We'll of course fully attribute learnings / quotes etc.


  👤 PLenz Accepted Answer ✓
No matter what you do, someone will use your dashboard to post-hoc justify a pre-made decision. When it all goes wrong you'll be blamed for making a bad dashboard.

👤 bradhe
I started a company about this. We did OK. You can learn more in my profile. Happy to help if you think it's useful.

👤 jwsteigerwalt
You will often have to polish the users’ half-baked metrics. Even large orgs with teams of business analysts will leave gaps not uncovered till part way through build.

👤 grvdrm
I'll enjoy watching this thread evolve. Some thoughts from my experience:

- Everyone asks to translate simpler spreadsheets and Excel charts/graphs into dashboards in your BI tool of choice. As soon as it's there, they'll ask you why they can't export to manage the data themselves. This vicious cycle can sometimes be stopped but is a slow-motion drag on productivity in lots of orgs.

- Build in validations, and/or work on ways to check the dashboard. Dashboards sometimes put their builders and consumers on auto-pilot. The dashboard "must be right" but could easily have a bug or inaccuracy for weeks/months/etc. that isn't obvious without some external validation.

- The dashboard never has the "right" metrics - users will continue asking for changes. Be your best advocate and say no as a way of understanding the importance of the ask

- Related: always ask why about everything you're building into or modifying in dashboards. Business users often ask for things without an ounce of rationale.

- Related: taking away is harder than not doing at all!

Finally, I think most dashboards miss one fundamental point. Imagine you're the CEO/COO and you've got this beautiful 3 or 4-chart dashboard in front of you. What should you know about what you're seeing? What's the succinct summary?

I like building in spots to write 2-3 sentence executive summaries.


👤 tagspace
Biggest finding for us has been that no matter how many charts / filters / options / etc. we give to our users, they always want something more.

Answers don't just lead to Eureka moments, they lead to follow up questions and follow up questions.

Not a complaint - it's actually great. Just an observation (and a challenge)


👤 candiddevmike
Are you talking about internal or external users?

👤 awildfivreld
I'm a software developer.

There is a chicken and the egg problem when it comes to designing these things.

I can ask "What do you want the dashboard to look like" and they'll answer "I don't know before I see the data".

Then I'll ask what data they want to see, and they'll respond "What will it look like?", or we'll spend significant time on data collection only to find they never actually want it in a dashboard after all.

By far and away the most time consuming aspect of this entire domain is to find out what users actually want to see, as they almost never have something specific enough when they approach me.


👤 ravdeepchawla
Biggest lesson: all metrics _must_ be defined in code, not manager-speak.

For instance, if a marketing head wants to plot CAC (cost of acquiring customers) over time, saying CAC is number of customers divided by marketing spend is manager-speak. Spends are budgeted higher early in the month and adjusted with actuals. Customers ask for refunds and cancel accounts. Some campaigns have volume incentives which are known later... and so on. The solution is to write well commented SQL which laymen can audit and improve.


👤 RyanHamilton
80% of the time people should display a table, 15% a time-series or line chart. The other 5% is probably wrong. Anyone that asks for pie charts, 3d charts,... isn't a real data user ;)

👤 telecomsteve
Lesson learned: start with fewer metrics and observe how they are used and interpreted. It is much easier to expand correctly from there. Collecting requirements in a single pass and building a monolith is rarely as productive as it seems - because the barrier to adding things and shifting responsibility to the dashboard is so low in the beginning, that it can easily become a dumping ground.

👤 anymouse123456
I've seen so many of these projects over the years, and they are almost always used for success theater, promotions or just plain ego.

- What do you hope to learn from this tool?

- Is there a less expensive way to get this information?

- The data will move 1 of three directions; up, down or stay the same. Ahead of time, what will you do in each case? Asking me to change the direction of the line is not an acceptable answer. Do we still need to make the chart? Or were all three answers the same?

- This is not a one-and-done project. The moment some visibility emerges in the fog, you will be desperate for more answers. We must set up a process for the never ending litany of questions that will emerge from this work.

- Smaller is better, incremental, fast iteration and ability to change are all far more important in dashboard work than stable, long term, deeply reliable.

- This is the conversation I even have with myself as I work on data for my own company.


👤 h1fra
For external dashboard not internal:

- You can output the most elegant metrics, you will never know if it was the right one until you talk to actual customers. Most of the time, they don't even understand what is presented.

- Use libraries, ui-kit made for this, it will save a huge amount of time.

- Whatever you do it will: never be enough, wrongly interpreted, used in the wrong context.

- Try to tie graph and metrics to use cases or questions. e.g titling: "Active user" vs "How many users were active* in the last 30days?" (* define active in the description) can make a huge diff in terms of comprehension


👤 analog31
This is completely an aside, but whenever I see "dashboard" I think of those colorful plastic toy dashboards that are given to children sitting in the back seat of the car, so they can pretend that they're actually driving.

👤 james-revisoai
Use colours and graphical elements (generated graphs), but:

Obey rules of spacing more carefully than other rules to avoid overwhelming.

Do not use colours unless signalling information, so users can be alert and relaxed when needed.

As soon as you have more than 2 types of information, have expanding panels, which remember whether the user expanded/collapsed them.

Lastly, remember that speed of loading data is much more important for dashboards in general than a random page. Cache data, or initially load only summary data, or only load the latest day by default and then fetch the weeks data. Remember clients may make purchasing decisions based on how fast your stats page of your SaaS usage loads when they are showcasing it to their C-suite, and a 15 second wait can cost you your enterprise sale.


👤 jtthe13
No matter what the client says, ensure your prototypes load fast. I had a project turn sour because the C level test end users couldn't be bothered to wait 20 seconds, despite us telling them it was normal.

👤 frgtpsswrdlame
Ah, I saw a great tweet that captured a lot of my feelings about this the other day: https://twitter.com/InsightsMachine/status/17018601232984842...

>“Data is the new oil.” Clive Humby, 2006

>“Most of my career to date seems to involve redesigning legacy reports to make it easier for existing users (if any) to see that they contain absolutely no actionable insight with a lot less effort.” Jeff Weir, 2023

For my perspective:

In general, I find most users can't actually say whether they need any given number/visual on an ongoing basis. So large amounts of work go into building dashboards that are used for a very short amount of time and then discarded. Probably we should do a better job on one-off analyses and only dashboard after the fact.

Many users don't actually want a dashboard, what they actually want is a live data dump into excel where they can pivot table it. Maybe, maybe a bar or line chart.

In general, I find people always ask for more filters, more slicers, just endless options to reconfigure the data as they please. But they quickly become trapped in a swamp of their own making, now nobody knows how this should be sorted or sliced, does it even make sense to do it this way? People think what they want is a 'data democracy' with hundreds of dashboards with hundreds of options with hundreds of users and so they ask for and usually receive it. But they usually just end up coming back to the data team and asking - 'so what's the answer?' What many orgs need is actually a data dictator.

On the other hand, dashboards do allow you to establish really good feedback loops within the business so when you can identify an ongoing constraint, figure out how to track it and then force people to receive it on a regular cadence and be accountable to it, you can make a lot of headway. But that's a more niche use-case than how they're frequently used and the skills involved are different - less visualization skills, more business analysis - and you need to be positioned to make sure someone is held accountable.


👤 itsoktocry
I've worked in many analytics projects across a number of companies as a consultant. I'm a big believer in "decision support systems". Find out what decisions your customers need to make, repeatedly, to their job. Quantify the heuristics and visualize that information (and that information only) in an easy to consume manner. More often than not that's an email or PDF. Another advantage is that by supporting the business users they feel less threatened by the changes or technology.

I think "self-serve" analytics is silly, the idea that you put all of the data in front of people and they've derive "insights". That's not how normal people or data work. We just had a discussion on HN the other day about Facebook's Prophet, and its pitfalls. Meanwhile we expect Joe in sales to identify useful trends on a chart he made. Every company needs to forecast, regardless of their sophistication. That stuff needs to be defined by the right people and given to the users.


👤 Klaster_1
As a developer who works on a database management system monitoring tools, user-facing monitoring dashboards have been my bane for a while. I don't know much about the situation in other companies and products, but here are the main pain points I've encountered:

1. Nobody knows what to monitor exactly, every new dashboard is based on a guess.

2. Not much user feedback to base the decisions on if you don't have much users to begin with.

3. Often, the metrics exposed by the app under the monitoring prove grossly inadequate or suitable metrics do not exist.

4. You can't just add new metrics. Users have to update the whole distributed app for the new metric to become available. This has to be accounted for at the UI design stage.

5. Somebody has to spend a significant amount of time gathering all the information from random people in the company, because see 1.


👤 tomrod
james-revisionai captured most of the main ideas.

One thing not emphasized well:

1. Make it accessible. At some point, virtually all of us will have some form of accessibility issues. 508 compliance is a solid standard for this, though can be a pain to manage without starting with it from the get-go.

2. Make it tabbable (similar to accessible).

3. For development side, make it able to client-side OR server-side render -- not every dashboard will have or need a rendering server. In python, Altair is the only client-side rendering that is also interactive that I'm aware of. It's important for payload considerations

4. Related to 3 - consider payload considerations. Make it transparent, either in debug logs or similar, how large the elements passing across the wire are.


👤 polyterative
make as many metrics as you can configurable. what I mean is that charts origin of data should be configurable,in its form and in its colors. also allow users to filter the data incoming to the charts, users love messing up with the data before exporting them to their pointless and boring powerpoint presentation

👤 tobr
In my experience, if your plan is to make a “dashboard”, you’re already on the wrong path. It’s too generic and says nothing about what problems you are there to solve. Think about it yourself: in how many of the products that are important in your life is there any meaningful value produced by a dashboard?

Dashboards seem alluring because we imagine that users will sit there and somehow have insights delivered to them automatically. It’s often less clear what those insights will be or what is needed to produce them, we somehow hope they will materialize by just displaying some data. Often the focus is on making pretty-looking charts (which only ever look good when you demo with picturesque fake data), because you want the product to feel colorful, welcoming and visual.

A better approach is to either make a focused tool for solving a specific problem you know users have - you won’t think of what you end up with as a “dashboard” but it might occasionally end up looking a little like one - or to make general tools that allow users to dig through data interactively to find the things they care about.


👤 petespeed
Similar to supervised and unsupervised learning, one can see dual paths on this journey. One path answers the questions which have been in user's mind. The other explores unasked ones to finds new insights.

👤 rukuu001
Balm for my heart.

I'm looking after a decision support system at the moment, and am encountering all the challenges raised here. Glad to see my experienced is not unique.


👤 NicoJuicy
Make everything exportable to csv / excel.

The ones who actually use it, you won't cover all their edge cases.


👤 anonu
Some thoughts:

- a clean data pipeline is critical. Is your data pipeline manageable? Is it observable? Is it monitorable? Can you make changes quickly at different stages? How do overrides work? Does your data pipeline have entitlements? (Can private data points be provisioned to specific users?)

- Should you implement your own dashboard? Or are you reinventing the wheel? Can you reuse/recycle existing BI tools? What are the licenses involved? Power BI is proprietary to microsoft and will have per user economics. Grafana is AGPL, be very careful with anything AGPL in your tech stack because it may force you to open source your code. Apache Superset is pretty cool. I've seen big startup valuations with off-the-shelf BI tools. If its an MVP, definitely consider using this as opposed to rolling your own.

- Making assumptions for your users is bad because users will always ask for more. So building a flexible framework where users can add/remove visuals and build their own analytics may be necessary. The flipside is this adds complexity and can confuse the user. Its a delicate balance to cater to all types of users: the basic user vs the power user.

- How do users send you feedback? Bad data point? How do you find out? Can the user address it themselves?


👤 hughess
I spent 5 years leading a data team which produced reports for hundreds of users.

In our team’s experience, the most important factor in getting engagement from users is including the right context directly within the report - definitions, caveats, annotations, narrative. This pre-empts a lot of questions about the report, but more importantly builds trust in what the data is showing (vs having a user self-serve, nervous that they’re making a decision with bad data - ultimately they’ll reach out to an analyst to get them to do the analysis for them).

The second most important factor was loading speed - we noticed that after around 8 seconds of waiting, business users would disengage with a report, or lose trust in the system presenting the information (“I think it’s broken”). Most often this resulted in people not logging in to look at reports - they were busy with tons of other things, so once they expected reports to take a while to load, they stopped coming back.

The third big finding was giving people data where they already are, in a format they understand. A complicated filter interface would drive our users nuts and turned into many hours of training and technical support. For this reason, we always wanted a simple UI with great mobile support for reports - our users were on the go and could already do most other things on their phones.

We couldn’t achieve these things in BI tools, so for important decisions, we had to move the work to tools that could offer text support, instant report loading, and a familiar and accessible format: PowerPoint, PDF, and email. Of course this is a difficult workflow to automate and maintain, but for us it was crucial to get engagement on the work we were producing, and it worked.

This experience inspired my colleague and I to start an open source BI tool which could achieve these things with a more maintainable, version controlled workflow. The tool is called Evidence (https://evidence.dev) if anyone is interested.


👤 sceaux
About a year ago my (new-ish founder) boss came to me and asked me to build him a custom dashboard. "I have all the data in a spreadsheet but I want it in a dashboard" he said. I was a specialized systems dev, only occasionally doing a bit of webdev if necessary and really didn't have time for those kind of errands.

I showed him this tutorial I had recently seen. Just a few minutes and the thumbnail, about how to build a "dashboard" in excel. https://youtu.be/z26zbiGJnd4?si=HWn8qTbozD8vmXiF

"Oh wow, I didn't know excel could look so beautiful!". He asked for the link, never did anything with it of course but was totally satisfied. I am pretty sure he just wanted a shiny toy and also felt inadequate about "just using excel" to do his important founder work. Showing him that excel can look beautiful and is a powerful tool was enough. No more feeling inadequate, no need for an actual (or even excel) dashboard.


👤 logason
imo there are three core pillars you have to get right here:

1. Relevant: Don't just build a dashboard for the sake of building a dashboard. First, understand what the goal of the user is, and what metrics they'll want to look at to understand their progress towards that goal

2. Reliable: You only have one shot to get this one right. As soon as you present incorrect data to your users, you've lost their trust forever, so make sure you have solid tooling in place across your data stack that ensures data quality, from collection, through transformations to query time

3. Accessible: The data the user will be looking at needs to be either self explanatory, or the user has to have access to documentation that describes the data they're looking at in detail.

For point 1/, here's a framework to help you identify which metrics to focus on: https://www.avo.app/blog/tracking-the-right-product-metrics


👤 spark1212
One of the hardest challenges is ensuring alignment with the end user from ideation to delivery. It can be tough to figure out what the end user needs in the first place, let alone the details of how to define individual metrics or slice the data. This is a huge pain point for both externally and internally facing deliverables, but it's especially tough for external clients because you're likely a lot more limited in your ability to communicate ad-hoc to clarify things down the line. And once you've delivered something that's either irrelevant or inaccurate, then it can end up being game over for the engagement (if you're working externally) or your counterpart's trust in your output (if you're working internally).

So it's super important to get on the same page RE: goals and expectations and keep that alignment going to the end - so that there aren't any unpleasant surprises at the delivery stage. Some more on who to get involved and how here: https://www.avo.app/blog/who-should-be-involved-in-tracking-...


👤 syndicatedjelly
Tons of great advice in the comments. At risk of repeating others, here's what I've learned work on business intelligence tools for an engineering group:

- What users ask for and what users really want are often extremely different.

- Engineering executives like to place their "thumbprint" on every dashboard you make. They want evidence that the "intelligence" being reported has been customized by them. It's their way of imparting branding on the organization.

- UI/UX is far more important to these users than how you handle the technical details. When discussing implementation with them, start with the UI so that they have a mental model to build from.

- Leave space to create cool things that you/your team want to make. The developers of BI dashboards often have excellent ideas for visualizing data that an end-user would not immediately think of. Leave room to "delight".

- Never assume the data is clean or accurate (even when there are regulatory reasons for it to be either of those things)

- Not everyone's opinion is equally valuable.

- Beware of corporate politics. I once had an analytics project completely shut down because it would expose certain weaknesses in the business that were not acceptable to discuss publicly.


👤 keiferkif
from what I've seen people just want the data dumped into an Excel document so they can do their own analytics.

👤 alexpetralia
* Design matters a lot - if it looks bad, people won't look at it.

* Layout for dashboards is almost completely formulaic. A panel for selected high-level stats (user growth % increase from last year, user % increase from last month, # new users added), a panel for breakdowns (user growth by marketing channel, user growth by registration cohort), a panel at the top for filters ("let's filter the entire dashboard by just this marketing channel, or just this registration cohort") identical to all breakdowns provided, and finally a drill-down ("show me the users in this particular aggregation"). It took me a very long time to learn that this design is entirely cookie-cutter for good reason. Users always want the same things: stats, breakdowns, filters and drill-downs.

* Padding matters, font matters, color palette matters, no typos matter, visual hierarchy matters (i.e. big bold numbers versus smaller grey numbers).

* Always define the key metrics first (based on fact tables). All dimensions and drill-downs in the dashboard will derive from these front-and-center stats.

* Reconcile to existing metrics before broadcasting widely - almost always, people have the same stats in extant technologies (i.e. Excel, Mixpanel, Salesforce) and will instantly find inconsistencies between your figures and the extant ones.

* The vast majority of users will be passive viewers. Very few users will be "power" EDA (exploratory data analysis) users. EDA views will look different from the view that passive viewers want - keep them separate

* Obviously, the more things done in code, which promotes modularity and composability, the fewer data integrity issues you will have


👤 bdcravens
Users often catch what they see as conflicts, and you need to answer for this.

Often it's something as a different interpretation of data in multiple places (revenue in one place, profit in another) or differing date logic (one query includes a date in the range, others are "up to" that date, etc). Caching is another issue, especially if you selectively caching only slow queries.

To minimize this, always have an explanation on the chart/card (even if it's hidden but clickable to show)


👤 __tyler__
The biggest help we got was meeting directly with our customers and asking them “What would it take for you to login everyday to view this dashboard” and they clearly provided metrics and trends they care about but have a hard time getting access to the data. Also don’t get fancy with our visuals. Lots of big metric kpi visuals, tabular visuals, line charts & bar charts. They should be able to glance at the visuals and immediately known what’s going on and get sense of what the visual is conveying.

Another thing customers love is the dynamic ability we give them to be able to switch how certain visuals are grouped or what value is being displayed. We can’t for see all the different ways users will want to slice and dice the data so giving them that ability was huge.


👤 gorbachev
#1 mistake people first dabbling with dashboards make is to show absolutely everything.

Don't do that. Show only the things users need to act on what's on the screen. Minimize the information, make it "glanceable".

If you have a troubleshooting dashboard, and you're showing 999 items with nothing going wrong, that one item that's actually wrong is not going to pop.


👤 BillFranklin
As someone who has made a ton of grafana dashboards over the years, be prepared for users to hold it wrong. Data visualisations should fail/degrade in clear and expected ways. Users are often surprised when dashboards/charts hit some limit (eg they write a non performant query). The big query design (async first, fair queueing) is best if you’re letting users write their own queries on their own datasets.

👤 otronjones
A common saying in statistical consulting is that the entire job is just asking "what question are you trying to answer?" over and over again.

Building dashboards that will actually be useful requires the same approach.


👤 z5h
Worked at a place providing financial research data and models to investors. We spent a lot of effort creating infinitely flexible and customizable reporting and dashboards. Turns out no one used that. Everyone just wanted a general high level report emailed to them.

👤 tlarkworthy
The phenomenal cost of hosting low latency realtime dashboards for everyone is a real cost. Tons of memory required if you want them to open quick for everyone. I wish they could be served more dynamically like if you saw a user loging, you could probably populate the query before they got to the page or something. As it was it seems like we have to serve a zillion dashboards noone is actively reading.

👤 danielvaughn
They always seem easy at first. They're never easy. Anyone can toss up a visualization, in fact you don't even need to know how to code, just load up a CSV in Google Sheets and drag it into Google Data Studio.

The hard part is knowing what information to surface, and how to drive the user towards those insights in an intuitive way. You need a strong team that intersects product, data science and UX. Engineering is the least important aspect of it.


👤 kfor
Lots of great suggestions here, but one I haven't seen is providing deep links. Let users share the exact state of their dashboard with others, ideally without requiring some convoluted system of logging in and sharing things. We implemented it by allowing a json config in the url, then providing a button to copy a shortened URL containing the whole config.

Original creator of (the now woefully dated-looking) GBD Compare [https://vizhub.healthdata.org/gbd-compare/] here, where we found this super useful since we had so many controls that it could take a lot of clicking (and knowledge of the UI) to recreate a graph someone else was looking at. It really helped with reach, as folks could email/tweet their specific view then others could use that as a starting point to dive in without starting from scratch or having to create an account.


👤 revenga99
I'm currently in the middle of building an overly complicated analytics platform where there is a "easy mode" and an "advanced mode" they pick the devices and metrics they want on the graph, and if they toggle advanced it shows the sql it used to create the graph. So then they can edit the sql or do whatever they want.

Giving customers "secure" sql access was a must have feature from upper management, and it was very tricky/a nightmare to get right.

Customers actually liked it though, sql is king.

Advise I would give is make sure your analytics api's data models and queries are well though out and extensible. It makes it very hard to change them and rework the ux.


👤 systems
my advice is mainly about taste

i always assumed i have a good taste, and that my designs are good looking and should appeal to others

different people have a completely different idea on what is usable and what is a good taste, so just be open minded to listen and accommodate for the taste of others


👤 JoeyBananas
#1 lesson is "don't be distracted by cool figures." The actual important stats are just numbers, or a list of strings, or an iso timestamp, etc.

I regularly tell customers "Open up this JSON, hit control f and search for the stats that you need." And they're like "Thank you, you just saved me 50000 hour of work."