Our embedded analytics underperformed because we assumed users wanted flexibility instead of clear answers. As Founder and product lead at WhatAreTheBest.com, we shipped in-app performance signals and dashboards on category pages intended to help brands and partners understand how product rankings and scores behaved — for example on a live page like https://whatarethebest.com/shop-pet-supplies/shop-dog-supplies/shop-dog-training-supplies/ . While the data was accessible, engagement stayed low because users didn't know which metrics mattered or how to act on them. The assumption we got wrong was that more visibility equals more value. If rebuilding today, I would not ship configurable dashboards first. Instead, I would deliver concise, outcome-oriented insights that answer specific business questions. Albert Richer, Founder, WhatAreTheBest.com
Joern Meissner, Founder & Chairman, Manhattan Review — Link: https://www.manhattanreview.com/gmat-course-online-resources/ Although we provided embedded practice dashboards to help students identify areas of weakness in their content preparation and time management, the dashboards did not perform well because the views were "overly busy" and the action steps were unclear. In one quarter, fewer than 1 in 5 active students reached the first summary page when completing practice tests at least once a week. The assumption we made incorrectly was that simply providing more detail in our dashboards would create more learning opportunities. Students actually just needed a simple next step to follow (for example, complete 20 geometry problems within 2 minutes per problem) rather than 12 different charts, filters, and views. If we were to build again today, I believe the best starting point would be to present three scores based on decision-making: accuracy, pacing, and difficulty; then add details to the dashboard only after the student requests them. Additionally, I would never present topic mastery as a single number, as this can create a false sense of security in the student's study decisions.
Shan Abbasi, Director of Business Development at PayCompass - High Volume - https://paycompass.com/high-volume/ PayCompass developed an embedded merchant dashboard that provided high-volume sellers with real-time access to approval, chargeback, and cash-flow information. Still, the dashboard underperformed because real-time did not match the time frames for users to see approvals, chargebacks, and cash flows reflected in their bank or settlement reports. The failure of our product was based on a lack of trust from the client base, as a 2-6 hour delay between the time of transaction and the time that the transaction was posted in the system caused users to believe that data reported via the dashboard was inaccurate, even after the transactions had been completed and matched. While we assumed the customer's primary concern regarding the dashboard was speed, many were more concerned about having final, matchable numbers they could use for accounting purposes. The first time this occurred, a user flagged a "revenue drop" on the dashboard, stopped advertising, and subsequently learned that the transactions were merely pending. If we were to rebuild the dashboard today, I would include labeling for each state (authorized vs settled) and display the last sync time on every chart, and default to reconciled views for finance teams. One thing I would never do again is ship a single blended cash today metric without providing definitions and means for users to audit that definition.
Tom Bukevicius, Principal, SCUBE Marketing — URL: https://www.scubemarketing.com/industries/facility-decor-maintenance We built a web-based client dashboard that provided a multi-channel performance reporting interface to our clients, and it quietly failed. Our clients used it as a tool to evaluate performance across different marketing channels, but it did not influence decision-making. The primary factor contributing to the failure was that there were no clear rules or attribution models presented in the dashboard, so when clients reviewed multiple ROAS numbers from different sources, such as Shopify, Google Ads, and their CRM, they were unable to determine which number was correct. We had assumed that providing a single "source of truth" view would eliminate most questions about performance between platforms; however, the opposite occurred because we did not provide clarity on which data points were included, excluded, or delayed in the reporting interface. The first warning sign that something was wrong was the call volume pattern, where clients would take screenshots of the dashboard and request access to the underlying spreadsheet. If I were to rebuild the same type of application today, I would begin with defining the fundamental attributes of each channel (i.e., time zones, click windows, returns) and create a guided story within the dashboard (What changed this week and why?) with a short notes panel for capturing anomalies. There is one item I would NOT have done differently if I were building the same application today, and that is hiding data freshness and joins - people will forgive you for delivering bad news, but they will not forgive you for unclear math.
Our embedded analytics underperformed because we optimised for "impressive" charts instead of helping users make one clear recurring decision. It was built into a multi-location services platform to show owners which locations, channels, and offers drove the best LTV and lowest CAC so they could shift budget and staffing. A public example that's close in concept and structure to what we shipped is here: https://www.zenoti.com/product/analytics-reporting. We assumed users wanted deep drill-downs and dozens of filters, but most just wanted 3-4 stable views they could trust and check weekly. I wouldn't start from the data model again; I'd start from 2-3 key questions the operator asks ("where should I move budget this week?") and design only for those, then layer complexity later. I'd also insist on an in-app onboarding tour tied to a simple workflow, instead of dumping people into a dashboard gallery and hoping they explore. Name: Josiah Roche, Fractional CMO, Silver Atlas, www.silveratlas.org.
have to admit, embedded analytics didn't quite pan out when I assumed customers would be happy with a whole lot of depth, but not so much clarity. The dashboards were meant to give customers a live view into how their software was performing - right inside the platform they were using - but the usage just never took off. The users were getting bogged down in it all. What we thought we were giving them was a way to ditch manual reporting, and have all the data at their fingertips. But it turned out, we really just ended up giving them too many answers - to questions they probably didn't even need to ask. We thought that having a whole bunch of charts would equate to a whole lot of value, but boy, were we wrong - what users actually wanted were fewer questions answered, not more data to sift through. If I'm being honest, I'd not even bother trying to fix those analytics without first figuring out what specific decisions they're really going to help people make.
Our embedded analytics failed because we prioritized pretty, real-time charts over actually helping partners make decisions. The dashboard lived in our partner portal and was designed to be a simple view for local tour operators on how their booking funnel, cancellations, reviews, and payout were doing. A public page we made called "Partner Dashboard Overview" that catalogs what we shipped. Our biggest wrong assumption was that partners would want a deep drilldown; they actually wanted simple weekly benchmarks versus similar tours in their city and CSV exports that were easy for accountants to understand. I would not iframe BI embed a second time with that many complex fks and rls, which caused mobile to be slow and difficult to maintain. If starting from scratch now, I would anchor on three decisions (pricing, capacity, quality), show only the essential KPIs for those, and send insights by email or alerts so partners wouldn't have to hunt them down.
At Engrave Ink, we created an in-app customer dashboard which was intended to allow our clients to monitor their order status, read community story shares customized to them, and see soft engagement indicators such as campaign open rates of our memorial outreach. This feature was simply not performing well since in this sensitive area, the customers hardly used the feature repeatedly, as they favored one time updates through email when they experienced emotional processing. The assumption was that people will desire constant information regarding the trends in the community but the majority desired to have the information on how to close it. Rebuilding today, I would not coerce a complete dashboard to log-in again.. There is real essence in respecting the silent personal moments than pushing data on the screen, and these memories bring gentle lives in the hearts besides the screen..
The first embedded analytics feature we launched utterly flopped because we delivered a data exploration tool instead of a decision-making tool. We had visions of grandeur, and included a data exploration app to be built in a logistics platform to show clients their supply chain inefficiencies. Users resisted the power we gave them and ended up doing all the hard work themselves. Our #1 wrong assumption was thinking people would want to "analyze" the raw data we put in front of them, instead of wanting a straight answer to "what routes were the least profitable this week?" We've launched good dashboards since then (take a peek at our portfolio to see some examples: https://bimg.b-cdn.net/our-portfolio/portImage2.png), and I would never again go to market for the V1 with a ubiquitous filter-based ubiquitous interface, but instead; I'd start with a few clicky "insight cards" with clear, opinionated answers to the top 3 business questions we need to pay attention to; and force myself to get narrow enough to land somewhere useful with actionable intelligence from day one.