I run a device repair shop in Mississippi, not a dev team, but I tackled the exact same problem when I built out 2,000+ repair guides on our site using AI assistance. The bottleneck wasn't the writing--it was the QA loop where every guide had to wait for manual review before publishing. I restructured our workflow so AI-proofread batches of 50 guides got pre-approved for common device families (all iPhone 12 variants, all Samsung Galaxy S series), then only flagged outliers needing human eyes. That one change cut our publication cycle from 14 days per batch down to 4 days--roughly 70% faster time-to-live content. The bigger lesson: identify which steps are actually identical across most of your work, then cache that decision once instead of re-validating it every single time. In your case, if 80% of your TypeScript modules share the same lint/test/build config, bake that into a reusable target so Bazel only recomputes the 20% that actually changed. We did the same thing with our parts inventory--stopped re-checking supplier stock daily for stable SKUs and only polled fast-moving items, which freed up hours every week.
This may be the most fruifully actionable lesson we learned: Bazel forces you to fundamentally change your perspective from package level builds to explicit dependency management at the level of targets. In Nx or Turborepo, it's easy to get sucked into thinking in terms of whole workspaces/libraries. With Bazel, you have to declare every single input and output, for every micro-target - sometimes just a couple files. But this initial investment is paramount to a truly hermetic build graph and is needed to unlock these massive speedups. Without it, undeclared dependencies will render your CB flaky. The thing that drove the biggest cut to our CI run time was that addign a shared remote cache made reaping the rewards from breaking up, per below, our big fat monolithic packages into hundreds of smaller build and test targets quite a bit faster. Where a small change in a shared library would invalidate that single package and re-trigger a rebuild for all its dependents (and their dependents), now, Bazel could cache the vast majority of unchanged targets. Your results may vary, but they can be quite dramatic. One case study I read about on Medium described how Stripe migrated their services into a monorepo using Bazel and as a result reduced their CI time from ~ 45 minutes to ~ 7 minutes through the same use of hermetic build and caching.