Why Even Great Data Teams Fail
For over 20 years, data advisor Shachar Meir has built and led data teams everywhere from early-stage startups to global giants like PayPal and Meta. So when he says that most organizations still “have great teams, strong platforms, and lots of data - and nothing works,” it’s worth listening.
In the first episode of בול בדאטה, the data/AI podcast hosted by Hetz partner Guy Fighel, Shachar breaks down why that happens and how companies can fix it. Watch the full episode (Hebrew), or read through major themes and takeaways below.
This post distills some of the key lessons from their conversation, expanding for founders and data leaders who want to build systems that actually deliver value, not just dashboards.
From Oracle 9i to AI Agents: Three Eras of Data
Twenty years ago, building a data system meant racking physical servers, managing storage manually, and performing nightly backups you prayed would restore. Schema changes required days of coordination. Every query ran through the DBA. The result: control and reliability, but very little agility.
Then came the Hadoop/Data Lake revolution. Suddenly, organizations could dump everything into distributed storage and “figure it out later.” That freedom fueled massive innovation — and an explosion of startups offering narrow solutions to every niche step in the pipeline.
The unintended consequence was what Shachar calls feature-sprawl.
“To move data from A to B today, you need at least seven tools.”
Each one solves a real problem (ingestion, transformation, orchestration, quality, observability, cataloging) but the cumulative complexity has become the bottleneck. The average data engineer now spends more time wiring tools together than generating insight.
We’re now entering a third phase: consolidation. Major ecosystems like AWS, Databricks, Snowflake, and GCP are rebuilding the stack into integrated one-stop shops. The same pattern that once fragmented the space is reversing, as vendors acquire or rebuild the very “features” that spun out as standalone products.
For founders, that shift matters. The bar for point solutions is higher than ever. Integration, workflow fit, and measurable ROI now outweigh novelty.
The real reason data fails
Most companies don’t fail at data because of poor technology. They fail because their data work is disconnected from business value.
Shachar described seeing countless teams with robust pipelines and talented engineers, yet nobody using the output effectively. Data consumers still pull static reports or chase dashboards that aren’t embedded in real workflows.
A data team’s mission isn’t to build pipelines; it’s to enable better decisions. That requires designing for consumption, not just production.
Imagine being able to pull live performance data directly inside Google Slides while preparing a board deck. Why switch to another tool? Bringing data to the user, rather than forcing the user into data tools, is the next usability frontier.
Buy vs Build: A founder’s discipline
Another recurring theme was the buy-versus-build trap. Many engineering-driven organizations prefer to build everything internally, believing they’ll save cost or gain control. In reality, they often end up maintaining half-finished internal products (‘Frankenstein’ projects).
Shachar’s advice: focus your engineering energy on what differentiates your business, not on problems the industry has already solved. Ingestion, data quality, and observability are solved categories. Buy them. Then build the parts that create true strategic leverage.
He referenced a quick LinkedIn poll he ran: more than half of data engineers spend two and a half days per week chasing data-quality issues. For a team of five, that’s the cost equivalent of an extra full-time engineer burned on maintenance.
Pricing Models: Simplicity wins in enterprise
One of the more practical parts of the discussion touched on pricing; a topic many technical founders underestimate.
Usage-based pricing (“pay per credit or per row processed”) sounds modern and fair, but large organizations hate unpredictability. CFOs prefer fixed or tiered pricing, even if it’s slightly higher, because it’s predictable and easy to forecast.
For early-stage companies, usage-based can still make sense (it scales with customer growth) but founders should always test how the pricing model feels to the budget owner, not just the user.
And don’t underestimate the value of a free tier. Let users experience the product before procurement ever starts.
C'mon, who’s your actual buyer?
Data products often face an internal triangle:
- Need sits with the data engineers.
- Authority sits with the CDO or VP Data.
- Budget often sits elsewhere (perhaps with the CIO, CTO, or even the CFO).
If those three don’t align, the sale stalls. Founders must map all three personas early, understand their motivations, and equip champions with the materials they’ll need to convince procurement and legal.
Guy noted that in recent years, data budgets have started migrating upward, from data-specific leaders to broader technology ownership. The same people who buy DevOps or security tools are now also buying data infrastructure. For founders, that means product positioning must speak the language of platform teams, not just analytics.
Governance, transparency and the AI shift
As companies race into AI, the definition of “data” itself is expanding. Call recordings, sensor feeds, video streams are now all training material. This raises new questions: Who owns the data? How is quality measured? What counts as compliant use?
Shachar drew on his experience leading data for Meta’s Safety & Trust organization. When models make automated decisions, like suspending accounts, flagging content, denying credit - transparency becomes existential.
Companies must be able to explain why a model acted a certain way. The more complex and opaque the model, the harder this becomes. Expect regulation to catch up and force explainability. The balance between innovation speed and accountability will define the next few years of AI operations.
A playbook (in short)
- Simplify: reduce tool overhead; fewer moving parts mean faster insights.
- Buy where possible: reserve in-house engineering for strategic differentiation.
- Design for consumption: integrate data into real workflows.
- Price predictably: remove uncertainty for enterprise buyers.
- Anchor in business value: measure data success by outcomes, not pipelines.
- Govern early: build transparency and compliance into your systems now.
Subscribe to Hetz Ventures on YouTube to get updated when new episodes release, or follow along on Geektime.