How reMarkable, Salesforce, and OpenAI Set Platform Team Goals
Real-world examples of using a four-layer framework to measure data platform impact

Several months ago, I published an article introducing a framework for setting data platform goals. The framework organized platform goals into four distinct layers, from outputs (features shipped) to direct business outcomes (revenue impact). The article resonated with many people who struggle with the same challenge: how do you prove your platform’s value when you’re several steps removed from the business metrics that matter?
After publishing, I received several messages asking for concrete examples. How are real companies actually using this framework? Which layers are they operating in? What metrics work in practice versus theory?
I reached out to platform managers and leaders and listened to podcast episodes to understand how several companies approach platform goals, and I am happy to share the conclusions in this article.
The Four Layers Revisited
Before diving into the examples, let me briefly recap the four-layer framework for platform goals that I shared in the previous article:
Layer 1: Platform Outputs.
They measure what you ship: features delivered, tickets closed, tools implemented. Easy to track, but tells you nothing about impact.
Layer 2: Adoption, Engagement, and Reliability.
Goals in this layer measure whether people use what you build and whether it works: adoption rates, active users, and system uptime are examples.
Layer 3: Efficiency Metrics.
They measure productivity gains: time saved, manual work eliminated, faster decision cycles.
Layer 4: Direct Business Outcomes.
Business outcomes directly connect to money, risk reduction, or strategic business metrics, such as revenue attribution, cost savings, and customer retention improvements.
The framework needs to be seen as a hierarchy of preference. You should aim as high as you can while being honest about what you can credibly measure and impact.
Now, let’s move towards the three examples.
reMarkable: Building for Immediate Business Value
Eirik Folkestad leads the data platform team at reMarkable, the Norwegian e-paper tablet company. His team primarily operates between layers two and three, focusing on adoption metrics and efficiency gains.
“We don’t track a single metric,” Eirik explained. “Instead, we assess business impact qualitatively across three areas: enabling insights that wouldn’t otherwise be possible, automation and self-service that save hours on manual processes, and time to insight.”
The team has deliberately moved away from layer-one goals as its platform has matured. In the early days, they needed to build foundational capabilities, including fetching, storing, and transforming data, as well as creating basic reports and dashboards. At that stage, tracking platform outputs made sense. But once those foundations were in place, the team shifted focus.
“With our data platform foundation and core data products now in place, we’re rigged for delivering more layer four goals that help optimize business processes,” Eirik said.
reMarkable’s approach is pragmatic about measurement. They track health indicators, such as active users, in self-service analytics tools and data quality metrics. As you move to measure efficiency metrics (layer three) and, if possible, direct business goals (layer four), it is useful to keep adoption and engagement metrics (layer two) as guardrails. But they acknowledge the challenge of direct attribution: “All value is indirect—it flows through our domains and functions.”
The team’s philosophy is straightforward: stay close to the business, understand their needs, and build what they’ll actually use. “If the need is visibility into sales numbers, then deliver that, and don’t create a data catalog that will never be used,” Eirik emphasized. This value-driven approach means developing only features with immediate adoption potential and dropping anything that doesn’t get used.
Salesforce: Anchoring to Efficiency Despite the Challenges
Rounak Mehta leads product for an internal data platform at Salesforce, building a system to turn application telemetry into adoption metrics at scale. His team anchors to layer three (efficiency metrics) as the closest they can get to real business outcomes.
Rounak mentions: “For instance, one of my OKRs last year was time to metric—how long does it take us to turn product telemetry into adoption metrics.”
The team has operated at the efficiency layer since its founding two years ago, but they’re always looking for opportunities to measure layer four business outcomes. “It’s as you wrote in your article—impact at that layer is usually correlated, not causal,” Rounak explained.
What would Rounak love to measure but can’t? At layer four, he’d track the opportunity cost of applications that couldn’t be built or took too long because of platform gaps. At layer three, he wants more rigorous tracking of time spent finding and using the data his platform produces.
The breakthrough for Rounak’s team came when executives finally felt the pain directly. “One of the big drivers of bringing that pain to light was the time to deploy and scale agents and their performance,” he said. When strategic initiatives stalled because the data platform didn’t exist, suddenly the value became obvious.
His advice to other platform leaders: “Build close relationships with executives who have important and urgent needs that trace back to problems you can solve. As the number and volume of these executives grow, they can build the case for investment better than you would.”
OpenAI: Platform Goals in a Research-Driven Organization
Jake Brill heads the integrity product at OpenAI, managing a platform team that builds shared technology for trust, safety, and core capabilities. His perspective reveals how platform goals work differently in organizations where research and product development are tightly coupled.
The integrity platform team Jake’s leads focuses on reliability and enablement metrics: latency, uptime, and system maturity. These are layer two metrics, but with a critical distinction—they directly enable other teams to achieve their layer four goals.
“The success metrics for integrity have a different shape from other teams,” Jake explained during a podcast interview that I happened to listen to while working on this article. “For our integrity platform team, we view our success metrics as: are we building systems that enable the success of other products?”
“For example, our team is responsible for our identity system,” Jake noted. “We think a lot about how many nines of reliability we have, because if people can’t log into their accounts or sign up, OpenAI isn’t going to accomplish its overall goals.”
This approach reflects a deeper truth about platform work: sometimes your layer-two metrics serve as the foundation for other teams’ layer-four metrics. The platform team doesn’t set direct goals for user growth or revenue, but they build systems that enable those metrics.
What makes OpenAI’s approach distinctive is that they plan quarterly but “write plans in pencil, not in pen,” expecting to accomplish only 60-70% of any given plan. This flexibility acknowledges that platform priorities shift as new capabilities emerge from research.
Jake’s team also handles a challenge most platform teams face: handling a massive volume of inbound requests. “Being a platform team, we get multiple times more requests than the average team, both because we’re building shared technology and because we have close partners in operations, policy, and investigations.” In my opinion, you need to try to minimize this while acknowledging that it won’t reach zero and that some capacity needs to be reserved.
Lessons Learned Across Companies
After going through these three examples, I think three clear patterns emerge from these conversations:
The measurement gap is real. All three leaders struggle with the distance between their work and business outcomes. They’ve made peace with operating at layers two and three, using proxy metrics and qualitative assessment where direct measurement isn’t feasible.
Executive pain drives investment. Both Rounak at Salesforce and Jake at OpenAI emphasized the importance of leadership feeling the direct impact of platform gaps. Abstract ROI calculations matter less than concrete examples of strategic initiatives blocked by missing capabilities.
Value-driven beats feature-driven. Eirik’s emphasis on immediate adoption and Rounak’s focus on urgent executive needs both point to the same lesson: build what people will actually use, not what you think they should want. This requires constant contact with your stakeholders and a willingness to kill features that don’t get traction.
Practical Recommendations
If you’re leading a platform team and struggling to move beyond “features shipped” metrics, here’s what these examples suggest:
Start with an honest assessment. Map your current goals to the four layers. If everything sits at layer one, you have clarity on what needs to change. Don’t feel pressured to reach layer four immediately.
Pick metrics you can actually measure. Eirik initially tracked active users and data quality. Rounak measured time to metric. Jake focuses on reliability and uptime. These are all layer two or three metrics, but they’re concrete, measurable, and connected to value.
Invest in stakeholder relationships. All three leaders emphasized proximity to business needs. You can’t measure impact if you don’t understand what your stakeholders are trying to accomplish. Regular conversations, paired work, and shared goals build the context you need.
Use qualitative assessment when quantitative fails. reMarkable’s approach of qualitative impact assessment across three areas acknowledges that not everything can be quantified. Sometimes “enabled a decision that wouldn’t have been possible” is the best measure you’ll get.
Final Words
The four-layer framework isn’t a destination, it’s a tool for honest conversation about what you can and can’t measure. These three companies show that success doesn’t require operating at layer four. Sometimes layer two or three is exactly right.
What matters more than reaching the highest layer is being transparent about where you are, measuring what you can credibly track, and continuously strengthening the connection between your work and business value. As your platform matures and your stakeholder relationships deepen, opportunities to move up the layers will emerge naturally.
The goal isn’t perfection. The goal is progress, one measurable step at a time.
Enjoyed this post? You might like my book, Data as a Product Driver 🚚


