Featured Work

Multinational Publisher

When I came into this global academic publisher as a Strategic BD consultant, their AI Growth function was in its earliest stages. Early on I identified that the central strategic question - how to build recurring revenue into an AI licensing model - had a clear answer: RAG-based licensing.

Over a year, I rebuilt how the team presented its content offerings to AI model developers, reshaped their content packaging and pricing strategy, and originated and led their first commercial RAG licensing deal, an eight-figure transaction which became a flagship proof point for their AI Growth strategy.

LLM Developer

When I joined this leading LLM developer, there was no data acquisition function, and no standard process for getting a model capability from concept to completion. I built the infrastructure that fixed that.

I created the Data Spec process, establishing how data acquisition and annotation projects were scoped, resourced, and executed across teams. I also built cross-functional frameworks that aligned Sales, Product, Legal, and Data around a unified data strategy.

When I identified that no one was owning a critical NL2SQL model capability improvement for a major enterprise client, I took the lead. Within days, the customer’s SVP of Data & AI was speaking directly to me — not because it was my formal role, but because I was the one driving it.

In one year as Senior Manager of Data, I acquired trillions of tokens and cut STEM annotation costs by 30-50% with strategic data acquisition.

Computer Vision Start-up

When I joined this San Francisco face recognition company, their algorithm was among the top-ranked on the NIST leaderboards. But they were starting from scratch — an FTC Consent Order had required them to delete most of their existing data and affected models. That's exactly when they hired me.

As GM of Data Acquisition, I quickly identified that the real problem wasn't sourcing new data. It was that the team had lost the organizational capacity to engage data subjects directly after the Consent Order. I rebuilt that confidence, then executed against it fast.

Within 18 months, I launched a data collection app and global photographer network that generated >2M images from 40K data subjects in 1 year on an extremely limited budget. The result? Measurably reduced error rates in commercial face recognition models across demographic groups.

I find what others won't say out loud,

and fix it.