Case Study

Hebbia helps knowledge workers save thousands of hours with Claude

Hebbia and Anthropic logo lockups

Hebbia is the AI platform for knowledge work, servicing over 1/3 of the top 50 asset managers as well as Tier 1 Investment Banks and Law Firms. Hebbia uses Claude to power their platform, analyzing vast amounts of complex documents and generating actionable insights.

The need for an AI financial and legal analyst

Knowledge workers across industries face a critical challenge—processing, analyzing, and extracting insights from vast amounts of complex information efficiently and accurately. As the volume of data grows exponentially, traditional retrieval methods such as RAG fall short, unable to handle the nuance, context, and scale required for meaningful insights.

This challenge is particularly acute in finance and law, where workers must analyze thousands of dense documents to make critical decisions that require multiple steps of reasoning. The available tools compound the problem. Basic AI chat applications fail to handle 84% of real-world queries, according to Divya Mehta, Product Manager. "These questions required analyzing vast amounts of data across multiple documents. Simple retrieval of relevant text wasn't enough—users needed comprehensive analysis to generate meaningful insights," she explained. Users needed to analyze vast amounts of information across multiple documents to generate meaningful insights—something traditional tools couldn't handle.

Why Hebbia chose Claude

Hebbia's relationship with Claude is twofold—they offer it as a model choice for customers while using it to power core platform features. For customers, Claude excels at complex analysis tasks. "Users choose Claude for analysis over technical documents, credit agreements, or legal documents, especially when asking nuanced questions requiring detailed and descriptive answers," Mehta said.

Internally, Hebbia leverages Claude to build key platform features, with meta-prompting being a cornerstone of their approach. "All of our meta-prompting in Hebbia is done using Sonnet," Mehta explained. Meta-prompting allows Hebbia to automatically generate optimal prompts for different types of analysis tasks, essentially creating prompts that generate other prompts. Their prompt generator, for instance, was "largely inspired by the Anthropic prompt generator." This sophisticated approach eliminates the need for users to master prompt engineering themselves.

"Investment analysts who leverage Hebbia love that they no longer need to become prompt engineers thanks to Hebbia's ability to translate natural language into the most performant prompts," notes Raymond Verbeke, Strategy Lead at Hebbia. This philosophy allows Hebbia to focus on their core innovation—orchestrating complex document analysis—while leveraging Claude's proven capabilities for critical platform features.

How Claude powers the Hebbia platform

Hebbia's Matrix lets users analyze hundreds of documents simultaneously. When users input queries, Claude helps in multiple ways:

  • Generates prompts for complex analysis
  • Provides metadata and quick summaries through Claude Haiku
  • Offers detailed analysis of technical documents via Claude
  • Enables natural language interactions so users don't need to write technical prompts

Hebbia delivers speedy document analysis with Claude

Hebbia accelerates document analysis and improves accuracy by integrating Claude into their platform. The platform's ability to handle massive document sets in parallel means "running 700 rows is actually not that different than running 10," said Mehta. Speed is paramount for Hebbia's customers, and the company is constantly innovating to deliver faster results.

They're particularly excited about Claude's prompt caching capabilities to speed up analysis. In their document processing, "the prompt is a handful of tokens, but the document is exactly the same every single time," Mehta said. With 95% of tokens repeated across document analysis, prompt caching can reduce response times by up to 85%. This is key for follow-up questions, where users need rapid iterations on their analysis. "Customers want the best answer in the fastest amount of time," Mehta noted. "With prompt caching, the minute users ask a follow-up question, we can load responses instantly." This speed will let users dive deeper into their analysis, exploring different angles and insights in real-time.

Shaping the future of work

Hebbia envisions a future where AI augments knowledge workers' capabilities, allowing them to transcend routine tasks and focus on strategic, high-value work. "Today's smartest people graduate college and work in finance, consulting, or at law firms spending a majority of their time looking at documents, filling out spreadsheets, and doing mundane tasks," said Mehta. By automating these time-consuming activities, Hebbia aims to unlock the full potential of human talent. "If you armed all knowledge workers with this technology, the industry could do so much more."

Looking ahead, Hebbia is advancing toward their vision of a fully autonomous AI knowledge worker. "In three months, users will simply drop a folder into Hebbia, and it will orchestrate the analysis, split the document by categories, run all the matrices, and create the final report," said Mehta. In five years, Hebbia sees AI evolving to achieve true autonomy, working alongside humans as an invaluable partner in knowledge work.

With Claude as a key enabler, Hebbia is not just changing document analysis—they're reimagining the future of knowledge work, creating a world where humans are empowered to think bigger, dive deeper, and achieve more than ever before.