In just a few years, Heidi Health has gone from a Melbourne healthtech startup to one of the world’s most used AI medical scribes, supporting millions of consultations each week across more than 100 countries. A key part of that journey has been how the team leveraged Amazon’s generative AI ecosystem, especially Amazon Bedrock and the broader AWS platform, to move fast, stay compliant, and convince investors they could scale safely in one of the most regulated industries on earth.

In AWS re: Invent 2025, Ocha Cakramurti, the CTO of Heidi Health, delivered a speech on “How Heidi Health is leveraging GenAI to transform the global healthcare industry”. One of the key success factors that he delivered was that they utilize Amazon Bedrock, a fully managed AWS service that provides key features such as model selection, customization, agents, and a marketplace for various firms.
The Problem: Clinicians Drowning in Documentation
Across healthcare systems, clinicians spend nearly as much time on administration as on patient care, with documentation, evidence search, and follow‑up communications consuming large chunks of every shift. Heidi Health set out to build an AI “care partner” that sits alongside clinicians, listens to consultations, and generates structured clinical documentation and follow‑up content so clinicians can focus on patients rather than paperwork.
In just 18 months, Heidi’s platform returned more than 18 million hours to frontline clinicians by automating critical administrative tasks. That kind of impact doesn’t happen without serious infrastructure and model‑ops decisions behind the scenes.
Why AWS and Amazon Bedrock Were a Strategic Fit
AWS became the default backbone because it offers mature regional coverage, strong compliance primitives, and a growing portfolio of healthcare‑specific AI services, such as AWS HealthScribe (powered by Amazon Bedrock) for clinical note generation. For Heidi, Amazon Bedrock added an important layer: instant access to compliant, region‑available foundation models that could be used day one in new markets while their self‑hosted models caught up.

Hybrid Model Strategy: Bedrock for Speed, EKS for Control
At AWS re: Invent 2025, Heidi’s team described a hybrid approach: Use Amazon Bedrock to access LLM providers that are already compliant and available in the target region, solving the “cold start” problem when entering new markets
- Run self‑hosted models on Amazon EKS (Elastic Kubernetes Service) for long‑term cost control, customisation, and deeper optimisation once volumes justify it
This “Bedrock for speed, EKS for control” model gave Heidi the agility to launch quickly in new geographies while still having a path to their own specialized, heavily tuned models wherever they needed more control or lower unit economics.
Standardised Infrastructure as Code Across Regions
Heidi wasn’t just solving a modelling problem; they were solving a global infrastructure problem. Each new region came with different:
- – Data residency and sovereignty requirements
- – Regulatory expectations on healthcare data
- – Preferred or allowed AI vendors
- – Latency and network characteristics
The solution was aggressive standardisation:
- All AWS infrastructure was defined as code, so a new deployment in a new region was essentially “apply the template”, not “reinvent the stack”.
- Core components, VPCs, security groups, EKS clusters, Bedrock integrations, logging, and monitoring—were replicated with consistent guardrails.
- Model‑routing logic could select between Bedrock‑hosted models and self‑hosted models depending on the region, compliance posture, and performance needs.
This approach made their architecture not just scalable but portable, which is critical when your product must live close to patient data to meet local laws.
Building Clinician Trust: Evaluation Loops on Top of Bedrock
In healthcare, model quality is not just BLEU scores or ROUGE metrics; it is whether clinicians actually trust and adopt the tool. Heidi leaned heavily on “clinicians in the loop” evaluation processes, supported by synthetic data and LLM‑as‑judge tools, to continuously evaluate and improve their AI.
Their workflow looked roughly like this:
- – Generate or collect de‑identified clinical conversations.
- – Use LLMs (including those available via Amazon Bedrock) to propose structured documentation.
- – Run LLM‑as‑judge pipelines and human clinicians to compare outputs with expected documentation, flag errors, and refine prompts and model configs.
By combining Bedrock’s models with clinician feedback loops, Heidi could iterate quickly while maintaining quality in highly sensitive workflows.
Focus on One Workflow, Then Scale Out
Heidi’s team has been explicit about one of their survival strategies: focus obsessively on one workflow before trying to solve everything in healthcare. For them, that first workflow was medical scribing and documentation—listening to the consultation and producing high‑quality notes, letters, and summaries that fit into clinicians’ existing systems.
Once that core workflow was working reliably at scale, they expanded into:
- Evidence search and retrieval to support clinical decisions
- Follow‑up communications and patient‑friendly recaps
- Broader “AI care partner” features to tackle more of the non‑clinical overhead clinicians face daily
Underneath, AWS made it easier to incrementally bolt on new services without re‑architecting from scratch.
Business Outcomes: From Startup to Global AI Care Partner
The technical choices around AWS and Amazon Bedrock didn’t exist in a vacuum; they translated into business outcomes investors could underwrite.
- Adoption: Heidi supports tens of thousands of clinicians across more than 200 specialties, handling millions of consultations per week in over 100 countries and more than 100 languages.
- Impact: Over 18 million hours returned to frontline clinicians in 18 months by automating administrative tasks like documentation and follow‑up.
- Funding: A 65 million USD Series B led by Point72 Private Investments, taking total funding to nearly 100 million USD and valuing Heidi at around 465 million USD.
Investors called out the platform’s potential to relieve administrative burden, improve capacity in strained healthcare systems, and preserve the human touch in patient care, signalling confidence that the underlying infrastructure and AI strategy could scale.
Key Lessons for Builders Using Amazon Bedrock
For founders and builders looking to replicate Heidi’s trajectory, a few practical lessons stand out:
- Start with one workflow and design your architecture so that it can grow horizontally once you nail product–market fit.
- Use Amazon Bedrock to solve the “day one” model availability problem in new regions, then gradually transition pieces to self‑hosted models on EKS where it makes sense for cost and customisation.
- Treat infrastructure as code from the beginning so new regions are a repeatable deployment, not a bespoke project.
- Close the loop with your expert users; in healthcare that means clinicians in the evaluation pipeline, supported by synthetic data and LLM‑as‑judge systems to keep iteration fast but safe.
<Works Cited>
“AI-Powered Patient Profiles Using AWS HealthLake and Amazon Bedrock | Amazon Web Services.” Amazon Web Services, 11 Aug. 2025, aws.amazon.com/blogs/industries/ai-powered-patient-profiles-using-aws-healthlake-and-amazon-bedrock/.
Leave a Reply