Endoscopy Clinical Governance

Structured reporting.
Defensible governance.

Specialist clinical indicator reporting for private day hospitals — aligned to ACSQHC and GESA benchmarks, built for Medical Advisory Committees and executive oversight.

MPH GradCertEpi Clinical Governance NSQHS
Get in touch View pilot offer
3
Service Tiers
2–3
Week Turnaround
NSQHS
Aligned

Services

Three tiers. One niche.

Each tier is designed for a different stage of governance maturity. Start with quarterly compliance reporting and build toward full enterprise assurance — or begin wherever your facility's needs are greatest.

Tier 1

Compliance Reporting

Pricing on enquiry — contact me to discuss.

  • Quarterly Director/CEO Governance Report
  • Clinician-Level Variance Monitoring Appendix (de-identified)
  • Methodology & Definitions Statement
  • Data validation commentary
  • Benchmark comparison — ACSQHC / GESA standards

Pure reporting. No governance implementation layer. Ideal for facilities establishing a baseline.

Tier 3

Enterprise Assurance

Pricing on enquiry — contact me to discuss.

  • All Tier 1 & 2 deliverables
  • Annual Board-Level Assurance Report (4-quarter rolling)
  • Multi-Site Consolidation Report
  • Governance Maturity Assessment
  • Quality Improvement Portfolio Summary

Board-ready annual assurance documentation and governance maturity framework for enterprise groups.

How It Works

Simple. Structured. No disruption.

The engagement is built around your existing data exports — no system integration, no live data access, no additional workflows for your team.

01
Data Export
You provide a structured procedure export and pathology export from your existing systems — no custom extraction required.
02
Secure Transfer
Data is transferred via agreed secure channel. Handled under a formal confidentiality agreement from day one.
03
Analysis & Reporting
Indicators are calculated, validated, and contextualised against benchmarks. Reports produced within 2–3 weeks of receipt.
04
Delivery
Final reports delivered in PDF format, ready for MAC presentation and executive review.
Data Required
Procedure export + pathology export from existing systems
Turnaround
2–3 weeks from receipt of complete data export
Cadence
Quarterly, aligned to your facility's reporting cycle
System Access
Not required — export-only workflow
Data Handling
De-identified outputs, secure storage, no retention beyond engagement
Payment
Invoiced quarterly in advance, 14-day terms
Not Included
System integration, real-time dashboards, medico-legal opinions, credentialling decisions

About

The specialist behind your reporting

I'm Teyanna Gaeta, an endoscopy governance and clinical indicator reporting specialist with a Master of Public Health and Graduate Certificate in Epidemiology. My work is grounded in years of hands-on experience in clinical governance and colonoscopy quality reporting within the public health system.

I understand what private day hospitals need: structured, credible, benchmark-referenced reporting that holds up under accreditation scrutiny — produced without placing any additional burden on your clinical or administrative teams.

Endoscopy Reporting is an Industry Member of Day Hospitals Australia — the peak body representing private day hospitals across Australia.

All reporting is produced from your existing procedure and pathology exports. No system access required. No disruption to clinical workflows.
Teyanna Gaeta MPH GradCertEpi
📊
Clinical Indicator Expertise
ADR, caecal intubation, SSLDR, withdrawal time, PDR — reported accurately and contextualised against ACSQHC and GESA benchmarks.
🔬
Epidemiological Rigour
MPH and GradCertEpi-trained methodology ensures your inclusion/exclusion logic and statistical interpretation are defensible.
🏛️
Governance Ready
Reports are designed for Medical Advisory Committees, NSQHS accreditation, and executive-level oversight — not just data summaries.
🔒
Confidential Handling
All data is managed under a formal confidentiality agreement. De-identified outputs. No retention beyond the agreed engagement period.

Insights

Thinking on endoscopy governance.

A growing body of work on clinical indicator reporting, unwarranted variation, and what defensible governance actually looks like in private endoscopy.

Article 01

The Governance Gap Nobody in Private Endoscopy Wants to Talk About

Most private endoscopy facilities produce data. Very few produce governance. The gap between the two is where risk lives — and where structured reporting starts.

Read article →
Article 02

Your ADR Is 45%. Does That Actually Mean Anything?

Reporting an ADR and understanding what it means are two different things. Without the right cohort definitions and benchmark context, a number is just a number.

Read article →
Article 03

Nobody Ever Told You Your ADR Was Dropping

In the public system, clinician performance is monitored, benchmarked, and fed back. In private endoscopy, most clinicians have never seen their own data — let alone a trend.

Read article →
Article 04

6 Minutes Is the Minimum. Not the Goal.

A clinician meeting the withdrawal time benchmark may still be the most significant outlier in the room — and no one has told them. The benchmark was never designed to define quality.

Read article →
Teyanna Gaeta

Article 01 · Clinical Governance

The Governance Gap Nobody in Private Endoscopy Wants to Talk About

I spent five years working as a Facility Data Manager in the Queensland public health system. My job was to make sure colonoscopy quality indicators were calculated correctly, reported consistently, and used meaningfully in clinical governance — at hospitals performing thousands of procedures a year, with dedicated teams, established systems, and real accountability structures.

Then I started working with private endoscopy facilities.

The contrast was striking. Not because the clinicians were less capable — they weren't, often the opposite — but because the infrastructure simply didn't exist. The same NSQHS standards apply. The same ACSQHC Colonoscopy Clinical Care Standard applies. But the support structures that make compliance meaningful in the public system? Almost entirely absent in the private sector.

That's not a criticism. It's an observation about a structural gap that I think deserves an honest conversation.

What the standard actually requires

The ACSQHC Colonoscopy Clinical Care Standard is not a suggestion. It is the framework against which private endoscopy facilities are assessed during NSQHS accreditation surveys. It calls for:

  • Regular monitoring of colonoscopy quality indicators including adenoma detection rate, caecal intubation rate, and withdrawal time
  • Processes to identify unwarranted variation in clinical practice
  • Mechanisms for clinician-level peer review
  • Governance structures that give executive leadership meaningful oversight of endoscopy quality

Read that list again. Now think about your facility. How many of those four things would you be genuinely confident defending in front of an ACSQHC surveyor?

For most private facilities I work with, the honest answer is: one or two, partially.

How it usually works in practice

Here is the most common picture I encounter. A private endoscopy facility — independent or part of a small group — has a Medical Advisory Committee that meets quarterly. Someone on the committee is a gastroenterologist who cares about quality. The practice manager pulls some numbers from the endoscopy system before the meeting. They're presented. There's a brief discussion. The meeting moves on.

Nobody has applied formal exclusion logic to the data. Nobody has checked whether the pathology linkage is complete. The clinician-level numbers are compared to the benchmark but not to each other. There's no written methodology explaining how the indicators were calculated. There's no action register capturing what was discussed or agreed. There's no structured process for identifying outliers, and no escalation pathway if one is found.

The facility believes it is doing clinical governance. But the output is not defensible.

The unwarranted variation problem

Unwarranted variation is the term used to describe differences in clinical practice — or clinical outcomes — that cannot be explained by patient need or clinical circumstances. In colonoscopy, it shows up as meaningful differences in ADR, withdrawal time, or caecal intubation rates between clinicians performing procedures at the same facility, with the same patient population, under the same conditions.

Unwarranted variation is not unusual. It is expected. The question is not whether it exists — it almost certainly does at your facility — but whether anyone is looking for it, and whether there is a structure to respond constructively when it is found.

In the public system, dedicated governance processes exist specifically to surface and respond to this variation. In the private sector, without structured reporting, unwarranted variation is effectively invisible. The clinician with a withdrawal time that has been below six minutes for three consecutive quarters doesn't know, because nobody has shown them their data in comparison to their peers.

Invisible variation cannot be addressed. And unaddressed variation is, ultimately, a patient safety issue.

Why this is harder in the private sector — and why that's not an excuse

I want to be fair here. The structural challenges in private endoscopy are real. Private facilities don't have dedicated data managers. The practice manager is doing seventeen other things. The gastroenterologists are independent practitioners who don't report to the facility in the same way a salaried clinician does. The endoscopy system may not make it easy to extract clean data. Nobody has the epidemiological training to apply proper exclusion logic to indicator calculations.

These are genuine constraints, not failures of effort or intent. But they don't reduce the compliance obligation. And they don't reduce the governance responsibility that comes with running an accredited facility that performs high-risk procedures on patients who are trusting you with their bowel cancer screening.

The gap between the structural challenge and the compliance requirement is exactly where I work. My argument is not that private facilities need to build an entire data management infrastructure from scratch. It is that structured external reporting — delivered from existing data exports, requiring no system access, aligned to recognised frameworks — can close that gap at a fraction of the cost of what the alternative eventually looks like.

What changes when governance is done properly

I work with facilities that have moved from informal indicator review to structured quarterly reporting, and the difference is consistent and predictable.

The MAC meeting becomes more productive. People arrive having read a pre-read pack that summarises the data clearly. The discussion is focused. Outliers are already identified. The conversation moves from 'what do the numbers say' to 'what do we do about this.'

Clinicians engage differently with their own data when it is presented clearly, benchmarked properly, and delivered in a format that respects their expertise. A well-constructed individual clinician report is not a threat — it is a professional tool. Most clinicians I have worked with respond to good data with curiosity and genuine interest in improvement.

Directors and CEOs can actually exercise oversight. When governance reporting is structured and consistent, executive leadership has something meaningful to review. They are not just being told that quality is fine — they can see it, quarter by quarter, in a format that stands up to scrutiny.

Present, structured, defensible. That's what good governance documentation looks like.

A conversation worth having

I wrote this because I think the gap I'm describing is real, and I think it's under-discussed. Private endoscopy in Australia is a significant part of the healthcare system. The clinicians working in it are skilled and committed. The facilities serving patients are, for the most part, genuinely trying to do the right thing.

But governance infrastructure matters. Data quality matters. Structured peer review matters. And the absence of these things — even in a well-intentioned, well-run facility — creates risk that the current system does not adequately surface.

If you are a director, practice manager, clinical governance lead, or gastroenterologist at a private endoscopy facility reading this and recognising something familiar — I would genuinely welcome a conversation. This is not a complicated problem to solve. It just requires the right structure, applied consistently, by someone who knows what they're looking for.

If this resonates with something you're navigating at your facility, I'd be glad to talk.

Get in touch →

Teyanna Gaeta  |  MPH, GradCertEpi

Endoscopy Clinical Indicator Reporting Specialist

teyanna@endoscopyreporting.com.au

Teyanna Gaeta

Article 02 · Adenoma Detection

Your ADR Is 45%. Does That Actually Mean Anything?

Let's start with a question I get asked more than any other.

A gastroenterologist — experienced, conscientious, genuinely invested in their practice — shows me their adenoma detection rate. It's 45%. Well above the 25% minimum benchmark. They're satisfied. Their facility is satisfied. At the last MAC meeting, someone noted it and moved on.

So why am I not satisfied?

Because 45% doesn't mean much on its own. What it means depends entirely on how it was calculated — and in most private endoscopy facilities I work with, that calculation is shakier than anyone realises.

The benchmark was never designed to be the destination

The 25% ADR benchmark exists for a reason. It was derived from evidence linking adenoma detection to reduced colorectal cancer incidence — specifically, the work showing that each 1% increase in ADR is associated with a 3% reduction in colorectal cancer risk between screenings. It is a minimum threshold, not a quality target.

When a facility reports 45% and considers that the end of the conversation, something important has been lost.

The benchmark tells you whether you're in the game. It doesn't tell you whether you're playing well.

The more meaningful questions are: Is that 45% stable across quarters? How does it compare to peers at the same facility? What happens to it when you apply proper exclusion logic? And — critically — is it actually based on the right cohort?

The cohort problem nobody talks about

Here's where most private facility reporting quietly falls apart. ADR should only be calculated on eligible patients: those aged 50 and over, with adequate bowel preparation, undergoing colonoscopy for appropriate indications, with intact colons and no documented inflammatory bowel disease.

When I receive data exports from private facilities and apply that exclusion logic properly — removing IBD indications, complex colonoscopies, patients under 50, inadequate prep procedures — the ADR almost always moves. Sometimes up, sometimes down. Occasionally significantly.

I've seen ADR figures that looked impressive until we removed the IBD patients who were having surveillance colonoscopies for reasons completely unrelated to screening. I've seen figures that looked borderline until we properly excluded the inadequate prep procedures that were dragging the denominator up.

The number itself is only as meaningful as the cohort it was built on. And in most private facilities, the cohort has never been formally audited.

Documentation is the silent variable

There's another issue that compounds this — one that feels administrative but has real clinical governance implications. Pathology linkage.

For an adenoma to count in your ADR, the pathology result has to be linked back to the procedure. In a well-functioning system, the pathology accession ID in the endoscopy system connects the procedure to the histology result. When that field is missing, blank, or inconsistently recorded, adenomas that were detected and removed simply disappear from the calculation.

In the facilities I work with, pathology linkage completeness of 85–90% is common. That means 10–15% of procedures are contributing to the denominator but potentially not contributing their polyp findings to the numerator. The effect on ADR can be meaningful, particularly for clinicians who perform a high proportion of polypectomies.

This isn't a criticism of anyone. It's a documentation infrastructure issue that most private facilities have never had the external lens to identify.

What good ADR reporting actually looks like

Good ADR reporting for a private endoscopy facility means:

  • Consistent cohort definition — the same inclusion and exclusion logic applied every quarter, documented in a Methodology Statement that sits behind every report.
  • Pathology linkage audit — knowing your linkage rate and tracking it over time. A declining linkage rate is a data quality signal, not a formatting problem.
  • Trended comparison — one quarter of ADR data is a data point. Four quarters is a trend. The ability to see whether a clinician's ADR is stable, improving, or declining over time is what transforms data into governance.
  • Peer comparison in context — knowing that your ADR is 45% is more meaningful when you can see that your colleagues at the same facility range from 38% to 61%, and that yours has been stable for three consecutive quarters.

The question worth asking at your next MAC meeting

Not "what is our ADR?" — but "how is our ADR calculated, and are we confident the cohort is right?"

If the answer is uncertain, that's not a failure. It's an opportunity. And it's a much more defensible position than presenting a number with confidence that turns out, on examination, to rest on shaky ground.

I work with private endoscopy facilities across Australia to produce structured, defensible quarterly reporting. If this resonates, I'd be glad to talk.

Get in touch →

Teyanna Gaeta  |  MPH, GradCertEpi

Endoscopy Clinical Indicator Reporting Specialist

teyanna@endoscopyreporting.com.au

Teyanna Gaeta

Article 03 · Clinical Feedback

Nobody Ever Told You Your ADR Was Dropping

In the public system, this problem is largely solved. In private endoscopy, it's barely being asked.

I work in clinical indicator reporting inside the public hospital system. In that environment, clinician-level colonoscopy performance is tracked, benchmarked, and fed back through structured governance frameworks. When an endoscopist's ADR drops, there is a process. Someone notices. A conversation happens. It is documented.

The infrastructure exists because the standard demands it — and because the consequences of not having it are well understood.

Now step into a private endoscopy facility. Ask who is tracking clinician-level ADR over time. Ask who receives a structured quarterly breakdown of their own performance against peers and national benchmarks. Ask what happens when a clinician's detection rate quietly drops 20 percentage points over the course of a year.

In most private facilities, the honest answer is: nobody knows. Because nobody is looking.

Two systems. One standard. A very different reality.

Both public and private endoscopy facilities in Australia are bound by the same national standard — the ACSQHC Colonoscopy Clinical Care Standard. It requires systematic collection of quality indicators, individual clinician feedback, and a governance process for identifying and responding to variation.

The difference is infrastructure. Public hospitals have invested in the systems, the roles, and the processes to make this happen. Many private facilities have not — not because they don't care about quality, but because this kind of structured reporting capability doesn't exist naturally in a smaller, leaner operation.

The data is there. It always is. What's missing is someone to turn it into something that actually gets used.

What the clinician deserves to know

Most gastroenterologists working in private practice are excellent clinicians. They are not looking to avoid scrutiny. They want to know how they're performing. They want to be told if something has shifted.

The problem isn't the clinician. It's that the feedback loop was never built.

When an endoscopist receives a structured quarterly report showing their ADR, their SSLDR, their withdrawal time — benchmarked against the unit and against national standards — they engage with it. They ask questions. They reflect on their technique. That reflection is where quality improvement actually happens.

Without the report, none of that happens. The gap stays invisible. Until it isn't.

When the gap becomes visible, it's usually too late

The moment most private facilities discover they have a governance gap is during accreditation. An accreditor asks to see the clinician-level reporting. They ask what happened when a particular result was flagged. They ask who reviewed it, when, and what the response was.

If the infrastructure wasn't there to capture that information — the answer is silence. And silence, in an accreditation context, is a finding.

The public system learned this lesson years ago. Private endoscopy is still catching up.

The question sitting in every private facility right now

If your ADR dropped 15 percentage points over the last year — would you know? Would your Medical Director know? Would your CEO? Would anyone have told the clinician?

The public system has built the infrastructure to answer yes. Private endoscopy deserves the same standard — and the same confidence when an accreditor walks through the door.

That's exactly what I build. If this resonates, I'm easy to find.

Get in touch →

Teyanna Gaeta  |  MPH, GradCertEpi

Endoscopy Clinical Indicator Reporting Specialist

teyanna@endoscopyreporting.com.au

Teyanna Gaeta

Article 04 · Withdrawal Time

6 Minutes Is the Minimum. Not the Goal.

Why the withdrawal time benchmark is being misused — and what it's costing patients.

A clinician with a median withdrawal time of 6 minutes and 30 seconds is meeting the benchmark. They are also more likely to be missing lesions than a clinician withdrawing at 10 or 12 minutes. The benchmark was designed as a minimum threshold — a floor below which practice is considered inadequate. It was never designed to define quality. It was never designed to be the goal.

But in most private endoscopy facilities across Australia, clearing 6 minutes is treated as the finish line. The report shows green. The meeting moves on.

That is a patient safety problem. And nobody is saying it out loud.

What the benchmark was actually designed to do

The 6 minute median withdrawal time threshold exists because the evidence showed that very short withdrawal times — under 6 minutes — were associated with significantly lower adenoma detection rates. Endoscopists who rushed were missing things.

The benchmark was introduced to eliminate the bottom of the distribution. To catch the outliers. To set a minimum below which no clinician should fall. It was not introduced to tell you that 6 minutes and 1 second represents good colonoscopy practice.

When a facility reports "withdrawal time — met" and closes the conversation, they are confusing the absence of a red flag with the presence of quality. Those are not the same thing.

The question nobody is asking

If your facility reports median withdrawal time and it clears 6 minutes — what happens next? Does anyone look at the distribution? Does anyone ask how many procedures fell between 6 and 7 minutes — the borderline zone where technically compliant doesn't mean clinically confident? Does anyone compare withdrawal time against ADR for the same clinician to see whether the time taken is actually translating into detection?

In most private endoscopy facilities, the answer is no. Because the benchmark was met. Because the report shows green. Because there is no infrastructure to ask the deeper question.

Without the reporting infrastructure to go beyond the minimum, it stops being a threshold. It becomes a target.

This is not about blame. It's about what good looks like.

The clinician withdrawing at 6 minutes and 30 seconds is not a bad clinician. They are meeting the standard they have been given. The facility reporting green on withdrawal time is not negligent. They are reporting what the framework asks them to report. The problem is the framework is incomplete.

Good clinical governance around withdrawal time asks more than "did we meet the minimum?" It asks:

  • What is the distribution of withdrawal times across our clinicians?
  • Which clinicians are consistently at the lower end of the acceptable range?
  • Is there a relationship between withdrawal time and detection rate for the same clinician?
  • Has any individual clinician's withdrawal time shifted significantly over the last four quarters?
  • What does the data tell us that the benchmark alone cannot?

These are the questions that structured reporting can answer. They are also the questions that most private endoscopy facilities currently cannot answer — because the infrastructure to ask them doesn't exist.

The minimum is not enough

Private endoscopy in Australia has spent years getting comfortable with meeting the minimum. Clearing the benchmarks. Showing green on the report.

The patients sitting in those procedure rooms deserve more than the minimum. They deserve a governance framework that asks not just whether the benchmark was met — but whether the care was genuinely good.

That starts with reporting built to ask harder questions than "did we pass?"

If your facility is ready to go beyond the minimum, I'm easy to find.

Get in touch →

Teyanna Gaeta  |  MPH, GradCertEpi

Endoscopy Clinical Indicator Reporting Specialist

teyanna@endoscopyreporting.com.au

Pilot Engagement

Start with one quarter. No commitment required.

The pilot is a fixed-scope, retrospective engagement using one quarter of your existing site-level data exports. It's the lowest-risk way to see the framework in action before making an ongoing commitment.

Director/CEO Governance Report — ADR, caecal intubation rate, withdrawal time, SSLDR, complications, and operational metrics
Clinician-Level Variance Monitoring Appendix with individual indicator breakdown and de-identified peer comparison
Governance Commentary & Methodology Statement with inclusion/exclusion logic, data limitations, and benchmark references
$5,500
+ GST · Fixed scope · One retrospective quarter
The pilot fee is credited toward your first quarter of ongoing engagement if you proceed. No ongoing commitment is required following the pilot.
Enquire about the pilot →

Contact

Let's talk about your facility.

Whether you're ready to start a pilot or want to understand which tier best suits your governance needs, I'm happy to have a straightforward conversation.

ABN
59 604 065 576
Location
Queensland, Australia · Servicing nationally