Pillar guide · Approval workflows

Document approval workflows for compliance

What makes an approval flow defensible under ISO 9001, 21 CFR Part 11, and HIPAA — and why sequential, named, audited beats parallel, group-based, and inferred every time.

18 min read · 4,200 words

TL;DR

  • Email approvals fail at audit — they leave the evidence distributed across inboxes, Teams threads, and memory. Reconstructing 'who approved what, in which role, against which version' takes hours, and sometimes doesn't work.
  • Sequential execution is not a preference — it's a compliance requirement. Parallel and state-based workflows are harder to defend when an auditor asks for the approval sequence.
  • Named Microsoft Entra identities per step, not groups or distribution lists. The audit log must attribute every approval to a specific person acting in a specific role.
  • Automatic check-out during the flow guarantees the version approved is the version reviewed — no edits slip in between steps.
  • Fixed approvers encode organizational policy (Quality on every SOP; Medical Director on every clinical procedure) as enforceable workflow.
01

Chapter one

Why ad-hoc approvals fail audits

The default way regulated organizations approve documents — email, Teams, shared inboxes — produces audit evidence that can't be retrieved cleanly. At the worst possible moment, it can't be retrieved at all.

Every compliance audit eventually asks a specific question. Who approved this document, in what role, against which version, on what date, with what comments? In regulated contexts — ISO 9001 surveillance, FDA Part 11 inspections, HIPAA assessments, SOX control testing — an imprecise answer is a finding. The auditor doesn’t care how much work went into the document or how careful the team was. They care whether the evidence exists.

In most organizations that haven’t moved past ad-hoc approval, the honest answer is: the evidence probably exists, somewhere, but finding it takes an afternoon and might not be complete. The approval happened in an email thread. The signed PDF is attached. The Legal review was a Teams DM. The Quality sign-off was a verbal “go ahead” that nobody documented. Version 2 was approved; version 3 has been living in the library for a week; nobody is sure whether version 3 got the same sign-offs or not.

1
FRAGMENTATION

Evidence lives in five places

Email, Teams, SharePoint comments, the document itself, people's memory. Reconstructing the sequence means reconciling five partial stories.

2
DRIFT

The reviewed version ≠ the published one

Someone edits the document after approval but before publication. The PDF distributed to users doesn't match what the approvers actually read.

3
ATTRIBUTION

"Approved by the Quality team"

Not a person, not a role, not a timestamped event. Regulators reading this in an audit log are looking for a specific failure.

4
SKIPS

Mandatory signatures skipped

Quality was supposed to sign every SOP. Nobody remembered on this one. The audit discovers 8% of historical approvals missing a mandatory signatory.

The cost of ad-hoc approval isn't the time it takes to email documents around. It's the week of evidence-gathering when the auditor arrives — and the finding when the evidence turns out to be incomplete.

The move from ad-hoc to structured approval isn’t about efficiency; it’s about producing the evidence as a byproduct of the workflow, so the evidence is always there when asked for.

02

Chapter two

What makes an approval workflow defensible

Five structural properties separate an audit-proof workflow from a "we approved it somehow" process.

The sequential approval workflow at the center of an active document lifecycle is defensible not because it’s sophisticated — it isn’t — but because it has five structural properties that ad-hoc approval lacks. Each property prevents a specific failure mode that auditors know how to look for.

1
ORDER

Sequential execution

Steps execute one at a time, in order. Approver 1 first, then 2, then 3. Not parallel, not state-based.

2
IDENTITY

Named Entra per step

Each step points to one Microsoft Entra identity in one role. Not a group, not a distribution list, not a pool.

3
LOCK

Automatic check-out

Document locked for edits during the flow. The version approved is the version reviewed.

4
MANDATORY

Fixed approvers

Required signatures encoded at document-type level. Authors can't remove them. Policy becomes workflow.

5
EVIDENCE

Complete audit trail

Every step writes an event to the log: who, role, version, timestamp, comment. Append-only by design.

None of these five properties are optional in regulated contexts. An approval workflow that’s “sequential except when someone’s on PTO” isn’t sequential. Attribution to a “group approver” isn’t attribution. Check-out that can be overridden by admins isn’t check-out. Each of them has to hold unconditionally, or the audit evidence they’re supposed to produce isn’t defensible.

03

Chapter three

Sequential vs parallel vs state-based

Why "whatever completes first" and "any approver from the pool" are tempting to design, and why compliance auditors know to push back on them.

Workflow products often advertise “flexible routing” — parallel branches, state-based transitions, round-robin assignment, conditional skip rules. These are genuinely useful in operational workflows like purchase orders or IT tickets. In document approval for regulated documents, they’re a trap.

Pattern What it does Audit posture
Sequential Approvers 1, 2, 3 review in defined order. On rejection, flow halts. Each step is a discrete event. Defensible
Parallel Multiple approvers notified simultaneously, all must approve. Order of completion is whatever it happens to be. Defensible with care
Any-of-N "One Quality manager approves" — pool of eligible approvers, first to act wins. Reduces bottlenecks. Risky
State-based Document transitions through states (Draft → Review → Approved). Multiple paths between states. Audit log is a state-transition log. Hard to defend
Conditional skip "Skip Legal review if the document value is under €10k." Logic that omits steps under specific conditions. Hard to defend

The reason sequential wins in regulated document management isn’t that it’s better. It’s that it answers the auditor’s question in the simplest possible way. When the auditor asks “who approved this, in what order?”, the answer is a list. With parallel approval, the answer is still a list but the order requires explanation. With any-of-N, the answer is “one of these four people, whoever got to it first,” which invites follow-up questions about accountability. With state-based, the answer requires reconstructing the state machine.

Our sequential approval engine supports sequential flows by design. It does not support parallel, any-of-N, state-based, or conditional-skip patterns. This is a deliberate constraint — one of the ways the product stays defensible rather than flexible.

"Flexible" is the antonym of "defensible" in the regulated-documents context. Every degree of flexibility is another piece of evidence the auditor has to reconstruct.

04

Chapter four

Fixed approvers — policy as workflow

Every organization has "non-optional" signatures. Making them structural instead of social is the single highest-value approval-workflow decision.

Every regulated organization has roles whose sign-off on certain document types is non-negotiable. The Quality Manager on every SOP. The Medical Director on every clinical procedure. The Legal Counsel on every customer-facing policy. The Chief Privacy Officer on every data-handling procedure. In a defensible program, these signatures always appear. In a passive program, they usually do — but the word “usually” is where audit findings live.

Fixed approvers encode the non-optional signatures at the document-type level. The Quality Manager is automatically a pre-flow approver on every SOP. The Medical Director is automatically a post-flow approver on every clinical procedure. Authors can add variable middle approvers but cannot remove the fixed ones. Organizational policy becomes enforceable workflow, not social expectation.

1
PRE-FLOW

The required first step

Document authors often aren't qualified to judge whether content meets organizational standards. A pre-flow fixed approver (typically Quality) gates submission — they review before anyone else sees the document.

2
POST-FLOW

The required last signature

Senior accountability for publication sits with a named role — Medical Director, Director of Operations, Chief Compliance Officer. Their sign-off is the flow's terminal event, always present.

3
DOCUMENT-TYPE

Configured once, enforced forever

Fixed approvers are attached to document types at implementation. Every new approval flow of that type picks them up automatically. Authors don't choose; the configuration chooses.

4
AUDITABLE

100% coverage, provably

After implementation, the claim "every SOP has Quality sign-off" becomes mathematically true across the library. Auditors can sample any document and the signature will be there.

The practical effect is that organizational policy (“Quality signs every SOP”) is enforced by structure rather than by remembering. New authors join the company and their SOPs still get Quality review, because the system requires it. A team reorganization changes who holds the Quality role; the configuration gets updated once, and every subsequent flow picks up the new person. The approval policy survives turnover, delegation, and time.

05

Chapter five

E-signature — when cryptographic binding matters

Not every approval needs a signature. For the ones that do, PAdES provides binding that stands up in court, in inspections, and in disputes.

Standard approval captures “who clicked Approve, when, against which version.” That’s enough evidence for most regulated document workflows. For a specific subset — contracts, regulatory submissions, signed policies, documents where non-repudiation is the point — approval isn’t enough. The signature needs to be cryptographically bound to the PDF so that any subsequent modification invalidates it, and so that the signer can’t credibly claim “that wasn’t me” or “that wasn’t the document I signed.”

The DocuSign integration adds PAdES signatures at specific steps of the approval flow. PAdES comes in two levels:

PAdES simple

Standard authenticated signature

Signer authenticates via email + click-through. DocuSign binds the signature to the PDF cryptographically. Suitable for internal contracts, approved-policy signatures, most routine regulatory documents.

PAdES advanced

Identity-verified signature

DocuSign performs identity verification — SMS code, government-ID check, knowledge-based auth. Used when the signature's legal weight needs to be maximally defensible: external contracts, high-value agreements, specific regulatory submissions.

Native

Signing is an approval-step type, not a separate workflow. One flow, one evidence trail, one audit log — not two.

Bound

The signed PDF is cryptographically bound to its content. Any subsequent modification invalidates the signature. The math is the defense.

In-tenant

The signed PDF lands back in SharePoint as the authoritative version. DocuSign is the signing ceremony, not the archive.

What’s not supported: CAdES signatures, qualified e-signatures, and AGID-specific signing modes. These use different trust models (certificate-based chains rather than cryptographic binding of the document), and they require different integrations than DocuSign PAdES. For most regulated document workflows the distinction doesn’t matter; for customers with specific Italian AGID requirements or eIDAS qualified-signature obligations, a separate tool remains necessary.

The compliance lesson: only pay for cryptographic binding when the document's legal weight genuinely requires it. For most approval flows, the standard audit log is the evidence the regulation asks for.

06

Chapter six

The audit log as approval evidence

Every approval step writes to the log automatically. What the log captures, where it lives, and why immutability matters.

The audit log is the evidence layer that makes all the preceding structure actually defensible. Every approval step — the approval-flow launch, each individual approver’s decision, the final publication event, any rejection — writes an entry. The entries capture seven properties that together answer every audit question about the approval:

WHO

Named Entra identity

Specific person, not a group or shared account.

ROLE

Capacity of the action

Quality Manager, Medical Director, Legal Counsel.

EVENT

What happened

Approved, rejected, signed, launched, published.

VERSION

Which version applied

The exact minor or major version at the moment of the event.

WHEN

Precise timestamp

System clock, not user-reported.

COMMENT

Reviewer's note

What the approver said about the decision, if anything.

CRYPTO

Signature payload

For DocuSign-signed steps: the PAdES signature binding details.

SCOPE

Document identity

Protocol code + document type anchors the event to a specific artifact.

The log is append-only by design. No one — not admins, not the product owner, not Microsoft, not users — can edit or delete entries. This is structural, not a permission setting. The integrity of the approval evidence depends on architecture, not trust.

During an audit, the Quality Manager opens the document in the library, clicks the three-dot menu, selects Audit, and filters by event type. The evidence for a specific approval is retrieved in thirty seconds. For bulk evidence (“all SOP approvals in Q3”), the Power BI dashboard aggregates the log data. Either way, the evidence is a query, not a week of archaeology.

07

Chapter seven

A complex approval, walked through

Five-step approval of a clinical SOP revision, with a DocuSign signature at the end and one rejection mid-flow.

START

Day 1 · 10:30 AM

Flow launched

The Quality Coordinator submits SOP-CLIN-0047 v4.1 for approval. She picks her department lead (step 2) and the Medical Director (step 4). Quality (step 1, pre-flow fixed) and Regulatory Affairs (step 3, fixed) populate automatically. Step 5 is configured as DocuSign-signed (PAdES simple) because regulator submissions require cryptographic binding. Document is checked out.

1

Day 1 · 2:45 PM · Quality Manager

Approves with comment

Quality Manager reviews, notes the updated calibration reference, approves. Event logged: Entra ID, "Quality Manager" role, v4.1, timestamp, comment "Calibration update matches the updated standard." Step 2 (department lead) starts automatically.

Day 2 · 9:10 AM · Department lead

Rejects with comment

Department lead identifies a missing step in the procedure: "Section 4.3 omits the post-calibration verification check. Please add and resubmit." Event logged. Flow halts. Document returns to editing. The Coordinator revises to v4.2 and resubmits — all five steps re-execute from step 1. (Quality reapproves in 30 minutes; the full flow continues from where it left off.)

2

Day 2 · 4:22 PM · Department lead

Approves revised version

Reviews v4.2 (with the added verification step), approves. Event logged: role, v4.2, timestamp. Step 3 (Regulatory Affairs, fixed) starts.

3

Day 3 · 11:15 AM · Regulatory Affairs

Approves after regulatory check

RA confirms the procedure aligns with current regulation, approves. Event logged. Step 4 (Medical Director) starts.

4

Day 4 · 8:40 AM · Medical Director

Clinical oversight approval

The Medical Director reviews the clinical substance, approves. Step 5 (DocuSign-signed final approval) starts.

Day 4 · 3:05 PM · Chief Compliance Officer · PAdES signed · published as v5.0

Signed, bound, published

CCO routed through DocuSign. Opens the signing ceremony, authenticates via email, signs. DocuSign binds the signature to the PDF cryptographically and returns it to SharePoint. System issues v5.0, publishes the signed PDF to the public area, sends distribution email to the clinical team with read-receipts enabled.

Total elapsed time from initial submission to published v5.0: 3 days 4 hours 35 minutes. Total audit-log events captured: 11 (2 submissions, 7 approvals, 1 rejection, 1 publication). Six months later, when a regulator audits this SOP, the Quality Manager produces all 11 events as retrieval. No reconstruction.

08

Chapter eight

Compliance mapping — what each regime actually asks

Specific clauses, specific expectations. Which capabilities of an active approval workflow satisfy which compliance requirements.

Regime Clause What it expects Which capability satisfies it
ISO 9001 7.5.2, 7.5.3 Documented information must be reviewed and approved for suitability and adequacy before use; changes must go through controlled review. Sequential approval + audit log + version binding on approval events.
FDA Part 11 §11.10(e) Secure, computer-generated, time-stamped audit trail that records entries and actions that create, modify, or delete electronic records. Append-only audit log, Entra-attributed, system-timestamped. Validation posture remains with the customer's QA team.
FDA Part 11 §11.50, §11.70 Signed electronic records must contain the printed name of the signer, date and time, and the meaning of the signature. Signatures cannot be excised, copied, or transferred. PAdES signing via DocuSign — cryptographic binding prevents excision or transfer.
HIPAA §164.312(b) Implement hardware, software, and/or procedural mechanisms that record and examine activity in information systems containing PHI. Audit log provides the document-management-portion mechanism. HIPAA program certification sits with the customer.
SOX §404 Management and the external auditor must report on the effectiveness of internal control over financial reporting, supported by documented evidence. Audit log + versioning produce the "controls operated as designed" evidence. SOX compliance is enterprise-level.

The thread running through every clause: the evidence regulators ask for is specific and structural. “The approval happened” isn’t sufficient. “Approved by someone with the right role, against the specific version, at a captured moment, with a documented decision” is. A workflow that produces that evidence by default — not as a retrieval effort — is defensible. One that requires reconstruction isn’t.

09

Chapter nine

Common anti-patterns to avoid

Four approval-workflow patterns that look efficient, pass internal review, and fall apart under an auditor's follow-up questions.

ANTI-PATTERN

Group approver ("QA team approves")

Looks efficient: anyone on the QA team can act. Fails audit: attribution is to the group, not a person. Regulators want "who specifically approved, acting in what capacity."

ANTI-PATTERN

Shared-account approval

"quality.approvals@company.com" is a shared inbox. Multiple people use it. Every approval event attributes to the shared account, not a person. Non-defensible under Part 11.

ANTI-PATTERN

"Approval by silence"

Document gets distributed with "please respond if you have objections; otherwise we'll treat it as approved." No affirmative approval event. No signature. Audit retrieval is impossible.

ANTI-PATTERN

Edit during review

Approver notices a typo and fixes it during review. No check-out enforced. The version approved is no longer the version reviewed. "What was approved?" becomes ambiguous.

The common root cause: workflows designed for operational efficiency rather than for producing audit evidence. In regulated documents, the evidence is the output. Any “efficiency” that compromises evidence integrity is a false economy, because it reappears as audit findings and remediation time.

10

Chapter ten

Implementation — from email to workflow

The transition from ad-hoc approval to a structured workflow, in four phases.

Phase 1 · Week 1–2

Map document types

Inventory every document type. For each: which roles must approve, in what order, at which stage (pre-flow / post-flow).

Phase 2 · Week 3–5

Configure flows

Set up fixed approvers per document type. Decide which steps require DocuSign signing. Configure distribution lists for post-publication notification.

Phase 3 · Week 6–8

Pilot on one doc type

Pick SOPs or the highest-volume type. Run approvals through the new workflow. Train approvers (2 hours) and authors (1 hour). Observe, adjust.

Phase 4 · Week 9+

Roll out + measure

Expand to all document types. Turn on the Power BI dashboard. Track cycle time, rejection rate, cadence adherence. Iterate.

A practical note on rejections. In ad-hoc approval, rejections are rare because the document often gets “approved” after informal revision. In structured workflows, rejections become more common — and that’s good. The rejection events capture why documents aren’t yet approvable, which makes the process visible. Cycle times on first submissions go up slightly. Rework drops. Quality of the final approved document goes up. The dashboard metrics that matter are rejection rate (which should settle around 10–15% on a mature workflow) and approval cycle time (which stabilizes once authors calibrate to the approvers’ expectations).

The shift from email approval to sequential workflow is cultural as much as technical. The workflow is the easy part; teaching approvers that "LGTM" in an email isn't approval anymore takes a quarter.


If your organization’s document approvals currently happen in email, Teams, or shared inboxes, the risk isn’t theoretical — it’s a known failure mode that regulators look for. Moving to a sequential workflow with named approvers, automatic check-out, and complete audit log is not a large project. It’s a focused 8–10 week rollout, and the payoff is concrete: audit evidence produced as a byproduct of the approvals themselves, retrievable in seconds, complete by construction.

A 30-minute conversation with our team is usually enough to map your current approval practice against the structural properties this guide describes, and to identify the specific failure modes you’re exposed to right now.

See this guide's principles applied to your own documents

Thirty minutes. No cost. No obligation. We'll walk through your current document-management practice and map it against what the guide describes.