Intake-to-Retirement Framework | BPMN-Governed, DMN-Driven, Agent-Ready
Role-Based Reference Architecture for Financial Services Institutions
Vision, problem statement, and strategic context
Before/after, cycle time compression, stakes
Phase 0 through Phase 6 with swim lanes
15 DMN tables, pathways, risk classification
Asset hub, data model, reuse timeline
13 frameworks, evidence chain, deep dives
Three-horizon model and success metrics
Call to action
Software acquisition managed through reactive, disconnected workflows creating duplication and compliance exposure.
Industry average: 90–120 days from intake to deployment. Rework consumes 60%+ of effort due to poor upfront capture.
Inability to demonstrate disciplined, documented, repeatable technology decisions to regulators (OCC, FINRA, SEC).
No centralized registry means duplicate purchases, untracked AI deployments, and ungoverned vendor relationships.
Every material routing and approval decision is governed by explicit, auditable rules (DMN). Agents and AI accelerate intake, enrichment, classification, and generation—but never make unmonitored, unexplained decisions.
Front-loads information harvesting to compress downstream cycle times by 60-75%.
Identifies every opportunity for AI-driven acceleration. ~~60% average automation across phases.
No control gaps between TPRM, AI governance, legal, risk, compliance, security, and regulatory obligations.
Transforming governance from bottleneck to accelerator
End-to-end governance from idea inception to retirement
| Phase | Name | Std Risk | High Risk | Automation | DMN Tables |
|---|---|---|---|---|---|
| P0 | Software Asset Intelligence | Continuous | Continuous | 90% | — |
| P1 | Intake & Risk Classification | 1-2 days | 1-2 days | 75% | DMN-09, DMN-15 |
| P2 | AI Routing & Scoring | Real-time | Real-time | 95% | DMN-01, DMN-13 |
| P3 | Product Mgmt Review | 3-5 days | 5-7 days | 60% | DMN-02, DMN-03 |
| P4 | Portfolio Governance | 3-5 days | 5-10 days | 40% | DMN-04, DMN-05 |
| P5A | PDLC Build Pathway | 10-15 days | 15-25 days | 55% | DMN-10, 11, 12 |
| P5B | TPRM Buy Pathway | 5-7 days | 7-10 days | 50% | DMN-06, 07, 08, 14 |
| P6 | Observability & Audit | Ongoing | Ongoing | 65% | DMN-12 |
Single source of truth for all institutional software assets, vendor relationships, and license inventory.
Automated identification of existing capabilities before any new request proceeds, preventing duplicate spend.
SSO and expense management integration for continuous detection of ungoverned software.
PRIORITY hit policy. Inputs: decision materiality, credit/capital impact, model complexity, data sensitivity. Output: Tier 1/2/3.
UNIQUE hit policy. Inputs: registry match, functional fit, license availability. Output: Reuse-Redirect / Proceed / Evaluate.
1-5 business days. Internal use, no AI, pre-approved vendor, low risk.
8-26 weeks. Net-new internal development, IP/differentiation justifies.
6-16 weeks. Commercial solution, vendor not contracted, standard procurement.
10-20 weeks. Buy commercial core, build proprietary extension.
Composite score, strategic alignment, regulatory risk, budget, resources → Priority GO / Standard GO / Defer / NO-GO
Market solution, 5yr TCO, IP differentiation, complexity, strategic fit → Buy / Build / Hybrid / Defer
10-step development lifecycle: Risk Evaluation → High-Level Design → Observability Design → Proof of Concept → Technology & Risk Gate → AI Governance Review → Development → UAT → Go-to-Market → Deployment
6-stage TPRM lifecycle aligned to OCC 2023-17: Vendor Risk Tiering → RFP Assembly → Sourcing → RAE & Selection → Contracting & Funding → Vendor Enablement & Monitoring
WORM-compliant storage with full decision provenance chain for every DMN invocation.
Drift detection, bias monitoring, and hallucination detection for Tier 1 and Tier 2 models per SR 11-7.
DMN-14 drives monitoring frequency: High=monthly, Limited=quarterly, Minimal=semi-annually.
| Swim Lane | candidateGroups | Primary Phases | Key Functions |
|---|---|---|---|
| Governance Board | sla-governance-board | P3-P4 | Policy decisions, final approvals, escalations |
| Business Owner | business-owner | P1-P4 | Requirements, sponsorship, UAT sign-off |
| IT Architecture | it-architecture | P5A | Technical design, integration assessment |
| Procurement | procurement | P5B | Vendor selection, RFP, contract management |
| Legal & Compliance | legal-compliance | P1-P6 | Regulatory review, compliance gates |
| Information Security | information-security | P3, P5 | Security assessment, pen testing |
| Vendor Management | vendor-management | P5B, P6 | Vendor onboarding, SLA tracking |
Each swim lane maps to Camunda 7 candidateGroups for executable BPMN. All tasks within a lane are assigned to the role(s) ensuring clear accountability.
15 DMN decision tables replacing informal gateway reviews with deterministic, auditable logic
Material decisions on credit, capital, or pricing. Full SR 11-7 validation. Highest monitoring cadence.
Moderate business impact. Standard validation with periodic review.
Non-material, informational use. Simplified validation, lower monitoring frequency.
7 agents operating under strict governance: deterministic outputs, knowledge-base-bound, DMN-governed, full provenance logging.
| Agent | Phase(s) | Knowledge Base | DMN Governed By | Escalation Trigger |
|---|---|---|---|---|
| Intake Bot | P1 | Software Registry, Regulatory KB | DMN-09, DMN-15 | Ambiguous capability |
| Routing Engine | P2 | Registry, Historical outcomes | DMN-01, DMN-13 | Confidence <85% |
| Compliance Agent | P1, P3 | Regulatory KB, Data Gov KB | DMN-02 | Novel regulatory scenario |
| Legal Clause Assembly | P5B | Legal Knowledge Graph | Graph rules | Conflicting clauses |
| Contract Redline | P5B | Legal KG, Precedents | Standards rules | Non-standard deviation |
| Knowledge Staging | All | All KBs (write) | Validation rules | Schema failure |
| Monitoring Agent | P6 | Baselines, Thresholds | DMN-12, DMN-14 | Alert breach |
| ID | Decision Name | Phase | Hit Policy | Key Inputs | Key Outputs |
|---|---|---|---|---|---|
| DMN-01 | AI Routing & Pathway | P2 | UNIQUE | Channel, AI tier, composite score, vendor status | Pathway, fast-track flag, required reviews |
| DMN-02 | Information Completeness | P3 | ANY | Value quantified, data class, integration list, reg flags | Proceed / Return with gap |
| DMN-03 | Duplicate, Merge, Reuse | P3 | UNIQUE | Registry match score, backlog match, match type | Close / Reuse / Merge / Proceed |
| DMN-04 | Go/No-Go Viability | P4 | PRIORITY | Composite score, alignment, reg risk, budget | Priority GO / Standard GO / Defer / NO-GO |
| DMN-05 | Buy vs Build | P4 | UNIQUE | Market solution, 5yr TCO, IP, complexity | Buy / Build / Hybrid / Defer |
| DMN-06 | Vendor Risk Tier | P5B | PRIORITY | Critical activity, data sensitivity, concentration | Tier 1-4, TPRM intensity |
| DMN-07 | Vendor Selection RAE | P5B | UNIQUE | Vendor count, pilot outcome, RAE findings | Proceed / Conditional / Restart |
| DMN-08 | Funding Confirmation | P5B | UNIQUE | Finance engagement, budget, FP&A completion | Funded / Deferred / Escalate |
| DMN-09 | AI Risk Tier | P1 | PRIORITY | Decision materiality, credit impact, complexity | Tier 1 / Tier 2 / Tier 3 |
| DMN-10 | PoC Gate | P5A | UNIQUE | PoC score, Architecture sign-off, CyberSec | Proceed / Refine / Reject |
| DMN-11 | Tech & Risk Eval | P5A | ANY | Completeness, AI Gov, Architecture, CyberSec | Proceed to UAT / Hold / Return |
| DMN-12 | Observability Tier | P5A | PRIORITY | AI risk tier, reg class, decision materiality | Log schema, retention, monitoring cadence |
| DMN-13 | Fast-Track Eligibility | P2 | UNIQUE | Internal, AI flag, production, sensitivity | Fast-track / Standard |
| DMN-14 | TPRM Monitoring Freq | P5B+ | UNIQUE | Risk tier, contract value, service criticality | Monitoring cadence per activity |
| DMN-15 | Capability Reuse | P1 | UNIQUE | Registry match, functional fit, license avail | Reuse-Redirect / Proceed / Evaluate |
Centralized asset intelligence preventing duplicate spend and enabling reuse
Single source of truth integrating CMDB, SAM, Vendor Contract Repository with nightly reconciliation.
Functional capability mapping enabling DMN-15 reuse gate to query before any new request proceeds.
Real-time license utilization tracking, renewal forecasting, and compliance status monitoring.
SSO and expense management integration for continuous identification of ungoverned software deployments.
Every request must pass through the capability reuse gate (DMN-15) before proceeding to design or build. The registry query evaluates functional fit, license availability, and prior retirement decisions to determine the optimal path.
13 frameworks mapped across 7 governance phases
Full chain logged: input data hash, knowledge base version, agent version, DMN rule applied, output, confidence score.
WORM-compliant storage with automated retention enforcement. Every decision recorded with timestamp and actor.
Pre-built exam packages for OCC, FINRA, SEC. 24-hour retrieval SLA met with automated tooling.
AI-driven anomaly detection: performance drift, security anomalies, compliance deviations. Risk-tiered cadence.
Three-horizon maturity model: Foundation → Automation → Optimization
From 90-120 days to 29-45 days. 15 DMN tables. 10 BPMN models. 13 regulatory frameworks.
v0.0.0 | Built 2026-03-02 | View Full PRD