Strategy

How the three-layer intelligence engine works — from public data to personalised computation to Open Banking enrichment

The thesis is that people need computation, not dashboards — mathematical translation of macroeconomic forces into personal financial implications. This chapter describes how that translation works in practice: what a user actually experiences, what powers it underneath, and how the architecture replicates across every domain of personal finance.

The easiest way to understand the product is to follow someone through it.

A User Story: Sarah's Mortgage Decision

Sarah is 34. She and her partner bought their first home two years ago with a two-year fixed-rate mortgage at 5.14%. The fix expires in September. She knows she needs to remortgage, but she doesn't know when to start, what rate to aim for, whether to fix for two or five years, or how the Bank of England's recent decisions affect her options. She does what most people do: she Googles it.

Layer 1 — She Arrives, She Learns, She Stays

Sarah searches "should I fix my mortgage for 2 or 5 years" and finds Ask Morty (askmorty.co.uk). Instead of the opinion article or comparison table she expected, she finds an interactive tool.

The tool shows her the current shape of the Bank of England's yield curve — the market's collective expectation of where interest rates are heading — translated into plain language. It shows the current gap between two-year and five-year swap rates, what that gap means historically, and what the market is currently "pricing in" for the path of rates over the next few years. It shows a timeline of upcoming Bank of England decisions and how each one could shift the landscape.

Sarah has entered nothing. No name, no email, no mortgage details. She is using publicly available data — Bank of England yield curves, SONIA swap rates, published lender rate sheets — computed and presented in a way she has never seen before. She spends four minutes on the page. She bookmarks it.

This is Layer 1. The data is public. The computation is ours. The value is immediate. And Sarah now has a reason to come back: the next Bank of England decision is in three weeks, and the tool will update automatically.

What Layer 1 does

Layer 1 tools use only publicly available data — central bank rates, published swap curves, ONS inflation components, FCA regulatory filings, provider fee schedules — and translate them into interactive computations that anyone can use with zero input. This is the entry point, the trust-builder, and the SEO engine. It serves all 54.5 million UK adults with financial lives, not just the fraction willing to register.

Layer 2 — She Gets Personal

Sarah returns after the Bank of England holds rates steady. The tool has updated — the yield curve shape has shifted slightly, the market's implied rate path has moved, and the two-year versus five-year calculus has changed. She sees a prompt: "Want to see what this means for your specific mortgage?"

She enters her mortgage balance (£285,000), her current rate (5.14%), her remaining term (23 years), her property value (£340,000), and her fix expiry date (September 2026). No account creation. No email required. Just numbers into fields.

The tool now runs her scenario. It shows her exactly what happens under three paths: fix for two years at current indicative rates, fix for five years, or sit on SVR if she misses the window. It computes the total cost of each path over five years. It shows her the break-even point — the rate at which the two-year fix stops being cheaper than the five-year. It models what happens if rates rise 0.5%, 1%, or 1.5% when her next fix expires. It quantifies the cost of every week she delays beyond her fix expiry — the SVR penalty in pounds per month.

Sarah has given no sensitive information. She has not connected her bank account, shared her income, or provided her address. She has entered five numbers — numbers she knows from her last mortgage statement — and received a level of analysis that would previously have required a broker consultation.

She clicks "save my scenario" and enters her email. Now Ask Morty can alert her when market conditions change in ways that affect her specific situation. A new tool has just earned a relationship that aggregation apps spent millions trying to buy.

What Layer 2 does

Layer 2 tools take basic user-provided parameters — mortgage balance, rate, and term; pension pot size and contribution level; income and spending patterns — and run personalised simulations, stress tests, and scenario models. No Open Banking connection required. No account creation for the core tools. This is the engagement engine and the free-to-premium boundary: some Layer 2 features are free, others are available through subscriptions or one-off payments for detailed reports, advanced simulations, and exportable analyses.

Layer 3 — She Goes Deeper

Over the following weeks, Sarah uses Ask Morty three more times — once after a CPI print shifts rate expectations, once to compare two specific lender products side by side, and once to model the impact of overpaying her mortgage versus increasing her pension contributions. Each time she returns, the tool remembers her scenario and shows how conditions have changed.

When September approaches, she sees an option: "Connect your mortgage account for an exact analysis." She has used the tool for two months. She trusts it. She consents to Open Banking access.

Now Ask Morty pulls her actual mortgage data — the precise balance, the exact rate, the specific lender, the contractual terms. It cross-references her lender's published product range. It computes her exact loan-to-value ratio and maps it to the rate tiers available to her. It identifies whether her current deal has early repayment charges, and if so, when they expire and whether the cost of breaking early is offset by the rate improvement available. It generates a comprehensive mortgage review — the kind of document a broker would charge for — that she can take to any adviser or lender.

Sarah has moved from anonymous visitor to known user to consented data connection — not because she was asked to, but because each layer proved its value before the next was offered.

What Layer 3 does

Layer 3 uses Open Banking integration — with Influx registered as an Account Information Service Provider (AISP) and Third Party Provider (TPP) — to pull real transaction data, real balances, and real product terms with the user's explicit consent. This enables exact computations: real affordability analysis, actual fee detection, precise product matching, and comprehensive financial reviews. Layer 3 is the deepest engagement level and the premium monetisation layer — but it is never required. A user can spend months in Layers 1 and 2 and receive genuine, substantial value throughout.

The Three-Layer Intelligence Engine

Sarah's journey illustrates the architecture, but the architecture is designed to work at scale across every domain of personal finance. The three layers are not a feature roadmap — they are a structural inversion of the aggregation model's fatal flaw.

The aggregation model demanded data upfront and promised value later. Users had to complete extensive onboarding, connect all their accounts, and manually enter product details before the product did anything useful. This created the onboarding wall that killed Moneyhub, Money Dashboard, and every other aggregator.

The three-layer engine inverts this completely: it delivers value first and earns data progressively.

Data sources: Bank of England yield curves, SONIA overnight rates, gilt yields, ONS inflation components (CPI, CPIH, RPI at category level), FCA regulatory filings, published provider fee schedules, lender rate sheets, pension scheme public data, Land Registry price data, HMRC tax tables.

What it computes: Macro-to-micro translation — the current rate environment interpreted for consumer decisions. Market-implied rate paths. Inflation regime analysis. Lender criteria overviews. Provider fee comparisons. Regulatory change impact summaries. Historical context for current conditions.

User experience: Zero input required. No registration, no account, no email. The user arrives from search, uses an interactive tool, and gets a computation they cannot find anywhere else. The value is immediate and unconditional.

Business function: This is the traffic engine. Layer 1 tools target verifiable data vacuums — high-volume search queries where no existing result performs the computation our tools perform. They are the entry point to the constellation, the trust-builder, and the foundation for everything that follows. They also serve as the content that gets cited by AI Overviews, linked in forums, and shared socially — because interactive computation cannot be summarised away.

Data sources: User-provided parameters (mortgage balance, rate, term, property value; pension pot size, contribution rate, target retirement age; income, spending categories) combined with all Layer 1 public data.

What it computes: Personalised scenario modelling. Mortgage strategy simulations across multiple rate paths. Pension adequacy projections under different contribution, growth, and inflation assumptions. Personal inflation rates from ONS component data reweighted to the user's spending profile. Systemic leakage estimates across the user's described financial products. Stress tests — what happens to the user's position if rates rise, inflation spikes, or markets fall.

User experience: The user enters a handful of numbers — typically 3–8 fields — into a tool they already understand from Layer 1. No Open Banking connection. No sensitive data (no income proof, no bank statements, no National Insurance number). The inputs are structural descriptions of products the user holds, not personal identity data.

Business function: This is the engagement and conversion engine. Layer 2 is where users move from anonymous visitors to identifiable relationships (email capture via "save my scenario" or "alert me when conditions change"). It is also where the free-to-premium boundary sits: core simulations are free; advanced features — multi-scenario comparison, exportable PDF reports, historical backtesting, ongoing monitoring alerts — are available through subscriptions (monthly or annual) or one-off payments.

Data sources: Real transaction data, real account balances, real product terms, and real lender/provider details pulled via AISP/TPP consent under Open Banking regulations — layered on top of all Layer 1 and Layer 2 data.

What it computes: Exact personal inflation from real spending patterns. Actual fee extraction across connected accounts. Precise loan-to-value ratios mapped to live rate tiers. Real affordability analysis based on verified income and expenditure. Comprehensive financial reviews — mortgage, pension, or cross-product — that pre-structure the information an adviser would need for a fact-find.

User experience: The user, having built trust through weeks or months of Layer 1 and Layer 2 engagement, chooses to connect their accounts. The consent flow is standard Open Banking — regulated, transparent, revocable at any time. The reward is immediate: the tools they already use become dramatically more precise.

Business function: This is the data moat and the premium monetisation layer. Layer 3 users are the highest-value cohort: they generate the most detailed analyses, the most qualified professional referrals, and the richest anonymised benchmark data. Layer 3 is also the foundation for the B2B2C model — the comprehensive financial reviews generated at this layer are exactly what regulated advisers need to serve their clients more efficiently.

The Progressive Engagement Funnel

The three layers are not just a data architecture. They are an engagement model that solves the trust problem that killed every aggregator.

At Layer 1, the user gives nothing and receives genuine value. This is structurally impossible in the aggregation model, where the product is useless until accounts are connected. The result: 57% of UK consumers who actively limit data sharing with financial providers can still use our tools fully at Layer 1, and meaningfully at Layer 2.

The transition from Layer 1 to Layer 2 is not a registration gate. It is a natural extension of curiosity. A user who has seen how the yield curve affects mortgage timing wants to know what it means for their mortgage. The input fields are not a form to be endured — they are parameters in a computation the user already values. This is why Layer 2 conversion should structurally outperform the aggregation onboarding wall: the user is adding precision to something they already understand, not trusting a stranger with their financial life.

The transition from Layer 2 to Layer 3 is even more selective — and that is by design. Only users who have demonstrated sustained engagement, who return multiple times, who refine their scenarios and explore edge cases, are invited to connect via Open Banking. This self-selection means Layer 3 users are deeply engaged before they ever share real data. They are not cold signups being asked to connect accounts on day one. They are informed users who have already experienced months of value and want more.

The flywheel

More users at Layer 1 generate more data about which computations attract engagement. Better computations attract more users. More Layer 2 users generate richer scenario data that improves model calibration. More Layer 3 users generate anonymised benchmark data that makes Layer 1 and Layer 2 tools more valuable for everyone. Each layer feeds the others. The intelligence compounds.

The Constellation: Same Engine, Different Domains

The three-layer architecture is not specific to mortgages. It is a reusable engine that applies to every domain of personal finance — because the underlying structure is the same everywhere: public macroeconomic data exists, consumers cannot interpret it for their situation, and the computation that bridges the gap has never been built.

Mortgages First: Ask Morty

askmorty.co.uk launches first because mortgages are the highest-intent, most time-sensitive, and most monetisable entry point in UK personal finance.

Mortgage decisions affect more money more immediately than any other consumer financial decision. They recur on known schedules — every 2, 3, or 5 years when a fix expires. They are driven by macroeconomic conditions that change continuously: every Bank of England decision (eight per year), every inflation print (twelve per year), every significant swap rate movement generates fresh computation and a new reason for users to return. This solves the frequency problem that killed aggregation dashboards — the content engine is the economy itself, and the economy never stops moving.

The Layer 1 tools — rate environment analysis, yield curve translation, lender criteria overviews, market trend context — target data vacuums that currently return opinion articles and generic guides. "Should I fix for 2 or 5 years," "when should I start remortgaging," "what happens if I go onto SVR" — these are high-volume queries where no existing result provides personalised, computational analysis.

The Layer 2 tools — mortgage strategy simulator, affordability stress test, remortgage timing optimiser, overpayment versus investment modelling — convert curious visitors into engaged users with quantifiable value. The Layer 3 integration — exact mortgage position, real LTV computation, lender-specific product matching, comprehensive mortgage review — serves the highest-intent users and generates qualified referrals to mortgage brokers.

Pensions Second: Pension Copilot

pensioncopilot.co.uk follows because pensions are the largest asset most people will ever own, the least understood, and the area where the knowledge gap and extraction problem are most acute.

The cohort overlap with mortgage users is substantial. A 35–55 year old making a mortgage decision is statistically likely to hold multiple pension pots, to be uncertain about their pension adequacy, and to have never reviewed their workplace pension fund allocation. Ask Morty identifies these users through their engagement patterns — a user modelling retirement-age mortgage payoff scenarios is signalling pension-related intent — and introduces Pension Copilot as a natural next step.

Layer 1 tools for pensions: pension fee benchmarking against published scheme data, state pension age and entitlement calculators, tax relief explainers that compute the actual benefit for different tax bands, annuity rate trackers from published market data. Layer 2: pension adequacy projections under multiple growth, inflation, and contribution scenarios; consolidation analysis that identifies embedded guarantees before recommending any action; geographic retirement mapping that shows where a projected pension income goes furthest. Layer 3: real pension data pulled via consent, actual fee extraction computed across all connected pots, comprehensive pension review that pre-structures an adviser fact-find.

The Expansion Path

Each subsequent vertical follows the same pattern: identify the domain, map the public data sources, find the data vacuums, build Layer 1 tools that deliver zero-input value, extend to Layer 2 personalisation, and offer Layer 3 enrichment for users who want it.

The roadmap after mortgages and pensions — ISAs and savings optimisation, protection and insurance, debt management, tax planning — is determined by cohort overlap (which users naturally need the next tool?), data availability (do sufficient public data sources exist?), and monetisation potential (does the domain support premium features, referrals, or B2B licensing?). Each vertical is only built once the preceding verticals have validated the three-layer model with real user data.

Revenue: Monetising Computation, Not Assets

The graveyard proved that "seeing your money and getting insights" cannot sustain a standalone business. Our revenue model does not depend on it. We monetise what we compute, not what we hold.

Premium features sell ongoing access to computations the user has already experienced as valuable. A user who has seen their mortgage strategy simulation update after a Bank of England decision understands what they are paying for. The premium tier is "keep this computation running and alert me when the answer changes" — not "trust that this dashboard will eventually be useful." Revenue comes from monthly or annual subscriptions and one-off payments for detailed reports, multi-scenario analyses, and exportable documents.

Professional referrals emerge from the tools' own triage mechanisms. Users with straightforward situations receive self-directed guidance. Users with genuinely complex situations — defined benefit pensions with safeguarded benefits, high-value mortgages with unusual structures, approaching critical deadlines — are identified, their data pre-structured into a ready-to-analyse information pack, and referred to independent financial advisers or mortgage brokers. This eliminates the £500–1,000 an adviser normally spends on fact-finding, making referral fees of £300–2,000 per qualified case economically rational for both parties. The triage ensures referrals happen only when complexity genuinely warrants them, preserving the trust that makes the referral valuable.

B2B2C licensing monetises the analytical models directly. The same computational engines that power the consumer tools have standalone value for regulated professionals. A mortgage broker can subscribe to Ask Morty's rate environment and strategy simulation tools to use with their own clients. An IFA firm can license Pension Copilot's adequacy projections and fee benchmarking for client reviews. An employer can offer the tools as part of a financial wellbeing programme. This creates a revenue stream that does not depend on consumer willingness to pay, while expanding the tools' reach to populations — employer schemes, adviser client bases, corporate wellbeing programmes — that would never find them through search.

Gated content — educational resources, methodology deep-dives, regulatory guides, and curated reference libraries — provides additional value at the free-to-premium boundary. This content serves both consumers seeking deeper understanding and professionals seeking materials they can share with clients.

Revenue without the graveyard's mistake

Each revenue channel is enabled by the computational assets the tools produce — calibrated models, scenario engines, benchmark datasets — rather than by the data users provide. We do not need to hold assets, manage accounts, or sell financial products. The business model is structurally different from everything in the graveyard.

The Competitive Moat

The aggregation products had no durable moat. Any competitor could access the same Open Banking data through the same providers at the same commodity price. When NatWest and Barclays built account aggregation into their own apps, the startups lost whatever edge they had.

Our competitive advantage sits in a fundamentally different asset: calibrated computational models at the intersection of macroeconomics, financial engineering, and consumer product structures.

The model that translates a yield curve shape into consumer mortgage guidance requires someone who understands both institutional fixed-income markets and how UK mortgage products are structured and priced. The Monte Carlo engine that simulates multi-decade mortgage strategies requires stochastic modelling calibrated to real rate volatility data. The leakage computation framework requires continuously updated benchmark data on fees, rates, and product terms across major UK providers, combined with compounding mathematics that makes the invisible visible.

These models cannot be scraped or crawled — the outputs are numbers, but the methodology behind them is complex. They cannot be reverse-engineered from outputs alone. They improve with time as more regime changes are observed, more calibrations are refined, and more edge cases in product structures are handled. And they require genuine cross-disciplinary expertise to replicate: a co-founder with deep financial services regulatory experience across multiple jurisdictions, and a co-founder with a PhD in quantitative modelling and algorithmic design, working in the same room on the same problem.

The moat is not in data access. It is in the interpretation layer. And the interpretation layer is an intellectual asset that compounds with every regime change observed, every model calibrated, every new product structure mapped — rather than a technology commodity that any competitor can buy from the same API provider.

Incumbents will not replicate this. Monzo cannot credibly tell customers their money would earn more elsewhere. Comparison sites cannot help users think beyond the next referral commission. Workplace pension providers cannot build tools that reveal their own fee extraction. The neutrality required for genuine economic translation is structurally incompatible with being a product provider — and that structural incompatibility is the most durable moat of all.