16  Analytics Frameworks: The LAMP Model

16.1 Why Analytics Needs a Framework

A function with strong techniques and no framework will solve well-formed problems brilliantly and ill-formed ones not at all.

Most HR analytics work fails not at the model but at the framing. The data exists, the technique is sound, the chart is drawn well — and the audience walks away unconvinced. The diagnosis is usually the same: the analyst answered the wrong question, or answered the right question for the wrong audience, or made the right argument but built it on a measurement the audience would not trust. A framework is what stops these failures, by making the analyst commit to logic, evidence, measurement, and process before the model is run.

The most influential framework in HR analytics is the LAMP model proposed by John W. Boudreau & Peter M. Ramstad (2007) in their work on talent measurement and decision science. LAMP stands for Logic, Analytics, Measurement, and Process. The four elements work in sequence and as a check on one another. Logic supplies the cause-and-effect argument the work is built on. Analytics applies the techniques that test the logic. Measurement provides the data the analytics depend on. Process is the machinery that turns the result into a decision the organisation actually takes. Skip any one, and the work fails predictably at that point.

The logic-first ordering of LAMP was a deliberate response to the way many HR-analytics programmes had been built. As John W. Boudreau & Peter M. Ramstad (2005) had argued earlier in their introduction of talentship as a decision science, the discipline that distinguishes an HR-analytics function from an HR-reporting function is the willingness to start with logic — the cause-and-effect story about why workforce decisions matter for business outcomes — before anything else. A function that starts with the data ends up with whatever questions the data happens to support. A function that starts with the logic ends up with the questions the business actually needs answered.

The visualisation lens enters at every step. Logic is rendered as a cause-and-effect diagram. Analytics is rendered as a chart with a comparison built in. Measurement is rendered as a tooltip that discloses definitions and lineage. Process is rendered as a recurring decision moment that the page is designed to feed. A dashboard that surfaces all four elements visibly is a dashboard that audiences read as analytics rather than as reporting.

TipThe LAMP contract
  1. Every analytic the function ships is built on an explicit logic — a cause-and-effect story the audience can audit before any technique is applied.
  2. Logic, analytics, measurement, and process are designed in that order. Reordering produces predictable failures, especially when the work begins with whatever data is most convenient.
  3. The dashboard renders all four LAMP elements: a logic visual, an analytical chart, a measurement disclosure, and a named decision the chart supports.

16.2 The Four Elements of LAMP

LAMP is best read as a working sequence rather than as a list of equally weighted boxes. Each element does a specific job; together they describe the path from a workforce question to a workforce decision the organisation acts on.

TipThe Four LAMP Elements at a Glance
Element Question it answers Failure mode when it is missing
Logic Why does the workforce question we are asking matter to business outcomes Answers to questions the business does not need
Analytics What technique will test the logic and produce the answer Numbers without an argument behind them
Measurement What data are required and how will they be defined A model whose inputs cannot be defended
Process How does the answer become a decision the organisation acts on Insights that are produced and forgotten
TipThe LAMP sequence

flowchart LR
  A[Logic<br/>cause-and-effect argument] --> B[Analytics<br/>techniques to test it]
  B --> C[Measurement<br/>data and definitions]
  C --> D[Process<br/>decision integration]
  style A fill:#E8F0FE,stroke:#1A73E8
  style B fill:#FEF7E0,stroke:#F9AB00
  style C fill:#E6F4EA,stroke:#137333
  style D fill:#F3E8FD,stroke:#8430CE

Each arrow is a check. If the analytics cannot test the logic, the logic was wrong, and the work returns to the first element. If the measurement cannot support the analytics, the analytics was over-reaching, and the work returns to the second element. If the process cannot use the result, the analysis was not framed for a decision in the first place, and the work returns to the start. LAMP is a framework that fails forward, not a checklist run once.

16.3 Logic and Analytics

Logic and analytics are the upstream half of LAMP. Logic supplies the argument. Analytics tests the argument. The two together form the intellectual core of the work, and the discipline is to build them in the order LAMP prescribes rather than letting the available technique determine the question.

TipWhat Disciplined Logic Looks Like

A disciplined logic statement names the workforce variable being studied, the business outcome it is claimed to influence, the mechanism by which the influence is supposed to operate, and the conditions under which the relationship is expected to hold. “Engagement matters” is not a logic statement. “Engagement among frontline staff influences customer-rated service quality through reduced absenteeism and faster issue-resolution, in stores with at least three months of stable management” is. The longer statement makes the claim testable and tells the analytics layer exactly what comparison and what controls to build.

TipWhat Analytics Owes the Logic

Analytics owes the logic three things. First, a technique strong enough to test the claim — a paired comparison, a regression, a quasi-experimental design — chosen because it fits the logic, not because the team is comfortable with it. Second, an honest rendering of uncertainty, so that the audience can tell the difference between a tested claim and a confident guess. Third, a visualisation that makes the test legible: the comparison group on the same chart, the time offset rendered correctly, the threshold or counterfactual visible. As John W. Boudreau & Peter M. Ramstad (2007) argue, the analytics step is where the audience earns the right to trust the logic, and the visualisation is the working surface where that trust is built.

16.4 Measurement and Process

Measurement and process are the downstream half of LAMP. Measurement supplies the inputs that the analytics depend on. Process is what converts the analytical output into an organisational decision. The two together are where most well-designed analytics work fails in practice — usually because the team underestimated either the data work or the change-management work that the framework requires.

TipWhat Disciplined Measurement Looks Like
Property What measurement promises
Definition The variable is computed the same way every time
Lineage The path from raw data to measure is documented
Refresh cadence The data is fresh enough to support the decision the analytic informs
Coverage The measure applies to the population the logic specified
Defensibility The measure can be explained to a sceptical audience without rephrasing
TipWhat Disciplined Process Looks Like

Process is the machinery that turns the analytic output into action. A disciplined process names the recurring meeting where the result is read, the decision owner who is expected to act on it, the cadence at which the result is refreshed, and the way the action will be tracked back to the next cycle’s analytic. Without that machinery, even a flawless logic-and-analytics chain produces studies the organisation forgets. The strongest test of a LAMP-aligned function is whether removing the analytic would visibly disrupt a recurring meeting; if the meeting can carry on without it, the process step has not been built.

16.5 Visualising LAMP on the Page

LAMP is a framework, not a visual style. But a dashboard built with LAMP discipline shows the framework on the page in specific ways. Five design choices, applied consistently, render the framework legibly to the audience that consumes the work.

TipFive Design Choices That Render LAMP
Choice What it shows about LAMP
Logic strip on the page A brief cause-and-effect statement at the top of the page
Analytics chart with comparison The technique is visible and the comparison is built in
Measurement tooltip Definition, lineage, refresh, and coverage on hover
Decision header The page names the recurring meeting and decision owner it serves
Action-tracking column The previous cycle’s decision and outcome are recorded on the page
TipThe dashboard as a LAMP artefact

A dashboard built deliberately as a LAMP artefact reads differently from one assembled chart-by-chart. The page opens with a logic strip the audience can audit, surfaces the analytical evidence with comparison built in, exposes the measurement discipline through tooltips, names the decision moment in the header, and records the previous cycle’s action. The audience reads the page as analytics rather than as reporting because every LAMP element is visible. The function earns its place because every LAMP element is also true behind the page.

Summary

Concept Description
Why a Framework Matters
Frameworks prevent framing failures Most analytics fails at framing, not at the model; the framework prevents this
Logic-first ordering Start with the cause-and-effect argument before the data or the technique
LAMP fails forward Each LAMP element checks the previous and returns the work when it fails
Reporting versus analytics Reporting describes; analytics tests an argument with a defended technique
Visualising LAMP elements A LAMP-built dashboard makes all four elements visible to the audience
The Four LAMP Elements
Logic element Why does the workforce question matter for the business outcome
Analytics element What technique will test the logic and produce the answer
Measurement element What data are required and how will they be defined
Process element How does the answer become a decision the organisation acts on
LAMP sequence Logic, analytics, measurement, process — in that order, with feedback
Logic and Analytics
Disciplined logic statement The full statement names variable, outcome, mechanism, and conditions
Workforce variable named The specific workforce input under study
Business outcome named The specific business outcome the workforce input is claimed to influence
Mechanism named The cause-and-effect path from input to outcome
Boundary conditions named The conditions under which the relationship is expected to hold
Technique chosen for the logic The analytical technique fits the logic, not the team's comfort
Honest uncertainty rendering Charts render uncertainty visibly so the audience can judge the strength
Visualisation that makes the test legible The comparison group, time offset, and threshold are visible on the chart
Measurement and Process
Definition discipline The variable is computed the same way every time
Lineage discipline The path from raw data to measure is documented
Refresh cadence discipline The data is fresh enough to support the decision
Coverage discipline The measure applies to the population the logic specified
Defensibility discipline The measure can be explained to a sceptical audience without rephrasing
Decision-owner naming The named role expected to act on the analytic output
Recurring meeting integration A recurring meeting that depends on the analytic to proceed
Action tracking back to the analytic Last cycle's decision and outcome are tied back to the analytic
Visualising LAMP
Logic strip on the page A brief cause-and-effect statement at the top of the page
Analytics chart with comparison The technique is visible and the comparison is built into the visual
Measurement tooltip Definition, lineage, refresh, and coverage available on hover
Decision header The page names the recurring meeting and decision owner it serves
Action-tracking column Previous cycle's action and result are recorded on the page
Dashboard as a LAMP artefact A dashboard built deliberately to surface every LAMP element