ISO 17025:2025 Transition
The 36-Month Clock Started
Digital evidence, broader software validation, and operational risk decisions now carry the weight of reassessment, with a September 30, 2028 deadline already running.
The Clock Is Ticking
ISO published the third edition of ISO/IEC 17025 on September 27, 2025. Five days later, the International Laboratory Accreditation Cooperation set the transition deadline: every accredited testing or calibration laboratory still on the 2017 edition must conform to the 2025 edition by September 30, 2028.
Three years sounds generous. It is not. A credible transition runs a gap assessment, a system selection, a validation plan, configuration and data migration, a parallel run, staff retraining, and a reassessment in that order, and every stage has a minimum duration. A lab that starts in month thirty will not finish.
The consequence at the deadline is blunt. Accreditation to the 2017 edition will not be recognized under the Global Accreditation Cooperation MRA, successor to the ILAC MRA since January 1, 2026, once the transition closes. A lapsed certificate is a lost contract with every regulator, insurer, and purchaser that specifies accredited results.
What Actually Changed
The 2025 edition is not a rewrite. Most structural clauses carry over. The text that moved is the text that touches how a result is generated, signed, stored, and produced to an assessor on request. Three shifts matter more than the rest.
The first is in Clause 7.11, data and information management. The 2017 edition allowed electronic records as an option. The 2025 edition codifies what working assessors already expected to see: electronic signatures attributable to a named user, audit trails with timestamps, traceable digital calibration data, and Digital Calibration Certificates as a recognized format. The standard does not forbid paper. It requires an evidence chain that withstands scrutiny, and paper-only workflows will struggle to produce that evidence in the form an assessor now expects.
The second concerns software validation, and it reaches well past the LIMS. Under the 2017 edition, most labs treated validation as a LIMS question and ran the rest of the stack on trust. Under the 2025 edition, everything that touches a number on the way to a report is in scope: spreadsheets with embedded formulas, middleware that passes instrument readings, calculation macros, report templates, and the AI-assisted analysis tools arriving in the lab. Each one needs a validation record.
The third is risk-based thinking, which moved out of the management review and into daily operations. In 2017, risk was largely an annual exercise. In 2025, risk decisions thread into personnel competence in 6.2, method selection in 7.2, equipment verification in 6.4, and data management in 7.11. Assessors will expect evidence of risk reasoning in the routine output of the lab, not only in meeting minutes.
Five Requirements to Meet
Digital-first records with traceability
Records must be reproducible in full, including the conditions under which they were created, edited, and approved. Assessors will request an audit trail for a randomly selected sample, traced from instrument output to final report.
Attributable electronic signatures
Every approval in the result-generation path must be bound to a named user with unique credentials. Shared logins fail this. Scanned signatures on PDFs fail this. Assessors will request a signed record plus the system log that proves the signatory could not have been anyone else at that moment.
Software validation beyond the LIMS
Any software that calculates, transfers, stores, or formats data in a client report is in scope. A spreadsheet with an uncertainty formula is software. A script that parses instrument output is software. Assessors will request a validation record for each in-scope system, with a defensible rationale for the level of testing applied.
Risk decisions inside operations
Risk reasoning now appears explicitly in clauses 6.2, 6.4, 7.2, and 7.11. Each expects documented evidence that the lab identified a risk, decided on a response, and acted on the decision. Assessors will request a risk register tied to operational clauses, not a generic attachment to the annual management review.
Continuity of data, end to end
From sample receipt to final report, the data trail must be unbroken and internally consistent. A gap at any handoff becomes a finding: a scribbled batch number, a verbal approval, a re-keyed result. Assessors will request a single sample traced from login to certificate with every system, user, and timestamp accounted for.
The Paper-to-Digital Gap
Most labs assume the transition is a LIMS problem. It rarely is. The LIMS was validated years ago and produces its audit trail on demand. The gap is everything the LIMS does not touch: the receiving spreadsheet the sample techs keep open, the calculation workbook the chemists email back and forth, the review binder on a senior analyst's desk, the PDF report that collects a scanned signature on page three.
Each of these sits inside the result-generation path. Each is software or a record under the 2025 definitions. In most labs, none has been validated, assigned an owner, or built an audit trail around.
A 36-Month Roadmap
A credible transition fits inside 36 months. Slower is safer. Faster is not available.
Gap assessment
Clause-by-clause comparison of current practice against the 2025 edition, conducted by someone not defending their own procedures. The output is a prioritized list of gaps with an honest read on which require new software rather than new paperwork.
Selection and validation plan
Inventory every piece of software in the result-generation path, including the spreadsheets and middleware most surveys politely ignore. Select the platforms that will consolidate the work, and build a validation master plan that maps each system to the clauses it touches.
Configure, validate, migrate
Stand up the new platforms, validate each against its plan, and migrate historical data with retention and traceability intact. Migration validation is its own workstream, and it is where most projects first discover their data is in worse shape than anyone admitted.
Parallel run and training
Operate new and legacy systems side by side on the same samples. Reconcile outputs until reconciliation stops producing surprises. Train analysts, reviewers, and quality managers while the old workflow still exists as a fallback.
Retire paper and reassessment prep
Close legacy workflows, archive legacy records under retention rules, and assemble the document pack the assessor will request. Schedule the reassessment so the certificate is in hand before September 30, 2028, not on it.
A lab that begins this work in month 30 will not finish. A lab that begins in month 34 will not have the option to finish. The three years do not flex at the back end.
Common Transition Pitfalls
Treating the LIMS as the only software in scope
The LIMS is usually the best-documented piece of software in the lab, which is why it is the easiest one to point at when an assessor asks for validation evidence. The findings come from the rest of the stack.
Relying on scanned signatures
A scanned signature bound to a PDF is an image attached to a document, not an electronic signature under the 2025 definition. The standard expects a signature that cannot be produced by anyone other than the named signatory, backed by a system log that can prove it.
Skipping data migration validation
Migration gets treated as a cleanup exercise when it is a validation exercise. Records that arrive in the new system without their original timestamps, their full audit trail, or their retention metadata carry a finding into the new accreditation period.
Starting in month 30
Every realistic timeline assumes the phases run in sequence. A lab that starts with six months on the clock will spend those months on a gap assessment and watch the deadline pass.
Your Next Move
If you haven't started, the next 90 days decide whether this transition is a project or a scramble. Commission an honest gap assessment, inventory every piece of software that touches a result, and pull the validation records you already have into one place. That alone will tell you whether month zero looks like cleanup or rebuild.
If you're mid-selection, pressure-test every vendor against Clause 7.11 directly. Ask for the audit trail they produce for a single sample. Ask how they bind a signature to a record. Ask what their migration validation evidence looks like. A vendor who answers in feature language is a vendor who will leave you defending the gap at reassessment.
If you're mid-validation, the work left is the work that is easy to defer: the spreadsheets, the middleware, the macros, the training records. Defer them into the parallel-run phase and they will define your next audit.
Wherever you are on the 36-month map, the decisions ahead are easier to make with someone who has seen the terrain. LabLynx has been building LIMS platforms for accredited labs since 1997, and the 2025 edition describes, clause by clause, the way the LabLynx LIMS was already built to operate. A short conversation will tell you where your current workflow stands against the new requirements, and what a credible path to September 2028 looks like for your lab specifically.
A 45-minute conversation with a LabLynx compliance specialist will map your current workflow against every clause that changed in the 2025 edition and show you exactly where the gaps are. Bring your current evidence pack; leave with a short list of what matters most before your next reassessment.
Primary standards and accreditation body guidance referenced throughout. Last reviewed April 2026.

