
By: SMT Field Service Lead (10+ years; cross-border dispatch and on-site recovery)
If a placement head is down or a reflow zone drifts out of spec, every minute costs throughput, FPY, and confidence. That’s why your top buying risk isn’t just machine specs—it’s the quality of SMT after-sales service. This scorecard turns service into numbers you can audit, compare, and contract for across regions when sourcing SMT equipment from China.
Key takeaways
Make service network coverage your first gate. Require hard evidence (addresses, coverage maps, dispatch logs) before discussing SLAs or price.
Score vendors with weighted criteria and minimum acceptance thresholds; fail any vendor that misses your non-negotiables even if the total score looks good.
Bind KPIs to contract language and reporting cadence. Evidence beats promises; require logs, not just “24/7” claims.
How the scorecard works
Scale: Score each checklist item from 0–5 (0 = no evidence; 3 = meets minimum; 5 = exceeds target with proof).
Weights: Seven pillars with indicative weights below; adjust for your factory footprint and risk tolerance.
Pass/fail rule: Overall pass ≥70/100 with no pillar below its acceptance gate. A single red-flag fail on P1 coverage or safety/security is a no-go.
Evidence rule: Scores require verifiable artifacts (addresses, logs, rosters, protocols). References alone are insufficient.
Service network coverage (weight 25%)
Primary lens: Can the vendor physically reach each of your plants fast enough, across time zones and holidays, with certified people and tools? Treat this as your hard filter for SMT after-sales service.
Coverage formula
Coverage Ratio (Tier X) = (Number of buyer sites reachable on-site within X hours) / (Total buyer sites)
Tiers to model: Local ≤8h; Regional ≤24h; International ≤72h
Target ranges to start negotiations
Coverage Ratio: ≥80% of sites within 24 hours; ≥95% within 72 hours.
Time-zone overlap for Tier-1: ≥8 business hours per site; 24/7 hotline for P1 incidents.
Evidence to request
Office/authorized partner list with full street addresses and contacts; coverage map with SLA tiers.
Anonymized 12-month dispatch logs showing severity, dispatch time, GPS on-site check-in, and closure.
Holiday/on-call rosters, surge capacity plan for NPI or line-down events.
Cross-border readiness: visa lead times and ATA Carnet playbook for service tools; ICC describes the carnet as a “passport for goods” to speed temporary imports duty-free, which helps on-site SLAs across borders, per the International Chamber of Commerce’s guidance in the ATA Carnet solution overview.
Why interoperability helps
Vendors aligned with line and factory data standards can diagnose faster. For example, IPC’s CFX overview of smart manufacturing enablement và IPC-Hermes-9852 v1.4 table of contents show how machine-to-machine and factory messages improve traceability and context for triage.
Red flags to watch
Partner addresses that do not validate in public registries/maps; no proof of on-site arrival logs; one-person “coverage” for vast regions; no holiday or visa plan.
Master scorecard (weights and acceptance gates)
Pillar | Weight | Minimum acceptance gate | What evidence looks like |
|---|---|---|---|
Service network coverage | 25% | ≥80% sites ≤24h; ≥95% ≤72h; Tier-1 overlap ≥8h | Addressed partner list, coverage map, 12-month dispatch logs |
Response and repair SLA | 20% | P1 acknowledge ≤1h; remote diagnostics start ≤2h; on-site: 8h/24h/72h bands | Ticketing timestamps, remote session logs, GPS check-ins |
Spare parts assurance | 15% | ≥90% critical SKU fill rate in nearest overseas stock; P90 lead time ≤7–10 days | Warehouse SKU snapshot, ATP for blinded SKUs, brokerage SLAs |
Engineer capability | 10% | FTR target ≥70% with audit trail; active safety/certification registry | Skills matrix, certification records, closed work orders |
Installation and training | 10% | IQ/OQ/PQ completed per protocol; training pass-rate tracked | Signed IQ/OQ/PQ reports, training assessments, SOPs |
Software and remote support | 10% | Secure remote access with MFA/VPN and logging; fix-without-visit KPI tracked | Remote access policy, release notes, integration docs |
Compliance and documentation | 10% | Retain service records ≥3–5 years; calibration and CAPA documented | QMS procedures, sample reports, certs |
Scoring tip: Keep a “red flag” override. For example, any falsified address or absence of safety training should block award regardless of totals.
Response and repair SLAs (weight 20%)
What you want is clarity by severity, with evidence. Acknowledge fast, start remote diagnostics within two hours, and get on-site within tiered targets. Structure SLAs with the service management discipline framed by ISO/IEC 20000; see the service-level management examples in the ISO/IEC 20000-1 in action guide.
KPIs to contract
P1 response: acknowledge ≤1 hour, remote diagnostics start ≤2 hours.
On-site arrival: Local ≤8h; Regional ≤24h; International ≤72h.
MTTR (target window): 24–48 hours for P1, subject to parts availability and escalation.
Evidence to collect
Ticket system exports with timestamps; remote session logs; GPS on-site check-ins; escalation ladder with timings and owner names.
Clause starter
P1 (line-down): acknowledge ≤1h; remote diagnostics start ≤2h; on-site arrival Local ≤8h / Regional ≤24h / International ≤72h. Monthly SLA reports required; misses trigger joint RCA and CAPA.
Spare parts assurance (weight 15%)
Parts logistics decide whether “target MTTR” is real. Require proof of overseas warehouses, critical SKU fill rates, and percentile lead times. For a practical list of what to ask vendors to include in quotations (warranty, training, local service support, spares expectations), see the contextual checklist ideas in the company’s page on vacuum reflow oven quotation for high‑volume factories.
KPIs to contract
Critical SKU fill rate in nearest overseas stock ≥90%; lead time percentiles: P50 ≤3–5 days, P90 ≤7–10 days.
Advance-exchange availability for selected assemblies; consignment or site safety stock for remote plants.
Evidence to collect
Warehouse location list and time-stamped SKU snapshot; blinded-ATP test on five critical SKUs; customs/brokerage SLAs; obsolescence roadmap for key assemblies.
Engineer capability (weight 10%)
More than headcount, you need cross-domain skills and a first-time fix rate with an audit trail. Industry field-service benchmarks often cite a median FTR around the high-80s; use that as directional context and set your threshold with evidence, as discussed in the TSIA field-service KPIs explainer.
KPIs to contract
First-time fix rate ≥70% target with quarterly reporting.
Active certification registry (process/software/electrical), average training hours/engineer/year, and safety training currency.
Evidence to collect
Skills matrix, certification copies with expiry dates, closed work orders showing FTR, mentoring/shadowing plans, language capability for local sites.
Installation, commissioning, and training (weight 10%)
Your acceptance criteria live in IQ/OQ/PQ. Tie handover to signed protocols, golden profiles, and operator/maintenance training pass-rates. Reference widely used qualification language from FDA/ISPE-aligned practices when shaping your protocols.
KPIs to contract
IQ/OQ/PQ completion per agreed protocol and schedule; ramp-to-rate achieved within the planned window.
Training completion and pass rates tracked for operators, technicians, and programmers.
Evidence to collect
Signed IQ/OQ/PQ reports; golden profile package; training rosters with assessments; maintenance SOPs and schedules.
Software and remote support (weight 10%)
Secure remote access accelerates triage and raises the fix-without-visit rate—provided it’s controlled and logged. For industrial environments, align with practices like NIST’s guidance in the SP 1800-10B remote access practice guide. Interoperability with factory systems speeds evidence and diagnosis; see IPC’s CFX overview và Hermes 9852 v1.4 TOC for context on data flows.
KPIs to contract
Remote diagnostic start time ≤2h for P1, fix-without-visit rate tracked and improved quarterly.
Documented firmware lifecycle with release cadence, rollback method, and vulnerability patch SLA.
MES/traceability integration lead-time and compatibility (CFX/Hermes/SMEMA interfaces declared).
Evidence to collect
Remote access policy (MFA/VPN, session recording, logs retention), release notes with issue IDs, API/interface documentation, integration case logs.
Compliance and audit documentation (weight 10%)
Regulated sectors expect retention, traceability, and CAPA discipline. Keep the bar clear and auditable.
KPIs to contract
Retain service logs and records for ≥3–5 years aligned with your QMS.
CAPA closure within agreed timelines; calibration certificates for instruments used on-site.
Evidence to collect
ISO/QMS certificates; document control procedures; anonymized sample service reports; calibration certificates; re-audit closure examples.
A neutral example of evidence request workflow
As a practical illustration, suppose you request documentation from a vendor such as S&M Co.Ltd. Keep it neutral and evidence-led. Ask for: a) an addressed list of overseas offices and authorized service partners plus a coverage map with 8h/24h/72h tiers; b) a time-stamped snapshot of the nearest overseas warehouse critical SKU list with stock quantities; and c) a 12-month anonymized dispatch log showing severity tags, dispatch and arrival timestamps (with GPS check-in), engineer ID, parts-in-hand flag, and closure notes. You’re not asking for performance boasts—you’re collecting artifacts to score the SMT after-sales service against this checklist.
Contract clause starters you can adapt
Coverage commitment: Vendor guarantees ≥80% of Buyer sites are reachable on-site ≤24h and ≥95% ≤72h; provide notified updates to office/partner lists within 10 business days of change.
SLA by severity: P1 (line-down) acknowledge ≤1h; remote diagnostics start ≤2h; on-site arrival Local ≤8h / Regional ≤24h / International ≤72h. Monthly SLA scorecards; misses trigger RCA/CAPA.
Spare parts: Maintain ≥90% critical SKU fill rate in nearest overseas stock; disclose P50/P90 lead times monthly; offer advance-exchange for designated assemblies; provide consignment/safety stock options for remote sites.
Remote support and security: All remote sessions via MFA-protected VPN with session recording and log retention; publish firmware lifecycle and vulnerability patch SLA; provide session logs on request; declare CFX/Hermes/SMEMA interface support.
Next steps
Download and adapt the weighted scorecard to your site list; run a pilot RFI with two vendors and validate evidence before shortlisting. To see how vendors disclose partner networks, review a typical distributor disclosure format such as this global distributor network example.
If your procurement culture tends to overweight price, share this cautionary analysis on the top risks of choosing an SMT supplier based only on price; re-balance your scoring model before issuing the RFP.
—
Frequently asked
Q: Is a 24–48 hour MTTR realistic globally? A: Sometimes. It depends on spare parts placement and customs clearance. That’s why consigned stock or advance-exchange and a documented brokerage SLA matter; otherwise, MTTR targets are just talk.
Q: Where should we start if we have limited data? A: Start with coverage evidence and hotline tests. If a vendor can’t prove reach and answer the phone reliably, deeper SLAs won’t save you.
SEO note: This scorecard is designed to help buyers evaluate SMT after-sales service with measurable criteria, not marketing slogans. Use it to standardize vendor responses and build auditable contracts across regions.
Last updated: 2026-03-09 | Version: 1.0
