Contact Resistance Tester – Selection Guide
The key to selecting a contact resistance tester is matching the test current, accuracy/range, measurement method, portability, and functionality while complying with standards such as DL/T 845.4-2022.

1. Core Parameter Selection
1.1 Test Current (Most Critical)
Standard Requirement: ≥100 A DC (per DL/T 845.4-2022) to break through oxide films and ensure true value readings.
Selection Recommendations:
Conventional switches / circuit breakers / isolating switches: 100 A (preferred for general use)
GIS / new equipment acceptance / high-precision applications: 200 A / 300 A
Large busbars / long cable joints: 400 A – 600 A
Portable / low voltage / light load: 50 A / 100 A handheld type
1.2 Measurement Range and Accuracy
Range: Mainstream range 0.1 μΩ – 2000 μΩ (covers the vast majority of switch circuits)
Resolution: Conventional ≥0.1 μΩ; low impedance (<10 μΩ) ≥0.01 μΩ
Accuracy Class:
Preventive testing: ±(0.5% RD + 2 digits)
Handover / factory acceptance: ±(0.2% RD + 1 digit)
Laboratory / calibration: ±0.1% RD
1.3 Measurement Method (Essential)
Must support Kelvin four-wire system: current and voltage lines separated to eliminate lead resistance influence
Wiring principle: current leads on the outside, voltage leads on the inside
2. Scenario and Function Selection
2.1 Portability and Power Supply
On-site / climbing / mobile applications: handheld / portable (≤5 kg) with lithium battery power (endurance ≥8 hours)
Fixed / laboratory applications: benchtop models with AC power supply for higher output
2.2 Anti-Interference and Stability
Strong electromagnetic environments: select models with power frequency interference resistance and automatic current stabilization
Long-distance testing: high current + low drift design
2.3 Data and Operation
Data storage, USB / Bluetooth export, printing, automatic report generation