During contact resistance testing, a precise measurement of resistance across electrical connections—such as circuit breaker contacts, switches, and busbar joints—is performed. The goal is to ensure these points allow efficient current flow without generating excessive heat due to high resistance, which could lead to power loss, malfunction, or failure.
Key Steps in Contact Resistance Testing:
Setup and Connection:
- The test device, usually a micro-ohmmeter, is connected to the contacts with four leads: two for current injection and two for measuring voltage.
- This four-wire method (or Kelvin method) isolates the measurement from test lead resistance, enabling high accuracy.
Current Injection and Voltage Measurement:
- A controlled DC current is injected through the contacts via the current leads.
- The potential leads measure the voltage drop across the contacts.
- Using Ohm's Law (Resistance = Voltage / Current), the micro-ohmmeter calculates the contact resistance.
Thermal EMF Mitigation:
- Thermal EMFs (thermocouple effects) can interfere with readings. The tester may use polarity reversal or averaging to remove these errors from the measurement, ensuring accurate results.
Re-testing with Higher Current (if needed):
- If low current produces unreliable (higher than expected) resistance readings, testing at a higher current can overcome minor issues like surface oxidation or connection inconsistencies.
Trend Analysis and Recordkeeping:
- Consistent measurement conditions are essential for reliable trend analysis. By taking periodic measurements under the same conditions, operators can track changes in resistance over time to predict maintenance needs.
Contact resistance testing is vital for identifying and preventing potential faults that can lead to inefficiency, overheating, or equipment failure.
Nov 09,2024