![]() |
Illustration of IR Measurment |
The Fundamental variation in insulation resistance (IR) readings of Lithium Batteries are due to below reasons namely,
- Battery Pack Design and Construction
- Voltage Level
- Environmental Conditions
- Contamination or Aging
- Measurement Path and Method
- Capacity/Size of the Pack
- SAE J1766 (Society of Automotive Engineers),
- ISO 6469-3 (Electrically Propelled Road Vehicles – Safety Specifications),
- IEC 61851-23 (Electric Vehicle Conductive Charging System),
- UN ECE R100 (Uniform Provisions for Battery Electric Vehicles)
All the above standards specify minimum IR of 100 Ω/V for DC systems and 500 Ω/V for AC systems, measured between HV components and the chassis, also at the time of crash or vibration, thermal shock, environmental tests like water ingress IR must remain above these thresholds.
For an 800V system (e.g., BYD Seal EV Vehicle):
- DC Side IR Measurment: 100 Ω/V × 800V = 80 kΩ.
- AC Side IR Measurment: 500 Ω/V × 800V = 400 kΩ.
And for EV Chargers , it must maintain IR >1 MΩ under normal conditions to prevent leakage to the grid or chassis, It has to ensures safe interaction between the vehicle and charging infrastructure. Diagnostic monitoring Devices monitor IR has to detect faults during charging.
EV manufacturers like Tesla and BYD calibrate their Battery Management Systems (BMS) and diagnostic tools to monitor IR in realtime, typically displaying values in megaohms (MΩ).Healthy systems show IR values significantly above the minimum requirements, often >1500 to 4000 kΩ (1.5 to 5 MΩ), as seen through diagnostic tools like CANALYZER & PEAK CAN with monitoring softwares.
By Connecting the megohmmeter Positive terminal to ground or Negative terminal to ground and applying a DC test voltages 500V or 1000V DC will show us the resistance value either in KΩ or MΩ depending upon the fault condition. Higher resistance mean no leakage of power and IR is OK.