The foundation for consistent and meaningful conducted immunity testing comes from defining the appropriate combination of amplification, injection device, and test set-up. RF conducted immunity (CI) is the test method and standard that substitutes for radiated immunity (RI) testing at lower frequencies. Due to antenna size, near field and grounding interactions, RI testing at lower frequencies is inherently problematic. Susceptibility to radiated energy at lower frequencies is more likely to find ingress on the equipment input/output lines; consequently, CI methods and standards were developed that replicate these radiated fields that would couple onto equipment cabling. Defining the system components is critical, considering and applying the correct method accordingly will result in dependable CI testing.
How does radiated energy get induced on to a cable?
Any conductor can act as an antenna and allow energy to be induced upon it. To do so, it must be resonant at that given frequency. The free space impedance of a plane wave is approximately 370 Ω. An effective antenna, or a test cable acting as an antenna must transform this impedance to induce a current on to its conductor. Antenna impedance varies from approximately 73 Ω for 1/4 wave stub to 280 Ω for a folded dipole. (See Figure 1). Therefore, test methods have been established with levels based on lower impedance values, namely 50 and 150 Ω. Currents induced on multiple conductors may not be balanced. That is to say, the level and phase are not equal. All CI injection methods, however, are common-mode or equal level/phase on all conductors.