Tolerance setting

Discussion and questions regarding the Radiated Immunity and Conducted Immunity modules of RadiMation.
Post Reply
labemcteseo
Posts: 9
Joined: 07 Oct 2011, 15:16

Tolerance setting

Post by labemcteseo »

Hi all,
I would like to know exactly how is treated the Tolerance setting when an immunity with field-probe-current-voltage is performed.

For example: when performing a BCI test at 100dBuA closed loop (with current reading), if the value of the tolerance is 0.2dB, the software will regulate the siggen level in order to read 99.8 - 100.2 dBuA.

But when an irradiated immunity is performed, V/m should be taken into account instead of dBuA. How is performed the conversion dB --> V/m? That is, how much is 0.2dB of tolerance when dealing with V/m?

This question is asked from every customer.

Thanks a lot!
User avatar
joro
Posts: 440
Joined: 24 Aug 2011, 09:55
Contact:

Re: Tolerance setting

Post by joro »

Hi. You are almost correct regarding your example:
For example: when performing a BCI test at 100dBuA closed loop (with current reading), if the value of the tolerance is 0.2dB, the software will regulate the siggen level in order to read 99.8 - 100.2 dBuA.
In your example the software will regulate the signal generator level in order to achieve 100.0 - 100.2 dBuA. The software will always try to achieve (AT LEAST) the specified test-level. So it will regulate between: "(the specified test-level) and (the specified test-level + the specified tolerance)".

Now your real question:
But when a radiated immunity is performed, V/m should be taken into account instead of dBuA. How is performed the conversion dB --> V/m?
When the testlevel is specified in a linear unit (like: V/m, mA, Vrms, A, etc...) and not in a logarithmic unit (like: dBuA, dBV/m, dBVrms, etc...), the tolerance is still being specified in dB. In that situation the software is still regulating the testlevel between: "(the specified test-level) and (the specified test-level + the specified tolerance)". However the calculation of: "(the specified test-level + the specified tolerance)" is more complicated because it exists of a linear and a logarithmic component. The software will correctly calculate the value by converting the linear test-level to the corresponding logarithmic value.

For example: When a test-level is specified of 10 V/m, and a tolerance of 0.2 dB.
10 V/m corresponds with 20 dBV/m. When the tolerance of 0.2 dB is added, the top of the regulation window is at 20.2 dBV/m (which corresponds to 10.23 V/m). So in this example the software will regulate the test level between 10 V/m and 10.23 V/m.

I hope that this clarifies how dB tolerance values are used in combination with test-levels specified in linear units.
User avatar
joro
Posts: 440
Joined: 24 Aug 2011, 09:55
Contact:

Re: Tolerance setting

Post by joro »

There is a big advantage for using dB values for the tolerance: Specifying the tolerance in dB values, results in constant regulation accuracy, independent of the size of the specified test-level.

Going back to example of the previous post:
For example: When a test-level is specified of 10 V/m, and a tolerance of 0.2 dB.
10 V/m corresponds with 20 dBV/m. When the tolerance of 0.2 dB is added, the top of the regulation window is at 20.2 dBV/m (which corresponds to 10.23 V/m). So in this example the software will regulate the test level between 10 V/m and 10.23 V/m.
The 0.2 dB tolerance is thus adding a 0.23 V/m tolerance on top of the test-level. That is 2.3% of the specified test-level.

However when a test-level of 100 V/m is requested, the values will be:
100 V/m corresponds with 40 dBV/m. With the tolerance added, the top of the regulation window will be at 40.2 dBV/m. And this 40.2 dBV/m corresponds with 102.3 V/m. In this second example the 0.2 dB tolerance results in a 2.3 V/m tolerance added on top of the test-level. This is also logical because this is again 2.3% of the specified test-level. The regulation accuracy of the measurement is thus the same, no mather if a small test-level or a high-test level is requested.

Also between tests (between radiated immunity and conducted immunity) specifying tolerance in dB will result in the same regulation accuracy. In all cases the needed power (can be signal power, forward power or net power) to achieve the test-level will be regulated within the specified tolerance.

If the tolerance in the software had to be specified in linear units (V/m), it would always be necessary to update the tolerance value if the test-level value is changed. Forgetting to change the tolerance would result in a too high regulation accuracy (with the chance that it is not possible to regulate it within the tolerance), or it could result in an in-accurate regulation accuracy.
Post Reply