Ask Question
14 December, 02:10

Suppose the maximum safe average intensity of microwaves for human exposure is taken to be 2.50 W / m 2. If a radar unit leaks 10.0 W of microwaves (other than those sent by its antenna) uniformly in all directions, how far away must you be to be exposed to an average intensity considered to be safe? Assume that the power spreads uniformly over the area of a sphere with no complications from absorption or reflection.

+2
Answers (1)
  1. 14 December, 04:27
    0
    r = 0.56 m

    Explanation:

    The power emitted by the radar is evenly distributed in a sphere, so let's use the ratio of intensity and power

    I = P / A

    A = P / I

    A = 10.0 / 2.50

    A = 4 m²

    Let's find the radius of the sphere of this area

    A = 4π r²

    r = √ A / 4π

    r = √ 4 / 4π

    r = 0.56 m
Know the Answer?
Not Sure About the Answer?
Find an answer to your question ✅ “Suppose the maximum safe average intensity of microwaves for human exposure is taken to be 2.50 W / m 2. If a radar unit leaks 10.0 W of ...” in 📘 Physics if you're in doubt about the correctness of the answers or there's no answer, then try to use the smart search and find answers to the similar questions.
Search for Other Answers