Ask Question
31 October, 13:37

Assume a company pays out $100 in dividends in year 1. dividends increase by 10% a year for 4 years and thereafter stay constant. what is the total amount of dividends paid out (rounded to the nearest $1) during years 1-10?

+2
Answers (1)
  1. 31 October, 14:50
    0
    Dividends increased by 10% (0.1) in the first 4 years.

    Therefore

    Dividends paid in year 1 = $100.00

    Dividends paid in year 2 = $100*1.1 = $110.00

    Diividends paid in year 3 = $110*1.1 = $121.00

    Dividends paid in year 4 = $121*1.1 = $133.10

    Dividends paid in year 5 = $133.1*1.1 = $146.41

    For the next years 5 - 10, dividends remained constant.

    Dividends paid in years 6 - 10 = $146.41*5 = $732.05

    Total dividends paid in years 1-10 is

    100 + 110 + 121 + 133.10 + 146.41 + 732.05 = $1,342.56

    Answer: Total dividends paid in years 1-10 = $1,343 (nearest dollar)
Know the Answer?
Not Sure About the Answer?
Find an answer to your question ✅ “Assume a company pays out $100 in dividends in year 1. dividends increase by 10% a year for 4 years and thereafter stay constant. what is ...” in 📘 Business if you're in doubt about the correctness of the answers or there's no answer, then try to use the smart search and find answers to the similar questions.
Search for Other Answers