Ask Question
26 September, 09:24

When did vaccines become mandatory in the us?

+2
Answers (1)
  1. 26 September, 12:37
    0
    The year of 1978 is when vaccines were mandatory in the United States
Know the Answer?
Not Sure About the Answer?
Find an answer to your question ✅ “When did vaccines become mandatory in the us? ...” in 📘 History if you're in doubt about the correctness of the answers or there's no answer, then try to use the smart search and find answers to the similar questions.
Search for Other Answers