Ask Question
14 May, 04:47

What happened when the First World War ended?

+4
Answers (2)
  1. 14 May, 06:12
    0
    Nations that gained or regained territory or independence after World War I. France: gained Alsace-Lorraine as well as various African colonies from the German Empire, and Middle East territories from the Ottoman Empire. The African and Middle East gains were officially League of Nations Mandates.
  2. 14 May, 07:13
    0
    They stopped fighting while terms of peace were negotiated
Know the Answer?
Not Sure About the Answer?
Find an answer to your question ✅ “What happened when the First World War ended? ...” in 📘 History if you're in doubt about the correctness of the answers or there's no answer, then try to use the smart search and find answers to the similar questions.
Search for Other Answers