Ask Question
8 June, 11:10

What happened to German territory in the east after WWI?

+2
Answers (1)
  1. 8 June, 14:26
    0
    Germany lost World War I. In the 1919 Treaty of Versailles, the victorious powers (the United States, Great Britain, France, and other allied states) imposed punitive territorial, military, and economic provisions on defeated Germany ... In the east, Poland received parts of West Prussia and Silesia from Germany.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question ✅ “What happened to German territory in the east after WWI? ...” in 📘 History if you're in doubt about the correctness of the answers or there's no answer, then try to use the smart search and find answers to the similar questions.
Search for Other Answers