SKIP TO CONTENT

West Germany

/wɛst ˌdʒʌrməni/
IPA guide

Definitions of West Germany
  1. noun
    a republic in north central Europe on the North Sea; established in 1949 from the zones of Germany occupied by the British and French and Americans after the German defeat; reunified with East Germany in 1990
    see moresee less
    example of:
    European country, European nation
    any one of the countries occupying the European continent
Cite this entry
Style:
MLA
  • MLA
  • APA
  • Chicago

Copy citation
DISCLAIMER: These example sentences appear in various news sources and books to reflect the usage of the word ‘West Germany'. Views expressed in the examples do not represent the opinion of Vocabulary.com or its editors. Send us feedback
Word Family