West Germany

Definitions of West Germany
  1. noun
    a republic in north central Europe on the North Sea; established in 1949 from the zones of Germany occupied by the British and French and Americans after the German defeat; reunified with East Germany in 1990
    synonyms: Federal Republic of Germany
    see moresee less
    example of:
    European country, European nation
    any one of the countries occupying the European continent
Word Family