SKIP TO CONTENT

Deutschland

Definitions of Deutschland
  1. noun
    a republic in central Europe; split into East Germany and West Germany after World War II and reunited in 1990
    synonyms: Germania
Cite this entry
Style:
MLA
  • MLA
  • APA
  • Chicago

Copy citation
DISCLAIMER: These example sentences appear in various news sources and books to reflect the usage of the word ‘Deutschland'. Views expressed in the examples do not represent the opinion of Vocabulary.com or its editors. Send us feedback
Word Family