entropy

The idea of entropy comes from a principle of thermodynamics dealing with energy. It usually refers to the idea that everything in the universe eventually moves from order to disorder, and entropy is the measurement of that change.

The word entropy finds its roots in the Greek entropia, which means "a turning toward" or "transformation." The word was used to describe the measurement of disorder by the German physicist Rudolph Clausius and appeared in English in 1868. A common example of entropy is that of ice melting in water. The resulting change from formed to free, from ordered to disordered increases the entropy.

Primary Meanings of entropy

1.
n
(communication theory) a numerical measure of the uncertainty of an outcome
2.
n
(thermodynamics) a thermodynamic quantity representing the amount of energy in a system that is no longer available for doing mechanical work
Full Definitions of entropy
1

n (communication theory) a numerical measure of the uncertainty of an outcome

Synonyms:
information, selective information
Type of:
information measure
a system of measurement of information based on the probabilities of the events that convey information
2

n (thermodynamics) a thermodynamic quantity representing the amount of energy in a system that is no longer available for doing mechanical work

entropy increases as matter and energy in the universe degrade to an ultimate state of inert uniformity”
Synonyms:
S, randomness
Types:
conformational entropy
entropy calculated from the probability that a state could be reached by chance alone
Type of:
physical property
any property used to characterize matter and energy and their interactions

Sign up, it's free!

Whether you're a student, an educator, or a life-long learner, Vocabulary.com can put you on the path to systematic vocabulary improvement.