Expansion vs Compression - What's the difference?
expansion | compression |
The act or process of expanding.
The fractional change in unit length per unit length per unit temperature change.
A new addition.
A product to be used with a previous product.
That which is expanded; expanse; extended surface.
* Beattie
(steam engines) The operation of steam in a cylinder after its communication with the boiler has been cut off, by which it continues to exert pressure upon the moving piston.
an increase in density; the act of compressing, or the state of being compressed; compaction
the cycle of an internal combustion engine during which the fuel and air mixture is compressed
(computing) the process by which data is compressed
* {{quote-web
, year = 2011
, author = Marcelo A. Montemurro & Damián H. Zanette
, title = Universal Entropy of Word Ordering Across Linguistic Families
, site = PLoS ONE
, url = http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0019875
, accessdate = 2012-09-26}}
(music) the electronic process by which any sound's gain is automatically controlled
(astronomy) the deviation of a heavenly body from a spherical form
As nouns the difference between expansion and compression
is that expansion is the act or process of expanding while compression is an increase in density; the act of compressing, or the state of being compressed; compaction.expansion
English
Noun
- The expansion of metals and plastics in response to heat is well understood.
- My new office is in the expansion behind the main building.
- This expansion requires the original game board.
- the starred expansion of the skies
Antonyms
* (act of expanding) contractionDerived terms
* expansionism * expansion joint * expansion team * expansion cleat * expansion packcompression
English
Noun
(en noun)- Due to the presence of long-range correlations in language [21], [22] it is not possible to compute accurate measures of the entropy by estimating block probabilities directly. More efficient nonparametric methods that work even in the presence of long-range correlations are based on the property that the entropy of a sequence is a lower bound to any lossless compressed version of it [15]. Thus, in principle, it is possible to estimate the entropy of a sequence by finding its length after being compressed by an optimal algorithm. In our analysis, we used an efficient entropy estimator derived from the Lempel-Ziv compression algorithm that converges to the entropy [19], [23], [24], and shows a robust performance when applied to correlated sequences [25] (see Materials and Methods).