Empathy vs Compression - What's the difference?
empathy | compression |
the intellectual identification of the thoughts, feelings, or state of another person
capacity to understand another person's point of view or the result of such understanding
(parapsychology, science fiction) a paranormal ability to psychically read another person's emotions
an increase in density; the act of compressing, or the state of being compressed; compaction
the cycle of an internal combustion engine during which the fuel and air mixture is compressed
(computing) the process by which data is compressed
* {{quote-web
, year = 2011
, author = Marcelo A. Montemurro & Damián H. Zanette
, title = Universal Entropy of Word Ordering Across Linguistic Families
, site = PLoS ONE
, url = http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0019875
, accessdate = 2012-09-26}}
(music) the electronic process by which any sound's gain is automatically controlled
(astronomy) the deviation of a heavenly body from a spherical form
As nouns the difference between empathy and compression
is that empathy is the intellectual identification of the thoughts, feelings, or state of another person while compression is an increase in density; the act of compressing, or the state of being compressed; compaction.empathy
English
Noun
(wikipedia empathy)- She had a lot of empathy for her neighbor; she knew what it was like to lose a parent too.
Usage notes
Used similarly to sympathy, interchangeably in looser usage. In stricter usage, empathy is stronger and more intimate, meaning that the subject understands and shares an emotion with the object – as in “I feel your pain” – while (term) is weaker and more distant – concern, but not shared emotion: “I care for you”.compression
English
Noun
(en noun)- Due to the presence of long-range correlations in language [21], [22] it is not possible to compute accurate measures of the entropy by estimating block probabilities directly. More efficient nonparametric methods that work even in the presence of long-range correlations are based on the property that the entropy of a sequence is a lower bound to any lossless compressed version of it [15]. Thus, in principle, it is possible to estimate the entropy of a sequence by finding its length after being compressed by an optimal algorithm. In our analysis, we used an efficient entropy estimator derived from the Lempel-Ziv compression algorithm that converges to the entropy [19], [23], [24], and shows a robust performance when applied to correlated sequences [25] (see Materials and Methods).
