quantitykind:InformationEntropy

PredicateObject
rdf:type qudt:QuantityKind
dcterms:description Information Entropy is a concept from information theory. It tells how much information there is in an event. In general, the more uncertain or random the event is, the more information it will contain. The concept of information entropy was created by a mathematician. He was named Claude Elwood Shannon. It has applications in many areas, including lossless data compression, statistical inference, cryptography and recently in other disciplines as biology, physics or machine learning.
qudt:applicableUnit
Show 23 values
qudt:hasDimensionVector qkdv:A0E0L0I0M0H0T0D1
qudt:informativeReference http://simple.wikipedia.org/wiki/Information_entropy
qudt:plainTextDescription “Information Entropy is a concept from information theory. It tells how much information there is in an event. In general, the more uncertain or random the event is, the more information it will contain. The concept of information entropy was created by a mathematician. He was named Claude Elwood Shannon. It has applications in many areas, including lossless data compression, statistical inference, cryptography and recently in other disciplines as biology, physics or machine learning.”
qudt:wikidataMatch http://www.wikidata.org/entity/Q204570
rdfs:comment “Applicable units are those of quantitykind:InformationEntropy”
rdfs:isDefinedBy http://qudt.org/3.1.10/vocab/quantitykind
rdfs:label “Information Entropy”@en
skos:broader quantitykind:Dimensionless
Generated 2026-01-15T09:03:10.866-05:00