Skip to content

Entropy

Source code

Description

The entropy is measured with the formula -sum(pk * log(pk)) where pk are discrete probabilities.

Usage

<Expr>$entropy(base = base::exp(1), normalize = TRUE)

Arguments

base Given exponential base, defaults to exp(1).
normalize Normalize pk if it doesn’t sum to 1.

Value

Expr

Examples

library(polars)

pl$DataFrame(x = c(1, 2, 3, 2))$
  with_columns(entropy = pl$col("x")$entropy(base = 2))
#> shape: (4, 2)
#> ┌─────┬──────────┐
#> │ x   ┆ entropy  │
#> │ --- ┆ ---      │
#> │ f64 ┆ f64      │
#> ╞═════╪══════════╡
#> │ 1.0 ┆ 1.905639 │
#> │ 2.0 ┆ 1.905639 │
#> │ 3.0 ┆ 1.905639 │
#> │ 2.0 ┆ 1.905639 │
#> └─────┴──────────┘