Indices based on information theory

Indices based on information theory, such as entropy, mutual information etc, can easily be computed. To this end, the ecological network is transformed in a bivariate distribution. This is done by normalizing the adjacency or incidence matrix to obtain a doubly stochastic matrix. The information theoretic indices are computed either from this matrix or directly from the ecological network. Note that when using an array is input, the functions do not perform any checks whether the matrix is normalized and nonnegative. When the input is an ecological network, the functions automatically convert the network to a normalized probability matrix.

One can compute individual indices or use the function information_decomposition which performs the entire decomposition at once.

Indices can be calculated for the joint distribution, as well as for the marginal distributions of the two trophic levels (if applicable), by changing an optional argument dim=1 of the function.

Network conversion

EcologicalNetworks.make_joint_distributionFunction
make_joint_distribution(N::NT) where {NT<:AbstractEcologicalNetwork}

Returns a double stochastic matrix from the adjacency or incidence matrix. Raises an error if the matrix contains negative values. Output in bits.

source

Indices

EcologicalNetworks.entropyFunction
entropy(P::AbstractArray)

Computes the joint entropy of a double stochastic matrix. Does not perform any checks whether the matrix is normalized. Output in bits.

source
entropy(P::AbstractArray, dims::I)

Computes the marginal entropy of a double stochastic matrix. dims indicates whether to compute the entropy for the rows (dims=1) or columns (dims=2). Does not perform any checks whether the matrix is normalized. Output in bits.

source
entropy(N::AbstractEcologicalNetwork)

Computes the joint entropy of an ecological network. Output in bits.

source
entropy(N::AbstractEcologicalNetwork, dims::I)

Computes the marginal entropy of an ecological network. dims indicates whether to compute the entropy for the rows (dims=1) or columns (dims=2). Output in bits.

source
EcologicalNetworks.conditional_entropyFunction
conditional_entropy(P::AbstractArray, given::I)

Computes the conditional entropy of double stochastic matrix. If given = 1, it is the entropy of the columns, and visa versa when given = 2. Output in bits.

source
conditional_entropy(N::AbstractEcologicalNetwork, given::I)

Computes the conditional entropy of an ecological network. If given = 1, it is the entropy of the columns, and visa versa when given = 2.

source
EcologicalNetworks.mutual_informationFunction
mutual_information(P::AbstractArray)

Computes the mutual information of a double stochastic matrix. Output in bits.

source
mutual_information(N::NT) where {NT<:AbstractEcologicalNetwork}

Computes the mutual information of an ecological network. Output in bits.

source
EcologicalNetworks.variation_informationFunction
variation_information(P::AbstractArray)

Computes the variation of information of a double stochastic matrix. Output in bits.

source
variation_information(N::AbstractEcologicalNetwork)

Computes the variation of information of an ecological network. Output in bits.

source
EcologicalNetworks.potential_informationFunction
potential_information(N::NT)

Computes the maximal potential information in a network, corresponding to every species interacting with every other species. Compute result for the marginals using the optional parameter dims. Output in bits.

source
potential_information(N::NT, dims::I)

Computes the maximal potential information in a network, corresponding to every species interacting with every other species. Compute result for the marginals using the optional parameter dims. Output in bits.

source
EcologicalNetworks.diff_entropy_uniformFunction
diff_entropy_uniform(P::AbstractArray)

Computes the difference in entropy of the marginals compared to the entropy of an uniform distribution. The parameter dims indicates which marginals are used, with both if no value is provided. Output in bits.

source
diff_entropy_uniform(P::AbstractArray, dims::I)

Computes the difference in entropy of the marginals compared to the entropy of an uniform distribution. The parameter dims indicates which marginals are used, with both if no value is provided. Output in bits.

source
diff_entropy_uniform(N::AbstractEcologicalNetwork, dims::I=nothing)

Computes the difference in entropy of the marginals compared to the entropy of an uniform distribution. The parameter dims indicates which marginals are used, with both if no value is provided. Output in bits.

source

Decomposition

EcologicalNetworks.information_decompositionFunction
information_decomposition(N::AbstractEcologicalNetwork; norm::Bool=false, dims::I=nothing)

Performs an information theory decomposition of a given ecological network, i.e. the information content in the normalized adjacency matrix is split in:

  • :D : difference in entropy of marginals compared to an uniform distribition
  • :I : mutual information
  • :V : variation of information / conditional entropy

If norm=true, the components are normalized such that their sum is equal to 1. One can optinally give the dimision, indicating whether to compute the indices for the rows (dims=1), columns (dims=2) or the whole matrix (default).

Result is returned in a Dict. Outputs in bits.

source

Effective interactions