sklearn.feature_selection.mutual_info_classif Estimate mutual information for a discrete target variable. Mutual information (MI) [1] between two random variables is a non-negative value, which measures the dependency between the variables. The mutual information determines how similar the joint distribution is to the products of the factored marginal distributions. If and are completely unrelated (and therefore independent), then would equal , and this integral would be zero.

*Returns the maximum normalized mutual information scores (i.e. the characteristic matrix M if est=”mic_approx”, the equicharacteristic matrix instead). M is a list of 1d numpy arrays where M[i][j] contains the score using a grid partitioning x-values into i+2 bins and y-values into j+2 bins.*// Mutual Information as computed in this example, and as commonly used in the // context of image registration provides a measure of how much uncertainty on // the value of a pixel in one image is reduced by measuring the homologous