Skip to content

[ENH]: Implement Bias Variance decomposition #302

@fraimondo

Description

@fraimondo

Which feature do you want to include?

From @kaurao:

def bias_variance_decomposition_classification(y, A):
    import numpy as np

    """
    y: true labels
    A: predicted labels, columns labels, rows repetitions
    see: Kohavi and WOlpert, Bias Plus Variance Decomposition for Zero-One Loss Functions
    see: https://github.com/Waikato/weka-trunk/blob/master/weka/src/main/java/weka/classifiers/BVDecompose.java
    """

    assert len(y) == np.size(A,1)
    uy = np.unique(y)
    m_Bias, m_Variance, m_Sigma, m_Bias_uncorr = [0, 0, 0, 0]

    for i in range(np.size(A,1)): # over instances
        bsum, vsum, ssum, busum = [0, 0, 0, 0]

        for j in range(len(uy)):
            pActual = 1 if y[i] == uy[j] else 0
            pPred = np.mean(A[:,i] == uy[j])

            bsum = bsum + (pActual - pPred) ** 2 - pPred * (1 - pPred) / (np.size(A, 0) - 1) # this is unbiased estimator of Bias ^ 2, see Kohavi's paper section 4.2
            vsum = vsum + (pPred * pPred)
            ssum = ssum + (pActual * pActual)
            busum = busum + (pActual - pPred) ** 2

        m_Bias = m_Bias + bsum
        m_Variance = m_Variance + (1 - vsum)
        m_Sigma = m_Sigma + (1 - ssum)
        m_Bias_uncorr = m_Bias_uncorr + busum

    m_Bias = m_Bias / (2 * np.size(A, 1))
    m_Variance = m_Variance / (2 * np.size(A, 1))
    m_Sigma = m_Sigma / (2 * np.size(A, 1))
    m_Bias_uncorr = m_Bias_uncorr / (2 * np.size(A, 1))
    print(m_Bias, m_Variance, m_Sigma, m_Bias_uncorr)

    return m_Bias, m_Variance, m_Sigma, m_Bias_uncorr

How do you imagine this integrated in julearn?

as part of the stats module.

Do you have a sample code that implements this outside of julearn?

Anything else to say?

No response

Metadata

Metadata

Assignees

Labels

enhancementNew feature or request

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions