Skip to content

Analysis -- Distance Metrics

Functions for comparing probability distributions and quantum measurement outcomes.

hellinger_distance

Compute the Hellinger distance between two probability distributions. The Hellinger distance measures the similarity between two discrete distributions and is bounded in [0,1].

For discrete distributions p and q:

H(p,q)=12x(p(x)q(x))2

Signature (dict with integer keys)

python
hellinger_distance(
    p: dict[int, float],
    q: dict[int, float],
) -> float

Signature (dict with string keys)

python
hellinger_distance(
    p: dict[str, float],
    q: dict[str, float],
) -> float

Parameters

ParameterTypeDescription
pdict[int, float] or dict[str, float]The first probability distribution, represented as a dictionary mapping outcomes to probabilities.
qdict[int, float] or dict[str, float]The second probability distribution, represented as a dictionary mapping outcomes to probabilities.

Returns

float -- The Hellinger distance between the two distributions. A value of 0 means the distributions are identical; a value of 1 means they have no overlap.

Examples

python
from pyqpanda3.quantum_info import hellinger_distance

p = {0: 0.5, 1: 0.5}
q = {0: 0.3, 1: 0.7}

d = hellinger_distance(p, q)
print(d)  # 0.2

hellinger_fidelity

Compute the Hellinger fidelity between two probability distributions. The Hellinger fidelity is related to the Hellinger distance by FH(p,q)=(1H(p,q)2), or equivalently:

FH(p,q)=(xp(x)q(x))2

Signature (dict with integer keys)

python
hellinger_fidelity(
    p: dict[int, float],
    q: dict[int, float],
) -> float

Signature (dict with string keys)

python
hellinger_fidelity(
    p: dict[str, float],
    q: dict[str, float],
) -> float

Parameters

ParameterTypeDescription
pdict[int, float] or dict[str, float]The first probability distribution.
qdict[int, float] or dict[str, float]The second probability distribution.

Returns

float -- The Hellinger fidelity between the two distributions. A value of 1 means the distributions are identical; a value of 0 means they have no overlap.

Examples

python
from pyqpanda3.quantum_info import hellinger_fidelity

p = {0: 0.5, 1: 0.5}
q = {0: 0.3, 1: 0.7}

f = hellinger_fidelity(p, q)
print(f)  # 0.96

KL_divergence

Compute the Kullback-Leibler (KL) divergence between two probability distributions.

For discrete distributions:

DKL(p|q)=xp(x)logp(x)q(x)

For continuous distributions:

DKL(p|q)=p(x)logp(x)q(x)dx

Note: The KL divergence is not symmetric: DKL(p|q)DKL(q|p).

Signature (discrete)

python
KL_divergence(
    p: list[float],
    q: list[float],
) -> float

Signature (continuous)

python
KL_divergence(
    p_pdf: Callable[[float], float],
    q_pdf: Callable[[float], float],
    x_start: float,
    x_end: float,
    dx: float = 1e-4,
) -> float

Parameters (discrete)

ParameterTypeDescription
plist[float]The first discrete probability distribution.
qlist[float]The second discrete probability distribution.

Parameters (continuous)

ParameterTypeDescription
p_pdfCallable[[float], float]The probability density function of the first distribution p(x).
q_pdfCallable[[float], float]The probability density function of the second distribution q(x).
x_startfloatThe lower bound of the integration interval.
x_endfloatThe upper bound of the integration interval.
dxfloatThe step size for numerical integration. Defaults to 1e-4.

Returns

float -- The KL divergence of distribution p relative to distribution q. The value is non-negative; it is zero if and only if the distributions are identical.

Examples

Discrete distributions:

python
from pyqpanda3.quantum_info import KL_divergence

p = [0.5, 0.5]
q = [0.3, 0.7]

d = KL_divergence(p, q)
print(d)

Continuous distributions:

python
import math
from pyqpanda3.quantum_info import KL_divergence

def p_pdf(x):
    return math.exp(-x) if x >= 0 else 0.0

def q_pdf(x):
    return 0.5 * math.exp(-0.5 * x) if x >= 0 else 0.0

d = KL_divergence(p_pdf, q_pdf, 0.0, 20.0, 1e-4)
print(d)

See Also

Released under the MIT License.