Analysis -- Distance Metrics
Functions for comparing probability distributions and quantum measurement outcomes.
hellinger_distance
Compute the Hellinger distance between two probability distributions. The Hellinger distance measures the similarity between two discrete distributions and is bounded in
For discrete distributions
Signature (dict with integer keys)
hellinger_distance(
p: dict[int, float],
q: dict[int, float],
) -> floatSignature (dict with string keys)
hellinger_distance(
p: dict[str, float],
q: dict[str, float],
) -> floatParameters
| Parameter | Type | Description |
|---|---|---|
| p | dict[int, float] or dict[str, float] | The first probability distribution, represented as a dictionary mapping outcomes to probabilities. |
| q | dict[int, float] or dict[str, float] | The second probability distribution, represented as a dictionary mapping outcomes to probabilities. |
Returns
float -- The Hellinger distance between the two distributions. A value of 0 means the distributions are identical; a value of 1 means they have no overlap.
Examples
from pyqpanda3.quantum_info import hellinger_distance
p = {0: 0.5, 1: 0.5}
q = {0: 0.3, 1: 0.7}
d = hellinger_distance(p, q)
print(d) # 0.2hellinger_fidelity
Compute the Hellinger fidelity between two probability distributions. The Hellinger fidelity is related to the Hellinger distance by
Signature (dict with integer keys)
hellinger_fidelity(
p: dict[int, float],
q: dict[int, float],
) -> floatSignature (dict with string keys)
hellinger_fidelity(
p: dict[str, float],
q: dict[str, float],
) -> floatParameters
| Parameter | Type | Description |
|---|---|---|
| p | dict[int, float] or dict[str, float] | The first probability distribution. |
| q | dict[int, float] or dict[str, float] | The second probability distribution. |
Returns
float -- The Hellinger fidelity between the two distributions. A value of 1 means the distributions are identical; a value of 0 means they have no overlap.
Examples
from pyqpanda3.quantum_info import hellinger_fidelity
p = {0: 0.5, 1: 0.5}
q = {0: 0.3, 1: 0.7}
f = hellinger_fidelity(p, q)
print(f) # 0.96KL_divergence
Compute the Kullback-Leibler (KL) divergence between two probability distributions.
For discrete distributions:
For continuous distributions:
Note: The KL divergence is not symmetric:
.
Signature (discrete)
KL_divergence(
p: list[float],
q: list[float],
) -> floatSignature (continuous)
KL_divergence(
p_pdf: Callable[[float], float],
q_pdf: Callable[[float], float],
x_start: float,
x_end: float,
dx: float = 1e-4,
) -> floatParameters (discrete)
| Parameter | Type | Description |
|---|---|---|
| p | list[float] | The first discrete probability distribution. |
| q | list[float] | The second discrete probability distribution. |
Parameters (continuous)
| Parameter | Type | Description |
|---|---|---|
| p_pdf | Callable[[float], float] | The probability density function of the first distribution |
| q_pdf | Callable[[float], float] | The probability density function of the second distribution |
| x_start | float | The lower bound of the integration interval. |
| x_end | float | The upper bound of the integration interval. |
| dx | float | The step size for numerical integration. Defaults to 1e-4. |
Returns
float -- The KL divergence of distribution p relative to distribution q. The value is non-negative; it is zero if and only if the distributions are identical.
Examples
Discrete distributions:
from pyqpanda3.quantum_info import KL_divergence
p = [0.5, 0.5]
q = [0.3, 0.7]
d = KL_divergence(p, q)
print(d)Continuous distributions:
import math
from pyqpanda3.quantum_info import KL_divergence
def p_pdf(x):
return math.exp(-x) if x >= 0 else 0.0
def q_pdf(x):
return 0.5 * math.exp(-0.5 * x) if x >= 0 else 0.0
d = KL_divergence(p_pdf, q_pdf, 0.0, 20.0, 1e-4)
print(d)