You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Information Bottleneck In a [[induction-deduction framework]] Induction, Deduction, and Transduction , for a given training dataset
$$ {X, Y}, $$
a prediction Markov chain1
$$ X \to \hat X \to Y, $$
where $\hat X$ is supposed to be the minimal sufficient statistics of $X$. $\hat X$ is the minimal data that can still represent the relation between $X$ and $Y$, i.e., $I(X;Y)$, the [[mutual information]] Mutual Information Mutual information is defined as $$ I(X;Y) = \mathbb E_{p_{XY}} \ln \frac{P_{XY}}{P_X P_Y}. $$ In the case that $X$ and $Y$ are independent variables, we have $P_{XY} = P_X P_Y$, thus $I(X;Y) = 0$.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
wiki/learning-theory/information-bottleneck/
Information Bottleneck In a [[induction-deduction framework]] Induction, Deduction, and Transduction , for a given training dataset$\hat X$ is supposed to be the minimal sufficient statistics of $X$ . $\hat X$ is the minimal data that can still represent the relation between $X$ and $Y$ , i.e., $I(X;Y)$ , the [[mutual information]] Mutual Information Mutual information is defined as $$ I(X;Y) = \mathbb E_{p_{XY}} \ln \frac{P_{XY}}{P_X P_Y}. $$ In the case that $X$ and $Y$ are independent variables, we have $P_{XY} = P_X P_Y$ , thus $I(X;Y) = 0$ .
$$ {X, Y}, $$
a prediction Markov chain1
$$ X \to \hat X \to Y, $$
where
https://datumorphism.leima.is/wiki/learning-theory/information-bottleneck/
Beta Was this translation helpful? Give feedback.
All reactions