The question with proper formatting (http://stats.stackexchange.com/questions/244969/log-likelihhood-ratio-with-soft-information)

The log likelihood ratio of two binary random variables $x$ and $y$ (only taking values $0$ or $1$) can be defined as

$LLR_x=\frac{P(x=0|y)}{P(x=1|y)}$

The above definition is OK if $y=0$ or $1$.

Now if $y$ is in the form of log likelihood ratio (soft information output from another system), how can it be incorporated in the above equation i.e.

$LLR_y=\frac{P(y=0|z)}{P(y=1|z)}$

so that,

$LLR_xnew=\frac{P(x=0|LLR_{y})}{P(x=1|LLR_{y})}$

Obviously, I don't want to take the hard decisions on the value of $y$ (it is straight forward for the hard decisions).

In other words, if $P(x=0|y)$ is already known but some how $y$ is the output from a system and is in the form of log likelihood ratio. So, is it possible to redefine $P(x=0|y)$?

Similar questions and discussions