Two probability density functions (PDF) with a triangular shape are considered.
Two probability density functions $\rm (PDF)$ with triangular shapes are considered.
* The random variable $X$ is limited to the range of values from $0$ to $1$ , and it holds for the PDF (upper sketch):
* The random variable $X$ is limited to the range from $0$ to $1$ , and it holds for the PDF (upper sketch):
*If, on the other hand, the result is asked in "bit" then the "dual logarithm" ⇒ "$\log_2$" is to be used.
\hspace{0.05cm}.$$
*If the ''natural logarithm'', the pseudo-unit "nat" must be added.
*If, on the other hand, the result is asked in "bit" then the <i>dual logarithm</i> ⇒ "$\log_2$" is to be used.
In the fourth subtask, the new random variable $Z = A \cdot Y$ is considered. Here, the PDF parameter $A$ is to be determined in such a way that the differential entropy of the new random variable $Z$ yields exactly $1$ bit :<br>
In the fourth subtask, the new random variable $Z = A \cdot Y$ is considered. Here, the PDF parameter $A$ is to be determined in such a way that the differential entropy of the new random variable $Z$ yields exactly $1$ bit :<br>
:$$h(Z) = h (A \cdot Y) = h (Y) + {\rm log}_2 \hspace{0.1cm} (A) = 1\,{\rm bit} \hspace{0.05cm}.$$
:$$h(Z) = h (A \cdot Y) = h (Y) + {\rm log}_2 \hspace{0.1cm} (A) = 1\ {\rm bit} \hspace{0.05cm}.$$
Line 34:
Line 29:
Hints:
Hints:
*The task belongs to the chapter [[Information_Theory/Differentielle_Entropie|Differential Entropy]].
*The task belongs to the chapter [[Information_Theory/Differentielle_Entropie|Differential Entropy]].
*Useful hints for solving this task and further information on continuous-valued random variables can be found in the third chapter "Continuous Random Variables" of the book [[Theory of Stochastic Signals]].
*Useful hints for solving this task and further information on continuous random variables can be found in the third chapter "Continuous Random Variables" of the book [[Theory of Stochastic Signals]].
'''(1)''' For the probability density function, in the range $0 \le X \le 1$ , it is agreed that:
'''(1)''' For the probability density function, in the range $0 \le X \le 1$ , it is agreed that:
:$$f_X(x) = 2x = C \cdot x
:$$f_X(x) = 2x = C \cdot x\hspace{0.05cm}.$$
\hspace{0.05cm}.$$
*Here we have replaced "2" by $C$ ⇒ generalization in order to be able to use the following calculation again in subtask $(3)$ .
*Here we have replaced "2" by $C$ ⇒ generalization in order to be able to use the following calculation again in subtask $(3)$ .
*Since the differential entropy is sought in "nat", we use the natural logarithm. With the substitution $\xi = C \cdot x$ we obtain:
*Since the differential entropy is sought in "nat", we use the natural logarithm. With the substitution $\xi = C \cdot x$ we obtain:
:$$h_{\rm nat}(X) = \hspace{0.1cm} - \int_{0}^{1} \hspace{0.1cm} C \cdot x \cdot {\rm ln} \hspace{0.1cm} \big[ C \cdot x \big] \hspace{0.1cm}{\rm d}x =
*You can save this conversion if you directly replace $(1)$ direct "ln" by "log<sub>2</sub>" already in the analytical result of subtask:
\hspace{0.05cm}.$$
*You can save this conversion if you directly replace $(1)$ direkt "ln" by "log<sub>2</sub>" already in the analytical result of subtask:
*The first integral for the range $-1 \le y \le 0$ is identical in form to that of subtask $(1)$ and only shifted with respect to it, which does not affect the result.
*The first integral for the range $-1 \le y \le 0$ is identical in form to that of subtask $(1)$ and only shifted with respect to it, which does not affect the result.
*Now the heighte $C = 1$ instead of $C = 2$ has to be considered::
*Now the height $C = 1$ instead of $C = 2$ has to be considered:
*The second integrand is identical to the first except for a shift and reflection. Moreover, the integration intervals do not overla ⇒ $I_{\rm pos} = I_{\rm neg}$:
*The second integrand is identical to the first except for a shift and reflection. Moreover, the integration intervals do not overlap ⇒ $I_{\rm pos} = I_{\rm neg}$:
If the "natural logarithm", the pseudo-unit "nat" must be added.
If, on the other hand, the result is asked in "bit" then the "dual logarithm" ⇒ "$\log_2$" is to be used.
In the fourth subtask, the new random variable $Z = A \cdot Y$ is considered. Here, the PDF parameter $A$ is to be determined in such a way that the differential entropy of the new random variable $Z$ yields exactly $1$ bit :
$$h(Z) = h (A \cdot Y) = h (Y) + {\rm log}_2 \hspace{0.1cm} (A) = 1\ {\rm bit} \hspace{0.05cm}.$$
Useful hints for solving this task and further information on continuous random variables can be found in the third chapter "Continuous Random Variables" of the book Theory of Stochastic Signals.
The first integral for the range $-1 \le y \le 0$ is identical in form to that of subtask $(1)$ and only shifted with respect to it, which does not affect the result.
Now the height $C = 1$ instead of $C = 2$ has to be considered:
The second integrand is identical to the first except for a shift and reflection. Moreover, the integration intervals do not overlap ⇒ $I_{\rm pos} = I_{\rm neg}$: