<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Yadav Prasad G B</title>
    <description>The latest articles on Forem by Yadav Prasad G B (@yadav_prasadgb_34fcd06b).</description>
    <link>https://forem.com/yadav_prasadgb_34fcd06b</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/yadav_prasadgb_34fcd06b"/>
    <language>en</language>
    <item>
      <title>Discrete Random Probability</title>
      <dc:creator>Yadav Prasad G B</dc:creator>
      <pubDate>Sun, 19 Oct 2025 16:36:56 +0000</pubDate>
      <link>https://forem.com/yadav_prasadgb_34fcd06b/discrete-random-probability-4b2n</link>
      <guid>https://forem.com/yadav_prasadgb_34fcd06b/discrete-random-probability-4b2n</guid>
      <description>&lt;h3&gt;
  
  
  Random Variable
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;An assignment of a value to every possible outcome.&lt;/li&gt;
&lt;li&gt;A function from sample space to the real numbers. It is of two types: &lt;strong&gt;discrete&lt;/strong&gt; and &lt;strong&gt;continuous&lt;/strong&gt; random variables.&lt;/li&gt;
&lt;li&gt;It is denoted by &lt;strong&gt;X&lt;/strong&gt;, and its numeric value is denoted by &lt;strong&gt;x&lt;/strong&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;For Example:&lt;/strong&gt;&lt;br&gt;&lt;br&gt;
Tossing a coin once may land either &lt;strong&gt;Heads&lt;/strong&gt; or &lt;strong&gt;Tails&lt;/strong&gt; :&lt;br&gt;&lt;br&gt;
 S = {H, T}&lt;br&gt;&lt;br&gt;
A random variable ( X ) can be defined as:  &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;(X = 1) if it is &lt;strong&gt;Heads&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;(X = 0) if it is &lt;strong&gt;Tails&lt;/strong&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;We can also have &lt;strong&gt;more than one random variable&lt;/strong&gt; in an experiment. The diagram below takes a random data point and maps it to the &lt;strong&gt;height scale&lt;/strong&gt; (in inches). We can also map its &lt;strong&gt;weight&lt;/strong&gt; (in kg).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffr8ealuy4r1hdlrilyss.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffr8ealuy4r1hdlrilyss.png" alt="Height and Weight Mapping" width="734" height="371"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Probability Mass Function (PMF) :
&lt;/h3&gt;

&lt;p&gt;Probability Mass Function is a function that gives the probability that a discrete random variable takes a specific value. It is denoted by P(X = x) probability that X takes the value x.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Properties of PMF:&lt;/strong&gt;&lt;br&gt;
0≤P(X=x)≤1 for all possible values x.&lt;br&gt;
The sum of probabilities for all possible values of x : ∑x P(X=x)=1.&lt;/p&gt;

&lt;p&gt;we use PMF to generally study how probabilities are distributed across all possible outcomes of a discrete random variable.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;For Example :&lt;/strong&gt; A die is rolled and the following sample space are {1, 2, 3, 4, 5, 6} assume that the random variable X is equals to the values in the sample space so the PMF for this experiment would be :&lt;/p&gt;

&lt;p&gt;P(X = 1) = 1/6  P(X = 2) = 1/6  P(X = 3) = 1/6 &lt;br&gt;
 P(X = 4) = 1/6 P(X = 5) = 1/6  P(X = 6) = 1/6&lt;br&gt;
    ​&lt;/p&gt;

&lt;h3&gt;
  
  
  Binomial PMF
&lt;/h3&gt;

&lt;p&gt;Binomial PMF is used to determine the probability of k successes in n independent events where each trial has only two outcomes.&lt;/p&gt;

&lt;p&gt;if we take the coin toss example we can assign some probability value p if heads lands else 1 - p if tails lands and taking X = number of success in n trails.&lt;/p&gt;

&lt;p&gt;the formula for this is given by, &lt;/p&gt;

&lt;p&gt;P(X=k)= n!/(k!(n-k)!) * p^k(1−p)^n−k&lt;/p&gt;

&lt;p&gt;why bothering finding the combination of k different sequences of occurring heads ? Let us take an example of finding the P(X = 2) given the number of coin tosses are 4.&lt;/p&gt;

&lt;p&gt;n = 4, k = 2, X = p if heads appears else (1-p) if tails appears,&lt;/p&gt;

&lt;p&gt;P(X = 2) = P(HHTT) + P(HTHT) + P(HTTH) + P(THHT) + P(THTH) + P(TTHH) = 6 * p^2 * (1-p)^2 which is equal to 4!/2!*(4-2)! * p^2 * (1-p)^2&lt;/p&gt;

&lt;h3&gt;
  
  
  Expectation
&lt;/h3&gt;

&lt;p&gt;It is the average values of a random variable that you will expect when performing an experiment repeatedly. &lt;/p&gt;

&lt;p&gt;it is denoted by E[X], generally finds the center of gravity of PMF&lt;/p&gt;

&lt;p&gt;If a random variable X can take values x1,x2,x3,...x with probabilities P(X=xi) then the expected value is:&lt;br&gt;
                       E[X]=∑xi P(X=xi)&lt;/p&gt;

&lt;p&gt;Let us take an example, You are playing a Game where the chances of winning $1 is 1/6 and chances of winning $2 is 1/2 and chances of winning $4 is 1/3, so much the player should bet in-order to break even in the long term ??&lt;/p&gt;

&lt;p&gt;E[X] = 1/6 * 1 + 1/3 * 2 + 1/3 * 4 = 2.5&lt;/p&gt;

&lt;p&gt;therefore the player can bet upto $2.5 per game  to break even in the long term if the player bets more than the expected value then he's more likely to be in loss after n games.&lt;/p&gt;

&lt;p&gt;But how does expectation determines the center of gravity of a PMF, lets see this visually &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fldpywzw8s5pmjaqlejpd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fldpywzw8s5pmjaqlejpd.png" alt=" " width="549" height="409"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Properties of Expectations
&lt;/h3&gt;

&lt;p&gt;Let X and Y be random variables, and a, b, c be constants.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. Linearity of Expectation&lt;/strong&gt;&lt;br&gt;
E[aX + bY + c] = a E[X] + b E[Y] + c&lt;/p&gt;

&lt;p&gt;Expectation distributes over addition and scalar multiplication. This holds even if X and Y are not independent.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example:&lt;/strong&gt;&lt;br&gt;&lt;br&gt;
If E[X] = 2 and E[Y] = 3,&lt;br&gt;&lt;br&gt;
E[2X + 5Y + 1] = 2·2 + 5·3 + 1 = 22&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. Expectation of a Constant&lt;/strong&gt;&lt;br&gt;
E[c] = c&lt;/p&gt;

&lt;p&gt;A constant always takes the same value, so its expected value is itself.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3. Expectation of a Sum&lt;/strong&gt;&lt;br&gt;
E[X + Y] = E[X] + E[Y]&lt;/p&gt;

&lt;p&gt;A special case of linearity: the mean of the sum equals the sum of the means.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;4. Independent Random Variables&lt;/strong&gt;&lt;br&gt;
If X and Y are independent:&lt;br&gt;&lt;br&gt;
E[XY] = E[X] · E[Y]&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;5. Non-Negativity&lt;/strong&gt;&lt;br&gt;
If X ≥ 0, then E[X] ≥ 0.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;6. Function of a Random Variable&lt;/strong&gt;&lt;br&gt;
For any function g(X):&lt;br&gt;&lt;br&gt;
E[g(X)] = Σ g(x) · P(X = x)&lt;/p&gt;

&lt;h3&gt;
  
  
  Example
&lt;/h3&gt;

&lt;p&gt;Let X be the number shown on a fair die: X ∈ {1, 2, 3, 4, 5, 6}&lt;/p&gt;

&lt;p&gt;E[X] = ΣPX(x) . x = (1/6) (1 + 2 + 3 + 4 + 5 + 6)/6 = 3.5&lt;/p&gt;

&lt;p&gt;Now if Y = 2X + 1:&lt;br&gt;&lt;br&gt;
E[Y] = 2 E[X] + 1 = 2·3.5 + 1 = 8&lt;/p&gt;

&lt;h2&gt;
  
  
  Variance Properties for Discrete Random Variables (PMF)
&lt;/h2&gt;

&lt;p&gt;Variance measures how much the values of a random variable differ from the expected mean. It tells us the spread or dispersion of data around the mean.And the square root of the variance is called the standard deviation.&lt;/p&gt;

&lt;p&gt;The formula for discrete random variable is given by,&lt;br&gt;
Var(X) = Σ (xᵢ - E[X])² · P(X = xᵢ)&lt;/p&gt;




&lt;h2&gt;
  
  
  Properties of Variance
&lt;/h2&gt;

&lt;p&gt;Let X and Y be discrete random variables and a, b, c be constants.&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Variance of a Constant
&lt;/h3&gt;

&lt;p&gt;&lt;code&gt;Var(c) = 0&lt;/code&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A constant does &lt;strong&gt;not vary&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Example: If X = 5 always, then &lt;code&gt;Var(X) = 0&lt;/code&gt;.&lt;/li&gt;
&lt;/ul&gt;




&lt;h3&gt;
  
  
  2. Scaling Property
&lt;/h3&gt;

&lt;p&gt;&lt;code&gt;Var(aX) = a^2 * Var(X)&lt;/code&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Multiplying a random variable by a constant scales variance by a^2.
&lt;/li&gt;
&lt;li&gt;Example: If &lt;code&gt;Var(X) = 2&lt;/code&gt; and &lt;code&gt;Y = 3X&lt;/code&gt;, then &lt;code&gt;Var(Y) = 3^2 * 2 = 18&lt;/code&gt;.&lt;/li&gt;
&lt;/ul&gt;




&lt;h3&gt;
  
  
  3. Adding a Constant
&lt;/h3&gt;

&lt;p&gt;&lt;code&gt;Var(X + b) = Var(X)&lt;/code&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Adding a constant &lt;strong&gt;shifts the distribution&lt;/strong&gt; but does &lt;strong&gt;not change spread&lt;/strong&gt;.
&lt;/li&gt;
&lt;li&gt;Example: &lt;code&gt;Var(X) = 2&lt;/code&gt;, then &lt;code&gt;Var(X + 5) = 2&lt;/code&gt;.&lt;/li&gt;
&lt;/ul&gt;




&lt;h3&gt;
  
  
  4. Variance of a Sum (Independent Random Variables)
&lt;/h3&gt;

&lt;p&gt;&lt;code&gt;Var(X + Y) = Var(X) + Var(Y)&lt;/code&gt;  (if X and Y are independent)&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;For independent variables, total variance is the &lt;strong&gt;sum of individual variances&lt;/strong&gt;.
&lt;/li&gt;
&lt;li&gt;If not independent:
&lt;code&gt;Var(X + Y) = Var(X) + Var(Y) + 2 * Cov(X,Y)&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;




&lt;h3&gt;
  
  
  5. Variance of a Linear Combination
&lt;/h3&gt;

&lt;p&gt;&lt;code&gt;Var(aX + bY) = a^2 * Var(X) + b^2 * Var(Y) + 2ab * Cov(X,Y)&lt;/code&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;If X and Y are independent, &lt;code&gt;Cov(X,Y) = 0&lt;/code&gt;.&lt;/li&gt;
&lt;/ul&gt;




&lt;h3&gt;
  
  
  6. Non-Negativity
&lt;/h3&gt;

&lt;p&gt;&lt;code&gt;Var(X) ≥ 0&lt;/code&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Variance is always &lt;strong&gt;zero or positive&lt;/strong&gt;, since it is the average of squared deviations.&lt;/li&gt;
&lt;/ul&gt;




&lt;h3&gt;
  
  
  7. Variance in Terms of PMF
&lt;/h3&gt;

&lt;p&gt;&lt;code&gt;Var(X) = Σ xi^2 * P(X=xi) - (Σ xi * P(X=xi))^2&lt;/code&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;First term: expected value of X²
&lt;/li&gt;
&lt;li&gt;Second term: square of the mean
&lt;/li&gt;
&lt;li&gt;Difference gives the &lt;strong&gt;spread around the mean&lt;/strong&gt;
&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Numerical Example
&lt;/h2&gt;

&lt;p&gt;Discrete random variable X with PMF:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;x_i&lt;/th&gt;
&lt;th&gt;1&lt;/th&gt;
&lt;th&gt;2&lt;/th&gt;
&lt;th&gt;3&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;P(X=x_i)&lt;/td&gt;
&lt;td&gt;0.2&lt;/td&gt;
&lt;td&gt;0.5&lt;/td&gt;
&lt;td&gt;0.3&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;&lt;strong&gt;Step 1: Compute the mean&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;code&gt;E[X] = 1*0.2 + 2*0.5 + 3*0.3 = 2.1&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 2: Compute E[X²]&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;code&gt;E[X^2] = 1^2*0.2 + 2^2*0.5 + 3^2*0.3 = 4.9&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 3: Compute variance&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;code&gt;Var(X) = E[X^2] - (E[X])^2 = 4.9 - (2.1)^2 = 0.49&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 4: Interpretation&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Variance = 0.49 → small spread around the mean
&lt;/li&gt;
&lt;li&gt;Standard deviation = √0.49 = 0.7 → average deviation from the mean&lt;/li&gt;
&lt;/ul&gt;




&lt;h3&gt;
  
  
  Joint PMF
&lt;/h3&gt;

&lt;p&gt;a Joint PMF describes the probability distribution of two discrete random variable X and Y simultaneously.It hints you the what will be the probability that X takes the value xi and Y takes the value yi.&lt;/p&gt;

&lt;p&gt;Formally, for discrete random variables X and Y:&lt;/p&gt;

&lt;p&gt;pX,Y(xi,yj) = P(X=xi,Y=yj)&lt;/p&gt;

&lt;h3&gt;
  
  
  Properties of Joint PMF
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Non-negativity:&lt;/strong&gt;&lt;br&gt;&lt;br&gt;
&lt;code&gt;p_X,Y(xi, yj) &amp;gt;= 0&lt;/code&gt; for all &lt;code&gt;xi, yj&lt;/code&gt;  &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Sum of all probabilities = 1:&lt;/strong&gt;&lt;br&gt;&lt;br&gt;
&lt;code&gt;Σi Σj pX,Y(xi, yj) = 1&lt;/code&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Marginal PMF:&lt;/strong&gt;&lt;br&gt;&lt;br&gt;
You can get the individual distributions of X or Y by summing over the other variable:&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;For X: &lt;code&gt;pX(xi) = Σj p_X,Y(xi, yj)&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;For Y: &lt;code&gt;pY(yj) = Σi p_X,Y(xi, yj)&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Conditional Probability (under Joint PMF)
&lt;/h3&gt;

&lt;p&gt;For two discrete random variables X and Y, the &lt;strong&gt;conditional probability&lt;/strong&gt; of X = xi given Y = yj is:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;P(X = xi | Y = yj) = P(X = xi AND Y = yj) / P(Y = yj)&lt;/code&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Numerator:&lt;/strong&gt; joint probability that both events happen: &lt;code&gt;P(X = xi, Y = yj)&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Denominator:&lt;/strong&gt; probability of the condition: &lt;code&gt;P(Y = yj)&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;It answers the question:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;“If I know Y = yj has occurred, what is the probability that X = xi occurs?”&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fejn2jw4l18ykdrce294b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fejn2jw4l18ykdrce294b.png" alt=" " width="522" height="533"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Independent Random Variable
&lt;/h3&gt;

&lt;p&gt;In general given the random variable the PMF formula is given by,&lt;/p&gt;

&lt;p&gt;P(X=x,Y=y,Z=z) = P(X=x)⋅P(Y=y∣X=x)⋅P(Z=z∣X=x,Y=y)&lt;/p&gt;

&lt;p&gt;Random variables X, Y, Z are independent if,&lt;/p&gt;

&lt;p&gt;P(X=x,Y=y,Z=z) = P(X=x) * P(Y=Y) * p(Z=z)&lt;/p&gt;

&lt;p&gt;and so, PX|Y(x|y) = PX(x)&lt;/p&gt;

&lt;h3&gt;
  
  
  The Hat Problem
&lt;/h3&gt;

&lt;p&gt;N people throw their hats in a box and then pick one at random X is the number of people who get their own hat.&lt;/p&gt;

&lt;p&gt;-Find E[X]&lt;/p&gt;

&lt;p&gt;xi = 1 if i selects own hat else 0&lt;/p&gt;

&lt;p&gt;X = x1 _ x2 + x3 + .... + xn&lt;/p&gt;

&lt;p&gt;p(xi = 1) = 1/n [each person will have probability of 1/n in picking own hat]&lt;/p&gt;

&lt;p&gt;E[xi] = 1/n * 1 + (1 - 1/n) * 0 = 1/n&lt;br&gt;
E[X] = n * 1/n = 1&lt;/p&gt;

&lt;p&gt;so expected mean is 1 meaning that only one person in most cases will get their hat back!&lt;/p&gt;

&lt;p&gt;let's calculate variance&lt;/p&gt;

&lt;p&gt;Var(X) = E[X^2] - (E[X])^2&lt;/p&gt;

&lt;p&gt;E[X]^2 = 1&lt;br&gt;
E[X^2] = Σ xi^2 + Σxixj (where i!=j)&lt;/p&gt;

&lt;p&gt;E[xi^2] = E[xi] = 1/n&lt;/p&gt;

&lt;p&gt;E[X^2] = n * 1/n + 1/n * 1/(n-1) * (n^2 - n)&lt;/p&gt;

&lt;p&gt;why  1/n * 1/(n-1) * n^2 - n ???&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;n^2 - n (since n^2 elements in total and we remove elements where i = j)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;if P(X1X2 = 1) = P(X1 = 1)P(X2 = 1| X1 = 1) then,&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;it should be equal to 1/n * 1/(n - 1).&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;thus E[X^2] = 1 + 1 = 2&lt;/p&gt;

&lt;p&gt;Var(X) = 2 - 1 = 1&lt;/p&gt;

&lt;p&gt;thus having E[X] = 1 and Var(X) = 1 we can inference that there will be higher chance of k = 0, 1, or 2 being the number of person getting their own hat back.&lt;/p&gt;

</description>
      <category>maths</category>
    </item>
    <item>
      <title>Probability - Independence and Counting</title>
      <dc:creator>Yadav Prasad G B</dc:creator>
      <pubDate>Wed, 01 Oct 2025 11:18:48 +0000</pubDate>
      <link>https://forem.com/yadav_prasadgb_34fcd06b/probability-independence-and-counting-497e</link>
      <guid>https://forem.com/yadav_prasadgb_34fcd06b/probability-independence-and-counting-497e</guid>
      <description>&lt;p&gt;before getting into independence and counting let's see a very interesting topic.&lt;/p&gt;




&lt;h2&gt;
  
  
  Infinite Independent Events
&lt;/h2&gt;

&lt;p&gt;consider a coin where the probability of getting a head is 1/2^n where n is the ith toss and each time a person does a coin flip the probability of getting heads will be 1/2, 1/4, 1/8.. and so on. what if we were told to find the probability of getting a first heads in a even flip number, we would have probably know the axioms but they don't comply with infinite events so what's the solution? you may ask so if given a problem where there is an infinite series of independent event and told to find the overall probability there comes the &lt;strong&gt;geometric series&lt;/strong&gt; the formula is given in the picture below :&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fco30lcmj5d4wl9rzupv6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fco30lcmj5d4wl9rzupv6.png" alt=" " width="363" height="145"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;using this newly obtained formula we can derive the probability by finding the P({2, 4, 6,...}) = P(2) + P(4) + P(6) + ...&lt;br&gt;
which is equals to 1/4 + 1/8 + 1/16.. = 1/2^2 + 1/2^4 + 1/2^6 + ...&lt;br&gt;
which equals to 1/3 after applying the formula.&lt;/p&gt;

&lt;p&gt;geometric series can be very useful for infinite independent event where they grow and can be ordered sequentially.&lt;/p&gt;


&lt;h2&gt;
  
  
  Independence
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Definition:&lt;/strong&gt; P(A | B) = P(A) occurrence of B doesn't provide any information about A's occurrence.&lt;/p&gt;

&lt;p&gt;We Know that P (A n B) = P(B) * P(A|B) &lt;/p&gt;

&lt;p&gt;with Independence : P(A n B) = P(B) * P(A) &lt;/p&gt;

&lt;p&gt;with independence rule and multiplicative rule we can write as :&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;P(A n B | C) = P(A|C) * P(B|C)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;where can we see these Independence ?? let's take an example of coin toss where p(getting a heads) = p. if you were told that the first 10 flips are heads does it change your belief about the probability of getting a heads in 11th toss? it's the same probability p for all the events so even though a person flips the toss 10 times that didn't change his initial beliefs about the coin flip in the 11th toss.&lt;/p&gt;




&lt;h2&gt;
  
  
  How does conditioning may affect independence
&lt;/h2&gt;

&lt;p&gt;Two unfair coins A and B where P(getting an heads | A) = 0.9 and P(getting an heads | B) = 0.1, where the person can choose either coin with equal probability .&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb2vj9abde2r6powwgud1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb2vj9abde2r6powwgud1.png" alt=" " width="376" height="390"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once we know it is coin A, are tosses independent ? : Yes&lt;/p&gt;

&lt;p&gt;If we do not know which coin it is, are tosses independent?? then what is the P(11th toss is Heads)? answer:= 1/2 * 0.9 + 1/2 * 0.1 = 1/2 (90% of getting heads from coin A and 10% of getting heads from coin B).&lt;/p&gt;

&lt;p&gt;Now with conditional probability, P(11th toss is Heads | first 10 toss is heads) what is the probability of getting an heads in 11th toss?? answer : ≈ 0.9.&lt;/p&gt;

&lt;p&gt;we can be certain that the coin A is being flipped from the get go but not 100% certain since coin B can also have a miracle chances of dropping this outcomes but it's just that it is very close to zero precisely (0.1)^10 so applying conditioning affects the relationship between two events it can make two independent events dependent or vice versa...&lt;/p&gt;




&lt;h1&gt;
  
  
  Basic Counting Principle
&lt;/h1&gt;

&lt;h2&gt;
  
  
  Permutations
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Permutations :&lt;/strong&gt; Number of ways of ordering n elements.&lt;/p&gt;

&lt;p&gt;so understanding permutations, given that there are n elements in a set how many different arrangements can we made ?? : &lt;strong&gt;n! ways&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh6b5nd4rs5kbowrww2hu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh6b5nd4rs5kbowrww2hu.png" alt=" " width="800" height="399"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;now given a problem that a die is thrown 6 times what is the probability that 6 unique arrangement is made in 6 throws ie. {1,3,2,4,6,5} is one of the unique numbered arrangement &lt;/p&gt;

&lt;p&gt;we can solve this by finding total arrangement resulting from throwing a die size times which is sample space (Ω) = 6 * 6 * 6 * 6 * 6 * 6 = 6^6 and from the sample space how many unique arrangements ca be made ??? from the above information we can say that n! ways arrangements can be made which is 6! so the resultant probability is,&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;|E|       6!
----   = ----    
|Ω|       6^6
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h2&gt;
  
  
  Combinations
&lt;/h2&gt;

&lt;p&gt;we know that we if taken n elements we can arrange it in n! ways so what about selecting k in n element set?&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fva5voxl5bq3vvkfd5a2s.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fva5voxl5bq3vvkfd5a2s.png" alt=" " width="800" height="415"&gt;&lt;/a&gt; &lt;/p&gt;

&lt;p&gt;which can be rewritten as &lt;strong&gt;n!/(n-k)!&lt;/strong&gt; ways.&lt;/p&gt;

&lt;p&gt;assume that number of k - elements subsets if a given n-element set is denoted by C(n,k) or ⁿCₖ&lt;/p&gt;

&lt;p&gt;now the number of k different arrangement out of chosen k elements subset from n-elements set is : k!&lt;/p&gt;

&lt;p&gt;from the above equation :&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;k! × C(n,k) = n!/(n-k)!

C(n,k) = n!/[k!(n-k)!]
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;so there are two ways to come up with this formula n!/(n-k)!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhn1e6y99hx1edppl0ok7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhn1e6y99hx1edppl0ok7.png" alt=" " width="721" height="508"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  Partitioning
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F25c3cxhrvnhcb8ft2z0y.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F25c3cxhrvnhcb8ft2z0y.png" alt=" " width="439" height="358"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;if n elements is partitioned into m1, m2, m3 and m4 the total number of subsets made is&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;        n!
  ---------------
   m1! m2! m3! m4!
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;For example&lt;/strong&gt; if a deck of 52 cards are partitioned into 4 groups of each 13 cards then the number of ways the cards can be distributed is&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;52!/(13! * 13! * 13! * 13!)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;how does it work ???&lt;/strong&gt; let's understand intuitively&lt;/p&gt;

&lt;p&gt;out of 52 cards if 13 can be selected (52 choose 13), after selecting the first 13 cards we have 39 cards remaining. We select another 13 cards (39 choose 13), after selecting the second group we have 26 cards remaining. We select another 13 cards (26 choose 13), the final 13 cards automatically form the last group (13 choose 13) = 1&lt;/p&gt;

&lt;p&gt;So the total number of ways is:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;(52 choose 13) × (39 choose 13) × (26 choose 13) × (13 choose 13)

= [52!/(13!×39!)] × [39!/(13!×26!)] × [26!/(13!×13!)] × [13!/(13!×0!)]
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now let's see the cancellation:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The 39! in the numerator of the second term cancels with the 39! in the denominator of the first term&lt;/li&gt;
&lt;li&gt;The 26! in the numerator of the third term cancels with the 26! in the denominator of the second term
&lt;/li&gt;
&lt;li&gt;The 13! in the numerator of the fourth term cancels with the 13! in the denominator of the third term&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;After all the cancellations we get:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;= 52! / (13! × 13! × 13! × 13!)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;p&gt;that's all for now folks, I will catch you up later.....&lt;/p&gt;

</description>
      <category>datascience</category>
      <category>learning</category>
      <category>maths</category>
    </item>
    <item>
      <title>Probability - Introduction, axioms, Conditional and Bayes' Rule</title>
      <dc:creator>Yadav Prasad G B</dc:creator>
      <pubDate>Mon, 29 Sep 2025 14:33:28 +0000</pubDate>
      <link>https://forem.com/yadav_prasadgb_34fcd06b/probability-introduction-axioms-conditional-and-bayes-rule-1on8</link>
      <guid>https://forem.com/yadav_prasadgb_34fcd06b/probability-introduction-axioms-conditional-and-bayes-rule-1on8</guid>
      <description>&lt;h1&gt;
  
  
  Probability
&lt;/h1&gt;

&lt;p&gt;Probability is everywhere. It helps determine the likelihood of certain events given the whole sample space. By understanding probability, we can make better decisions based on the chances of favorable outcomes and reduce the impact of unfavorable events. Probability is widely used in areas like:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Climate prediction
&lt;/li&gt;
&lt;li&gt;Machine learning algorithms
&lt;/li&gt;
&lt;li&gt;Competitive programming
&lt;/li&gt;
&lt;li&gt;Medical research
&lt;/li&gt;
&lt;li&gt;Decision-making in everyday life
&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Terminology
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Sample Space (Ω):&lt;/strong&gt; List of all the elements in a set / all possible outcomes.
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Mutually Exclusive Events:&lt;/strong&gt; Two or more events that cannot occur at the same time in a single trial of an experiment.
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Collectively Exhaustive Events:&lt;/strong&gt; At least one of the events must occur in a single trial of an experiment.
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Event:&lt;/strong&gt; A set of favorable outcomes.
&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Understanding Probability Visually
&lt;/h2&gt;

&lt;p&gt;Visualization helps intuitively understand probability. It works well for discrete sample spaces but can get complex for continuous probability. For example, consider two rolls of a tetrahedral die.  &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyny9a8rrhzxur2wlat7h.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyny9a8rrhzxur2wlat7h.png" alt="Visualizing two dice rolls" width="498" height="462"&gt;&lt;/a&gt;  &lt;/p&gt;

&lt;p&gt;Another way to visualize:  &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdep2hhpkrv4b3bnbe6ry.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdep2hhpkrv4b3bnbe6ry.png" alt="Another sample space visualization" width="592" height="638"&gt;&lt;/a&gt;  &lt;/p&gt;




&lt;h2&gt;
  
  
  Probability Axioms
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Non-negativity:&lt;/strong&gt; P(A) ≥ 0
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Normalization:&lt;/strong&gt; P(Ω) = 1
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Additivity Rule:&lt;/strong&gt; If A ∩ B = ∅, then P(A ∪ B) = P(A) + P(B)
&lt;/li&gt;
&lt;/ol&gt;




&lt;h2&gt;
  
  
  Conditional Probability
&lt;/h2&gt;

&lt;p&gt;Conditional probability calculates the likelihood of an event occurring given that another event has already occurred:&lt;/p&gt;

&lt;p&gt;P(A | B) = P(A ∩ B) / P(B),  where P(B) ≠ 0  &lt;/p&gt;

&lt;p&gt;Equivalently:&lt;/p&gt;

&lt;p&gt;P(A ∩ B) = P(A | B) × P(B)  &lt;/p&gt;

&lt;p&gt;Let's understand how this formula can be derived intuitively with an example.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example: Student &amp;amp; Exam&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Event B: Student studies for the exam
&lt;/li&gt;
&lt;li&gt;Event A: Student passes the exam
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Suppose:  &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;70% of students study: P(B) = 0.7
&lt;/li&gt;
&lt;li&gt;Given they studied, 90% pass: P(A | B) = 0.9
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Then:&lt;/p&gt;

&lt;p&gt;P(A ∩ B) = 0.7 × 0.9 = 0.63  &lt;/p&gt;

&lt;p&gt;Meaning there’s a 63% chance that a randomly chosen student both studies and passes. 🎯  &lt;/p&gt;

&lt;p&gt;Let’s visualize this with a probability tree diagram:  &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2g1onjhcpicvmhhy1nrh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2g1onjhcpicvmhhy1nrh.png" alt="Probability Tree Diagram" width="729" height="484"&gt;&lt;/a&gt;  &lt;/p&gt;




&lt;h2&gt;
  
  
  Bayes’ Theorem
&lt;/h2&gt;

&lt;p&gt;From conditional probability, we get Bayes’ theorem:&lt;/p&gt;

&lt;p&gt;P(A | B) = [P(A) × P(B | A)] / P(B)  &lt;/p&gt;

&lt;p&gt;Where:&lt;/p&gt;

&lt;p&gt;P(A ∩ B) = P(A) × P(B | A)  &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Terminology in Bayes’ Theorem:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Posterior (P(H | E)):&lt;/strong&gt; Probability of the hypothesis given the evidence.
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Prior (P(H)):&lt;/strong&gt; Probability of the hypothesis before observing the evidence.
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Likelihood (P(E | H)):&lt;/strong&gt; Probability of the evidence given the hypothesis is true.
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Marginal (P(E)):&lt;/strong&gt; Probability of the evidence under all possible hypotheses.
&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Multiplicative Rule
&lt;/h2&gt;

&lt;p&gt;For multiple events, the probability of all events occurring is:&lt;/p&gt;

&lt;p&gt;P(A1 ∩ A2 ∩ ... ∩ An) = P(A1) × P(A2 | A1) × ... × P(An | A1 ∩ ... ∩ A(n-1))  &lt;/p&gt;

&lt;p&gt;This is a generalization of conditional probability for multiple events.&lt;/p&gt;

&lt;p&gt;Now let us understand this visually using probability tree diagram given below,&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxanlbhnrbxfftl320i19.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxanlbhnrbxfftl320i19.png" alt=" " width="526" height="523"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;that's all for now folks, I will catch you up later.....&lt;/p&gt;

</description>
      <category>maths</category>
    </item>
  </channel>
</rss>
