- FOR INSTRUCTOR
- FOR INSTRUCTORS


4.1.4 Solved Problems: Continuous Random Variables
- Find the constant $c$.
- Find $EX$ and Var$(X)$.
- Find $P(X \geq \frac{1}{2})$.
- To find $P(X \geq \frac{1}{2})$, we can write $$P(X \geq \frac{1}{2})=\frac{3}{2} \int_{\frac{1}{2}}^{1} x^2dx=\frac{7}{16}.$$
Let $X$ be a continuous random variable with PDF given by $$f_X(x)=\frac{1}{2}e^{-|x|}, \hspace{20pt} \textrm{for all }x \in \mathbb{R}.$$ If $Y=X^2$, find the CDF of $Y$.
Let $X$ be a continuous random variable with PDF \begin{equation} \nonumber f_X(x) = \left\{ \begin{array}{l l} 4x^3 & \quad 0 \frac{1}{3})$.
Let $X$ be a continuous random variable with PDF \begin{equation} \nonumber f_X(x) = \left\{ \begin{array}{l l} x^2\left(2x+\frac{3}{2}\right) & \quad 0 < x \leq 1\\ 0 & \quad \text{otherwise} \end{array} \right. \end{equation} If $Y=\frac{2}{X}+3$, find Var$(Y)$.
First, note that $$\textrm{Var}(Y)=\textrm{Var}\left(\frac{2}{X}+3\right)=4\textrm{Var}\left(\frac{1}{X}\right), \hspace{15pt} \textrm{using Equation 4.4}$$ Thus, it suffices to find Var$(\frac{1}{X})=E[\frac{1}{X^2}]-(E[\frac{1}{X}])^2$. Using LOTUS, we have $$E\left[\frac{1}{X}\right]=\int_{0}^{1} x\left(2x+\frac{3}{2}\right) dx =\frac{17}{12}$$ $$E\left[\frac{1}{X^2}\right]=\int_{0}^{1} \left(2x+\frac{3}{2}\right) dx =\frac{5}{2}.$$ Thus, Var$\left(\frac{1}{X}\right)=E[\frac{1}{X^2}]-(E[\frac{1}{X}])^2=\frac{71}{144}$. So, we obtain $$\textrm{Var}(Y)=4\textrm{Var}\left(\frac{1}{X}\right)=\frac{71}{36}.$$
Let $X$ be a positive continuous random variable. Prove that $EX=\int_{0}^{\infty} P(X \geq x) dx$.


Probability Density Function
Continuous distributions are constructed from continuous random variables which take values at every point over a given interval and are usually generated from experiments in which things are “measured” as opposed to “counted”.
With continuous distributions, probabilities of outcomes occurring between particular points are determined by calculating the area under the probability density function (pdf) curve between those points. In addition, the entire area under the whole curve is equal to 1.

We welcome your feedback, comments and questions about this site or page. Please submit your feedback or enquiries via our Feedback page.
User Preferences
Content preview.
Arcu felis bibendum ut tristique et egestas quis:
- Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris
- Duis aute irure dolor in reprehenderit in voluptate
- Excepteur sint occaecat cupidatat non proident
Keyboard Shortcuts
14.1 - probability density functions.
A continuous random variable takes on an uncountably infinite number of possible values. For a discrete random variable \(X\) that takes on a finite or countably infinite number of possible values, we determined \(P(X=x)\) for all of the possible values of \(X\), and called it the probability mass function ("p.m.f."). For continuous random variables, as we shall soon see, the probability that \(X\) takes on any particular value \(x\) is 0. That is, finding \(P(X=x)\) for a continuous random variable \(X\) is not going to work. Instead, we'll need to find the probability that \(X\) falls in some interval \((a, b)\), that is, we'll need to find \(P(a<X<b)\). We'll do that using a probability density function ("p.d.f."). We'll first motivate a p.d.f. with an example, and then we'll formally define it.
Example 14-1 Section

Even though a fast-food chain might advertise a hamburger as weighing a quarter-pound, you can well imagine that it is not exactly 0.25 pounds. One randomly selected hamburger might weigh 0.23 pounds while another might weigh 0.27 pounds. What is the probability that a randomly selected hamburger weighs between 0.20 and 0.30 pounds? That is, if we let \(X\) denote the weight of a randomly selected quarter-pound hamburger in pounds, what is \(P(0.20<X<0.30)\)?
In reality, I'm not particularly interested in using this example just so that you'll know whether or not you've been ripped off the next time you order a hamburger! Instead, I'm interested in using the example to illustrate the idea behind a probability density function.
Now, you could imagine randomly selecting, let's say, 100 hamburgers advertised to weigh a quarter-pound. If you weighed the 100 hamburgers, and created a density histogram of the resulting weights, perhaps the histogram might look something like this:
In this case, the histogram illustrates that most of the sampled hamburgers do indeed weigh close to 0.25 pounds, but some are a bit more and some a bit less. Now, what if we decreased the length of the class interval on that density histogram? Then, the density histogram would look something like this:
Now, what if we pushed this further and decreased the intervals even more? You can imagine that the intervals would eventually get so small that we could represent the probability distribution of \(X\), not as a density histogram, but rather as a curve (by connecting the "dots" at the tops of the tiny tiny tiny rectangles) that, in this case, might look like this:
Such a curve is denoted \(f(x)\) and is called a (continuous) probability density function .
Now, you might recall that a density histogram is defined so that the area of each rectangle equals the relative frequency of the corresponding class, and the area of the entire histogram equals 1. That suggests then that finding the probability that a continuous random variable \(X\) falls in some interval of values involves finding the area under the curve \(f(x)\) sandwiched by the endpoints of the interval. In the case of this example, the probability that a randomly selected hamburger weighs between 0.20 and 0.30 pounds is then this area:
Now that we've motivated the idea behind a probability density function for a continuous random variable, let's now go and formally define it.
The probability density function (" p.d.f. ") of a continuous random variable \(X\) with support \(S\) is an integrable function \(f(x)\) satisfying the following:
\(f(x)\) is positive everywhere in the support \(S\), that is, \(f(x)>0\), for all \(x\) in \(S\)
The area under the curve \(f(x)\) in the support \(S\) is 1, that is:
\(\int_S f(x)dx=1\)
If \(f(x)\) is the p.d.f. of \(x\), then the probability that \(x\) belongs to \(A\), where \(A\) is some interval, is given by the integral of \(f(x)\) over that interval, that is:
\(P(X \in A)=\int_A f(x)dx\)
As you can see, the definition for the p.d.f. of a continuous random variable differs from the definition for the p.m.f. of a discrete random variable by simply changing the summations that appeared in the discrete case to integrals in the continuous case. Let's test this definition out on an example.
Example 14-2 Section
Let \(X\) be a continuous random variable whose probability density function is:
\(f(x)=3x^2, \qquad 0<x<1\)
First, note again that \(f(x)\ne P(X=x)\). For example, \(f(0.9)=3(0.9)^2=2.43\), which is clearly not a probability! In the continuous case, \(f(x)\) is instead the height of the curve at \(X=x\), so that the total area under the curve is 1. In the continuous case, it is areas under the curve that define the probabilities.
Now, let's first start by verifying that \(f(x)\) is a valid probability density function.
What is the probability that \(X\) falls between \(\frac{1}{2}\) and 1? That is, what is \(P\left(\frac{1}{2}<X<1\right)\)?
What is \(P\left(X=\frac{1}{2}\right)\)?
It is a straightforward integration to see that the probability is 0:
\(\int^{1/2}_{1/2} 3x^2dx=\left[x^3\right]^{x=1/2}_{x=1/2}=\dfrac{1}{8}-\dfrac{1}{8}=0\)
In fact, in general, if \(X\) is continuous, the probability that \(X\) takes on any specific value \(x\) is 0. That is, when \(X\) is continuous, \(P(X=x)=0\) for all \(x\) in the support.
An implication of the fact that \(P(X=x)=0\) for all \(x\) when \(X\) is continuous is that you can be careless about the endpoints of intervals when finding probabilities of continuous random variables. That is:
\(P(a\le X\le b)=P(a<X\le b)=P(a\le X<b)=P(a<x<b)\)
for any constants \(a\) and \(b\).
Example 14-3 Section
\(f(x)=\dfrac{x^3}{4}\)
for an interval \(0<x<c\). What is the value of the constant \(c\) that makes \(f(x)\) a valid probability density function?
Probability Density Function
Probability density function is a function that provides the likelihood that the value of a random variable will fall between a certain range of values. We use the probability density function in the case of continuous random variables. For discrete random variables, we use the probability mass function which is analogous to the probability density function.
The graph of a probability density function is in the form of a bell curve. The area that lies between any two specified values gives the probability of the outcome of the designated observation. We solve the integral of this function to determine the probabilities associated with a continuous random variable. In this article, we will do a detailed analysis of the probability density function and take a look at the various aspects related to it.
What is Probability Density Function?
Probability density function and cumulative distribution function are used to define the distribution of continuous random variables. If we differentiate the cumulative distribution function of a continuous random variable it results in the probability density function. Conversely, on integrating the probability density function we get the cumulative distribution function.
Probability Density Function Definition
Probability density function defines the density of the probability that a continuous random variable will lie within a particular range of values. To determine this probability, we integrate the probability density function between two specified points.
Probability Density Function Example
Say we have a continuous random variable whose probability density function is given by f(x) = x + 2, when 0 < x ≤ 2. We want to find P(0.5 < X < 1). Then we integrate x + 2 within the limits 0.5 and 1. This gives us 1.375. Thus, the probability that the continuous random variable lies between 0.5 and 1 is 1.375.
Probability Density Function Formula
The probability density function of a continuous random variable is analogous to the probability mass function of a discrete random variable. Discrete random variables can be evaluated at a particular point while continuous random variables have to be evaluated between a certain interval . This is because the probability that a continuous random variable will take an exact value is 0. Given below are the various probability density function formulas.
Probability Density Function of Continuous Random Variable
Suppose we have a continuous random variable, X. Let F(x) be the cumulative distribution function of X. Then the formula for the probability density function, f(x), is given as follows:
f(x) = \(\frac{\mathrm{d} F(x)}{\mathrm{d} x}\) = F'(x)
If we want to find the probability that X lies between lower limit 'a' and upper limit 'b' then using the probability density function this can be given as:
P(a < X ≤ b) = F(b) - F(a) = \(\int_{a}^{b}f(x)dx\)
Here, F(b) and F(a) represent the cumulative distribution function at b and a respectively.
Probability Density Function of Normal Distribution
Normal distribution is the most widely used type of continuous probability distribution. The notation for normal distribution is given as \(X \sim N(\mu ,\sigma ^{2})\). The probability density function of a normal distribution is given below.
f(x) = \(\frac{1}{\sigma \sqrt{2\Pi}}e^{\frac{-1}{2}\left ( \frac{x - \mu }{\sigma } \right )^{2}}\)
Here, \(\mu\) is the mean and \(\sigma\) is the standard deviation while \(\sigma\) 2 is the variance.
Probability Density Function Graph
If X is a continuous random variable then the probability distribution of this variable is given by the probability density function. The graph given below depicts the probability that X will lie between two points a and b.

Mean of Probability Density Function
In the case of a probability density function, the mean is the expected value or the average value of the random variable. If f(x) is the probability density function of the random variable X, then mean is given by the following formula:
E[X] = \(\mu = \int_{-\infty }^{\infty}xf(x)dx\)
Median of Probability Density Function
The median is the value that splits the probability density function curve into two equal halves. Suppose x = m is the value of the median. The area under the curve from \(-\infty\) to m will be equal to the area under the curve from m to \(\infty\). This implies that the value of the median is 1 / 2. Thus, the median of the probability density function is given as follows:
\(\int_{-\infty }^{m}f(x)dx = \int_{m}^{\infty }f(x)dx\) = 1/2

Variance of Probability Density Function
The expected value of the squared deviation from the mean is the variance of a random variable. Expressing this definition mathematically we get,
Var(X) = \(E[(X - \mu )^{2}]\)
To represent this variance with the help of the probability density function, the formula is given as:
Var(X) = \(\sigma ^{2} = \int_{-\infty }^{\infty }(x - \mu )^{2}f(x)dx\)
Properties of Probability Density Function
The properties of the probability density function help to solve questions faster. If f(x) is the probability distribution of a continuous random variable, X, then some of the useful properties are listed below:
- f(x) ≥ 0. This implies that the probability density function for all real numbers can be either equal to or greater than 0. But it can never be negative or lesser than 0.
- \(\int_{-\infty }^{\infty}f(x)dx\) = 1. Thus, the total area under the probability density curve will be equal to 1.
Related Articles:
- Probability and Statistics
- Experimental Probability
- Probability Density Function Calculator
Important Notes on Probability Density Function
- Probability density function determines the probability that a continuous random variable will fall between a range of specified values.
- On differentiating the cumulative distribution function, we obtain the probability density function.
- The mean of the probability density function can be give as E[X] = \(\mu = \int_{-\infty }^{\infty}xf(x)dx\).
- As the median divides the probability density function curve into 2 equal halves, its value will be equal to 1 / 2.
- The variance of a probability density function is given by Var(X) = \(\sigma ^{2} = \int_{-\infty }^{\infty }(x - \mu )^{2}f(x)dx\)
Examples on Probability Density Function
- Example 1: If the probability density function is given as: f(x) = \(\left\{\begin{matrix} x(x-1) &0\leq x < 3 \\ x& x\geq3 \end{matrix}\right.\) Find P(1< X < 2). Solution: Integrating the function, \(\int_{1}^{2}x(x-1)dx\) = \(\int_{1}^{2}(x^{2}-x)dx\) = \([\frac{x^{3}}{3} - \frac{x^{2}}{2}]_{1}^{2}\) = 5 / 6 Answer: P(1< X < 2) = 5 / 6
- Example 2: If X is a continuous random variable with the probability density function given as: f(x) = \(\left\{\begin{matrix} \frac{be^{-x}}{2} & x\geq 0\\ 0 & \text{otherwise} \end{matrix}\right.\) Find the value of b. Solution: We know from properties of probability density function that \(\int_{-\infty }^{\infty}f(x)dx\) = 1 \(\int_{0}^{\infty }\frac{be^{-x}}{2}dx\) = 1 On integrating, \(\frac{b}{2}\left [ -e^{-x} \right ]_{0}^{\infty }\) = 1 b / 2 = 1 b = 2 Answer: b = 2
- Example 3: Find the expected value of X if the probability density function is defined as: f(x) = \(\left\{\begin{matrix} \frac{3}{2}x^{2} & 0\leq x\leq 2\\ 0& \text{otherwise} \end{matrix}\right.\) Solution: We know that, E[X] = \(\int_{-\infty }^{\infty}xf(x)dx\) = \(\int_{-\infty }^{0}x(0)dx + \int_{0}^{2}x (\frac{3x^{2}}{2})dx + \int_{2}^{\infty }x(0)dx\) = \(\frac{3}{2}[\frac{x^{4}}{4}]_{0}^{2}\) = (3/8) [16] = 6 Answer: Mean = 6
go to slide go to slide go to slide

Book a Free Trial Class
Practice Questions on Probability Density Function
go to slide go to slide
FAQs on Probability Density Function
What is meant by the probability density function.
Probability density function is a function that is used to give the probability that a continuous random variable will fall within a specified interval. The integral of the probability density function is used to give this probability.
What is the Probability Density Function Formula?
We can differentiate the cumulative distribution function (cdf) to get the probability density function (pdf). This can be given by the formula f(x) = \(\frac{\mathrm{d} F(x)}{\mathrm{d} x}\) = F'(x). Here, f(x) is the pdf and F'(x) is the cdf.
How to Calculate the Probability Density Function?
To calculate the probability density function we differentiate the cumulative distribution function. If we integrate the probability density function, we get the probability that a continuous random variable lies within a certain interval.
How to Find the Mean of Probability Density Function?
The expected value is also known as the mean. The mean of the probability density function is given by the formula \(\mu = \int_{-\infty }^{\infty}xf(x)dx\).
Is Probability Density Function Always Positive?
The value of the integral of a probability density function will always be positive. This is because probability can never be negative hence, as a consequence, the probability density function can also never be negative.
How Do You Find the Probability of a Probability Density Function?
To find the probability that a continuous random variable X, falls between an interval a and b we use the probability density function, f(x). This formula is given as P(a < X < b) = \(\int_{a}^{b}f(x)dx\)
What are the Features of Probability Density Function?
The features of the probability density function are given below:
- The probability density function will always be a positive value.
- The total area under the probability density function curve will always be equal to 1.
- Skip to primary navigation
- Skip to main content
- Skip to primary sidebar
- Skip to footer


Probability Density Function

What is Probability Density Function?
The probability density function gives the output indicating the density of a continuous random variable lying between a specific range of values. If a given scenario is calculated based on numbers and values, the function computes the density corresponding to the specified range.
The function, when solved, defines the relationship between the random variable and its probability. The variable’s probability is matched when the function is solved. The function is denoted by f(x), and the graphical portrayal gives a bell curve. The area between any two particular values gives the outcome probability of the designated observation.
Table of contents
- #1 – Discrete variable
- #2 – Continuous Variable
Examples with Calculation
Applications, frequently asked questions (faqs), recommended articles, key takeaways.
- The probability density function (PDF) gives the output indicating the density of a continuous random variable lying between a specific range of values.
- There are imperatively two types of variables: discrete and continuous. The PDF turns into the probability mass function when dealing with discrete variables.
- It is used in statistical calculations and graphically represented as a bell curve forming a relationship between the variable and its probability.
- Its application is significant in machine learning algorithms, analytics, probability theory, neural networks, etc.
Probability Density Function Explained
The probability density function (PDF) is associated with a continuous random variable by finding the probability that falls in a specific interval. A continuous random variable can take an uncountably infinite number of possible values. The probability mass function replaces the PDF for a discrete random variable that takes on finite or countable possible values.
PDFs have a wide range of applications. For example, it is used in modeling and predictions related to chemically reactive turbulent flows and analysis of stock price returns. For each application, the corresponding curve is depicted on a graph, and analyzing bell curve features like symmetry and sides like left or right gives important information.
Another important concept significant in understanding the PDF is the cumulative distribution function. It is also used to explain the distribution of random variables, primarily continuous random variables. The differentiating cumulative distribution function of a continuous random variable will give the value of PDF, and integrating the PDF gives the value of the cumulative distribution function.
Probability density function formula :

To calculate the PDF online probability density function calculator or formula based on cumulative distribution function is used, we differentiate the cumulative distribution function:

f(x) is the PDF and F(x) is the CDF

- X lies between lower limit ‘a’ and upper limit ‘b’
- F(b): Cumulative distribution function at a
- F(a): Cumulative distribution function at b
To better understand the formula and its application, consider the following PDF example:
The PDF is:

Find P(1<x<2)
The formula is:

Applying the upper and lower limits:

- It is used in machine learning algorithms, analytics, probability theory, neural networks, etc.
- Calculates probabilities associated with continuous random variables.
- The PDF of stock price returns is used for studying various market scenarios.
- It is used to model various processes and derive solutions to the problem. For example, it is applied to model chemically reacting turbulent flows and derive appropriate resolution to the closure problems.

The function, joint pdf, denotes the probability distribution of two or more continuous random variables, which together form a continuous random vector. If two random variables have a joint PDF, they are jointly continuous. Its calculation involves the application of multiple integrals.
Consider an example with PDF, f(x) = x + 3, when 1 < x ≤ 3. We have to find P(2 < X < 3). Integrating x + 3 within the limits 2 and 3 gives the answer 5.5.
This is a Guide to What is Probability Density Function (PDF) and its definition. We explain formulas, calculations, applications, examples & joint PDF. You can learn more from the following articles –
- Posterior Probability
- Conditional Probability
- A Priori Probability
- Privacy Policy
- Terms of Service
- Cookie Policy
- Advertise with us
- Investment Banking Resources
- Financial Modeling Guides
- Excel Resources
- Accounting Resources
- Financial Statement Analysis

Find Study Materials for
Create Study Materials
Select your language

Probability Density Function
- Absolute Maxima and Minima
- Absolute and Conditional Convergence
- Accumulation Function
- Accumulation Problems
- Algebraic Functions
- Alternating Series
- Antiderivatives
- Application of Derivatives
- Approximating Areas
- Arc Length of a Curve
- Area Between Two Curves
- Arithmetic Series
- Average Value of a Function
- Calculus of Parametric Curves
- Candidate Test
- Combining Differentiation Rules
- Combining Functions
- Continuity Over an Interval
- Convergence Tests
- Cost and Revenue
- Density and Center of Mass
- Derivative Functions
- Derivative of Exponential Function
- Derivative of Inverse Function
- Derivative of Logarithmic Functions
- Derivative of Trigonometric Functions
- Derivatives
- Derivatives and Continuity
- Derivatives and the Shape of a Graph
- Derivatives of Inverse Trigonometric Functions
- Derivatives of Polar Functions
- Derivatives of Sec, Csc and Cot
- Derivatives of Sin, Cos and Tan
- Determining Volumes by Slicing
- Direction Fields
- Disk Method
- Divergence Test
- Eliminating the Parameter
- Euler's Method
- Evaluating a Definite Integral
- Evaluation Theorem
- Exponential Functions
- Finding Limits
- Finding Limits of Specific Functions
- First Derivative Test
- Function Transformations
- General Solution of Differential Equation
- Geometric Series
- Growth Rate of Functions
- Higher-Order Derivatives
- Hydrostatic Pressure
- Hyperbolic Functions
- Implicit Differentiation Tangent Line
- Implicit Relations
- Improper Integrals
- Indefinite Integral
- Indeterminate Forms
- Initial Value Problem Differential Equations
- Integral Test
- Integrals of Exponential Functions
- Integrals of Motion
- Integrating Even and Odd Functions
- Integration Formula
- Integration Tables
- Integration Using Long Division
- Integration of Logarithmic Functions
- Integration using Inverse Trigonometric Functions
- Intermediate Value Theorem
- Inverse Trigonometric Functions
- Jump Discontinuity
- Lagrange Error Bound
- Limit of Vector Valued Function
- Limit of a Sequence
- Limits at Infinity
- Limits at Infinity and Asymptotes
- Limits of a Function
- Linear Approximations and Differentials
- Linear Differential Equation
- Linear Functions
- Logarithmic Differentiation
- Logarithmic Functions
- Logistic Differential Equation
- Maclaurin Series
- Manipulating Functions
- Maxima and Minima
- Maxima and Minima Problems
- Mean Value Theorem for Integrals
- Models for Population Growth
- Motion Along a Line
- Motion in Space
- Natural Logarithmic Function
- Net Change Theorem
- Newton's Method
- Nonhomogeneous Differential Equation
- One-Sided Limits
- Optimization Problems
- Particle Model Motion
- Particular Solutions to Differential Equations
- Polar Coordinates
- Polar Coordinates Functions
- Polar Curves
- Population Change
- Power Series
- Radius of Convergence
- Related Rates
- Removable Discontinuity
- Riemann Sum
- Rolle's Theorem
- Second Derivative Test
- Separable Equations
- Separation of Variables
- Simpson's Rule
- Solid of Revolution
- Solutions to Differential Equations
- Surface Area of Revolution
- Symmetry of Functions
- Tangent Lines
- Taylor Polynomials
- Taylor Series
- Techniques of Integration
- The Fundamental Theorem of Calculus
- The Mean Value Theorem
- The Power Rule
- The Squeeze Theorem
- The Trapezoidal Rule
- Theorems of Continuity
- Trigonometric Substitution
- Vector Valued Function
- Vectors in Calculus
- Vectors in Space
- Washer Method
- Dynamic Programming
- Formulating Linear Programming Problems
- 2 Dimensional Figures
- 3 Dimensional Vectors
- 3-Dimensional Figures
- Angles in Circles
- Arc Measures
- Area and Volume
- Area of Circles
- Area of Circular Sector
- Area of Parallelograms
- Area of Plane Figures
- Area of Rectangles
- Area of Regular Polygons
- Area of Rhombus
- Area of Trapezoid
- Area of a Kite
- Composition
- Congruence Transformations
- Congruent Triangles
- Convexity in Polygons
- Coordinate Systems
- Distance and Midpoints
- Equation of Circles
- Equilateral Triangles
- Fundamentals of Geometry
- Geometric Inequalities
- Geometric Mean
- Geometric Probability
- Glide Reflections
- HL ASA and AAS
- Identity Map
- Inscribed Angles
- Isosceles Triangles
- Law of Cosines
- Law of Sines
- Linear Measure and Precision
- Parallel Lines Theorem
- Parallelograms
- Perpendicular Bisector
- Plane Geometry
- Projections
- Properties of Chords
- Proportionality Theorems
- Pythagoras Theorem
- Reflection in Geometry
- Regular Polygon
- Right Triangles
- SSS and SAS
- Segment Length
- Similarity Transformations
- Special quadrilaterals
- Surface Area of Cone
- Surface Area of Cylinder
- Surface Area of Prism
- Surface Area of Sphere
- Surface Area of a Solid
- Surface of Pyramids
- Translations
- Triangle Inequalities
- Using Similar Polygons
- Vector Addition
- Vector Product
- Volume of Cone
- Volume of Cylinder
- Volume of Pyramid
- Volume of Solid
- Volume of Sphere
- Volume of prisms
- Acceleration and Time
- Acceleration and Velocity
- Angular Speed
- Assumptions
- Calculus Kinematics
- Coefficient of Friction
- Connected Particles
- Conservation of Mechanical Energy
- Constant Acceleration
- Constant Acceleration Equations
- Converting Units
- Elastic Strings and Springs
- Force as a Vector
- Newton's First Law
- Newton's Law of Gravitation
- Newton's Second Law
- Newton's Third Law
- Projectiles
- Resolving Forces
- Statics and Dynamics
- Tension in Strings
- Variable Acceleration
- Work Done by a Constant Force
- Basic Probability
- Charts and Diagrams
- Conditional Probabilities
- Continuous and Discrete Data
- Frequency, Frequency Tables and Levels of Measurement
- Independent Events Probability
- Line Graphs
- Mean Median and Mode
- Mutually Exclusive Probabilities
- Probability Rules
- Probability of Combined Events
- Quartiles and Interquartile Range
- Systematic Listing
- ASA Theorem
- Absolute Value Equations and Inequalities
- Addition and Subtraction of Rational Expressions
- Addition, Subtraction, Multiplication and Division
- Algebraic Fractions
- Algebraic Notation
- Algebraic Representation
- Analyzing Graphs of Polynomials
- Angle Measure
- Angles in Polygons
- Approximation and Estimation
- Area and Circumference of a Circle
- Area and Perimeter of Quadrilaterals
- Area of Triangles
- Argand Diagram
- Arithmetic Sequences
- Average Rate of Change
- Bijective Functions
- Binomial Expansion
- Binomial Theorem
- Circle Theorems
- Circles Maths
- Combination of Functions
- Combinatorics
- Common Factors
- Common Multiples
- Completing the Square
- Completing the Squares
- Complex Numbers
- Composite Functions
- Composition of Functions
- Compound Interest
- Compound Units
- Conic Sections
- Construction and Loci
- Converting Metrics
- Convexity and Concavity
- Coordinate Geometry
- Coordinates in Four Quadrants
- Cubic Function Graph
- Cubic Polynomial Graphs
- Data transformations
- De Moivre's Theorem
- Deductive Reasoning
- Definite Integrals
- Deriving Equations
- Determinant of Inverse Matrix
- Determinants
- Differential Equations
- Differentiation
- Differentiation Rules
- Differentiation from First Principles
- Differentiation of Hyperbolic Functions
- Direct and Inverse proportions
- Disjoint and Overlapping Events
- Disproof by Counterexample
- Distance from a Point to a Line
- Divisibility Tests
- Double Angle and Half Angle Formulas
- Drawing Conclusions from Examples
- Equation of Line in 3D
- Equation of a Perpendicular Bisector
- Equation of a circle
- Equations and Identities
- Equations and Inequalities
- Estimation in Real Life
- Euclidean Algorithm
- Evaluating and Graphing Polynomials
- Even Functions
- Exponential Form of Complex Numbers
- Exponential Rules
- Exponentials and Logarithms
- Expression Math
- Expressions and Formulas
- Faces Edges and Vertices
- Factoring Polynomials
- Factoring Quadratic Equations
- Factorising expressions
- Finding Maxima and Minima Using Derivatives
- Finding Rational Zeros
- Finding the Area
- Forms of Quadratic Functions
- Fractional Powers
- Fractional Ratio
- Fractions and Decimals
- Fractions and Factors
- Fractions in Expressions and Equations
- Fractions, Decimals and Percentages
- Function Basics
- Functional Analysis
- Fundamental Counting Principle
- Fundamental Theorem of Algebra
- Generating Terms of a Sequence
- Geometric Sequence
- Gradient and Intercept
- Graphical Representation
- Graphing Rational Functions
- Graphing Trigonometric Functions
- Graphs and Differentiation
- Graphs of Common Functions
- Graphs of Exponents and Logarithms
- Graphs of Trigonometric Functions
- Greatest Common Divisor
- Growth and Decay
- Growth of Functions
- Highest Common Factor
- Imaginary Unit and Polar Bijection
- Implicit differentiation
- Inductive Reasoning
- Inequalities Maths
- Infinite geometric series
- Injective functions
- Instantaneous Rate of Change
- Integrating Polynomials
- Integrating Trigonometric Functions
- Integrating e^x and 1/x
- Integration
- Integration Using Partial Fractions
- Integration by Parts
- Integration by Substitution
- Integration of Hyperbolic Functions
- Inverse Hyperbolic Functions
- Inverse Matrices
- Inverse and Joint Variation
- Inverse functions
- Iterative Methods
- Law of Cosines in Algebra
- Law of Sines in Algebra
- Laws of Logs
- Limits of Accuracy
- Linear Expressions
- Linear Systems
- Linear Transformations of Matrices
- Location of Roots
- Logarithm Base
- Lower and Upper Bounds
- Lowest Common Denominator
- Lowest Common Multiple
- Math formula
- Matrix Addition and Subtraction
- Matrix Determinant
- Matrix Multiplication
- Metric and Imperial Units
- Misleading Graphs
- Mixed Expressions
- Modulus Functions
- Modulus and Phase
- Multiples of Pi
- Multiplication and Division of Fractions
- Multiplicative Relationship
- Multiplying and Dividing Rational Expressions
- Natural Logarithm
- Natural Numbers
- Number Line
- Number Systems
- Numerical Methods
- Odd functions
- Open Sentences and Identities
- Operation with Complex Numbers
- Operations with Decimals
- Operations with Matrices
- Operations with Polynomials
- Order of Operations
- Parallel Lines
- Parametric Differentiation
- Parametric Equations
- Parametric Integration
- Partial Fractions
- Pascal's Triangle
- Percentage Increase and Decrease
- Percentage as fraction or decimals
- Perimeter of a Triangle
- Permutations and Combinations
- Perpendicular Lines
- Points Lines and Planes
- Polynomial Graphs
- Polynomials
- Powers Roots And Radicals
- Powers and Exponents
- Powers and Roots
- Prime Factorization
- Prime Numbers
- Problem-solving Models and Strategies
- Product Rule
- Proof and Mathematical Induction
- Proof by Contradiction
- Proof by Deduction
- Proof by Exhaustion
- Proof by Induction
- Properties of Exponents
- Proving an Identity
- Pythagorean Identities
- Quadratic Equations
- Quadratic Function Graphs
- Quadratic Graphs
- Quadratic functions
- Quadrilaterals
- Quotient Rule
- Radical Functions
- Rates of Change
- Ratio Fractions
- Rational Exponents
- Rational Expressions
- Rational Functions
- Rational Numbers and Fractions
- Ratios as Fractions
- Real Numbers
- Reciprocal Graphs
- Recurrence Relation
- Recursion and Special Sequences
- Remainder and Factor Theorems
- Representation of Complex Numbers
- Rewriting Formulas and Equations
- Roots of Complex Numbers
- Roots of Polynomials
- Roots of Unity
- SAS Theorem
- SSS Theorem
- Scalar Triple Product
- Scale Drawings and Maps
- Scale Factors
- Scientific Notation
- Second Order Recurrence Relation
- Sector of a Circle
- Segment of a Circle
- Sequences and Series
- Series Maths
- Similar Triangles
- Similar and Congruent Shapes
- Simple Interest
- Simplifying Fractions
- Simplifying Radicals
- Simultaneous Equations
- Sine and Cosine Rules
- Small Angle Approximation
- Solving Linear Equations
- Solving Linear Systems
- Solving Quadratic Equations
- Solving Radical Inequalities
- Solving Rational Equations
- Solving Simultaneous Equations Using Matrices
- Solving Systems of Inequalities
- Solving Trigonometric Equations
- Solving and Graphing Quadratic Equations
- Solving and Graphing Quadratic Inequalities
- Special Products
- Standard Form
- Standard Integrals
- Standard Unit
- Straight Line Graphs
- Substraction and addition of fractions
- Sum and Difference of Angles Formulas
- Sum of Natural Numbers
- Surjective functions
- Tables and Graphs
- Tangent of a Circle
- The Quadratic Formula and the Discriminant
- Transformations
- Transformations of Graphs
- Translations of Trigonometric Functions
- Triangle Rules
- Triangle trigonometry
- Trigonometric Functions
- Trigonometric Functions of General Angles
- Trigonometric Identities
- Trigonometric Ratios
- Trigonometry
- Turning Points
- Types of Functions
- Types of Numbers
- Types of Triangles
- Unit Circle
- Variables in Algebra
- Verifying Trigonometric Identities
- Writing Equations
- Writing Linear Equations
- Bias in Experiments
- Binomial Distribution
- Binomial Hypothesis Test
- Bivariate Data
- Categorical Data
- Categorical Variables
- Central Limit Theorem
- Chi Square Test for Goodness of Fit
- Chi Square Test for Homogeneity
- Chi Square Test for Independence
- Chi-Square Distribution
- Combining Random Variables
- Comparing Data
- Comparing Two Means Hypothesis Testing
- Conditional Probability
- Conducting a Study
- Conducting a Survey
- Conducting an Experiment
- Confidence Interval for Population Mean
- Confidence Interval for Population Proportion
- Confidence Interval for Slope of Regression Line
- Confidence Interval for the Difference of Two Means
- Confidence Intervals
- Correlation Math
- Cumulative Distribution Function
- Cumulative Frequency
- Data Analysis
- Data Interpretation
- Degrees of Freedom
- Discrete Random Variable
- Distributions
- Empirical Rule
- Errors in Hypothesis Testing
- Estimator Bias
- Events (Probability)
- Frequency Polygons
- Generalization and Conclusions
- Geometric Distribution
- Hypothesis Test for Correlation
- Hypothesis Test for Regression Slope
- Hypothesis Test of Two Population Proportions
- Hypothesis Testing
- Inference for Distributions of Categorical Data
- Inferences in Statistics
- Large Data Set
- Least Squares Linear Regression
- Linear Interpolation
- Linear Regression
- Measures of Central Tendency
- Methods of Data Collection
- Normal Distribution
- Normal Distribution Hypothesis Test
- Normal Distribution Percentile
- Paired T-Test
- Point Estimation
- Probability
- Probability Calculations
- Probability Distribution
- Probability Generating Function
- Quantitative Variables
- Random Variables
- Randomized Block Design
- Residual Sum of Squares
- Sample Mean
- Sample Proportion
- Sampling Distribution
- Scatter Graphs
- Single Variable Data
- Spearman's Rank Correlation Coefficient
- Standard Deviation
- Standard Error
- Standard Normal Distribution
- Statistical Graphs
- Statistical Measures
- Stem and Leaf Graph
- Sum of Independent Random Variables
- Survey Bias
- T-distribution
- Transforming Random Variables
- Tree Diagram
- Two Categorical Variables
- Two Quantitative Variables
- Type I Error
- Type II Error
- Types of Data in Statistics
- Variance for Binomial Distribution
- Venn Diagrams
Lerne mit deinen Freunden und bleibe auf dem richtigen Kurs mit deinen persönlichen Lernstatistiken
Nie wieder prokastinieren mit unseren Lernerinnerungen.
If you are flipping a coin, it is pretty easy to see that the probability of getting a head is \(0.5\). But what if you want to find the probability of someone being exactly \(2\) metres tall? Height is a continuous variable, not a discrete one, so you can't use the basic probability rules you might already know. Instead, you will need a probability density function . So don't be dense, read on to find out about continuous random variables and probability density functions!
Probability mass function and probability density function
You might think that because the names 'probability mass function' and 'probability density function' are so close that they really describe the same thing. Both of them do describe probabilities, and both are functions. The big difference is in what kind of random variable they are used with:
If \(X\) is a discrete random variable, then use a probability mass function, which is a summation.
If \(X\) is a continuous random variable, then use a probability density function, which is an integral.
Going forward you will see information and examples involving the probability density function for a continuous random variable \(X\). If you are interested in probability mass functions, check out the article Discrete Probability Distributions or the article on the Poisson Distribution.
Probability density function graph
First of all, what is a probability density function?
The probability density function , or PDF, of a continuous random variable \(X\) is an integrable function \(f_X(x)\) satisfying the following:
- \(f_X(x) \ge 0\) for all \(x\) in \(X\); and
\(\displaystyle \int_X f_X(x) \, \mathrm{d} x = 1\).
Then the probability that \(X\) is in the interval \([a,b]\) is \[ P(a<X<b) = \int_a^b f_X(x) \, \mathrm{d} x .\]
That looks more complicated than it actually is. Let's relate it to the graph of a function.
Take the function
\[ f_X(x) = \begin{cases} 0.1 & 1 \le x \le 11 \\ 0 & \text{otherwise} \end{cases} \]
as seen in the graph below.
Let's check it for the properties of a probability density function. It is certainly at least always zero. The area under the curve is \(1\) since that area is just a rectangle with height \(0.1\) and width \(10\). And lastly, you can represent the probability as an area. For example, if you wanted to find \(P(5<X<7)\) you could do so by finding the area of the rectangle in the graph below, getting that \(P(5<X<7) = 0.2\).
So \(f_X(x)\) is a probability density function. If you were to graph the probability curve, you would need to integrate it, giving you
\[ P(a<X<b) = \begin{cases} 0 & a \text{ and } b \le 1 \\ 0.2(b-1) & a<1<b \le 11 \\ 0.2(b-a) & 1 \le a \le b \le 11 \\ 0.2(11-a) & 1 <a <11 < b \\ 1 & 11 \le a < b \end{cases} \]
That certainly seems like a lot of cases, but you can see it much more easily by looking at the graph below.
Notice that the minimum height of the graph above is \(0\), and the maximum height of the graph is \(1\). This makes sense because probabilities are always at least zero and at most one.
It turns out that the integral of the probability density function is quite useful, and it is called the Cumulative Distribution Function.
Probability density function properties
Using the definition of the probability density function, you can see an important property of them:
\[P(X=a) = 0.\]
It also doesn't matter if you use strict inequalities with continuous density functions:
\[ P(X<a) = P(X\le a).\]
Both of those properties come from the fact that
\[ P(a<X<b) = \int_a^b f_X(x) \, \mathrm{d} x .\]
You might ask if the probability density function can be greater than \(1\). Sure it can! The integral of the function still needs to be equal to \(1\), but the probability density function can take on values larger than that as long as it is also at least zero. One example of this is the probability density function
\[ f_X(x) = \begin{cases} 2 & 0 \le x \le \dfrac{1}{2} \\ 0 & \text{otherwise} \end{cases} .\]
This function is always at least zero, it is integrable, and the integral is \(1\), so it could be a probability density function for a continuous random variable \(X\). Don't confuse the probability density function for actual probabilities!
Probability density function of normal distribution
One of the probability density functions you will see often is the normal distribution. You can see the graph of the standard normal distribution probability density function below.
Just like with other probability density functions, the area under the curve of the standard normal distribution is \(1\).
Probability density function example
Let's look at some examples.
Suppose that someone tells you that
\[ f_X(x) = \begin{cases} 2x & 0 \le x \le 1 \\ 0 & \text{otherwise} \end{cases}\]
is the probability density function for the length of time, in hours, you will spend waiting in the doctor's office.
(a) Check to be sure this is a probability density function.
(b) Find the probability you will wait less than half an hour to see the doctor.
(c) Find the probability you will wait more than half an hour to see the doctor.
(a) First note that \(X\) is in fact a continuous random variable. In addition, \(f_X(x)\) is always at least zero. It is also integrable, so now it just remains to check that the integral is one. Doing the integration,
\[\begin{align} \int_X f_X(x) \, \mathrm{d} x &= \int_0^1 2x \, \mathrm{d} x \\ &= \left. 2\left(\frac{1}{2}\right)x^2\right|_0^1 \\ &= 1^2-0^2 \\ &= 1.\end{align}\]
So this is, in fact, a probability density function.
(b) You want to know the probability that you will wait less than half an hour. In other words, you need to find \(P(X<0.5)\). Then
\[\begin{align} P(X<0.5) &= \int_0^{0.5} 2x \, \mathrm{d} x \\ &= \left.\phantom{\frac{1}{2}} x^2 \right|_0^{0.5} \\ &= (0.5)^2 - 0^2 \\ &=0.25. \end{align}\]
So the probability that you will wait less than half an hour is \(0.25\). So \(25\%\) of the time, you will wait less than half an hour to see the doctor.
(c) Now you want to find the probability that you will wait more than half an hour to see the doctor. Remember that the area under the probability density function is \(1\), so
\[ P(X > 0.5) = 1 - P(X<0.5).\]
Then using the previous part of the problem, \(P(X> 0.5) =0.75\). This means that \(75\%\) of the time, you will wait at least half an hour to see the doctor!
Probability Density Function - Key takeaways
- A probability mass function is used with discrete random variables, and a probability density function is used with continuous random variables.
The probability density function, or PDF, of a continuous random variable \(X\) is an integrable function \(f_X(x)\) satisfying the following:
The probability that a continuous random variable \(X\) is in the interval \([a,b]\) is \[ P(a<X<b) = \int_a^b f_X(x) \, \mathrm{d} x .\]
For a continuous random variable \(X\), \(P(X=a) = 0\), and it doesn't matter if you use strict inequalities: \( P(X<a) = P(x \le a)\).
Frequently Asked Questions about Probability Density Function
--> what is pdf and cdf .
The PDF is the probability density function and the CDF is the cumulative distribution function.
--> What is probability density function give example?
An example of a probability density function for a continuous random variable would be the standard normal distribution.
--> Can probability density function be negative?
No. They are always at least zero.
--> What does the probability density function tell us?
A probability density function can tell you the probability of a continuous random variable being within a certain range.
--> Can a probability density function be greater than 1?
Yes. Remember that the area under the probability density function is 1. As long as that is satisfied, and the function is at least zero, it can take on values larger than 1.
Final Probability Density Function Quiz
Suppose you are rolling a \(20\) sided die in a game. You could use a ____ to model the probability of rolling \(13\) twice in a row.
Show answer
Probability mass function.
Show question
Suppose you want to model the probability that someone will spend at least \(2\) minutes paying attention to an advertisement. To do this you would use a ____.
Probability density function.
Probability density functions are used with ____ random variables.
Continuous.
What three things do you need to check to be sure that \(f_X(x)\) is a probability density function for the continuous random variable \(X\)?
\(f_X(x)\) is always at least zero.
\(f_X(x)\) is integrable.
Which of the following is true about a probability density function for a continuous random variable?
The area under the curve is equal to \(1\).
True or False: The probability density function for a continuous random variable can't be larger than one.
True or False: The probability density function for a continuous random variable must be larger than zero.
True or False: For a continuous random variable \(X\), \( P(a\le X \le b) = P(a<X <b)\).
The probability that a continuous random variable \(X\) is in the interval \([a,b]\) is found using the formula ____.
\( P(a<X<b) = \displaystyle\int_a^b f_X(x) \, \mathrm{d} x\).
Which of the following are properties of the probability density function \(f_X(x)\) of a continuous random variable \(X\)?
\[ f(x) = \begin{cases} 1 & 0 \le x \le 2 \\ 0 & \text{otherwise} \end{cases}\]
a probability density function for a continuous random variable \(X\)?
No. The area under the curve of \(f(x)\) is not equal to \(1\).
\[ f(x) = \begin{cases} 2 & 0 \le x \le \frac{1}{2} \\ 0 & \text{otherwise} \end{cases}\]
Yes. It satisfies all three properties of a probability density function.
\[ f(x) = \begin{cases} \frac{1}{2} & 0 \le x \le 2 \\ 0 & \text{otherwise} \end{cases}\]
For a probability density function, what is \(P(X=a)\)?
For a probability mass function, what is \(P(X=a)\)?
Not enough information to tell.
- Mechanics Maths
- Probability and Statistics
of the users don't pass the Probability Density Function quiz! Will you pass the quiz?
More explanations about Statistics
Discover the right content for your subjects, business studies, combined science, english literature, environmental science, human geography, macroeconomics, microeconomics, no need to cheat if you have everything you need to succeed packed into one app.
Be perfectly prepared on time with an individual plan.
Test your knowledge with gamified quizzes.
Create and find flashcards in record time.
Create beautiful notes faster than ever before.
Have all your study materials in one place.
Upload unlimited documents and save them online.
Study Analytics
Identify your study strength and weaknesses.
Weekly Goals
Set individual study goals and earn points reaching them.
Smart Reminders
Stop procrastinating with our study reminders.
Earn points, unlock badges and level up while studying.
Magic Marker
Create flashcards in notes completely automatically.
Smart Formatting
Create the most beautiful study materials using our templates.
Join millions of people in learning anywhere, anytime - every day
Sign up to highlight and take notes. It’s 100% free.
This is still free to read, it's not a paywall.
You need to register to keep reading, get free access to all of our study material, tailor-made.
Over 10 million students from across the world are already learning smarter.

StudySmarter bietet alles, was du für deinen Lernerfolg brauchst - in einer App!
- Practice Problems
- Assignment Problems
- Show all Solutions/Steps/ etc.
- Hide all Solutions/Steps/ etc.
- Hydrostatic Pressure and Force
- Parametric Equations and Polar Coordinates Introduction
- Integration Techniques
- Parametric Equations and Polar Coordinates
- Calculus II
- Calculus III
- Differential Equations
- Algebra & Trig Review
- Common Math Errors
- Complex Number Primer
- How To Study Math
- Cheat Sheets & Tables
- MathJax Help and Configuration
- Notes Downloads
- Complete Book
- Practice Problems Downloads
- Complete Book - Problems Only
- Complete Book - Solutions
- Assignment Problems Downloads
- Other Items
- Get URL's for Download Items
- Print Page in Current Form (Default)
- Show all Solutions/Steps and Print Page
- Hide all Solutions/Steps and Print Page
Section 8.5 : Probability
- Show that \(f\left( x \right)\) is a probability density function.
- Find \(P\left( {X \le 7} \right)\).
- Find \(P\left( {X \ge 7} \right)\).
- Find \(P\left( {3 \le X \le 14} \right)\).
- Determine the mean value of \(X\).
- Verify that \(f\left( t \right)\) is a probability density function.
- What is the probability that a light bulb will have a life span less than 8 months?
- What is the probability that a light bulb will have a life span more than 20 months?
- What is the probability that a light bulb will have a life span between 14 and 30 months?
- Determine the mean value of the life span of the light bulbs.
- Determine the value of \(c\) for which the function below will be a probability density function. \[f\left( x \right) = \left\{ {\begin{array}{*{20}{l}}{c\left( {8{x^3} - {x^4}} \right)}&{{\mbox{if }}0 \le x \le 8}\\0&{{\mbox{otherwise}}}\end{array}} \right.\] Solution

IMAGES
VIDEO
COMMENTS
Problem. Let X be a continuous random variable with PDF given by fX(x)=12e−|x|,for all x∈R. If Y=X2, find the CDF of Y. Solution.
a) Verify that f is a probability density function. b) What is the probability that x is greater than 4. c) What is the probability that x is between 1 and 3
Solution. In reality, I'm not particularly interested in using this example just so that you'll know whether or not you've been ripped
In this video lecture you will know the relationship between probability and probability density function (PDF). This problem on probability
Say we have a continuous random variable whose probability density function is given by f(x) = x + 2, when 0 < x ≤ 2. We want to find P(0.5 < X < 1). Then we
Let the joint probability density function for (X, Y) be f(x, y) = 2 yx. +. , x > 0, y > 0,. 3x + y < 3, zero otherwise. a). Find the probability P(X < Y).
Consider an example with PDF, f(x) = x + 3, when 1 < x ≤ 3. We have to find P(2 < X < 3). Integrating x + 3 within the limits 2 and 3 gives the answer 5.5.
An example of a probability density function for a continuous random variable would be the standard normal distribution. Can probability density function be
Determine the mean value of X X . Solution; For a brand of light bulb the probability density function of the life span of the light bulb is
The lifetime of a certain brand of battery, in tens of hours, is modelled by the continuous random variable X with probability density function ( ).