mjtsai1974's Dev Blog Welcome to mjt's AI world

Introduction To The Gamma Distribution

Prologue To The Gamma Distribution

In probability theory and statistics, the gamma distribution is the most fundamental, which is based on for further development of many distributions, they are beta, exponential, F, chi-square, t distributions and still others. With the basic intuition of gamma distribution would it be greatly helpful in the evaluation of the regression model build on your hypothesis, even more, the power of test for the precision in the machine learning results.

The Gamma Function Γ

It is very important in the gamma distribution, first of all, we take not only a glance over it, but go through some of the major properties of it. The gamma function comes in the definition:
Γ(α)=0xα1exdx, where α>0.

Taking advantage of integration by part:
Let u=xα1, dv=exdx, then,
du=(α1)xα2, v=ex.

Γ(α)=xα1(ex)|0-0ex(α1)xα2dx
=0+0ex(α1)xα2dx
=(α1)0exxα2dx
=(α1)Γ(α1)

Γ(5)=4Γ(4), therefore, we can deduce it out that: Γ(α)=(α1)Γ(α1)
=(α1)(α2)Γ(α2)=

[1]the corollary has it that:
Γ(n)=(n1)(n2)(n3)Γ(1)
,where Γ(1)=0x0exdx=ex|0=1
, thus, Γ(n)=(n1)! is obtained.

[2]Γ(12)=π
There exists some alternatives, either way could be:
proof::➀
As we don’t like 12, by means of change unit,
let x=u2, then, dx=2udu:
Γ(12)=0x12exdx
=0u1eu22udu
=20eu2du

Take I=0eu2du, then,
I2=0ex2dx 0ey2dy
=00e(x2+y2)dxdy

Guess what? We just transform our integral to the quadrant one.
Take r2=x2+y2, we can have below two sets of deduction:
dr2dx=d(x2+y2)dx=2x
dr2=2xdx

dr2dr=d(x2+y2)dr
2r=d(x2+y2)dr
2rdr=d(x2+y2)
2rdrdx=2x
rdr=xdx

Replace ➀ and ➁ in below integral:
0er2dr2
=0er2dr2dxdx
=0er22xdx
=02rer2dr
=er2|0
=1

Please recall that we have our integration area in quadrant one, at this moment, back to I, let θ=y to integrate from 0 to π2:
I2=π200er2dr2dθ
=π20dθ 0er2xdx
=π2 0er2rdr
=π2(12er2)|0
=π2(012)
=π4

Γ(12)=20eu2du=2I, where I=0eu2du is something we have already known.
Therefore, I2=π4, and I=π2, finally, we have Γ(12)=π thus proved.

proof::➁
Γ(12)=0x12exdx, here we are again.
Take x=z22, then, dxdz=z, thus, we have dx=zdz
0x12exdx
=0(z22)12ez22zdz
=02z1ez22zdz
=20ez22zdz
=2π012πez22zdz
=2π12
=π

where 12πez22zdz=1 is the accumulative probability of normal distribution, therefore, 012πez22zdz=12.

The PDF of Gamma Distribution

Next we inspect the PDF(probability density function) of the gamma distribution. The f(x) of PDF is expressed as:
f(x)=1βαΓ(α)xα1exβ
=1βΓ(α)(xβ)α1exβ
=1β(xβ)α1exβΓ(α)
, where α>0, β>0

By taking λ=1β, then, we just have it that:
f(x)=λ(λx)α1eλxΓ(α)

What do we mean by the parameters α, β, λ?
α is for the sharpness of the distribution.
➁the spread or dissemination of the distribution could be resort to β.
λ is for the intensity, that is, the rate, frequency, in the form of counttimeunit.

Expect Value And Variance Of Gamma Distribution

As we know that it is the PDF of the gamma distribution:
f(x)=1βαΓ(α)xα1exβ

Next to figure out the expect value and variance of the gamma distribution. The suggestion would be made that we should take advantage of the moment in Introduction To The Moment Generating Function.

E[Xk]=1βαΓ(α)0xkxα1exβdx
Let y=xβ, and we can have, dy=1βdx, then:
E[Xk]=1βαΓ(α)0xk+α1exβdx
=1βαΓ(α)0(βy)k+α1eyβdy
=βk+α1ββαΓ(α)0(y)k+α1eydy
=βkΓ(α)0(y)k+α1eydy
=βkΓ(α)Γ(k+α)

E[X]=μ1, the first ordinary moment, by taking k=1, we can have the expected value expressed as:
E[X]=βΓ(α)Γ(1+α)=βα
Var[X]=E[X2]E2[X], by taking k=2, we can obtain E[X2]=μ2, the second ordinary moment, and have the expression of the variance:
Var[X]=β2Γ(α)Γ(2+α)-(βα)2
=β2(α+1)(α)-(βα)2
=β2α(α1α)
=β2α