How to calculate Akaike information criterion with probability distribution function?

Pocket

Akaike information criterion (ACI) is the most useful indicator to select variables in multivariate analysis. It’s assumed that N is free parameter number, ACI is calculated as below;

\displaystyle AIC = -2(Maximum\ Log\ Likelihood)+2N

Free parameter number of model is dimension of the space that parameter value could take in expected models. AIC is an evaluation criterion when expected model is estimated with maximum likelihood method and it indicates that log likelihood bias approximates to free parameter number included in model.

How to find maximum log likelihood? Let’s define log likelihood function as following equation;

\displaystyle l(\theta) = \sum_{\alpha=1}^{n}\log f(x_{\alpha}|\theta)

\hat\theta, that is maximum likelihood estimator, maximizes l(θ) and this is called as maximum-likelihood method. l(\hat\theta) = \Sigma_{\alpha=1}^{n}\log f(x_\alpha |\hat\theta) is called as maximum log-likelihood.

If log likelihood function (l(θ)) could be differentiable, maximum likelihood estimator (\hat\theta) would be given by solving differentiated likelihood equation.

\displaystyle \frac{\partial l(\theta)}{\partial \theta} = 0

References:
Probability density function, expected value and variance of each probability distribution

Pocket

投稿者: admin

趣味:写真撮影とデータベース. カメラ:TOYO FIELD, Hasselblad 500C/M, Leica M6. SQL Server 2008 R2, MySQL, Microsoft Access.

コメントを残す

メールアドレスが公開されることはありません。 が付いている欄は必須項目です