Proof : Gauss Markov theorem 본문
https://pages.stat.wisc.edu/~st849-1/lectures/Ch04.pdf
https://math.stackexchange.com/questions/1542602/proof-of-gauss-markov-theorem
Proof of Gauss-Markov theorem
Theorem: Let $Y=X\beta+\varepsilon$ where $$Y\in\mathcal M_{n\times 1}(\mathbb R),$$ $$X\in \mathcal M_{n\times p}(\mathbb R),$$ $$\beta\in\mathcal M_{n\times 1}(\mathbb R ),$$ and $$\varepsilon\in\
math.stackexchange.com
What makes inequality true in proof of Gauss Markov theorem
Elsewhere on this site, I found a very compact proof of the Gauss-Markov theorem, seen below. I don't understand the justification for the middle step with the inequality. Specifically, what proper...
math.stackexchange.com
https://gregorygundersen.com/blog/2022/02/08/gauss-markov-theorem/
The Gauss–Markov Theorem
Informally, the Gauss–Markov theorem states that, under certain conditions, the ordinary least squares (OLS) estimator is the best linear model we can use. This is a powerful claim. Formally, the theorem states the following: Gauss–Markov theorem. In a
gregorygundersen.com
'Reference' 카테고리의 다른 글
| Positive Definite Matrix (0) | 2024.02.19 |
|---|---|
| Singular Value Decomposition & Optimization (0) | 2024.02.16 |
| Ridge & LASSO property reference (0) | 2024.02.09 |
| Kernel PCA (0) | 2024.01.30 |
| Bootstrap for time series data (0) | 2024.01.24 |