深度学习中对全连接层的理解
1. 单层模型1.1 Recap(回顾)out=f(X@W+b)=f(X @ W+b)=f(X@W+b)例如:out=relu(X@W+b)=\operatorname{relu}(X @ W+b)=relu(X@W+b)注:f(x)称为激活函数2.1 X @ W+b(逐渐降维,将高阶原始样本降为低阶分类样本)out=relu(X@W+b)=\operatorname{relu}(X @ W+b)=relu(X@W+b)[h00h10h01h11]=relu([x00x1