搜索资源列表
CNNdaima-tuxiangshibie
- 深度学习中的CNN方法,层数由原先的五层增加到七层,使用最新的RELU激活函数代替原先的sigm函数,能运行-CNN depth learning method, the number of layers the original five to seven, with the latest RELU activation function replaces sigm function, you can run it
Conv_neural_net_NumPy
- 用numpy实现卷积神经网络的卷积、池化、ReLU操作。(numpy,Relu,pooling,conv)