加入收藏 | 设为首页 | 会员中心 | 我要投稿 李大同 (https://www.lidatong.com.cn/)- 科技、建站、经验、云计算、5G、大数据,站长网!
当前位置: 首页 > 大数据 > 正文

mxnet-Sequential

发布时间:2020-12-14 04:14:28 所属栏目:大数据 来源:网络整理
导读:#!/usr/bin/env python2 # -*- coding: utf-8 -*- """ Created on Fri Aug 10 16:13:29 2018 @author: myhaspl """ from mxnet import nd from mxnet.gluon import nn net = nn.Sequential() # Add a sequence of layers. net.add( # Similar to Dense,it is
#!/usr/bin/env python2 # -*- coding: utf-8 -*- """ Created on Fri Aug 10 16:13:29 2018 @author: myhaspl """ from mxnet import nd from mxnet.gluon import nn net = nn.Sequential() # Add a sequence of layers. net.add( # Similar to Dense,it is not necessary to specify the # input channels by the argument `in_channels`,which will be # automatically inferred in the first forward pass. Also,# we apply a relu activation on the output. # # In addition,we can use a tuple to specify a # non-square kernel size,such as `kernel_size=(2,4) nn.Conv2D(channels=6,kernel_size=5,activation=‘relu‘),# One can also use a tuple to specify non-symmetric # pool and stride sizes nn.MaxPool2D(pool_size=2,strides=2),nn.Conv2D(channels=16,kernel_size=3,nn.MaxPool2D(pool_size=2,# flatten the 4-D input into 2-D with shape # `(x.shape[0],x.size/x.shape[0])` so that it can be used # by the following dense layers nn.Flatten(),nn.Dense(120,activation="relu"),nn.Dense(84,nn.Dense(10)) print net net.initialize() x = nd.random.uniform(shape=(4,1,28,28)) y = net(x) print y.shape print net[0].weight.data().shape print net[7].weight.data().shape Sequential( ? (0): Conv2D(None -> 6,kernel_size=(5,5),stride=(1,1)) ? (1): MaxPool2D(size=(2,2),stride=(2,padding=(0,0),ceil_mode=False) ? (2): Conv2D(None -> 16,kernel_size=(3,3),1)) ? (3): MaxPool2D(size=(2,ceil_mode=False) ? (4): Flatten ? (5): Dense(None -> 120,Activation(relu)) ? (6): Dense(None -> 84,Activation(relu)) ? (7): Dense(None -> 10,linear) )

(4L,10L)(6L,1L,5L,5L)(10L,84L)

(编辑:李大同)

【声明】本站内容均来自网络,其相关言论仅代表作者个人观点,不代表本站立场。若无意侵犯到您的权利,请及时与联系站长删除相关内容!

    推荐文章
      热点阅读