Sequential函数与类方法构建神经网络的比较


前言

如果构建的神经网络结构简单,可以用Sequential()说明网络结构情况,当各层数量和类型上较复杂时也可以定义类来描述神经网络


比较代码如下:

import torch
from torch import nn
from torch.nn import Conv2d, MaxPool2d, Flatten, Linear, Sequential
from torch.utils.tensorboard import SummaryWriter


class NNet(nn.Module):
    def __init__(self):
        super(NNet, self).__init__()
        '''
        这些内容被下边的Sequential类,顺序连接模型给代替了,他们起的作用是一致的
        Sequential的作用就是连接模型的功能,使代码看起来更加整洁
        self.conv=Conv2d(in_channels=3,out_channels=32,kernel_size=5,padding=2,stride=1)
        self.maxpool=MaxPool2d(2)
        self.conv2=Conv2d(32,32,5,padding=2)
        self.maxpool2=MaxPool2d(2)
        self.conv3=Conv2d(32,64,5,padding=2)
        self.maxpool3=MaxPool2d(2)
        self.flatten=Flatten()
        self.linear=Linear(1024,64)
        self.linear2=Linear(64,10)#把图片的像素展开成一列
        '''
        self.model=Sequential(Conv2d(3,32,5,padding=2),
                                   MaxPool2d(2),
                                   Conv2d(32,32,5,padding=2),
                                   MaxPool2d(2),
                                   Conv2d(32,64,5,padding=2),
                                   MaxPool2d(2),
                                   Flatten(),
                                   Linear(1024,64),
                                   Linear(64, 10)
                                   )
        pass
    def forward(self,x):
        '''
        下边的这么些代码被self.model(x)给代替了
        x=self.conv(x)
        x=self.maxpool(x)
        x=self.conv2(x)
        x=self.maxpool2(x)
        x=self.conv3(x)
        x=self.maxpool3(x)
        x=self.flatten(x)
        x=self.linear(x)
        x=self.linear2(x)
        '''
        x=self.model(x)
        return x

用call方法实现前向传播也是可行的,例如下面代码

class Baseline(Model):
    def __init__(self):
        super(Baseline, self).__init__()
        self.c1 = Conv2D(filters=6, kernel_size=(5, 5), padding='same')  #  配置一个卷积层c1
        self.b1 = BatchNormalization()  # BN层
        self.a1 = Activation('relu')  # 激活层
        self.p1 = MaxPool2D(pool_size=(2, 2), strides=2, padding='same')  # 池化层
        self.d1 = Dropout(0.2)  # dropout层

        self.flatten = Flatten()
        self.f1 = Dense(128, activation='relu')
        self.d2 = Dropout(0.2)
        self.f2 = Dense(10, activation='softmax')

    def call(self, x):
        x = self.c1(x)   # 向c1卷积层输入x
        x = self.b1(x)
        x = self.a1(x)
        x = self.p1(x)
        x = self.d1(x)

        x = self.flatten(x)
        x = self.f1(x)
        x = self.d2(x)
        y = self.f2(x)
        return y

原因是在在Model类中,实现了call方法,而在call方法中调用了forward函数


版权声明:本文为weixin_43759716原创文章,遵循CC 4.0 BY-SA版权协议,转载请附上原文出处链接和本声明。