resnet18模型

resnet18模型结构ResNet18((conv1):Conv2D(3,64,kernel_size=[3,3],padding=1,data_format=NCHW)(bn1):BatchNorm2D(num_features=64,momentum=0.9,epsilon=1e-05)(relu):ReLU()(avagPool):AdaptiveAvgPool2D(output_size=1)(classifier):Linear(in_features=512

大家好,又见面了,我是你们的朋友全栈君。

睡觉

结构

ResNet18(
  (conv1): Conv2D(3, 64, kernel_size=[3, 3], padding=1, data_format=NCHW)
  (bn1): BatchNorm2D(num_features=64, momentum=0.9, epsilon=1e-05)
  (relu): ReLU()
  (avagPool): AdaptiveAvgPool2D(output_size=1)
  (classifier): Linear(in_features=512, out_features=1000, dtype=float32)
  (layer1): Sequential(
    (0): Block(
      (conv1): Conv2D(64, 64, kernel_size=[3, 3], padding=1, data_format=NCHW)
      (bn1): BatchNorm2D(num_features=64, momentum=0.9, epsilon=1e-05)
      (conv2): Conv2D(64, 64, kernel_size=[3, 3], padding=1, data_format=NCHW)
      (bn2): BatchNorm2D(num_features=64, momentum=0.9, epsilon=1e-05)
      (relu): ReLU()
      (downsample): Identity()
    )
    (1): Block(
      (conv1): Conv2D(64, 64, kernel_size=[3, 3], padding=1, data_format=NCHW)
      (bn1): BatchNorm2D(num_features=64, momentum=0.9, epsilon=1e-05)
      (conv2): Conv2D(64, 64, kernel_size=[3, 3], padding=1, data_format=NCHW)
      (bn2): BatchNorm2D(num_features=64, momentum=0.9, epsilon=1e-05)
      (relu): ReLU()
      (downsample): Identity()
    )
  )
  (layer2): Sequential(
    (0): Block(
      (conv1): Conv2D(64, 128, kernel_size=[3, 3], stride=[2, 2], padding=1, data_format=NCHW)
      (bn1): BatchNorm2D(num_features=128, momentum=0.9, epsilon=1e-05)
      (conv2): Conv2D(128, 128, kernel_size=[3, 3], padding=1, data_format=NCHW)
      (bn2): BatchNorm2D(num_features=128, momentum=0.9, epsilon=1e-05)
      (relu): ReLU()
      (downsample): Sequential(
        (0): Conv2D(64, 128, kernel_size=[1, 1], stride=[2, 2], data_format=NCHW)
        (1): BatchNorm2D(num_features=128, momentum=0.9, epsilon=1e-05)
      )
    )
    (1): Block(
      (conv1): Conv2D(128, 128, kernel_size=[3, 3], padding=1, data_format=NCHW)
      (bn1): BatchNorm2D(num_features=128, momentum=0.9, epsilon=1e-05)
      (conv2): Conv2D(128, 128, kernel_size=[3, 3], padding=1, data_format=NCHW)
      (bn2): BatchNorm2D(num_features=128, momentum=0.9, epsilon=1e-05)
      (relu): ReLU()
      (downsample): Identity()
    )
  )
  (layer3): Sequential(
    (0): Block(
      (conv1): Conv2D(128, 256, kernel_size=[3, 3], stride=[2, 2], padding=1, data_format=NCHW)
      (bn1): BatchNorm2D(num_features=256, momentum=0.9, epsilon=1e-05)
      (conv2): Conv2D(256, 256, kernel_size=[3, 3], padding=1, data_format=NCHW)
      (bn2): BatchNorm2D(num_features=256, momentum=0.9, epsilon=1e-05)
      (relu): ReLU()
      (downsample): Sequential(
        (0): Conv2D(128, 256, kernel_size=[1, 1], stride=[2, 2], data_format=NCHW)
        (1): BatchNorm2D(num_features=256, momentum=0.9, epsilon=1e-05)
      )
    )
    (1): Block(
      (conv1): Conv2D(256, 256, kernel_size=[3, 3], padding=1, data_format=NCHW)
      (bn1): BatchNorm2D(num_features=256, momentum=0.9, epsilon=1e-05)
      (conv2): Conv2D(256, 256, kernel_size=[3, 3], padding=1, data_format=NCHW)
      (bn2): BatchNorm2D(num_features=256, momentum=0.9, epsilon=1e-05)
      (relu): ReLU()
      (downsample): Identity()
    )
  )
  (layer4): Sequential(
    (0): Block(
      (conv1): Conv2D(256, 512, kernel_size=[3, 3], stride=[2, 2], padding=1, data_format=NCHW)
      (bn1): BatchNorm2D(num_features=512, momentum=0.9, epsilon=1e-05)
      (conv2): Conv2D(512, 512, kernel_size=[3, 3], padding=1, data_format=NCHW)
      (bn2): BatchNorm2D(num_features=512, momentum=0.9, epsilon=1e-05)
      (relu): ReLU()
      (downsample): Sequential(
        (0): Conv2D(256, 512, kernel_size=[1, 1], stride=[2, 2], data_format=NCHW)
        (1): BatchNorm2D(num_features=512, momentum=0.9, epsilon=1e-05)
      )
    )
    (1): Block(
      (conv1): Conv2D(512, 512, kernel_size=[3, 3], padding=1, data_format=NCHW)
      (bn1): BatchNorm2D(num_features=512, momentum=0.9, epsilon=1e-05)
      (conv2): Conv2D(512, 512, kernel_size=[3, 3], padding=1, data_format=NCHW)
      (bn2): BatchNorm2D(num_features=512, momentum=0.9, epsilon=1e-05)
      (relu): ReLU()
      (downsample): Identity()
    )
  )
)

Process finished with exit code 0

代码

import paddle
import paddle.nn as nn


class Identity(nn.Layer):
    def __init__(self):
        super().__init__()

    def forward(self, x):
        return x


class Block(nn.Layer):
    def __init__(self, in_dim, out_dim, stride):
        super().__init__()
        self.conv1 = nn.Conv2D(in_dim, out_dim, 3, stride, 1, bias_attr=False)
        self.bn1 = nn.BatchNorm2D(out_dim)
        self.conv2 = nn.Conv2D(out_dim, out_dim, 3, 1, 1, bias_attr=False)
        self.bn2 = nn.BatchNorm2D(out_dim)
        self.relu = nn.ReLU()
        if stride == 2 or in_dim != out_dim:
            self.downsample = nn.Sequential(
                *[nn.Conv2D(in_dim, out_dim, 1, stride, bias_attr=False), nn.BatchNorm2D(out_dim)])
        else:
            self.downsample = Identity()

    def forward(self, x):
        h = x
        x = self.conv1(x)
        x = self.bn1(x)
        x = self.relu(x)
        x = self.conv2(x)
        x = self.bn2(x)
        identity = self.downsample(h)
        x = x + identity
        x = self.relu(x)
        return x


class ResNet18(nn.Layer):
    def __init__(self, in_dim=64, num_classes=1000):
        super().__init__()
        self.in_dim = in_dim  # 差点忘了这一行
        #     stem
        self.conv1 = nn.Conv2D(in_channels=3, out_channels=in_dim, kernel_size=3, stride=1, padding=1, bias_attr=False)
        self.bn1 = nn.BatchNorm2D(in_dim)
        self.relu = nn.ReLU()
        #     head
        self.avagPool = nn.AdaptiveAvgPool2D(1)
        self.classifier = nn.Linear(512, num_classes)
        # blocks
        self.layer1 = self.makelayer(64, 2, 1)
        self.layer2 = self.makelayer(128, 2, 2)
        self.layer3 = self.makelayer(256, 2, 2)
        self.layer4 = self.makelayer(512, 2, 2)

    def makelayer(self, out_dim, n_blocks, stride):
        layer_list = []
        layer_list.append(Block(self.in_dim, out_dim, stride))  # 哦对,这里的self.in_dim是这个类的,不是这个函数的.
        self.in_dim = out_dim
        for i in range(1, n_blocks):
            layer_list.append(Block(self.in_dim, out_dim, stride=1))
        return nn.Sequential(*layer_list)

    def forward(self, x):
        x=self.conv1(x)
        x=self.bn1(x)
        x=self.relu(x)

        #blocks
        x=self.layer1(x)
        x=self.layer2(x)
        x=self.layer3(x)
        x=self.layer4(x)

        #head
        x=self.avagPool(x)
        # print("preflatten:",x.shape)
        x=x.flatten(1)
        # print("flatten:",x.shape)
        x=self.classifier(x)
        # print("classifier:",x.shape)
        return x
def main():
    model=ResNet18()
    x=paddle.randn([2,3,32,32])
    out=model(x)
    print(model)
    # print("x.shape:",x.shape)
if __name__ == "__main__":
    main()
版权声明:本文内容由互联网用户自发贡献,该文观点仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请联系我们举报,一经查实,本站将立刻删除。

发布者:全栈程序员-站长,转载请注明出处:https://javaforall.net/141235.html原文链接:https://javaforall.net

(0)
全栈程序员-站长的头像全栈程序员-站长


相关推荐

  • iOS 13越狱:越狱后如何安装AppSync和afc2越狱补丁

    iOS 13越狱:越狱后如何安装AppSync和afc2越狱补丁iOS13越狱:越狱后如何安装AppSync和afc2越狱补丁?本文件转载自:https://news.tongbu.com/98644.html文章概要:越狱后必须安装的补丁:AppSync和afc2。越狱后如何安装AppSync和afc2越狱补丁?越狱虽然成功了,但如果不安装Appsync和afc2这两个重要的补丁,就无法享受越狱的功能哦。请按照如下教程提示,分别安装着这2个重要补丁。越狱教程:《iOS13越狱教程:unc0ver5.2.0更新发布,支持iOS13.5..

    2022年4月27日
    73
  • 网孔型高级维修电工实训装置

    网孔型高级维修电工实训装置ZN-88CCV网孔型高级维修电工实训装置一、概述ZN-88CCV网孔型高级维修电工实训装置主要由实训桌、网孔板、实训元器件(也可自购)组成。学生根据实训线路进行元器件的合理布局,安装、接线全部由学生自行完成,接近工业现场,能完成电工基础电路,电机控制线路,照明配电的模拟操作,PLC可编程综合应用线路,电子技术应用电路的综合实训,通过一系列项目实训培养学生动手能力和实操技能。实训项目可自行确定,根据所选的项目选择相应的元器件。该装置也可作为电工考工的考核设备。二、特点1、实训采用网孔板与挂箱相结合

    2022年6月6日
    29
  • 开始激活成功教程so文件_so文件格式怎么打开

    开始激活成功教程so文件_so文件格式怎么打开第一、利用IDA静态分析native函数1.isEquals函数分析函数指令代码:简单分析指令代码:1>、PUSH{r3-r7,lr}是保存r3,r4,r5,r6,r7,lr的值到内存的栈中;与之对应的是POP{r3-r7,pc}pc:程序寄存器,保留下一条CPU即将执行的指令lr:连接返回寄存器,保留函数返回后,下一条应执行的指令2>、调用strlen,malloc,st

    2022年9月19日
    0
  • 算法学习–分酒问题(BFS)[通俗易懂]

    算法学习–分酒问题(BFS)[通俗易懂]有4个红酒瓶子,它们的容量分别是:9升,7升,4升,2升开始的状态是[9,0,0,0],也就是说:第一个瓶子满着,其它的都空着。允许把酒从一个瓶子倒入另一个瓶子,但只能把一个瓶子倒满或把一个瓶子倒空,不能有中间状态。这样的一次倒酒动作称为1次操作。假设瓶子的容量和初始状态不变,对于给定的目标状态,至少需要多少次操作才能实现?本题就是要求你编程实现最小操作次数的计算。输入:最终状…

    2022年10月19日
    0
  • python基础(3)列表list「建议收藏」

    python基础(3)列表list「建议收藏」列表列表特点:是一种序列结构,与元组不同,列表具有可变性,可以追加、插入、删除、替换列表中的元素新增元素appendappend添加一个对象,可以是任意类型a=['zhangsa

    2022年7月28日
    4
  • 【java系列】unix时间戳转Date[通俗易懂]

    【java系列】unix时间戳转Date[通俗易懂]unix时间戳转Date注意,不能直接使用Integer进行乘除和转换,需要转成bigDecimal去处理,否则转换出来的时间只会是1970-xxxxpackagehutoolTest;importcn.hutool.core.date.DateTime;importjava.math.BigDecimal;importjava.text.SimpleDateFormat;importjava.util.Date;publicclassDateTest{pub

    2022年6月25日
    26

发表回复

您的邮箱地址不会被公开。 必填项已用 * 标注

关注全栈程序员社区公众号