注意事项:激活函数也是来自nn模块下
复制代码
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33import torchvision from tensorboardX import SummaryWriter from torch import nn from torch.nn import ReLU from torch.utils.data import DataLoader from torchvision import transforms # input = torch.tensor([[1,-0.5], # [-1,3]]) # # output = torch.reshape(input,(-1,1,2,2)) # print(output.shape) train_set = torchvision.datasets.CIFAR10(root="datasets2",train=False,transform=transforms.ToTensor(),download=True) train_loader = DataLoader(dataset=train_set,batch_size=64) class zj_relu(nn.Module): def __init__(self) -> None: super().__init__() self.relu = ReLU() def forward(self,input): output = self.relu(input) return output zj_relu_1 = zj_relu() # output = zj_relu_1(input) # print(output) step = 0 writer = SummaryWriter("logs") for data in train_loader: imgs,targets = data output = zj_relu_1(imgs) writer.add_images("input",imgs, step) writer.add_images("output",output,step) step += 1 writer.close()
最后
以上就是疯狂墨镜最近收集整理的关于pytorch学习笔记-激活函数层的全部内容,更多相关pytorch学习笔记-激活函数层内容请搜索靠谱客的其他文章。
本图文内容来源于网友提供,作为学习参考使用,或来自网络收集整理,版权属于原作者所有。
发表评论 取消回复