Hi,

I am trying to add an attention layer to a Resnet 50 model. I am using python. My code is shown in the file below. I am getting an error message. Any suggestions would be appreciated.

AttributeError Traceback (most recent call last) in ----> 1 flat1 = Flatten()(resnet.output) 2 class1 = Dense(256, activation='relu')(flat1) 3 class1=BatchNormalization()(class1) 4 # receive 3D and output 3D 5 class2 = Dense(128, activation='relu')(class1) ~\anaconda3\lib\site-packages\torch\nn\modules\module.py in _call_impl(self, *input, **kwargs) 1108 if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks 1109 or _global_forward_hooks or _global_forward_pre_hooks): -> 1110 return forward_call(*input, **kwargs) 1111 # Do not call functions when jit is used 1112 full_backward_hooks, non_full_backward_hooks = [], [] in forward(self, x) 21 class Flatten(nn.Module): 22 def forward(self, x): ---> 23 return x.view(x.size(0), -1) 24 25 class ChannelGate(nn.Module): AttributeError: 'KerasTensor' object has no attribute 'view'

More Warid Islam's questions See All
Similar questions and discussions