My question is very simple:
How can I reduce the dimension of a list or a tensor using max-pooling layer to 512 elements in the list:
I'm trying the following code:
input_ids = tokenizer.encode(question, text)
print(input_ids) # input_ids is a list of 700 elements
m = nn.AdaptiveMaxPool1d(512)
input_ids = m(torch.tensor([[input_ids]])) # convert the list to tensor and apply max-pooling layer
But I get the following error:
RuntimeError: "adaptive_max_pool2d_cpu" not implemented for 'Long'
So, please help to figure out where is the error