Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
1.3k views
in Technique[技术] by (71.8m points)

pytorch - slice tensor of tensors using boolean tensor

Having two tensors :inputs_tokens is a batch of 20x300 of token ids and seq_A is my model output with size of [20, 300, 512] (512 vector for each of the tokens in the batch)

seq_A.size()
Out[1]: torch.Size([20, 300, 512])

inputs_tokens.size()
torch.Size([20, 300])

I would like to get only the vectors of the token 101 (CLS) as follow:

cls_tokens = (inputs_tokens == 101) 
cls_tokens
Out[4]: 
tensor([[ True, False, False,  ..., False, False, False],
       [ True, False, False,  ..., False, False, False],
       [ True, False, False,  ..., False, False, False], ...

How do I slice seq_A to get only the vectors which are true in cls_tokens for each batch? when I do

seq_A[cls_tokens].size()
Out[7]: torch.Size([278, 512]) 

but I still need it to bee in the size of [20 x N x 512 ] (otherwise I don't know to which sample it belongs)

question from:https://stackoverflow.com/questions/65908585/slice-tensor-of-tensors-using-boolean-tensor

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

TLDR; You can't, all sequences must have the same size along a given axis.


Take this simplified example:

>>> inputs_tokens = torch.tensor([[  1, 101,  18, 101,   9],
                                  [  1,   2, 101, 101, 101]])
>>> inputs_tokens.shape
torch.Size([2, 5])

>>> cls_tokens = inputs_tokens == 101
tensor([[False,  True, False,  True, False],
        [False, False,  True,  True,  True]])

Indexing inputs_tokens with the cls_tokens mask comes down to reducing inputs_tokens to cls_tokens's true values. In a general case where there is a different number of true values per batch, keeping the shape is impossible.

Following the above example, here is seq_A:

>>> seq_A = torch.rand(2, 5, 1)
tensor([[[0.4644],
         [0.7656],
         [0.3951],
         [0.6384],
         [0.1090]],

        [[0.6754],
         [0.0144],
         [0.7154],
         [0.5805],
         [0.5274]]])

According to your example, you would expect to have an output shape of (2, N, 1). What would N be? 3? What about the first batch which only as 2 true values? The resulting tensor can't have different sizes (2 and 3 on axis=1). Hence: "all sequences on axis=1 must have the same size".


If however, you are expecting each batch to have the same number of tokens 101, then you could get away with a broadcast of your indexed tensor:

>>> inputs_tokens = torch.tensor([[  1, 101, 101, 101,   9],
                                  [  1,   2, 101, 101, 101]])
>>> inputs_tokens.shape

>>> N = cls_tokens[0].sum()
3

Here remember, I'm assuming you have:

>>> assert all(cls_tokens.sum(axis=1) == N)

Therefore the desired output (with shape (2, 3, 1)) is:

>>> seq_A[cls_tokens].reshape(seq_A.size(0), N, -1)
tensor([[[0.7656],
         [0.3951],
         [0.6384]],

        [[0.7154],
         [0.5805],
         [0.5274]]])

Edit - if you really want to do this though you would require the use of a list comprehension:

>>> [seq_A[i, cls_tokens[i]] for i in range(cls_tokens.size(0))]
[ tensor([[0.7656],
          [0.6384]]), 
  tensor([[0.7154],
          [0.5805],
          [0.5274]]) ]

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...