I have 3 sequences in a cell-array:
Input_cell= {'ABCD','ACD', 'ABD'}
S1= 'ABCD' % which means A<B<C<D
S2= 'ACD' % which means A<C<D % missing B in the full string of 'ABCD'
S3= 'ABD' % which means A<B<D % missing C in the full string of 'ABCD'
I want to convert each of the strings in the Input_cell
into a matrix M
(i
-by-j
) which has to satisfy these conditions:
M(i,j)
and M(j,i)
are random
M(i,i) = 0.5
M(i,j) + M(j,i) = 1
M(i,j) < M(j,i)
For example if A<B
then M(A,B) < M(B,A)
For example if we have S1 = 'ABCD'
(which means A<B<C<D
), the M1
matrix will be expected as follows:
A B C D
A 0.5 0.3 0.2 0.1
B 0.7 0.5 0 0.4
C 0.8 1 0.5 0.1
D 0.9 0.6 0.9 0.5
If we have S2 = 'ACD'
(which means A<C<D
), missing B
in the full string of 'ABCD'
, we will put the value 0.5
in every position of B
in the matrix, the M2
matrix will be expected as follows:
A B C D
A 0.5 0.5 0.2 0.1
B 0.5 0.5 0.5 0.5
C 0.8 0.5 0.5 0.1
D 0.9 0.5 0.9 0.5
If we have S3 = 'ABD'
(which means A<B<D
), missing C
in the full string of 'ABCD'
, we will put the value 0.5
in every position of C
in the matrix, the M3
matrix will be expected as follows:
A B C D
A 0.5 0.4 0.5 0.1
B 0.6 0.5 0.5 0.3
C 0.5 0.5 0.5 0.5
D 0.9 0.7 0.5 0.5
How to create that kind of above matrices from a given cell-array of sequences?
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…