Okay, so basically i'm working on some sort of image blending thing, and i have a function that will blend every image in the given array according to some specified weight, like in the code below:
#Weighting Function
def weight_sym(n):
n = int(n/2)
r = range(n,-n,-1)
val = [math.exp(-(2*x/n)**2) for x in r]
val = val/np.sum(val)
return val
#Blending Function
def blend(imgs):
num = len(imgs)
Weight = weight_sym(num)
P = np.einsum("ijkl,i->jkl", imgs, Weight)
return P.astype(np.uint8)
Note : imgs
is an array that holds multiple images in the form of pixel arrays, the shape is (n,1080,1920,3)
where n
is the amount of images, 1080 and 1920
is actual image dimension, and 3
is the rgb value.
Ok, but unfortunately for large amount of images, the code seems to slow down really badly, and because 75% of all images is just black background, i was wondering, can i only do einsum
calculation only if the pixel value is actually changing at least once in every images. (Perhaps utilizing np.allclose()
and use the boolean output (?))
What i mean by that is i want to reduce the amount of calculation needed by ignoring pixel that doesn't change value at all in every image. So my question is, is it actually possible to do that and get a better performance? if so, can you provide an example?
question from:
https://stackoverflow.com/questions/65871146/is-there-a-way-to-do-einsum-in-python-with-boolean-logic-to-optimize-it 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…