Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
555 views
in Technique[技术] by (71.8m points)

bitmap - Android SetPixels() Explanation and Example?

I am trying to set a region of pixels in a mutable bitmap a different color in my android app. Unfortunately I cannot get setPixels() to work properly. I am constantly getting ArrayOutOfBoundsExceptions. I think it may have something to do with the stride, but I'm really not sure. That is the only parameter that I still don't understand. The only other post I have seen on setPixels (not setPixel) is here: drawBitmap() and setPixels(): what's the stride? and it did not help me. I tried setting the stride as 0, as the width of the bitmap, as the width of the bitmap - the area i'm trying to draw and it still crashes. Here is my code:

public void updateBitmap(byte[] buf, int offset, int x, int y, int width, int height) {
    // transform byte[] to int[]
    IntBuffer intBuf = ByteBuffer.wrap(buf).asIntBuffer();
    int[] intarray = new int[intBuf.remaining()];
    intBuf.get(intarray);                                       

    int stride = ??????
    screenBitmap.setPixels(intarray, offset, stride, x, y, width, height); // crash here

My bitmap is mutable, so I know that is not the problem. I am also certain that my byte array is being properly converted to an integer array. But I keep getting ArrayOutOfBoundsExceptions and I don't understand why. Please help me figure this out

EDIT - here is how I construct the fake input:

int width = 1300;
int height = 700;
byte[] buf = new byte[width * height * 4 * 4]; // adding another * 4 here seems to work... why?     
for (int i = 0; i < width * height * 4 * 4; i+=4) {
    buf[i] = (byte)255;
    buf[i + 1] = 3;
    buf[i + 2] = (byte)255;
    buf[i + 3] = 3;         
}
//(byte[] buf, int offset, int x, int y, int width, int height)  - for reference        
siv.updateBitmap(buf, 0, 0, 0, width, height);

So the width and height are the correct amount of ints (at least it should be).

EDIT2 - here is the code for the original creation of screenBitmap:

public Bitmap createABitmap() {
int w = 1366;
int h = 766;

byte[] buf = new byte[h * w * 4];

for (int i = 0; i < h * w * 4;i+=4) {
        buf[i] = (byte)255;
    buf[i+1] = (byte)255;
    buf[i+2] = 0;
    buf[i+3] = 0;
}

DisplayMetrics metrics = new DisplayMetrics();
getWindowManager().getDefaultDisplay().getMetrics(metrics);

IntBuffer intBuf = ByteBuffer.wrap(buf).asIntBuffer();  
int[] intarray = new int[intBuf.remaining()];
intBuf.get(intarray);                                           

Bitmap bmp = Bitmap.createBitmap(metrics, w, h, itmap.Config.valueOf("ARGB_8888"));
bmp.setPixels(intarray, 0, w, 0, 0, w, h);
return bmp;
}

It seems to work in this instance, not sure what the difference is

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

probably it should be:

screenBitmap.setPixels(intarray, 0, width / 4, x, y, width / 4, height);

because you have converted byte to int. your error is ArrayOutOfBoundsExceptions. check whether the size intBuf.remaining() = width * height / 4.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...