Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
259 views
in Technique[技术] by (71.8m points)

android - getSupportedPictureSize() returns a value which is not actually supported by Nexus4

I have extended a SurfaceView for displaying the camera feed for a very simple camera application. To find the optimal preview size for each device, I used this sample code which is used in almost all the open source camera apps I have seen:

    List<Camera.Size> sizes = parameters.getSupportedPreviewSizes();
    double minDiff = Double.MAX_VALUE;
    for (Camera.Size size : sizes) {
        if (Math.abs(size.width - width) < minDiff) {
        screenWidth = size.width;
        screenHeight = size.height;
        minDiff = Math.abs(size.width - width);
        }
    }

Everything works perfectly up to this point.

Now, due to the nature of the application, I have to keep two bitmaps in the memory during the course of a session and for the sake of simplicity(avoiding memory issues during testing) I used the same code for the PICTURE SIZE(replaced the getSupportedPreviewSizes() with getSupportedPictureSizes()). Everything works great on most of the devices though I have to decide some other way to choose the optimum picture size for each device.

Recently, while testing on an Nexus 4 device, this above loop failed in choosing the optimum picture size. Upon investigation, I found out that the getSupportedPictureSizes() functions returns a value i.e 1280*960 which is not actually supported by the Nexus 4 camera. So, how does one solve this issue? I mean, isn't this function supposed to ONLY return those values which the CAMERA of the device supports? I am sure there will be other devices with the same issue which I won't be able to test on. Any clues as to how this issue should be resolved?

UPDATE: Whats happening is that it accepts the wrong parameter without any error and the image that it returns is distorted, I will try to get a picture here as well. Also, there are no runtime exceptions.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

I have the same problem on an LG Lucid (model number VS840 4G). Basically, getSupportedPictureSizes() returns these sizes:

Size        Aspect Ratio
2560x1920   (1.3333333333333333)
3072x1728   (1.7777777777777777)
2048x1536   (1.3333333333333333)
2304x1296   (1.7777777777777777)
1536x864    (1.7777777777777777)
1280x960    (1.3333333333333333)
640x480     (1.3333333333333333)

If we call setPictureSize() with any of the sizes with a 1.7 aspect ratio, the camera hardware throws no errors, but it returns an image that is distorted:

sample of distorted result from LG Lucid, showing what appears to be a pixel buffer interpreted with the wrong stride

I tried a number of techniques to get the camera driver to admit this size was not really supported, including checking the result of getPictureSize() and getSupportedPictureFormats() after calling setParameters(). I couldn't find anything that gave away this behavior.

To work around this, we now mimic the behavior of the stock Camera app: we attempt to find a "preferred" size from the getSupportedPictureSizes() list before falling back to selecting the largest (as computed by width * height). Since the preferred sizes all have an aspect ratio of 1.333 (4:3), this works around the issue on this device.

See the Camera app source where it sets the value (line 111) and the hardcoded sizes it prefers (line 63) for the exact implementation.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...