Camera2api How to solve ? E/BufferQueueProducer: dequeueBuffer: attempting to exceed the max dequeued buffer count (4)

0

I am using Samsung-A10
with sdk 28

I am trying to take images using the camera2api but always after collecting 14 frames I get

I/com.oculid.daq: NativeAlloc concurrent copying GC freed 3481(447KB) AllocSpace objects, 33(1580KB) LOS objects, 49% free, 3MB/7MB, paused 149us total 174.997ms
D/NetworkManagementSocketTagger: tagSocket(92) with statsTag=0xffffffff, statsUid=-1
D/NetworkManagementSocketTagger: tagSocket(100) with statsTag=0xffffffff, statsUid=-1
D/NetworkManagementSocketTagger: tagSocket(96) with statsTag=0xffffffff, statsUid=-1
D/NetworkManagementSocketTagger: tagSocket(101) with statsTag=0xffffffff, statsUid=-1
D/NetworkManagementSocketTagger: tagSocket(106) with statsTag=0xffffffff, statsUid=-1
D/IR_CAMERA: camera stateCallback onError
E/BufferQueueProducer: [ImageReader-1280x720f100m1-965-0] dequeueBuffer: attempting to exceed the max dequeued buffer count (4)

I can see that behavior in certain phones models, I am trying to understand what this error means and how to solve it, Will be happy for any leads.

I am attaching here my OnimageAvailable in case its help

    private ImageReader.OnImageAvailableListener readerListener = new ImageReader.OnImageAvailableListener() {
        @RequiresApi(api = Build.VERSION_CODES.P)
        @Override
        /*
         * gets a  the "reader" the callback is associated with.
         * transfer to bytes
         */

        public void onImageAvailable(ImageReader reader) {
            //Log.i(TAG,"readerListener on onImageAvailable");
            long imageReaderTime = SystemClock.elapsedRealtimeNanos();
            frameIndexForFileName++;
            //Log.i(TAG, String.format("Frame index: %08d", frameIndexForFileName));
            try (Image image = reader.acquireNextImage()) {

                //convert the image to bytes
                ByteBuffer buffer = image.getPlanes()[0].getBuffer();
                byte[] bytes = new byte[buffer.capacity()];
                buffer.get(bytes);

                double frameCaptureDuration = (double) (SystemClock.elapsedRealtimeNanos() - imageReaderTime) / 1000000;
                Log.i(TAG, String.format("TIME_MEASUREMENT: Capture of frame %08d took %f ms", frameIndexForFileName, frameCaptureDuration));

                Bitmap bmp = BitmapFactory.decodeByteArray(bytes,0,bytes.length);
                ByteArrayOutputStream stream = new ByteArrayOutputStream();
                bmp.compress(Bitmap.CompressFormat.JPEG, 60, stream);
                byte[] compressImage = stream.toByteArray();
                bmp.recycle();
                //image.close();
                //reader.discardFreeBuffers();
                imageListener.onImage(compressImage, imageReaderTime);



            } catch (Exception e) {
                e.printStackTrace();
                Log.e(TAG, "!!!!!!!!!!!!!!!!onImageAvailable: " + e );
            }
        }
    };

thanks

android
android-camera2
asked on Stack Overflow Mar 3, 2020 by helpper • edited Mar 7, 2020 by helpper

1 Answer

1

I found a fix for the Buffer problem. Changing the ImageReader ImageFormat from JPEG to a raw format (YUV_420_888 usually also used for video) and converting the images later to RGB for the jpegs solved the issue for the Samsung A10, I’ve also tested it on a Samsung A5, and Huawei, the Fairphone and the Pocophone. It seems to work on all phonee. I’ve read a lot about real time camera applications and it seems like using raw for the single frames is recommended due to performance advantages.

answered on Stack Overflow Mar 7, 2020 by helpper

User contributions licensed under CC BY-SA 3.0