• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    公众号

Java I420Frame类代码示例

原作者: [db:作者] 来自: [db:来源] 收藏 邀请

本文整理汇总了Java中org.webrtc.VideoRenderer.I420Frame的典型用法代码示例。如果您正苦于以下问题:Java I420Frame类的具体用法?Java I420Frame怎么用?Java I420Frame使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。



I420Frame类属于org.webrtc.VideoRenderer包,在下文中一共展示了I420Frame类的20个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。

示例1: setSize

import org.webrtc.VideoRenderer.I420Frame; //导入依赖的package包/类
@Override
public void setSize(final int width, final int height) {
  Log.d(TAG, "ID: " + id + ". YuvImageRenderer.setSize: " +
      width + " x " + height);
  videoWidth = width;
  videoHeight = height;
  int[] strides = { width, width / 2, width / 2  };
  // Frame re-allocation need to be synchronized with copying
  // frame to textures in draw() function to avoid re-allocating
  // the frame while it is being copied.
  synchronized (frameToRenderQueue) {
    // Clear rendering queue.
    frameToRenderQueue.poll();
    // Re-allocate / allocate the frame.
    yuvFrameToRender = new I420Frame(width, height, strides, null);
    textureFrameToRender = new I420Frame(width, height, null, -1);
    updateTextureProperties = true;
  }
}
 
开发者ID:jingcmu,项目名称:MediaCodecTest,代码行数:20,代码来源:VideoRendererGui.java


示例2: YuvImageRenderer

import org.webrtc.VideoRenderer.I420Frame; //导入依赖的package包/类
private YuvImageRenderer(
    GLSurfaceView surface,
    int x, int y, int width, int height) {
  Log.v(TAG, "YuvImageRenderer.Create");
  this.surface = surface;
  frameToRenderQueue = new LinkedBlockingQueue<I420Frame>(1);
  // Create texture vertices.
  float xLeft = (x - 50) / 50.0f;
  float yTop = (50 - y) / 50.0f;
  float xRight = Math.min(1.0f, (x + width - 50) / 50.0f);
  float yBottom = Math.max(-1.0f, (50 - y - height) / 50.0f);
  float textureVeticesFloat[] = new float[] {
      xLeft, yTop,
      xLeft, yBottom,
      xRight, yTop,
      xRight, yBottom
  };
  textureVertices = directNativeFloatBuffer(textureVeticesFloat);
}
 
开发者ID:actorapp,项目名称:droidkit-webrtc,代码行数:20,代码来源:VideoRendererGui.java


示例3: queueFrame

import org.webrtc.VideoRenderer.I420Frame; //导入依赖的package包/类
/** Queue |frame| to be uploaded. */
public void queueFrame(final Endpoint stream, I420Frame frame) {
  // Paying for the copy of the YUV data here allows CSC and painting time
  // to get spent on the render thread instead of the UI thread.
  abortUnless(framePool.validateDimensions(frame), "Frame too large!");
  final I420Frame frameCopy = framePool.takeFrame(frame).copyFrom(frame);
  boolean needToScheduleRender;
  synchronized (framesToRender) {
    // A new render needs to be scheduled (via updateFrames()) iff there isn't
    // already a render scheduled, which is true iff framesToRender is empty.
    needToScheduleRender = framesToRender.isEmpty();
    I420Frame frameToDrop = framesToRender.put(stream, frameCopy);
    if (frameToDrop != null) {
      framePool.returnFrame(frameToDrop);
    }
  }
  if (needToScheduleRender) {
    queueEvent(new Runnable() {
        public void run() {
          updateFrames();
        }
      });
  }
}
 
开发者ID:kenneththorman,项目名称:appspotdemo-mono,代码行数:25,代码来源:VideoStreamsView.java


示例4: updateFrames

import org.webrtc.VideoRenderer.I420Frame; //导入依赖的package包/类
private void updateFrames() {
  I420Frame localFrame = null;
  I420Frame remoteFrame = null;
  synchronized (framesToRender) {
    localFrame = framesToRender.remove(Endpoint.LOCAL);
    remoteFrame = framesToRender.remove(Endpoint.REMOTE);
  }
  if (localFrame != null) {
    texImage2D(localFrame, yuvTextures[0]);
    framePool.returnFrame(localFrame);
  }
  if (remoteFrame != null) {
    texImage2D(remoteFrame, yuvTextures[1]);
    framePool.returnFrame(remoteFrame);
  }
  abortUnless(localFrame != null || remoteFrame != null,
              "Nothing to render!");
  requestRender();
}
 
开发者ID:kenneththorman,项目名称:appspotdemo-mono,代码行数:20,代码来源:VideoStreamsView.java


示例5: renderFrame

import org.webrtc.VideoRenderer.I420Frame; //导入依赖的package包/类
@Override
public void renderFrame(I420Frame frame) {
  synchronized (frameLock) {
    ++framesRendered;
    width = frame.rotatedWidth();
    height = frame.rotatedHeight();
    frameLock.notify();
  }
  VideoRenderer.renderFrameDone(frame);
}
 
开发者ID:lgyjg,项目名称:AndroidRTC,代码行数:11,代码来源:CameraVideoCapturerTestFixtures.java


示例6: waitForPendingFrames

import org.webrtc.VideoRenderer.I420Frame; //导入依赖的package包/类
public List<I420Frame> waitForPendingFrames() throws InterruptedException {
  Logging.d(TAG, "Waiting for pending frames");
  synchronized (pendingFrames) {
    while (pendingFrames.isEmpty()) {
      pendingFrames.wait();
    }
    return new ArrayList<I420Frame>(pendingFrames);
  }
}
 
开发者ID:lgyjg,项目名称:AndroidRTC,代码行数:10,代码来源:CameraVideoCapturerTestFixtures.java


示例7: returnBufferLateEndToEnd

import org.webrtc.VideoRenderer.I420Frame; //导入依赖的package包/类
public void returnBufferLateEndToEnd() throws InterruptedException {
  final CapturerInstance capturerInstance = createCapturer(false /* initialize */);
  final VideoTrackWithRenderer videoTrackWithRenderer =
      createVideoTrackWithFakeAsyncRenderer(capturerInstance.capturer);
  // Wait for at least one frame that has not been returned.
  assertFalse(videoTrackWithRenderer.fakeAsyncRenderer.waitForPendingFrames().isEmpty());

  capturerInstance.capturer.stopCapture();

  // Dispose everything.
  disposeCapturer(capturerInstance);
  disposeVideoTrackWithRenderer(videoTrackWithRenderer);

  // Return the frame(s), on a different thread out of spite.
  final List<I420Frame> pendingFrames =
      videoTrackWithRenderer.fakeAsyncRenderer.waitForPendingFrames();
  final Thread returnThread = new Thread(new Runnable() {
    @Override
    public void run() {
      for (I420Frame frame : pendingFrames) {
        VideoRenderer.renderFrameDone(frame);
      }
    }
  });
  returnThread.start();
  returnThread.join();
}
 
开发者ID:lgyjg,项目名称:AndroidRTC,代码行数:28,代码来源:CameraVideoCapturerTestFixtures.java


示例8: queueFrame

import org.webrtc.VideoRenderer.I420Frame; //导入依赖的package包/类
/** Queue |frame| to be uploaded. */
public void queueFrame(final int stream, I420Frame frame) {
  // Paying for the copy of the YUV data here allows CSC and painting time
  // to get spent on the render thread instead of the UI thread.
  abortUnless(FramePool.validateDimensions(frame), "Frame demasiado grande!");
  final I420Frame frameCopy = framePool.takeFrame(frame).copyFrom(frame);
  queueEvent(new Runnable() {
    public void run() {
      updateFrame(stream, frameCopy);
    }
  });
}
 
开发者ID:KashaMalaga,项目名称:UMA-AndroidWebRTC,代码行数:13,代码来源:VideoStreamsView.java


示例9: updateFrame

import org.webrtc.VideoRenderer.I420Frame; //导入依赖的package包/类
private void updateFrame(int stream, I420Frame frame) {
  int[] textures = yuvTextures[stream];
  texImage2D(frame, textures);
  framePool.returnFrame(frame);

  requestRender();
}
 
开发者ID:KashaMalaga,项目名称:UMA-AndroidWebRTC,代码行数:8,代码来源:VideoStreamsView.java


示例10: texImage2D

import org.webrtc.VideoRenderer.I420Frame; //导入依赖的package包/类
private void texImage2D(I420Frame frame, int[] textures) {
  for (int i = 0; i < 3; ++i) {
    ByteBuffer plane = frame.yuvPlanes[i];
    GLES20.glActiveTexture(GLES20.GL_TEXTURE0 + i);
    GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textures[i]);
    int w = i == 0 ? frame.width : frame.width / 2;
    int h = i == 0 ? frame.height : frame.height / 2;
    abortUnless(w == frame.yuvStrides[i], frame.yuvStrides[i] + "!=" + w);
    GLES20.glTexImage2D(
                        GLES20.GL_TEXTURE_2D, 0, GLES20.GL_LUMINANCE, w, h, 0,
                        GLES20.GL_LUMINANCE, GLES20.GL_UNSIGNED_BYTE, plane);
  }
  checkNoGLES2Error();
}
 
开发者ID:KashaMalaga,项目名称:UMA-AndroidWebRTC,代码行数:15,代码来源:VideoStreamsView.java


示例11: returnFrame

import org.webrtc.VideoRenderer.I420Frame; //导入依赖的package包/类
public void returnFrame(I420Frame frame) {
    long desc = summarizeFrameDimensions(frame);
    synchronized (availableFrames) {
        LinkedList<I420Frame> frames = availableFrames.get(desc);
        if (frames == null) {
            throw new IllegalArgumentException("Unexpected frame dimensions");
        }
        frames.add(frame);
    }
}
 
开发者ID:KashaMalaga,项目名称:UMA-AndroidWebRTC,代码行数:11,代码来源:FramePool.java


示例12: validateDimensions

import org.webrtc.VideoRenderer.I420Frame; //导入依赖的package包/类
/** Validate that |frame| can be managed by the pool. */
public static boolean validateDimensions(I420Frame frame) {
    return frame.width < MAX_DIMENSION && frame.height < MAX_DIMENSION &&
            frame.yuvStrides[0] < MAX_DIMENSION &&
            frame.yuvStrides[1] < MAX_DIMENSION &&
            frame.yuvStrides[2] < MAX_DIMENSION;
}
 
开发者ID:KashaMalaga,项目名称:UMA-AndroidWebRTC,代码行数:8,代码来源:FramePool.java


示例13: summarizeFrameDimensions

import org.webrtc.VideoRenderer.I420Frame; //导入依赖的package包/类
private static long summarizeFrameDimensions(I420Frame frame) {
    long ret = frame.width;
    ret = ret * MAX_DIMENSION + frame.height;
    ret = ret * MAX_DIMENSION + frame.yuvStrides[0];
    ret = ret * MAX_DIMENSION + frame.yuvStrides[1];
    ret = ret * MAX_DIMENSION + frame.yuvStrides[2];
    return ret;
}
 
开发者ID:KashaMalaga,项目名称:UMA-AndroidWebRTC,代码行数:9,代码来源:FramePool.java


示例14: YuvImageRenderer

import org.webrtc.VideoRenderer.I420Frame; //导入依赖的package包/类
private YuvImageRenderer(
    GLSurfaceView surface, int id,
    int x, int y, int width, int height,
    ScalingType scalingType) {
  Log.d(TAG, "YuvImageRenderer.Create id: " + id);
  this.surface = surface;
  this.id = id;
  this.scalingType = scalingType;
  frameToRenderQueue = new LinkedBlockingQueue<I420Frame>(1);
  // Create texture vertices.
  texLeft = (x - 50) / 50.0f;
  texTop = (50 - y) / 50.0f;
  texRight = Math.min(1.0f, (x + width - 50) / 50.0f);
  texBottom = Math.max(-1.0f, (50 - y - height) / 50.0f);
  float textureVeticesFloat[] = new float[] {
      texLeft, texTop,
      texLeft, texBottom,
      texRight, texTop,
      texRight, texBottom
  };
  textureVertices = directNativeFloatBuffer(textureVeticesFloat);
  // Create texture UV coordinates.
  float textureCoordinatesFloat[] = new float[] {
      0, 0, 0, 1, 1, 0, 1, 1
  };
  textureCoords = directNativeFloatBuffer(textureCoordinatesFloat);
  updateTextureProperties = false;
}
 
开发者ID:jingcmu,项目名称:MediaCodecTest,代码行数:29,代码来源:VideoRendererGui.java


示例15: queueFrame

import org.webrtc.VideoRenderer.I420Frame; //导入依赖的package包/类
/** Queue |frame| to be uploaded. */
public void queueFrame(final String stream, I420Frame frame) {
	// Paying for the copy of the YUV data here allows CSC and painting time
	// to get spent on the render thread instead of the UI thread.
	abortUnless(FramePool.validateDimensions(frame), "Frame too large!");
	// boolean needToScheduleRender;
	synchronized (frameDescriptions) {
		// A new render needs to be scheduled (via updateFrames()) iff there
		// isn't
		// already a render scheduled, which is true iff framesToRender is
		// empty.
		// needToScheduleRender = frameDescriptions.isEmpty();
		FrameDescription desc = frameDescriptions.get(stream);
		// if (desc == null)
		// {
		// desc = createFrameDescription(stream);
		// }
		if (desc != null && desc.bufferIndex != -1) {
			I420Frame frameToDrop = desc.frameToRender;
			if (desc.bufferIndex != -1) {
				final I420Frame frameCopy = framePool.takeFrame(frame)
						.copyFrom(frame);
				desc.frameToRender = frameCopy;
			}
			if (frameToDrop != null) {
				framePool.returnFrame(frameToDrop);
			}
		}
	}
	long dt = System.nanoTime() - mLastRendered;
	if (dt > MIN_NANOS_BETWEEN_FRAMES
			&& mRenderRequested.compareAndSet(false, true)) {
		queueEvent(new Runnable() {
			public void run() {
				updateFrames();
			}
		});
	}
}
 
开发者ID:K-GmbH,项目名称:licodeAndroidClient,代码行数:40,代码来源:VideoStreamsView.java


示例16: texImage2D

import org.webrtc.VideoRenderer.I420Frame; //导入依赖的package包/类
private void texImage2D(I420Frame frame, int[] textures) {
	for (int i = 0; i < 3; ++i) {
		ByteBuffer plane = frame.yuvPlanes[i];
		GLES20.glActiveTexture(GLES20.GL_TEXTURE0 + i);
		GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textures[i]);
		int w = i == 0 ? frame.width : frame.width / 2;
		int h = i == 0 ? frame.height : frame.height / 2;
		abortUnless(w == frame.yuvStrides[i], frame.yuvStrides[i] + "!="
				+ w);
		GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_LUMINANCE,
				w, h, 0, GLES20.GL_LUMINANCE, GLES20.GL_UNSIGNED_BYTE,
				plane);
	}
	checkNoGLES2Error();
}
 
开发者ID:K-GmbH,项目名称:licodeAndroidClient,代码行数:16,代码来源:VideoStreamsView.java


示例17: returnFrame

import org.webrtc.VideoRenderer.I420Frame; //导入依赖的package包/类
public void returnFrame(I420Frame frame) {
  long desc = summarizeFrameDimensions(frame);
  synchronized (availableFrames) {
    LinkedList<I420Frame> frames = availableFrames.get(desc);
    if (frames == null) {
      throw new IllegalArgumentException("Unexpected frame dimensions");
    }
    frames.add(frame);
  }
}
 
开发者ID:K-GmbH,项目名称:licodeAndroidClient,代码行数:11,代码来源:FramePool.java


示例18: validateDimensions

import org.webrtc.VideoRenderer.I420Frame; //导入依赖的package包/类
/** Validate that |frame| can be managed by the pool. */
public static boolean validateDimensions(I420Frame frame) {
  return frame.width < MAX_DIMENSION && frame.height < MAX_DIMENSION &&
      frame.yuvStrides[0] < MAX_DIMENSION &&
      frame.yuvStrides[1] < MAX_DIMENSION &&
      frame.yuvStrides[2] < MAX_DIMENSION;
}
 
开发者ID:K-GmbH,项目名称:licodeAndroidClient,代码行数:8,代码来源:FramePool.java


示例19: summarizeFrameDimensions

import org.webrtc.VideoRenderer.I420Frame; //导入依赖的package包/类
private static long summarizeFrameDimensions(I420Frame frame) {
  long ret = frame.width;
  ret = ret * MAX_DIMENSION + frame.height;
  ret = ret * MAX_DIMENSION + frame.yuvStrides[0];
  ret = ret * MAX_DIMENSION + frame.yuvStrides[1];
  ret = ret * MAX_DIMENSION + frame.yuvStrides[2];
  return ret;
}
 
开发者ID:K-GmbH,项目名称:licodeAndroidClient,代码行数:9,代码来源:FramePool.java


示例20: setSize

import org.webrtc.VideoRenderer.I420Frame; //导入依赖的package包/类
@Override
public void setSize(final int width, final int height) {
  Log.v(TAG, "YuvImageRenderer.setSize: " + width + " x " + height);
  int[] strides = { width, width / 2, width / 2  };
  // Frame re-allocation need to be synchronized with copying
  // frame to textures in draw() function to avoid re-allocating
  // the frame while it is being copied.
  synchronized (frameToRenderQueue) {
    // Clear rendering queue
    frameToRenderQueue.poll();
    // Re-allocate / allocate the frame
    frameToRender = new I420Frame(width, height, strides, null);
  }
}
 
开发者ID:actorapp,项目名称:droidkit-webrtc,代码行数:15,代码来源:VideoRendererGui.java



注:本文中的org.webrtc.VideoRenderer.I420Frame类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。


鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
上一篇:
Java CodeBlob类代码示例发布时间:2022-05-23
下一篇:
Java PostprocessReformattingAspect类代码示例发布时间:2022-05-23
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap