• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    公众号

Java VideoCapturerAndroid类代码示例

原作者: [db:作者] 来自: [db:来源] 收藏 邀请

本文整理汇总了Java中org.webrtc.VideoCapturerAndroid的典型用法代码示例。如果您正苦于以下问题:Java VideoCapturerAndroid类的具体用法?Java VideoCapturerAndroid怎么用?Java VideoCapturerAndroid使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。



VideoCapturerAndroid类属于org.webrtc包,在下文中一共展示了VideoCapturerAndroid类的12个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。

示例1: getVideoCapturer

import org.webrtc.VideoCapturerAndroid; //导入依赖的package包/类
private VideoCapturer getVideoCapturer(CameraVideoCapturer.CameraEventsHandler eventsHandler) {
    String[] cameraFacing = {"front", "back"};
    int[] cameraIndex = {0, 1};
    int[] cameraOrientation = {0, 90, 180, 270};
    for (String facing : cameraFacing) {
        for (int index : cameraIndex) {
            for (int orientation : cameraOrientation) {
                String name = "Camera " + index + ", Facing " + facing +
                        ", Orientation " + orientation;
                VideoCapturer capturer = VideoCapturerAndroid.create(name, eventsHandler);
                if (capturer != null) {
                    Log.d("Using camera: ", name);
                    return capturer;
                }
            }
        }
    }
    throw new RuntimeException("Failed to open capture");
}
 
开发者ID:vivek1794,项目名称:webrtc-android-codelab,代码行数:20,代码来源:MainActivity.java


示例2: createCapturerVideoTrack

import org.webrtc.VideoCapturerAndroid; //导入依赖的package包/类
private VideoTrack createCapturerVideoTrack(VideoCapturerAndroid capturer) {
    videoSource = factory.createVideoSource(capturer, videoConstraints);
    localVideoTrack = factory.createVideoTrack(VIDEO_TRACK_ID, videoSource);
    localVideoTrack.setEnabled(renderVideo);
    localVideoTrack.addRenderer(new VideoRenderer(localRender));
    return localVideoTrack;
}
 
开发者ID:nubomedia-vtt,项目名称:webrtcpeer-android,代码行数:8,代码来源:MediaResourceManager.java


示例3: onViewClicked

import org.webrtc.VideoCapturerAndroid; //导入依赖的package包/类
@OnClick({R.id.btnCamera, R.id.btnVoice, R.id.btnChangeCam, R.id.btnEndCall})
    public void onViewClicked(View view) {
        switch (view.getId()) {
            case R.id.btnCamera:
                enableCam = !enableCam;
                btnCamera.setSelected(!enableCam);
                localVideoTrack.setEnabled(enableCam);
                break;
            case R.id.btnVoice:
                enableVoice = !enableVoice;
                btnVoice.setSelected(!enableVoice);
                localAudioTrack.setEnabled(enableVoice);
                break;
            case R.id.btnChangeCam:
                LogUtils.e("change cam");
                videoCapture.switchCamera(new VideoCapturerAndroid.CameraSwitchHandler() {
                    @Override
                    public void onCameraSwitchDone(final boolean b) {
                        LogUtils.e("is Front Camera: " + b);
                        VideoCallActivity.this.runOnUiThread(new Runnable() {
                            @Override
                            public void run() {
                                isFrontCam = b;
                                btnChangeCam.setSelected(!isFrontCam);
//                                VideoRendererGui.update(localRender, 72, 65, 25, 25, RendererCommon.ScalingType.SCALE_ASPECT_FIT, isFrontCam);
                            }
                        });

                    }

                    @Override
                    public void onCameraSwitchError(String s) {
                        LogUtils.e("onCameraSwitchError: ");
                    }
                });
                break;
            case R.id.btnEndCall:
                endCall();
                break;
        }
    }
 
开发者ID:vuatovuanang,项目名称:WebRTC-VideoCall-Anrdoid,代码行数:42,代码来源:VideoCallActivity.java


示例4: createPeerConnection

import org.webrtc.VideoCapturerAndroid; //导入依赖的package包/类
public void createPeerConnection(
			final EglBase.Context renderEGLContext,
			final VideoRenderer.Callbacks localRender,
			final VideoRenderer.Callbacks remoteRender,
			final PeerConnectionEvents events,
			final PeerConnectionParameters peerConnectionParameters) {
		this.peerConnectionParameters = peerConnectionParameters;
		this.events = events;
		videoCallEnabled = peerConnectionParameters.videoCallEnabled;
//
//		PeerConnectionFactory.initializeAndroidGlobals(, true, true,
//				false);
//		factory = new PeerConnectionFactory();

//		if (peerConnectionParameters == null) {
//			Log.e(TAG, "Creating peer connection without initializing factory.");
//			return;
//		}
		this.localRender = localRender;
		this.remoteRender = remoteRender;

		executor.execute(new Runnable() {
			@Override
			public void run() {
				createMediaConstraintsInternal();
//				createPeerConnectionInternal(renderEGLContext, iceServers);
				if(mediaStream == null) {
					mediaStream = factory.createLocalMediaStream("ARDAMS");
					if (videoCallEnabled) {
						String cameraDeviceName = CameraEnumerationAndroid.getDeviceName(0);
						String frontCameraDeviceName =
								CameraEnumerationAndroid.getNameOfFrontFacingDevice();
						if (numberOfCameras > 1 && frontCameraDeviceName != null) {
							cameraDeviceName = frontCameraDeviceName;
						}
						Log.d(TAG, "Opening camera: " + cameraDeviceName);
						videoCapturer = VideoCapturerAndroid.create(cameraDeviceName, null,
								peerConnectionParameters.captureToTexture ? renderEGLContext : null);
						if (videoCapturer == null) {
							reportError("Failed to open camera");
							return;
						}
						mediaStream.addTrack(createVideoTrack(videoCapturer));
					}

					mediaStream.addTrack(factory.createAudioTrack(
							AUDIO_TRACK_ID,
							factory.createAudioSource(audioConstraints)));
				}
				try {
					manager = new Manager(new URI(mHost));
					client = manager.socket("/");
				} catch (URISyntaxException e) {
					e.printStackTrace();
				}
				client
						.on(INIT_MESSAGE, messageHandler.onInitMessage)
						.on(TEXT_MESSAGE, messageHandler.onTextMessage)
//						.on(INVITE_MESSAGE, messageHandler.onInviteMessage)
//						.on(READY_MESSAGE, messageHandler.onReadyMessage)
//						.on(OFFER_MESSAGE, messageHandler.onOfferMessage)
//						.on(ANSWER_MESSAGE, messageHandler.onAnswerMessage)
//						.on(ICE_CANDIDATE_MESSAGE, messageHandler.onCandidateMessage)
						.on(RTC_MESSAGE, messageHandler.onRtcMessage)
						.on(LEAVE_MESSAGE, messageHandler.onLeaveMessage)
						.on(AVAILABLE_USERS_MESSAGE, messageHandler.onAvailablePeersMessage)
						.on(PRESENCE_MESSAGE, messageHandler.onPresenceMessage);
				client.connect();
			}
		});

	}
 
开发者ID:ardnezar,项目名称:webrtc-android,代码行数:73,代码来源:PeerConnectionClient.java


示例5: createVideoTrack

import org.webrtc.VideoCapturerAndroid; //导入依赖的package包/类
private VideoTrack createVideoTrack(VideoCapturerAndroid capturer) {
	videoSource = factory.createVideoSource(capturer, videoConstraints);

	localVideoTrack = factory.createVideoTrack(VIDEO_TRACK_ID, videoSource);
	localVideoTrack.setEnabled(renderVideo);
	localVideoTrack.addRenderer(new VideoRenderer(localRender));
	return localVideoTrack;
}
 
开发者ID:ardnezar,项目名称:webrtc-android,代码行数:9,代码来源:PeerConnectionClient.java


示例6: createVideoCapturer

import org.webrtc.VideoCapturerAndroid; //导入依赖的package包/类
/**
 * Creates a instance of VideoCapturerAndroid.
 * @return VideoCapturerAndroid
 */
private VideoCapturerAndroid createVideoCapturer() {
    switch (mOption.getVideoType()) {
        default:
        case NONE:
            return null;
        case CAMERA:
            return createCameraCapture();
        case EXTERNAL_RESOURCE:
            return createExternalResource();
    }
}
 
开发者ID:DeviceConnect,项目名称:DeviceConnect-Android,代码行数:16,代码来源:MediaStream.java


示例7: createVideoTrack

import org.webrtc.VideoCapturerAndroid; //导入依赖的package包/类
/**
 * Creates a instance of VideoTrack to used in a VideoCapturerAndroid.
 * @param capturer Instance of VideoCapturerAndroid
 * @return VideoTrack
 */
private VideoTrack createVideoTrack(final VideoCapturerAndroid capturer) {
    mVideoRender = mOption.getRender();
    mVideoSource = mFactory.createVideoSource(capturer, mVideoConstraints);
    mVideoTrack = mFactory.createVideoTrack(VIDEO_TRACK_ID, mVideoSource);
    mVideoTrack.setEnabled(mEnableVideo);
    mVideoTrack.addRenderer(new VideoRenderer(mVideoRender));
    return mVideoTrack;
}
 
开发者ID:DeviceConnect,项目名称:DeviceConnect-Android,代码行数:14,代码来源:MediaStream.java


示例8: startCapture

import org.webrtc.VideoCapturerAndroid; //导入依赖的package包/类
@Override
public void startCapture(final int width, final int height, final int frameRate,
                         final Context applicationContext,
                         final VideoCapturerAndroid.CapturerObserver frameObserver) {
    if (DEBUG) {
        Log.i(TAG, "@@@ startCapture size:[" + width + ", " + height
                + "] frameRate:" + frameRate);
    }

    mSurfaceHelper.getSurfaceTexture().setDefaultBufferSize(width, height);
    mSurface = new Surface(mSurfaceHelper.getSurfaceTexture());

    mRequestWidth = width;
    mRequestHeight = height;
    mFrameObserver = frameObserver;

    if (mClient != null) {
        mClient.stop();
        mClient = null;
    }

    mClient = new MixedReplaceMediaClient(mUri);
    mClient.setOnMixedReplaceMediaListener(mOnMixedReplaceMediaListener);
    mClient.start();
}
 
开发者ID:DeviceConnect,项目名称:DeviceConnect-Android,代码行数:26,代码来源:VideoCapturerExternalResource.java


示例9: getVideoCapturer

import org.webrtc.VideoCapturerAndroid; //导入依赖的package包/类
private VideoCapturer getVideoCapturer() {
    String frontCameraDeviceName = CameraEnumerationAndroid.getDeviceName(0);
    return VideoCapturerAndroid.create(frontCameraDeviceName);
}
 
开发者ID:ardnezar,项目名称:webrtc-android,代码行数:5,代码来源:WebRtcClient.java


示例10: onCreate

import org.webrtc.VideoCapturerAndroid; //导入依赖的package包/类
@Override
protected void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);

    setContentView(R.layout.activity_main);

    AudioManager audioManager = (AudioManager) this.getSystemService(Context.AUDIO_SERVICE);
    audioManager.setMode(AudioManager.MODE_IN_COMMUNICATION);
    audioManager.setSpeakerphoneOn(true);

    PeerConnectionFactory.initializeAndroidGlobals(
            this,  // Context
            true,  // Audio Enabled
            true,  // Video Enabled
            true,  // Hardware Acceleration Enabled
            null); // Render EGL Context

    peerConnectionFactory = new PeerConnectionFactory();

    VideoCapturerAndroid vc = VideoCapturerAndroid.create(VideoCapturerAndroid.getNameOfFrontFacingDevice(), null);

    localVideoSource = peerConnectionFactory.createVideoSource(vc, new MediaConstraints());
    VideoTrack localVideoTrack = peerConnectionFactory.createVideoTrack(VIDEO_TRACK_ID, localVideoSource);
    localVideoTrack.setEnabled(true);

    AudioSource audioSource = peerConnectionFactory.createAudioSource(new MediaConstraints());
    AudioTrack localAudioTrack = peerConnectionFactory.createAudioTrack(AUDIO_TRACK_ID, audioSource);
    localAudioTrack.setEnabled(true);

    localMediaStream = peerConnectionFactory.createLocalMediaStream(LOCAL_STREAM_ID);
    localMediaStream.addTrack(localVideoTrack);
    localMediaStream.addTrack(localAudioTrack);

    GLSurfaceView videoView = (GLSurfaceView) findViewById(R.id.glview_call);

    VideoRendererGui.setView(videoView, null);
    try {
        otherPeerRenderer = VideoRendererGui.createGui(0, 0, 100, 100, VideoRendererGui.ScalingType.SCALE_ASPECT_FILL, true);
        VideoRenderer renderer = VideoRendererGui.createGui(50, 50, 50, 50, VideoRendererGui.ScalingType.SCALE_ASPECT_FILL, true);
        localVideoTrack.addRenderer(renderer);
    } catch (Exception e) {
        e.printStackTrace();
    }
}
 
开发者ID:Nitrillo,项目名称:krankygeek,代码行数:45,代码来源:MainActivity.java


示例11: createLocalMediaStream

import org.webrtc.VideoCapturerAndroid; //导入依赖的package包/类
void createLocalMediaStream(Object renderEGLContext,final VideoRenderer.Callbacks localRender) {
    if (factory == null) {
        Log.e(TAG, "Peerconnection factory is not created");
        return;
    }
    this.localRender = localRender;
    if (videoCallEnabled) {
        factory.setVideoHwAccelerationOptions(renderEGLContext, renderEGLContext);
    }

    // Set default WebRTC tracing and INFO libjingle logging.
    // NOTE: this _must_ happen while |factory| is alive!
    Logging.enableTracing("logcat:", EnumSet.of(Logging.TraceLevel.TRACE_DEFAULT), Logging.Severity.LS_INFO);

    localMediaStream = factory.createLocalMediaStream("ARDAMS");

    // If video call is enabled and the device has camera(s)
    if (videoCallEnabled && numberOfCameras > 0) {
        String cameraDeviceName; // = CameraEnumerationAndroid.getDeviceName(0);
        String frontCameraDeviceName = CameraEnumerationAndroid.getNameOfFrontFacingDevice();
        String backCameraDeviceName = CameraEnumerationAndroid.getNameOfBackFacingDevice();

        // If current camera is set to front and the device has one
        if (currentCameraPosition==NBMCameraPosition.FRONT && frontCameraDeviceName!=null) {
            cameraDeviceName = frontCameraDeviceName;
        }
        // If current camera is set to back and the device has one
        else if (currentCameraPosition==NBMCameraPosition.BACK && backCameraDeviceName!=null) {
            cameraDeviceName = backCameraDeviceName;
        }
        // If current camera is set to any then we pick the first camera of the device, which
        // should be a back-facing camera according to libjingle API
        else {
            cameraDeviceName = CameraEnumerationAndroid.getDeviceName(0);
            currentCameraPosition = NBMCameraPosition.BACK;
        }

        Log.d(TAG, "Opening camera: " + cameraDeviceName);
        videoCapturer = VideoCapturerAndroid.create(cameraDeviceName, null);
        if (videoCapturer == null) {
            Log.d(TAG, "Error while opening camera");
            return;
        }
        localMediaStream.addTrack(createCapturerVideoTrack(videoCapturer));
    }

    // Create audio track
    localMediaStream.addTrack(factory.createAudioTrack(AUDIO_TRACK_ID, factory.createAudioSource(audioConstraints)));

    Log.d(TAG, "Local media stream created.");
}
 
开发者ID:nubomedia-vtt,项目名称:webrtcpeer-android,代码行数:52,代码来源:MediaResourceManager.java


示例12: switchCamera

import org.webrtc.VideoCapturerAndroid; //导入依赖的package包/类
@Override
public void switchCamera(VideoCapturerAndroid.CameraSwitchHandler cameraSwitchHandler) {
    if (DEBUG) {
        Log.e(TAG, "switchCamera:");
    }
}
 
开发者ID:DeviceConnect,项目名称:DeviceConnect-Android,代码行数:7,代码来源:VideoCapturerExternalResource.java



注:本文中的org.webrtc.VideoCapturerAndroid类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。


鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
上一篇:
Java GeneratorBase类代码示例发布时间:2022-05-23
下一篇:
Java EmptyIteration类代码示例发布时间:2022-05-23
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap