环信实时通话分为视频通话和语音通话,SDK 提供简单的 API,方便开发者简单的接入实时通话功能。
实时语音和实时视频通话的数据流量如下:
1. 在项目中导入库
Hyphenate.framework //包含实时音视频的库
AVFoundation.framework
2. 在项目中导入头文件
#import <Hyphenate/Hyphenate.h>
3. 配置属性:
进行音视频之前,设置全局的音视频属性,具体属性有哪些请查看头文件 *EMCallOptions*
EMCallOptions *options = [[EMClient sharedClient].callManager getCallOptions];
//当对方不在线时,是否给对方发送离线消息和推送,并等待对方回应
options.isSendPushIfOffline = NO;
//设置视频分辨率:自适应分辨率、352 * 288、640 * 480、1280 * 720
options.videoResolution = EMCallVideoResolutionAdaptive;
//最大视频码率,范围 50 < videoKbps < 5000, 默认0, 0为自适应,建议设置为0
options.maxVideoKbps = 0;
//最小视频码率
options.minVideoKbps = 0;
//是否固定视频分辨率,默认为NO
options.isFixedVideoResolution = NO;
[[EMClient sharedClient].callManager setCallOptions:options];
具体实现可以参考 Demo: DemoCallManager 和 EMCallViewController
用户可以调用发起语音或者视频 API 向在线用户发起实时通话。
/*!
* 发起实时会话
*
* @param aType 通话类型
* @param aRemoteName 被呼叫的用户(不能与自己通话)
* @param aExt 通话扩展信息,会传给被呼叫方
* @param aCompletionBlock 完成的回调
*/
- (void)startCall:(EMCallType)aType
remoteName:(NSString *)aRemoteName
ext:(NSString *)aExt
completion:(void (^)(EMCallSession *aCallSession, EMError *aError))aCompletionBlock;
示例代码:创建视频通话
void (^completionBlock)(EMCallSession *, EMError *) = ^(EMCallSession *aCallSession, EMError *aError){
//创建通话实例是否成功
//TODO: code
};
[[EMClient sharedClient].callManager startCall:EMCallTypeVideo remoteName:aUsername ext:nil completion:^(EMCallSession *aCallSession, EMError *aError) {
completionBlock(aCallSession, aError);
}];
接收到通话时调用此 API 同意实时通话。
/*!
* 接收方同意通话请求
*
* @param aCallId 通话ID
*
* @result 错误信息
*/
- (EMError *)answerIncomingCall:(NSString *)aCallId;
//调用:
//EMError *error = nil;
//error = [[EMClient sharedClient].callManager answerIncomingCall:@"sessionId"];
根据不同场景可以选择结束会话的原因。
例如:拒接选择 EMCallEndReasonDecline,主动挂断选择 EMCallEndReasonHangup。
typedef enum{
EMCallEndReasonHangup = 0, /*! 对方挂断 */
EMCallEndReasonNoResponse, /*! 对方没有响应 */
EMCallEndReasonDecline, /*! 对方拒接 */
EMCallEndReasonBusy, /*! 对方占线 */
EMCallEndReasonFailed, /*! 失败 */
EMCallEndReasonUnsupported, /*! 功能不支持 */
}EMCallEndReason;
/*!
* 结束通话
*
* @param aCallId 通话的ID
* @param aReason 结束原因
*
* @result 错误
*/
- (EMError *)endCall:(NSString *)aCallId
reason:(EMCallEndReason)aReason;
//调用:
//[[EMClient sharedClient].callManager endCall:@"sessionId" reason:aReason];
SDK提供了用于显示本地视频的页面类*EMCallLocalView*,显示对方视频的页面类*EMCallRemoteView*,建议在同意接通视频通话之后再初始化 EMCallRemoteView页面。
//前提:EMCallSession *callSession 存在
CGFloat width = 80;
CGFloat height = self.view.frame.size.height / self.view.frame.size.width * width;
callSession.localVideoView = [[EMCallLocalView alloc] initWithFrame:CGRectMake(self.view.frame.size.width - 90, CGRectGetMaxY(_statusLabel.frame), width, height)];
[self.view addSubview:callSession.localVideoView];
//同意接听视频通话之后
callSession.remoteVideoView = [[EMCallRemoteView alloc] initWithFrame:CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height)];
//设置视频页面缩放方式
callSession.remoteVideoView.scaleMode = EMCallViewScaleModeAspectFill;
[self.view addSubview:_callSession.remoteVideoView];
EMCallSession中录制及截屏接口已废弃,该功能作为音视频的插件之一单独打成了静态库。录制和截屏都必须在通话已经开始进行之后再调用。
点击下载静态库
1. 头文件
EMAVPluginRecorder.h
2. 库
libbz2.tbd
libHyphenatePluginRecorder.a
libffmpeg-ios-full.a
3. 使用教程
必须在发起通话之前调用[EMVideoRecorderPlugin initGlobalConfig] ,该方法为全局方法,只需要调用一次即可
示例代码:开始录制(已经开始通话之后再调用)
NSString *recordPath = NSHomeDirectory();
recordPath = [NSString stringWithFormat:@"%@/Library/appdata/chatbuffer",recordPath];
NSFileManager *fm = [NSFileManager defaultManager];
if(![fm fileExistsAtPath:recordPath]) {
[fm createDirectoryAtPath:recordPath withIntermediateDirectories:YES attributes:nil error:nil];
}
[[EMVideoRecorderPlugin sharedInstance] startVideoRecordingToFilePath:recordPath error:nil];
示例代码:停止录制
EMError *error = nil;
[[EMVideoRecorderPlugin sharedInstance] stopVideoRecording:&error];
if (!error) {
NSLog(@"录制成功");
}else {
NSLog(@"录制失败");
}
示例代码:截屏(已经开始通话之后再调用)
NSString *imgPath = NSHomeDirectory();
imgPath = [NSString stringWithFormat:@"%@/Library/appdata/chatbuffer/img.jpeg", imgPath];
[[EMVideoRecorderPlugin sharedInstance] screenCaptureToFilePath:imgPath error:nil];
暂停恢复实时通话的数据传输相关 API。
/*!
* 暂停语音数据传输
*
* @result 错误
*/
- (EMError *)pauseVoice;
/*!
* 恢复语音数据传输
*
* @result 错误
*/
- (EMError *)resumeVoice;
/*!
* 暂停视频图像数据传输
*
* @result 错误
*/
- (EMError *)pauseVideo;
/*!
* 恢复视频图像数据传输
*
* @result 错误
*/
- (EMError *)resumeVideo;
实时通话前后摄像头切换相关API
#pragma mark - Camera
/*!
* 设置使用前置摄像头还是后置摄像头,默认使用前置摄像头
*
* @param aIsFrontCamera 是否使用前置摄像头, YES使用前置, NO使用后置
*/
- (void)switchCameraPosition:(BOOL)aIsFrontCamera;
注册实时通话回调
//注册实时通话回调
[[EMClient sharedClient].callManager addDelegate:self delegateQueue:nil];
//移除实时通话回调
[[EMClient sharedClient].callManager removeDelegate:self];
相关回调说明:
/*!
* 用户A拨打用户B,用户B会收到这个回调
*
* @param aSession 会话实例
*/
- (void)callDidReceive:(EMCallSession *)aSession;
/*!
* 通话通道建立完成,用户A和用户B都会收到这个回调
*
* @param aSession 会话实例
*/
- (void)callDidConnect:(EMCallSession *)aSession;
/*!
* 用户B同意用户A拨打的通话后,用户A会收到这个回调
*
* @param aSession 会话实例
*/
- (void)callDidAccept:(EMCallSession *)aSession;
/*!
* 1. 用户A或用户B结束通话后,对方会收到该回调
* 2. 通话出现错误,双方都会收到该回调
*
* @param aSession 会话实例
* @param aReason 结束原因
* @param aError 错误
*/
- (void)callDidEnd:(EMCallSession *)aSession
reason:(EMCallEndReason)aReason
error:(EMError *)aError;
/*!
* 用户A和用户B正在通话中,用户A中断或者继续数据流传输时,用户B会收到该回调
*
* @param aSession 会话实例
* @param aType 改变类型
*/
- (void)callStateDidChange:(EMCallSession *)aSession
type:(EMCallStreamingStatus)aType;
通过回调通知应用当前实时通话网络状态。
typedef enum{
EMCallNetworkStatusNormal = 0, /*! 正常 */
EMCallNetworkStatusUnstable, /*! 不稳定 */
EMCallNetworkStatusNoData, /*! 没有数据 */
}EMCallNetworkStatus;
/*!
* 用户A和用户B正在通话中,用户A的网络状态出现不稳定,用户A会收到该回调
*
* @param aSession 会话实例
* @param aStatus 当前状态
*/
- (void)callNetworkDidChange:(EMCallSession *)aSession
status:(EMCallNetworkStatus)aStatus
配置属性
EMCallOptions *options = [[EMClient sharedClient].callManager getCallOptions];
//当对方不在线时,是否给对方发送离线消息和推送,并等待对方回应
options.isSendPushIfOffline = NO;
[[EMClient sharedClient].callManager setCallOptions:options];
监听回调
[[EMClient sharedClient].callManager setBuilderDelegate:self];
处理回调
- (void)callRemoteOffline:(NSString *)aRemoteName
{
NSString *text = [[EMClient sharedClient].callManager getCallOptions].offlineMessageText;
EMTextMessageBody *body = [[EMTextMessageBody alloc] initWithText:text];
NSString *fromStr = [EMClient sharedClient].currentUsername;
EMMessage *message = [[EMMessage alloc] initWithConversationID:aRemoteName from:fromStr to:aRemoteName body:body ext:@{@"em_apns_ext":@{@"em_push_title":text}}];
message.chatType = EMChatTypeChat;
[[EMClient sharedClient].chatManager sendMessage:message progress:nil completion:nil];
}
//进行1v1自定义视频之前,必须设置 EMCallOptions.enableCustomizeVideoData=YES
EMCallOptions *options = [[EMClient sharedClient].callManager getCallOptions];
options.enableCustomizeVideoData = YES;
[[EMClient sharedClient].callManager startCall:aType remoteName:aUsername ext:@"123" completion:^(EMCallSession *aCallSession, EMError *aError) {
completionBlock(aCallSession, aError);
}];
//进行默认1v1视频之前,必须设置 EMCallOptions.enableCustomizeVideoData=NO
EMCallOptions *options = [[EMClient sharedClient].callManager getCallOptions];
options.enableCustomizeVideoData = NO;
[[EMClient sharedClient].callManager startCall:aType remoteName:aUsername ext:@"123" completion:^(EMCallSession *aCallSession, EMError *aError) {
completionBlock(aCallSession, aError);
}];
设置 EMCallOptions.enableCustomizeVideoData=YES;后,必须设置 EMCallSession.localVideoView.previewDirectly = NO; ,并且必须自定义摄像头数据
Demo中相关代码前都添加了“3.3.9 new 自定义视频数据”,可作为参考
/*!
* 自定义本地视频数据
*
* @param aSampleBuffer 视频采样缓冲区
* @param aCallId 1v1会话实例ID,即[EMCallSession callId]
* @param aFormat 视频格式
* @param aRotation 旋转角度0~360,默认0
* @param aCompletionBlock 完成后的回调
*/
- (void)inputVideoSampleBuffer:(CMSampleBufferRef)aSampleBuffer
callId:(NSString *)aCallId
format:(EMCallVideoFormat)aFormat
rotation:(int)aRotation
completion:(void (^)(EMError *aError))aCompletionBlock;
/*!
* 自定义本地视频数据
*
* @param aPixelBuffer 视频像素缓冲区
* @param aCallId 1v1会话实例ID,即[EMCallSession callId]
* @param aFormat 视频格式
* @param aRotation 旋转角度0~360,默认0
* @param aCompletionBlock 完成后的回调
*/
- (void)inputVideoPixelBuffer:(CVPixelBufferRef)aPixelBuffer
callId:(NSString *)aCallId
format:(EMCallVideoFormat)aFormat
rotation:(int)aRotation
completion:(void (^)(EMError *aError))aCompletionBlock;
/*!
* 自定义本地视频数据
*
* @param aData 视频数据
* @param aCallId 1v1会话实例ID,即[EMCallSession callId]
* @param aWidth 宽度
* @param aHeight 高度
* @param aFormat 视频格式
* @param aRotation 旋转角度0~360,默认0
* @param aCompletionBlock 完成后的回调
*/
- (void)inputVideoData:(NSData *)aData
callId:(NSString *)aCallId
widthInPixels:(size_t)aWidth
heightInPixels:(size_t)aHeight
format:(EMCallVideoFormat)aFormat
rotation:(int)aRotation
completion:(void (^)(EMError *aError))aCompletionBlock;
详细示例代码请看Demo
#pragma mark AVCaptureVideoDataOutputSampleBufferDelegate
- (void)captureOutput:(AVCaptureOutput*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection*)connection
{
if(!self.callSession || self.videoModel == VIDEO_INPUT_MODE_NONE){
return;
}
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
if (imageBuffer == NULL) {
return ;
}
CVOptionFlags lockFlags = kCVPixelBufferLock_ReadOnly;
CVReturn ret = CVPixelBufferLockBaseAddress(imageBuffer, lockFlags);
if (ret != kCVReturnSuccess) {
return ;
}
static size_t const kYPlaneIndex = 0;
static size_t const kUVPlaneIndex = 1;
uint8_t* yPlaneAddress = (uint8_t*)CVPixelBufferGetBaseAddressOfPlane(imageBuffer, kYPlaneIndex);
size_t yPlaneHeight = CVPixelBufferGetHeightOfPlane(imageBuffer, kYPlaneIndex);
size_t yPlaneWidth = CVPixelBufferGetWidthOfPlane(imageBuffer, kYPlaneIndex);
size_t yPlaneBytesPerRow = CVPixelBufferGetBytesPerRowOfPlane(imageBuffer, kYPlaneIndex);
size_t uvPlaneHeight = CVPixelBufferGetHeightOfPlane(imageBuffer, kUVPlaneIndex);
size_t uvPlaneBytesPerRow = CVPixelBufferGetBytesPerRowOfPlane(imageBuffer, kUVPlaneIndex);
size_t frameSize = yPlaneBytesPerRow * yPlaneHeight + uvPlaneBytesPerRow * uvPlaneHeight;
// set uv for gray color
uint8_t * uvPlaneAddress = yPlaneAddress + yPlaneBytesPerRow * yPlaneHeight;
memset(uvPlaneAddress, 0x7F, uvPlaneBytesPerRow * uvPlaneHeight);
if(self.videoModel == VIDEO_INPUT_MODE_DATA){
[[EMClient sharedClient].callManager inputVideoData:[NSData dataWithBytes:yPlaneAddress length:frameSize] callId:self.callSession.callId widthInPixels:yPlaneWidth heightInPixels:yPlaneHeight format:EMCallVideoFormatNV12 rotation:0 completion:nil];
}
CVPixelBufferUnlockBaseAddress(imageBuffer, lockFlags);
if(self.videoModel == VIDEO_INPUT_MODE_SAMPLE_BUFFER) {
[[EMClient sharedClient].callManager inputVideoSampleBuffer:sampleBuffer callId:self.callSession.callId format:EMCallVideoFormatNV12 rotation:0 completion:nil];
} else if(self.videoModel == VIDEO_INPUT_MODE_PIXEL_BUFFER) {
[[EMClient sharedClient].callManager inputVideoPixelBuffer:imageBuffer callId:self.callSession.callId format:EMCallVideoFormatNV12 rotation:0 completion:nil];
}
}