你好,歡迎來到IOS教程網

 Ios教程網 >> IOS編程開發 >> IOS開發綜合 >> iOS開發進階 - 用AVFoundation自定義視頻錄制功能

iOS開發進階 - 用AVFoundation自定義視頻錄制功能

編輯:IOS開發綜合

系統自帶的錄制視頻的功能顯然無法滿足美工和項目經理的要求,自定義視頻錄制就非常重要了,那麼下面來帶大家制作屬於自己的視頻錄制界面。

效果圖

簡介

自定義視頻錄制需要用到的框架主要是AVFoundationCoreMedia,包括視頻輸出,輸入和文件的讀寫,下面給大家羅列一下將要用到的類:

AVCaptureSession AVCaptureVideoPreviewLayer AVCaptureDeviceInput AVCaptureConnection AVCaptureVideoDataOutput AVCaptureAudioDataOutput AVAssetWriter AVAssetWriterInput

下面詳細介紹每個類和代碼實現

AVCaptureSession

AVCaptureSessionAVFoundation捕捉類的中心樞紐,我們先從這個類入手,在視頻捕獲時,客戶端可以實例化AVCaptureSession並添加適當的AVCaptureInputsAVCaptureDeviceInput和輸出,比如AVCaptureMovieFileOutput。通過[AVCaptureSession startRunning]開始數據流從輸入到輸出,和[AVCaptureSession stopRunning]停止輸出輸入的流動。客戶端可以通過設置sessionPreset屬性定制錄制質量水平或輸出的比特率。

//捕獲視頻的會話
- (AVCaptureSession *)recordSession {
    if (_recordSession == nil) {
        _recordSession = [[AVCaptureSession alloc] init];
        //添加後置攝像頭的輸出
        if ([_recordSession canAddInput:self.backCameraInput]) {
            [_recordSession addInput:self.backCameraInput];
        }
        //添加後置麥克風的輸出
        if ([_recordSession canAddInput:self.audioMicInput]) {
            [_recordSession addInput:self.audioMicInput];
        }
        //添加視頻輸出
        if ([_recordSession canAddOutput:self.videoOutput]) {
            [_recordSession addOutput:self.videoOutput];
            //設置視頻的分辨率為後置攝像頭
            NSDictionary* actual = self.videoOutput.videoSettings;
            _cx = [[actual objectForKey:@"Height"] integerValue];
            _cy = [[actual objectForKey:@"Width"] integerValue];
        }
        //添加音頻輸出
        if ([_recordSession canAddOutput:self.audioOutput]) {
            [_recordSession addOutput:self.audioOutput];
        }
        //設置視頻錄制的方向
        self.videoConnection.videoOrientation = AVCaptureVideoOrientationPortrait;
    }
    return _recordSession;
}

AVCaptureDevice

AVCaptureDevice的每個實例對應一個設備,如攝像頭或麥克風。AVCaptureDevice的實例不能直接創建。所有現有設備可以使用類方法devicesWithMediaType:defaultDeviceWithMediaType:獲取,設備可以提供一個或多個給定流媒體類型。AVCaptureDevice實例可用於提供給AVCaptureSession創建一個為AVCaptureDeviceInput類型的輸入源。

//返回前置攝像頭
- (AVCaptureDevice *)frontCamera {
    return [self cameraWithPosition:AVCaptureDevicePositionFront];
}

//返回後置攝像頭
- (AVCaptureDevice *)backCamera {
    return [self cameraWithPosition:AVCaptureDevicePositionBack];
}

//用來返回是前置攝像頭還是後置攝像頭
- (AVCaptureDevice *)cameraWithPosition:(AVCaptureDevicePosition) position {
    //返回和視頻錄制相關的所有默認設備
    NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
    //遍歷這些設備返回跟position相關的設備
    for (AVCaptureDevice *device in devices) {
        if ([device position] == position) {
            return device;
        }
    }
    return nil;
}
//開啟閃光燈
- (void)openFlashLight {
    AVCaptureDevice *backCamera = [self backCamera];
    if (backCamera.torchMode == AVCaptureTorchModeOff) {
        [backCamera lockForConfiguration:nil];
        backCamera.torchMode = AVCaptureTorchModeOn;
        backCamera.flashMode = AVCaptureFlashModeOn;
        [backCamera unlockForConfiguration];
    }
}
//關閉閃光燈
- (void)closeFlashLight {
    AVCaptureDevice *backCamera = [self backCamera];
    if (backCamera.torchMode == AVCaptureTorchModeOn) {
        [backCamera lockForConfiguration:nil];
        backCamera.torchMode = AVCaptureTorchModeOff;
        backCamera.flashMode = AVCaptureTorchModeOff;
        [backCamera unlockForConfiguration];
    }
}

AVCaptureDeviceInput

AVCaptureDeviceInput 是AVCaptureSession輸入源,提供媒體數據從設備連接到系統,通過AVCaptureDevice的實例化得到,就是我們將要用到的設備輸出源設備,也就是前後攝像頭,通過[AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]方法獲得。

//後置攝像頭輸入
- (AVCaptureDeviceInput *)backCameraInput {
    if (_backCameraInput == nil) {
        NSError *error;
        _backCameraInput = [[AVCaptureDeviceInput alloc] initWithDevice:[self backCamera] error:&error];
        if (error) {
            [SVProgressHUD showErrorWithStatus:@"獲取後置攝像頭失敗~"];
        }
    }
    return _backCameraInput;
}

//前置攝像頭輸入
- (AVCaptureDeviceInput *)frontCameraInput {
    if (_frontCameraInput == nil) {
        NSError *error;
        _frontCameraInput = [[AVCaptureDeviceInput alloc] initWithDevice:[self frontCamera] error:&error];
        if (error) {
            [SVProgressHUD showErrorWithStatus:@"獲取前置攝像頭失敗~"];
        }
    }
    return _frontCameraInput;
}

AVCaptureVideoPreviewLayer

CoreAnimation裡面layer的一個子類,用來做為AVCaptureSession預覽視頻輸出,簡單來說就是來做為拍攝的視頻呈現的一個layer。

//捕獲到的視頻呈現的layer
- (AVCaptureVideoPreviewLayer *)previewLayer {
    if (_previewLayer == nil) {
        //通過AVCaptureSession初始化
        AVCaptureVideoPreviewLayer *preview = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.recordSession];
        //設置比例為鋪滿全屏
        preview.videoGravity = AVLayerVideoGravityResizeAspectFill;
        _previewLayer = preview;
    }
    return _previewLayer;
}

AVCaptureMovieFileOutput

AVCaptureMovieFileOutputAVCaptureFileOutput的子類,用來寫入QuickTime視頻類型的媒體文件。因為這個類在iphone上並不能實現暫停錄制,和不能定義視頻文件的類型,所以在這裡並不使用,而是用靈活性更強的AVCaptureVideoDataOutputAVCaptureAudioDataOutput來實現視頻的錄制。

AVCaptureVideoDataOutput

AVCaptureVideoDataOutputAVCaptureOutput一個子類,可以用於用來輸出未壓縮或壓縮的視頻捕獲的幀,AVCaptureVideoDataOutput產生的實例可以使用其他媒體視頻幀適合的api處理,應用程序可以用captureOutput:didOutputSampleBuffer:fromConnection:代理方法來獲取幀數據。

//視頻輸出
- (AVCaptureVideoDataOutput *)videoOutput {
    if (_videoOutput == nil) {
        _videoOutput = [[AVCaptureVideoDataOutput alloc] init];
        [_videoOutput setSampleBufferDelegate:self queue:self.captureQueue];
        NSDictionary* setcapSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                        [NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange], kCVPixelBufferPixelFormatTypeKey,
                                        nil];
        _videoOutput.videoSettings = setcapSettings;
    }
    return _videoOutput;
}

AVCaptureAudioDataOutput

AVCaptureAudioDataOutputAVCaptureOutput的子類,可用於用來輸出捕獲來的非壓縮或壓縮的音頻樣本,AVCaptureAudioDataOutput產生的實例可以使用其他媒體視頻幀適合的api處理,應用程序可以用captureOutput:didOutputSampleBuffer:fromConnection:代理方法來獲取音頻數據。

//音頻輸出
- (AVCaptureAudioDataOutput *)audioOutput {
    if (_audioOutput == nil) {
        _audioOutput = [[AVCaptureAudioDataOutput alloc] init];
        [_audioOutput setSampleBufferDelegate:self queue:self.captureQueue];
    }
    return _audioOutput;
}

AVCaptureConnection

AVCaptureConnection代表AVCaptureInputPort或端口之間的連接,和一個AVCaptureOutputAVCaptureVideoPreviewLayerAVCaptureSession中的呈現。

//視頻連接
- (AVCaptureConnection *)videoConnection {
    _videoConnection = [self.videoOutput connectionWithMediaType:AVMediaTypeVideo];
    return _videoConnection;
}

//音頻連接
- (AVCaptureConnection *)audioConnection {
    if (_audioConnection == nil) {
        _audioConnection = [self.audioOutput connectionWithMediaType:AVMediaTypeAudio];
    }
    return _audioConnection;
}

AVAssetWriter

AVAssetWriter為寫入媒體數據到一個新的文件提供服務,AVAssetWriter的實例可以規定寫入媒體文件的格式,如QuickTime電影文件格式或MPEG-4文件格式等等。AVAssetWriter有多個並行的軌道媒體數據,基本的有視頻軌道和音頻軌道,將會在下面介紹。AVAssetWriter的單個實例可用於一次寫入一個單一的文件。那些希望寫入多次文件的客戶端必須每一次用一個新的AVAssetWriter實例。

//初始化方法
- (instancetype)initPath:(NSString*)path Height:(NSInteger)cy width:(NSInteger)cx channels:(int)ch samples:(Float64) rate {
    self = [super init];
    if (self) {
        self.path = path;
        //先把路徑下的文件給刪除掉,保證錄制的文件是最新的
        [[NSFileManager defaultManager] removeItemAtPath:self.path error:nil];
        NSURL* url = [NSURL fileURLWithPath:self.path];
        //初始化寫入媒體類型為MP4類型
        _writer = [AVAssetWriter assetWriterWithURL:url fileType:AVFileTypeMPEG4 error:nil];
        //使其更適合在網絡上播放
        _writer.shouldOptimizeForNetworkUse = YES;
        //初始化視頻輸出
        [self initVideoInputHeight:cy width:cx];
        //確保采集到rate和ch
        if (rate != 0 && ch != 0) {
            //初始化音頻輸出
            [self initAudioInputChannels:ch samples:rate];
        }
    }
    return self;
}

AVAssetWriterInput

AVAssetWriterInput去拼接一個多媒體樣本類型為CMSampleBuffer的實例到AVAssetWriter對象的輸出文件的一個軌道;當有多個輸入時, AVAssetWriter試圖在用於存儲和播放效率的理想模式寫媒體數據。它的每一個輸入信號,是否能接受媒體的數據根據通過readyForMoreMediaData的值來判斷。如果readyForMoreMediaDataYES ,說明輸入可以接受媒體數據。並且你只能媒體數據追加到輸入端。

//初始化視頻輸入
- (void)initVideoInputHeight:(NSInteger)cy width:(NSInteger)cx {
    //錄制視頻的一些配置,分辨率,編碼方式等等
    NSDictionary* settings = [NSDictionary dictionaryWithObjectsAndKeys:
                              AVVideoCodecH264, AVVideoCodecKey,
                              [NSNumber numberWithInteger: cx], AVVideoWidthKey,
                              [NSNumber numberWithInteger: cy], AVVideoHeightKey,
                              nil];
    //初始化視頻寫入類
    _videoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:settings];
    //表明輸入是否應該調整其處理為實時數據源的數據
    _videoInput.expectsMediaDataInRealTime = YES;
    //將視頻輸入源加入
    [_writer addInput:_videoInput];
}

//初始化音頻輸入
- (void)initAudioInputChannels:(int)ch samples:(Float64)rate {
    //音頻的一些配置包括音頻各種這裡為AAC,音頻通道、采樣率和音頻的比特率
    NSDictionary *settings = [NSDictionary dictionaryWithObjectsAndKeys:
                              [ NSNumber numberWithInt: kAudioFormatMPEG4AAC], AVFormatIDKey,
                              [ NSNumber numberWithInt: ch], AVNumberOfChannelsKey,
                              [ NSNumber numberWithFloat: rate], AVSampleRateKey,
                              [ NSNumber numberWithInt: 128000], AVEncoderBitRateKey,
                              nil];
    //初始化音頻寫入類
    _audioInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:settings];
    //表明輸入是否應該調整其處理為實時數據源的數據
    _audioInput.expectsMediaDataInRealTime = YES;
    //將音頻輸入源加入
    [_writer addInput:_audioInput];

}

上面是錄制之前的一些需要的類和配置,下面介紹的是如何將獲取到的數據呈現出來和怎樣進行文件寫入

寫入數據

#pragma mark - 寫入數據
- (void) captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    BOOL isVideo = YES;
    @synchronized(self) {
        if (!self.isCapturing  || self.isPaused) {
            return;
        }
        if (captureOutput != self.videoOutput) {
            isVideo = NO;
        }

        //初始化編碼器,當有音頻和視頻參數時創建編碼器
        if ((self.recordEncoder == nil) && !isVideo)                     
        {  
            CMFormatDescriptionRef fmt = CMSampleBufferGetFormatDescription(sampleBuffer);
            [self setAudioFormat:fmt];
            NSString *videoName = [NSString getUploadFile_type:@"video" fileType:@"mp4"];
            self.videoPath = [[self getVideoCachePath] stringByAppendingPathComponent:videoName];
            self.recordEncoder = [WCLRecordEncoder encoderForPath:self.videoPath Height:_cy width:_cx channels:_channels samples:_samplerate];
        }

        //判斷是否中斷錄制過
        if (self.discont) {
            if (isVideo) {
                return;
            }
            self.discont = NO;
            // 計算暫停的時間
            CMTime pts = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
            CMTime last = isVideo ? _lastVideo : _lastAudio;
            if (last.flags & kCMTimeFlags_Valid) {
                if (_timeOffset.flags & kCMTimeFlags_Valid) {
                    pts = CMTimeSubtract(pts, _timeOffset);
                }
                CMTime offset = CMTimeSubtract(pts, last);
                if (_timeOffset.value == 0) {
                    _timeOffset = offset;
                }else {
                    _timeOffset = CMTimeAdd(_timeOffset, offset);
                }
            }
            _lastVideo.flags = 0;
            _lastAudio.flags = 0;
        }
        // 增加sampleBuffer的引用計時,這樣我們可以釋放這個或修改這個數據,防止在修改時被釋放
        CFRetain(sampleBuffer);
        if (_timeOffset.value > 0) {
            CFRelease(sampleBuffer);
            //根據得到的timeOffset調整
            sampleBuffer = [self adjustTime:sampleBuffer by:_timeOffset];
        }
        // 記錄暫停上一次錄制的時間
        CMTime pts = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
        CMTime dur = CMSampleBufferGetDuration(sampleBuffer);
        if (dur.value > 0) {
            pts = CMTimeAdd(pts, dur);
        }
        if (isVideo) {
            _lastVideo = pts;
        }else {
            _lastAudio = pts;
        }
    }
    CMTime dur = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
    if (self.startTime.value == 0) {
        self.startTime = dur;
    }
    CMTime sub = CMTimeSubtract(dur, self.startTime);
    self.currentRecordTime = CMTimeGetSeconds(sub);
    if (self.currentRecordTime > self.maxRecordTime) {
        if (self.currentRecordTime - self.maxRecordTime < 0.1) {
            if ([self.delegate respondsToSelector:@selector(recordProgress:)]) {
                dispatch_async(dispatch_get_main_queue(), ^{
                    [self.delegate recordProgress:self.currentRecordTime/self.maxRecordTime];
                });
            }
        }
        return;
    }
    if ([self.delegate respondsToSelector:@selector(recordProgress:)]) {
        dispatch_async(dispatch_get_main_queue(), ^{
            [self.delegate recordProgress:self.currentRecordTime/self.maxRecordTime];
        });
    }
    // 進行數據編碼
    [self.recordEncoder encodeFrame:sampleBuffer isVideo:isVideo];
    CFRelease(sampleBuffer);
}

//設置音頻格式
- (void)setAudioFormat:(CMFormatDescriptionRef)fmt {
    const AudioStreamBasicDescription *asbd = CMAudioFormatDescriptionGetStreamBasicDescription(fmt);
    _samplerate = asbd->mSampleRate;
    _channels = asbd->mChannelsPerFrame;

}

//調整媒體數據的時間
- (CMSampleBufferRef)adjustTime:(CMSampleBufferRef)sample by:(CMTime)offset {
    CMItemCount count;
    CMSampleBufferGetSampleTimingInfoArray(sample, 0, nil, &count);
    CMSampleTimingInfo* pInfo = malloc(sizeof(CMSampleTimingInfo) * count);
    CMSampleBufferGetSampleTimingInfoArray(sample, count, pInfo, &count);
    for (CMItemCount i = 0; i < count; i++) {
        pInfo[i].decodeTimeStamp = CMTimeSubtract(pInfo[i].decodeTimeStamp, offset);
        pInfo[i].presentationTimeStamp = CMTimeSubtract(pInfo[i].presentationTimeStamp, offset);
    }
    CMSampleBufferRef sout;
    CMSampleBufferCreateCopyWithNewTiming(nil, sample, count, pInfo, &sout);
    free(pInfo);
    return sout;
}

//通過這個方法寫入數據
- (BOOL)encodeFrame:(CMSampleBufferRef) sampleBuffer isVideo:(BOOL)isVideo {
    //數據是否准備寫入
    if (CMSampleBufferDataIsReady(sampleBuffer)) {
        //寫入狀態為未知,保證視頻先寫入
        if (_writer.status == AVAssetWriterStatusUnknown && isVideo) {
            //獲取開始寫入的CMTime
            CMTime startTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
            //開始寫入
            [_writer startWriting];
            [_writer startSessionAtSourceTime:startTime];
        }
        //寫入失敗
        if (_writer.status == AVAssetWriterStatusFailed) {
            NSLog(@"writer error %@", _writer.error.localizedDescription);
            return NO;
        }
        //判斷是否是視頻
        if (isVideo) {
            //視頻輸入是否准備接受更多的媒體數據
            if (_videoInput.readyForMoreMediaData == YES) {
                //拼接數據
                [_videoInput appendSampleBuffer:sampleBuffer];
                return YES;
            }
        }else {
            //音頻輸入是否准備接受更多的媒體數據
            if (_audioInput.readyForMoreMediaData) {
                //拼接數據
                [_audioInput appendSampleBuffer:sampleBuffer];
                return YES;
            }
        }
    }
    return NO;
}

完成錄制並寫入相冊

//停止錄制
- (void) stopCaptureHandler:(void (^)(UIImage *movieImage))handler {
    @synchronized(self) {
        if (self.isCapturing) {
            NSString* path = self.recordEncoder.path;
            NSURL* url = [NSURL fileURLWithPath:path];
            self.isCapturing = NO;
            dispatch_async(_captureQueue, ^{
                [self.recordEncoder finishWithCompletionHandler:^{
                    self.isCapturing = NO;
                    self.recordEncoder = nil;
                    [[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
                        [PHAssetChangeRequest creationRequestForAssetFromVideoAtFileURL:url];
                    } completionHandler:^(BOOL success, NSError * _Nullable error) {
                        NSLog(@"保存成功");
                    }];
                    [self movieToImageHandler:handler];
                }];
            });
        }
    }
}

//獲取視頻第一幀的圖片
- (void)movieToImageHandler:(void (^)(UIImage *movieImage))handler {
    NSURL *url = [NSURL fileURLWithPath:self.videoPath];
    AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:url options:nil];
    AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
    generator.appliesPreferredTrackTransform = TRUE;
    CMTime thumbTime = CMTimeMakeWithSeconds(0, 60);
    generator.apertureMode = AVAssetImageGeneratorApertureModeEncodedPixels;
    AVAssetImageGeneratorCompletionHandler generatorHandler =
    ^(CMTime requestedTime, CGImageRef im, CMTime actualTime, AVAssetImageGeneratorResult result, NSError *error){
        if (result == AVAssetImageGeneratorSucceeded) {
            UIImage *thumbImg = [UIImage imageWithCGImage:im];
            if (handler) {
                dispatch_async(dispatch_get_main_queue(), ^{
                    handler(thumbImg);
                });
            }
        }
    };
    [generator generateCGImagesAsynchronouslyForTimes:
     [NSArray arrayWithObject:[NSValue valueWithCMTime:thumbTime]] completionHandler:generatorHandler];
}

//完成視頻錄制時調用
- (void)finishWithCompletionHandler:(void (^)(void))handler {
    [_writer finishWritingWithCompletionHandler: handler];
}

以上就是本博客內容的全部內容,大家如果有什麼疑問可以問我,本文附帶有demo,大家可以去看看具體怎麼使用,有用的話可以點一下star,謝謝大家的閱讀~~

我的demo地址

  1. 上一頁:
  2. 下一頁:
蘋果刷機越獄教程| IOS教程問題解答| IOS技巧綜合| IOS7技巧| IOS8教程
Copyright © Ios教程網 All Rights Reserved