how descargar acethinker iphone ios camera video-capture record

iphone - descargar - AVCaptureVideoDataOutput y AVCaptureMovieFileOutput simultáneos



screen recorder ios 10 (2)

Necesito poder tener AVCaptureVideoDataOutput y AVCaptureMovieFileOutput funcionando al mismo tiempo. El siguiente código funciona, sin embargo, la grabación de video no lo hace. Se didFinishRecordingToOutputFileAtURL delegado didFinishRecordingToOutputFileAtURL directamente después de startRecordingToOutputFileURL . Ahora, si AVCaptureVideoDataOutput de AVCaptureSession simplemente comentando la línea:

[captureSession addOutput:captureDataOutput];

La grabación de video funciona pero luego no se llama a SampleBufferDelegate (lo que necesito).

¿Cómo puedo hacer que AVCaptureVideoDataOutput y AVCaptureMovieFileOutput funcionen simultáneamente?

- (void)initCapture { AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput deviceInputWithDevice:[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo] error:NULL]; captureDataOutput = [[AVCaptureVideoDataOutput alloc] init]; [captureDataOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()]; m_captureFileOutput = [[AVCaptureMovieFileOutput alloc] init]; NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey; NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA]; NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key]; [captureDataOutput setVideoSettings:videoSettings]; captureSession = [[AVCaptureSession alloc] init]; [captureSession addInput:captureInput]; [captureSession addOutput:m_captureFileOutput]; [captureSession addOutput:captureDataOutput]; [captureSession beginConfiguration]; [captureSession setSessionPreset:AVCaptureSessionPresetLow]; [captureSession commitConfiguration]; [self performSelector:@selector(startRecording) withObject:nil afterDelay:10.0]; [self performSelector:@selector(stopRecording) withObject:nil afterDelay:15.0]; [captureSession startRunning]; } - (void) startRecording { [m_captureFileOutput startRecordingToOutputFileURL:[self tempFileURL] recordingDelegate:self]; } - (void) stopRecording { if([m_captureFileOutput isRecording]) [m_captureFileOutput stopRecording]; } - (NSURL *) tempFileURL { NSString *outputPath = [[NSString alloc] initWithFormat:@"%@%@", NSTemporaryDirectory(), @"camera.mov"]; NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputPath]; NSFileManager *fileManager = [NSFileManager defaultManager]; if ([fileManager fileExistsAtPath:outputPath]) { [[NSFileManager defaultManager] removeItemAtPath:outputPath error:nil]; [outputPath release]; return [outputURL autorelease]; } - (void)captureOutput:(AVCaptureFileOutput *)captureOutput didStartRecordingToOutputFileAtURL:(NSURL *)fileURL fromConnections:(NSArray *)connections { NSLog(@"start record video"); } - (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error { NSLog(@"end record"); } - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { // do stuff with sampleBuffer }

Debo añadir que estoy recibiendo el error:

Error Domain=NSOSStatusErrorDomain Code=-12780 "The operation couldn’t be completed. (OSStatus error -12780.)" UserInfo=0x23fcd0 {AVErrorRecordingSuccessfullyFinishedKey=false}

desde

- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error

Aclamaciones


Aunque no puede usar AVCaptureVideoDataOutput , puede usar AVCaptureVideoPreviewLayer simultáneamente con AVCaptureMovieFileOutput . Vea el ejemplo de "AVCam" en el sitio web de Apple.

En Xamarin.iOS, el código se ve así:

var session = new AVCaptureSession(); var camera = AVCaptureDevice.DefaultDeviceWithMediaType(AVMediaType.Video); var mic = AVCaptureDevice.DefaultDeviceWithMediaType(AVMediaType.Audio); if(camera == null || mic == null){ throw new Exception("Can''t find devices"); } if(session.CanAddInput(camera)){ session.AddInput(camera); } if(session.CanAddInput(mic)){ session.AddInput(mic); } var layer = new AVCaptureVideoPreviewLayer(session); layer.LayerVideoGravity = AVLayerVideoGravity.ResizeAspectFill; layer.VideoGravity = AVCaptureVideoPreviewLayer.GravityResizeAspectFill; cameraView = new UIView(); cameraView.Layer.AddSublayer(layer); var filePath = System.IO.Path.Combine( Path.GetTempPath(), "temporary.mov"); var fileUrl = NSUrl.FromFilename( filePath ); var movieFileOutput = new AVCaptureMovieFileOutput(); var recordingDelegate = new MyRecordingDelegate(); session.AddOutput(movieFileOutput); movieFileOutput.StartRecordingToOutputFile( fileUrl, recordingDelegate);


Me he contactado con un ingeniero de soporte de Apple y me dijo que AVCaptureVideoDataOutput AVCaptureMovieFileOutput uso simultáneo de AVCaptureVideoDataOutput + AVCaptureMovieFileOutput . No sé si lo apoyarán en el futuro, pero usó la palabra "no es compatible en este momento".

Lo invito a completar una solicitud de informe / función de error en esto, como hice yo (bugreport.apple.com), ya que miden cuán difícil es la gente que quiere algo y quizás podamos ver esto en un futuro cercano.