open dev desarrolladores apple iphone image-processing video-capture avfoundation

iphone - dev - ¿Puede usar AVCaptureVideoDataOutput y AVCaptureMovieFileOutput al mismo tiempo?



sdk de apple (2)

Quiero grabar video y tomar cuadros al mismo tiempo con mi código.

Estoy usando AVCaptureVideoDataOutput para cuadros de AVCaptureMovieFileOutput y AVCaptureMovieFileOutput para la grabación de video. Pero no puede funcionar y obtener el código de error -12780 mientras trabaja al mismo tiempo pero de forma individual.

Busqué este problema pero no obtuve respuesta. ¿Alguien tiene la misma experiencia o explicaciones? Realmente me molesta por un tiempo.

Gracias.


Esta es una versión rápida de la respuesta de Tommy.

// Set up the Capture Session // Add the Inputs // Add the Outputs var outputSettings = [ AVVideoWidthKey : Int(640), AVVideoHeightKey : Int(480), AVVideoCodecKey : .h264 ] var assetWriterInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo,outputSettings: outputSettings) var pixelBufferAdaptor = AVAssetWriterInputPixelBufferAdaptor(assetWriterInput, sourcePixelBufferAttributes: [ kCVPixelBufferPixelFormatTypeKey : Int(kCVPixelFormatType_32BGRA)]) var assetWriter = AVAssetWriter(url: URLFromSomwhere, fileType: AVFileTypeMPEG4 , error : Error ) assetWriter.addInput(assetWriterInput) assetWriterInput.expectsMediaDataInRealTime = true assetWriter.startWriting() assetWriter.startSession(atSourceTime: kCMTimeZero) captureSession.startRunning() func captureOutput(_ captureOutput: AVCaptureOutput, didOutputSampleBuffer sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) { var imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) // a very dense way to keep track of the time at which this frame // occurs relative to the output stream, but it''s just an example! var frameNumber: Int64 = 0 if assetWriterInput.readyForMoreMediaData { pixelBufferAdaptor.appendPixelBuffer(imageBuffer, withPresentationTime: CMTimeMake(frameNumber, 25)) } frameNumber += 1 } captureSession.stopRunning() assetWriter.finishWriting()

Sin embargo, no garantizo una precisión del 100%, porque soy nuevo en rapidez.


No puedo responder la pregunta específica, pero he estado grabando video y capturando fotogramas al mismo tiempo usando:

  • AVCaptureSession y AVCaptureVideoDataOutput para enrutar fotogramas en mi propio código
  • AVAssetWriter , AVAssetWriterInput y AVAssetWriterInputPixelBufferAdaptor para escribir marcos en un archivo de película codificado H.264

Eso es sin investigar audio. CMSampleBuffers obteniendo CMSampleBuffers de la sesión de captura y luego los empujo al adaptador de memoria intermedia de píxeles.

EDITAR: por lo que mi código se parece más o menos, con los bits que no tiene problemas con los que se pasan por alto e ignorando los problemas de alcance:

/* to ensure I''m given incoming CMSampleBuffers */ AVCaptureSession *captureSession = alloc and init, set your preferred preset/etc; AVCaptureDevice *captureDevice = default for video, probably; AVCaptureDeviceInput *deviceInput = input with device as above, and attach it to the session; AVCaptureVideoDataOutput *output = output for 32BGRA pixel format, with me as the delegate and a suitable dispatch queue affixed. /* to prepare for output; I''ll output 640x480 in H.264, via an asset writer */ NSDictionary *outputSettings = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:640], AVVideoWidthKey, [NSNumber numberWithInt:480], AVVideoHeightKey, AVVideoCodecH264, AVVideoCodecKey, nil]; AVAssetWriterInput *assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:outputSettings]; /* I''m going to push pixel buffers to it, so will need a AVAssetWriterPixelBufferAdaptor, to expect the same 32BGRA input as I''ve asked the AVCaptureVideDataOutput to supply */ AVAssetWriterInputPixelBufferAdaptor *pixelBufferAdaptor = [[AVAssetWriterInputPixelBufferAdaptor alloc] initWithAssetWriterInput:assetWriterInput sourcePixelBufferAttributes: [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey, nil]]; /* that''s going to go somewhere, I imagine you''ve got the URL for that sorted, so create a suitable asset writer; we''ll put our H.264 within the normal MPEG4 container */ AVAssetWriter *assetWriter = [[AVAssetWriter alloc] initWithURL:URLFromSomwhere fileType:AVFileTypeMPEG4 error:you need to check error conditions, this example is too lazy]; [assetWriter addInput:assetWriterInput]; /* we need to warn the input to expect real time data incoming, so that it tries to avoid being unavailable at inopportune moments */ assetWriterInput.expectsMediaDataInRealTime = YES; ... eventually ... [assetWriter startWriting]; [assetWriter startSessionAtSourceTime:kCMTimeZero]; [captureSession startRunning]; ... elsewhere ... - (void) captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); // a very dense way to keep track of the time at which this frame // occurs relative to the output stream, but it''s just an example! static int64_t frameNumber = 0; if(assetWriterInput.readyForMoreMediaData) [pixelBufferAdaptor appendPixelBuffer:imageBuffer withPresentationTime:CMTimeMake(frameNumber, 25)]; frameNumber++; } ... and, to stop, ensuring the output file is finished properly ... [captureSession stopRunning]; [assetWriter finishWriting];