I'm working with AVCaptureVideoDataOutput and want to convert CMSampleBufferRef to UIImage. Many answers are the same, like this UIImage created from CMSampleBufferRef not displayed in UIImageView? and AVCaptureSession with multiple previews
It works fine if I set the VideoDataOutput color space to BGRA (credited to this answer CGBitmapContextCreateImage error)
NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
[dataOutput setVideoSettings:videoSettings];
Without the above videoSettings, I will receive the following errors
CGBitmapContextCreate: invalid data bytes/row: should be at least 2560 for 8 integer bits/component, 3 components, kCGImageAlphaPremultipliedFirst.
<Error>: CGBitmapContextCreateImage: invalid context 0x0
Working with BGRA is not a good choice, since there is conversion overhead from YUV (default AVCaptureSession color space) to BGRA, as stated by Brad and Codo in How to get the Y component from CMSampleBuffer resulted from the AVCaptureSession?
So is there a way to convert CMSampleBufferRef to UIImage and working with YUV color space ?