Live Photo

如何用AVFoundation将视频生成Live Photo。

Live Photo其实就是一个mov文件和一张jpg文件,在下面保存Live Photo代码中就可以看出。

1
2
3
4
5
6
7
8
[[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
PHAssetCreationRequest *request = [PHAssetCreationRequest creationRequestForAsset];
PHAssetResourceCreationOptions *options = [[PHAssetResourceCreationOptions alloc] init];
options.shouldMoveFile = YES;
[request addResourceWithType:PHAssetResourceTypePhoto fileURL:_imageOutputURL options:options];
[request addResourceWithType:PHAssetResourceTypePairedVideo fileURL:_videoOutputURL options:options];
} completionHandler:^(BOOL success, NSError * _Nullable error) {
}];

是不是任意的mov文件和jpg就可以生成Live Photo呢?当然是不行的,因为系统在元数据上进行了标记。

Begin

1
2
3
4
5
6
7
NSString *_assetIdentifier;
NSURL *_imageOutputURL;
NSURL *_videoOutputURL;
AVURLAsset *_asset;

NSString* const kKeyAppleMakerNoteAssetIdentifier = @"17";
NSString* const kKeyStillImageTime = @"com.apple.quicktime.still-image-time";

_assetIdentifier是用[[NSUUID UUID] UUIDString]生成的唯一标识符。

Image

首先我们取视频中某一时间点来生成一个jpg。

1
2
3
4
5
6
7
8
9
10
11
12
13
AVAssetImageGenerator* generator = [AVAssetImageGenerator assetImageGeneratorWithAsset:_asset];
generator.appliesPreferredTrackTransform = YES;
CGImageRef cgImage = [generator copyCGImageAtTime:kCMTimeZero actualTime:nil error:nil];
/// Setting metadata.
NSDictionary *metadata = @{(NSString*)kCGImagePropertyMakerAppleDictionary: @{kKeyAppleMakerNoteAssetIdentifier: _assetIdentifier}};
NSMutableData *imageData = [NSMutableData new];
CGImageDestinationRef dest = CGImageDestinationCreateWithData((CFMutableDataRef)imageData, kUTTypeJPEG, 1, nil);
CGImageDestinationAddImage(dest, cgImage, (CFDictionaryRef)metadata);
CGImageDestinationFinalize(dest);
CGImageRelease(cgImage);
CFRelease(dest);
/// Write to file.
[imageData writeToURL:_imageOutputURL atomically:YES]

也可以用其他图片,但是需要设置jpg的元数据,给这个jpg一个标记。

Video

视频上有两个点要设置,一个唯一标识符_assetIdentifier,一个是关键帧

Transcoding

如果视频不是mov文件而是mp4的,就需要转码了,转码有两个方式,一个是用AVAssetReaderTrackOutputAVAssetWriterInput,一个是用AVAssetExportSession

AVAssetReader && AVAssetWriter

首先我们先得到视频的videoTrack和audioTrack(可能没有)

1
2
3
4
5
6
7
8
9
10
11
AVAssetTrack *videoTrack = [[_asset tracksWithMediaType:AVMediaTypeVideo] firstObject];
if (videoTrack == nil) {
if (completionHandler) {
dispatch_async(dispatch_get_main_queue(), ^{
completionHandler(NO, [NSError errorWithDomain:@"LivePhoto Can't get video track." code:-1 userInfo:nil]);
});
}
return;
}
AVAssetTrack *audioTrack = [[_asset tracksWithMediaType:AVMediaTypeAudio] firstObject];
BOOL hasAudio = audioTrack != nil;

Progress

1
2
3
4
/// Accurate to microsecond.
const int64_t videoTotalUnitCount = (int64_t)(CMTimeGetSeconds(videoTrack.timeRange.duration) * 10);
const int64_t audioTotalUnitCount = hasAudio ? (int64_t)(CMTimeGetSeconds(audioTrack.timeRange.duration) * 10) : 0;
NSProgress *progress = [NSProgress progressWithTotalUnitCount:hasAudio ? videoTotalUnitCount + audioTotalUnitCount : videoTotalUnitCount];

得到videoTrack和audioTrack的持续时间,精确到毫秒。

AVAssetReader

1
2
3
4
AVAssetReaderTrackOutput *videoReaderTrackOutput = [[AVAssetReaderTrackOutput alloc] initWithTrack:videoTrack
outputSettings:@{(NSString *)kCVPixelBufferPixelFormatTypeKey: [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA]}];
AVAssetReader *videoReader = [[AVAssetReader alloc] initWithAsset:_asset error:nil];
[videoReader addOutput:videoReaderTrackOutput];

在这里我们读取的格式为kCVPixelFormatType_32BGRA

AVAssetWriter

1
2
3
4
5
6
AVAssetWriterInput *videoWriteInput = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeVideo
outputSettings:@{AVVideoCodecKey: AVVideoCodecH264,
AVVideoWidthKey: [NSNumber numberWithDouble:videoTrack.naturalSize.width],
AVVideoHeightKey: [NSNumber numberWithDouble:videoTrack.naturalSize.height]}];
videoWriteInput.expectsMediaDataInRealTime = YES;
videoWriteInput.transform = videoTrack.preferredTransform;

设置输出的编码格式为AVVideoCodecH264并设置视频的宽高。

Audio

1
2
3
4
5
6
7
8
9
10
AVAssetWriterInput *audioWriteInput = nil;
AVAssetReaderTrackOutput *audioReaderTrackOutput = nil;
AVAssetReader *audioReader = nil;
if (hasAudio) {
audioReaderTrackOutput = [[AVAssetReaderTrackOutput alloc] initWithTrack:audioTrack outputSettings:nil];
audioReader = [[AVAssetReader alloc] initWithAsset:_asset error:nil];
[audioReader addOutput:audioReaderTrackOutput];
audioWriteInput = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeAudio outputSettings:nil];
audioWriteInput.expectsMediaDataInRealTime = YES;
}

在这里对音频做特殊的处理。

Metadata

1
2
3
4
5
6
7
NSDictionary *spec = @{(NSString *)kCMMetadataFormatDescriptionMetadataSpecificationKey_Identifier: [NSString stringWithFormat:@"%@/%@", AVMetadataKeySpaceQuickTimeMetadata, kKeyStillImageTime],
(NSString *)kCMMetadataFormatDescriptionMetadataSpecificationKey_DataType: (NSString *)kCMMetadataBaseDataType_UInt8};
CMFormatDescriptionRef description;
CMMetadataFormatDescriptionCreateWithMetadataSpecifications(kCFAllocatorDefault, kCMMetadataFormatType_Boxed, (__bridge CFArrayRef)(@[spec]), &description);
AVAssetWriterInput *metadataInput = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeMetadata outputSettings:nil sourceFormatHint:description];
AVAssetWriterInputMetadataAdaptor *adaptor = [[AVAssetWriterInputMetadataAdaptor alloc] initWithAssetWriterInput:metadataInput];
CFRelease(description);
Writer Metadata
1
2
3
4
5
AVMutableMetadataItem *assetIdentifierMetadataItem = [[AVMutableMetadataItem alloc] init];
assetIdentifierMetadataItem.key = AVMetadataQuickTimeMetadataKeyContentIdentifier;
assetIdentifierMetadataItem.keySpace = AVMetadataKeySpaceQuickTimeMetadata;
assetIdentifierMetadataItem.value = _assetIdentifier;
assetIdentifierMetadataItem.dataType = (NSString *)kCMMetadataBaseDataType_UTF8;

添加唯一标识符。

Time Metadata
1
2
3
4
5
AVMutableMetadataItem *stillImageTimeMetadataItem = [[AVMutableMetadataItem alloc] init];
stillImageTimeMetadataItem.key = kKeyStillImageTime;
stillImageTimeMetadataItem.keySpace = AVMetadataKeySpaceQuickTimeMetadata;
stillImageTimeMetadataItem.value = [NSNumber numberWithUnsignedShort:0];
stillImageTimeMetadataItem.dataType = (NSString *)kCMMetadataBaseDataType_UInt8;

添加关键帧。

AVAssetExportSession

好像无法直接使用AVAssetExportSession来生成相应的mov文件,主要是由于无法在时间线上添加元数据,导致系统无法识别哪个是关键帧。

Code

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
#import <Photos/Photos.h>

/// Detail for Apple Live Photo file formatter: https://stackoverflow.com/questions/32508375/apple-live-photo-file-format
NSString* const kKeyAppleMakerNoteAssetIdentifier = @"17";
NSString* const kKeyStillImageTime = @"com.apple.quicktime.still-image-time";

@interface LivePhotoConverter () {
NSString *_assetIdentifier;
NSURL *_imageOutputURL;
NSURL *_videoOutputURL;
AVURLAsset *_asset;
NSURL *_livePhotoCacheURL;
}

@end

@implementation LivePhotoConverter

- (id)initWithMediaURL:(NSURL *)mediaURL {

self = [super init];
if (self) {
/// Raw Asset
_asset = [AVURLAsset URLAssetWithURL:mediaURL options:@{AVURLAssetPreferPreciseDurationAndTimingKey: @YES}];
_livePhotoCacheURL = [LivePhotoConverter livePhotoCacheURL];
if (![[NSFileManager defaultManager] fileExistsAtPath:_livePhotoCacheURL.path]) {
[[NSFileManager defaultManager] createDirectoryAtURL:_livePhotoCacheURL withIntermediateDirectories:YES attributes:nil error:nil];
}
_assetIdentifier = [[NSUUID UUID] UUIDString];
_imageOutputURL = [_livePhotoCacheURL URLByAppendingPathComponent:[_assetIdentifier stringByAppendingString:@".jpg"]];
_videoOutputURL = [_livePhotoCacheURL URLByAppendingPathComponent:[_assetIdentifier stringByAppendingString:@".mov"]];
}
return self;
}

- (void)saveLivePhotoToAlbumWithProgressHandler:(void(^)(NSProgress* progress)) progressHandler
CompletionHandler:(void(^)(BOOL success, NSError* _Nullable error)) completionHandler {

if (!_asset.URL.isFileURL || ![[NSFileManager defaultManager] fileExistsAtPath:_asset.URL.path]) {
NSAssert(false, @"Only file url supported and file must exist.");
if (completionHandler) {
dispatch_async(dispatch_get_main_queue(), ^{
completionHandler(NO, [NSError errorWithDomain:@"LivePhoto Only file url supported and file must exist." code:-1 userInfo:nil]);
});
}
return;
}

/// Generat first frame.
AVAssetImageGenerator* generator = [AVAssetImageGenerator assetImageGeneratorWithAsset:_asset];
generator.appliesPreferredTrackTransform = YES;
CGImageRef cgImage = [generator copyCGImageAtTime:kCMTimeZero actualTime:nil error:nil];
/// Setting metadata.
NSDictionary *metadata = @{(NSString*)kCGImagePropertyMakerAppleDictionary: @{kKeyAppleMakerNoteAssetIdentifier: _assetIdentifier}};
NSMutableData *imageData = [NSMutableData new];
CGImageDestinationRef dest = CGImageDestinationCreateWithData((CFMutableDataRef)imageData, kUTTypeJPEG, 1, nil);
CGImageDestinationAddImage(dest, cgImage, (CFDictionaryRef)metadata);
CGImageDestinationFinalize(dest);
CGImageRelease(cgImage);
CFRelease(dest);
/// To file.
if (![imageData writeToURL:_imageOutputURL atomically:YES]) {
if (completionHandler) {
dispatch_async(dispatch_get_main_queue(), ^{
completionHandler(NO, [NSError errorWithDomain:@"LivePhoto Failed to write image to file." code:-1 userInfo:nil]);
});
}
return;
}

AVAssetTrack *videoTrack = [[_asset tracksWithMediaType:AVMediaTypeVideo] firstObject];
if (videoTrack == nil) {
if (completionHandler) {
dispatch_async(dispatch_get_main_queue(), ^{
completionHandler(NO, [NSError errorWithDomain:@"LivePhoto Can't get video track." code:-1 userInfo:nil]);
});
}
return;
}
AVAssetTrack *audioTrack = [[_asset tracksWithMediaType:AVMediaTypeAudio] firstObject];
BOOL hasAudio = audioTrack != nil;

/// Accurate to microsecond.
const int64_t videoTotalUnitCount = (int64_t)(CMTimeGetSeconds(videoTrack.timeRange.duration) * 10);
const int64_t audioTotalUnitCount = hasAudio ? (int64_t)(CMTimeGetSeconds(audioTrack.timeRange.duration) * 10) : 0;
NSProgress *progress = [NSProgress progressWithTotalUnitCount:hasAudio ? videoTotalUnitCount + audioTotalUnitCount : videoTotalUnitCount];

AVAssetReaderTrackOutput *videoReaderTrackOutput = [[AVAssetReaderTrackOutput alloc] initWithTrack:videoTrack
outputSettings:@{(NSString *)kCVPixelBufferPixelFormatTypeKey: [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA]}];
AVAssetReader *videoReader = [[AVAssetReader alloc] initWithAsset:_asset error:nil];
[videoReader addOutput:videoReaderTrackOutput];
AVAssetWriterInput *videoWriteInput = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeVideo
outputSettings:@{AVVideoCodecKey: AVVideoCodecH264,
AVVideoWidthKey: [NSNumber numberWithDouble:videoTrack.naturalSize.width],
AVVideoHeightKey: [NSNumber numberWithDouble:videoTrack.naturalSize.height]}];
videoWriteInput.expectsMediaDataInRealTime = YES;
videoWriteInput.transform = videoTrack.preferredTransform;

AVAssetWriterInput *audioWriteInput = nil;
AVAssetReaderTrackOutput *audioReaderTrackOutput = nil;
AVAssetReader *audioReader = nil;
if (hasAudio) {
audioReaderTrackOutput = [[AVAssetReaderTrackOutput alloc] initWithTrack:audioTrack outputSettings:nil];
audioReader = [[AVAssetReader alloc] initWithAsset:_asset error:nil];
[audioReader addOutput:audioReaderTrackOutput];
audioWriteInput = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeAudio outputSettings:nil];
audioWriteInput.expectsMediaDataInRealTime = YES;
}

// --------------------------------------------------
// Metadata input track.
// --------------------------------------------------
NSDictionary *spec = @{(NSString *)kCMMetadataFormatDescriptionMetadataSpecificationKey_Identifier: [NSString stringWithFormat:@"%@/%@", AVMetadataKeySpaceQuickTimeMetadata, kKeyStillImageTime],
(NSString *)kCMMetadataFormatDescriptionMetadataSpecificationKey_DataType: (NSString *)kCMMetadataBaseDataType_UInt8};
CMFormatDescriptionRef description;
CMMetadataFormatDescriptionCreateWithMetadataSpecifications(kCFAllocatorDefault, kCMMetadataFormatType_Boxed, (__bridge CFArrayRef)(@[spec]), &description);
AVAssetWriterInput *metadataInput = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeMetadata outputSettings:nil sourceFormatHint:description];
AVAssetWriterInputMetadataAdaptor *adaptor = [[AVAssetWriterInputMetadataAdaptor alloc] initWithAssetWriterInput:metadataInput];
CFRelease(description);


// --------------------------------------------------
// Setting Metadata for asset identifier.
// --------------------------------------------------
AVMutableMetadataItem *assetIdentifierMetadataItem = [[AVMutableMetadataItem alloc] init];
assetIdentifierMetadataItem.key = AVMetadataQuickTimeMetadataKeyContentIdentifier;
assetIdentifierMetadataItem.keySpace = AVMetadataKeySpaceQuickTimeMetadata;
assetIdentifierMetadataItem.value = _assetIdentifier;
assetIdentifierMetadataItem.dataType = (NSString *)kCMMetadataBaseDataType_UTF8;

// --------------------------------------------------
// Setting Metadata for stillImage time.
// --------------------------------------------------
AVMutableMetadataItem *stillImageTimeMetadataItem = [[AVMutableMetadataItem alloc] init];
stillImageTimeMetadataItem.key = kKeyStillImageTime;
stillImageTimeMetadataItem.keySpace = AVMetadataKeySpaceQuickTimeMetadata;
stillImageTimeMetadataItem.value = [NSNumber numberWithUnsignedShort:0];
stillImageTimeMetadataItem.dataType = (NSString *)kCMMetadataBaseDataType_UInt8;


AVAssetWriter *writer = [[AVAssetWriter alloc] initWithURL:_videoOutputURL
fileType:AVFileTypeQuickTimeMovie error:nil];
writer.metadata = @[assetIdentifierMetadataItem];
[writer addInput:videoWriteInput];
if (audioWriteInput) { [writer addInput:audioWriteInput]; }
[writer addInput:adaptor.assetWriterInput];


[videoReader startReading];
[audioReader startReading];
[writer startWriting];
[writer startSessionAtSourceTime:kCMTimeZero];
/// Need to add after writer startWriting.
[adaptor appendTimedMetadataGroup:[[AVTimedMetadataGroup alloc] initWithItems:@[stillImageTimeMetadataItem]
timeRange:CMTimeRangeMake(kCMTimeZero, CMTimeMake(200, 3000))]];
dispatch_queue_t queue = dispatch_queue_create("com.livephoto.video.serial", nil);
[videoWriteInput requestMediaDataWhenReadyOnQueue:queue usingBlock:^{
while (videoWriteInput.isReadyForMoreMediaData) {
CMSampleBufferRef videoBuffer = [videoReaderTrackOutput copyNextSampleBuffer];
if (videoReader.status == AVAssetReaderStatusReading) {
if (videoBuffer) {
CMTime presTime = CMSampleBufferGetPresentationTimeStamp(videoBuffer);

progress.completedUnitCount = (int64_t)(CMTimeGetSeconds(presTime) * 10);
if (progressHandler) {
dispatch_async(dispatch_get_main_queue(), ^{
progressHandler(progress);
});
}
[videoWriteInput appendSampleBuffer:videoBuffer];
CFRelease(videoBuffer);
}
} else if (videoReader.status == AVAssetReaderStatusCompleted) {

[videoWriteInput markAsFinished];
/// Write audio if exist.
if (hasAudio) {
[writer startSessionAtSourceTime:kCMTimeZero];
[audioWriteInput requestMediaDataWhenReadyOnQueue:queue usingBlock:^{
while (audioWriteInput.isReadyForMoreMediaData) {
CMSampleBufferRef audioBuffer = [audioReaderTrackOutput copyNextSampleBuffer];
if (audioReader.status == AVAssetReaderStatusReading) {
if (audioBuffer) {
CMTime presTime = CMSampleBufferGetPresentationTimeStamp(audioBuffer);
progress.completedUnitCount = videoTotalUnitCount + (int64_t)(CMTimeGetSeconds(presTime) * 10);
if (progressHandler) {
dispatch_async(dispatch_get_main_queue(), ^{
progressHandler(progress);
});
}

[audioWriteInput appendSampleBuffer:audioBuffer];
CFRelease(audioBuffer);
}
} else if (videoReader.status == AVAssetReaderStatusCompleted) {
[audioWriteInput markAsFinished];
[writer finishWritingWithCompletionHandler:^{
[self saveToAlbumWithCompletionHandler:completionHandler];
}];
break;
} else {
if (completionHandler) {
dispatch_async(dispatch_get_main_queue(), ^{
completionHandler(NO, audioReader.error);
});
}
///Failed audio.
NSLog(@"Failed to encoding audio with error:%@", audioReader.error);
break;
}
}
}];
} else {
/// No audio.
[writer finishWritingWithCompletionHandler:^{
[self saveToAlbumWithCompletionHandler:completionHandler];
}];
}
break;
} else {
///Failed video.
if (completionHandler) {
dispatch_async(dispatch_get_main_queue(), ^{
completionHandler(NO, videoReader.error);
});
}
NSLog(@"Failed to encoding video with error:%@", videoReader.error);
break;
}
}
}];
}

- (void)saveToAlbumWithCompletionHandler:(void(^)(BOOL success, NSError *error)) completionHandler {

if (![[NSFileManager defaultManager] fileExistsAtPath:_imageOutputURL.path]) {
if (completionHandler) {
dispatch_async(dispatch_get_main_queue(), ^{
completionHandler(NO, [NSError errorWithDomain:@"LivePhoto image file not exist." code:-1 userInfo:nil]);
});
}
return;
}
if (![[NSFileManager defaultManager] fileExistsAtPath:_videoOutputURL.path]) {
if (completionHandler) {
dispatch_async(dispatch_get_main_queue(), ^{
completionHandler(NO, [NSError errorWithDomain:@"LivePhoto video file not exist." code:-1 userInfo:nil]);
});
}
return;
}

/// Save to album
[[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
PHAssetCreationRequest *request = [PHAssetCreationRequest creationRequestForAsset];
PHAssetResourceCreationOptions *options = [[PHAssetResourceCreationOptions alloc] init];
/// 移动文件。
options.shouldMoveFile = YES;
[request addResourceWithType:PHAssetResourceTypePhoto fileURL:_imageOutputURL options:options];
[request addResourceWithType:PHAssetResourceTypePairedVideo fileURL:_videoOutputURL options:options];
} completionHandler:^(BOOL success, NSError * _Nullable error) {
if (completionHandler) {
dispatch_async(dispatch_get_main_queue(), ^{
completionHandler(success, error);
});
}
}];
}

+ (NSURL *)livePhotoCacheURL {
return [[[NSFileManager.defaultManager URLsForDirectory:NSCachesDirectory inDomains:NSUserDomainMask] firstObject] URLByAppendingPathComponent:@"LivePhoto"];
}

@end

⚠️ 提供的🌰是默认将第一帧作为关键帧。

写入的方法是先写入Video,再写入Audio。在Apple提供的例子中使用的是dispatch_group来同时写入Video和Audio,但是个人理解却没有这样的方式好用。

Reference

Photos框架之Live Photo相关

LivePhotoDemo