Как сделать видео Slow Motion в IOS

Я должен сделать " медленное движение " в видеофайле вместе со звуком, между кадрами, и мне нужно сохранить увеличенное видео как новое видео.

Ссылка: http://www.youtube.com/watch?v=BJ3_xMGzauk (смотреть от 0 до 10 с)

Из моего анализа я обнаружил, что фреймворк AVFoundation может быть полезным.

Ссылка: http://developer.apple.com/library/ios/#documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/00_Introduction.html

Скопируйте и вставьте по ссылке выше:

"Редактирование AV Foundation использует композиции для создания новых активов из существующих фрагментов мультимедиа (как правило, одной или нескольких видео- и аудиодорожек). Вы можете использовать изменяемую композицию для добавления и удаления дорожек и настройки их временных порядков. Вы также можете установить относительные громкости и линейного изменения звуковых дорожек, а также задайте непрозрачность и линейные значения непрозрачности для видеодорожек. Композиция представляет собой совокупность фрагментов мультимедиа, хранящихся в памяти. При экспорте композиции с использованием сеанса экспорта она сворачивается в файл. В iOS 4.1 и более поздних версиях вы также можете создать ресурс из мультимедиа, такого как буферы образцов или неподвижные изображения, используя средство записи ресурсов.

"

Вопросы: Могу ли я сделать "медленное движение" видео / аудио файла с помощью инфраструктуры AVFoundation? Или есть какой-нибудь другой пакет? Если я хочу обрабатывать аудио и видео отдельно, подскажите, пожалуйста, как это сделать?

Обновление:: Код для AV экспорта сессии:

 NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    NSString *outputURL = paths[0];
    NSFileManager *manager = [NSFileManager defaultManager];
    [manager createDirectoryAtPath:outputURL withIntermediateDirectories:YES attributes:nil error:nil];
    outputURL = [outputURL stringByAppendingPathComponent:@"output.mp4"];
    // Remove Existing File
    [manager removeItemAtPath:outputURL error:nil];
    AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:self.inputAsset presetName:AVAssetExportPresetLowQuality];
    exportSession.outputURL = [NSURL fileURLWithPath:outputURL]; // output path;
    exportSession.outputFileType = AVFileTypeQuickTimeMovie;
    [exportSession exportAsynchronouslyWithCompletionHandler:^(void) {
        if (exportSession.status == AVAssetExportSessionStatusCompleted) {
            [self writeVideoToPhotoLibrary:[NSURL fileURLWithPath:outputURL]];
            ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
            [library writeVideoAtPathToSavedPhotosAlbum:[NSURL fileURLWithPath:outputURL] completionBlock:^(NSURL *assetURL, NSError *error){
                if (error) {
                    NSLog(@"Video could not be saved");
                }
            }];
        } else {
            NSLog(@"error: %@", [exportSession error]);
        }
    }];

7 ответов

Решение

Вы можете масштабировать видео с помощью фреймворков AVFoundation и CoreMedia. Взгляните на метод AVMutableCompositionTrack:

- (void)scaleTimeRange:(CMTimeRange)timeRange toDuration:(CMTime)duration;

Образец:

AVURLAsset* videoAsset = nil; //self.inputAsset;

//create mutable composition
AVMutableComposition *mixComposition = [AVMutableComposition composition];

AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
                                                                               preferredTrackID:kCMPersistentTrackID_Invalid];
NSError *videoInsertError = nil;
BOOL videoInsertResult = [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
                                                        ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
                                                         atTime:kCMTimeZero
                                                          error:&videoInsertError];
if (!videoInsertResult || nil != videoInsertError) {
    //handle error
    return;
}

//slow down whole video by 2.0
double videoScaleFactor = 2.0;
CMTime videoDuration = videoAsset.duration;

[compositionVideoTrack scaleTimeRange:CMTimeRangeMake(kCMTimeZero, videoDuration)
                           toDuration:CMTimeMake(videoDuration.value*videoScaleFactor, videoDuration.timescale)];

//export
AVAssetExportSession* assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition
                                                                     presetName:AVAssetExportPresetLowQuality];

(Вероятно, аудиодорожка из videoAsset также должна быть добавлена ​​в mixComposition)

Я попытался и смог замедлить актив.

compositionVideoTrack?.scaleTimeRange(timeRange, toDuration: scaledVideoDuration) сделал трюк.

Я сделал класс, который поможет вам создать slower видео от AVAsset, + Дело в том, что вы также можете сделать это faster и еще один плюс это то, что он будет обрабатывать аудио тоже.

Вот мой пример пользовательского класса:

import UIKit
import AVFoundation

enum SpeedoMode {
    case Slower
    case Faster
}

class VSVideoSpeeder: NSObject {

    /// Singleton instance of `VSVideoSpeeder`
    static var shared: VSVideoSpeeder = {
       return VSVideoSpeeder()
    }()

    /// Range is b/w 1x, 2x and 3x. Will not happen anything if scale is out of range. Exporter will be nil in case url is invalid or unable to make asset instance.
    func scaleAsset(fromURL url: URL,  by scale: Int64, withMode mode: SpeedoMode, completion: @escaping (_ exporter: AVAssetExportSession?) -> Void) {

        /// Check the valid scale
        if scale < 1 || scale > 3 {
            /// Can not proceed, Invalid range
            completion(nil)
            return
        }

        /// Asset
        let asset = AVAsset(url: url)

        /// Video Tracks
        let videoTracks = asset.tracks(withMediaType: AVMediaType.video)
        if videoTracks.count == 0 {
            /// Can not find any video track
            completion(nil)
            return
        }

        /// Get the scaled video duration
        let scaledVideoDuration = (mode == .Faster) ? CMTimeMake(asset.duration.value / scale, asset.duration.timescale) : CMTimeMake(asset.duration.value * scale, asset.duration.timescale)
        let timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration)

        /// Video track
        let videoTrack = videoTracks.first!

        let mixComposition = AVMutableComposition()
        let compositionVideoTrack = mixComposition.addMutableTrack(withMediaType: AVMediaType.video, preferredTrackID: kCMPersistentTrackID_Invalid)

        /// Audio Tracks
        let audioTracks = asset.tracks(withMediaType: AVMediaType.audio)
        if audioTracks.count > 0 {
            /// Use audio if video contains the audio track
            let compositionAudioTrack = mixComposition.addMutableTrack(withMediaType: AVMediaType.audio, preferredTrackID: kCMPersistentTrackID_Invalid)

            /// Audio track
            let audioTrack = audioTracks.first!
            do {
                try compositionAudioTrack?.insertTimeRange(timeRange, of: audioTrack, at: kCMTimeZero)
                compositionAudioTrack?.scaleTimeRange(timeRange, toDuration: scaledVideoDuration)
            } catch _ {
                /// Ignore audio error
            }
        }

        do {
            try compositionVideoTrack?.insertTimeRange(timeRange, of: videoTrack, at: kCMTimeZero)
            compositionVideoTrack?.scaleTimeRange(timeRange, toDuration: scaledVideoDuration)

            /// Keep original transformation
            compositionVideoTrack?.preferredTransform = videoTrack.preferredTransform

            /// Initialize Exporter now
            let outputFileURL = URL(fileURLWithPath: "/Users/thetiger/Desktop/scaledVideo.mov")
           /// Note:- Please use directory path if you are testing with device.

            if FileManager.default.fileExists(atPath: outputFileURL.absoluteString) {
                try FileManager.default.removeItem(at: outputFileURL)
            }

            let exporter = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality)
            exporter?.outputURL = outputFileURL
            exporter?.outputFileType = AVFileType.mov
            exporter?.shouldOptimizeForNetworkUse = true
            exporter?.exportAsynchronously(completionHandler: {
                completion(exporter)
            })

        } catch let error {
            print(error.localizedDescription)
            completion(nil)
            return
        }
    }

}

Я взял 1x, 2x и 3x в качестве действительной шкалы. Класс содержит правильную проверку и обработку. Ниже приведен пример использования этой функции.

let url = Bundle.main.url(forResource: "1", withExtension: "mp4")!
VSVideoSpeeder.shared.scaleAsset(fromURL: url, by: 3, withMode: SpeedoMode.Slower) { (exporter) in
     if let exporter = exporter {
         switch exporter.status {
                case .failed: do {
                      print(exporter.error?.localizedDescription ?? "Error in exporting..")
                }
                case .completed: do {
                      print("Scaled video has been generated successfully!")
                }
                case .unknown: break
                case .waiting: break
                case .exporting: break
                case .cancelled: break
           }
      }
      else {
           /// Error
           print("Exporter is not initialized.")
      }
}

Эта строка будет обрабатывать аудио масштабирование

compositionAudioTrack?.scaleTimeRange(timeRange, toDuration: scaledVideoDuration)

Я знаю, что опоздал на эту тему, но я добился добавления медленного движения к видео, включая аудио, а также с правильной ориентацией вывода. Надеюсь, это может кому-то помочь

 - (void)SlowMotion:(NSURL *)URl
 {
   AVURLAsset* videoAsset = [AVURLAsset URLAssetWithURL:URl options:nil]; //self.inputAsset;

AVAsset *currentAsset = [AVAsset assetWithURL:URl];
AVAssetTrack *vdoTrack = [[currentAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
//create mutable composition
AVMutableComposition *mixComposition = [AVMutableComposition composition];

AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];

NSError *videoInsertError = nil;
BOOL videoInsertResult = [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
                                                        ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
                                                         atTime:kCMTimeZero
                                                          error:&videoInsertError];
if (!videoInsertResult || nil != videoInsertError) {
    //handle error
    return;
}

NSError *audioInsertError =nil;
BOOL audioInsertResult =[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
                                                       ofTrack:[[currentAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]
                                                        atTime:kCMTimeZero
                                                         error:&audioInsertError];

if (!audioInsertResult || nil != audioInsertError) {
    //handle error
    return;
}

CMTime duration =kCMTimeZero;
duration=CMTimeAdd(duration, currentAsset.duration);
//slow down whole video by 2.0
double videoScaleFactor = 2.0;
CMTime videoDuration = videoAsset.duration;

[compositionVideoTrack scaleTimeRange:CMTimeRangeMake(kCMTimeZero, videoDuration)
                           toDuration:CMTimeMake(videoDuration.value*videoScaleFactor, videoDuration.timescale)];
[compositionAudioTrack scaleTimeRange:CMTimeRangeMake(kCMTimeZero, videoDuration)
                           toDuration:CMTimeMake(videoDuration.value*videoScaleFactor, videoDuration.timescale)];
[compositionVideoTrack setPreferredTransform:vdoTrack.preferredTransform];

        NSArray *dirPaths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
        NSString *docsDir = [dirPaths objectAtIndex:0];
        NSString *outputFilePath = [docsDir stringByAppendingPathComponent:[NSString stringWithFormat:@"slowMotion.mov"]];
        if ([[NSFileManager defaultManager] fileExistsAtPath:outputFilePath])
        [[NSFileManager defaultManager] removeItemAtPath:outputFilePath error:nil];
        NSURL *_filePath = [NSURL fileURLWithPath:outputFilePath];

//export
AVAssetExportSession* assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition
                                                                        presetName:AVAssetExportPresetLowQuality];
assetExport.outputURL=_filePath;
                          assetExport.outputFileType =           AVFileTypeQuickTimeMovie;
  exporter.shouldOptimizeForNetworkUse = YES;
                           [assetExport exportAsynchronouslyWithCompletionHandler:^
                            {

                                switch ([assetExport status]) {
                                    case AVAssetExportSessionStatusFailed:
                                    {
                                        NSLog(@"Export session faiied with error: %@", [assetExport error]);
                                        dispatch_async(dispatch_get_main_queue(), ^{
                                            // completion(nil);
                                        });
                                    }
                                        break;
                                    case AVAssetExportSessionStatusCompleted:
                                    {

                                        NSLog(@"Successful");
                                        NSURL *outputURL = assetExport.outputURL;

                                        ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
                                        if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:outputURL]) {

                                            [self writeExportedVideoToAssetsLibrary:outputURL];
                                        }
                                        dispatch_async(dispatch_get_main_queue(), ^{
                                            //                                            completion(_filePath);
                                        });

                                    }
                                        break;
                                    default:

                                        break;
                                }


                            }];


 }

  - (void)writeExportedVideoToAssetsLibrary :(NSURL *)url {
NSURL *exportURL = url;
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:exportURL]) {
    [library writeVideoAtPathToSavedPhotosAlbum:exportURL completionBlock:^(NSURL *assetURL, NSError *error){
        dispatch_async(dispatch_get_main_queue(), ^{
            if (error) {
                UIAlertView *alertView = [[UIAlertView alloc] initWithTitle:[error localizedDescription]
                                                                    message:[error localizedRecoverySuggestion]
                                                                   delegate:nil
                                                          cancelButtonTitle:@"OK"
                                                          otherButtonTitles:nil];
                [alertView show];
            }
            if(!error)
            {
               // [activityView setHidden:YES];
                UIAlertView *alertView = [[UIAlertView alloc] initWithTitle:@"Sucess"
                                                                    message:@"video added to gallery successfully"
                                                                   delegate:nil
                                                          cancelButtonTitle:@"OK"
                                                          otherButtonTitles:nil];
                [alertView show];
            }
 #if !TARGET_IPHONE_SIMULATOR
            [[NSFileManager defaultManager] removeItemAtURL:exportURL error:nil];
#endif
        });
    }];
} else {
    NSLog(@"Video could not be exported to assets library.");
}

}

Я извлек бы все кадры из исходного видео, используя ffmpeg, а затем собрал их вместе, используя AVAssetWriter, но с более низкой частотой кадров. Для получения более полного медленного движения, возможно, вам потребуется применить некоторый эффект размытия или даже создать кадр между существующими, который будет смешиваться из двух кадров.

Пример в swift:

я

var asset: AVAsset?  
func configureAssets(){

    let videoAsset = AVURLAsset(url: Bundle.main.url(forResource: "sample", withExtension: "m4v")!)
    let audioAsset = AVURLAsset(url: Bundle.main.url(forResource: "sample", withExtension: "m4a")!)
    //    let audioAsset2 = AVURLAsset(url: Bundle.main.url(forResource: "audio2", withExtension: "m4a")!)

    let comp = AVMutableComposition()

    let videoAssetSourceTrack = videoAsset.tracks(withMediaType: AVMediaTypeVideo).first! as AVAssetTrack
    let audioAssetSourceTrack = videoAsset.tracks(withMediaType: AVMediaTypeAudio).first! as AVAssetTrack
    //    let audioAssetSourceTrack2 = audioAsset2.tracks(withMediaType: AVMediaTypeAudio).first! as AVAssetTrack

    let videoCompositionTrack = comp.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)
    let audioCompositionTrack = comp.addMutableTrack(withMediaType: AVMediaTypeAudio, preferredTrackID: kCMPersistentTrackID_Invalid)

    do {

        try videoCompositionTrack.insertTimeRange(
            CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(9 , 600)),
            of: videoAssetSourceTrack,
            at: kCMTimeZero)



        try audioCompositionTrack.insertTimeRange(
            CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(9, 600)),
            of: audioAssetSourceTrack,
            at: kCMTimeZero)

        //
        //      try audioCompositionTrack.insertTimeRange(
        //        CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(3, 600)),
        //        of: audioAssetSourceTrack2,
        //        at: CMTimeMakeWithSeconds(7, 600))

        let videoScaleFactor = Int64(2.0)
        let videoDuration: CMTime = videoAsset.duration


        videoCompositionTrack.scaleTimeRange(CMTimeRangeMake(kCMTimeZero, videoDuration), toDuration: CMTimeMake(videoDuration.value * videoScaleFactor, videoDuration.timescale))
        audioCompositionTrack.scaleTimeRange(CMTimeRangeMake(kCMTimeZero, videoDuration), toDuration: CMTimeMake(videoDuration.value * videoScaleFactor, videoDuration.timescale))
        videoCompositionTrack.preferredTransform = videoAssetSourceTrack.preferredTransform



    }catch { print(error) }

    asset = comp
}

II

  func createFileFromAsset(_ asset: AVAsset){

let documentsDirectory = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0] as URL

let filePath = documentsDirectory.appendingPathComponent("rendered-audio.m4v")
deleteFile(filePath)

if let exportSession = AVAssetExportSession(asset: asset, presetName: AVAssetExportPresetLowQuality){


  exportSession.canPerformMultiplePassesOverSourceMediaData = true
  exportSession.outputURL = filePath
  exportSession.timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration)
  exportSession.outputFileType = AVFileTypeQuickTimeMovie
  exportSession.exportAsynchronously {
    _ in
    print("finished: \(filePath) :  \(exportSession.status.rawValue) ")
  }
}

 }

 func deleteFile(_ filePath:URL) {
guard FileManager.default.fileExists(atPath: filePath.path) else {
  return
}

do {
  try FileManager.default.removeItem(atPath: filePath.path)
}catch{
  fatalError("Unable to delete file: \(error) : \(#function).")
}
}

Создание «замедленного» видео в iOS swift - непростая задача, я столкнулся со многими «замедленными» видео, которые, как стало известно, не работают, или некоторые из кодов в них устарели. И вот я наконец придумал способ делать замедленное движение в Swift. примечание: этот код можно использовать для 120 кадров в секунду больше, чем это. Вы можете делать звук в замедленном темпе так же, как и я.

Вот «фрагмент кода, который я создал для замедленного воспроизведения».

Дайте мне UPVOTE, если этот код работает.

          func slowMotion(pathUrl: URL) {

    let videoAsset = AVURLAsset.init(url: pathUrl, options: nil)
    let currentAsset = AVAsset.init(url: pathUrl)

    let vdoTrack = currentAsset.tracks(withMediaType: .video)[0]
    let mixComposition = AVMutableComposition()

    let compositionVideoTrack = mixComposition.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid)

    let videoInsertError: Error? = nil
    var videoInsertResult = false
    do {
        try compositionVideoTrack?.insertTimeRange(
            CMTimeRangeMake(start: .zero, duration: videoAsset.duration),
            of: videoAsset.tracks(withMediaType: .video)[0],
            at: .zero)
        videoInsertResult = true
    } catch let videoInsertError {
    }

    if !videoInsertResult || videoInsertError != nil {
        //handle error
        return
    }


    var duration: CMTime = .zero
    duration = CMTimeAdd(duration, currentAsset.duration)
    
    
    //MARK: You see this constant (videoScaleFactor) this helps in achieving the slow motion that you wanted. This increases the time scale of the video that makes slow motion
    // just increase the videoScaleFactor value in order to play video in higher frames rates(more slowly)
    let videoScaleFactor = 2.0
    let videoDuration = videoAsset.duration
    
    compositionVideoTrack?.scaleTimeRange(
        CMTimeRangeMake(start: .zero, duration: videoDuration),
        toDuration: CMTimeMake(value: videoDuration.value * Int64(videoScaleFactor), timescale: videoDuration.timescale))
    compositionVideoTrack?.preferredTransform = vdoTrack.preferredTransform
    
    let dirPaths = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).map(\.path)
    let docsDir = dirPaths[0]
    let outputFilePath = URL(fileURLWithPath: docsDir).appendingPathComponent("slowMotion\(UUID().uuidString).mp4").path
    
    if FileManager.default.fileExists(atPath: outputFilePath) {
        do {
            try FileManager.default.removeItem(atPath: outputFilePath)
        } catch {
        }
    }
    let filePath = URL(fileURLWithPath: outputFilePath)
    
    let assetExport = AVAssetExportSession(
        asset: mixComposition,
        presetName: AVAssetExportPresetHighestQuality)
    assetExport?.outputURL = filePath
    assetExport?.outputFileType = .mp4
    
    assetExport?.exportAsynchronously(completionHandler: {
        switch assetExport?.status {
        case .failed:
            print("asset output media url = \(String(describing: assetExport?.outputURL))")
            print("Export session faiied with error: \(String(describing: assetExport?.error))")
            DispatchQueue.main.async(execute: {
                // completion(nil);
            })
        case .completed:
            print("Successful")
            let outputURL = assetExport!.outputURL
            print("url path = \(String(describing: outputURL))")
            
            PHPhotoLibrary.shared().performChanges({
                PHAssetChangeRequest.creationRequestForAssetFromVideo(atFileURL: outputURL!)
            }) { saved, error in
                if saved {
                    print("video successfully saved in photos gallery view video in photos gallery")
                }
                if (error != nil) {
                    print("error in saing video \(String(describing: error?.localizedDescription))")
                }
            }
            DispatchQueue.main.async(execute: {
                //      completion(_filePath);
            })
        case .none:
            break
        case .unknown:
            break
        case .waiting:
            break
        case .exporting:
            break
        case .cancelled:
            break
        case .some(_):
            break
        }
    })
}

Swift 5

Вот код @TheTiger, преобразованный в SwiftUI:

import UIKit
import AVFoundation


    enum SpeedoMode {
        case Slower
        case Faster
    }

    class VSVideoSpeeder: NSObject {

        /// Singleton instance of `VSVideoSpeeder`
        static var shared: VSVideoSpeeder = {
           return VSVideoSpeeder()
        }()

        /// Range is b/w 1x, 2x and 3x. Will not happen anything if scale is out of range. Exporter will be nil in case url is invalid or unable to make asset instance.
        func scaleAsset(fromURL url: URL,  by scale: Int64, withMode mode: SpeedoMode, completion: @escaping (_ exporter: AVAssetExportSession?) -> Void) {

            /// Check the valid scale
            if scale < 1 || scale > 3 {
                /// Can not proceed, Invalid range
                completion(nil)
                return
            }

            /// Asset
            let asset = AVAsset(url: url)

            /// Video Tracks
            let videoTracks = asset.tracks(withMediaType: AVMediaType.video)
            if videoTracks.count == 0 {
                /// Can not find any video track
                completion(nil)
                return
            }

            /// Get the scaled video duration
            let scaledVideoDuration = (mode == .Faster) ? CMTimeMake(value: asset.duration.value / scale, timescale: asset.duration.timescale) : CMTimeMake(value: asset.duration.value * scale, timescale: asset.duration.timescale)
            let timeRange = CMTimeRangeMake(start: CMTime.zero, duration: asset.duration)

            /// Video track
            let videoTrack = videoTracks.first!

            let mixComposition = AVMutableComposition()
            let compositionVideoTrack = mixComposition.addMutableTrack(withMediaType: AVMediaType.video, preferredTrackID: kCMPersistentTrackID_Invalid)

            /// Audio Tracks
            let audioTracks = asset.tracks(withMediaType: AVMediaType.audio)
            if audioTracks.count > 0 {
                /// Use audio if video contains the audio track
                let compositionAudioTrack = mixComposition.addMutableTrack(withMediaType: AVMediaType.audio, preferredTrackID: kCMPersistentTrackID_Invalid)

                /// Audio track
                let audioTrack = audioTracks.first!
                do {
                    try compositionAudioTrack?.insertTimeRange(timeRange, of: audioTrack, at: CMTime.zero)
                    compositionAudioTrack?.scaleTimeRange(timeRange, toDuration: scaledVideoDuration)
                } catch _ {
                    /// Ignore audio error
                }
            }

            do {
                try compositionVideoTrack?.insertTimeRange(timeRange, of: videoTrack, at: CMTime.zero)
                compositionVideoTrack?.scaleTimeRange(timeRange, toDuration: scaledVideoDuration)

                /// Keep original transformation
                compositionVideoTrack?.preferredTransform = videoTrack.preferredTransform

                /// Initialize Exporter now
                let outputFileURL = URL(fileURLWithPath: "/Users/thetiger/Desktop/scaledVideo.mov")
               /// Note:- Please use directory path if you are testing with device.

                if FileManager.default.fileExists(atPath: outputFileURL.absoluteString) {
                    try FileManager.default.removeItem(at: outputFileURL)
                }

                let exporter = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality)
                exporter?.outputURL = outputFileURL
                exporter?.outputFileType = AVFileType.mov
                exporter?.shouldOptimizeForNetworkUse = true
                exporter?.exportAsynchronously(completionHandler: {
                    completion(exporter)
                })

            } catch let error {
                print(error.localizedDescription)
                completion(nil)
                return
            }
        }

    }

}

С тем же вариантом использования:

        let url = Bundle.main.url(forResource: "1", withExtension: "mp4")!
        VSVideoSpeeder.shared.scaleAsset(fromURL: url, by: 3, withMode: SpeedoMode.Slower) { (exporter) in
             if let exporter = exporter {
                 switch exporter.status {
                        case .failed: do {
                              print(exporter.error?.localizedDescription ?? "Error in exporting..")
                        }
                        case .completed: do {
                              print("Scaled video has been generated successfully!")
                        }
                        case .unknown: break
                        case .waiting: break
                        case .exporting: break
                        case .cancelled: break
                   }
              }
              else {
                   /// Error
                   print("Exporter is not initialized.")
              }
        }
Другие вопросы по тегам