一、AVPlayerViewController
iOS9之前,我们经常使用MPMoviePlayerController
实现视频、音频文件的播放。iOS9之后,MPMoviePlayerController被苹果弃用,苹果推荐使用AVPlayerViewController
,AVPlayerViewController的功能比MPMoviePlayerController更为强大。单独使用AVPlayer我们就可以实现视频和音频两种文件的播放。由于其基于AVFoundation和AVKit,所以在开始使用之前我们需要先导入AVFoundation
框架。具体实现如下:
加载本地视频资源
如果需要加载本地音频文件只需修改资源的后缀为音频对应的后缀名,如MP3。
let filePath = NSBundle.mainBundle().pathForResource("赵传 - 每次都想呼喊你的名字", ofType: "mp3")
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22
| import UIKit import AVKit import AVFoundation
lazy var playerController: AVPlayerViewController = { let filePath = NSBundle.mainBundle().pathForResource("音乐喷泉", ofType: "mp4") let sourceMovieURL = NSURL(fileURLWithPath: filePath!) let movieAsset = AVURLAsset(URL: sourceMovieURL) let playItem = AVPlayerItem(asset: movieAsset) let player = AVPlayer(playerItem: playItem) let playerController = AVPlayerViewController() playerController.player = player playerController.view.frame = self.view.frame self.addChildViewController(playerController) playerController.videoGravity = AVLayerVideoGravityResizeAspectFill return playerController }() override func viewDidLoad() { super.viewDidLoad() self.view.addSubview(playerController.view) }
|
AVFoundation支持三种视频填充模式:
1 2 3
| AVLayerVideoGravityResizeAspect 保留长宽比,未填充部分会有黑边 AVLayerVideoGravityResizeAspectFill 保留长宽比,填充所有的区域 AVLayerVideoGravityResize 拉伸填满所有的空间
|
加载网络视频
1 2 3 4 5 6 7 8 9 10
| lazy var playerController: AVPlayerViewController = { let player = AVPlayer(URL: NSURL(string: "http://www.ebookfrenzy.com/ios_book/movie/movie.mov")!) let playerController = AVPlayerViewController() playerController.player = player playerController.view.frame = self.view.frame self.addChildViewController(playerController) playerController.videoGravity = AVLayerVideoGravityResizeAspectFill return playerController }()
|
二、AVAudioPlayer
AVAudioPlayer封装了播放单个音频的能力。可以通过NSURL或NSData来初始化,AVAudioPlayer不具备流媒体播放能力,但是我们能够通过网络请求后获取的NSData间接实现播放网络资源。首先,我们同样需要导入AVFoundation框架,具体实现如下:
加载本地音频
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46
| import UIKit import AVFoundation
class AudioPlayerViewController: UIViewController {
lazy var audioPlayer: AVAudioPlayer? = { let filePath = NSBundle.mainBundle().pathForResource("赵传 - 每次都想呼喊你的名字", ofType: "mp3") let sourceAudioURL = NSURL(fileURLWithPath: filePath!) let audioPlayer: AVAudioPlayer! do { audioPlayer = try AVAudioPlayer(contentsOfURL: sourceAudioURL) audioPlayer.delegate = self return audioPlayer } catch { // TODO:couldn't load file :( } return nil }() override func viewDidLoad() { super.viewDidLoad() audioPlayer?.play() } }
extension AudioPlayerViewController: AVAudioPlayerDelegate { // 音频播放完毕时 func audioPlayerDidFinishPlaying(player: AVAudioPlayer, successfully flag: Bool) { } // 解码错误 func audioPlayerDecodeErrorDidOccur(player: AVAudioPlayer, error: NSError?) { } // 当音频播放过程中被中断时 func audioPlayerBeginInterruption(player: AVAudioPlayer) { } // 当中断结束时 func audioPlayerEndInterruption(player: AVAudioPlayer, withOptions flags: Int) { } }
|
加载网络音频
- 加载网络音频文件,我们的策略是通过网络请求获取音频流的NSData数据,然后使用AVPlayer的初始化方法 获取player实例,进行播放;
- 对于iOS7以上的系统(含iOS7),在确定文件格式的情况下可以使用-initWithData:fileTypeHint:error:和-initWithContentsOfURL:fileTypeHint:error:生成实例,或者把NSData保存为对应后缀名的文件后使用-initWithContentsOfURL:error:后再生成实例;
- 对于iOS7以下的系统,在确定文件格式的情况下,最为安全的方法是把NSData保存为对应后缀名的文件后使用-initWithContentsOfURL:error:生成实例。
引用
关于加载网络音频的更多注意事项,可以参见“码农人生”的技术博客:
AVAudioPlayer的1937337955错误研究
三、AVPlayer
AVPlayer
支持本地和网络流媒体播放,AVPlayerViewController实际上是AVPlayer的封装,如果我们想要自定义一款流媒体视频或音频播放器,我们可以简单的使用AVPlayer实现。AVPlayer的具体实现如下:
初始化AVPlayer
此处的url可以上本地媒体资源url也可以是网络资源url
1 2
| let playerItem = AVPlayerItem(URL: NSURL(string: "url")!) let player = AVPlayer(playerItem: playerItem)
|
添加属性监听
此处利用KVO监听player的currentItem的status和loadTimeRanges两个属性。status是表示player状态的一个枚举,只有当status状态为ReadyToPlay时,我们才能对资源进行播放,并获取音频时长等信息。通过对loadTimeRanges属性的实时监听,获取资源的缓冲进度。
1 2 3 4 5 6
| public enum AVPlayerStatus : Int { case Unknown case ReadyToPlay case Failed }
|
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26
| // 添加监听 func addObserverForPlayItem() { player.currentItem?.addObserver(self, forKeyPath: "status", options: NSKeyValueObservingOptions.New, context: nil) player.currentItem?.addObserver(self, forKeyPath: "loadedTimeRanges", options: NSKeyValueObservingOptions.New, context: nil) }
// 移除监听 func removeObserverForPlayItem() { player.currentItem?.removeObserver(self, forKeyPath: "status") player.currentItem?.removeObserver(self, forKeyPath: "loadedTimeRanges") }
override func observeValueForKeyPath(keyPath: String?, ofObject object: AnyObject?, change: [String : AnyObject]?, context: UnsafeMutablePointer<Void>) { if keyPath == "status" { let status = change!["new"] as! Int if status == AVPlayerStatus.ReadyToPlay.rawValue { // 例如获取音频时长,刷新UI界面 } } else if keyPath == "loadedTimeRanges" { let array = player.currentItem?.loadedTimeRanges let timeRanges = array?.first?.CMTimeRangeValue let startSeconds = CMTimeGetSeconds(timeRanges!.start) let durationSeconds = CMTimeGetSeconds(timeRanges!.duration) let bufferTime = startSeconds + durationSeconds // 获取bufferTime刷新界面缓冲进度条 }
|
其他相关属性、方法介绍
播放:player.play()
暂停:player.pause()
获取当前播放时间:timeScale指的是1秒需要有几个帧构成(fps),所以要得到秒数,需要用如下方式
1
| let currentTime = Double(player.currentTime().value) / Double(player.currentTime().timescale)
|
获取总时长:
1
| let duration = Double(player.currentItem!.duration.value) / Double(player.currentItem!.duration.timescale)
|
拖动进度条:player.seekToTime(time: cmTime)
这个方法可以指定播放到哪一个时刻,参数是一个CMTime类型的数据,CMTime是一种用于表征媒体资源时间的专用类。我们可以通过CMTime构造方法创建CMTime实例。
1 2
| timeScale = player.currentItem.duration.timeScale let cmTime = CMTime(seconds: seconds, preferredTimescale: timeScale)
|
友情链接
码农人生 iOS音频播放系列博客