Android 多媒体播放流程
-
- MediaPlayer 框架
- mediaplayer state diagram
- Nuplayer的创建
- setDataSource
- preparesync
- OnStart and Decoder
MediaPlayer 框架
§ Android 1.0→Packet video – Open core
§ Android 1.6→Open Core 2.0
§ Android 2.0→Stagefright
§ Android 2.1→Stagefright
§ Android 2.3→Stagefright with enhancements
§ Android 3.0→Stagefright + HLS for first time
§ Android 4.1 to 4.3→SF – Support for MediaCodec
§ Android 5.0→MediaSession and MediaController classes introduced
Android中的MediaPlayer播放框架经历了OpenCore, StageFrightPlayer(AwesomePlayer),然后NuPlayerDriver。Android2.3 引入流媒体框架,而流媒体框架的核心是NuPlayer。Android4.0之后HttpLive和RTSP协议开始使用NuPlayer播放器,本地播放还是AwesomePlayer, Android5.0(L版本)之后本地播放也开始使用NuPlayer播放器。
m8976/frameworks/av/include/media/MediaPlayerInterface.h
enum player_type {
NU_PLAYER = 4,// Test players are available only in the 'test' and 'eng' builds.
// The shared library with the test player is passed passed as an// argument to the 'test:' url in the setDataSource call.
TEST_PLAYER = 5,
DASH_PLAYER = 6,
};
mediaplayer state diagram
State Diagram:
Nuplayer的创建
app层创建MediaPlayer,有两种方式,
MediaPlayer mediaPlayer = MediaPlayer.create(context, R.raw.sound_file_1);
mediaPlayer.start(); // no need to call prepare(); create() does that for you
Or MediaPlayer mediaPlayer = new MediaPlayer() 例如:
public class MyService extends Service implements MediaPlayer.OnPreparedListener {private static final String ACTION_PLAY = "com.example.action.PLAY";MediaPlayer mMediaPlayer = null;public int onStartCommand(Intent intent, int flags, int startId) {...if (intent.getAction().equals(ACTION_PLAY)) {MediaPlayer mediaPlayer = new MediaPlayer(); // initialize it heremMediaPlayer.setOnPreparedListener(this);mMediaPlayer.prepareAsync(); // prepare async to not block main thread}}/** Called when MediaPlayer is ready */public void onPrepared(MediaPlayer player) {player.start();}
}
播放完成需要release mediaplayer
mediaPlayer.release();
mediaPlayer = null;
对于DRM,MediaPlayer的初始化和设置资源不变,使用DRM时候需要注意下面的步骤:
setOnPreparedListener();
setOnDrmInfoListener();
setDataSource();
prepareAsync();
// ....// If the data source content is protected you receive a call to the onDrmInfo() callback.
onDrmInfo() {prepareDrm();getKeyRequest();provideKeyResponse();
}// When prepareAsync() finishes, you receive a call to the onPrepared() callback.
// If there is a DRM, onDrmInfo() sets it up before executing this callback, so you can start the player.
onPrepared() {start();
}// ...play/pause/resume...
stop();
releaseDrm();
上层app通过new创建MediaPlayer后,然后调用setDataSource(),经过jni最终调用mediapalyer.cpp的setDataSource(),然后执行getMediaPlayerService(),获取MediaPlayerService::Client后创建真正意义上的Player,也就是NuPlayer。
frameworks/av/media/libmediaplayerservice/MediaPlayerService.cpp
811 sp<MediaPlayerBase> MediaPlayerService::Client::createPlayer(player_type playerType)
812 {
813 // determine if we have the right player type
814 sp<MediaPlayerBase> p = mPlayer;
815 if ((p != NULL) && (p->playerType() != playerType)) {
816 ALOGV("delete player");
817 p.clear();
818 }
819 if (p == NULL) {
820 p = MediaPlayerFactory::createPlayer(playerType, this, notify, mPid);
821 }
822
823 if (p != NULL) {
824 p->setUID(mUid);
825 }
826
827 return p;
828 }
进而由MediaPlayerFactory根据playerType创建对应的player,这里是NuplayerDriver
frameworks/av/media/libmediaplayerservice/MediaPlayerFactory.cpp
virtual sp<MediaPlayerBase> createPlayer(pid_t pid) {
227 ALOGV(" create NuPlayer");
228 return new NuPlayerDriver(pid);
229 }
NuPlayerDriver是对NuPlayer的封装,继承MediaPlayerInterface接口。通过NuPlayer来实现播放的功能。
59 NuPlayerDriver::NuPlayerDriver(pid_t pid)
60 : mState(STATE_IDLE),
61 mIsAsyncPrepare(false),
62 mAsyncResult(UNKNOWN_ERROR),
63 mSetSurfaceInProgress(false),
64 mDurationUs(-1),
65 mPositionUs(-1),
66 mSeekInProgress(false),
67 mPlayingTimeUs(0),
68 mLooper(new ALooper),
69 mPlayer(new NuPlayer(pid)),
70 mPlayerFlags(0),
71 mAnalyticsItem(NULL),
72 mAtEOS(false),
73 mLooping(false),
74 mAutoLoop(false) {
75 ALOGD("NuPlayerDriver(%p) created, clientPid(%d)", this, pid);
76 mLooper->setName("NuPlayerDriver Looper");
77
78 // set up an analytics record
79 mAnalyticsItem = new MediaAnalyticsItem(kKeyPlayer);
80 mAnalyticsItem->generateSessionID();
81
82 mLooper->start(
83 false, /* runOnCallingThread */
84 true, /* canCallJava */
85 PRIORITY_AUDIO);
86 //mPlayer即NuPlayer,继承于AHandle
87 mLooper->registerHandler(mPlayer);
88
89 mPlayer->setDriver(this);
90 }
setDataSource
setDataSource方法,这个方法有两个功能:一个是根据文件类型获得相应的player,一个是创建相应文件类型的mediaExtractor,解析媒体文件,记录metadata的主要信息。
//android/frameworks/av/media/libmedia/mediaplayer.cpp194 status_t MediaPlayer::setDataSource(const sp<IDataSource> &source)
195 {
196 ALOGV("setDataSource(IDataSource)");
197 status_t err = UNKNOWN_ERROR;
198 const sp<IMediaPlayerService> service(getMediaPlayerService());
199 if (service != 0) {
200 sp<IMediaPlayer> player(service->create(this, mAudioSessionId));
201 if ((NO_ERROR != doSetRetransmitEndpoint(player)) ||
202 (NO_ERROR != player->setDataSource(source))) {
203 player.clear();
204 }
205 err = attachNewPlayer(player);
206 }
207 return err;
208 }
preparesync
mediaplayer播放的第一步骤setdataSource,下面我们来讲解preparesync的流程,在prepare前我们还有setDisplay这一步,即获取surfacetexture来进行画面的展示
android/frameworks/base/media/java/android/media/MediaPlayer.java
public static MediaPlayer create(Context context, Uri uri, SurfaceHolder holder,
900 AudioAttributes audioAttributes, int audioSessionId) {
901
902 try {
903 MediaPlayer mp = new MediaPlayer();
904 final AudioAttributes aa = audioAttributes != null ? audioAttributes :
905 new AudioAttributes.Builder().build();
906 mp.setAudioAttributes(aa);
907 mp.setAudioSessionId(audioSessionId);
908 mp.setDataSource(context, uri);
909 if (holder != null) {
910 mp.setDisplay(holder);
911 }
912 mp.prepare();
913 return mp;
914 } catch (IOException ex) {
915 Log.d(TAG, "create failed:", ex);
916 // fall through
917 } catch (IllegalArgumentException ex) {
918 Log.d(TAG, "create failed:", ex);
919 // fall through
920 } catch (SecurityException ex) {
921 Log.d(TAG, "create failed:", ex);
922 // fall through
923 }
924
925 return null;
926 }public void setDisplay(SurfaceHolder sh) {
750 mSurfaceHolder = sh;
751 Surface surface;
752 if (sh != null) {
753 surface = sh.getSurface();
754 } else {
755 surface = null;
756 }
757 _setVideoSurface(surface);
758 updateSurfaceScreenOn();
759 }
//android/frameworks/base/media/jni/android_media_MediaPlayer.cppstatic void
326 setVideoSurface(JNIEnv *env, jobject thiz, jobject jsurface, jboolean mediaPlayerMustBeAlive)
327 {
328 sp<MediaPlayer> mp = getMediaPlayer(env, thiz);
329 if (mp == NULL) {
330 if (mediaPlayerMustBeAlive) {
331 jniThrowException(env, "java/lang/IllegalStateException", NULL);
332 }
333 return;
334 }
335
336 decVideoSurfaceRef(env, thiz);
337
338 sp<IGraphicBufferProducer> new_st;
339 if (jsurface) {
340 sp<Surface> surface(android_view_Surface_getSurface(env, jsurface));
341 if (surface != NULL) {
342 new_st = surface->getIGraphicBufferProducer();
343 if (new_st == NULL) {
344 jniThrowException(env, "java/lang/IllegalArgumentException",
345 "The surface does not have a binding SurfaceTexture!");
346 return;
347 }
348 new_st->incStrong((void*)decVideoSurfaceRef);
349 } else {
350 jniThrowException(env, "java/lang/IllegalArgumentException",
351 "The surface has been released");
352 return;
353 }
354 }
355
356 env->SetLongField(thiz, fields.surface_texture, (jlong)new_st.get());
357
358 // This will fail if the media player has not been initialized yet. This
359 // can be the case if setDisplay() on MediaPlayer.java has been called
360 // before setDataSource(). The redundant call to setVideoSurfaceTexture()
361 // in prepare/prepareAsync covers for this case.
362 mp->setVideoSurfaceTexture(new_st);//setVideoSurfaceTexture
363 }
setVideoSurfaceTexture
//android/frameworks/av/media/libmedia/mediaplayer.cpp244 status_t MediaPlayer::setVideoSurfaceTexture(
245 const sp<IGraphicBufferProducer>& bufferProducer)
246 {
247 ALOGV("setVideoSurfaceTexture");
248 Mutex::Autolock _l(mLock);
249 if (mPlayer == 0) return NO_INIT;
250 return mPlayer->setVideoSurfaceTexture(bufferProducer);
251 }
实际是NuPlayer setVideoSurfaceTextureAsync,这里发送AMessage kWhatSetVideoSurface
375 void NuPlayer::setVideoSurfaceTextureAsync(
376 const sp<IGraphicBufferProducer> &bufferProducer) {
377 sp<AMessage> msg = new AMessage(kWhatSetVideoSurface, this);
378
379 if (bufferProducer == NULL) {
380 msg->setObject("surface", NULL);
381 } else {
382 msg->setObject("surface", new Surface(bufferProducer, true /* controlledByApp */));
383 }
384
385 msg->post();
386 }
找到kWhatSetVideoSurface,一直往下在kWhatScanSources时初始化decoder
//android/frameworks/av/media/libmediaplayerservice/nuplayer/NuPlayer.cppcase kWhatScanSources:
975 {
976 int32_t generation;
977 CHECK(msg->findInt32("generation", &generation));
978 if (generation != mScanSourcesGeneration) {
979 // Drop obsolete msg.
980 break;
981 }
982
983 mScanSourcesPending = false;
984
985 ALOGV("scanning sources haveAudio=%d, haveVideo=%d",
986 mAudioDecoder != NULL, mVideoDecoder != NULL);
987
988 bool mHadAnySourcesBefore =
989 (mAudioDecoder != NULL) || (mVideoDecoder != NULL);
990 bool rescan = false;
991
992 // initialize video before audio because successful initialization of
993 // video may change deep buffer mode of audio.
994 if (mSurface != NULL) {
995 if (instantiateDecoder(false, &mVideoDecoder) == -EWOULDBLOCK) {
996 rescan = true;
997 }
998 }
999
1000 // Don't try to re-open audio sink if there's an existing decoder.
1001 if (mAudioSink != NULL && mAudioDecoder == NULL) {
1002 if (instantiateDecoder(true, &mAudioDecoder) == -EWOULDBLOCK) {
1003 rescan = true;
1004 }
1005 }
1006
1007 if (!mHadAnySourcesBefore
1008 && (mAudioDecoder != NULL || mVideoDecoder != NULL)) {
1009 // This is the first time we've found anything playable.
1010
1011 if (mSourceFlags & Source::FLAG_DYNAMIC_DURATION) {
1012 schedulePollDuration();
1013 }
1014 }
1015
status_t NuPlayer::instantiateDecoder(
1779 bool audio, sp<DecoderBase> *decoder, bool checkAudioModeChange) {
1780 // The audio decoder could be cleared by tear down. If still in shut down
1781 // process, no need to create a new audio decoder.
1782 if (*decoder != NULL || (audio && mFlushingAudio == SHUT_DOWN)) {
1783 return OK;
1784 }
1785
1786 sp<AMessage> format = mSource->getFormat(audio);
1787
1788 if (format == NULL) {
1789 return UNKNOWN_ERROR;
1790 } else {
1791 status_t err;
1792 if (format->findInt32("err", &err) && err) {
1793 return err;
1794 }
1795 }
1796
1797 format->setInt32("priority", 0 /* realtime */);
1798
1799 if (!audio) {
1800 AString mime;
1801 CHECK(format->findString("mime", &mime));
1802
1803 sp<AMessage> ccNotify = new AMessage(kWhatClosedCaptionNotify, this);
1804 if (mCCDecoder == NULL) {
1805 mCCDecoder = new CCDecoder(ccNotify);
1806 }
1807
1808 if (mSourceFlags & Source::FLAG_SECURE) {
1809 format->setInt32("secure", true);
1810 }
1811
1812 if (mSourceFlags & Source::FLAG_PROTECTED) {
1813 format->setInt32("protected", true);
1814 }
1815
1816 float rate = getFrameRate();
1817 if (rate > 0) {
1818 format->setFloat("operating-rate", rate * mPlaybackSettings.mSpeed);
1819 }
1820 }
1821
1822 if (audio) {
1823 sp<AMessage> notify = new AMessage(kWhatAudioNotify, this);
1824 ++mAudioDecoderGeneration;
1825 notify->setInt32("generation", mAudioDecoderGeneration);
1826
1827 if (checkAudioModeChange) {
1828 determineAudioModeChange(format);
1829 }
1830 if (mOffloadAudio) {
1831 mSource->setOffloadAudio(true /* offload */);
1832
1833 const bool hasVideo = (mSource->getFormat(false /*audio */) != NULL);
1834 format->setInt32("has-video", hasVideo);
1835 *decoder = AVNuFactory::get()->createPassThruDecoder(notify, mSource, mRenderer);
1836 ALOGV("instantiateDecoder audio DecoderPassThrough hasVideo: %d", hasVideo);
1837 } else {
1838 AVNuUtils::get()->setCodecOutputFormat(format);
1839 mSource->setOffloadAudio(false /* offload */);
1840
1841 *decoder = AVNuFactory::get()->createDecoder(notify, mSource, mPID, mUID, mRenderer);
1842 ALOGV("instantiateDecoder audio Decoder");
1843 }
1844 mAudioDecoderError = false;
1845 } else {
1846 sp<AMessage> notify = new AMessage(kWhatVideoNotify, this);
1847 ++mVideoDecoderGeneration;
1848 notify->setInt32("generation", mVideoDecoderGeneration);
1849
1850 *decoder = new Decoder(
1851 notify, mSource, mPID, mUID, mRenderer, mSurface, mCCDecoder);
1852 mVideoDecoderError = false;
1853
1854 // enable FRC if high-quality AV sync is requested, even if not
1855 // directly queuing to display, as this will even improve textureview
1856 // playback.
1857 {
1858 if (property_get_bool("persist.sys.media.avsync", false)) {
1859 format->setInt32("auto-frc", 1);
1860 }
1861 }
1862 }
1863 (*decoder)->init();
1864
1865 // Modular DRM
1866 if (mIsDrmProtected) {
1867 format->setPointer("crypto", mCrypto.get());
1868 ALOGV("instantiateDecoder: mCrypto: %p (%d) isSecure: %d", mCrypto.get(),
1869 (mCrypto != NULL ? mCrypto->getStrongCount() : 0),
1870 (mSourceFlags & Source::FLAG_SECURE) != 0);
1871 }
1872
1873 (*decoder)->configure(format);
1874
1875 if (!audio) {
1876 sp<AMessage> params = new AMessage();
1877 float rate = getFrameRate();
1878 if (rate > 0) {
1879 params->setFloat("frame-rate-total", rate);
1880 }
1881
1882 sp<MetaData> fileMeta = getFileMeta();
1883 if (fileMeta != NULL) {
1884 int32_t videoTemporalLayerCount;
1885 if (fileMeta->findInt32(kKeyTemporalLayerCount, &videoTemporalLayerCount)
1886 && videoTemporalLayerCount > 0) {
1887 params->setInt32("temporal-layer-count", videoTemporalLayerCount);
1888 }
1889 }
1890
1891 if (params->countEntries() > 0) {
1892 (*decoder)->setParameters(params);
1893 }
1894 }
1895 return OK;
1896 }// android/frameworks/av/media/libavextensions/mediaplayerservice/AVNuFactory.cpp
62 sp<NuPlayer::DecoderBase> AVNuFactory::createDecoder(
63 const sp<AMessage> ¬ify,
64 const sp<NuPlayer::Source> &source,
65 pid_t pid,
66 uid_t uid,
67 const sp<NuPlayer::Renderer> &renderer) {
68 return new NuPlayer::Decoder(notify, source, pid, uid, renderer);
69 }
Nuplayer初始化decoder过程中 (*decoder)->configure(format);
289 void NuPlayer::Decoder::onConfigure(const sp<AMessage> &format) {
290 CHECK(mCodec == NULL);
291
292 mFormatChangePending = false;
293 mTimeChangePending = false;
294
295 ++mBufferGeneration;
296
297 AString mime;
298 CHECK(format->findString("mime", &mime));
299
300 mIsAudio = !strncasecmp("audio/", mime.c_str(), 6);
301 mIsVideoAVC = !strcasecmp(MEDIA_MIMETYPE_VIDEO_AVC, mime.c_str());
302
303 mComponentName = mime;
304 mComponentName.append(" decoder");
305 ALOGV("[%s] onConfigure (surface=%p)", mComponentName.c_str(), mSurface.get());
306
307 mCodec = AVUtils::get()->createCustomComponentByName(mCodecLooper, mime.c_str(), false /* encoder */, format);
308 if (mCodec == NULL) {
309 mCodec = MediaCodec::CreateByType(
310 mCodecLooper, mime.c_str(), false /* encoder */, NULL /* err */, mPid, mUid);
311 }
312 int32_t secure = 0;
313 if (format->findInt32("secure", &secure) && secure != 0) {
314 if (mCodec != NULL) {
315 mCodec->getName(&mComponentName);
316 mComponentName.append(".secure");
317 mCodec->release();
318 ALOGI("[%s] creating", mComponentName.c_str());
319 mCodec = MediaCodec::CreateByComponentName(
320 mCodecLooper, mComponentName.c_str(), NULL /* err */, mPid, mUid);
321 }
322 }
然后创建MediaCodec,Init mediaCodec,initiateAllocateComponent(format);
//android/frameworks/av/media/libstagefright/MediaCodec.cppcase kWhatInit:
1918 {
1919 sp<AReplyToken> replyID;
1920 CHECK(msg->senderAwaitsResponse(&replyID));
1921
1922 if (mState != UNINITIALIZED) {
1923 PostReplyWithError(replyID, INVALID_OPERATION);
1924 break;
1925 }
1926
1927 mReplyID = replyID;
1928 setState(INITIALIZING);
1929
1930 AString name;
1931 CHECK(msg->findString("name", &name));
1932
1933 int32_t nameIsType;
1934 int32_t encoder = false;
1935 CHECK(msg->findInt32("nameIsType", &nameIsType));
1936 if (nameIsType) {
1937 CHECK(msg->findInt32("encoder", &encoder));
1938 }
1939
1940 sp<AMessage> format = new AMessage;
1941
1942 if (nameIsType) {
1943 format->setString("mime", name.c_str());
1944 format->setInt32("encoder", encoder);
1945 } else {
1946 format->setString("componentName", name.c_str());
1947 }
1948
1949 mCodec->initiateAllocateComponent(format);
1950 break;
1951 }
1952
MediaCodec调用ACodec的initiateAllocateComponent接口进编解码组件的创建,ACodec就给自己发送了个msg: kWhatAllocateComponent。结合下面的代码来看,这个msg的处理就是单纯的调用执行onAllocateComponent。
android/frameworks/av/media/libstagefright/ACodec.cpp6328 bool ACodec::UninitializedState::onMessageReceived(const sp<AMessage> &msg) {
6329 bool handled = false;
6330
6331 switch (msg->what()) {
6332 case ACodec::kWhatSetup:
6333 {
6334 onSetup(msg);
6335
6336 handled = true;
6337 break;
6338 }
6339
6340 case ACodec::kWhatAllocateComponent:
6341 {
6342 onAllocateComponent(msg);
6343 handled = true;
6344 break;
6345 }
在这里onAllocateComponent,完成omx client连接omx,获取omx bp代理对象,获取matching codecs,allocate node等等
6392 bool ACodec::UninitializedState::onAllocateComponent(const sp<AMessage> &msg) {
6393 ALOGV("onAllocateComponent");
6394
6395 CHECK(mCodec->mOMXNode == NULL);
6396
6397 OMXClient client;
6398 bool trebleFlag;
6399 if (client.connect(&trebleFlag) != OK) { //omx client连接omx
6400 mCodec->signalError(OMX_ErrorUndefined, NO_INIT);
6401 return false;
6402 }
6403 mCodec->setTrebleFlag(trebleFlag);
6404
6405 sp<IOMX> omx = client.interface();//获取omx bp代理对象
6406
6407 sp<AMessage> notify = new AMessage(kWhatOMXDied, mCodec);
6408
6409 Vector<AString> matchingCodecs;
6410
6411 AString mime;
6412
6413 AString componentName;
6414 int32_t encoder = false;
6415 if (msg->findString("componentName", &componentName)) {
6416 sp<IMediaCodecList> list = MediaCodecList::getInstance();
6417 if (list != NULL && list->findCodecByName(componentName.c_str()) >= 0) {
6418 matchingCodecs.add(componentName);
6419 }
6420 //make sure if the component name contains qcom/qti, we add it to matchingCodecs
6421 //as these components are not present in media_codecs.xml and MediaCodecList won't find
6422 //these component by findCodecByName
6423 if (matchingCodecs.size() == 0 && (componentName.find("qcom", 0) > 0 ||
6424 componentName.find("qti", 0) > 0)) {
6425 matchingCodecs.add(componentName);
6426 }
6427
6428 } else {
6429 CHECK(msg->findString("mime", &mime));
6430
6431 if (!msg->findInt32("encoder", &encoder)) {
6432 encoder = false;
6433 }
6434 //findMatchingCodecs
6435 MediaCodecList::findMatchingCodecs(
6436 mime.c_str(),
6437 encoder, // createEncoder
6438 0, // flags
6439 &matchingCodecs);
6440 }
6441
6442 sp<CodecObserver> observer = new CodecObserver;
6443 sp<IOMXNode> omxNode;
6444
6445 status_t err = NAME_NOT_FOUND;
6446 for (size_t matchIndex = 0; matchIndex < matchingCodecs.size();
6447 ++matchIndex) {
6448 componentName = matchingCodecs[matchIndex];
6449
6450 pid_t tid = gettid();
6451 int prevPriority = androidGetThreadPriority(tid);
6452 androidSetThreadPriority(tid, ANDROID_PRIORITY_FOREGROUND);
6453 err = omx->allocateNode(componentName.c_str(), observer, &omxNode);//allocate Node
6454 androidSetThreadPriority(tid, prevPriority);
6455
6456 if (err == OK) {
6457 break;
6458 } else {
6459 ALOGW("Allocating component '%s' failed, try next one.", componentName.c_str());
6460 }
6461
6462 omxNode = NULL;
6463 }
6464
6465 if (omxNode == NULL) {
6466 if (!mime.empty()) {
6467 ALOGE("Unable to instantiate a %scoder for type '%s' with err %#x.",
6468 encoder ? "en" : "de", mime.c_str(), err);
6469 } else {
6470 ALOGE("Unable to instantiate codec '%s' with err %#x.", componentName.c_str(), err);
6471 }
6472
6473 mCodec->signalError((OMX_ERRORTYPE)err, makeNoSideEffectStatus(err));
6474 return false;
6475 }
6476
6477 mDeathNotifier = new DeathNotifier(notify);
6478 if (mCodec->getTrebleFlag()) {
6479 auto tOmxNode = omxNode->getHalInterface();
6480 if (!tOmxNode->linkToDeath(mDeathNotifier, 0)) {
6481 mDeathNotifier.clear();
6482 }
6483 } else {
6484 if (IInterface::asBinder(omxNode)->linkToDeath(mDeathNotifier) != OK) {
6485 // This was a local binder, if it dies so do we, we won't care
6486 // about any notifications in the afterlife.
6487 mDeathNotifier.clear();
6488 }
6489 }
6490
6491 notify = new AMessage(kWhatOMXMessageList, mCodec);
6492 notify->setInt32("generation", ++mCodec->mNodeGeneration);
6493 observer->setNotificationMessage(notify);
6494
6495 mCodec->mComponentName = componentName;
6496 mCodec->mRenderTracker.setComponentName(componentName);
6497 mCodec->mFlags = 0;
6498
6499 if (componentName.endsWith(".secure")) {
6500 mCodec->mFlags |= kFlagIsSecure;
6501 mCodec->mFlags |= kFlagIsGrallocUsageProtected;
6502 mCodec->mFlags |= kFlagPushBlankBuffersToNativeWindowOnShutdown;
6503 }
6504
6505 mCodec->mOMX = omx;
6506 mCodec->mOMXNode = omxNode;
6507 mCodec->mCallback->onComponentAllocated(mCodec->mComponentName.c_str());
6508 mCodec->changeState(mCodec->mLoadedState);
6509
6510 return true;
6511 }
为什么用surfaceTexture不用surface来展示呢?ICS之前都用的是surfaceview来展示video或者openGL的内容,surfacaview render在surface上,textureview render在surfaceTexture,textureview和surfaceview 这两者有什么区别呢?surfaceview跟应用的视窗不是同一个视窗,它自己new了一个window来展示openGL或者video的内容,这样做有一个好处就是不用重绘应用的视窗,本身就可以不停的更新,但这也带来一些局限性,surfaceview不是依附在应用视窗中,也就不能移动、缩放、旋转,应用ListView或者 ScrollView就比较费劲。Textureview就很好的解决了这些问题。它拥有surfaceview的一切特性外,它也拥有view的一切行为,可以当个view使用。
获取完surfaceTexture,我们就可以prepare/prepareAsync了
prepare是个同步的过程,所以要加锁,prepareAsync_l后缀加_l就是表面是同步的过程。
1 /*
2 **
3 ** Copyright 2006, The Android Open Source Project
4 **
5 ** Licensed under the Apache License, Version 2.0 (the "License");
6 ** you may not use this file except in compliance with the License.
7 ** You may obtain a copy of the License at
8 **
9 ** http://www.apache.org/licenses/LICENSE-2.0
10 **
11 ** Unless required by applicable law or agreed to in writing, software
12 ** distributed under the License is distributed on an "AS IS" BASIS,
13 ** WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
14 ** See the License for the specific language governing permissions and
15 ** limitations under the License.
16 */
17
18 //#define LOG_NDEBUG 0
19 #define LOG_TAG "MediaPlayerNative"
20
21 #include <fcntl.h>
22 #include <inttypes.h>
23 #include <sys/stat.h>
24 #include <sys/types.h>
25 #include <unistd.h>
26
27 #include <utils/Log.h>
28
29 #include <binder/IServiceManager.h>
30 #include <binder/IPCThreadState.h>
31
32 #include <gui/Surface.h>
33
34 #include <media/mediaplayer.h>
35 #include <media/AudioResamplerPublic.h>
36 #include <media/AudioSystem.h>
37 #include <media/AVSyncSettings.h>
38 #include <media/IDataSource.h>
39 #include <media/MediaAnalyticsItem.h>
40
41 #include <binder/MemoryBase.h>
42
43 #include <utils/KeyedVector.h>
44 #include <utils/String8.h>
45
46 #include <system/audio.h>
47 #include <system/window.h>
48
49 #ifdef MM_VIDEOSHOT
50 #include <MyosMediaplayer.h>
51 #endif
52 namespace android {
53
54 MediaPlayer::MediaPlayer()
55 {
56 ALOGV("constructor");
57 mListener = NULL;
58 mCookie = NULL;
59 mStreamType = AUDIO_STREAM_MUSIC;
60 mAudioAttributesParcel = NULL;
61 mCurrentPosition = -1;
62 mCurrentSeekMode = MediaPlayerSeekMode::SEEK_PREVIOUS_SYNC;
63 mSeekPosition = -1;
64 mSeekMode = MediaPlayerSeekMode::SEEK_PREVIOUS_SYNC;
65 mCurrentState = MEDIA_PLAYER_IDLE;
66 mPrepareSync = false;
67 mPrepareStatus = NO_ERROR;
68 mLoop = false;
69 mLeftVolume = mRightVolume = 1.0;
70 mVideoWidth = mVideoHeight = 0;
71 mLockThreadId = 0;
72 mAudioSessionId = (audio_session_t) AudioSystem::newAudioUniqueId(AUDIO_UNIQUE_ID_USE_SESSION);
73 AudioSystem::acquireAudioSessionId(mAudioSessionId, -1);
74 mSendLevel = 0;
75 mRetransmitEndpointValid = false;
76 }
77
78 MediaPlayer::~MediaPlayer()
79 {
80 ALOGV("destructor");
81 if (mAudioAttributesParcel != NULL) {
82 delete mAudioAttributesParcel;
83 mAudioAttributesParcel = NULL;
84 }
85 AudioSystem::releaseAudioSessionId(mAudioSessionId, -1);
86 disconnect();
87 IPCThreadState::self()->flushCommands();
88 }
89
90 void MediaPlayer::disconnect()
91 {
92 ALOGV("disconnect");
93 sp<IMediaPlayer> p;
94 {
95 Mutex::Autolock _l(mLock);
96 p = mPlayer;
97 mPlayer.clear();
98 }
99
100 if (p != 0) {
101 p->disconnect();
102 }
103 }
104
105 // always call with lock held
106 void MediaPlayer::clear_l()
107 {
108 mCurrentPosition = -1;
109 mCurrentSeekMode = MediaPlayerSeekMode::SEEK_PREVIOUS_SYNC;
110 mSeekPosition = -1;
111 mSeekMode = MediaPlayerSeekMode::SEEK_PREVIOUS_SYNC;
112 mVideoWidth = mVideoHeight = 0;
113 mRetransmitEndpointValid = false;
114 }
115
116 status_t MediaPlayer::setListener(const sp<MediaPlayerListener>& listener)
117 {
118 ALOGV("setListener");
119 Mutex::Autolock _l(mLock);
120 mListener = listener;
121 return NO_ERROR;
122 }
123
124
125 status_t MediaPlayer::attachNewPlayer(const sp<IMediaPlayer>& player)
126 {
127 status_t err = UNKNOWN_ERROR;
128 sp<IMediaPlayer> p;
129 { // scope for the lock
130 Mutex::Autolock _l(mLock);
131
132 if ( !( (mCurrentState & MEDIA_PLAYER_IDLE) ||
133 (mCurrentState == MEDIA_PLAYER_STATE_ERROR ) ) ) {
134 ALOGE("attachNewPlayer called in state %d", mCurrentState);
135 return INVALID_OPERATION;
136 }
137
138 clear_l();
139 p = mPlayer;
140 mPlayer = player;
141 if (player != 0) {
142 mCurrentState = MEDIA_PLAYER_INITIALIZED;
143 player->getDefaultBufferingSettings(&mCurrentBufferingSettings);
144 err = NO_ERROR;
145 } else {
146 mCurrentBufferingSettings = BufferingSettings();
147 ALOGE("Unable to create media player");
148 }
149 }
150
151 if (p != 0) {
152 p->disconnect();
153 }
154
155 return err;
156 }
157
158 status_t MediaPlayer::setDataSource(
159 const sp<IMediaHTTPService> &httpService,
160 const char *url, const KeyedVector<String8, String8> *headers)
161 {
162 ALOGV("setDataSource(%s)", url);
163 status_t err = BAD_VALUE;
164 if (url != NULL) {
165 const sp<IMediaPlayerService> service(getMediaPlayerService());
166 if (service != 0) {
167 sp<IMediaPlayer> player(service->create(this, mAudioSessionId));
168 if ((NO_ERROR != doSetRetransmitEndpoint(player)) ||
169 (NO_ERROR != player->setDataSource(httpService, url, headers))) {
170 player.clear();
171 }
172 err = attachNewPlayer(player);
173 }
174 }
175 return err;
176 }
177
178 status_t MediaPlayer::setDataSource(int fd, int64_t offset, int64_t length)
179 {
180 ALOGV("setDataSource(%d, %" PRId64 ", %" PRId64 ")", fd, offset, length);
181 status_t err = UNKNOWN_ERROR;
182 const sp<IMediaPlayerService> service(getMediaPlayerService());
183 if (service != 0) {
184 sp<IMediaPlayer> player(service->create(this, mAudioSessionId));
185 if ((NO_ERROR != doSetRetransmitEndpoint(player)) ||
186 (NO_ERROR != player->setDataSource(fd, offset, length))) {
187 player.clear();
188 }
189 err = attachNewPlayer(player);
190 }
191 return err;
192 }
193
194 status_t MediaPlayer::setDataSource(const sp<IDataSource> &source)
195 {
196 ALOGV("setDataSource(IDataSource)");
197 status_t err = UNKNOWN_ERROR;
198 const sp<IMediaPlayerService> service(getMediaPlayerService());
199 if (service != 0) {
200 sp<IMediaPlayer> player(service->create(this, mAudioSessionId));
201 if ((NO_ERROR != doSetRetransmitEndpoint(player)) ||
202 (NO_ERROR != player->setDataSource(source))) {
203 player.clear();
204 }
205 err = attachNewPlayer(player);
206 }
207 return err;
208 }
209
210 status_t MediaPlayer::invoke(const Parcel& request, Parcel *reply)
211 {
212 Mutex::Autolock _l(mLock);
213 const bool hasBeenInitialized =
214 (mCurrentState != MEDIA_PLAYER_STATE_ERROR) &&
215 ((mCurrentState & MEDIA_PLAYER_IDLE) != MEDIA_PLAYER_IDLE);
216 if ((mPlayer != NULL) && hasBeenInitialized) {
217 ALOGV("invoke %zu", request.dataSize());
218 return mPlayer->invoke(request, reply);
219 }
220 ALOGE("invoke failed: wrong state %X, mPlayer(%p)", mCurrentState, mPlayer.get());
221 return INVALID_OPERATION;
222 }
223
224 status_t MediaPlayer::setMetadataFilter(const Parcel& filter)
225 {
226 ALOGD("setMetadataFilter");
227 Mutex::Autolock lock(mLock);
228 if (mPlayer == NULL) {
229 return NO_INIT;
230 }
231 return mPlayer->setMetadataFilter(filter);
232 }
233
234 status_t MediaPlayer::getMetadata(bool update_only, bool apply_filter, Parcel *metadata)
235 {
236 ALOGD("getMetadata");
237 Mutex::Autolock lock(mLock);
238 if (mPlayer == NULL) {
239 return NO_INIT;
240 }
241 return mPlayer->getMetadata(update_only, apply_filter, metadata);
242 }
243
244 status_t MediaPlayer::setVideoSurfaceTexture(
245 const sp<IGraphicBufferProducer>& bufferProducer)
246 {
247 ALOGV("setVideoSurfaceTexture");
248 Mutex::Autolock _l(mLock);
249 if (mPlayer == 0) return NO_INIT;
250 return mPlayer->setVideoSurfaceTexture(bufferProducer);
251 }
252
253 status_t MediaPlayer::getDefaultBufferingSettings(BufferingSettings* buffering /* nonnull */)
254 {
255 ALOGV("getDefaultBufferingSettings");
256
257 Mutex::Autolock _l(mLock);
258 if (mPlayer == 0) {
259 return NO_INIT;
260 }
261 return mPlayer->getDefaultBufferingSettings(buffering);
262 }
263
264 status_t MediaPlayer::getBufferingSettings(BufferingSettings* buffering /* nonnull */)
265 {
266 ALOGV("getBufferingSettings");
267
268 Mutex::Autolock _l(mLock);
269 if (mPlayer == 0) {
270 return NO_INIT;
271 }
272 *buffering = mCurrentBufferingSettings;
273 return NO_ERROR;
274 }
275
276 status_t MediaPlayer::setBufferingSettings(const BufferingSettings& buffering)
277 {
278 ALOGV("setBufferingSettings");
279
280 Mutex::Autolock _l(mLock);
281 if (mPlayer == 0) {
282 return NO_INIT;
283 }
284 status_t err = mPlayer->setBufferingSettings(buffering);
285 if (err == NO_ERROR) {
286 mCurrentBufferingSettings = buffering;
287 }
288 return err;
289 }
290
291 // must call with lock held
292 status_t MediaPlayer::prepareAsync_l()
293 {
294 if ( (mPlayer != 0) && ( mCurrentState & (MEDIA_PLAYER_INITIALIZED | MEDIA_PLAYER_STOPPED) ) ) {
295 if (mAudioAttributesParcel != NULL) {
296 mPlayer->setParameter(KEY_PARAMETER_AUDIO_ATTRIBUTES, *mAudioAttributesParcel);
297 } else {
298 mPlayer->setAudioStreamType(mStreamType);
299 }
300 mCurrentState = MEDIA_PLAYER_PREPARING;
301 return mPlayer->prepareAsync();
302 }
303 ALOGE("prepareAsync called in state %d, mPlayer(%p)", mCurrentState, mPlayer.get());
304 return INVALID_OPERATION;
305 }
实际是NuplayerDriver
264 status_t NuPlayerDriver::prepare() {
265 ALOGV("prepare(%p)", this);
266 Mutex::Autolock autoLock(mLock);
267 return prepare_l();
268 }
269
270 status_t NuPlayerDriver::prepare_l() {
271 switch (mState) {
272 case STATE_UNPREPARED:
273 mState = STATE_PREPARING;
274
275 // Make sure we're not posting any notifications, success or
276 // failure information is only communicated through our result
277 // code.
278 mIsAsyncPrepare = false;
279 mPlayer->prepareAsync();
280 while (mState == STATE_PREPARING) {
281 mCondition.wait(mLock);
282 }
283 return (mState == STATE_PREPARED) ? OK : UNKNOWN_ERROR;
284 case STATE_STOPPED:
285 // this is really just paused. handle as seek to start
286 mAtEOS = false;
287 mState = STATE_STOPPED_AND_PREPARING;
288 mIsAsyncPrepare = false;
289 mPlayer->seekToAsync(0, MediaPlayerSeekMode::SEEK_PREVIOUS_SYNC /* mode */,
290 true /* needNotify */);
291 while (mState == STATE_STOPPED_AND_PREPARING) {
292 mCondition.wait(mLock);
293 }
294 return (mState == STATE_STOPPED_AND_PREPARED) ? OK : UNKNOWN_ERROR;
295 default:
296 return INVALID_OPERATION;
297 };
298 }
299
300 status_t NuPlayerDriver::prepareAsync() {
301 ALOGV("prepareAsync(%p)", this);
302 Mutex::Autolock autoLock(mLock);
303
304 switch (mState) {
305 case STATE_UNPREPARED:
306 mState = STATE_PREPARING;
307 mIsAsyncPrepare = true;
308 mPlayer->prepareAsync();
309 return OK;
310 case STATE_STOPPED:
311 // this is really just paused. handle as seek to start
312 mAtEOS = false;
313 mState = STATE_STOPPED_AND_PREPARING;
314 mIsAsyncPrepare = true;
315 mPlayer->seekToAsync(0, MediaPlayerSeekMode::SEEK_PREVIOUS_SYNC /* mode */,
316 true /* needNotify */);
317 return OK;
318 default:
319 return INVALID_OPERATION;
320 };
321 }
353 void NuPlayer::GenericSource::onPrepareAsync() {} else {
379 if (property_get_bool("media.stagefright.extractremote", true) &&
380 !FileSource::requiresDrm(mFd, mOffset, mLength, nullptr /* mime */)) {
381 sp<IBinder> binder =
382 defaultServiceManager()->getService(String16("media.extractor"));
383 if (binder != nullptr) {
384 ALOGD("FileSource remote");
385 sp<IMediaExtractorService> mediaExService(
386 interface_cast<IMediaExtractorService>(binder));
387 sp<IDataSource> source =
388 mediaExService->makeIDataSource(mFd, mOffset, mLength);
389 ALOGV("IDataSource(FileSource): %p %d %lld %lld",
390 source.get(), mFd, (long long)mOffset, (long long)mLength);
391 if (source.get() != nullptr) {
392 mDataSource = DataSource::CreateFromIDataSource(source);
393 if (mDataSource != nullptr) {
394 // Close the local file descriptor as it is not needed anymore.
395 close(mFd);
396 mFd = -1;
397 }
398 } else {
399 ALOGW("extractor service cannot make data source");
400 }
401 } else {
402 ALOGW("extractor service not running");
403 }
404 }......
430 // init extractor from data source
431 status_t err = initFromDataSource();
//android/frameworks/av/media/libmediaplayerservice/nuplayer/GenericSource.cpp159 status_t NuPlayer::GenericSource::initFromDataSource() {
160 sp<IMediaExtractor> extractor;
161 CHECK(mDataSource != NULL);
162
163 extractor = MediaExtractor::Create(mDataSource, NULL,
164 mIsStreaming ? 0 : AVNuUtils::get()->getFlags());
165
166 if (extractor == NULL) {
167 ALOGE("initFromDataSource, cannot create extractor!");
168 return UNKNOWN_ERROR;
169 }
170
171 mFileMeta = extractor->getMetaData();
prepare主要目的了,根据类型找到解码器并初始化对应的解码器
OnStart and Decoder
总的来说mediaplayer播放的第一步骤setdataSource,在prepare前我们还有setDisplay这一步,即获取surfacetexture来进行画面的展示。
setDataSource方法,根据文件类型获得相应的player。然后在prepare前initdecoder,创建MediaCodec,完成了prepare前的必要步骤,并且setVideoSurfaceTexture,接着prepare,创建相应文件类型的mediaExtractor,完成Start play前的所有准备工作
在Onstart命令开始后开始decoder,显示