当前位置:操作系统 > 安卓/Android >>

android多媒体本地播放流程video playback--base on jellybean (五)

前面两篇文章,我们分别讲了setdataSource和prepare的过程,获得了mVideoTrack,mAudioTrack,mVideoSourc,mAudioSource,前两个来自于setdataSource过程,后面两是prepare。
 
status_t AwesomePlayer::setDataSource_l(const sp<MediaExtractor> &extractor) {…
if (!haveVideo && !strncasecmp(mime.string(), "video/", 6)) {
            setVideoSource(extractor->getTrack(i));}
else if (!haveAudio && !strncasecmp(mime.string(), "audio/", 6)) {
            setAudioSource(extractor->getTrack(i));
……………..
}
}
void AwesomePlayer::setVideoSource(sp<MediaSource> source) {
    CHECK(source != NULL);
    mVideoTrack = source;
}
void AwesomePlayer::setAudioSource(sp<MediaSource> source) {
    CHECK(source != NULL);
    mAudioTrack = source;
}
 
mVideoSource = OMXCodec::Create(
            mClient.inte易做图ce(), mVideoTrack->getFormat(),
            false, // createEncoder
            mVideoTrack,
            NULL, flags, USE_SURFACE_ALLOC ? mNativeWindow : NULL);
mAudioSource = OMXCodec::Create(
                mClient.inte易做图ce(), mAudioTrack->getFormat(),
                false, // createEncoder
                mAudioTrack);
通过mVideoTrack,mAudioTrack我们找到了相应的解码器,并初始化了,下面我们就开讲mediaplayer如何播放了。前面的一些接口实现,我们就不讲了,不懂的可以回到setdataSource这一篇继续研究,我们直接看Awesomeplayer的实现。先看大体的时序图吧:


status_t AwesomePlayer::play_l() {
    modifyFlags(SEEK_PREVIEW, CLEAR);
…………
    modifyFlags(PLAYING, SET);
    modifyFlags(FIRST_FRAME, SET); ---设置PLAYING和FIRST_FRAME的标志位
…………………..
    if (mAudioSource != NULL) {-----mAudioSource不为空时初始化Audioplayer
        if (mAudioPlayer == NULL) {
            if (mAudioSink != NULL) {
 
        (1)        mAudioPlayer = new AudioPlayer(mAudioSink, allowDeepBuffering, this);
                mAudioPlayer->setSource(mAudioSource);
 
                seekAudioIfNecessary_l();
            }
        }
        CHECK(!(mFlags & AUDIO_RUNNING));
 
        if (mVideoSource == NULL) {-----如果单是音频,直接播放
….
     (2)       status_t err = startAudioPlayer_l(
                    false /* sendErrorNotification */);
 
                modifyFlags((PLAYING | FIRST_FRAME), CLEAR);
…………..           
                return err;
            }
        }
    }
   ……
    if (mVideoSource != NULL) {-----有视频时,发送event到queue,等待处理
        // Kick off video playback
       (3) postVideoEvent_l();
 
        if (mAudioSource != NULL && mVideoSource != NULL) {----有视频,音频时,检查他们是否同步
       (4)     postVideoLagEvent_l();
        }
    }
    }
…………..
 
    return OK;
}
 
在playe_l方法里,我们可以看到首先是实例化一个audioplayer来播放音频,如果单单是音频直接就播放,现在我们是本地视频播放,将不会走第二步,直接走第三和第四步。我们看下postVideoEvent_l()方法,跟我们在讲prepareAsync_l的类似:
void AwesomePlayer::postVideoEvent_l(int64_t delayUs) {
……………
    mVideoEventPending = true;
    mQueue.postEventWithDelay(mVideoEvent, delayUs < 0 ? 10000 : delayUs);
}
 
mVideoEvent在我们构造awesomeplayer时已经定义:
mVideoEvent = new AwesomeEvent(this, &AwesomePlayer::onVideoEvent);
 
所以我们看onVideoEvent方法:
void AwesomePlayer::onVideoEvent() {
if (!mVideoBuffer) {
for (;;) {
         (1)   status_t err = mVideoSource->read(&mVideoBuffer, &options); ---mVideoSource(omxcodec)
            options.clearSeekTo();
             ++mStats.mNumVideoFramesDecoded;
}
(2)  status_t err = startAudioPlayer_l();
 
if ((mNativeWindow != NULL)
            && (mVideoRendererIsPreview || mVideoRenderer == NULL)) {
        mVideoRendererIsPreview = false;
 
     (3)   initRenderer_l();
    }
 
    if (mVideoRenderer != NULL) {
        mSinceLastDropped++;
     (4)   mVideoRenderer->render(mVideoBuffer);
    }
  (5)   postVideoEvent_l();
}
 
我们看到通过read方法去解码一个个sample,获取videobuffer,然后render到su易做图ceTexture。
read 方法:
status_t OMXCodec::read(
        MediaBuffer **buffer, const ReadOptions *options) {
if (mInitialBufferSubmit) {
        mInitialBufferSubmit = false;
 
        if (seeking) {
            CHECK(seekTimeUs >= 0);
            mSeekTimeUs = seekTimeUs;
            mSeekMode = seekMode;
 
            // There's no reason to trigger the code below, there's
&nb

补充:移动开发 , Android ,
CopyRight © 2022 站长资源库 编程知识问答 zzzyk.com All Rights Reserved
部分文章来自网络,