MediaClock and Audio-Video Synchronization

Posted by McInfo on Fri, 17 May 2019 00:42:34 +0200

Nuplayer mid-range video synchronization relies mainly on anchor time recorded in MediaClock, so look at the MediaClock class first

1,1

MediaClock::MediaClock()
    : mAnchorTimeMediaUs(-1),
      mAnchorTimeRealUs(-1),
      mMaxTimeMediaUs(INT64_MAX),
      mStartingTimeMediaUs(-1),
      mPlaybackRate(1.0) {
}

1.2 Anchor point time update

void MediaClock::updateAnchor(
int64_t anchorTimeMediaUs,
int64_t anchorTimeRealUs,
int64_t maxTimeMediaUs) {
    Mutex::Autolock autoLock(mLock);
    int64_t nowUs = ALooper::GetNowUs();
    int64_t nowMediaUs =
    anchorTimeMediaUs + (nowUs - anchorTimeRealUs) * (double)mPlaybackRate; //mPlaybackRate is playback multiplier
    //The next section is protection beyond the threshold.
    if (mAnchorTimeRealUs != -1) {
    int64_t oldNowMediaUs =
    mAnchorTimeMediaUs + (nowUs - mAnchorTimeRealUs) * (double)mPlaybackRate;
    if (nowMediaUs < oldNowMediaUs
    && nowMediaUs > oldNowMediaUs - kAnchorFluctuationAllowedUs) {
    return;
    }
    }
    //mAnchorTimeRealUs The current system time, anchor real system time stamp, can be interpreted as the last frame played, corresponding to the time after the system clock.
    //mAnchorTimeMediaUs is the anchor media timestamp, which can be understood as the first media timestamp recorded from the beginning of playback to the total time the frame is currently playing, but it mainly unifies the audio pts with the system clock, i.e. synchronizes the audio to the system clock.
    //Adding this time to mAnchorTimeMediaUs simply takes into account the time spent on function calls, essentially recording the two incoming arguments directly
    mAnchorTimeRealUs = nowUs;
    mAnchorTimeMediaUs = nowMediaUs;
}

  

1.3 Interpretations excerpted from other people's links

realTimeUs = PTS - nowMediaUs + nowUs      

getRealTimeFor

= PTS - (mAnchorTimeMediaUs + (nowUs - mAnchorTimeRealUs)) + nowUs

mAnchorTimeMediaUs + (nowUs - mAnchorTimeRealUs) is

Calculations in getMediaTime_l

mAnchorTimeMediaUs anchor media timestamp, which can be interpreted as the first media timestamp recorded at the beginning of playback

mAnchorTimeRealUs Anchor real System Timestamp,

nowUs - mAnchorTimeRealUs is how long it has been since playback started.*

This time, combined with mAnchorTimeMediaUs, is the "corresponding media timestamp at the current system time".

Subtract this time with PTS to indicate how long it will take to play this frame.

Finally, add a system time, which is the time this frame should display.

--------------------- 

Author: zhanghui_cuc

Source: CSDN

Original: https://blog.csdn.net/nonmarking/article/details/78746671

Copyright Statement: This is an original article of the blogger. Please attach a link to the blog post to reproduce it!


Actually it is based on the PTS solved by codec, and then compare the audio anchor value to calculate how long the current video frame should be delayed post, anchor time is calculated at each audio frame, as I understand it is

(nowUs - mAnchorTimeRealUs) How long is it from the last audio frame to play

mAnchorTimeMediaUs + (nowUs - mAnchorTimeRealUs) is the video frame timestamp that the corresponding audio frame should play
PTS - (mAnchorTimeMediaUs + (nowUs - mAnchorTimeRealUs) + nowUs This is PTS - the real time stamp of the video frame to play calculates how long the video frame will be delayed plus how long the current time will get absolute time


mMediaClock->getRealTimeFor(mediaTimeUs, &realTimeUs) == OK
delayUs = realTimeUs - nowUs;
msg->setWhat(kWhatPostDrainVideoQueue);
msg->post(postDelayUs);
//TagetMediaUs PTS outRealUs, actual time to play
status_t MediaClock::getRealTimeFor(
int64_t targetMediaUs, int64_t *outRealUs) const {

    int64_t nowUs = ALooper::GetNowUs();
    int64_t nowMediaUs;
    status_t status =
    getMediaTime_l(nowUs, &nowMediaUs, true /* allowPastMaxTime */);
    if (status != OK) {
    return status;
    }
    *outRealUs = (targetMediaUs - nowMediaUs) / (double)mPlaybackRate + nowUs;
    return OK;
}
status_t MediaClock::getMediaTime_l(
int64_t realUs, int64_t *outMediaUs, bool allowPastMaxTime) const {

    int64_t mediaUs = mAnchorTimeMediaUs
    + (realUs - mAnchorTimeRealUs) * (double)mPlaybackRate;
    if (mediaUs > mMaxTimeMediaUs && !allowPastMaxTime) {
    mediaUs = mMaxTimeMediaUs;
    }
    if (mediaUs < mStartingTimeMediaUs) {
    mediaUs = mStartingTimeMediaUs;
    }
    if (mediaUs < 0) {
    mediaUs = 0;
    }
    *outMediaUs = mediaUs;
    return OK;
}


Topics: Android codec