<kbd id="afajh"><form id="afajh"></form></kbd>
<strong id="afajh"><dl id="afajh"></dl></strong>
    <del id="afajh"><form id="afajh"></form></del>
        1. <th id="afajh"><progress id="afajh"></progress></th>
          <b id="afajh"><abbr id="afajh"></abbr></b>
          <th id="afajh"><progress id="afajh"></progress></th>

          Android FFmpeg + MediaCodec 實現(xiàn)視頻硬解碼

          共 8622字,需瀏覽 18分鐘

           ·

          2021-12-31 13:48

          前面 FFmpeg 系列的文章中,已經(jīng)實現(xiàn)了音視頻的播放、錄制、添加濾鏡等功能:



          本文將利用 FFmpeg+ MediaCodec 做一個播放器,實現(xiàn)視頻的硬解碼和音視頻同步等功能。



          MediaCodec 介紹

          MediaCodec 是 Android 提供的用于對音視頻進(jìn)行編解碼的類,它通過訪問底層的 codec 來實現(xiàn)編解碼的功能,是 Android media 基礎(chǔ)框架的一部分,通常和 MediaExtractor, MediaSync, MediaMuxer, MediaCrypto, MediaDrm, Image, Surface和AudioTrack 一起使用。


          詳細(xì)描述可參見官方文檔:https://developer.android.com/reference/android/media/MediaCodec.html

          AMediaCodec 是 MediaCodec 的 native 接口,Google 從 Android 5.0 開始提供,Native 代碼編譯時需要引入 mediandk 庫,官方 demo :


          https://github.com/android/ndk-samples/tree/main/native-codec

          FFmpeg + ANativeCodec

          在 Android 沒有在 Native 層開放 ModecCodec 接口之前,F(xiàn)Fmpeg 實現(xiàn)硬解碼需要將視頻和音頻數(shù)據(jù)拷貝到 Java 層,在 Java 層調(diào)用 MediaCodec (通過 JNI 調(diào)用 Java 對象方法)。


          本文將實現(xiàn) FFmpeg 和 AMediaCodec 結(jié)合使用, FFmpeg 負(fù)責(zé)解復(fù)用和音頻解碼,MediaCodec 負(fù)責(zé)視頻解碼并輸出到 Surface(ANativeWindow)對象,其中解復(fù)用、音頻解碼、視頻解碼分別在一個子線程進(jìn)行,利用隊列管理音視頻數(shù)據(jù)包。


          注意:本 demo 處理的視頻是 H.264 編碼,由于 AVPacket data 不是標(biāo)準(zhǔn)的 NALU ,需要利用 av_bitstream_filter_filter 將 AVPacket 前 4 個字節(jié)替換為 0x00000001 得到標(biāo)準(zhǔn)的 NALU 數(shù)據(jù),這樣保證 MediaCodec 解碼正確。


          配置 AMediaCodec 對象,只解碼視頻流:

                  m_MediaExtractor = AMediaExtractor_new();
          media_status_t err = AMediaExtractor_setDataSourceFd(m_MediaExtractor, fd,static_cast<off64_t>(outStart),static_cast<off64_t>(outLen));
          close(fd);
          if (err != AMEDIA_OK) {
          result = -1;
          LOGCATE("HWCodecPlayer::InitDecoder AMediaExtractor_setDataSourceFd fail. err=%d", err);
          break;
          }

          int numTracks = AMediaExtractor_getTrackCount(m_MediaExtractor);

          LOGCATE("HWCodecPlayer::InitDecoder AMediaExtractor_getTrackCount %d tracks", numTracks);
          for (int i = 0; i < numTracks; i++) {
          AMediaFormat *format = AMediaExtractor_getTrackFormat(m_MediaExtractor, i);
          const char *s = AMediaFormat_toString(format);
          LOGCATE("HWCodecPlayer::InitDecoder track %d format: %s", i, s);
          const char *mime;
          if (!AMediaFormat_getString(format, AMEDIAFORMAT_KEY_MIME, &mime)) {
          LOGCATE("HWCodecPlayer::InitDecoder no mime type");
          result = -1;
          break;
          } else if (!strncmp(mime, "video/", 6)) {
          // Omitting most error handling for clarity.
          // Production code should check for errors.
          AMediaExtractor_selectTrack(m_MediaExtractor, i);
          m_MediaCodec = AMediaCodec_createDecoderByType(mime);
          AMediaCodec_configure(m_MediaCodec, format, m_ANativeWindow, NULL, 0);
          AMediaCodec_start(m_MediaCodec);
          }
          AMediaFormat_delete(format);
          }

          FFmpeg 在一個線程中解復(fù)用,分別將音頻和視頻的編碼數(shù)據(jù)包放入 2 個隊列。

          int HWCodecPlayer::DoMuxLoop() {
          LOGCATE("HWCodecPlayer::DoMuxLoop start");

          int result = 0;
          AVPacket avPacket = {0};
          for(;;) {
          double passTimes = 0;

          ......

          if(m_SeekPosition >= 0) { // seek 操作
          //seek to frame
          LOGCATE("HWCodecPlayer::DoMuxLoop seeking m_SeekPosition=%f", m_SeekPosition);
          ......
          }

          result = av_read_frame(m_AVFormatContext, &avPacket);
          if(result >= 0) {
          double bufferDuration = m_VideoPacketQueue->GetDuration() * av_q2d(m_VideoTimeBase);
          LOGCATE("HWCodecPlayer::DoMuxLoop bufferDuration=%lfs", bufferDuration);
          //防止緩沖數(shù)據(jù)包過多
          while (BUFF_MAX_VIDEO_DURATION < bufferDuration && m_PlayerState == PLAYER_STATE_PLAYING && m_SeekPosition < 0) {
          bufferDuration = m_VideoPacketQueue->GetDuration() * av_q2d(m_VideoTimeBase);
          usleep(10 * 1000);
          }

          //音頻和視頻的編碼數(shù)據(jù)包分別放入 2 個隊列。
          if(avPacket.stream_index == m_VideoStreamIdx) {
          m_VideoPacketQueue->PushPacket(&avPacket);
          } else if(avPacket.stream_index == m_AudioStreamIdx) {
          m_AudioPacketQueue->PushPacket(&avPacket);
          } else {
          av_packet_unref(&avPacket);
          }
          } else {
          //解復(fù)用結(jié)束,暫停解碼器
          std::unique_lock<std::mutex> lock(m_Mutex);
          m_PlayerState = PLAYER_STATE_PAUSE;
          }
          }
          LOGCATE("HWCodecPlayer::DoMuxLoop end");
          return 0;
          }

          視頻解碼線程中,Native 使用 AMediaCodec 對視頻進(jìn)行解碼,從視頻的 AVPacket 隊列中取包進(jìn)行解碼。

          void HWCodecPlayer::VideoDecodeThreadProc(HWCodecPlayer *player) {
          LOGCATE("HWCodecPlayer::VideoDecodeThreadProc start");
          AVPacketQueue* videoPacketQueue = player->m_VideoPacketQueue;
          AMediaCodec* videoCodec = player->m_MediaCodec;
          AVPacket *packet = av_packet_alloc();
          for(;;) {

          ....

          ssize_t bufIdx = -1;
          bufIdx = AMediaCodec_dequeueInputBuffer(videoCodec, 0);
          if (bufIdx >= 0) {
          size_t bufSize;
          auto buf = AMediaCodec_getInputBuffer(videoCodec, bufIdx, &bufSize);
          av_bitstream_filter_filter(player->m_Bsfc, player->m_VideoCodecCtx, NULL, &packet->data, &packet->size, packet->data, packet->size,
          packet->flags & AV_PKT_FLAG_KEY);
          LOGCATI("HWCodecPlayer::VideoDecodeThreadProc 0x%02X 0x%02X 0x%02X 0x%02X \n",packet->data[0],packet->data[1],packet->data[2],packet->data[3]);
          memcpy(buf, packet->data, packet->size);
          AMediaCodec_queueInputBuffer(videoCodec, bufIdx, 0, packet->size, packet->pts, 0);
          }
          av_packet_unref(packet);
          AMediaCodecBufferInfo info;
          auto status = AMediaCodec_dequeueOutputBuffer(videoCodec, &info, 1000);
          LOGCATI("HWCodecPlayer::VideoDecodeThreadProc status: %d\n", status);
          uint8_t* buffer;
          if (status >= 0) {
          SyncClock* videoClock = &player->m_VideoClock;
          double presentationNano = info.presentationTimeUs * av_q2d(player->m_VideoTimeBase) * 1000;
          videoClock->SetClock(presentationNano, GetSysCurrentTime());
          player->AVSync();//音視頻同步

          size_t size;
          LOGCATI("HWCodecPlayer::VideoDecodeThreadProc sync video curPts = %lf", presentationNano);
          buffer = AMediaCodec_getOutputBuffer(videoCodec, status, &size);
          LOGCATI("HWCodecPlayer::VideoDecodeThreadProc buffer: %p, buffer size: %d", buffer, size);
          AMediaCodec_releaseOutputBuffer(videoCodec, status, info.size != 0);
          } else if (status == AMEDIACODEC_INFO_OUTPUT_BUFFERS_CHANGED) {
          LOGCATI("HWCodecPlayer::VideoDecodeThreadProc output buffers changed");
          } else if (status == AMEDIACODEC_INFO_OUTPUT_FORMAT_CHANGED) {
          LOGCATI("HWCodecPlayer::VideoDecodeThreadProc output format changed");
          } else if (status == AMEDIACODEC_INFO_TRY_AGAIN_LATER) {
          LOGCATI("HWCodecPlayer::VideoDecodeThreadProc no output buffer right now");
          } else {
          LOGCATI("HWCodecPlayer::VideoDecodeThreadProc unexpected info code: %zd", status);
          }
          if(isLocked) lock.unlock();
          }

          if(packet != nullptr) {
          av_packet_free(&packet);
          packet = nullptr;
          }
          LOGCATE("HWCodecPlayer::VideoDecodeThreadProc end");
          }

          視頻向音頻同步,delay 時間根據(jù)實際幀率與目標(biāo)幀率的時間差進(jìn)行微調(diào)。

          void HWCodecPlayer::AVSync() {
          LOGCATE("HWCodecPlayer::AVSync");
          double delay = m_VideoClock.curPts - m_VideoClock.lastPts;
          int tickFrame = 1000 * m_FrameRate.den / m_FrameRate.num;
          LOGCATE("HWCodecPlayer::AVSync tickFrame=%dms", tickFrame);
          if(delay <= 0 || delay > VIDEO_FRAME_MAX_DELAY) {
          delay = tickFrame;
          }
          double refClock = m_AudioClock.GetClock();// 視頻向音頻同步
          double avDiff = m_VideoClock.lastPts - refClock;
          m_VideoClock.lastPts = m_VideoClock.curPts;
          double syncThreshold = FFMAX(AV_SYNC_THRESHOLD_MIN, FFMIN(AV_SYNC_THRESHOLD_MAX, delay));
          LOGCATE("HWCodecPlayer::AVSync refClock=%lf, delay=%lf, avDiff=%lf, syncThreshold=%lf", refClock, delay, avDiff, syncThreshold);
          if(avDiff <= -syncThreshold) { //視頻比音頻慢
          delay = FFMAX(0, delay + avDiff);
          }
          else if(avDiff >= syncThreshold && delay > AV_SYNC_FRAMEDUP_THRESHOLD) { //視頻比音頻快太多
          delay = delay + avDiff;
          }
          else if(avDiff >= syncThreshold)
          delay = 2 * delay;

          LOGCATE("HWCodecPlayer::AVSync avDiff=%lf, delay=%lf", avDiff, delay);

          double tickCur = GetSysCurrentTime();
          double tickDiff = tickCur - m_VideoClock.frameTimer;//兩幀實際的時間間隔
          m_VideoClock.frameTimer = tickCur;

          if(tickDiff - tickFrame > 5) delay-=5;//微調(diào)delay時間
          if(tickDiff - tickFrame < -5) delay+=5;

          LOGCATE("HWCodecPlayer::AVSync delay=%lf, tickDiff=%lf", delay, tickDiff);
          if(delay > 0) {
          usleep(1000 * delay);
          }
          }


          實現(xiàn)代碼路徑:

          https://github.com/githubhaohao/LearnFFmpeg


          -- END --


          最近掘金社區(qū)發(fā)起了一個創(chuàng)作者投票活動,?沒想到自己的幾十篇文章幫助了這么多人,后面會再接再勵輸出更多干貨。


          小伙伴!你愿意為我投上幾票嗎?(每人可投 7~28 票)



          推薦:

          全網(wǎng)最全的 Android 音視頻和 OpenGL ES 干貨,都在這了

          抖音傳送帶特效是怎么實現(xiàn)的?

          所有你想要的圖片轉(zhuǎn)場效果,都在這了

          我用 OpenGL ES 給小姐姐做了幾個抖音濾鏡

          瀏覽 86
          點(diǎn)贊
          評論
          收藏
          分享

          手機(jī)掃一掃分享

          分享
          舉報
          評論
          圖片
          表情
          推薦
          點(diǎn)贊
          評論
          收藏
          分享

          手機(jī)掃一掃分享

          分享
          舉報
          <kbd id="afajh"><form id="afajh"></form></kbd>
          <strong id="afajh"><dl id="afajh"></dl></strong>
            <del id="afajh"><form id="afajh"></form></del>
                1. <th id="afajh"><progress id="afajh"></progress></th>
                  <b id="afajh"><abbr id="afajh"></abbr></b>
                  <th id="afajh"><progress id="afajh"></progress></th>
                  自拍偷拍一区二区三区 | 操操操操操操操操操操操操操操操操操逼 | 爱搞在线国产 | 亚洲资源在线观看 | 日韩大香蕉视频 |