加入收藏 | 设为首页 | 会员中心 | 我要投稿 李大同 (https://www.lidatong.com.cn/)- 科技、建站、经验、云计算、5G、大数据,站长网!
当前位置: 首页 > 大数据 > 正文

【转载】live555学习 --H264数据处理(1)

发布时间:2020-12-14 03:41:57 所属栏目:大数据 来源:网络整理
导读:本文转载自白杨 《live555学习(十三) --H264数据处理(1)》 ??????? 在live555学习(七) --DESCRIBE命令处理中,对如何打开文件并获得其SDP信息已做了描述,这里针对H264数据的处理再做进一步分析。 ??????? 当RTSPServer收到对某个媒体的DESCRIBE请求时
本文转载自白杨 《live555学习(十三) --H264数据处理(1)》

??????? 在live555学习(七) --DESCRIBE命令处理中,对如何打开文件并获得其SDP信息已做了描述,这里针对H264数据的处理再做进一步分析。

??????? 当RTSPServer收到对某个媒体的DESCRIBE请求时,它会找到对应的ServerMediaSession,调用ServerMediaSession::generateSDPDescription()。generateSDPDescription()中会遍历调用ServerMediaSession中所有的调用ServerMediaSubsession,通过subsession->sdpLines()取得每个Subsession的sdp,合并成一个完整的SDP返回之。我们几乎可以断定,文件的打开和分析应该是在每个Subsession的sdpLines()函数中完成的,看看这个函数:

  1: char const* OnDemandServerMediaSubsession::sdpLines()    
  2: {    
  3:     if (fSDPLines == NULL) {    
  4:         // We need to construct a set of SDP lines that describe this     
  5:         // subsession (as a unicast stream).  To do so,we first create     
  6:         // dummy (unused) source and "RTPSink" objects, 
  7:         // whose parameters we use for the SDP lines:     
  8:         unsigned estBitrate;    
  9:         FramedSource* inputSource = createNewStreamSource(0,estBitrate);    
 10:         if (inputSource == NULL)    
 11:             return NULL; // file not found     
 12:     
 13:         struct in_addr dummyAddr;    
 14:         dummyAddr.s_addr = 0;    
 15:         Groupsock dummyGroupsock(envir(),dummyAddr,0);    
 16:         unsigned char rtpPayloadType = 96 + trackNumber() - 1; // if dynamic     
 17:         RTPSink* dummyRTPSink = createNewRTPSink(&dummyGroupsock,
 18:                 rtpPayloadType,inputSource);    
 19:     
 20:         setSDPLinesFromRTPSink(dummyRTPSink,inputSource,monospace; font-size:12px"> 21:         Medium::close(dummyRTPSink);    
 22:         closeStreamSource(inputSource);    
 23:     }    
 24:     
 25:     return fSDPLines;    
 26: }   
 27: 

??????? Subsession中直接保存了对应媒体文件的SDP,但是在第一次获取时fSDPLines为NULL,所以需先获取fSDPLines。其做法比较费事,是通过建临时的Source和RTPSink,把它们连接成一个StreamToken,Playing一段时间之后才取得了fSDPLines。createNewStreamSource()和createNewRTPSink()都是虚函数,所以此处创建的source和sink都是继承类指定的,我们分析的是H264,也就是H264VideoFileServerMediaSubsession所指定的,来看一下这两个函数:

  1: FramedSource* H264VideoFileServerMediaSubsession::createNewStreamSource(    
  2:         /*clientSessionId*/,monospace; font-size:12px">  3:         unsigned& estBitrate)    
  4: {    
  5:     estBitrate = 500; // kbps,estimate     
  6:     
  7:     // Create the video source:     
  8:     ByteStreamFileSource* fileSource = ByteStreamFileSource::createNew(envir(),fFileName);    
  9:     if (fileSource == NULL)    
return NULL;    
 11:     fFileSize = fileSource->fileSize();    
 13:     // Create a framer for the Video Elementary Stream:     
 14:     return H264VideoStreamFramer::createNew(envir(),fileSource);    
 15: }    
 16:     
 17: RTPSink* H264VideoFileServerMediaSubsession::createNewRTPSink(    
 18:         Groupsock* rtpGroupsock,monospace; font-size:12px"> 19:         char rtpPayloadTypeIfDynamic,monospace; font-size:12px"> 20:         FramedSource* /*inputSource*/)    
 21: {    
 22:     return H264VideoRTPSink::createNew(envir(),rtpGroupsock,rtpPayloadTypeIfDynamic);    
 23: }   
 24: 

??????? 可以看到,分别创建了H264VideoStreamFramer和H264VideoRTPSink。可以肯定H264VideoStreamFramer也是一个Source,但它内部又利用了另一个source--ByteStreamFileSource。后面会分析为什么要这样做,这里先不要管它。还没有看到真正打开文件的代码,继续探索:

void OnDemandServerMediaSubsession::setSDPLinesFromRTPSink(    
  2:         RTPSink* rtpSink,monospace; font-size:12px">  3:         FramedSource* inputSource,255)">unsigned estBitrate)    
  5: {    
  6:     if (rtpSink == NULL)    
return;    
  8:     
const* mediaType = rtpSink->sdpMediaType();    
 10:     char rtpPayloadType = rtpSink->rtpPayloadType();    
 11:     struct in_addr serverAddrForSDP;    
 12:     serverAddrForSDP.s_addr = fServerAddressForSDP;    
char* const ipAddressStr = strDup(our_inet_ntoa(serverAddrForSDP));    
char* rtpmapLine = rtpSink->rtpmapLine();    
 15:     const* rangeLine = rangeSDPLine();    
 16:     const* auxSDPLine = getAuxSDPLine(rtpSink,monospace; font-size:12px"> 17:     if (auxSDPLine == NULL)    
 18:         auxSDPLine = "";    
 20:     const* const sdpFmt = "m=%s %u RTP/AVP %drn"    
 21:             "c=IN IP4 %srn"    
 22:             "b=AS:%urn"    
 23:             "%s"    
 24:             " 25:             " 26:             "a=control:%srn";    
 27:     unsigned sdpFmtSize = strlen(sdpFmt) + strlen(mediaType) + 5 /* max short len */    
 28:     + 3 /* max char len */    
 29:     + strlen(ipAddressStr) + 20 /* max int len */    
 30:     + strlen(rtpmapLine) + strlen(rangeLine) + strlen(auxSDPLine)    
 31:             + strlen(trackId());    
 32:     char* sdpLines = new char[sdpFmtSize];    
 33:     sprintf(sdpLines,sdpFmt,mediaType,// m= <media>     
 34:             fPortNumForSDP,0)">// m= <port>     
 35:             rtpPayloadType,0)">// m= <fmt list>     
 36:             ipAddressStr,0)">// c= address     
 37:             estBitrate,0)">// b=AS:<bandwidth>     
 38:             rtpmapLine,0)">// a=rtpmap:... (if present)     
 39:             rangeLine,0)">// a=range:... (if present)     
 40:             auxSDPLine,0)">// optional extra SDP line     
 41:             trackId()); // a=control:<track-id>     
 42:     delete[] (char*) rangeLine;    
 43:     delete[] rtpmapLine;    
 44:     delete[] ipAddressStr;    
 45:     
 46:     fSDPLines = strDup(sdpLines);    
 47:     delete[] sdpLines;    
 48: }   
 49: 

??????? 此函数中取得Subsession的sdp并保存到fSDPLines。打开文件应在rtpSink->rtpmapLine()甚至是Source创建时已经做了。我们不防先把它放一放,而是先把SDP的获取过程搞个通透。所以把焦点集中到getAuxSDPLine()上。

const* OnDemandServerMediaSubsession::getAuxSDPLine(    
  3:         FramedSource*   5:     // Default implementation:     
return rtpSink == NULL ? NULL : rtpSink->auxSDPLine();    
  7: }   
  8: 

??????? 很简单,调用了rtpSink->auxSDPLine()那么我们要看H264VideoRTPSink::auxSDPLine():不用看了,很简单,取得source 中保存的PPS,SPS等形成a=fmpt行。但事实上并没有这么简单,H264VideoFileServerMediaSubsession重写了getAuxSDPLine()!如果不重写,则说明auxSDPLine已经在前面分析文件时获得了,那么既然重写,就说明前面没有获取到,只能在这个函数中重写。H264VideoFileServerMediaSubsession中这个函数:

const* H264VideoFileServerMediaSubsession::getAuxSDPLine(    
  3:         FramedSource* inputSource)    
if (fAuxSDPLine != NULL)    
return fAuxSDPLine; // it's already been set up (for a previous client)     
  7:     
  8:     if (fDummyRTPSink == NULL) { // we're not already setting it up for another,concurrent stream     
  9:         // Note: For H264 video files,the 'config' information ("profile-level-id" and "sprop-parameter-sets") isn't known     
// until we start reading the file.  This means that "rtpSink"s "auxSDPLine()" will be NULL initially,monospace; font-size:12px"> 11:         // and we need to start reading data from our file until this changes.     
 12:         fDummyRTPSink = rtpSink;    
 13:     
 14:         // Start reading the file:     
 15:         fDummyRTPSink->startPlaying(*inputSource,afterPlayingDummy,255)">this);    
 17:         // Check whether the sink's 'auxSDPLine()' is ready:     
 18:         checkForAuxSDPLine( 19:     }    
 20:     
 21:     envir().taskScheduler().doEventLoop(&fDoneFlag);    
 22:     
 23:     return fAuxSDPLine;    
 24: }  
 25: 
??????? 注释里面解释得很清楚,H264不能在文件头中取得PPS/SPS,必须在播放一下后(当然,它是一个原始流文件,没有文件头)才行。也就是说不能从rtpSink中取得了。为了保证在函数退出前能取得AuxSDP,把大循环搬到这里来了。afterPlayingDummy()是在播放结束也就是取得aux sdp之后执行。在大循环之前的checkForAuxSDPLine()做了什么呢??
void H264VideoFileServerMediaSubsession::checkForAuxSDPLine1()    
const* dasl;    
  4:     
if (fAuxSDPLine != NULL) {    
// Signal the event loop that we're done:     
  7:         setDoneFlag();    
  8:     } else if (fDummyRTPSink != NULL    
  9:             && (dasl = fDummyRTPSink->auxSDPLine()) != NULL) {    
 10:         fAuxSDPLine = strDup(dasl);    
 11:         fDummyRTPSink = NULL;    
 14:         setDoneFlag();    
 15:     } else {    
// try again after a brief delay:     
int uSecsToDelay = 100000; // 100 ms     
 18:         nextTask() = envir().taskScheduler().scheduleDelayedTask(uSecsToDelay,monospace; font-size:12px"> 19:                 (TaskFunc*) checkForAuxSDPLine,monospace; font-size:12px"> 20:     }    
 21: }   
 22: 

??????? 它检查是否已取得Aux sdp,如果取得了,设置结束标志,直接返回。如果没有,就检查是否sink中已取得了aux sdp,如果是,也设置结束标志,返回。如果还没有取得,则把这个检查函数做为delay task加入计划任务中。每100毫秒检查一次,每检查一次主要就是调用一次fDummyRTPSink->auxSDPLine()。大循环在检测到fDoneFlag改变时停止,此时已取得了aux sdp。但是如果直到文件结束也没有得到aux sdp,则afterPlayingDummy( )被执行,在其中停止掉这个大循环。然后在父类Subsession中关掉这些临时的source和sink。在真正播放时重新创建。

http://blog.csdn.net/nkmnkm/article/details/6931400

(编辑:李大同)

【声明】本站内容均来自网络,其相关言论仅代表作者个人观点,不代表本站立场。若无意侵犯到您的权利,请及时与联系站长删除相关内容!

    推荐文章
      热点阅读