Jump to content


Register a free account to unlock additional features at BleepingComputer.com
Welcome to BleepingComputer, a free community where people like yourself come together to discuss and learn how to use their computers. Using the site is easy and fun. As a guest, you can browse and view the various discussions in the forums, but can not create a new topic or reply to an existing one unless you are logged in. Other benefits of registering an account are subscribing to topics and forums, creating a blog, and having no ads shown anywhere on the site.

Click here to Register a free account now! or read our Welcome Guide to learn how to use this site.


Adaptive playout delay problem

  • Please log in to reply
No replies to this topic

#1 bibhas


  • Members
  • 1 posts
  • Local time:02:33 PM

Posted 02 May 2015 - 06:43 AM

Hey guys, need your help to solve this problem.

Consider the procedure for determining the playout delay for a VoIP connection. Suppose that u=0.2 and K=4. Let the following packets generated: packet 1 generated at time 0, packet 2 at time 20ms, packet 3 at time 40ms, packet 4 at time 60ms, and packet 5 at time 200ms, packet 6 at time 220ms, and packet 7 at time 240ms. And the receiving time for packet i is ri. And r1= 210ms, r2=270ms, r3=250ms, r4=380ms, r5=450ms, r6=510ms, r7=470ms. Packet 1 is the first packet generated by this VoIP connection. Please specify the payout time for packet 6 and packet 7, respectively.

Justify your answer.

BC AdBot (Login to Remove)


0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users