The operating environment of this tutorial: windows7 system, Dell G3 computer.
What does network jitter mean
If network congestion occurs, queuing delay will affect end-to-end latency and cause transmission over the same connection Packet delays vary, and jitter is used to describe the extent of such delay changes.
Network jitter means that when network congestion occurs, the delay caused by queuing will affect the end-to-end delay and cause packet delays transmitted through the same connection to be different. Jitter is used to describe such a delay. The degree of change. Generally, the network jitter value refers to the difference between the maximum value and the minimum value of delay in network communication. The smaller the network jitter value, the more stable the network quality is.
For example, assuming that the maximum delay of network A is 15 milliseconds and the minimum delay is 5 milliseconds, then the network jitter value is 10 milliseconds (this statement is not completely accurate, just for reference As a rough reference), it mainly identifies the stability of a network.
How many milliseconds of network jitter is considered normal?
Network jitter of 1~100ms is considered normal, that is, less than 100ms is considered normal.
1~30ms: Extremely fast, with almost no noticeable delay, and playing any game is extremely smooth.
31~50ms: Good, you can play the game normally without any obvious delay.
51~100ms: Normal, in confrontational games above a certain level, you can feel delays and occasional pauses.
100ms~200ms: Poor, unable to play confrontational games normally, with obvious lagging, and occasional packet loss and disconnection.
200ms~500ms: Very poor. There are obvious delays and freezes when accessing web pages. Packet loss or inaccessibility often occurs.
>500ms: Extremely poor, unacceptable delay and packet loss, even unable to access the web page.
>1000ms: basically inaccessible.
Calculation method: 1 second = 1000 milliseconds (for example: 30ms is 0.03 seconds)
Extended knowledge:
The impact of network jitter on video:
The display of video is achieved through rendering. If provided to the rendering The data packets are sometimes slow and sometimes fast, so the rendering effect is also sometimes fast and sometimes slow, so the video may look suddenly fast and suddenly slow, or it may look like it is stuck.
The principle of jitter buffer to solve the impact of network jitter on video:
Know the size of network jitter by calculating the network delay, so set the appropriate buffer size. to store received packets. Assume that the network jitter is too large at the beginning. At this time, we
create a buffer to receive data, but do not send it to decoding or rendering in time. Instead, we wait for the delay time set by the network jitter size to expire. Provide the data in the buffer for decoding or rendering.
This buffer contains multiple video frame data, so that the data obtained by the decoder from the buffer is continuous in time, so that the video will not be fast and slow, but will look smooth. Smooth. However, it can be seen that using jitter buffer, the rendered video will have a large delay with the source video, which is inevitable.
For more related knowledge, please visit the FAQ column!
The above is the detailed content of How many milliseconds is the network jitter considered normal?. For more information, please follow other related articles on the PHP Chinese website!