mysql - tens of millions of data paging, when the offset turns out to be larger, how to optimize the speed?
伊谢尔伦
伊谢尔伦 2017-05-18 10:48:01
0
2
705

Take mysql as an example. Generally, when the amount of data is relatively large, the offset will be larger when paging, and the query efficiency will be lower.

For example, if a post has 2 million comments and 10 are displayed on each page, there are 200,000 pages. How should the data be processed when 200,000 pages are taken? Is there any good way to solve this demand?

Example: NetEase Music, the maximum number of comments for a single song is more than 1.4 million, and the paging speed is very good,

伊谢尔伦
伊谢尔伦

小伙看你根骨奇佳,潜力无限,来学PHP伐。

reply all(2)
给我你的怀抱

第一我相信網易音樂讀取數據的方式絕對是使用nosql,去讀取數據。
當然假如你的數據表訪問頻率不高的話,還是可以直接讀取數據庫,當然mysql innodb庫有個坑爹的弱點,就是你查詢頁數越大,所以的效果就越小。所以一般我們是通過id去提高查詢的效果
舊的查詢用法 limit 100000,10 建議查詢用法 where id > 100000 limit 10。這樣可以保證到索引的使用。
當然你可以使用分錶的形式,降低單表數據,從而提高查詢效率

Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template
About us Disclaimer Sitemap
php.cn:Public welfare online PHP training,Help PHP learners grow quickly!