Such a large file cannot be processed by PHP. To process a file, PHP must first read it into memory, and the memory that each script can use is related to the memory_limit in the PHP configuration file. As long as your memory is large enough, you can set it to unlimited and use the maximum memory of the system.
The recommended solution is to first cut the file into small files, use a shell program such as sed to cut the file into several small files according to a certain size (the size is determined according to your PHP memory size), and then process it with PHP.
Read line by line, PHP has this function, it reads 3 lines at a time, but SQLite is a local database, and reading is a matter of the driver. It has nothing to do with PHP. Just don’t print it all at once
The data should be transferred to MYSQL and then provided to PHP for reading and writing. SQLite is originally a small file database with low ability to handle huge data.
Shocked! For such a large amount of data, sqlite is still used. Besides, if you read dozens of gigabytes into the memory at one time, unless your memory really has dozens of gigabytes, it will definitely burst the memory. It is recommended to try to split this database or convert it to other types of database processing.
I don’t understand sqlite, but it must be difficult to process such a large file directly. But there is a general idea. Just dismantling. According to the library. Watch or something.
Such a large file cannot be processed by PHP. To process a file, PHP must first read it into memory, and the memory that each script can use is related to the memory_limit in the PHP configuration file. As long as your memory is large enough, you can set it to unlimited and use the maximum memory of the system.
The recommended solution is to first cut the file into small files, use a shell program such as sed to cut the file into several small files according to a certain size (the size is determined according to your PHP memory size), and then process it with PHP.
Read line by line, PHP has this function, it reads 3 lines at a time, but SQLite is a local database, and reading is a matter of the driver. It has nothing to do with PHP. Just don’t print it all at once
The data should be transferred to MYSQL and then provided to PHP for reading and writing. SQLite is originally a small file database with low ability to handle huge data.
Shocked! For such a large amount of data, sqlite is still used.
Besides, if you read dozens of gigabytes into the memory at one time, unless your memory really has dozens of gigabytes, it will definitely burst the memory.
It is recommended to try to split this database or convert it to other types of database processing.
PHP’s text processing is not ideal and very slow. The solution of processing after entering mysql is feasible, but it is more time-consuming.
Reference Use it to check
I don’t understand sqlite, but it must be difficult to process such a large file directly. But there is a general idea. Just dismantling. According to the library. Watch or something.
Since it is a sqlite file, just click Database Read->Process->Storage. PHP can use sqlite directly by default.
Convert sqlite to Mysql, then optimize the index, and there should be no problem.
Please refer to this product