Rumah > pembangunan bahagian belakang > tutorial php > Bagaimana untuk Mengelakkan Ralat Keletihan Memori apabila Menggunakan file_get_contents() dengan Fail Besar?

Bagaimana untuk Mengelakkan Ralat Keletihan Memori apabila Menggunakan file_get_contents() dengan Fail Besar?

Barbara Streisand
Lepaskan: 2024-10-17 13:43:29
asal
593 orang telah melayarinya

How to Avoid Memory Exhaustion Errors when Using file_get_contents() with Large Files?

File_get_contents Memory Exhaustion: A Comprehensive Solution

When dealing with large file processing, the infamous PHP Fatal error: Allowed memory exhausted error can be a recurring issue. This problem arises when file_get_contents() attempts to read the entire contents of a sizeable file into memory, often exceeding the allocated memory limit.

Alternative to file_get_contents()

Instead of loading the entire file into memory, a more efficient approach is to open the file as a pointer and read it in smaller chunks using fread(). This allows for memory management, which is critical for handling large files.

Below is a custom function that mimics the functionality of Node.js's file processing API:

<code class="php">function file_get_contents_chunked($file, $chunk_size, $callback)
{
    try {
        $handle = fopen($file, "r");
        $i = 0;
        while (!feof($handle)) {
            call_user_func_array($callback, array(fread($handle, $chunk_size), &$handle, $i));
            $i++;
        }
    } catch (Exception $e) {
        trigger_error("file_get_contents_chunked::" . $e->getMessage(), E_USER_NOTICE);
        return false;
    }
    fclose($handle);

    return true;
}</code>
Salin selepas log masuk

This function accepts three parameters: the file path, the desired chunk size, and a callback function that will be called for each chunk read.

Usage of Custom Function

The file_get_contents_chunked() function can be used as follows:

<code class="php">$success = file_get_contents_chunked("my/large/file", 4096, function ($chunk, &$handle, $iteration) {
    /* Process the chunk here... */
});</code>
Salin selepas log masuk

Regex Considerations

Performing multiple regex operations on a large chunk of data is inefficient. Consider using native string manipulation functions like strpos(), substr(), trim(), and explode().

Example Cleanup

Instead of:

<code class="php">$newData = str_replace("^M", "", $myData);</code>
Salin selepas log masuk

Use:

<code class="php">$pattern = '/\r\n/';
$replacement = '';
$newData = preg_replace($pattern, $replacement, $myData);</code>
Salin selepas log masuk

By utilizing the aforementioned techniques, it is possible to effectively process large files without encountering memory exhaustion errors.

Atas ialah kandungan terperinci Bagaimana untuk Mengelakkan Ralat Keletihan Memori apabila Menggunakan file_get_contents() dengan Fail Besar?. Untuk maklumat lanjut, sila ikut artikel berkaitan lain di laman web China PHP!

sumber:php
Kenyataan Laman Web ini
Kandungan artikel ini disumbangkan secara sukarela oleh netizen, dan hak cipta adalah milik pengarang asal. Laman web ini tidak memikul tanggungjawab undang-undang yang sepadan. Jika anda menemui sebarang kandungan yang disyaki plagiarisme atau pelanggaran, sila hubungi admin@php.cn
Artikel terbaru oleh pengarang
Tutorial Popular
Lagi>
Muat turun terkini
Lagi>
kesan web
Kod sumber laman web
Bahan laman web
Templat hujung hadapan