Home > Backend Development > Golang > How to Efficiently Download Large Files in Go Without Filling Up Memory?

How to Efficiently Download Large Files in Go Without Filling Up Memory?

Barbara Streisand
Release: 2024-11-13 12:58:02
Original
434 people have browsed it

How to Efficiently Download Large Files in Go Without Filling Up Memory?

Efficiently Downloading Large Files Using Go

Storing large files in memory can overwhelm a computer's resources. In Go, how can we download such files directly, avoiding this memory constraint?

Answer:

Assuming the download is via HTTP:

import (
    "net/http"
    "io"
    "os"
)

func DownloadFile(url, filepath string) (int64, error) {
    // Open the destination file
    out, err := os.Create(filepath)
    if err != nil {
        return 0, err
    }
    defer out.Close()

    // Start the HTTP request
    resp, err := http.Get(url)
    if err != nil {
        return 0, err
    }
    defer resp.Body.Close()

    // Stream the response body into the file (avoiding full buffering)
    n, err := io.Copy(out, resp.Body)
    if err != nil {
        return 0, err
    }

    return n, nil
}
Copy after login

This approach utilizes HTTP's response body as a Reader. Functions like io.Copy() can operate on this Reader, allowing us to process the response in chunks.

The above is the detailed content of How to Efficiently Download Large Files in Go Without Filling Up Memory?. For more information, please follow other related articles on the PHP Chinese website!

source:php.cn
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Latest Articles by Author
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template