When working with large JSON payloads, decoding the entire stream in memory can be inefficient and impractical. In this article, we explore an alternative approach using the json.Decoder to decode JSON as it is streamed in.
The json.Decoder provides the Decoder.Token() method, which allows us to parse the next token in the JSON stream without consuming the entire object. This enables event-driven parsing, where we can process tokens incrementally and build a state machine to track our progress within the JSON structure.
Let's walk through an implementation that addresses the specific scenario described in the question:
import ( "encoding/json" "fmt" "log" ) type LargeObject struct { Id string `json:"id"` Data string `json:"data"` } // Helper error handler he := func(err error) { if err != nil { log.Fatal(err) } } // Parse and process a JSON object stream func ParseStream(reader io.Reader) { dec := json.NewDecoder(reader) // Expect an object t, err := dec.Token() he(err) if delim, ok := t.(json.Delim); !ok || delim != '{' { log.Fatal("Expected object") } // Read properties for dec.More() { t, err = dec.Token() he(err) prop := t.(string) if t != "items" { var v interface{} he(dec.Decode(&v)) log.Printf("Property '%s' = %v", prop, v) continue } // "items" array t, err := dec.Token() he(err) if delim, ok := t.(json.Delim); !ok || delim != '[' { log.Fatal("Expected array") } // Read large objects for dec.More() { lo := LargeObject{} he(dec.Decode(&lo)) fmt.Printf("Item: %+v\n", lo) } // Array closing delim t, err = dec.Token() he(err) if delim, ok := t.(json.Delim); !ok || delim != ']' { log.Fatal("Expected array closing") } } // Object closing delim t, err = dec.Token() he(err) if delim, ok := t.(json.Delim); !ok || delim != '}' { log.Fatal("Expected object closing") } }
This implementation processes the JSON object incrementally, decoding properties and large objects separately. The he() function is used to handle errors with a fatal exit.
By avoiding loading the entire JSON response into memory, this approach allows for efficient processing of large payloads.
The above is the detailed content of How Can I Efficiently Decode Large JSON Streams in Go?. For more information, please follow other related articles on the PHP Chinese website!