See this blog post for the rational and benchmarks.
tldr; when writing an array of objects this library can speed things up. Especially as the number of objects in the array and the number of columns in the object grow.
If you have a whole large array at once, there is a helper function for writing it all out at once.
package main
import (
"os"
"github.com/multiprocessio/go-json"
)
func main() {
// Uses stdlib's encoding/json
data := []interface{}{
map[string]interface{}{"a": 1, "b": 2},
map[string]interface{}{"a": 5, "c": 3, "d": "xyz"},
}
out := os.Stdout // Can be any io.Writer
err := jsonutil.Encode(out, data)
if err != nil {
panic(err)
}
}
If you are streaming data, there is also support for stream encoding with this library:
package main
import (
"os"
"github.com/multiprocessio/go-json"
)
func main() {
// Uses stdlib's encoding/json
data := []interface{}{
map[string]interface{}{"a": 1, "b": 2},
map[string]interface{}{"a": 5, "c": 3, "d": "xyz"},
}
out := os.Stdout // Can be any io.Writer
encoder := jsonutil.NewStreamEncoder(out, true)
for _, row := range data {
err := encoder.EncodeRow(row)
if err != nil{
panic(err)
}
}
err := encoder.Close()
if err != nil {
panic(err)
}
}