Goavro
Goavro is a library that encodes and decodes Avro data.
Install / Use
/learn @linkedin/GoavroREADME
[!IMPORTANT] Internally, most of LinkedIn has moved over to use https://github.com/hamba/avro for Avro serialization/deserialization needs as we found it to be significantly more performant in large-scale scenarios. goavro is in maintenance mode.
goavro
Goavro is a library that encodes and decodes Avro data.
Description
- Encodes to and decodes from both binary and textual JSON Avro data.
Codecis stateless and is safe to use by multiple goroutines.
With the exception of features not yet supported, goavro attempts to be fully compliant with the most recent version of the Avro specification.
Dependency Notice
All usage of gopkg.in has been removed in favor of Go modules.
Please update your import paths to github.com/linkedin/goavro/v2. v1
users can still use old versions of goavro by adding a constraint to
your go.mod or Gopkg.toml file.
require (
github.com/linkedin/goavro v1.0.5
)
[[constraint]]
name = "github.com/linkedin/goavro"
version = "=1.0.5"
Major Improvements in v2 over v1
Avro namespaces
The original version of this library was written prior to my really understanding how Avro namespaces ought to work. After using Avro for a long time now, and after a lot of research, I think I grok Avro namespaces properly, and the library now correctly handles every test case the Apache Avro distribution has for namespaces, including being able to refer to a previously defined data type later on in the same schema.
Getting Data into and out of Records
The original version of this library required creating goavro.Record
instances, and use of getters and setters to access a record's
fields. When schemas were complex, this required a lot of work to
debug and get right. The original version also required users to break
schemas in chunks, and have a different schema for each record
type. This was cumbersome, annoying, and error prone.
The new version of this library eliminates the goavro.Record type,
and accepts a native Go map for all records to be encoded. Keys are
the field names, and values are the field values. Nothing could be
more easy. Conversely, decoding Avro data yields a native Go map for
the upstream client to pull data back out of.
Furthermore, there is never a reason to ever have to break your schema
down into record schemas. Merely feed the entire schema into the
NewCodec function once when you create the Codec, then use
it. This library knows how to parse the data provided to it and ensure
data values for records and their fields are properly encoded and
decoded.
3x--4x Performance Improvement
The original version of this library was truly written with Go's idea
of io.Reader and io.Writer composition in mind. Although
composition is a powerful tool, the original library had to pull bytes
off the io.Reader--often one byte at a time--check for read errors,
decode the bytes, and repeat. This version, by using a native Go byte
slice, both decoding and encoding complex Avro data here at LinkedIn
is between three and four times faster than before.
Avro JSON Support
The original version of this library did not support JSON encoding or decoding, because it wasn't deemed useful for our internal use at the time. When writing the new version of the library I decided to tackle this issue once and for all, because so many engineers needed this functionality for their work.
Better Handling of Record Field Default Values
The original version of this library did not well handle default values for record fields. This version of the library uses a default value of a record field when encoding from native Go data to Avro data and the record field is not specified. Additionally, when decoding from Avro JSON data to native Go data, and a field is not specified, the default value will be used to populate the field.
Contrast With Code Generation Tools
If you have the ability to rebuild and redeploy your software whenever data schemas change, code generation tools might be the best solution for your application.
There are numerous excellent tools for generating source code to
translate data between native and Avro binary or textual data. One
such tool is linked below. If a particular application is designed to
work with a rarely changing schema, programs that use code generated
functions can potentially be more performant than a program that uses
goavro to create a Codec dynamically at run time.
I recommend benchmarking the resultant programs using typical data using both the code generated functions and using goavro to see which performs better. Not all code generated functions will out perform goavro for all data corpuses.
If you don't have the ability to rebuild and redeploy software updates
whenever a data schema change occurs, goavro could be a great fit for
your needs. With goavro, your program can be given a new schema while
running, compile it into a Codec on the fly, and immediately start
encoding or decoding data using that Codec. Because Avro encoding
specifies that encoded data always be accompanied by a schema this is
not usually a problem. If the schema change is backwards compatible,
and the portion of your program that handles the decoded data is still
able to reference the decoded fields, there is nothing that needs to
be done when the schema change is detected by your program when using
goavro Codec instances to encode or decode data.
Resources
Usage
Documentation is available via
.
package main
import (
"fmt"
"github.com/linkedin/goavro/v2"
)
func main() {
codec, err := goavro.NewCodec(`
{
"type": "record",
"name": "LongList",
"fields" : [
{"name": "next", "type": ["null", "LongList"], "default": null}
]
}`)
if err != nil {
fmt.Println(err)
}
// NOTE: May omit fields when using default value
textual := []byte(`{"next":{"LongList":{}}}`)
// Convert textual Avro data (in Avro JSON format) to native Go form
native, _, err := codec.NativeFromTextual(textual)
if err != nil {
fmt.Println(err)
}
// Convert native Go form to binary Avro data
binary, err := codec.BinaryFromNative(nil, native)
if err != nil {
fmt.Println(err)
}
// Convert binary Avro data back to native Go form
native, _, err = codec.NativeFromBinary(binary)
if err != nil {
fmt.Println(err)
}
// Convert native Go form to textual Avro data
textual, err = codec.TextualFromNative(nil, native)
if err != nil {
fmt.Println(err)
}
// NOTE: Textual encoding will show all fields, even those with values that
// match their default values
fmt.Println(string(textual))
// Output: {"next":{"LongList":{"next":null}}}
}
Also please see the example programs in the examples directory for
reference.
OCF file reading and writing
This library supports reading and writing data in Object Container File (OCF) format
package main
import (
"bytes"
"fmt"
"strings"
"github.com/linkedin/goavro/v2"
)
func main() {
avroSchema := `
{
"type": "record",
"name": "test_schema",
"fields": [
{
"name": "time",
"type": "long"
},
{
"name": "customer",
"type": "string"
}
]
}`
// Writing OCF data
var ocfFileContents bytes.Buffer
writer, err := goavro.NewOCFWriter(goavro.OCFConfig{
W: &ocfFileContents,
Schema: avroSchema,
})
if err != nil {
fmt.Println(err)
}
err = writer.Append([]map[string]interface{}{
{
"time": 1617104831727,
"customer": "customer1",
},
{
"time": 1717104831727,
"customer": "customer2",
},
})
fmt.Println("ocfFileContents", ocfFileContents.String())
// Reading OCF data
ocfReader, err := goavro.NewOCFReader(strings.NewReader(ocfFileContents.String()))
if err != nil {
fmt.Println(err)
}
fmt.Println("Records in OCF File");
for ocfReader.Scan() {
record, err := ocfReader.Read()
if err != nil {
fmt.Println(err)
}
fmt.Println("record", record)
}
}
The above code in go playground
ab2t
The ab2t program is similar to the reference standard
avrocat program and converts Avro OCF files to Avro JSON
encoding.
arw
The Avro-ReWrite program, arw, can be used to rewrite an
Avro OCF file while optionally changing the block counts, the
compression algorithm. arw can also upgrade the schema provided the
existing datum values can be encoded with the newly provided schema.
avroheader
The Avro Header program, avroheader, can be used to print various
header information from an OCF file.
splice
The splice program can be used to splice together an OCF file from
an Avro schema file and a raw Avro binary data file.
Translating Data
A Codec provides four methods for translating between a byte slice
of either binary or textual Avro data and native Go data.
The following methods convert data between native Go data and byte slices of the binary Avro representation:
BinaryFromNative
NativeFromBinary
The following methods convert data between native Go data and byte slices of the textual Avro representation:
NativeFromTextual
TextualFromNative
Each Codec also exposes the Schema method to return a simplified
version of the JSON schema string used to create the Codec.
Related Skills
node-connect
349.0kDiagnose OpenClaw node connection and pairing failures for Android, iOS, and macOS companion apps
xurl
349.0kA CLI tool for making authenticated requests to the X (Twitter) API. Use this skill when you need to post tweets, reply, quote, search, read posts, manage followers, send DMs, upload media, or interact with any X API v2 endpoint.
frontend-design
109.4kCreate distinctive, production-grade frontend interfaces with high design quality. Use this skill when the user asks to build web components, pages, or applications. Generates creative, polished code that avoids generic AI aesthetics.
openai-whisper-api
349.0kTranscribe audio via OpenAI Audio Transcriptions API (Whisper).
