Merge pull request #96 from ucan-wg/container-spec
container: add a specification
This commit is contained in:
109
pkg/container/SPEC.md
Normal file
109
pkg/container/SPEC.md
Normal file
@@ -0,0 +1,109 @@
|
|||||||
|
# UCAN container Specification v0.1.0
|
||||||
|
|
||||||
|
## Editors
|
||||||
|
|
||||||
|
* [Michael Muré], [Consensys]
|
||||||
|
|
||||||
|
## Authors
|
||||||
|
|
||||||
|
* [Michael Muré], [Consensys]
|
||||||
|
* [Hugo Dias]
|
||||||
|
|
||||||
|
## Language
|
||||||
|
|
||||||
|
The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", "SHOULD", "SHOULD NOT", "RECOMMENDED", "NOT RECOMMENDED", "MAY", and "OPTIONAL" in this document are to be interpreted as described in [BCP 14] when, and only when, they appear in all capitals, as shown here.
|
||||||
|
|
||||||
|
# 0 Abstract
|
||||||
|
|
||||||
|
[User-Controlled Authorization Network (UCAN)][UCAN] is a trustless, secure, local-first, user-originated authorization and revocation scheme. This document describes a container format for transmitting one or more UCAN tokens as bytes, regardless of the transport.
|
||||||
|
|
||||||
|
# 1 Introduction
|
||||||
|
|
||||||
|
The UCAN spec itself is transport agnostic. This specification describes how to transfer one or more [UCAN] tokens bundled together, regardless of the transport.
|
||||||
|
|
||||||
|
# 2 Container format
|
||||||
|
|
||||||
|
## 2.1 Inner structure
|
||||||
|
|
||||||
|
UCAN tokens, regardless of their kind ([Delegation], [Invocation], [Revocation], [Promise]) MUST be first signed and serialized into DAG-CBOR bytes according to their respective specification. As the token's CID is not part of the serialized container, any CID returned by this operation is to be ignored.
|
||||||
|
|
||||||
|
All the tokens' bytes MUST be assembled in a [CBOR] array. The ordering of tokens in the array MUST NOT matter. This array SHOULD NOT have duplicate entries.
|
||||||
|
|
||||||
|
That array is then inserted as the value under the `ctn-v1` string key, in a CBOR map. There MUST NOT be other keys.
|
||||||
|
|
||||||
|
For clarity, the CBOR shape is given below:
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"ctn-v1": [
|
||||||
|
<token1 bytes>,
|
||||||
|
<token2 bytes>,
|
||||||
|
<token3 bytes>,
|
||||||
|
]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## 2.2 Serialisation
|
||||||
|
|
||||||
|
To serialize the container into bytes, the inner CBOR structure MUST then be serialized into bytes according to the CBOR specification. The resulting bytes MAY be compressed by a supported algorithm, then MAY be encoded with a supported base encoding.
|
||||||
|
|
||||||
|
The following compression algorithms are REQUIRED to be supported:
|
||||||
|
- [GZIP]
|
||||||
|
|
||||||
|
The following base encoding combinations are REQUIRED to be supported:
|
||||||
|
- base64, standard alphabet, padding
|
||||||
|
- base64, URL alphabet, no padding
|
||||||
|
|
||||||
|
The CBOR bytes MUST be prepended by a single byte header to indicate the selection combination of base encoding and compression. This header value MUST be set according to the following table:
|
||||||
|
|
||||||
|
| Header as hex | Header as ASCII | Base encoding | Compression |
|
||||||
|
|---------------|-----------------|-------------------------|----------------|
|
||||||
|
| 0x40 | @ | raw bytes | no compression |
|
||||||
|
| 0x42 | B | base64 std padding | no compression |
|
||||||
|
| 0x43 | C | base64 url (no padding) | no compression |
|
||||||
|
| 0x4D | M | raw bytes | gzip |
|
||||||
|
| 0x4F | O | base64 std padding | gzip |
|
||||||
|
| 0x50 | P | base64 url (no padding) | gzip |
|
||||||
|
|
||||||
|
For clarity, the resulting serialisation is in the form of `<header byte><cbor bytes, optionally compressed, optionally encoded>`.
|
||||||
|
|
||||||
|
# 3 FAQ
|
||||||
|
|
||||||
|
## 3.1 Why not include the UCAN CIDs?
|
||||||
|
|
||||||
|
Several attacks are possible if UCAN tokens aren't validated. If CIDs aren't validated, at least two attacks are possible: [privilege escalation] and [cache poisoning], as UCAN delegation proofs depends on a correct hash-linked structure.
|
||||||
|
|
||||||
|
By not including the CID in the container, the recipient is forced to hash (and thus validate) the CIDs for each token. If presented with a claimed CID paired with the token bytes, implementers could ignore CID validation, breaking a core part of the proof chain security model. Hash functions are very fast on a couple kilobytes of data so the overhead is still very low. It also significantly reduces the size of the container.
|
||||||
|
|
||||||
|
## 3.2 Why compress? Why not always compress?
|
||||||
|
|
||||||
|
Compression is a relatively demanding operation. As such, using it is a tradeoff between size on the wire and CPU/memory usage, both when writing and reading a container. The transport itself can make compression worthwhile or not: for example, HTTP/2 and HTTP/3 headers are already compressed, but HTTP/1 headers are not. This being highly contextual, the choice is left to the final implementer.
|
||||||
|
|
||||||
|
# 4 Implementation recommendations
|
||||||
|
|
||||||
|
## 4.1 Dissociate reader and writer
|
||||||
|
|
||||||
|
While it is tempting to write a single implementation to read and write a container, it is RECOMMENDED to separate the implementation into a reader and a writer. The writer can simply accept arbitrary tokens as bytes, while the reader provides a read-only view with convenient access functions.
|
||||||
|
|
||||||
|
# 5 Acknowledgments
|
||||||
|
|
||||||
|
Many thanks to all the [Fission] team and in particular to [Brooklyn Zelenka] for creating and pushing [UCAN] and other critical pieces like [WNFS], and generally being awesome and supportive people.
|
||||||
|
|
||||||
|
<!-- External Links -->
|
||||||
|
|
||||||
|
[BCP 14]: https://www.rfc-editor.org/info/bcp14
|
||||||
|
[Brooklyn Zelenka]: https://github.com/expede
|
||||||
|
[CBOR]: https://www.rfc-editor.org/rfc/rfc8949.html
|
||||||
|
[Consensys]: https://consensys.io/
|
||||||
|
[Delegation]: https://github.com/ucan-wg/delegation/tree/v1_ipld
|
||||||
|
[Fission]: https://fission.codes
|
||||||
|
[GZIP]: https://datatracker.ietf.org/doc/html/rfc1952
|
||||||
|
[Hugo Dias]: https://github.com/hugomrdias
|
||||||
|
[Invocation]: https://github.com/ucan-wg/invocation
|
||||||
|
[Michael Muré]: https://github.com/MichaelMure/
|
||||||
|
[Promise]: https://github.com/ucan-wg/promise/tree/v1-rc1
|
||||||
|
[Revocation]: https://github.com/ucan-wg/revocation/tree/first-draft
|
||||||
|
[UCAN]: https://github.com/ucan-wg/spec
|
||||||
|
[WNFS]: https://github.com/wnfs-wg
|
||||||
|
[cache poisoning]: https://en.wikipedia.org/wiki/Cache_poisoning
|
||||||
|
[privilede escalation]: https://en.wikipedia.org/wiki/Privilege_escalation
|
||||||
118
pkg/container/packaging.go
Normal file
118
pkg/container/packaging.go
Normal file
@@ -0,0 +1,118 @@
|
|||||||
|
package container
|
||||||
|
|
||||||
|
import (
|
||||||
|
"compress/gzip"
|
||||||
|
"encoding/base64"
|
||||||
|
"errors"
|
||||||
|
"fmt"
|
||||||
|
"io"
|
||||||
|
)
|
||||||
|
|
||||||
|
const containerVersionTag = "ctn-v1"
|
||||||
|
|
||||||
|
type header byte
|
||||||
|
|
||||||
|
const (
|
||||||
|
headerRawBytes = header(0x40)
|
||||||
|
headerBase64StdPadding = header(0x42)
|
||||||
|
headerBase64URL = header(0x43)
|
||||||
|
headerRawBytesGzip = header(0x4D)
|
||||||
|
headerBase64StdPaddingGzip = header(0x4F)
|
||||||
|
headerBase64URLGzip = header(0x50)
|
||||||
|
)
|
||||||
|
|
||||||
|
func (h header) encoder(w io.Writer) *payloadWriter {
|
||||||
|
res := &payloadWriter{rawWriter: w, writer: w, header: h}
|
||||||
|
|
||||||
|
switch h {
|
||||||
|
case headerBase64StdPadding, headerBase64StdPaddingGzip:
|
||||||
|
b64Writer := base64.NewEncoder(base64.StdEncoding, res.writer)
|
||||||
|
res.writer = b64Writer
|
||||||
|
res.closers = append([]io.Closer{b64Writer}, res.closers...)
|
||||||
|
case headerBase64URL, headerBase64URLGzip:
|
||||||
|
b64Writer := base64.NewEncoder(base64.RawURLEncoding, res.writer)
|
||||||
|
res.writer = b64Writer
|
||||||
|
res.closers = append([]io.Closer{b64Writer}, res.closers...)
|
||||||
|
}
|
||||||
|
|
||||||
|
switch h {
|
||||||
|
case headerRawBytesGzip, headerBase64StdPaddingGzip, headerBase64URLGzip:
|
||||||
|
gzipWriter := gzip.NewWriter(res.writer)
|
||||||
|
res.writer = gzipWriter
|
||||||
|
res.closers = append([]io.Closer{gzipWriter}, res.closers...)
|
||||||
|
}
|
||||||
|
|
||||||
|
return res
|
||||||
|
}
|
||||||
|
|
||||||
|
func decodePayload(r io.Reader) (io.Reader, error) {
|
||||||
|
headerBuf := make([]byte, 1)
|
||||||
|
_, err := r.Read(headerBuf)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
h := header(headerBuf[0])
|
||||||
|
|
||||||
|
switch h {
|
||||||
|
case headerRawBytes,
|
||||||
|
headerBase64StdPadding,
|
||||||
|
headerBase64URL,
|
||||||
|
headerRawBytesGzip,
|
||||||
|
headerBase64StdPaddingGzip,
|
||||||
|
headerBase64URLGzip:
|
||||||
|
default:
|
||||||
|
return nil, fmt.Errorf("unknown container header")
|
||||||
|
}
|
||||||
|
|
||||||
|
switch h {
|
||||||
|
case headerBase64StdPadding, headerBase64StdPaddingGzip:
|
||||||
|
r = base64.NewDecoder(base64.StdEncoding, r)
|
||||||
|
case headerBase64URL, headerBase64URLGzip:
|
||||||
|
r = base64.NewDecoder(base64.RawURLEncoding, r)
|
||||||
|
}
|
||||||
|
|
||||||
|
switch h {
|
||||||
|
case headerRawBytesGzip, headerBase64StdPaddingGzip, headerBase64URLGzip:
|
||||||
|
gzipReader, err := gzip.NewReader(r)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
r = gzipReader
|
||||||
|
}
|
||||||
|
|
||||||
|
return r, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
var _ io.WriteCloser = &payloadWriter{}
|
||||||
|
|
||||||
|
// payloadWriter is tasked with two things:
|
||||||
|
// - prepend the header byte
|
||||||
|
// - call Close() on all the underlying io.Writer
|
||||||
|
type payloadWriter struct {
|
||||||
|
rawWriter io.Writer
|
||||||
|
writer io.Writer
|
||||||
|
header header
|
||||||
|
headerWrote bool
|
||||||
|
closers []io.Closer
|
||||||
|
}
|
||||||
|
|
||||||
|
func (w *payloadWriter) Write(p []byte) (n int, err error) {
|
||||||
|
if !w.headerWrote {
|
||||||
|
_, err := w.rawWriter.Write([]byte{byte(w.header)})
|
||||||
|
if err != nil {
|
||||||
|
return 0, err
|
||||||
|
}
|
||||||
|
w.headerWrote = true
|
||||||
|
}
|
||||||
|
return w.writer.Write(p)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (w *payloadWriter) Close() error {
|
||||||
|
var errs error
|
||||||
|
for _, closer := range w.closers {
|
||||||
|
if err := closer.Close(); err != nil {
|
||||||
|
errs = errors.Join(errs, err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return errs
|
||||||
|
}
|
||||||
@@ -2,7 +2,6 @@ package container
|
|||||||
|
|
||||||
import (
|
import (
|
||||||
"bytes"
|
"bytes"
|
||||||
"encoding/base64"
|
|
||||||
"errors"
|
"errors"
|
||||||
"fmt"
|
"fmt"
|
||||||
"io"
|
"io"
|
||||||
@@ -11,7 +10,7 @@ import (
|
|||||||
|
|
||||||
"github.com/ipfs/go-cid"
|
"github.com/ipfs/go-cid"
|
||||||
"github.com/ipld/go-ipld-prime"
|
"github.com/ipld/go-ipld-prime"
|
||||||
"github.com/ipld/go-ipld-prime/codec/dagcbor"
|
"github.com/ipld/go-ipld-prime/codec/cbor"
|
||||||
"github.com/ipld/go-ipld-prime/datamodel"
|
"github.com/ipld/go-ipld-prime/datamodel"
|
||||||
|
|
||||||
"github.com/ucan-wg/go-ucan/token"
|
"github.com/ucan-wg/go-ucan/token"
|
||||||
@@ -25,6 +24,72 @@ var ErrMultipleInvocations = fmt.Errorf("multiple invocations")
|
|||||||
// Reader is a token container reader. It exposes the tokens conveniently decoded.
|
// Reader is a token container reader. It exposes the tokens conveniently decoded.
|
||||||
type Reader map[cid.Cid]token.Token
|
type Reader map[cid.Cid]token.Token
|
||||||
|
|
||||||
|
// FromBytes decodes a container from a []byte
|
||||||
|
func FromBytes(data []byte) (Reader, error) {
|
||||||
|
return FromReader(bytes.NewReader(data))
|
||||||
|
}
|
||||||
|
|
||||||
|
// FromString decodes a container from a string
|
||||||
|
func FromString(s string) (Reader, error) {
|
||||||
|
return FromReader(strings.NewReader(s))
|
||||||
|
}
|
||||||
|
|
||||||
|
// FromReader decodes a container from an io.Reader.
|
||||||
|
func FromReader(r io.Reader) (Reader, error) {
|
||||||
|
payload, err := decodePayload(r)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
n, err := ipld.DecodeStreaming(payload, cbor.Decode)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
if n.Kind() != datamodel.Kind_Map {
|
||||||
|
return nil, fmt.Errorf("invalid container format: expected map")
|
||||||
|
}
|
||||||
|
if n.Length() != 1 {
|
||||||
|
return nil, fmt.Errorf("invalid container format: expected single version key")
|
||||||
|
}
|
||||||
|
|
||||||
|
// get the first (and only) key-value pair
|
||||||
|
it := n.MapIterator()
|
||||||
|
key, tokensNode, err := it.Next()
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
version, err := key.AsString()
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("invalid container format: version must be string")
|
||||||
|
}
|
||||||
|
if version != containerVersionTag {
|
||||||
|
return nil, fmt.Errorf("unsupported container version: %s", version)
|
||||||
|
}
|
||||||
|
|
||||||
|
if tokensNode.Kind() != datamodel.Kind_List {
|
||||||
|
return nil, fmt.Errorf("invalid container format: tokens must be a list")
|
||||||
|
}
|
||||||
|
|
||||||
|
ctn := make(Reader, tokensNode.Length())
|
||||||
|
it2 := tokensNode.ListIterator()
|
||||||
|
for !it2.Done() {
|
||||||
|
_, val, err := it2.Next()
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
data, err := val.AsBytes()
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
err = ctn.addToken(data)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return ctn, nil
|
||||||
|
}
|
||||||
|
|
||||||
// GetToken returns an arbitrary decoded token, from its CID.
|
// GetToken returns an arbitrary decoded token, from its CID.
|
||||||
// If not found, ErrNotFound is returned.
|
// If not found, ErrNotFound is returned.
|
||||||
func (ctn Reader) GetToken(cid cid.Cid) (token.Token, error) {
|
func (ctn Reader) GetToken(cid cid.Cid) (token.Token, error) {
|
||||||
@@ -65,7 +130,7 @@ func (ctn Reader) GetAllDelegations() iter.Seq2[cid.Cid, *delegation.Token] {
|
|||||||
|
|
||||||
// GetInvocation returns a single invocation.Token.
|
// GetInvocation returns a single invocation.Token.
|
||||||
// If none are found, ErrNotFound is returned.
|
// If none are found, ErrNotFound is returned.
|
||||||
// If more than one invocation exist, ErrMultipleInvocations is returned.
|
// If more than one invocation exists, ErrMultipleInvocations is returned.
|
||||||
func (ctn Reader) GetInvocation() (*invocation.Token, error) {
|
func (ctn Reader) GetInvocation() (*invocation.Token, error) {
|
||||||
var res *invocation.Token
|
var res *invocation.Token
|
||||||
for _, t := range ctn {
|
for _, t := range ctn {
|
||||||
@@ -95,110 +160,6 @@ func (ctn Reader) GetAllInvocations() iter.Seq2[cid.Cid, *invocation.Token] {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// FromCbor decodes a DAG-CBOR encoded container.
|
|
||||||
func FromCbor(data []byte) (Reader, error) {
|
|
||||||
return FromCborReader(bytes.NewReader(data))
|
|
||||||
}
|
|
||||||
|
|
||||||
// FromCborReader is the same as FromCbor, but with an io.Reader.
|
|
||||||
func FromCborReader(r io.Reader) (Reader, error) {
|
|
||||||
n, err := ipld.DecodeStreaming(r, dagcbor.Decode)
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
if n.Kind() != datamodel.Kind_Map {
|
|
||||||
return nil, fmt.Errorf("invalid container format: expected map")
|
|
||||||
}
|
|
||||||
if n.Length() != 1 {
|
|
||||||
return nil, fmt.Errorf("invalid container format: expected single version key")
|
|
||||||
}
|
|
||||||
|
|
||||||
// get the first (and only) key-value pair
|
|
||||||
it := n.MapIterator()
|
|
||||||
key, tokensNode, err := it.Next()
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
|
|
||||||
version, err := key.AsString()
|
|
||||||
if err != nil {
|
|
||||||
return nil, fmt.Errorf("invalid container format: version must be string")
|
|
||||||
}
|
|
||||||
if version != currentContainerVersion {
|
|
||||||
return nil, fmt.Errorf("unsupported container version: %s", version)
|
|
||||||
}
|
|
||||||
|
|
||||||
if tokensNode.Kind() != datamodel.Kind_List {
|
|
||||||
return nil, fmt.Errorf("invalid container format: tokens must be a list")
|
|
||||||
}
|
|
||||||
|
|
||||||
ctn := make(Reader, tokensNode.Length())
|
|
||||||
it2 := tokensNode.ListIterator()
|
|
||||||
for !it2.Done() {
|
|
||||||
_, val, err := it2.Next()
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
data, err := val.AsBytes()
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
err = ctn.addToken(data)
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
}
|
|
||||||
return ctn, nil
|
|
||||||
}
|
|
||||||
|
|
||||||
// FromCborBase64 decodes a base64 DAG-CBOR encoded container.
|
|
||||||
func FromCborBase64(data string) (Reader, error) {
|
|
||||||
return FromCborBase64Reader(strings.NewReader(data))
|
|
||||||
}
|
|
||||||
|
|
||||||
// FromCborBase64Reader is the same as FromCborBase64, but with an io.Reader.
|
|
||||||
func FromCborBase64Reader(r io.Reader) (Reader, error) {
|
|
||||||
return FromCborReader(base64.NewDecoder(base64.StdEncoding, r))
|
|
||||||
}
|
|
||||||
|
|
||||||
// FromCar decodes a CAR file encoded container.
|
|
||||||
func FromCar(data []byte) (Reader, error) {
|
|
||||||
return FromCarReader(bytes.NewReader(data))
|
|
||||||
}
|
|
||||||
|
|
||||||
// FromCarReader is the same as FromCar, but with an io.Reader.
|
|
||||||
func FromCarReader(r io.Reader) (Reader, error) {
|
|
||||||
_, it, err := readCar(r)
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
|
|
||||||
ctn := make(Reader)
|
|
||||||
|
|
||||||
for block, err := range it {
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
|
|
||||||
err = ctn.addToken(block.data)
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return ctn, nil
|
|
||||||
}
|
|
||||||
|
|
||||||
// FromCarBase64 decodes a base64 CAR file encoded container.
|
|
||||||
func FromCarBase64(data string) (Reader, error) {
|
|
||||||
return FromCarReader(strings.NewReader(data))
|
|
||||||
}
|
|
||||||
|
|
||||||
// FromCarBase64Reader is the same as FromCarBase64, but with an io.Reader.
|
|
||||||
func FromCarBase64Reader(r io.Reader) (Reader, error) {
|
|
||||||
return FromCarReader(base64.NewDecoder(base64.StdEncoding, r))
|
|
||||||
}
|
|
||||||
|
|
||||||
func (ctn Reader) addToken(data []byte) error {
|
func (ctn Reader) addToken(data []byte) error {
|
||||||
tkn, c, err := token.FromSealed(data)
|
tkn, c, err := token.FromSealed(data)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
|
|||||||
@@ -23,13 +23,21 @@ import (
|
|||||||
func TestContainerRoundTrip(t *testing.T) {
|
func TestContainerRoundTrip(t *testing.T) {
|
||||||
for _, tc := range []struct {
|
for _, tc := range []struct {
|
||||||
name string
|
name string
|
||||||
writer func(ctn Writer, w io.Writer) error
|
expectedHeader header
|
||||||
reader func(io.Reader) (Reader, error)
|
writer any
|
||||||
}{
|
}{
|
||||||
{"car", Writer.ToCarWriter, FromCarReader},
|
{"Bytes", headerRawBytes, Writer.ToBytes},
|
||||||
{"carBase64", Writer.ToCarBase64Writer, FromCarBase64Reader},
|
{"BytesWriter", headerRawBytes, Writer.ToBytesWriter},
|
||||||
{"cbor", Writer.ToCborWriter, FromCborReader},
|
{"BytesGzipped", headerRawBytesGzip, Writer.ToBytesGzipped},
|
||||||
{"cborBase64", Writer.ToCborBase64Writer, FromCborBase64Reader},
|
{"BytesGzippedWriter", headerRawBytesGzip, Writer.ToBytesGzippedWriter},
|
||||||
|
{"Base64StdPadding", headerBase64StdPadding, Writer.ToBase64StdPadding},
|
||||||
|
{"Base64StdPaddingWriter", headerBase64StdPadding, Writer.ToBase64StdPaddingWriter},
|
||||||
|
{"Base64StdPaddingGzipped", headerBase64StdPaddingGzip, Writer.ToBase64StdPaddingGzipped},
|
||||||
|
{"Base64StdPaddingGzippedWriter", headerBase64StdPaddingGzip, Writer.ToBase64StdPaddingGzippedWriter},
|
||||||
|
{"Base64URL", headerBase64URL, Writer.ToBase64URL},
|
||||||
|
{"Base64URLWriter", headerBase64URL, Writer.ToBase64URLWriter},
|
||||||
|
{"Base64URLGzip", headerBase64URLGzip, Writer.ToBase64URLGzip},
|
||||||
|
{"Base64URLGzipWriter", headerBase64URLGzip, Writer.ToBase64URLGzipWriter},
|
||||||
} {
|
} {
|
||||||
t.Run(tc.name, func(t *testing.T) {
|
t.Run(tc.name, func(t *testing.T) {
|
||||||
tokens := make(map[cid.Cid]*delegation.Token)
|
tokens := make(map[cid.Cid]*delegation.Token)
|
||||||
@@ -44,17 +52,49 @@ func TestContainerRoundTrip(t *testing.T) {
|
|||||||
dataSize += len(data)
|
dataSize += len(data)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
var reader Reader
|
||||||
|
var serialLen int
|
||||||
|
|
||||||
|
switch fn := tc.writer.(type) {
|
||||||
|
case func(ctn Writer, w io.Writer) error:
|
||||||
buf := bytes.NewBuffer(nil)
|
buf := bytes.NewBuffer(nil)
|
||||||
|
err := fn(writer, buf)
|
||||||
|
require.NoError(t, err)
|
||||||
|
serialLen = buf.Len()
|
||||||
|
|
||||||
err := tc.writer(writer, buf)
|
h, err := buf.ReadByte()
|
||||||
|
require.NoError(t, err)
|
||||||
|
require.Equal(t, byte(tc.expectedHeader), h)
|
||||||
|
err = buf.UnreadByte()
|
||||||
require.NoError(t, err)
|
require.NoError(t, err)
|
||||||
|
|
||||||
t.Logf("data size %d", dataSize)
|
reader, err = FromReader(bytes.NewReader(buf.Bytes()))
|
||||||
t.Logf("container overhead: %d%%, %d bytes", int(float32(buf.Len()-dataSize)/float32(dataSize)*100.0), buf.Len()-dataSize)
|
|
||||||
|
|
||||||
reader, err := tc.reader(bytes.NewReader(buf.Bytes()))
|
|
||||||
require.NoError(t, err)
|
require.NoError(t, err)
|
||||||
|
|
||||||
|
case func(ctn Writer) ([]byte, error):
|
||||||
|
b, err := fn(writer)
|
||||||
|
require.NoError(t, err)
|
||||||
|
serialLen = len(b)
|
||||||
|
|
||||||
|
require.Equal(t, byte(tc.expectedHeader), b[0])
|
||||||
|
|
||||||
|
reader, err = FromBytes(b)
|
||||||
|
require.NoError(t, err)
|
||||||
|
|
||||||
|
case func(ctn Writer) (string, error):
|
||||||
|
s, err := fn(writer)
|
||||||
|
require.NoError(t, err)
|
||||||
|
serialLen = len(s)
|
||||||
|
|
||||||
|
require.Equal(t, byte(tc.expectedHeader), s[0])
|
||||||
|
|
||||||
|
reader, err = FromString(s)
|
||||||
|
require.NoError(t, err)
|
||||||
|
}
|
||||||
|
|
||||||
|
t.Logf("data size %d, container size %d, overhead: %d%%, %d bytes",
|
||||||
|
dataSize, serialLen, int(float32(serialLen-dataSize)/float32(dataSize)*100.0), serialLen-dataSize)
|
||||||
|
|
||||||
for c, dlg := range tokens {
|
for c, dlg := range tokens {
|
||||||
tknRead, err := reader.GetToken(c)
|
tknRead, err := reader.GetToken(c)
|
||||||
require.NoError(t, err)
|
require.NoError(t, err)
|
||||||
@@ -98,10 +138,12 @@ func BenchmarkContainerSerialisation(b *testing.B) {
|
|||||||
writer func(ctn Writer, w io.Writer) error
|
writer func(ctn Writer, w io.Writer) error
|
||||||
reader func(io.Reader) (Reader, error)
|
reader func(io.Reader) (Reader, error)
|
||||||
}{
|
}{
|
||||||
{"car", Writer.ToCarWriter, FromCarReader},
|
{"Bytes", Writer.ToBytesWriter, FromReader},
|
||||||
{"carBase64", Writer.ToCarBase64Writer, FromCarBase64Reader},
|
{"BytesGzipped", Writer.ToBytesGzippedWriter, FromReader},
|
||||||
{"cbor", Writer.ToCborWriter, FromCborReader},
|
{"Base64StdPadding", Writer.ToBase64StdPaddingWriter, FromReader},
|
||||||
{"cborBase64", Writer.ToCborBase64Writer, FromCborBase64Reader},
|
{"Base64StdPaddingGzipped", Writer.ToBase64StdPaddingGzippedWriter, FromReader},
|
||||||
|
{"Base64URL", Writer.ToBase64URLWriter, FromReader},
|
||||||
|
{"Base64URLGzip", Writer.ToBase64URLGzipWriter, FromReader},
|
||||||
} {
|
} {
|
||||||
writer := NewWriter()
|
writer := NewWriter()
|
||||||
|
|
||||||
@@ -184,7 +226,7 @@ func FuzzContainerRead(f *testing.F) {
|
|||||||
_, c, data := randToken()
|
_, c, data := randToken()
|
||||||
writer.AddSealed(c, data)
|
writer.AddSealed(c, data)
|
||||||
}
|
}
|
||||||
data, err := writer.ToCbor()
|
data, err := writer.ToBytes()
|
||||||
require.NoError(f, err)
|
require.NoError(f, err)
|
||||||
|
|
||||||
f.Add(data)
|
f.Add(data)
|
||||||
@@ -194,7 +236,7 @@ func FuzzContainerRead(f *testing.F) {
|
|||||||
start := time.Now()
|
start := time.Now()
|
||||||
|
|
||||||
// search for panics
|
// search for panics
|
||||||
_, _ = FromCbor(data)
|
_, _ = FromBytes(data)
|
||||||
|
|
||||||
if time.Since(start) > 100*time.Millisecond {
|
if time.Since(start) > 100*time.Millisecond {
|
||||||
panic("too long")
|
panic("too long")
|
||||||
|
|||||||
1
pkg/container/testvectors/Base64StdPadding
Normal file
1
pkg/container/testvectors/Base64StdPadding
Normal file
File diff suppressed because one or more lines are too long
1
pkg/container/testvectors/Base64StdPaddingGzipped
Normal file
1
pkg/container/testvectors/Base64StdPaddingGzipped
Normal file
@@ -0,0 +1 @@
|
|||||||
|
OH4sIAAAAAAAA/5zX+ZOUxR3HcY7FwhKToFJGFAU1ghzL8/RztpxzrzsHszuzcy1Ru5/unmd6dubZnXsGE9EtFVnFSCiNEFfFiCSKqIVI8ECBKGiJiMSTFdCYeGDK6GpMWcSUtUOq8ts++Qfev7zq+6nu+5lRys+riKuTYwdvSCw9NrDty6c/Wf668n73nFsqQ988uh1Z8rTNu9X6hkO33S/vG1i92+//ontN6/BubU54fntiUnXqrPNfo9p5bnTWd3PP3IhMt/z52L5i2UD5+aQnvVRsFVqFeQWjVdxsoDKp6SRDrsjS+hUNNZjNtFdJsa1CzUwCxSK8imjO7curtFHwBqy8US3hlBT08XQlkWeGkSPmfGZZ8zEqGLTWOzX9yyemGZli8X+b1XAxESdxALIVX6E3xkiY1coeswLDMOp1BVRgOrGnqxEtukoJo9fqub7fQD09Rmv3z/vRYtZaQT1lOs4olvH/nyU5WkIPmEAQkAYg5lhDqigxjeimJoqEMgC4JipMViURiCYTBQHLBHGZCkyRNcgYzVt5gwauDa67+uiirx+KuQZnjgB9/2H3noUzrj79rQdv27HpuSufnOrfvu3Myf98rfv5Tfds2OX9de9w92Mfm/m9s2eOe/etzPnd/1jdeqI8vvuVbYOb8Q2Ltky0CZRDlAfcJKqHGSklXclQxR+mPKbgcM5n5awkRYmwM2SltSQqjRYo3RFD/lDOnW/TYrxd7ayVQaQme10dVl3RPCardrQHkFJUvTUT2QCyk20CaQpUFF1CnFJJpDqVoGYSDKhuMJVDlaoqpIIMTIZVhcoMcCYAnSgC1GkT6BF9x2d9X19nTBw/+cgI0KO+/W+sk4cmDz7/7AM7D7s2+BdIFw0vGLv9yzm7Vv3h1QN/WZHwxy9Qj0evv2ftkX891BLy7V2cWjY+sGV9f+zYHh4eYxOoXE8E3BFPNB+q0Kw7XexLpfXeXiPrd5WC3N0Wj7iBFfeQMqi06aMFKtcMf9wRrbN4EKOa7KxYfRkQ0/3ecJ/LV/dnkl2OgEPSvIViNmQDyE62CSRDgWEBilwECpWwADVsKkQnhgQZ13QZMVWVITGRCLCmI8ANCWuiSiSqNIF+ExuqnnSds6b7zv4XRoDWzN08c6H1GByE86wXj+561L/q9pc+G1jy4/ya71uPb3kXLJmwe/jlHFu3+NhFK2/8Sce4sZtiLQezq/uVBx6OXnv3n0+zC+T1F5MuENTKFW8u0ld3GcVQNl0gZodSiCk01UkMV9SVauRNXR0tEOtkic5YKtXlDgKt4VFKuhWoGiku417g6wuUimZ7DbTJDAGv2waQnWwTCIka0XVJ4JCKVJGQLGqmIamaYaiQKwpVIFQMgZgUCQLSicqxTKCuyxLATaD4JZf/4mhvbf19y15uGwFyrodHo9vOP7kzObhklvuTfQ/j3IQLn5nz6dKNn0xo7H/nkQs+HT70VvLwmWzvm3c/Ofn7g3f91jvkb7/5vpWHHl8EX3/B7sSxiq9R5bkIsAJKJurhPj9z1GgDh3gy18ViyOP2sa5oUso4C9aoJ84fq/fKKm3LtRe8gWTe568V8z7sMCU1FnA7gJnOhXq9Lk+XaHTamTgb2SaQQIFEEaJcgLIqQEEQgaloGsKIMs5ESRYFQVYlU2UEEVnEXCUSA0zTBLUJJM/nb351ecu/h6dLlREg67y3b3be96Ky/p7L7jzeO9s5Y9Ka5fNxZHc8cVC6+K83iXPfO/GjL5ybrbZrWq4QPu64rfu7adMP7v/uyt/f0r7yJjBpkk2gvgQPOTusdDGqhOuKNxROw0ba29lFUjCbqNV1MSDWZZxgViRkjBbI8jUa1aKl0Y42qz3sTVt11dQSeo5WqsGKD1NY0sN6QS8aUhXaALKTPQWEFU0nKuMi0aGCDayoJjIwZjrGHMoEAUI1QTINxUAShCo3DCwTWdOZ0QRaAhdf1yIf7unm5atGgObsH8rs5FO2wH1r/zb84c8WrNLL179/za+G51WvkvaKXjguP3XmXWetHTix59zd32Te9h2ftGTbhU/dv3PvjcvAlBnjzrAJVOjMUKhKznS4jZR10ZWMVz1JZ5zWHbmuSqnglzpSMqPVuiPgkkcL1Butd1DQVQmGgByp0S5Xo5SLKXlfvD1QDolQZ2oxXIwFk42AUrEBZCfbBBJlSZFFReIIQZlqhgR1U0MSlilROBAMTACBAjYZMyjUich1w1B+2Dd4auIOnLPCY3wFNzxIc+tGgKqX5an27Qtn37C2d+ykS7XPnz30TSi9onDd2eeMK7N4/6x78XLQMnnlm6/mFlbIeRee9qWybfCjBceSi6YVdg2s7ZxgE6ga7sp29QGTJ+LOWAooJcsXJ9loQgvFmZOXOgNt7nox1jBRyAVGC8SNbK0zCyKmrDsDRrrhShLqqGQ1DKJyJVMvknY/jNV52hQbThtAdrJNIIgVSdBljUuCwCSgMxGbBhENJBLIBWYgAzCsAZNAwdAFXeZMVVX4wywKTaBZTxz9E9y6b0JPgu8YAZo9A258I6l75KfJM2Omb+T3zjDPTW393SUXnHjswB3DY05a7wQ7ux9fXjtjx4QxS6fPfmns3598sWf9lkvB83sOXrxytt1XXK5IHCHk1Tx6sc0fTPdlCjTBYmWHHNH8XR2s3khVjKqeFD085R71BeXrdVytWKqSqCaDMiehTm/Q4jCVrUcdrBKTOfR4vEGto01N2bkgG9lTF6RBwFQicAqJALGiaropCwggCWMuAyJThAxETUIA0CVV57qkUEVlOtSbQGMGJuaOTFzyx/FHjj81AvTqwIExd3vB9FuXrdHmrp7106G33/vgo9tP3zjF8t5ZOLz52q2bjt1x6+yTH6zQbmGpiasOJ5Xw+oU7n3v8YMuH77Vs/bbF7kcVmjBmuhuhVMyVUNWiSXg9XMs2yvUkpCnma8BKxlK0eqXT8o0WqBgpmI5YKuLvyLiUeqTsp42S6qiasUCjlFG9Xk1pr3ujpN0BLWoDyE62CSRpMtBFTeaipKqYyZJKTEwURGWkcVHAWDcY0TSTYipjjFUu6YL6w2P8v0AnP45M2du/fWgHeOWD/wQAAP//IuQ6Y1MQAAA=
|
||||||
1
pkg/container/testvectors/Base64URL
Normal file
1
pkg/container/testvectors/Base64URL
Normal file
File diff suppressed because one or more lines are too long
1
pkg/container/testvectors/Base64URLGzip
Normal file
1
pkg/container/testvectors/Base64URLGzip
Normal file
@@ -0,0 +1 @@
|
|||||||
|
PH4sIAAAAAAAA_5zWa5cUxR3HcVcRiejmuCII3siKwgnubnd1dVW1eIS5rzM9w8zs3HZETFd19_T07PTcZ2cH1KMgoB41YgIo3nM0G1GCRhCRiILIEaIhIEo0Bi9c9ewaFfEazYMd8iCPdvIGvk8-p_71e1RnZaujyt_e2_LQLYk5Fbb1QefFR9-_5LzWO59-9PfWoWV_nN_f177z06eH31j13Je8-Wpk35bTccKjtV_22MRfnz9DufuSg8Ntn02KzZtw83sLxj-mGE441FIoVZhidal9qTl8J9fJdRRZJz_IlIpaI2pavTKjDVxZR_5MQXa4aFIFgr-gMCrrOTWUqWNajNZERzin0GLRz7IxsTcfAgnGsqrRpedyXVQpMq2Wn5K68dmLWLpU-p9msRbQ-usuHEsGorZK0BbwxWzxvCdrhDK2vL23UMCRTM0RlStVxPK5vpsXMaWvj3Vee90i5Wq9s6r0VbRTWalC__-smtXKyu8MERNOwJSZkCACeAFgzkC6xGkYAhOLqkJEiBXJkJAuCUBUTMIA4KEgAEWzchbT5DXHfxwc7LiqNnPmRwdHgBa8NO6jDQ9uvW6wa7DHYetfP3DxO4s7JhybLa1YcerUV-Yae7aDww_P2_zjvH9euWqOOW3yjBWtO2__bKzLvfnR_enEpmaBMtFuy4qEsgrlqRMpPm86mql6apjI4V63V_bFgl5WT-USpQhDowVK2amiFmM8tEUsiRJcCeowHM9AvVQreLGE83LMkcBBI99dAE0ANZNtAGFewJjwkilBJPG8TphoKEQhmihKpsgBXuIVhXEGRRQpRGAmhzkRqIpOaQNo4afuoTWTj60cfjw5fQQoxy98ZPGmF3YsWLJo8voJ587Y9-X38geyLBq_nHXZIy9fPun5m1pebqVzv7J-NuaM08XEzYl3v3kmtfuCr2Yc-VAa3vtxS7NAnpKSKsBEzVaMphWdFB0kgNSEgHpwiPe7a0qaxYtVG_SVq_7RAhkey-asYmfUZ1XKMbGSr-Vo1UZ60rZyNV6Uy8l8MY_iES1ic0abAGom2wCCEqcLVFRNpEKBaghAaCBJUYgKOVNVGMQCUyXRQBqFTJWwKQkIIoZFQW8AvZV64bRDez_6flYbbmu8oDa_euu2yJB9w1vX3uO-YrZvobj2yK07V7-1YebjrRcgbsvS4fiyJ9_gpx4KvT5n_A7PTz-cqF_asuuaoTFnn2N-bbU2CaSHi0bJlrEi3kyh6NFrgqOu59VKOJ2t9Xj4sllT4wXqJdFQfzI2aiBvIGSG_Zls2IsDSdKblRV3QlQz2X7Z2evudVBk16QIM1yiCZsBaiLbAOIEnROQppu8AjgKiKqIhgQVwDQCTCKJkqCIBIqGpoicCilvcoxDWJewcPLE3TD0fHzb0y_BrtRd2gjQN3fd--36ju_CYzuV8adcsOeA95UvVv-09N1t05_bN7DllouWTN-18r4nTm-9Y8uZ2L02fj1XLizHlzyfrZ-x65Pt503b0uyJs0oDhpWuSh4vwFpJjlcDciKRCbtQTzmvxkWrKIWCUQm51FomO1ogKyMLyXSPx0xHxKrixwFT7a9XgzUUTuedXgDi8XKmgGJmJOwMNQHUTLYBhERGkE6YCTXEYYAYxxuE5xGnCbopAkWjAtCgbhCMKAUQmQqkvA4wJaQBpNyhPfDyK9Ijs-yOv40ArTv_hoMtSbB12NjtzD1Ef35ow47k_D3frn41f-2LX6YmdTxzzba3t180_7XFV5m-fz304tj2Td-t7Nx_oblszv7ctI2Hmz1xhayl1myOkhEOxUt2S_S4shFvMI7dIbseyPa4AlalJ8q6g2KhGBktUJGVfQEaSsXtsK7EnXnC6zIP84ZHrYUyGR9vcxXj9lwuaIvG7E0ANZM9-QcBHkNCNFMFEKhYFTRiSBoUOV0RTE3DKmWMCJKhijzgeJWakgoIwyLUuAZQSlq-Y177mBOSHt49AjRF-PBAe9d1H3ffcvyZczrHLAhP9X3jC5y797YFa-mNxYl_evDqo4fu_8NZh3d88un1R1f_5cfBf8_76p3tv_psyooll_k-nj2mWaC6b8BCmVrCytYjIFZKF1wytmUFKdijVuoCLIdrBV8dk3AxbIwWqJRAkpyRMnkY9ReLcipX8PR3u6RcrheykFINGCCc7-lJ25LMpjUB1Ez25IljDHFYpCYUiUYR4yVkUMKYIEiSiTDBSFcRgAajPAOU6ibjmShBToOwAXRf-6V31zceuXHdjHV9I0D7Jn7wgXzr4TVX_-PYw_mpLatO69_qW7R1Q-m8o9OcsGP2rM_f3lM_3n0_-aL18zWHV-eHzkik9r4_LriMt16a98Tm889sdiTASDqRwFLW3evj6_mU6i96cqInGEY4Ea0F1brlsPUXJSlCrcpogfRCzGll1YKRsqXrWEuVajq21wK5buKv5FxOV7wQy4Wz5Wwg3NRIaCbbABKopoi6BkymSSrTRCpSQ8UqEEQBmkTTBQp0VcSGBgAvYaiZCgQMUiz89wXdVtm7cNtjqvLmb5ZtHAHq2vfX9Uem2q-ZXTzw3oub0ruemj52f29hwuWeZ2_6_qylidcO5d9p3310-_wptTV3Dv1WXZzbuLsNDU2-dzDy7Sf84MxxTQKZAxWi5wO0Agci0WTOnStHklUpbPaHSi7JcsV9UbvH6_IHzHTVNeqRwLtlj6rV_T3ZehxGex2KLA3EXX4XhNmaGZdj-VjCRgy37HA4mhkJTWQbQBImhOkcbyIeQA1oiOcNBVBEdBWYlOc0IgBAREPnBYoABSavUAI0RhFuAM22T_nzs8NTTyz94YFVI0Bze_9-4utz3dvfW5B7fXL7uJWvr-1rO-uG5evusY_jlk-ynvzheOfE5IFIt4ieunAfO3bK-CWuX2w-0HbQP_fDi-8Y2NPsitNYLCfr2YLGEhrwg7LNoL3ZWKLSAzwDxNvNV_OA1L0oFVdToz5xuURswBupyoo7Ggua8Vg8kIjV5TLvcyCUDJsV7PXKuglTehn5mwBqJnvyBSEd6qpKTV5HCEOeU4mBCJSwpImmIkBd4BRGoUEUHUlMQ6amQVVVJB2CBpBwxRsH-9-cvnYnOfvd_wQAAP__hMJ751MQAAA
|
||||||
BIN
pkg/container/testvectors/Bytes
Normal file
BIN
pkg/container/testvectors/Bytes
Normal file
Binary file not shown.
BIN
pkg/container/testvectors/BytesGzipped
Normal file
BIN
pkg/container/testvectors/BytesGzipped
Normal file
Binary file not shown.
@@ -2,12 +2,11 @@ package container
|
|||||||
|
|
||||||
import (
|
import (
|
||||||
"bytes"
|
"bytes"
|
||||||
"encoding/base64"
|
|
||||||
"io"
|
"io"
|
||||||
|
|
||||||
"github.com/ipfs/go-cid"
|
"github.com/ipfs/go-cid"
|
||||||
"github.com/ipld/go-ipld-prime"
|
"github.com/ipld/go-ipld-prime"
|
||||||
"github.com/ipld/go-ipld-prime/codec/dagcbor"
|
"github.com/ipld/go-ipld-prime/codec/cbor"
|
||||||
"github.com/ipld/go-ipld-prime/datamodel"
|
"github.com/ipld/go-ipld-prime/datamodel"
|
||||||
"github.com/ipld/go-ipld-prime/fluent/qp"
|
"github.com/ipld/go-ipld-prime/fluent/qp"
|
||||||
"github.com/ipld/go-ipld-prime/node/basicnode"
|
"github.com/ipld/go-ipld-prime/node/basicnode"
|
||||||
@@ -25,22 +24,92 @@ func (ctn Writer) AddSealed(cid cid.Cid, data []byte) {
|
|||||||
ctn[cid] = data
|
ctn[cid] = data
|
||||||
}
|
}
|
||||||
|
|
||||||
const currentContainerVersion = "ctn-v1"
|
// ToBytes encode the container into raw bytes.
|
||||||
|
func (ctn Writer) ToBytes() ([]byte, error) {
|
||||||
|
return ctn.toBytes(headerRawBytes)
|
||||||
|
}
|
||||||
|
|
||||||
// ToCbor encode the container into a DAG-CBOR binary format.
|
// ToBytesWriter is the same as ToBytes, but with an io.Writer.
|
||||||
func (ctn Writer) ToCbor() ([]byte, error) {
|
func (ctn Writer) ToBytesWriter(w io.Writer) error {
|
||||||
|
return ctn.toWriter(headerRawBytes, w)
|
||||||
|
}
|
||||||
|
|
||||||
|
// ToBytesGzipped encode the container into gzipped bytes.
|
||||||
|
func (ctn Writer) ToBytesGzipped() ([]byte, error) {
|
||||||
|
return ctn.toBytes(headerRawBytesGzip)
|
||||||
|
}
|
||||||
|
|
||||||
|
// ToBytesGzippedWriter is the same as ToBytesGzipped, but with an io.Writer.
|
||||||
|
func (ctn Writer) ToBytesGzippedWriter(w io.Writer) error {
|
||||||
|
return ctn.toWriter(headerRawBytesGzip, w)
|
||||||
|
}
|
||||||
|
|
||||||
|
// ToBase64StdPadding encode the container into a base64 string, with standard encoding and padding.
|
||||||
|
func (ctn Writer) ToBase64StdPadding() (string, error) {
|
||||||
|
return ctn.toString(headerBase64StdPadding)
|
||||||
|
}
|
||||||
|
|
||||||
|
// ToBase64StdPaddingWriter is the same as ToBase64StdPadding, but with an io.Writer.
|
||||||
|
func (ctn Writer) ToBase64StdPaddingWriter(w io.Writer) error {
|
||||||
|
return ctn.toWriter(headerBase64StdPadding, w)
|
||||||
|
}
|
||||||
|
|
||||||
|
// ToBase64StdPaddingGzipped encode the container into a pre-gzipped base64 string, with standard encoding and padding.
|
||||||
|
func (ctn Writer) ToBase64StdPaddingGzipped() (string, error) {
|
||||||
|
return ctn.toString(headerBase64StdPaddingGzip)
|
||||||
|
}
|
||||||
|
|
||||||
|
// ToBase64StdPaddingGzippedWriter is the same as ToBase64StdPaddingGzipped, but with an io.Writer.
|
||||||
|
func (ctn Writer) ToBase64StdPaddingGzippedWriter(w io.Writer) error {
|
||||||
|
return ctn.toWriter(headerBase64StdPaddingGzip, w)
|
||||||
|
}
|
||||||
|
|
||||||
|
// ToBase64URL encode the container into base64 string, with URL-safe encoding and no padding.
|
||||||
|
func (ctn Writer) ToBase64URL() (string, error) {
|
||||||
|
return ctn.toString(headerBase64URL)
|
||||||
|
}
|
||||||
|
|
||||||
|
// ToBase64URLWriter is the same as ToBase64URL, but with an io.Writer.
|
||||||
|
func (ctn Writer) ToBase64URLWriter(w io.Writer) error {
|
||||||
|
return ctn.toWriter(headerBase64URL, w)
|
||||||
|
}
|
||||||
|
|
||||||
|
// ToBase64URL encode the container into pre-gzipped base64 string, with URL-safe encoding and no padding.
|
||||||
|
func (ctn Writer) ToBase64URLGzip() (string, error) {
|
||||||
|
return ctn.toString(headerBase64URLGzip)
|
||||||
|
}
|
||||||
|
|
||||||
|
// ToBase64URLWriter is the same as ToBase64URL, but with an io.Writer.
|
||||||
|
func (ctn Writer) ToBase64URLGzipWriter(w io.Writer) error {
|
||||||
|
return ctn.toWriter(headerBase64URLGzip, w)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (ctn Writer) toBytes(header header) ([]byte, error) {
|
||||||
var buf bytes.Buffer
|
var buf bytes.Buffer
|
||||||
err := ctn.ToCborWriter(&buf)
|
err := ctn.toWriter(header, &buf)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
return buf.Bytes(), nil
|
return buf.Bytes(), nil
|
||||||
}
|
}
|
||||||
|
|
||||||
// ToCborWriter is the same as ToCbor, but with an io.Writer.
|
func (ctn Writer) toString(header header) (string, error) {
|
||||||
func (ctn Writer) ToCborWriter(w io.Writer) error {
|
var buf bytes.Buffer
|
||||||
|
err := ctn.toWriter(header, &buf)
|
||||||
|
if err != nil {
|
||||||
|
return "", err
|
||||||
|
}
|
||||||
|
return buf.String(), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (ctn Writer) toWriter(header header, w io.Writer) (err error) {
|
||||||
|
encoder := header.encoder(w)
|
||||||
|
|
||||||
|
defer func() {
|
||||||
|
err = encoder.Close()
|
||||||
|
}()
|
||||||
node, err := qp.BuildMap(basicnode.Prototype.Any, 1, func(ma datamodel.MapAssembler) {
|
node, err := qp.BuildMap(basicnode.Prototype.Any, 1, func(ma datamodel.MapAssembler) {
|
||||||
qp.MapEntry(ma, currentContainerVersion, qp.List(int64(len(ctn)), func(la datamodel.ListAssembler) {
|
qp.MapEntry(ma, containerVersionTag, qp.List(int64(len(ctn)), func(la datamodel.ListAssembler) {
|
||||||
for _, data := range ctn {
|
for _, data := range ctn {
|
||||||
qp.ListEntry(la, qp.Bytes(data))
|
qp.ListEntry(la, qp.Bytes(data))
|
||||||
}
|
}
|
||||||
@@ -49,60 +118,6 @@ func (ctn Writer) ToCborWriter(w io.Writer) error {
|
|||||||
if err != nil {
|
if err != nil {
|
||||||
return err
|
return err
|
||||||
}
|
}
|
||||||
return ipld.EncodeStreaming(w, node, dagcbor.Encode)
|
|
||||||
}
|
|
||||||
|
|
||||||
// ToCborBase64 encode the container into a base64 encoded DAG-CBOR binary format.
|
return ipld.EncodeStreaming(encoder, node, cbor.Encode)
|
||||||
func (ctn Writer) ToCborBase64() (string, error) {
|
|
||||||
var buf bytes.Buffer
|
|
||||||
err := ctn.ToCborBase64Writer(&buf)
|
|
||||||
if err != nil {
|
|
||||||
return "", err
|
|
||||||
}
|
|
||||||
return buf.String(), nil
|
|
||||||
}
|
|
||||||
|
|
||||||
// ToCborBase64Writer is the same as ToCborBase64, but with an io.Writer.
|
|
||||||
func (ctn Writer) ToCborBase64Writer(w io.Writer) error {
|
|
||||||
w2 := base64.NewEncoder(base64.StdEncoding, w)
|
|
||||||
defer w2.Close()
|
|
||||||
return ctn.ToCborWriter(w2)
|
|
||||||
}
|
|
||||||
|
|
||||||
// ToCar encode the container into a CAR file.
|
|
||||||
func (ctn Writer) ToCar() ([]byte, error) {
|
|
||||||
var buf bytes.Buffer
|
|
||||||
err := ctn.ToCarWriter(&buf)
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
return buf.Bytes(), nil
|
|
||||||
}
|
|
||||||
|
|
||||||
// ToCarWriter is the same as ToCar, but with an io.Writer.
|
|
||||||
func (ctn Writer) ToCarWriter(w io.Writer) error {
|
|
||||||
return writeCar(w, nil, func(yield func(carBlock, error) bool) {
|
|
||||||
for c, data := range ctn {
|
|
||||||
if !yield(carBlock{c: c, data: data}, nil) {
|
|
||||||
return
|
|
||||||
}
|
|
||||||
}
|
|
||||||
})
|
|
||||||
}
|
|
||||||
|
|
||||||
// ToCarBase64 encode the container into a base64 encoded CAR file.
|
|
||||||
func (ctn Writer) ToCarBase64() (string, error) {
|
|
||||||
var buf bytes.Buffer
|
|
||||||
err := ctn.ToCarBase64Writer(&buf)
|
|
||||||
if err != nil {
|
|
||||||
return "", err
|
|
||||||
}
|
|
||||||
return buf.String(), nil
|
|
||||||
}
|
|
||||||
|
|
||||||
// ToCarBase64Writer is the same as ToCarBase64, but with an io.Writer.
|
|
||||||
func (ctn Writer) ToCarBase64Writer(w io.Writer) error {
|
|
||||||
w2 := base64.NewEncoder(base64.StdEncoding, w)
|
|
||||||
defer w2.Close()
|
|
||||||
return ctn.ToCarWriter(w2)
|
|
||||||
}
|
}
|
||||||
|
|||||||
Reference in New Issue
Block a user