Skip to content

lydell/tiny-decoders

Repository files navigation

tiny-decoders minified size

Type-safe data decoding and encoding for the minimalist.

It’s similar to zod/mini, but simpler. tiny-decoders is a single file, around a thousand lines of TypeScript, with rather trivial functions and a few kinda nuts TypeScript types. Unlike Zod, there are no date parsing or email validation functions or anything like that. Just decoders and encoders for all basic TypeScript types.

The problem

A typical use case for tiny-decoders is making sure that JSON you read from somewhere actually looks like you expect. Consider this example:

const user = JSON.parse(someData) as User;

if (user.permissions.launchRockets) {
  launchRockets();
}

as User makes TypeScript trust you that the JSON actually matches your User type, but that could lead to problems:

  • That if statement could crash with TypeError: Cannot read properties of undefined (reading 'launchRockets') if user turned out not to have .permissions. Maybe it’s called something else. Maybe it’s missing if the user has zero permissions. With tiny-decoders, you’d be alerted while parsing the JSON instead – where the actual problem is, instead of at a random spot in your code.
  • That if statement could wrongly run if it turns out that .launchRockets is an object (instead of a boolean), and you were actually supposed to do if (user.permissions.launchRockets.granted). Oops! Launched some extra rockets there.

Here’s what the code could look like with tiny-decoders (except User would be much longer in reality):

import { boolean, fields, format, type Infer, JSON } from "tiny-decoders";

const User = fields({ permissions: fields({ launchRockets: boolean }) });
type User = Infer<typeof User>;

const userResult = JSON.parse(User, someData);

if (userResult.tag === "DecoderError") {
  throw new Error(`Failed to decode user:\n${format(userResult.error)}`);
}

const user = userResult.value;

if (user.permissions.launchRockets) {
  launchRockets();
}

And here’s an error message you could get:

Failed to decode user:
At root["permissions"]["launchRockets"]:
Expected a boolean
Got: {
  "granted": false
}

Installation

npm install tiny-decoders

👉 Codecs summary

TypeScript requirements

tiny-decoders requires TypeScript 5+ (because it uses const type parameters).

It is recommended to enable the following tsconfig.json options:

Note that it is possible to use tiny-decoders in plain JavaScript without type checking as well.

Example

import {
  array,
  boolean,
  field,
  fields,
  format,
  type Infer,
  number,
  string,
} from "tiny-decoders";

// You can also import into a namespace if you want (conventionally called `Codec`):
import * as Codec from "tiny-decoders";

const userCodec = fields({
  name: string,
  active: field(boolean, { renameFrom: "is_active" }),
  age: field(number, { optional: true }),
  interests: array(string),
});

type User = Infer<typeof userCodec>;
// equivalent to:
type User = {
  name: string;
  active: boolean;
  age?: number;
  interests: Array<string>;
};
// Note how the type syntax looks pretty similar to the codec! That’s on purpose!

const payload: unknown = getSomeJSON();

const userResult: DecoderResult<User> = userCodec.decoder(payload);

switch (userResult.tag) {
  case "DecoderError":
    console.error(format(userResult.error));
    break;

  case "Valid":
    console.log(userResult.value);
    break;
}

Here’s an example error message:

At root["age"]:
Expected a number
Got: "30"

Codec<T> and DecoderResult<T>

type Codec<Decoded, Encoded = unknown> = {
  decoder: (value: unknown) => DecoderResult<Decoded>;
  encoder: (value: Decoded) => Encoded;
};

type DecoderResult<Decoded> =
  | { tag: "DecoderError"; error: DecoderError }
  | { tag: "Valid"; value: Decoded };

A codec is an object with a decoder and an encoder.

A decoder is a function that:

  • Takes an unknown value and refines it to any type you want (Decoded).
  • Returns a DecoderResult: Either that refined Decoded or a DecoderError.

An encoder is a function that turns Decoded back into what the input looked like. You can think of it as “turning Decoded back into unknown”, but usually the Encoded type variable is inferred to something more precise.

That’s it!

tiny-decoders ships with a bunch of codecs, and a few functions to combine codecs. This way you can describe the shape of any data!

tiny-decoders used to only have decoders, and not encoders. That’s why it’s called tiny-decoders and not tiny-codecs. Decoders are still the most interesting part.

Codecs

Here’s a summary of all codecs (with slightly simplified type annotations) and related functions.

Codec Type JSON TypeScript
unknown Codec<unknown> any unknown
boolean Codec<boolean> boolean boolean
number Codec<number> number number
bigint Codec<bigint> n/a bigint
string Codec<string> string string
primitiveUnion
(variants: [
  "string1",
  "string2",
  "stringN",
  1,
  2,
  true
]) => Codec<
 "string1"
 | "string2"
 | "stringN"
 | 1
 | 2
 | true
>
string, number, boolean, null
"string1"
| "string2"
| "stringN"
| 1
| 2
| true
array
(decoder: Codec<T>) =>
 Codec<Array<T>>
array Array<T>
record
(decoder: Codec<T>) =>
 Codec<Record<string, T>>
object Record<string, T>
fields
(mapping: {
 field1: Codec<T1>,
 field2: Field<
  T2,
  { optional: true }
 >,
 field3: Field<
  T3,
  { renameFrom: "field_3" }
 >,
 fieldN: Codec<TN>
}) => Codec<{
 field1: T1,
 field2?: T2,
 field3: T3,
 fieldN: TN
}>
{
 "field1": ...,
 "field2": ...,
 "field_3": ...,
 "fieldN": ...
}
or:
{
 "field1": ...,
 "field_3": ...,
 "fieldN": ...
}
{
 field1: T1,
 field2?: T2,
 field3: T3,
 fieldN: TN
}
field
(
 codec: Codec<Decoded>,
 meta: Meta,
) => Field<Decoded, Meta>
n/a n/a, only used with fields
taggedUnion
(
 decodedCommonField: string,
 variants: Array<
  Parameters<typeof fields>[0]
 >,
) => Codec<T1 | T2 | TN>
object T1 | T2 | TN
tag
(
 decoded: "string literal",
 options?: Options,
) =>
 Field<"string literal", Meta>
string "string literal"
tuple
(codecs: [
 Codec<T1>,
 Codec<T2>,
 Codec<TN>
]) => Codec<[T1, T2, TN]>
array [T1, T2, TN]
multi
(types: [
 "type1",
 "type2",
 "type10"
]) => Codec<
   { type: "type1",
     value: type1 }
 | { type: "type2",
     value: type2 }
 | { type: "type10",
     value: type10 }
>
you decide A subset of:
  { type: "undefined";
    value: undefined }
| { type: "null";
    value: null }
| { type: "boolean";
    value: boolean }
| { type: "number";
    value: number }
| { type: "bigint";
    value: bigint }
| { type: "string";
    value: string }
| { type: "symbol";
    value: symbol }
| { type: "function";
    value: Function }
| { type: "array";
    value: Array }
| { type: "object";
    value: Record }
recursive
(callback: () => Codec<T>) =>
 Codec<T>
n/a T
undefinedOr
(codec: Codec<T>) =>
 Codec<T | undefined>
undefined or … T | undefined
nullOr
(codec: Codec<T>) =>
 Codec<T | null>
null or … T | null
map
(
 codec: Codec<T>,
 transform: {
  decoder: (value: T) => U;
  encoder: (value: U) => T;
 },
) => Codec<U>
n/a U
flatMap
(
 decoder: Codec<T>,
 transform: {
  decoder: (value: T) =>
   DecoderResult<U>;
  encoder: (value: U) => T;
 },
) => Codec<U>
n/a U

unknown

const unknown: Codec<unknown>;

Codec for any JSON value, and a TypeScript unknown. Basically, both the decoder and encoder are identity functions.

boolean

const boolean: Codec<boolean, boolean>;

Codec for a JSON boolean, and a TypeScript boolean.

number

const number: Codec<number, number>;

Codec for a JSON number, and a TypeScript number.

bigint

const bigint: Codec<bigint, bigint>;

Codec for a JavaScript bigint, and a TypeScript bigint.

Note: JSON does not have bigint. You need to serialize them to strings, and then parse them to bigint. This function does not do that for you. It is only useful when you are decoding values that already are JavaScript bigint, but are unknown to TypeScript.

string

const string: Codec<string, string>;

Codec for a JSON string, and a TypeScript string.

primitiveUnion

type Color = "green" | "red";

const colorCodec: Codec<Color> = primitiveUnion(["green", "red"]);

Codec for a set of specific primitive values, and a TypeScript union of those values.

It takes one parameter (variants), which is an array of the values you want.

Notes:

  • You must provide at least one variant.
  • If you provide exactly one variant, you get a codec for a single, constant, exact value (a union with just one variant).
  • If you have an object and want to use its keys for a string union there’s an example of that in the type inference example.

Full type definition:

function primitiveUnion<
  const Variants extends readonly [primitive, ...Array<primitive>],
>(variants: Variants): Codec<Variants[number], Variants[number]>;

type primitive = bigint | boolean | number | string | symbol | null | undefined;

array

const stringsCodec: Codec<Array<string>> = array(string);

Codec for a JSON array, and a TypeScript Array.

It takes one parameter (codec), which is a codec for each item of the array.

As shown in the example above, array(string) is a codec for an array of strings (Array<string>).

Full type definition:

function array<DecodedItem, EncodedItem>(
  codec: Codec<DecodedItem, EncodedItem>,
): Codec<Array<DecodedItem>, Array<EncodedItem>>;

record

const countsCodec: Codec<Record<string, number>> = record(number);

Codec for a JSON object, and a TypeScript Record. (Yes, this function is named after TypeScript’s type. Other languages call this a “dict”.)

It takes one parameter (codec), which is a codec for each value of the object.

As shown in the example above, record(number) is a codec for an object where the keys can be any strings and the values are numbers (Record<string, number>).

Full type definition:

function record<DecodedValue, EncodedValue>(
  codec: Codec<DecodedValue, EncodedValue>,
): Codec<Record<string, DecodedValue>, Record<string, EncodedValue>>;

fields

type User = {
  name: string;
  age?: number;
  active: boolean;
};

const userCodec: Codec<User> = fields({
  name: string,
  age: field(number, { optional: true }),
  active: field(boolean, { renameFrom: "is_active" }),
});

Codec for a JSON object with certain fields, and a TypeScript object type (also called interface) with known fields.

It takes one main parameter (mapping), which is an object with the keys you want in your TypeScript object. The values are either Codecs or Fields. A Field is just a Codec with some metadata: Whether the field is optional, and whether the field has a different name in the JSON object. Passing a plain Codec instead of a Field is just a convenience shortcut for passing a Field with the default metadata (the field is required, and has the same name both in TypeScript and in JSON).

Use the field function to create a Field – use it when you need to mark a field as optional, or when it has a different name in JSON than in TypeScript.

fields also takes an allowExtraFields option, which lets you choose between ignoring extraneous fields and making it an error.

  • true (default) allows extra fields on the object.
  • false returns a DecoderError for extra fields.

See also the Extra fields example.

Full type definition (it’s a bit of a mouthful and looks much more complicated than using fields actually is):

function fields<Mapping extends FieldsMapping>(
  mapping: Mapping,
  { allowExtraFields = true }: { allowExtraFields?: boolean } = {},
): Codec<InferFields<Mapping>, InferEncodedFields<Mapping>>;

type FieldsMapping = Record<string, Codec<any> | Field<any, any, FieldMeta>>;

type Field<Decoded, Encoded, Meta extends FieldMeta> = Meta & {
  codec: Codec<Decoded, Encoded>;
};

type FieldMeta = {
  renameFrom?: string | undefined;
  optional?: boolean | undefined;
  tag?: { decoded: primitive; encoded: primitive } | undefined;
};

type primitive = bigint | boolean | number | string | symbol | null | undefined;

type InferFields<Mapping extends FieldsMapping> = magic;

type InferEncodedFields<Mapping extends FieldsMapping> = magic;

field

const optionalFieldCodec: Codec<{ maybe?: string }> = fields({
  maybe: field(string, { optional: true }),
});

This function takes a codec and lets you:

  • Mark a field as optional: field(string, { optional: true })
  • Rename a field: field(string, { renameFrom: "some_name" })
  • Both: field(string, { optional: true, renameFrom: "some_name" })

Use it with fields.

Here’s an example illustrating the difference between field(string, { optional: true }) and undefinedOr(string):

const exampleCodec = fields({
  // Required field.
  a: string,

  // Optional field.
  b: field(string, { optional: true }),

  // Required field that can be set to `undefined`:
  c: undefinedOr(string),

  // Optional field that can be set to `undefined`:
  d: field(undefinedOr(string), { optional: true }),
});

The inferred type from exampleCodec is:

type Example = {
  a: string;
  b?: string;
  c: string | undefined;
  d?: string | undefined;
};

Full type definition:

function field<Decoded, Encoded, const Meta extends Omit<FieldMeta, "tag">>(
  codec: Codec<Decoded, Encoded>,
  meta: Meta,
): Field<Decoded, Encoded, Meta>;

type Field<Decoded, Encoded, Meta extends FieldMeta> = Meta & {
  codec: Codec<Decoded, Encoded>;
};

type FieldMeta = {
  renameFrom?: string | undefined;
  optional?: boolean | undefined;
  tag?: { decoded: primitive; encoded: primitive } | undefined;
};

type primitive = bigint | boolean | number | string | symbol | null | undefined;

The tag thing is handled by the tag function. It’s not something you’ll set manually using field. (That’s why the type annotation says Omit<FieldMeta, "tag">.)

Warning

It is recommended to enable the exactOptionalPropertyTypes option in tsconfig.json.

Why? Let’s take this codec as an example:

const exampleCodec = fields({ name: field(string, { optional: true }) });

With exactOptionalPropertyTypes enabled, the inferred type for exampleCodec is:

type Example = { name?: string };

That type allows constructing {} or { name: "some string" }. If you pass either of those to exampleCodec.decoder (such as exampleCodec.decoder({ name: "some string" })), the decoder will succeed. It makes sense that a decoder accepts things that it has produced itself (when no transformation is involved).

With exactOptionalPropertyTypes turned off (which is the default), the inferred type for exampleCodec is:

type Example = { name?: string | undefined };

Notice the added | undefined. That allows also constructing { name: undefined }. But if you run exampleCodec.decoder({ name: undefined }), the decoder will fail. The decoder only supports name existing and being set to a string, or name being missing. It does not support it being set to undefined explicitly. If you wanted to support that, use undefinedOr:

const exampleCodec = fields({
  name: field(undefinedOr(string), { optional: true }),
});

That gives the same inferred type, but also supports decoding the name field being set to undefined explicitly.

All in all, you avoid a slight gotcha with optional fields and inferred types if you enable exactOptionalPropertyTypes.

taggedUnion

type Shape =
  | { tag: "Circle"; radius: number }
  | { tag: "Rectangle"; width: number; height: number };

const shapeCodec: Codec<Shape> = taggedUnion("tag", [
  {
    tag: tag("Circle"),
    radius: number,
  },
  {
    tag: tag("Rectangle"),
    width: field(number, { renameFrom: "width_px" }),
    height: field(number, { renameFrom: "height_px" }),
  },
]);

Codec for JSON objects with a common field (that tells them apart), and a TypeScript tagged union type.

The first parameter (decodedCommonField) is the name of the common field.

The second parameter (variants) is an array of objects. Those objects are “fields objects” – they fit when passed to fields as well. All of those objects must have decodedCommonField as a key, and use the tag function on that key (that’s the field also called tag in the example above).

taggedUnion also takes an allowExtraFields option, which works just like for fields.

See also these examples:

Note: If you use the same tag value twice, the last one wins. TypeScript infers a type with two variants with the same tag (which is a valid type), but tiny-decoders can’t tell them apart. Nothing will ever decode to the first one, only the last one will succeed. Trying to encode the first one might result in bad data.

Full type definition (just like fields it’s a bit complicated):

function taggedUnion<
  const DecodedCommonField extends keyof Variants[number],
  Variants extends readonly [
    Variant<DecodedCommonField>,
    ...Array<Variant<DecodedCommonField>>,
  ],
>(
  decodedCommonField: DecodedCommonField,
  variants: Variants,
  { allowExtraFields = true }: { allowExtraFields?: boolean } = {},
): Codec<
  InferFieldsUnion<Variants[number]>,
  InferEncodedFieldsUnion<Variants[number]>
>;

type Variant<DecodedCommonField extends number | string | symbol> = Record<
  DecodedCommonField,
  Field<any, any, { tag: { decoded: primitive; encoded: primitive } }>
> &
  Record<string, Codec<any> | Field<any, any, FieldMeta>>;

type primitive = bigint | boolean | number | string | symbol | null | undefined;

type InferFieldsUnion<MappingsUnion extends FieldsMapping> = magic;

type InferEncodedFieldsUnion<MappingsUnion extends FieldsMapping> = magic;

// See `fields` for the definitions of `Field`, `FieldMeta` and `FieldsMapping`.

tag

const directionCodec: Codec<{ tag: "Left" } | { tag: "Right" }> = taggedUnion(
  "tag",
  [{ tag: tag("Left") }, { tag: tag("Right") }],
);

Used with taggedUnion, once for each variant of the union.

tag("MyTag") returns a Field with a codec that requires the input "MyTag" and returns "MyTag". The metadata of the Field also advertises that the tag value is "MyTag", which taggedUnion uses to know what to do.

tag("MyTag", { renameTagFrom: "my_tag" }) returns a Field with a codec that requires the input "my_tag" but returns "MyTag".

tag("MyTag", { renameFieldFrom: "otherFieldName" }) lets you use another common field name in TypeScript than in JSON – see the Renaming union field example.

You will typically use string tags for your tagged unions, but other primitive types such as boolean and number are supported too.

Full type definition:

function tag<
  const Decoded extends primitive,
  const Encoded extends primitive,
  const EncodedFieldName extends string,
>(
  decoded: Decoded,
  options: { renameTagFrom?: Encoded; renameFieldFrom?: EncodedFieldName } = {},
): Field<
  Decoded,
  Encoded,
  {
    renameFrom: EncodedFieldName | undefined;
    tag: { decoded: primitive; encoded: primitive };
  }
>;

type primitive = bigint | boolean | number | string | symbol | null | undefined;

tuple

type Point = [number, number];

const pointCodec: Codec<Point> = tuple([number, number]);

Codec for a JSON array, and a TypeScript tuple. They both must have the exact same length, otherwise the decoder fails.

It takes one parameter (codecs), which is a tuple of codecs, one codec per slot in the tuple.

See the tuples example for more details.

Full type definition:

function tuple<const Codecs extends ReadonlyArray<Codec<any>>>(
  codecs: Codecs,
): Codec<InferTuple<Codecs>, InferEncodedTuple<Codecs>>;

type InferTuple<Codecs extends ReadonlyArray<Codec<any>>> = magic;

type InferEncodedTuple<Codecs extends ReadonlyArray<Codec<any>>> = magic;

multi

type Id = { tag: "Id"; id: string } | { tag: "LegacyId"; id: number };

const idCodec: Codec<Id> = map(multi(["string", "number"]), {
  decoder: (value) => {
    switch (value.type) {
      case "string":
        return { tag: "Id" as const, id: value.value };
      case "number":
        return { tag: "LegacyId" as const, id: value.value };
    }
  },
  encoder: (id) => {
    switch (id.tag) {
      case "Id":
        return { type: "string", value: id.id };
      case "LegacyId":
        return { type: "number", value: id.id };
    }
  },
});

Codec for multiple types, and a TypeScript tagged union for those types.

This is useful for supporting stuff that can be either a string or a number, for example. It lets you do a JavaScript typeof, basically.

The type annotation for multi is a bit wacky, but it’s not that complicated to use. The types parameter is an array of strings – the wanted types. For example, you can say ["string", "number"]. Then the decoder will give you back either { type: "string", value: string } or { type: "number", value: number }. You can use map to map that to some type of choice, or flatMap to decode further. The example above shows that.

The types strings are the same as the JavaScript typeof returns, with two exceptions:

  • null is "null" instead of "object" (because typeof null === "object" is a famous mistake).
  • array is "array" instead of "object" (because arrays are very common).

If you need to tell other objects apart, write a custom codec.

Full type definition:

function multi<Types extends readonly [MultiTypeName, ...Array<MultiTypeName>]>(
  types: Types,
): Codec<Multi<Types[number]>, Multi<Types[number]>["value"]>;

type MultiTypeName =
  | "array"
  | "bigint"
  | "boolean"
  | "function"
  | "null"
  | "number"
  | "object"
  | "string"
  | "symbol"
  | "undefined";

type Multi<Types> = Types extends any
  ? Types extends "undefined"
    ? { type: "undefined"; value: undefined }
    : Types extends "null"
      ? { type: "null"; value: null }
      : Types extends "boolean"
        ? { type: "boolean"; value: boolean }
        : Types extends "number"
          ? { type: "number"; value: number }
          : Types extends "bigint"
            ? { type: "bigint"; value: bigint }
            : Types extends "string"
              ? { type: "string"; value: string }
              : Types extends "symbol"
                ? { type: "symbol"; value: symbol }
                : Types extends "function"
                  ? { type: "function"; value: Function }
                  : Types extends "array"
                    ? { type: "array"; value: Array<unknown> }
                    : Types extends "object"
                      ? { type: "object"; value: Record<string, unknown> }
                      : never
  : never;

recursive

type Person = {
  name: string;
  friends: Array<Person>;
};

const personCodec: Codec<Person> = fields({
  name: string,
  friends: array(recursive(() => personCodec)),
});

When you make a codec for a recursive data structure, you might end up with errors like:

ReferenceError: Cannot access 'personCodec' before initialization

The solution is to wrap personCodec in recursive: recursive(() => personCodec). The unnecessary-looking arrow function delays the reference to personCodec so we’re able to define it.

See the recursive example for more information.

Full type definition:

function recursive<Decoded, Encoded>(
  callback: () => Codec<Decoded, Encoded>,
): Codec<Decoded, Encoded>;

undefinedOr

const undefinedOrStringCodec: Codec<undefined | string> = undefinedOr(string);

Returns a new codec that also accepts undefined.

Notes:

  • Using undefinedOr does not make a field in an object optional. It only allows the field to be undefined. Similarly, using the field function to mark a field as optional does not allow setting the field to undefined, only omitting it.
  • JSON does not have undefined (only null). So undefinedOr is more useful when you are decoding something that does not come from JSON. However, even when working with JSON undefinedOr still has a use: If you infer types from codecs, using undefinedOr on object fields results in | undefined for the type of the field, which allows you to assign undefined to it which is occasionally useful.

Full type definition:

function undefinedOr<Decoded, Encoded>(
  codec: Codec<Decoded, Encoded>,
): Codec<Decoded | undefined, Encoded | undefined>;

nullOr

const nullOrStringCodec: Codec<null | string> = nullOr(string);

Returns a new codec that also accepts null.

Full type definition:

function nullOr<Decoded, Encoded>(
  codec: Codec<Decoded, Encoded>,
): Codec<Decoded | null, Encoded | null>;

map

const numberSetCodec: Codec<Set<number>> = map(array(number), {
  decoder: (arr) => new Set(arr),
  encoder: Array.from,
});

map takes two parameters: codec, and transform, which is an object with two functions.

Use map to run a function (transform.decoder) after a decoder (if it succeeds). The function transforms the decoded data. transform.encoder turns the transformed data back again.

Full type definition:

function map<const Decoded, Encoded, NewDecoded>(
  codec: Codec<Decoded, Encoded>,
  transform: {
    decoder: (value: Decoded) => NewDecoded;
    encoder: (value: NewDecoded) => Readonly<Decoded>;
  },
): Codec<NewDecoded, Encoded>;

flatMap

const regexCodec: Codec<RegExp> = flatMap(string, {
  decoder: (str) => {
    try {
      return { tag: "Valid", value: RegExp(str, "u") };
    } catch (error) {
      return {
        tag: "DecoderError",
        error: {
          tag: "custom",
          message: error instanceof Error ? error.message : String(error),
          got: str,
          path: [],
        },
      };
    }
  },
  encoder: (regex) => regex.source,
});

flatMap takes two parameters: codec, and transform, which is an object with two functions.

Use flatMap to run a function (transform.decoder) after a decoder (if it succeeds). The function decodes the decoded data further, returning another DecoderResult which is then “flattened” (so you don’t end up with a DecoderResult inside a DecoderResult). transform.encoder turns the transformed data back again.

Note: Sometimes TypeScript has trouble inferring the return type of the transform.decoder function. No matter what you do, it keeps complaining. In such cases it helps to add return type annotation on the transform.decoder function.

Full type definition:

function flatMap<const Decoded, Encoded, NewDecoded>(
  codec: Codec<Decoded, Encoded>,
  transform: {
    decoder: (value: Decoded) => DecoderResult<NewDecoded>;
    encoder: (value: NewDecoded) => Readonly<Decoded>;
  },
): Codec<NewDecoded, Encoded>;

DecoderError

type DecoderError = {
  path: Array<number | string>;
  orExpected?: "null or undefined" | "null" | "undefined";
} & (
  | {
      tag: "custom";
      message: string;
      got: unknown;
    }
  | {
      tag: "exact fields";
      knownFields: Array<string>;
      got: Array<string>;
    }
  | {
      tag: "missing field";
      field: string;
      got: Record<string, unknown>;
    }
  | {
      tag: "tuple size";
      expected: number;
      got: number;
    }
  | {
      tag: "unknown taggedUnion tag";
      knownTags: Array<primitive>;
      got: unknown;
    }
  | {
      tag: "unknown multi type";
      knownTypes: Array<
        | "array"
        | "boolean"
        | "null"
        | "number"
        | "object"
        | "string"
        | "undefined"
      >;
      got: unknown;
    }
  | {
      tag: "unknown primitiveUnion variant";
      knownVariants: Array<primitive>;
      got: unknown;
    }
  | {
      tag: "wrong tag";
      expected: primitive;
      got: unknown;
    }
  | { tag: "array"; got: unknown }
  | { tag: "bigint"; got: unknown }
  | { tag: "boolean"; got: unknown }
  | { tag: "number"; got: unknown }
  | { tag: "object"; got: unknown }
  | { tag: "string"; got: unknown }
);

type primitive = bigint | boolean | number | string | symbol | null | undefined;

The error returned by all decoders. It keeps track of where in the JSON the error occurred.

Use the format function to get a nice string explaining what went wrong.

const myCodec = array(string);

const decoderResult = myCodec.decoder(someUnknownValue);
switch (decoderResult.tag) {
  case "DecoderError":
    console.error(format(decoderResult.error));
    break;
  case "Valid":
    console.log(decoderResult.value);
    break;
}

When creating your own DecoderError, you probably want to do something like this:

const myError: DecoderError = {
  tag: "custom", // You probably want "custom".
  message: "my message", // What you expected, or what went wrong.
  got: theValueYouTriedToDecode,
  // Usually the empty array; put the object key or array index you’re at if
  // that makes sense. This will show up as for example `At root["myKey"]`.
  path: [],
};

orExpected exists so that undefinedOr and nullOr can say that undefined and/or null also are expected values.

format

function format(error: DecoderError, options?: ReprOptions): string;

Turn the DecoderError into a nicely formatted string. It uses repr under the hood and takes the same options.

repr

type ReprOptions = {
  depth?: number | undefined;
  indent?: string | undefined;
  maxArrayChildren?: number | undefined;
  maxObjectChildren?: number | undefined;
  maxLength?: number | undefined;
  sensitive?: boolean | undefined;
};

function repr(
  value: unknown,
  {
    depth = 0,
    indent = "  ",
    maxArrayChildren = 5,
    maxObjectChildren = 5,
    maxLength = 100,
    sensitive = false,
  }: ReprOptions = {},
): string;

Takes any value, and returns a string representation of it for use in error messages. format uses it behind the scenes. If you want to do your own formatting, repr can be useful.

Options:

name type default description
depth number 0 How deep to recursively call repr on array items and object values.
indent string " " (two spaces) The indentation to use for nested values when depth is larger than 0.
maxArrayChildren number 5 The number of array items to print.
maxObjectChildren number 5 The number of object key-values to print.
maxLength number 100 The maximum length of literals, such as strings, before truncating them.
sensitive boolean false Set it to true if you deal with sensitive data to avoid leaks. See below.

format(someDecoderError) example:

At root["details"]["ssn"]:
Expected a string
Got: 123456789

format(someDecoderError, { sensitive: true }) example:

At root["details"]["ssn"]:
Expected a string
Got: number
(Actual values are hidden in sensitive mode.)

It’s helpful when errors show you the actual values that failed decoding to make it easier to understand what happened. However, if you’re dealing with sensitive data, such as email addresses, passwords or social security numbers, you might not want that data to potentially appear in error logs.

Replacement for JSON.parse and JSON.stringify

const JSON: {
  parse<Decoded>(
    codec: Codec<Decoded>,
    jsonString: string,
  ): DecoderResult<Decoded>;

  stringify<Decoded, Encoded>(
    codec: Codec<Decoded, Encoded>,
    value: Decoded,
    space?: number | string,
  ): string;
};

tiny-decoders exports a JSON object with parse and stringify methods, similar to the standard global JSON object. The difference is that tiny-decoder’s versions also take a Codec, which makes them safer.

You can use ESLint’s no-restricted-globals rule to forbid the global JSON object, for maximum safety:

{
  "rules": {
    "no-restricted-globals": [
      "error",
      {
        "name": "JSON",
        "message": "Import JSON from tiny-decoders and use its JSON.parse and JSON.stringify with a codec instead."
      }
    ]
  }
}

Note

The standard JSON.stringify can return undefined (if you try to stringify undefined itself, or a function or a symbol). tiny-decoder’s JSON.stringify always returns a string – it returns "null" for undefined, functions and symbols.

Type inference

Rather than first defining the type and then defining the codec (which often feels like writing the type twice), you can only define the decoder and then infer the type.

const personCodec = fields({
  name: string,
  age: number,
});

type Person = Infer<typeof personCodec>;
// equivalent to:
type Person = {
  name: string;
  age: number;
};

This is a nice pattern (naming the type and the codec the same):

type Person = Infer<typeof Person>;
const Person = fields({ name: string, age: number });

Note that if you don’t annotate a codec, TypeScript infers both type parameters of Codec<Decoded, Encoded>. But if you annotate it with Codec<MyType>, TypeScript does not infer Encoded – it will become unknown. If you specify one type parameter, TypeScript stops inferring them altogether and requires you to specify all of them – except the ones with defaults. Encoded defaults to unknown, which is usually fine, but occasionally you need to work with a more precise type for Encoded. Then it might even be easier to leave out the type annotation!

See the type inference example for more details.

Some people like writing the types first, and then the codecs. Other people like writing only the codec and inferring the type. Some like both. It’s up to you to choose.

Things left out

either

// 🚨 Does not exist!
function either<T, U>(codec1: Codec<T>, codec2: Codec<U>): Codec<T | U>;

The decoder of this codec would try codec1.decoder first. If it fails, go on and try codec2.decoder. If that fails, present both errors. I consider this a blunt tool.

  • If you want either a string or a number, use multi. This let’s you switch between any JSON types.
  • For objects that can be decoded in different ways, use taggedUnion. If that’s not possible, see the untagged union example for how you can approach the problem.

The above approaches result in a much simpler DecoderError type, and also results in much better error messages, since there’s never a need to present something like “decoding failed in the following 2 ways: …”

About

Type-safe data decoding and encoding for the minimalist.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Contributors 3

  •  
  •  
  •