# Review of CL Json Libraries UPDATED 1 Feb 2022

## Changelog

1 Feb 2022 - Picks up updates in shasht and now includes warnings on libraries with safety 0 (jsown, jonathan, st-json).

19 Jan 2022 - Updated shasht for fixes in character, 2D arrays, and nil handling. (If you rely on quicklisp versions, expect these improvements in the next quicklisp version, but they are up on github now.) Fixed some typos, including a missing earmuff.

18 Jan 2022 - Corrected json conformity tests on boost-json, cl-json, json-lib, shasht, st-json, trivial-json-codec and yason. If *read-default-float-format* is set to 'single-float, those libraries would refuse to accept: "[123e65]", "[123e45]" and "[123.456e78]". If *read-default-float-format* is set to 'double-float, then those json strings would be correctly accepted. Net result: with that caveat, boost-json, cl-json, com.gigamonkeys.json, com.inuoe.jzon, json-lib, shasht, st-json and yason all score 95/95 on the json strings that must be accepted. See Standard Conformity - Must Accept.

15 Jan 2022 - Complete rewrite of 1st edition.

## Introduction

The common lisp (CL) landscape with respect to Json libraries has changed since the first edition of this review eight years ago. While I sometimes complain about someone writing new libraries when there are already so many, there have been major improvements and a couple of the new generation of CL json libraries are actually quite exciting. It still remains the case, however, that your choice of datastructures for your application may drive your choice of json library and vice versa. Like everything else in life, there are trade-offs to be made and I hope this paper helps you think about what may or may not be relevant for your situation.

Corrections to this report are welcomed. Please submit issues to https://github.com/sabracrolleton/sabracrolleton.github.io or pull requests against the json-view.org file.

### Common Lisp Encoding and Decoding Libraries

CL currently has at least twelve libraries that address importing and exporting json data. The libraries considered in this report are listed in the table below. (For purposes of this comparison, I will refer to "encoding" as converting from CL to JSON and "decoding" or "parsing" as converting from JSON to CL.)

Table 1: Common Lisp Json Libraries Compared
Library Author License Website Quicklisp? Updated
boost-json Jeffrey Massung Apache homepage No 16 Dec 2021
cl-json Henrik Hjelte, Boris Smilga, Robert Goldman MIT homepage Yes 7 Nov 2014
com.gigamonkeys.json Peter Seibel BSD-3 homepage Yes 15 Apr 2017
com.inuoe.jzon Wilfredo Velázquez-Rodríguez MIT homepage No 30 Nov 2021
jonathan Rudolph Miller MIT homepage Yes 1 Sep 2020
json-lib Alex Nygren MIT homepage No 19 Dec 2021
json-streams Thomas Bakketun, Stian Sletner GPL3 homepage Yes 12 Oct 2017
jsown Aad Versteden MIT homepage Yes 4 Feb 2020
shasht Tarn W. Burton MIT homepage Yes 19 Jan 2022
st-json Marijn Haverbeke zlib homepage Yes 28 Jun 2021
trivial-json-codec (2) Eric Diethelm MIT homepage Yes 8 Mar 2020
yason (1) Hans Huebner BSD homepage Yes 27 Aug 2021
• (1) IMPORTANT: Notice the github location has moved. Hans Huebner's old github location will automatically redirect to Phil Marek's, but quicklisp is not (as of the time of this writing) pulling code from the new github location.
• (2) trivial-json-codec seems more targeted at decoding from json to CL than serializing to json. Its main purpose is in serializing json data to CLOS hierarchical objects.
Table 2: Dependencies
Library Dependencies
boost-json
cl-json
com.gigamonkeys.json
com.inuoe.jzon closer-mop, flexi-streams
jonathan cl-syntax, cl-syntax-annot, fast-io, trivial-types, babel, proc-parse, cl-ppcre, cl-annot
json-lib alexandria, str, parse-float, cl-fad, babel
json-streams
jsown
shasht trivial-do, closer-mop
st-json
trivial-json-codec trivial-utilities, log4cl, closer-mop, iterate, parse-number
yason alexandria, trivial-gray-streams

### Helper Libraries

We should also talk a bit about some helper libraries.

define-json-expander Johan Sjölén MIT homepage cl-json CLOS<->json
herodotus Henry Steere BSD (1) homepage yason CLOS<->json
json-mop Grim Schjetne MIT homepage yason CLOS<->json
cl-json-helper Bob Felts BSD homepage cl-json
cl-json-schema Mark Skilbeck MIT homepage
cl-json-pointer Yokata Yuki MIT homepage cl-json, st-json, yason, jsown, jonathan, json-streams, com.gigamonkeys.json, (1) 40ants comments
• (1) asd file says BSD, but license included is MIT.

## Quick Summary

As can be expected, the libraries do much of the same if you have basic needs. However, differences exist and should be considered in choosing which library is right for any particular project. Many applications are asymmetric in how they will use these libraries, so consider the strengths and weakness given the needs of your case. Library links are to the libraries' individual sections below.

• Overall: I quite like the newcomers shasht and com.inuoe.jzon. cl-json and yason are still the work horses if you need fine control, but speed is not their forte.
• Decoding or Parsing Speed: Speed is not everything, but seems to be important to a lot of readers. If you are parsing compliant data and are just looking for speed, look at jsown, com.inuoe.jzon (not in quicklisp) or shasht. Jonathan is faster on tiny strings but starts to slow down with nested objects and eventually, depending on the size of the json object, becomes orders of magnitude slower than the other libraries. The effect was seen faster under sbcl but as data file size increased, it was also obvious under ccl and ecl as well. See read-times
• Encoding Speed: If you mostly need to encode lisp data to json and are just looking for speed, I would look at com.inuoe.jzon, shasht or depending on your data, jonathan or st-json. See write-times
• Safety 0 Three libraries (jsown, jonathan and st-json) have optimize set for safety 0 to increase speed. This has bitten some users hard in the past, so you have been warned.
• Unicode Handling: Most of the libraries can handle unicode. However, if you have unicode surrogate pairs, you need to choose from com.inuoe.jzon, jonathan, json-streams, jsown, shasht, st-json or yason. See Decoding Unicode.
• Extracting Subsets: If you want to extract a subset of json data without paying attention to everything else, jsown is the choice. Jonathan allows you to get a key:value pair, but I could only use it one level deep. Obviously every library can import all the json data into some CL datastructure and then use standard CL functionality to get whatever subset you want.
• Handling all proper json forms: If you remember to set *read-default-float-format* to 'double-float, there is no clear winner. boost-json, cl-json, com.gigamonkeys.json, com.inuoe.jzon, json-lib, shasht, st-json and yason all score 95/95 on the json standard conformity tests that must be met. If *read-default-float-format* is set to to 'single-float, only com.inuoe.jzon and com.gigamonkeys.json have perfect scores. See Must Accept.
• Handling potentially invalid data: See Standard Conformity and Dealing with Malformed Data. I think com.inuoe.jzon is the winner, unfortunately it is not in quicklisp if that is a requirement.
• Proper Distinction Between NULL/Nil/False: Com.inuoe.jzon, shasht and st-json get it right out of the box. jonathan and yason provide the ability to get there with setting a simple variable.
• CLOS Abilities - Decoding: Trivial-json-codec can decode objects to standard pre-defined classes. Boost-json can decode json objects to a boost-json:json-object which is a standard object. Cl-json has the ability to decode json objects into a "fluid-class" CLOS object. (Note, this is thread unsafe if you have not already prepped classes for every expected json object. If you define your own classes with a lispclass member, this can be avoided so long as every class is defined that way.) Personally I prefer being able to define my own classes. Trivial-json-codec then just dumps the json data into your classes. If you use yason, I would also use the helper library json-mop. (com.inuoe.jzon decodes json to hash-tables, so does not have decoding to CLOS abilities out of the box.) See Decoding to CLOS
• CLOS Abilities - Encoding: On the encoding side, cl-json, com.inuoe.jzon, shasht and trivial-json-codec are able to encode CLOS objects to json out of the box. The rest of the libraries require that you write your own methods for your clos classes - which is not necessarily that difficult, but not something you get out of the box. If you like yason, there are two helper libraries: json-mop or herodotus. I have a preference for json-mop with the caveat that, at the moment, you cannot redefine classes. There is also a cl-json helper library, define-json-expander, which defines classses which can be used by cl-json to help move data back and forth. See Encoding Objects
• Encoding different lisp types: shasht and com.inuoe.jzon were the best at encoding different lisp data types. For example, they were the only library able to handle structs without the user having to define a new method. There is a trade-off between handling multiple lisp types and symmetry. If you do not have a 1:1 matching but rather have a many:1 matching, there may be complications doing a round trip between json and CL and getting exactly the same data type or structure you started with. See Encoding or Encoding Data Structures.
• Incremental Encoding: cl-json, jonathan, json-streams, jsown, shasht, st-json and yason all have the capability to do incremental encoding. You can see small examples in their individual sections at the links in the previous sentence. I found shasht a little easier. com.inuoe.jzon is actively working on the capability. See Incremental Encoding Discussion for more detail.
• Security issues: Security issues are always a potential problem if you are getting data from uncontrolled sources. Malware disguised in the data will likely either be properly formed json strings that try to overload the system by overloading libraries using keywords as hash symbols or improperly formed json strings that exhaust the stack. See the Security discussion for more information. In cases facing data from uncontrolled sources, I would look first at com.inuoe.jzon , then maybe at shasht and jonathan. I would suggest that all libraries put some type of limit on parsing depth of json objects or arrays. At this point only com.inuoe.jzon, json-lib, json-streams and trivial-json-codec have such limits, all of which are configurable. See security.
• Symmetry or "Round Tripping": Symmetric treatment is important to some users. In this area, libraries tend to have issues with NULL/Nil/False as well as whether keys should be symbols or strings. Symmetry is easier going from Jason to CL and back. It is definitely harder if you go from CL to Json and back to CL and you do not limit your CL data types. (CL has more data types so you do not have a 1:1 match). Your choices are to limit your CL data types or using libraries which allow you to precisely specify the CL datatype you want at that point. Overall, I think the symmetry winner is shasht, but your fact pattern may drive a different answer. See symmetry discussion.

At the end of the day, your particular data and requirements will determine which library is best for any particular application. Webapps may have tiny bits of json going back and forth while other uses will be asymmetric - gigabytes of json getting imported and little or occasional amounts getting exported or vice versa. You may have different needs depending on whether the json encoded data is strictly controlled or it is coming in from unknown sources. In one test, yason:encode choked on a list which included a keyword :NULL in an unexpected location. Cl-json just encoded it as the string "null" and st-json encoded it as 'null' (not a string). In testing for your use, if you get json data from uncontrolled sources, deliberately feed badly formed data and see how the library reacts. Some will throw recoverable conditions (depending on the error) while others may actually lock up a thread.

## Decoding/Reading/Parsing Json Data to Lisp

The following table shows the basic decoding function for a library and then specialist functions. Each will have more detail in the section specific to that library.

Table 3: Basic Decoding Functions
Library Base Function Specialist Functions
cl-json decode-json decode-json-from-source, decode-json-from-string, decode-json-strict
com.gigamonkeys.json parse-json
com.inuoe.jzon parse
jonathan parse
json-lib parse
jsown parse
trivial-json-codec deserialize-raw deserialize-json
yason (1) parse parse-json-arrays-as-vectors, parse-json-booleans-as-symbols, parse-json-null-as-keyword, parse-object-as, parse-object-as-alist, parse-object-key-fn, symbol-key-encoder
• (1) yason:parse takes keyword parameters :object-key-fn, :object-as :json-arrays-as-vectors :json-booleans-as-symbols :json-nulls-as-keyword

### Decoding Streams or Strings

Does the library take both strings and streams as input?

Table 4: Strings and Streams As Input
Library Strings Streams
boost-json YES YES (1)
cl-json YES YES
com.gigamonkeys.json YES NO
com.inuoe.jzon YES YES
jonathan YES NO
json-lib UTF-8 Encoded Only NO
json-streams YES YES
jsown YES NO
shasht YES YES
st-json YES YES
trivial-json-codec YES NO
yason YES YES

### Mapping Data Types and Structures from Json to CL

Json has a limited number of data types. You will get different lisp data-types from the decoding depending on the library.

Table 5: Default Mapping Json to CL
Library True/False/ Null Number Array Json Object (3)(6)
Original Json "true"/ "false"/ "null" "12.3" "[1,2]" "{\"a\": 2}"

boost-json T / NIL/ NIL 12.3 (1 2) #<JSON-OBJECT {"a":2}>
cl-json (1) T / NIL / NIL 12.3 (1 2) ((A . 2))
com.gigamonkeys.json TRUE / FALSE / NULL 12.3d0 #(1 2) (a 2)
com.inuoe.jzon (3) T / NIL/ NULL 12.3d0 #(1 2) #<HASH-TABLE} :TEST EQUAL :COUNT 1>
jonathan T / NIL / NIL 12.3 (1 2) (a 2)
json-lib (3) T / NIL/ NIL 12.3 #(1 2) #<HASH-TABLE :TEST EQUAL :COUNT 1>
json-streams T / NIL/ NULL 12.3d0 (ARRAY 1 2) (OBJECT (a . 2))
jsown (5) T / NIL / NIL 123/10 (5) (1 2) (OBJ (a . 2))
shasht (3)(6) T / NIL / NULL 12.3 #(1 2) #<HASH-TABLE :TEST EQUAL :COUNT 1> (6)
st-json TRUE / FALSE / NULL 12.3 (1 2) #S(JSO :ALIST ((a . 2)))
trivial-json-codec (2) "true"/ "false"/ "null" 12.3 #(1 2) ((:A 2))
yason (3)(4)(6) T / NIL/ NIL/:NULL 12.3 (1 2)/#(1 2) #<HASH-TABLE :TEST EQUAL :COUNT 1> (6)
• (1) This is cl-json's default mode. Using cl-json:with-decoder-simple-clos-semantics or cl-json:simple-clos-semantics will switch cl-json into a mode where json arrays are decoded to cl vectors rather than lists, and json objects are decoded to CLOS objects rather than alists.
• (2) There is a difference between how it handles the string false and string null and how it handles them when they are in an object and not separated as sub-strings. See true-false-null-mapping.
• (3) All four libraries that decode a json object to a CL hash use strings as hash keys by default
• (4) yason:parse has a keyword parameter :json-nulls-as-keyword and a keyword parameter :json-arrays-as-vectors.
• (5) Parsing json value strings which are not embedded in a json object or array will trigger errors. For example (jsown:parse "12.3") or (jsown:parse "alpha") will trigger a sb-kernal:bounding-indices-bad error in sbcl. Floats in an array or object will be converted to a ratio:

(jsown:parse "[123e-1, 15.2]")
(123/10 76/5)

• (6) While shasht and yason default to parsing json objects as hash-tables, they can optionally be parsed as alists or plists.

The following tables set out some additional detail on mapping from JSON data structures to lisp data structures using the normal functions listed above with some comments on the results.

#### Unicode

Consider a json string with unicode characters. The json data string is the first row in each table.

Table 6: Decoding Basic Unicode
Library Function Result Comment
Original   ["\u004C","明彦","\u2604"]

boost-json json-decode (L 明彦 ☄)
cl-json json-decode-from-string (L 明彦 ☄)
com.gigamonkeys.json parse-json #(L 明彦 ☄)
com.inuoe.jzon parse #(L 明彦 ☄)
json-lib parse #( 明彦 ) (1)
json-streams json-parse (ARRAY L 明彦 ☄)
trivial-json-codec deserialize-raw #(\u004C 明彦 \u2604) (2)
yason parse (L 明彦 ☄)
• (1) Did not handle the unicode char codes
• (2) Repeated the unicode char codes

In the following table, we show the results of attempting to decode unicode with surrogate pairs

Table 7: Unicode Decoding 2 (surrogate pairs)
Library Function Result
Original   "\uD83D\uDE02\uD83D\uDE02"

boost-json json-decode ����
cl-json json-decode-from-string ����
com.gigamonkeys.json parse-json ����
com.inuoe.jzon parse 😂😂
jonathan parse 😂😂
json-lib parse ���
json-streams json-parse 😂😂
jsown parse 😂😂
trivial-json-codec deserialize-raw \uD83D\uDE02\uD83D\uDE02
yason parse 😂😂

Conclusion: If you may have surrogate pairs in your json data, you might want to stick to com.inuoe.jzon, jonathan, json-streams, jsown, shasht, st-json or yason.

#### Number Mapping

Table 8: Number Mapping
Library Result Comment
Original Json String "{\"integer\": 32,\"float\": 34.89}"

boost-json #<JSON-OBJECT {"integer":32,"float":34.89}>
cl-json ((INTEGER . 32) (FLOAT . 34.89))
com.gigamonkeys.json (integer 32 float 34.89d0)
com.inuoe.jzon ((float . 34.89d0) (integer . 32))
json-lib ((float . 34.89) (integer . 32))
jonathan ((float . 34.89) (integer . 32))
json-streams (OBJECT (integer . 32) (float . 34.89d0))
jsown (OBJ (integer . 32) (float . 3489/100)) Ratio
shasht ((float . 34.89) (integer . 32))
st-json #S(JSO :ALIST ((integer . 32) (float . 34.89)))
trivial-json-codec ((:INTEGER 32) (:FLOAT 34.89))
yason ((float . 34.89) (integer . 32))

#### True, False, Null, Empty Array Mapping

Contrary to some people's belief systems that boolean logic encompasses everything, there is a meaningful difference between "false" and "unknown". Null ≠ nil. For that matter, I subscribe to the belief that (not true) is not the same as the empty set. Json's null and empty arrays can track the differences between false, null and empty array. When translating back and forth between CL and json, it is important to be able to keep those distinction. Thank you, com.gigamonkeys.json, com.inuoe.jzon, shasht and st-json for getting them correct right out of the box and thank you to jonathan and yason for giving me the ability to get there with setting a variable.

Table 9: True, False, Null Mapping with Json Objects
Library Result Comment
Original Json String "{\"1\": true,\"2\": false, \"3\": null}"

boost-json #<JSON-OBJECT {"1":true,"2":null,"3":null}> failed false
cl-json ((1 . T) (2) (3)) failed null
com.gigamonkeys.json (1 TRUE 2 FALSE 3 NULL)
com.inuoe.jzon ((3 . NULL) (2 . nil) (1 . T))
jonathan ((:3 . nil) (:2 . nil) (:1 . T)) failed null (but see next row)
jonathan (4) ((:3 . :null) (:2 . nil) (:1 . T))
json-lib ((3 . nil) (2 . nil) (1 . T)) failed null
json-streams (OBJECT (1 . T) (2) (3 . NULL)) failed false (2)
jsown (OBJ (1 . T) (2) (3)) failed null
shasht ((3 . NULL) (2 . nil) (1 . T))
st-json #S(JSO :ALIST ((1 . TRUE) (2 . FALSE) (3 . NULL)))
trivial-json-codec ((:|1| T) (:|2| NIL) (:|3| NIL)) failed false and null
yason (3) (("3") ("2") ("1" . T)) failed null (but see next row
yason (4) (("3" . :NULL) ("2") ("1" . T))
• (1) The cl-boost author notes in his README that "it is not possible to distinguish between false, null, or []. And, I have personally never found this to be problematic." Obviously your experience may vary. My experience is that they show meaningful differences between "false" and "unknown". I think his position is stronger if the question is solely between nil and [].
• (2) When reading a json object with nil and null, json-streams provides no value for the cons cell which represents the json value false. I would have expected the cons cell to have a value of nil.
• (3) When called with :object-as :plist, the nil is explicit: ("1" T "2" NIL "3" nil)
• (4) Jonathan called with jonathan:*null-value* set to :null. Yason called with yason:*parse-json-null-as-keyword* set to t or passing the keyword parameter :json-nulls-as-keyword t to yason:parse

The following table shows how the various libraries try to decode a json object with an empty array as the value. Com.gigamonkeys, com.inuoe.jzon, json-lib and shasht are the clearest representation of the original json.

Table 10: Decoding Json Object with an Empty Array
Library Function Result
Original Json string   {"a": []}

boost-json json-decode #<JSON-OBJECT {"a":null}>
cl-json json-decode-from-string ((A))
com.gigamonkeys.json parse-json (a #())
com.inuoe.jzon parse ((a . #()))
jonathan parse (a NIL)
json-lib parse ((a . #()))
json-streams json-parse (OBJECT (a ARRAY))
jsown parse (OBJ (a))
trivial-json-codec deserialize-raw ((A NIL))
yason parse ((a))

Now looking at all four types within a json array:

Table 11: True, False, Null and Empty Array Mapping with Json Arrays
Library Result Comment
Original [true,false,null, []]

boost-json (T NIL NIL NIL) failed null
cl-json (T NIL NIL NIL) failed null
com.gigamonkeys.json #(TRUE FALSE NULL #())
com.inuoe.jzon #(T NIL NULL #())
jonathan (T NIL NIL NIL) failed null but see next row
jonathan (1) (T NIL NIL :NULL)
json-lib #(T NIL NIL #()) failed null
json-streams (ARRAY T NIL NULL (ARRAY)) correct this time compared to decoding object
jsown (T NIL NIL NIL) failed null
shasht #(T NIL NULL #())
st-json (TRUE FALSE NULL NIL)
trivial-json-codec #(T NIL NIL NIL) failed null
yason (T NIL NIL NIL)
yason (1) (T NIL NIL :null) failed null but see next row
• (1) Jonathan called with jonathan:*null-value* set to :null. Yason called with yason:*parse-json-null-as-keyword* set to t or passing the keyword parameter :json-nulls-as-keyword t to yason:parse

#### Decoding Json Arrays

Table 12: Array Mapping
Library Result Comment
Original ["Skoda", "Peugeot", "SEAT"]

boost-json (Skoda Peugeot SEAT) list
cl-json (Skoda Peugeot SEAT) list
com.gigamonkeys.json #(Skoda Peugeot SEAT) vector
com.inuoe.jzon #(Skoda Peugeot SEAT) vector
jonathan (Skoda Peugeot SEAT) list
json-lib #(Skoda Peugeot SEAT) vector
json-streams (:ARRAY Skoda Peugeot SEAT) cons
jsown (Skoda Peugeot SEAT) list
shasht #(Skoda Peugeot SEAT) vector
st-json (Skoda Peugeot SEAT) list
trivial-json-codec #("Skoda" "Peugeot" "SEAT") vector
yason (Skoda Peugeot SEAT) list
yason (1) #("Skoda" "Peugeot" "SEAT") vector
• (1) Yason called with additional keyword parameter :json-arrays-as-vectors t

#### Decoding Json Objects

For the json object examples, we will be using the following simple json string:

(defparameter *address-1* "{
\"name\": \"George Washington\",
\"birthday\": \"February 22, 1732\",
\"address\": \"Mount Vernon, Virginia, United States\"
}"

Table 13: Decoding Basic Json Objects
Library Result Comment
Original {"name":"George Washington","birthday":"February 22, 1732","address":"Mount Vernon, Virginia, United States"}
boost-json #<JSON-OBJECT {"name":"George Washington","birthday":"February 22, 1732","address":"Mount Vernon, Virginia, United States"}>
cl-json ((NAME . George Washington) (BIRTHDAY . February 22, 1732) (ADDRESS . Mount Vernon, Virginia, United States))
com.gigamonkeys.json (name George Washington birthday February 22, 1732 address Mount Vernon, Virginia, United States)
com.inuoe.jzon #<HASH-TABLE :TEST EQUAL :COUNT 3 {101F71B1E3}> (1)
jonathan ((address . Mount Vernon, Virginia, United States) (birthday . February 22, 1732) (name . George Washington))
json-lib #<HASH-TABLE :TEST EQUAL :COUNT 3 {101F71CC63}> (1)
json-streams (OBJECT (name . George Washington) (birthday . February 22, 1732) (address . Mount Vernon, Virginia, United States))
jsown (OBJ (name . George Washington) (birthday . February 22, 1732) (address . Mount Vernon, Virginia, United States))
shasht #<HASH-TABLE :TEST EQUAL :COUNT 3 {101F71FA83}> (1)
st-json #S(JSO :ALIST ((name . George Washington) (birthday . February 22, 1732) (address . Mount Vernon, Virginia, United States)))
trivial-json-codec ((NAME George Washington) (BIRTHDAY February 22, 1732) (ADDRESS Mount Vernon, Virginia, United States))
yason #<HASH-TABLE :TEST EQUAL :COUNT 3 {101F722863}> (1)
yason-alist ((address . Mount Vernon, Virginia, United States) (birthday . February 22, 1732) (name . George Washington))
yason-plist (name George Washington birthday February 22, 1732 address Mount Vernon, Virginia, United States)
• (1) All four libraries that decode a json object to a CL hash use strings as hash keys by default. All four will allow you to optionally use keywords instead. Shasht and yason allow you to decode a json object to alists or plists instead of hashes.

#### Decoding Nested Json Objects

For nested object examples, we will be using the following simple json object:

(defparameter *nested-address-1* "{
\"first_name\": \"George\",
\"last_name\": \"Washington\",
\"birthday\": \"1732-02-22\",
\"street_address\": \"3200 Mount Vernon Memorial Highway\",
\"city\": \"Mount Vernon\",
\"state\": \"Virginia\",
\"country\": \"United States\"
}
}")

Table 14: Decoding Nested Json Objects
Library Result Comment
Original { "first_name": "George", "last_name": "Washington", "birthday": "1732-02-22", "address": { "street_address": "3200 MountVernon Memorial Highway", "city": "Mount Vernon", "state": "Virginia", "country": "United States" }}

boost-json #<JSON-OBJECT {"first_name":"George","last_name":"Washington", "birthday":"1732-02-22","address":#}> Nested address info is enclosed in a nested json-object. See boost-json decoding nested objects to clos
cl-json ((FIRST–NAME . George) (LAST–NAME . Washington) (BIRTHDAY . 1732-02-22) (ADDRESS (STREET–ADDRESS . 3200 Mount Vernon Memorial Highway) (CITY . Mount Vernon) (STATE . Virginia) (COUNTRY . United States))) All info provided
com.gigamonkeys.json (first_name George last_name Washington birthday 1732-02-22 address (street_address 3200 Mount Vernon Memorial Highway city Mount Vernon state Virginia country United States)) All info but nested plists provided
com.inuoe.jzon ((address . #<HASH-TABLE :TEST EQUAL :COUNT 4 {10092D4753}>) (birthday . 1732-02-22) (last_name . Washington) (first_name . George)) all info available but shows need to recursively apply differnt functions to pull out the nested hashes created by com.inuoe.jzon
jonathan ((address (country . United States) (state . Virginia) (city . Mount Vernon) (street_address . 3200 Mount Vernon Memorial Highway)) (birthday . 1732-02-22) (last_name . Washington) (first_name . George)) All info provided
json-lib ((address . #<HASH-TABLE :TEST EQUAL :COUNT 4 {10092DA153}>) (birthday . 1732-02-22) (last_name . Washington) (first_name . George)) all info available but shows need to recursively apply differnt functions to pull out the nested hashes created by json-lib
json-streams (OBJECT (first_name . George) (last_name . Washington) (birthday . 1732-02-22) (address OBJECT (street_address . 3200 Mount Vernon Memorial Highway) (city . Mount Vernon) (state . Virginia) (country . United States))) All info provided
jsown (OBJ (first_name . George) (last_name . Washington) (birthday . 1732-02-22) (address OBJ (street_address . 3200 Mount Vernon Memorial Highway) (city . Mount Vernon) (state . Virginia) (country . United States))) All info provided in nested jsown objects
shasht ((address . #<HASH-TABLE :TEST EQUAL :COUNT 4 {10092E2203}>) (birthday . 1732-02-22) (last_name . Washington) (first_name . George)) all info available but shows need to recursively apply differnt functions to pull out the nested hashes created by shasht
st-json #S(JSO :ALIST ((first_name . George) (last_name . Washington) (birthday . 1732-02-22) (address . #S(JSO :ALIST ((street_address . 3200 Mount Vernon Memorial Highway) (city . Mount Vernon) (state . Virginia) (country . United States)))))) All info provided in nested st-json objects
trivial-json-codec ((:FIRST_NAME "George") (:LAST_NAME "Washington") (:BIRTHDAY "1732-02-22") (:ADDRESS ((:STREET_ADDRESS "3200 Mount VernonMemorial Highway") (:CITY "Mount Vernon") (:STATE "Virginia") (:COUNTRY "United States"))))
yason ((address . #<HASH-TABLE :TEST EQUAL :COUNT 4 {10092E6003}>) (birthday . 1732-02-22) (last_name . Washington) (first_name . George)) all info available but shows need to recursively apply differnt functions to pull out the nested hashes created by yason.

### Converting json data to a CLOS object

Which libraries have some built-in ability to convert a json object to a CLOS object?

Table 15: Decoding Json Object to a CLOS Object
Library Function Comment
boost-json json-decode YES (1)
cl-json decode-json-from-string YES (2)
com.gigamonkeys.json   NO
com.inuoe.jzon   NO
jonathan   NO
json-lib   NO
json-streams   NO
jsown   NO
shasht   NO
st-json   NO (3)
trivial-json-codec deserialize-json YES (4)
yason   NO (5)
• (1) boost-json creates a boost-json:json-object which is a standard-object. Accessing the slots is done with boost-json:json-getf and boost-json:json-setf functions. See Boost-json decoding to CLOS
• (2) cl-json can create a cl-json:fluid-class which is a standard-object. Accessing the slots is done with the name of the slot. See cl-json-data-to-clos.
• (3) The st-json "object" jso is a struct, not a CLOS object.
• (4) You need to define your classes first
• (5) Possible with helper libraries json-mop or herodotus.

For example, consider two address json objects:

  *address-1*
"{
\"name\": \"George Washington\",
\"birthday\": \"February 22, 1732\",
\"address\": \"Mount Vernon, Virginia, United States\"
}"

"{
\"first_name\": \"George\",
\"last_name\": \"Washington\",
\"birthday\": \"1732-02-22\",
\"street_address\": \"3200 Mount Vernon Memorial Highway\",
\"city\": \"Mount Vernon\",
\"state\": \"Virginia\",
\"country\": \"United States\"
}
}"


Obviously you can always write your own function to initialize a specific class from an alist, but we do have two libraries (boost-json and cl-json) that try to do something like this for you automagically. Let's take a look.

#### Boost-json

Boost-json will decode the json nested object to a CLOS object class called boost-json:json-object.

(boost-json:json-decode *nested-address-1*)
#<BOOST-JSON:JSON-OBJECT


It appears like you need to call (json-getf obj keyword) in order to act as an accessor. E.g.

(let ((data (boost-json:json-decode *nested-address-1*)))
(setf (boost-json:json-getf data "first_name") "Michael")
data)
#<BOOST-JSON:JSON-OBJECT


#### Cl-json

cl-json can decode and convert them to a cl-json:fluid-class CLOS object. You do need to at least temporarily set the change the decoder to use simple-clos-semantics and set the *json-symbols-package* to nil. (Note, this is thread unsafe unless you have pre-created. Per the documentation: "To maintain the mapping between lists of superclass names and fluid classes, the decoder maintains a class registry. Thus, using fluid objects makes the CLOS decoder essentially thread-unsafe. (If every incoming json Object is guaranteed to have a prototype with a "lispClass" member then there are no fluid objects and thread safety is ensured.) If the user wishes to employ fluid objects in a threaded environment it is advisable to wrap the body of entry-point functions in with-local-class-registry.")

First, looking at the simpler version, notice you need to specify the slots in the fluid-class object:

(cl-json:with-decoder-simple-clos-semantics
(setf cl-json:*json-symbols-package* nil)
#<#<JSON:FLUID-CLASS COMMON-LISP:NIL {100F765713}> {100FB22513}>
"George Washington"
"February 22, 1732"
"Mount Vernon, Virginia, United States"


Now looking at the nested version, we need to note that by default cl-json will convert the underscores in the json keys to double hyphens in the slot names.

(cl-json:with-decoder-simple-clos-semantics
(setf cl-json:*json-symbols-package* nil)
(with-slots (first--name last--name birthday address) x
(values x first--name last--name birthday address)))))
#<#<JSON:FLUID-CLASS COMMON-LISP:NIL {100F765713}> {10107CF9E3}>
"George"
"Washington"
"1732-02-22"
#<#<JSON:FLUID-CLASS COMMON-LISP:NIL {100F765713}> {10107CF6F3}>


Because we have a nested class, we would need drill down and specify the slots for the sub-object as well:

(cl-json:with-decoder-simple-clos-semantics
(setf cl-json:*json-symbols-package* nil)
(with-slots (first--name last--name birthday address) x
(values x first--name last--name birthday address city))))))
#<#<JSON:FLUID-CLASS COMMON-LISP:NIL {100F765713}> {1010E69B93}>
"George"
"Washington"
"1732-02-22"
#<#<JSON:FLUID-CLASS COMMON-LISP:NIL {100F765713}> {1010E698A3}>
"Mount Vernon"


#### Trivial-json-codec

If you have defined your classes, trivial-json-codec does make it easy to decode json data directly to a vector of your classes. Suppose we have a simple person class:

(defclass person ()
((name
:initarg :name :initform "Sabra"
:accessor name)
(eye-colour :initarg :eye-colour
:initform "brown"
:accessor eye-colour)))


If we have a vector of json objects which are all data for a person class, we can automatically build a vector of persons by specifying the class we want to use:

(let ((data
(trivial-json-codec:deserialize-json
"[{\"name\":\"Claudia\",\"eye-colour\":\"blue\"},
{\"name\":\"Johann\",\"eye-colour\":\"brown\"}]"
:class (find-class 'person))))
(name (aref data 1)))
"Johann"


There is no magic if the json array has objects of different types.

### Extracting a Subset of a json object

Suppose you just want a subset of data from a json file. Obviously every library allows you to parse the entire json file and then use standard lisp functions to pull out the subset you want. Only a few libraries will extract the desired subset directly.

Table 16: Extracting a Data Subset from Json Object
Library Result Comment
boost-json NO
cl-json NO But see cl-json-helper's json-key-value function
com.gigamonkeys.json NO
com.inuoe.jzon NO
jonathan YES First level of data only. Also remember jonathan only reads strings, not streams
json-lib NO
json-streams NO
jsown YES But remember jsown only reads strings, not streams
shasht NO
st-json NO
yason

### Determining Object Keywords

Jsown has the ability, once a json object has been parsed into a jsown object, to get the keywords of the object. The following only gets the first level keywords.

(jsown:keywords (jsown:parse  *nested-address-1* ))



### Handling NIL

This is a test of how the library function for reading from a string handles nil. Errors are not surprising. What was surprising was shasht:read-json hanging when given a nil.

Library Function Result Comment
boost-json json-decode Error
cl-json json-decode-from-string Error
com.gigamonkeys.json parse-json NIL
com.inuoe.jzon parse Error
jonathan parse Error
json-lib parse ""
json-streams json-parse Error
jsown parse NIL
st-json deserialize-raw Error
yason parse Error

## Encoding Lisp Data to Json

Table 18: Basic Encoding Functions for Each Library
Library Base Function Specialist Functions
boost-json json-encode
cl-json encode-json encode-json-to-string, encode-json-alist encode-json-alist-to-string, encode-json-plist, encode-json-plist-to-string, encode-object-member, encode-array-member
com.gigamonkeys.json to-json, write-json
com.inuoe.jzon stringify
jonathan to-json
json-lib stringify
json-streams json-stringify
jsown to-json to-json*
shasht write-json write-json-string, write-json*
st-json write-json write-json-to-string, write-json-element
trivial-json-codec (1) serialize serialize-json
yason encode encode-alist, encode-plist, encode-object, encode-slots, encode-object-element, encode-array-element, encode-array-elements
• (1) As previously noted, trivial-json-codec is really intended as a parser (one way) from json to CL, not really serializing to json.

### Encoding to Streams or Strings

Does the library take both strings and streams as output?

Table 19: Strings and Streams As Output
Library String Function Stream Function
boost-json json-encode json-encode
cl-json encode-json encode-json-to-string
com.gigamonkeys.json to-json write-json
com.inuoe.jzon stringify stringify
jonathan to-json with-output …
json-lib stringify No
json-streams json-stringify with-json-output
jsown to-json No
shasht write-json write-json
st-json write-json-to-string write-json
trivial-json-codec serialize or serialize-json serialize
yason encode encode

### Encoding Symbols, Chars, T, nil and :null

Several libraries have an issue encoding symbols. Typically these can be resolved if you write a method for handling symbols. Of course, if you have symbols in your lisp data, you might want to look at libraries which handle them without any additional effort on your part.

As mentioned when discussing mapping from JSON to CL, there is a problem with null in that it represents "unknown", not false. CL does not have that as a data type, which is unfortunate. You cannot determine, absent other additional data, whether CL's NIL = [] or NIL = False or "unknown". As a result, you really have to pay attention serializing to and from JSON as to what you want nil to represent and whether it is unacceptably overloaded.

The following table uses 'a as the symbol input and #\C as a character input. :null was used as a proxy for null.

Table 20: Quick Summary - Encoding Functionality 1
Library Function Symbol Char T Nil/:NULL
boost-json json-encode "\"A\"" Error (3) "true" "null"/"NULL"
cl-json encode-json "a" "C" true null/"null" (4)
com.gigamonkeys.json write-json Error (1) Hangs (3) true {}/null
com.inuoe.jzon stringify "\"A\"" [] (3) "true" "false"/"NULL"
jonathan to-json "\"A\"" Error (3) "true" "[]"/"null"
json-lib stringify "null" (2) "null" (3) "true" "null"/"null"
json-streams json-stringify Error (3) Error (3) "true" "false"/"null"
jsown to-json "\"A\"" Error (3) "true" "[]"/"null"
shasht write-json "A" "C" "true" false/null
st-json write-json Error (3) Error (3) "true" []/null
trivial-json-codec serialize-json "A" Error (3) "true" "null"/":NULL"
yason encode Error (5) Error (3) true null/null
• (1) Keywords Only
• (2) returns "null"
• (3) You could write a method to handle this type. For jonathan, see jonathan-encoding. For st-json see st-json-encoding.
• (4) You can use the helper library cl-json-helper to encode nil as "false".
• (5) Using yason to encode a symbol would require the use of yason:encode-symbol-as-lowercase rather than just the simple yason:encode

### Encoding Numbers

There are no surprises in encoding integers. Floats are generally fine depending on your view of rounding and the use of exponents. Ratios get slightly more interesting. Json-streams and trivial-json-codec were the only libraries to refuse to encode a ratio number. Json-lib wrote it as the string "null". The rest converted it to some form of digital number. The following table uses 9/4 as the sample data.

Table 21: Encoding Encoding Float
Library Function Float Ratios
Original Numbers   3.675 9/4

boost-json json-encode 3.675 "2.25"
cl-json encode-json 3.675 2.25
com.gigamonkeys.json write-json 3.674999952316284 2.25
com.inuoe.jzon stringify 3.675 "2.25"
jonathan to-json 3.675 "2.25"
json-lib (1) stringify 3.675 "null"
json-streams json-stringify 3.675 Error (2)
jsown to-json 3.675 "2.25"
shasht write-json 3.675e+0 2.25e+0
st-json write-json 0.3675e+1 0.2373e+2
trivial-json-codec serialize-json 3.675 Error (2)
yason encode 3.674999952316284 2.25

(1) As of the time of writing, json-lib only handles integers and floats, not ratios. (2) Error: JSON write error: Number must be integer or float, got 9/4.

### Encoding Pathnames

Just to see what the libraries did with pathnames, using #P"/home/sabra" as sample data. Out of the box, only boost-json and shasht turned it into a string. cl-json, jonathan, jsown, st-json, trivial-json-codec and yason indicate that you could write a method to handle the path datatype.

Table 22: Encoding Pathnames
Library Function Result
boost-json json-encode "/home/sabra"
cl-json encode-json Not of a type which can be encoded by encode-json
com.gigamonkeys.json write-json hangs
com.inuoe.jzon stringify {}
jonathan to-json No applicable method
json-lib stringify null
json-streams json-stringify Fell through etypecase expression
jsown to-json No applicable method
shasht write-json "home/sabra"
st-json write-json Cannot write object of type Pathname to json
trivial-json-codec serialize-json No applicable method
yason encode No applicable method

### Encoding Local-Time Timestamps

Again, just to see what the libraries did with timestamps, using (local-time:now) as sample data. This is actually just a preview of what happens with CLOS objects. We did have more libraries handling the timestamp and returning a json object, but none returned a json data object. As you can see, cl-json, com.inuoe.json, shasht and trivial-json-codec returned an object. Boost-json, jonathan, jsown, st-json and yason all indicate you could write a method to handle the timestamp.

Table 23: Encoding Encoding Timestamps
Library Function Result
boost-json json-encode There is no applicable method for the generic function
cl-json encode-json {"day":7990,"sec":48161,"nsec":580943000}
com.gigamonkeys.json write-json hangs
com.inuoe.jzon stringify {"day":7990,"sec":48161,"nsec":580943000}
jonathan to-json There is no applicable method for the generic function
json-lib stringify null
json-streams json-stringify @2022-01-15T08:22:41.580943-05:00 fell through ETYPECASE expression.
jsown to-json There is no applicable method for the generic function
shasht write-json { "DAY": 7990, "SEC": 48161, "NSEC": 580943000}
st-json write-json Can not write object of type TIMESTAMP as JSON.
trivial-json-codec serialize-json { "DAY" : 7990, "SEC" : 48161, "NSEC" : 580943000}
yason encode There is no applicable method for the generic function

### Encoding Data Structures to Json (Summary)

The following table is just a quick summary of library functionality for arrays, hashtables, CLOS objects and structs. More detail is provided in specific section for each of those categories of data structures.

Table 24: Quick Summary - Encoding Functionality Array/Hashtable/CLOS
Library Function Vectors Hash-table Object Struct
boost-json json-encode (6) (1) (4) (4)
cl-json encode-json-to-string YES YES YES (4)
com.gigamonkeys.json write-json YES (2) hangs hangs
com.inuoe.jzon stringify YES YES YES (5) YES
jonathan to-json (8) YES YES (4) (4)
json-lib stringify YES YES "null" NO
json-streams json-stringify (3) (3) (3) (3)
jsown to-json YES (10) YES (4) (4)
shasht write-json YES (10) YES YES YES
st-json write-json (7) (1) (4) (4)
trivial-json-codec serialize-json YES (4) YES (4)
yason encode-* (9) YES (1) (4) (4) (4)
• (1) Succeeds if the hash-table keys are strings, fails if they are symbols
• (2) Invalid results if given a list inside the hash-table or errors if the hash-table keys are symbols
• (3) The basic json-stringify function does not handle data structures, so you need to resort to more complex calls. See json-streams-encoding-hash-tables or json-streams-encoding-arrays for examples.
• (4) May be able to resolve if you write a specialized method.
• (5) Automatically handles standard CLOS objects and also allows you to specialize
• (6) Invalid results Sample output on a simple nested array looked like: [,"Cork","Limerick"][,[,"Frankfurt","Munich"]]
• (7) You need to write your own st-json::write-json-element function for arrays.

See st-json-encoding.

• (8) Typically for these objects jonathan requires extra keyword parameters like :from :alist
• (9) May require using one of the more specialized functions such as encode-alist etc.
• (10) jsown and shasht are the only libraries which can handle multi-dimensional arrays.

### Encoding Lists (Summary)

By default, all libraries except json-streams:json-stringify (which does not accept lists) and trivial-json-codec (which generates invalid json) will return a json array when provided a plain CL list. Some libraries will not accept non-keyword symbols in a list (or will need an additional parameter). If you need to keep the key-value connections of a plist, you may need to convert the plist to another form or use a more specific function. More detail can be found below plain-lists, encoding alists, encoding plists.

In general, plain lists and plists are returned as json arrays unless some other keyword parameter is provided. If the library handles alists, they may be returned as json objects enclosing arrays, arrays enclosing arrays, or other variations. See encoding-alists for more detail. Please set the footnotes to see if libraries have issues with symbols (including keyword symbols).

Table 25: Quick Summary - Encoding Functionality Lists
Library Function lists alists plists
boost-json json-encode YES (7) YES
cl-json encode-json YES YES (8)
cl-json encode-json-alist YES YES (11)
cl-json encode-json-plist YES (9) YES
com.gigamonkeys.json write-json (2)(12) (1) (2)
com.inuoe.jzon stringify YES YES YES
jonathan to-json YES YES (13) YES
json-lib stringify (4) (1) YES
json-streams json-stringify (6) (6) (6)
jsown to-json YES (1) (10) (3)
shasht write-json YES YES YES
st-json write-json YES YES (10) (3)
trivial-json-codec serialize-json (14) (14) (14)
yason encode (5) (5) (5)
yason encode-plist (12) YES YES
yason encode-alist (5) YES (3)
• (1) com.gigamonkeys.json, json-lib cannot deal with alists directly. Consider using alexandria:alist-hash-table to convert the alist to a hash table.
• (2) Symbols are allowed only if they are keyword symbols, otherwise com.gigamonkeys.json will error.
• (3) Plists are treated the same as plain lists and will lose their key-value connections. Convert to a hash table first.
• (4) Symbols are allowed only if they are keyword symbols. Json-lib will convert a non-keyword symbol to null.
• (5) If it is a plain list or plist, yason:encode and yason:encode-alist will not accept symbols in the list. If it is an alist which has symbols, it will accept them as keys if you run (setf yason:*symbol-key-encoder* 'yason:ENCODE-SYMBOL-AS-LOWERCASE) first. Otherwise it errors.
• (6) Json-streams:json-stringify does not accept lists as input. You would need to use lower level components of json-streams. See json-streams-encoding.
• (7) Fail - the json arrays are invalid. They have ',.' or ',' rather than ':' depending on whether they are dotted cons cells or not.
• (8) As you might expect, plists are treated the same as plain lists and will lose their key-value connections. If you want to keep the key-value connections, you can either convert the list to an alist or hash-table or use the cl-json:encode-json-plist function.
• (9) This function is plist specific and will error if provided an alist.
• (10) If the alist has dotted cons cells, jsown and st-json triggered unhandled memory faults with sbcl 2.1.11-x86-64-linux, ccl version 1.12.1 LinuxX8664 and ecl-version 21.2.1. This appears to be because jsown and st-json have optimized the code and assumed that all lists will be proper lists. The assumption obviously fails in the context of dotted cons cells.
• (11) The *-alist functions require an alist. Shocking, I know.
• (12) com.gigamonkeys.json will assume a plain list is a plist, returning a json object with key-value pairs. If the length of the list is odd, the final value in the list will be treated as a key and an empty set will be inserted as the value.
• (13) For jonathan to properly handle alists, you need to add the additional keyword parameters :from :alist
• (14) Trivial-json-codec is going to give us invalid json from now on with respect to tests involving lists - e.g. '(a b c)) becomes "<A,B,C>", so we will drop it from the rest of the encoding tests involving lists.

### Encoding Plain Lists

As noted above, if you had symbols in a list, libraries like json-streams and st-json are going to error out because they need additional methods, while com.gigamonkeys.json accepts only keyword symbols. In the below table, I included trivial-json-codec's results without the error that it generates on ratio numbers. Notice that gigamonkeys is interpreting the list as a plist and trying to create key:value pairs.

Using '("A" "b" 4 3.2 9/4) as the sample data

Table 26: Encoding Lists without Symbols
Library Function Result
Original data   '("A" "b" 4 3.2 9/4)

boost-json json-encode ["A","b",4,3.2,2.25]
cl-json encode-json ["A","b",4,3.2,2.25]
com.gigamonkeys.json write-json {"A":"b","4":3.200000047683716,"2.25":{}}
com.inuoe.jzon stringify ["A","b",4,3.2,2.25]
jonathan to-json "[\"A\",\"b\",4,3.2,2.25]"
json-lib stringify "[\"A\", \"b\", 4, 3.2, 9/4]"
json-streams json-stringify Fail on list, (1)
jsown to-json "[\"A\",\"b\",4,3.2,2.25]"
shasht write-json ["A","b",4,3.2e+0,2.25e+0]
st-json write-json ["A","b",4,0.32e+1,0.225e+1]
trivial-json-codec (2) serialize-json "<\"A\",\"b\",4,3.2>"
yason encode ["A","b",4,3.200000047683716,2.25]
• (1) See json-streams-encoding.
• (2) I have no idea why trivial-json-codec wants to generate angle brackets.

### Encoding Alists

I discovered that many libraries have problems if the alist components are dotted pairs (improper lists). So we break alists down between alists with dotted pairs and alists without dotted pairs. We will keep symbols out of the sample data because we already know from encoding-symbols that some libraries will have issues with alists containing symbols as keys. Of note:

• cl-json provides a function to encode the alist properly as key:value,
• com.inuoe.jzon correctly guesses this is a alist which should be encoded as key:value,
• shasht:*write-alist-as-object* provides the ability to write an undotted alist as a json object rather than an array.
• yason provides a specific function to encode the alist properly as key:value,
• If the alist has dotted cons cells, jsown and st-json triggered unhandled memory faults with sbcl 2.1.11-x86-64-linux, ccl version 1.12.1 LinuxX8664 and ecl-version 21.2.1. This appears to be because jsown and st-json have optimized the code and assumed that all lists will be proper lists. The assumption obviously fails in the context of dotted cons cells.
##### Dotted cons cells (dotted pairs)

Using '(("A" . 1) ("B" . 2) ("C" . 3)) as the sample data

Table 27: Encoding Dotted Alists Without Symbols
Library Function Result
Original data   '(("A" . 1) ("B" . 2) ("C" . 3))

boost-json (1) json-encode [["A",. 1],["B",. 2],["C",. 3]]
cl-json encode-json {"A":1,"B":2,"C":3}
cl-json encode-json-alist {"A":1,"B":2,"C":3}
com.gigamonkeys.json write-json Error:Can't stringify (A . 1)
com.inuoe.jzon stringify "{\"A\":1,\"B\":2,\"C\":3}"
jonathan to-json Error: value 1 is not of type list
jonathan to-json XXX :from :alist "{\"A\":1,\"B\":2,\"C\":3}"
json-lib (2) stringify Error value 1 is not of type list
json-streams json-stringify Error (5)
jsown (4) to-json Unhandled memory fault
shasht (3) write-json { "A": 1, "B": 2, "C": 3}
st-json (4) write-json Unhandled memory fault
yason encode-alist {"A":1,"B":2,"C":3}
• (1) Fail - the json arrays are invalid. They have ',.' rather than ','
• (2) You may be able to write a new method to handle dotted.
• (3) Only if (setf shasht:*write-alist-as-object* t), which is not the default. Otherwise it generates an error.
• (4) If the alist has dotted cons cells, jsown and st-json triggered unhandled memory faults with sbcl 2.1.11-x86-64-linux, ccl version 1.12.1 LinuxX8664 and ecl-version 21.2.1. This appears to be because jsown and st-json have optimized the code and assumed that all lists will be proper lists. The assumption obviously fails in the context of dotted cons cells.
• (5) You would need to use lower level components of json-streams. See json-streams-encoding.

Now we change the data slightly, still using dotted pairs, but with a nested list as the value in one pair: Using '(("foo" . "bar") ("baz" . ((1 2 3) (4 5 6)))) as the sample data.

Table 28: Encoding Dotted Alists without symbols with more complex values
Library Function Result
Original data   '(("foo" . "bar") ("baz" . ((1 2 3) (4 5 6))))

boost-json (1) json-encode [["foo",. "bar"],["baz",[1,2,3],[4,5,6]]]
cl-json encode-json {"foo":"bar","baz":[[1,2,3],[4,5,6]]}
cl-json encode-json-alist {"foo":"bar","baz":[[1,2,3],[4,5,6]]}
com.gigamonkeys.json write-json Error:Can't stringify (foo . bar)
com.inuoe.jzon stringify {"foo":"bar","baz":[[1,2,3],[4,5,6]]}
jonathan to-json Error: value 1 is not of type list
jonathan to-json XXX :from :alist {"foo":"bar","baz":{"1":[2,3],"4":[5,6]}}
json-lib (2) stringify Error value "bar" is not of type list
json-streams json-stringify Error (5)
jsown (4) to-json Unhandled memory fault
shasht (3) write-json Error: The value "bar" is not of type LIST
shasht (3a) write-json {"foo":"bar","baz":{1:[2,3],4:[5,6]}}
st-json (4) write-json Unhandled memory fault
yason encode Error: "bar" is not of type list
yason encode-alist {"foo":"bar","baz":[[1,2,3],[4,5,6]]}
• (1) Fail - the json arrays are invalid. They have ',.' rather than ','
• (2) You may be able to write a new method to handle dotted.
• (3) This error is generated if shasht:*write-alist-as-object* is nil (the default).
• (3a) If shasht:*write-alist-as-object* is t (not the default).
• (4) If the alist has dotted cons cells, jsown and st-json triggered unhandled memory faults with sbcl 2.1.11-x86-64-linux, ccl version 1.12.1 LinuxX8664 and ecl-version 21.2.1. This appears to be because jsown and st-json have optimized the code and assumed that all lists will be proper lists. The assumption obviously fails in the context of dotted cons cells.
• (5) You would need to use lower level components of json-streams. See json-streams-encoding.

Notice the difference in the successful results. cl-json and yason's result of {"foo":"bar","baz":[[1,2,3],[4,5,6]]} is what I would have expected. jonathan tried to force a key value pair into the second alist value. {"foo":"bar","baz":{"1":[2,3],"4":[5,6]}}.

com.inuoe.jzon is a special case here. It uses heuristics to predict what data structure it is using and this particular sample works because it does not try to make an integer into a key. if the sample data had been something like

'(("foo" . "bar") ("baz" . (("A" 2 3) ("B" 5 6))))


it would have predicated that the "A" and "B" should be keys and would have returned

"{\"foo\":\"bar\",\"baz\":{\"A\":[2,3],\"B\":[5,6]}}"


which may or may not have been what you wanted.

##### Without dotted cons cells

Using '(("A" 1) ("B" 2) ("C" 3)) as the sample data.

Note difference in results. Some libraries return arrays, others return objects. Some libraries return key:[value] pairs, others return two member arrays.

Table 29: Encoding Undotted Alists with simple values
Library Function Result
Original data   '(("A" 1) ("B" 2) ("C" 3))

boost-json boost-json-write-to-string [["A",1],["B",2],["C",3]]
cl-json encode-json [["A",1],["B",2],["C",3]]
cl-json encode-json-alist {"A":[1],"B":[2],"C":[3]}
com.gigamonkeys.json gigamonkeys-write-to-string Can't stringify (A 1)
com.inuoe.jzon stringify {"A":[1],"B":[2],"C":[3]}
jonathan jonathan-to-json-alist {"A":[1],"B":[2],"C":[3]}
json-lib stringify [["A",1],["B",2],["C",3]]
json-streams json-stringify Error (1)
jsown to-json [["A",1],["B",2],["C",3]]
shasht write-json [["A",1],["B",2],["C",3]]
st-json write-json-to-string [["A",1],["B",2],["C",3]]
yason encode [["A",1],["B",2],["C",3]]
yason encode-alist {"A":[1],"B":[2],"C":[3]}
• (1) Using json-stringify caused the alist used as data to fall through etypecase expression demanding a json-streams:json-array. That means that you have to fall back to more complicated calls. See json-streams-encoding.

Let's make the value portion of the alist another alist. Now using '(("foo" "bar") ("baz" ((1 2 3) (4 5 6)))) as the sample data. The results may or may not be what you want, so look at them carefully.

Table 30: Encoding Undotted Alists with more complex values (1)
Library Function Result
Original data   '(("foo" "bar") ("baz" ((1 2 3) (4 5 6))))

boost-json json-encode [["foo","bar"],["baz",[[1,2,3],[4,5,6]]]]
cl-json encode-json [["foo","bar"],["baz",[[1,2,3],[4,5,6]]]]
cl-json encode-json-alist [["foo","bar"],["baz",[[1,2,3],[4,5,6]]]]
com.gigamonkeys.json write-json Can't stringify (foo bar)
com.inuoe.jzon stringify {"foo":["bar"],"baz":[[[1,2,3],[4,5,6]]]}
jonathan to-json XX :from :alist {"foo":["bar"],"baz":[{"1":[2,3],"4":[5,6]}]}
json-lib stringify [["foo", "bar"], ["baz",[[1,2,3],[4,5,6]]]]
json-streams json-stringify Error (1)
jsown to-json [["foo","bar"],["baz",[[1,2,3],[4,5,6]]]]
shasht write-json [["foo","bar"],["baz",[[1,2,3],[4,5,6]]]]
st-json write-json [["foo","bar"],["baz",[[1,2,3],[4,5,6]]]]
yason encode [["foo","bar"],["baz",[[1,2,3],[4,5,6]]]]
yason encode-alist {"foo":["bar"],"baz":[[[1,2,3],[4,5,6]]]}

Yason:encode-alist, com.inuoe.jzon and jonathan return json objects with key pairs; the rest try to return arrays. jonathan seem to be trying to force a key value pair into the baz values when I do not think this is probably what you want. Yason and com.inuoe.jzon's heuristics return the same result. Let's change the data slightly and just look at these two libraries again.

Table 31: Encoding Undotted Alists with more complex values (2)
Library Function Result
New data   '(("foo" "bar") ("baz" ((1 2 3) ("A" 5 6))))

com.inuoe.jzon stringify {"foo":["bar"],"baz":[[[1,2,3],["A",5,6]]]}
yason encode [["foo","bar"],["baz",[[1,2,3],["A",5,6]]]]
yason encode-alist {"foo":["bar"],"baz":[[[1,2,3],["A",5,6]]]}

New data   '(("foo" "bar") ("baz" (("B" 2 3) ("A" 5 6))))
com.inuoe.jzon stringify {"foo":["bar"],"baz":[{"B":[2,3],"A":[5,6]}]}
yason encode [["foo","bar"],["baz",[["B",2,3],["A",5,6]]]]
yason encode-alist {"foo":["bar"],"baz":[[["B",2,3],["A",5,6]]]}

### Encoding Plists

• cl-json:encode-json (returns array) and cl-json:encode-json-plist (returns object). cl-json provides a specialized function to encode the plist properly as key:value
• com.inuoe.jzon correctly guesses this is a plist which should be encoded as key:value
• With keyword symbols, jonathan returns an object, but otherwise returns an array.
• yason provides a function to encode the plist properly as key:value

Using '("a" 1 "b" 2 "c" 3) as the sample data

Table 32: Encoding Plists
Library Function Result Comment
Original data   '("a" 1 "b" 2 "c" 3)

boost-json json-encode ["a",1,"b",2,"c",3] (1)
cl-json encode-json ["a",1,"b",2,"c",3] (1)
cl-json encode-json-plist {"a":1,"b":2,"c":3}
com.gigamonkeys.json write-json {"a":1,"b":2,"c":3}
com.inuoe.jzon stringify "{\"a\":1,\"b\":2,\"c\":3}"
jonathan to-json "[\"a\",1,\"b\",2,\"c\",3]" (1)
json-lib stringify "[\"a\", 1, \"b\", 2, \"c\", 3]"
json-streams json-stringify   Fail: (2)
jsown to-json "[\"a\",1,\"b\",2,\"c\",3]" (1)
shasht write-json ["a",1,"b",2,"c",3] (1)
st-json write-json "[\"a\",1,\"b\",2,\"c\",3]" (1)
yason encode ["a",1,"b",2,"c",3] (1)
yason encode-plist {"a":1,"b":2,"c":3}
• (1) Encoding to array but loses the obvious key-value connection.
• (2) You would need to use lower level components of json-streams. See json-streams-encoding.

### Encoding Arrays

#### Encoding Single Dimensional Vector

Boost-json loses the first value in the vector. Json-streams and st-json do not handle the vector. All the other libraries pass, although some get excited about the floating point number.

Using #("A" 1 2.3) as sample data.

Table 33: Encoding Encoding Vectors
Library Function Result
Original data   #("A" 1 2.3)

boost-json json-encode [,1,2.3]
cl-json encode-json ["A",1,2.3]
com.gigamonkeys.json write-json ["A",1,2.299999952316284]
com.inuoe.jzon stringify ["A",1,2.3]
jonathan to-json ["A",1,2.3]
json-lib stringify ["A", 1, 2.3]
json-streams (1) json-stringify #("A" 1 2.3) fell through ETYPECASE expression.
jsown to-json ["A",1,2.3]
shasht write-json ["A",1,2.2999999e+0]
st-json (2) write-json Cannot write object of type (SIMPLE-VECTOR 3) as JSON.
trivial-json-codec serialize-json "[\"A\",1,2.3]"
yason encode ["A",1,2.299999952316284]
• (1) You would need to use lower level components of json-streams. See json-streams-encoding.
• (2) You need to write your own st-json::write-json-element function for arrays. Possibly something like:
(defmethod write-json-element ((element vector) stream)
(declare #.*optimize*)
(write-char #$stream) (loop :for val :across element :for first := t :then nil :unless first :do (write-char #\, stream) :do (write-json-element val stream)) (write-char #$ stream))


See st-json-encoding.

#### Encoding Simple Bit Vector

Again boost-json, json-streams and st-json fail. The rest pass.

Using #*10110 as the sample data.

Table 34: Encoding Bit Vectors
Library Exported Function Result Comment
Original data   #*10110

boost-json json-encode Fail [,0,1,1,0] Lost first item
cl-json encode-json [1,0,1,1,0]
com.gigamonkeys.json write-json [1,0,1,1,0]
com.inuoe.jzon stringify [1,0,1,1,0]
jonathan to-json "[1,0,1,1,0]"
json-lib stringify "[1, 0, 1, 1, 0]"
json-streams json-stringify Fail Default fails (1)
jsown to-json "[1,0,1,1,0]"
shasht write-json [1,0,1,1,0]
st-json write-json Fail Default fails (2)
trivial-json-codec serialize-json "[1,0,1,1,0]"
yason encode [1,0,1,1,0]
• (1) You would need to use lower level components of json-streams. See json-streams-encoding.
• (2) Error: Can not write object of type (SIMPLE-BIT-VECTOR 5) as JSON. You can resolve this by writing your own method for handling symbols. See st-json-encoding.

#### Encoding Nested Arrays

The following example uses a very simple nested array:

#(#("Dublin" "Cork" "Limerick") #("Berlin" "Frankfurt" "Munich"))

Table 35: Encoding Nested Array
Library Result Comment
Original data #(#("Dublin" "Cork" "Limerick") #("Berlin" "Frankfurt" "Munich"))

boost-json [,"Cork","Limerick"][,[,"Frankfurt","Munich"]] Fail
cl-json [["Dublin","Cork","Limerick"],["Berlin","Frankfurt","Munich"]]
gigamonkeys [["Dublin","Cork","Limerick"],["Berlin","Frankfurt","Munich"]]
jzon [["Dublin","Cork","Limerick"],["Berlin","Frankfurt","Munich"]]
jonathan [["Dublin","Cork","Limerick"],["Berlin","Frankfurt","Munich"]]
json-lib [["Dublin", "Cork", "Limerick"], ["Berlin", "Frankfurt", "Munich"]]
json-streams   (1)
jsown [["Dublin","Cork","Limerick"],["Berlin","Frankfurt","Munich"]]
shasht [["Dublin","Cork","Limerick"], ["Berlin","Frankfurt","Munich"]]
st-json   (2)
trivial-json-codec "[[\"Dublin\",\"Cork\",\"Limerick\"],[\"Berlin\",\"Frankfurt\",\"Munich\"]]"
yason [["Dublin","Cork","Limerick"],["Berlin","Frankfurt","Munich"]]
• (1) You would need to use lower level components of json-streams. See json-streams-encoding.
• (2) Error: Can not write object of type (SIMPLE-BIT-VECTOR 5) as JSON. You can resolve this by writing your own method for handling symbols. See st-json-encoding.

#### Encoding 2 Dimensional Array Version

Just to see what happens if we give the libraries a two dimensional array, let's use #2A((1.0 1.0) (1.0 1.0) (1.0 1.0)) as the sample data.

Jsown and shasht are the only libraries to handle a two dimensional array without the user having to write additional methods.

Table 36: Encoding 2D Array
Library Function Result
Original data   #2A((1.0 1.0) (1.0 1.0) (1.0 1.0))

boost-json json-encode Error: No applicable method
cl-json encode-json Error: Unencodable value
com.gigamonkeys.json write-json Method emit-json hangs
com.inuoe.jzon stringify {}
jonathan to-json Error: No applicable method
json-lib stringify "null"
json-streams json-stringify Error: (1)
jsown to-json Passed "[[1.0,1.0],[1.0,1.0],[1.0,1.0]]"
shasht write-json Passed "[[1.0e+0,1.0e+0],[1.0e+0,1.0e+0],[1.0e+0,1.0e+0]]"
st-json write-json Error: Cannot write object as json
trivial-json-codec serialize-json Error: value is not of type sequence
yason encode Error: No applicable method
• (1) You would need to use lower level components of json-streams. See json-streams-encoding.
• (2) You can resolve this by writing your own method for handling symbols. See st-json-encoding.

### Encoding hash-tables

#### Using string as key

Using (alexandria:plist-hash-table '("foo" 1 "bar" (7 8 9)) :test #'equal) as the sample data:

Table 37: Encoding Hash Tables without Symbols
Library Function Result Comment
boost-json json-encode {"foo":1,"bar":[7,8,9]}
cl-json encode-json {"foo":1,"bar":[7,8,9]}
com.gigamonkeys.json write-json {"foo":1,"bar":{"7":8,"9":{}}} (1)
com.inuoe.jzon stringify "{\"foo\":1,\"bar\":[7,8,9]}"
jonathan to-json "{\"foo\":1,\"bar\":[7,8,9]}"
json-lib stringify "{\"foo\": 1, \"bar\": [7, 8, 9]}"
json-streams json-stringify Error (2)
jsown to-json "{\"foo\":1,\"bar\":[7,8,9]}"
shasht write-json { "foo": 1, "bar": [7,8,9]}
st-json write-json {"foo":1,"bar":[7,8,9]}
trivial-json-codec serialize-json Error (3)
yason encode {"foo":1,"bar":[7,8,9]}
• (1) The hash table encoding is fine. In this example com.gigamonkeys.json is treating the list which is a hash table value as a plist and trying to create key:value pairs.
• (2) You would need to use lower level components of json-streams. See json-streams-encoding.
• (3) May be able to resolve this with writing your own method.

#### Using symbol as key

Using (alexandria:plist-hash-table '(:foo 1 :bar (7 8 9)) :test #'eq) as the sample data: Again, see what gigamonkeys is doing with respect to trying to treat the list as key:value pairs.

Table 38: Encoding Hash Tables With Symbols as Key
Library Function Result Comment
boost-json json-encode FAIL: {}
cl-json encode-json {"foo":1,"bar":[7,8,9]}
com.gigamonkeys.json write-json {"foo":1,"bar":{"7":8,"9":{}}} (1)
com.inuoe.jzon stringify {"foo":1,"bar":[7,8,9]}
jonathan to-json {"FOO":1,"BAR":[7,8,9]}
json-lib stringify "{\"foo\": 1, \"bar\": [7,8,9]}" (2)
json-streams json-stringify Error (3)
jsown to-json "{\"FOO\":1,\"BAR\":[7,8,9]}"
shasht write-json {"FOO": 1,"BAR": [7,8,9]}
st-json write-json Error (4)
trivial-json-codec serialize-json Error (5)
yason encode Error (6)
• (1) The symbol must be a keyword symbol or com.gigamonkeys.json will error out. As noted above, it is also trying to treat the embedded list as key:value pairs rather than an array.
• (2) If the symbols were not keywords, json-lib would return "{null: 1, null: [7, 8, 9]}"
• (3) You would need to use lower level components of json-streams. See json-streams-encoding.
• (4) You will need to write your own method to handle symbols in this situation
• (5) May be able to resolve this with writing your own method.
• (6) There is no applicable method for the generic function when called with arguments :symbol

### Encoding CLOS objects

cl-json, com.inuoe.jzon and shasht all are able to encode CLOS objects by default. Boost-json, jonathan, jsown, st-json and yason require that you write methods for each class. With the other libraries you would either have to convert the clos object into another type (hash-table or alist etc.) or write a specific function for that particular class of object. We are using a very simple CLOS object here, just to test the water.

(defclass person ()
((name
:initarg :name :initform "Sabra"
:accessor name)
(eye-colour :initarg :eye-colour
:initform "brown"
:accessor eye-colour)))


As you can see from the table below, at least in this simple example, cl-json, com.inuoe.jzon and shasht all are able to encode CLOS objects by default. Boost-json, jonathan, jsown, and yason require that you write methods for each class.

Table 39: Encoding CLOS Objects
Library Function Result
boost-json json-encode (1)
cl-json encode-json {"name":"Sabra","eyeColour":"brown"}
com.gigamonkeys.json write-json hangs
com.inuoe.jzon stringify "{\"name\":\"Sabra\",\"eye-colour\":\"brown\"}" (4)
jonathan to-json (1)
json-lib stringify "null"
json-streams json-stringify Error (2)
jsown to-json (1)
shasht write-json { "NAME": "Sabra", "EYE-COLOUR": "brown"}
st-json write-json (1)
trivial-json-codec serialize-json "{\"NAME\":\"Sabra\",\"EYE-COLOUR\":\"brown\"}"
yason encode (1)

### Encoding Structs

Only com.inuoe.jzon and shasht were able to encode a struct without having to define a special method.

The sample data for the following table is:

(defparameter *book1* (make-book :title "C Programming"
:author "Nuha Ali"
:subject "C-Programming Tutorial"
:book-id "478"))

Table 40: Encoding Structs
Library Function Result
boost-json json-encode Error (1)
cl-json encode-json Error (2)
com.gigamonkeys.json write-json hangs
com.inuoe.jzon stringify {"title":"C Programming","author":"Nuha Ali","subject":"C-Programming Tutorial","book-id":"478"}
jonathan to-json Error (1)
json-lib stringify null
json-streams json-stringify Error (3)
jsown to-json Error (1)
shasht write-json {"TITLE":"C Programming","AUTHOR":"Nuha Ali","SUBJECT":"C-Programming Tutorial","BOOK-ID":"478"}
st-json write-json Error (4)
trivial-json-codec serialize Error (1)
yason encode Error (1)
• (1) No applicable method
• (2) Not a type which can be encoded by encode-json
• (3) You would need to use lower level components of json-streams. See json-streams-encoding.
• (4) Can not write object of type BOOK as JSON.

### Incremental Encoding

Suppose you do not yet have a specific lisp data structure with all the data you want to encode to a single json object or array. A simple example would be intermediate function results during a loop. How would you go about writing it? Obviously you could collect all the results into a list or other CL data structure, or you could build a json object or array incrementally or write them to stream incrementally. Examples can be found in the library details section below using the appropriate links in the following table.

Table 41: Incremental Encoding
Library Ability?
boost-json No
cl-json Yes
com.gigamonkeys.json No
com.inuoe.jzon No (1)
jonathan Yes
json-lib No
json-streams Yes
jsown Yes
shasht Yes
st-json Yes
trivial-json-codec No
yason Yes
• (1) being actively worked on

## Symmetry

Are the libraries symmetrical in the sense that (a) going from json to cl and back to json gets you the exact same json and (b) going from cl to json and back to cl gets you the exact same cl data and data structure? (This is sometimes called round-tripping.) This is critical to some users, not critical to others. If there was a 1:1 matching of cl and json data structures, this would be easy, but there isn't. json has a null that cl does not have and cl has many data structures that json does not have.

Consider starting from json and trying to get back to json. To quote Steve Losh: "For me, the most important quality I need in a JSON library is an unambiguous, one-to-one mapping of types. For example: some libraries will deserialize JSON arrays as Lisp lists, and JSON true/false as t/nil. But this means [] and false both deserialize to nil, so you can't reliably round trip anything!"

Now consider starting from cl and trying to get back to cl. A cl library that handles both lists and arrays might encode both as json arrays. But now when you decode the json array, are you decoding to a cl list or a cl array? If round-tripping is important to you, this will drive your choice of cl data structures so that you know what you are decoding back to. Similarly, what do you do with symbols?

### Json -> CL -> Json

#### First test (Easy)

In the first easy test, the only surprises were jonathan reversing the order of the elements in the json object and I do not know what trivial-json-codec is doing with the angle brackets in the json object test.

Table 42: Symmetry Test 1 (Easy) Json Object
Library Pass/Fail Resulting String Comment
Original   "{\"a\":1,\"b\":\"sales\",\"c\":true}"

boost-json PASS "{\"a\":1,\"b\":\"sales\",\"c\":true}"
cl-json PASS "{\"a\":1,\"b\":\"sales\",\"c\":true}"
com.gigamonkeys.json PASS {"a":1,"b":"sales","c":true}
com.inuoe.jzon PASS "{\"a\":1,\"b\":\"sales\",\"c\":true}"
jonathan FAIL "{\"c\":true,\"b\":\"sales\",\"a\":1}" Reversed order (2)
json-lib PASS "{\"a\":1,\"b\":\"sales\",\"c\":true}"
json-streams PASS "{\"a\":1,\"b\":\"sales\",\"c\":true}"
jsown PASS "{\"a\":1,\"b\":\"sales\",\"c\":true}"
shasht PASS "{\"a\":1,\"b\":\"sales\",\"c\":true}"
st-json PASS "{\"a\":1,\"b\":\"sales\",\"c\":true}"
trivial-json-codec FAIL "<<\":A\",1>,<\":B\",\"sales\">,<\":C\",true>>" (1)
yason PASS "{\"a\":1,\"b\":\"sales\",\"c\":true}"
• (1) trivial-json-codec is really intended as a parser (one way) from json to CL, not really serializing to json.
• (2) Ignoring the reversed order issue, consider the following from jonathan. We decode a json object as an alist. Jonathan returns an alist with dotted pairs. E.g.
(jonathan:parse "{\"a\":1,\"b\":2}" :as :alist)
(("b" . 2) ("a" . 1))


If we then pass that to jonathan:to-json, it throws an error because jonathan does not handle alists with dotted pairs. I am only calling this out because jonathan expressly provides you with the ability to return a json object as a dotted pair alist, but then cannot handle it going back the other direction.

Everyone passes when dealing with the easy array with no nulls involved.

Table 43: Symmetry Test 1 (Easy) Json Array
Library Pass/Fail Resulting String
Original   "[1,\"sales\",true]"

boost-json PASS "[1,\"sales\",true]"
cl-json PASS "[1,\"sales\",true]"
com.gigamonkeys.json PASS [1,"sales",true]
com.inuoe.jzon PASS "[1,\"sales\",true]"
jonathan PASS "[1,\"sales\",true]"
json-lib PASS "[1, \"sales\", true]"
json-streams PASS "[1,\"sales\",true]"
jsown PASS "[1,\"sales\",true]"
shasht PASS "[1,\"sales\",true]"
st-json PASS "[1,\"sales\",true]"
trivial-json-codec PASS "[1,\"sales\",true]"
yason PASS "[1,\"sales\",true]"

#### Second Test (Array Inside a Json Object)

Table 44: Symmetry Test 2 Json Array Within a Json Object
Library Pass/Fail Resulting String
Original   "{\"items\": [1,2,3]}"

boost-json PASS "{\"items\":[1,2,3]}"
cl-json FAIL (1) "[​[\"items\",1,2,3]]"
com.gigamonkeys.json PASS {"items":[1,2,3]}
com.inuoe.jzon PASS "{\"items\":[1,2,3]}"
jonathan PASS "{\"items\":[1,2,3]}"
json-lib PASS "{\"items\": [1, 2, 3]}"
json-streams PASS "{\"items\":[1,2,3]}"
jsown PASS "{\"items\":[1,2,3]}"
shasht PASS "{\"items\":[1,2,3]}"
st-json PASS "{\"items\":[1,2,3]}"
trivial-json-codec FAIL "<​<\":ITEMS\",[1,2,3]>>"
yason PASS "{\"items\":[1,2,3]}"
• (1) I was surprised that cl-json failed here by returning a json array instead of a json object. This is flagged on the homepage as issue 22.

#### Third test (Trickier Data Types)

Now we make it a bit trickier with the following json string, decoded and then encoded it to see if we got back the original. This has trickier bits because of the unicode, exponent and false and null. The results are pretty much what you would expect.

"{\"key1\":\"value\\n\",\"key2\":1,\"key3\":[\"Hello \\u2604\",1.2e-34 ,true,false,null]}"

Table 45: Symmetry Test 3
Library Pass/Fail Resulting String
Original   "{\"key1\":\"value\\n\",\"key2\":1, \"key3\":[\"Hello \\u2604\",1.2e-34 ,true,false,null]}"
boost-json FAIL (3) "{\"key1\":\"value\\n\",\"key2\":1, \"key3\":[\"Hello \\u2604\", 0.00000000000000000000000000000000012000001, true,null,null]}"
cl-json FAIL (3) "{\"key1\":\"value\\n\",\"key2\":1, \"key3\":[\"Hello \\u2604\",0.00000000000000000000000000000000012000001, true,null,null]}"
com.gigamonkeys.json PASS {"key1":"value\n","key2":1, "key3":["Hello ☄",1.2e-34,true,false,null]}
com.inuoe.jzon FAIL (3) "{\"key1\":\"value\\n\",\"key2\":1, \"key3\":[\"Hello ☄\", 0.00000000000000000000000000000000012, true,false,null]}"
jonathan (1) FAIL (3)(5) "{\"key3\":[\"Hello ☄\", 0.00000000000000000000000000000000012000001, true,[],[]],\"key2\":1, \"key1\":\"value\\n\"}"
json-lib FAIL "{\"key1\": \"value\\n\", \"key2\": 1, \"key3\": [\"Hello \", 1.2000001e-34, true, null, null]}"
json-streams (1) PASS "{\"key1\":\"value\\n\",\"key2\":1, \"key3\":[\"Hello ☄\",1.2E-34,true,false,null]}"
jsown FAIL (3)(5) "{\"key1\":\"value\\n\",\"key2\":1, \"key3\":[\"Hello \\u2604\", 0.00000000000000000000000000000000012, true,[],[]]}"
shasht (1) PASS "{\"key1\":\"value\\n\",\"key2\":1, \"key3\":[\"Hello ☄\",1.2e-34,true,false,null]}"
st-json PASS "{\"key1\":\"value\\n\",\"key2\":1, \"key3\":[\"Hello \\u2604\",0.12e-33,true,false,null]}"
trivial-json-codec FAIL (4) "<<\":KEY1\",\"value\\n\">,<\":KEY2\",1>, <\":KEY3\",[\"Hello \\u2604\",1.2000001e-34,true,null,null]>>"
yason (1) (2) PASS? (3) "{\"key1\":\"value\\n\",\"key2\":1, \"key3\":[\"Hello ☄\", 0.00000000000000000000000000000000011999999642058263, true,false,null]}"
• (1) unicode expanded to the appropriate character which I will treat as a success
• (2) You need to pass :json-booleans-as-symbols t to the yason parse function as a keyword argument in order to get false and it will come out as a YASON:FALSE symbol
• (3) Exponent expanded to a decimal which I could argue is a success. Your call.
• (4) trivial-json-codec is really intended as a parser (one way) from json to CL, not really serializing to json.
• (5) both false and null are converted to [], but they are different concepts.
Table 46: Symmetry Test 3 Points of Failure
Library Point(s) of Failure Comment
boost-json exponent (1), false
cl-json exponent (1), false
com.gigamonkeys.json
com.inuoe.jzon exponent (1)
jonathan backwards, exponent (1), false, null unicode expanded to the appropriate character
json-lib unicode, false
json-streams   unicode expanded to the appropriate character
jsown exponent (1), false, null
shasht   unicode expanded to the appropriate character
st-json exponent (1) 0.12e-33 v. 1.2e-34 should count as pass
yason exponent (1) unicode expanded to the appropriate character
• (1) Up to you whether you want to treat the exponent expanded to a decimal as a pass or fail

### CL -> Json -> CL

We did this testing with a simple alist with undotted pairs. Since this is an alist, cl-json and json use specialized functions :encode-json-alist-to-string to successfully write json, then read back a result that is symmetrical to the original alist.

'((:NAME "George Washington")
(:BIRTHDAY "February 22, 1732")
(:ADDRESS "Mount Vernon, Virginia, United States"))

Table 47: Symmetry CL -> Json -> CL Test 1 starting with undotted alist
Library Pass/Fail Resulting String
Original   ((:NAME "George Washington") (:BIRTHDAY "February 22, 1732") (:ADDRESS "Mount Vernon, Virginia, United States"))
boost-json FAIL Error: Unexpected #\N
cl-json PASS ((:NAME "George Washington") (:BIRTHDAY "February 22, 1732") (:ADDRESS "Mount Vernon, Virginia, United States"))
com.inuoe.jzon FAIL (("address" . #("Mount Vernon, Virginia, United States")) ("birthday" . #("February 22, 1732")) ("name" . #("George Washington")))
jonathan PASS (3) ((:NAME "George Washington") (:BIRTHDAY "February 22, 1732") (:ADDRESS "Mount Vernon, Virginia, United States"))
json-lib FAIL #(#("name" "George Washington") #("birthday" "February 22, 1732") #("address" "Mount Vernon, Virginia, United States"))
json-streams FAIL (4)
jsown FAIL (("NAME" "George Washington") ("BIRTHDAY" "February 22, 1732") ("ADDRESS" "Mount Vernon, Virginia, United States"))
shasht FAIL #(#("NAME" "George Washington") #("BIRTHDAY" "February 22, 1732") #("ADDRESS" "Mount Vernon, Virginia, United States"))
st-json FAIL Depends on how you write a method for handling symbols. See st-json-encoding
trivial-json-codec PASS ((:NAME "George Washington") (:BIRTHDAY "February 22, 1732") (:ADDRESS "Mount Vernon, Virginia, United States"))
yason (2) FAIL (("address" "Mount Vernon, Virginia, United States") ("birthday" "February 22, 1732") ("name" "George Washington"))
• (1) com.inuoe.jzon and yason return hashtables, so alexandria:hash-tables-alist was used to get an alist back.
• (2) (yason:parse (with-output-to-string (s) (yason:encode-alist data s)) :object-as :alist)
• (3) Caveat here if we tried to do this with respect to alists having dotted pairs. In that case jonathan would also fail.
• (4) You could write your own methods to wrap lower level components of json-streams. See json-streams-encoding.
Table 48: Symmetry CL -> Json -> CL Test 1 Points of Failure
Library Point(s) of Failure
boost-json Unexpected #\N
cl-json
com.gigamonkeys.json ???
com.inuoe.jzon keys are strings, not keywords, values are embedded in vectors
jonathan (1)
json-lib result is vectors in vectors with the keys being strings, not keywords
json-streams You would need to use lower level components of json-streams. See json-streams-encoding.
jsown keys are strings, not keywords
shasht result is vectors in vectors with the keys being strings, not keywords
st-json You need to write your own method for handling symbols.
trivial-json-codec
yason keys are strings, not keywords
• (1) As noted when going from Json -> CL -> Json, jonathan is not symmetric if the alists have dotted pairs.

Now let's try starting from an array which also contains a :NULL keyword symbol as a substitute for cl not having a proper null value.

Table 49: Symmetry CL -> Json -> CL Test 2 Starting with array
Library Pass/Fail Resulting String
Original   #("a" 1 4.2 NIL :NULL)

boost-json FAIL [,1,4.2,null,"NULL"]
cl-json DEFAULT FAIL ( "a" 1 4.2 NIL "null")
cl-json PASS (1)(2) #("a" 1 4.2 NIL "null")
com.gigamonkeys.json FAIL ["a",1,4.199999809265137,{},null]
com.inuoe.jzon PASS (2) #("a" 1 4.2d0 NIL "NULL")
jonathan FAIL (a 1 4.2 NIL NIL)
json-lib PASS (2) #(a 1 4.2 NIL "null")
json-streams FAIL Error
jsown FAIL (a 1 21/5 NIL NIL)
shasht PASS (3) #("a" 1 4.2000003 NIL :NULL)
st-json FAIL Error
trivial-json-codec PASS #(a 1 4.2 NIL NULL)
yason (4) FAIL #("a" 1 4.2 NIL NIL)
• (1) Needs to use cl-json:with-decoder-simple-clos-semantics or cl-json:simple-clos-semantics
• (2) Pass assuming you deal with the "NULL" or 'null" string somehow to get back to :NULL
• (3) Pass assuming the float result is acceptable to you
• (4) Assumes yason:parse has keyword parameter :json-arrays-as-vectors set to t
Table 50: Symmetry CL -> Json -> CL Test 2 Points of Failure
Library Point(s) of Failure
boost-json lost the first value, nil came back as null
cl-json need to parse the result and convert "null" to :NULL
com.gigamonkeys.json {} braces instead of () or nil
com.inuoe.jzon need to parse the result and convert "NULL" to :NULL
jonathan result is a list and it converts :NULL to nil
json-lib need to parse the result and convert "null" to :NULL
json-streams You would need to write your own method wrapping components of json-streams to handle a vector
jsown returns list instead of array and :NULL is converted to nil. Interesting that the float is converted to a ratio.
shasht
st-json You need to write your own method for handling vectors
trivial-json-codec
yason Null converted to nil

## Security

Getting json objects from another source is just as insecure as any other data you receive from another source. You are still responsible for ensuring that you have properly sanitized, validated or other checked the security of the data.

Redditor lokedhs has pointed out that "I'd be careful about using any JSON library that uses keywords for hash keys (like CL-JSON). The reason is that if you are using it to parse unchecked input, it can be used to cause a denial of service attack by sending maps that contain random keys. Every key will be interned into the keyword package, which are never garbage collected, causing an out of memory condition after a while." Cl-json flags the issue and provides the function safe-json-intern - which will throw an error if the keyword to be interned does not already exist in the *json-symbols-package*. - so at least you are warned and provided with an alternative. The following code will throw an error if alpha-omega is not already interned in the *json-symbols-package*.

(setf cl-json::*identifier-name-to-key* #'cl-json::safe-json-intern)
(cl-json:decode-json-from-string "{\"alpha-omega\": 1}")


Boost-json and Jonathan also does seem to have this as a potential issue. Boost-json decodes json objects to a CLOS object with slot-value names interned in the boost-json package.

com.gigamonkeys.json, com.inuoe.jzon, json, json-lib, json-streams, jsown, shasht and st-json do not intern the keywords. Yason doesn't intern the keywords in the library but the test files do, giving you the impression that it assumes that will be normal practice.

You will see in the next section when it talks about malformed data that certain strings can also hang the system or exhaust the stack (typically by opening thousands of json arrays or objects and never closing them). Com.inuoe.jzon, json-lib and json-streams have maximum level limits that can be used to prevent this type of overloading from happening.

## Standard Conformity and Dealing with Malformed Data

To check standard conformity and test for malformed data, we will take advantage of some of the test suites that have come out for json. Some can be found at, specifically https://github.com/nst/JSONTestSuite. You may also find interesting reading at https://yitzchak.github.io/shasht/parsing.html#17 and http://www.seriot.ch/parsing_json.php.

One thing to take into consideration, bearing in mind your own fact pattern is whether you agree or disagree with Jon Postel's robustness principle:

"Be strict when sending and tolerant when receiving. Implementations must follow specifications precisely when sending to the network, and tolerate faulty input from the network. When in doubt, discard faulty input silently, without returning an error message unless this is required by the specification."

If you follow this logic, your encoding function should be perfectly compliant and your decoding function may want to accept invalid json (subject to security concerns). According to https://datatracker.ietf.org/doc/html/draft-thomson-postel-was-wrong-03,the robustness principle should be read in the context of dealing with imperfect protocols, not everything under the sun including implementation bugs.

### Conformity Testing

While json has a standard, it is arguably under-specified. Using the tests from https://github.com/nst/JSONTestSuite, I intended to focus on whether the libraries correctly parsed json strings that must be accepted and whether they throw errors on json strings that must be rejected. However, as you will see, how the libraries handle the "must reject" strings and "under-specified" strings also implicate security and stability. Some libraries' attempts to handle intentionally malformed json strings actually hung sbcl or triggered stack exhaustion. These are not the libraries you want facing uncontrolled input.

#### Must Accept

This batch of json strings whould be accepted by all libraries. The only libraries to meet that standard are com.inuoe.jzon and com.gigamonkeys.json. Almost all the rest came "reasonably" close.

Table 51: Conformity Testing - Must Accept
Library Correct Incorrect Comment
boost-json 95/(92) 0 (1)
cl-json 95/(92) 0 (1)
gigamonkeys 95 0
com.inuoe.jzon 95 0
jonathan 94 1 (2)
json-lib 95/(92) 0 (1)
json-streams 93 2 (3)
jsown 93 2 (4)
shasht 95/(92) 0 (1)
st-json 95/(92) 0 (1)
trivial-json-codec 90/(87) 5 (1)(5)
yason 95/(92) 0 (1)
• (1) If *read-default-float-format* is set to 'single-float, there will be decoding failures on [123e65], [123e45], [123.456e78],
• (2) Failed on [123.456e78] even with *read-default-float-format* is set to 'double-float
• (3) rejected files with duplicate keys. This can be resolved by passing the keyword parameter :duplicate-key-check nil
• (4) Failed on lonely numbers (an integer or negative real not within a json array or json object)
• (5) Failed on [[] ], [0e+1], [1E+2], [1e+2], { "min": -1.0e+28, "max": 1.0e+28 }

#### Can Accept or Reject

This batch of json strings were designed to find the under-specified holes in the json specification. As such, libraries could accept or reject the strings without being out of compliance with the standard. For purposes of the following table, I only tested files that could be read as encoded in utf-8.

What I found troubling was that jonathan and jsown all hung on one or more of these strings.

Table 52: Conformity Testing - Can Accept or Reject
Library Accepted Rejected Comment
boost-json 13 4
cl-json 13 4
gigamonkeys 14 3
com.inuoe.jzon 7 10
jonathan 9 7 (1)
json-lib 12 5
json-streams 1 16
jsown 11 5 (1)
shasht 9 8
st-json 8 9
trivial-json-codec 13 4
yason 8 9
• (1) hang on i_number_real_underflow [123e-10000000]

### Malformed Data

##### Must Reject

Only three libraries did not hang or trigger stack exhaustion on strings designed to open thousands of beginning nested arrays or objects. Those are com.inuoe.jzon, json-lib and json-streams. All three have a limit and refused to exceed the limit.

Table 53: Malformed Data Testing - Must Reject
Library Correct Incorrect Comment
boost-json 145 28 stack exhausted (1)
cl-json 153 20 stack exhausted (1)
com.gigamonkeys.json 0 173 stack exhausted (1)
com.inuoe.jzon 173 0
jonathan 128 45 stack exhausted (1)
json-lib 110 63
json-streams 173 0
jsown 102 71 stack exhausted (1)
shasht 159 18 recoverable error (2)
st-json 140 33 stack exhausted (1)
trivial-json-codec 101 72
yason 119 54 stack exhausted (1)
• (1) n_structure_open_array_object.json, n_structure_100000_opening_arrays.json
• (2) n_structure_open_array_object.json

## Benchmarking

Ok. We are going to show read benchmarking results using sbcl, ccl and ecl because the changes in the results for jonathan were so different.

Jsown maintains its crown as the fastest parser on sbcl (assuming no errors in the file). It is beaten slightly by jonathan on tiny strings, but jonathan shows a slow-down with certain nested json objects that gets progressively worse as string sizes increase. This effect is more immediate under sbcl where it starts slowing down < 12.7k data. The effect is obvious even under ccl and ecl by the time you get to 227k json data objects and by the time you get over 1MB json data objects, jonathan is orders of magnitude slower than all other libraries, not just jsown.

If we take a quick look at benchmarking with tiny json strings, we get read time chart comparisons that look like the following. The numbers come from applying the cost-of-nothing benchmark function against the libraries parsing the json string immediately below and using cl-spark's vspark function.

yason-alist and yason-plist are shorthand for (yason:parse data :object-as :alist) and (yason:parse data :object-as :plist)

(defparameter +json-string+ "{\"key1\": \"value\\n\",
\"key2\":1,\"key3\":[\"Hello \\u2604\",  1.2e-34 ,true,
false,null]}")

                                JSON Read Times (sbcl tiny json object)
0                  6.147095e-6                  1.229419e-5
˫----------------------------+----------------------------˧
boost-json █████████████████████████▎
cl-json ████████████████████████████████████▎
com.gigamonkeys.json ███████████████████████▎
jonathan ███████▍
com.inuoe.jzon ███████████▍
json-lib █████████████████████████████████▌
json-streams ███████████████████████████████████████████████████████████
jsown ████████▏
shasht █████████████▊
st-json ████████████████████████████████▏
trivial-json-codec ████████████████▉
yason ████████████████████████████████████▉
yason-alist ███████████████████████████████████▏
yason-plist ███████████████████████████████████▎

JSON Read Times (ccl tiny json object)
0                  3.137722E-5                  6.275444E-5
˫----------------------------+----------------------------˧
boost-json ███████████████▉
cl-json ██████████████████▋
com.gigamonkeys.json █████████████████▌
jonathan ███▏
com.inuoe.jzon █████▋
json-lib ███████████████████████████████████████████████████████████
json-streams █████████████████████████████████████████▍
jsown █████▎
shasht ██████████▏
st-json ████████████████▍
yason █████████████████▎

JSON Read Times (ecl tiny json object)
0                 1.3122593e-4                 2.6245185e-4
˫----------------------------+----------------------------˧
boost-json ███████▍
cl-json ██████████████████████████▋
com.gigamonkeys.json █████████▋
jonathan █▋
com.inuoe.jzon █████▍
json-lib █████████▊
json-streams ███████████████████████████████████████████████████████████
jsown ██▋
shasht █████████████▎
st-json █████████▊
yason ████████▌


The read times start to look different with respect to nested json objects as the file sizes increase. The next chart is a 2.7k json nested object. With sbcl, jonathan is slightly behind json, but still looking good compared to the rest of the libraries. The json dataset for charts between 2.7k and 221k are all from a Nobel prize dataset which can be found at http://api.nobelprize.org/v1/prize.json. The 2.7k data is just 2021. The 12.9k data is 2017-2021 and the 221k data are all years. We are dropping trivial-json-codec out at this point because it claims the data is invalid.

                                JSON Read Times (sbcl 2.7k json object)
0                 1.3253705e-4                  2.650741e-4
˫----------------------------+----------------------------˧
boost-json ████████████▉
cl-json █████████████████████████████████▌
com.gigamonkeys.json ██████████████████▏
jonathan ██████▏
com.inuoe.jzon ██████████████▌
json-lib ████████████████████████████████████████▌
json-streams ███████████████████████████████████████████████████████████
jsown ███▊
shasht █████████████▊
st-json ██████████████▏
yason ██████████████████████████████████████████▊
yason-alist █████████████████████████████████████████▍
yason-plist █████████████████████████████████████████▍

JSON Read Times (ccl 2.7k json object)
0                   4.59796E-4                   9.19592E-4
˫----------------------------+----------------------------˧
boost-json █████████████▋
cl-json ████████████████████████████▎
com.gigamonkeys.json █████████████████▎
jonathan ███▍
com.inuoe.jzon ██████████▌
json-lib ███████████████████████████████████████████████████████████
json-streams ██████████████████████████████████████████████████████████▏
jsown ███████▍
shasht ████████▏
st-json █████████████▏
yason ███████████████████████████▏

JSON Read Times (ecl 2.7k json object)
0                 0.0029186574                 0.0058373148
˫----------------------------+----------------------------˧
boost-json █████▍
cl-json ████████▌
com.gigamonkeys.json ███████████▏
jonathan █▌
com.inuoe.jzon ████▋
json-lib █████████████████▋
json-streams ███████████████████████████████████████████████████████████
jsown ████▌
shasht ███▍
st-json █████▉
yason ███████████▎


Now we increase the size to a 12.9k json nested object. Using sbcl, jsown is by itself and jonathan has fallen behind four other libraries. On the other hand, jonathan is still faster with ccl and ecl.

                                JSON Read Times (sbcl 12.9k json object)
0                 6.402964e-4                  0.0012805928
˫----------------------------+----------------------------˧
boost-json █████████████▏
cl-json █████████████████████████████████▋
com.gigamonkeys.json █████████████████▉
jonathan ████████████████▍
com.inuoe.jzon ██████████████▌
json-lib ███████████████████████████████████████▌
json-streams ███████████████████████████████████████████████████████████
jsown ███▋
shasht █████████████▊
st-json █████████████▉
yason ██████████████████████████████████████████▌
yason-alist ████████████████████████████████████████▊
yason-plist ████████████████████████████████████████▉

JSON Read Times (ccl 12.9k json object)
0                  0.0022879406                 0.004575881
˫----------------------------+----------------------------˧
boost-json █████████████▎
cl-json ███████████████████████████▋
com.gigamonkeys.json ████████████████▍
jonathan ███████▍
com.inuoe.jzon ██████████▌
json-lib ███████████████████████████████████████████████████████████
json-streams █████████████████████████████████████████████████████████▍
jsown ███████▏
shasht ████████▏
st-json ████████████▊
yason ██████████████████████████▎

JSON Read Times (ecl 12.9k json object)
0                   0.02525251                   0.05050502
˫----------------------------+----------------------------˧
boost-json ███▍
cl-json ███▍
com.gigamonkeys.json ████▏
jonathan █▎
com.inuoe.jzon █████████████▍
json-lib ██████▏
json-streams ███████████████████████████████████████████████████████████
jsown ██▊
shasht █████▋
st-json █▉
yason █████████▍


Now we jump up to the 221k json object and jonathan is really falling behind with sbcl and suddenly starts struggling under ccl and ecl for unknown reasons.

                                JSON Read Times (sbcl 221k json object)
0                   0.03486486                   0.06972972
˫----------------------------+----------------------------˧
boost-json ████▏
cl-json ██████████▋
com.gigamonkeys.json █████▊
jonathan ███████████████████████████████████████████████████████████
com.inuoe.jzon ████▉
json-lib █████████████▌
json-streams ████████████████████▌
jsown █▎
shasht ████▋
st-json ████▍
yason ██████████████▎

JSON Read Times (ccl 221k json object)
0                  0.045712363                  0.091424726
˫----------------------------+----------------------------˧
boost-json ███████████▌
cl-json ██████████████████████▉
com.gigamonkeys.json █████████████████▌
jonathan ███████████████████████████████████████████████████████████
com.inuoe.jzon ███████████▉
json-lib ███████████████████████████████████████████████████████▏
json-streams ███████████████████████████████████████████████▉
jsown ██████▍
shasht ██████████████████▍
st-json ██████████▉
yason ██████████████████████████████▉

JSON Read Times ECL 221k json object
0                   0.13347456                   0.26694912
˫----------------------------+----------------------------˧
boost-json ████████████████▋
cl-json █████████████▊
com.gigamonkeys.json ███████████▍
jonathan █████████████████████████████████████████████████▏
com.inuoe.jzon █████████▌
json-lib ███████████████▊
json-streams ████████████████████████████████████████████████████▋
jsown ██████████████████████████████████████████████████████████
shasht ████▊
st-json █████████████▌
yason ███████████████████▎


Switching to trivial-benchmarking numbers and just taking the cumulative timing and consing for 20 runs of these three json strings and just comparing jonathan v. com.inuoe.jzon. Obviously the first thing that jumps out is the garbage collection while jonathan parses the 221k json string. Look at the relative increase. The largest string is 8162% larger than the smallest, but jonathan's run time increases by 100000% and the bytes consed increases by 576000%. At the same time com.inuoe.jzon's increases slightly less than the increase in string size.

Table 54: Jonathan Parsing (sbcl)
Relative Increase
File Size (Bytes) 2715 12961 221618 8162%
Total (sec) Total (sec) Total (sec)
RUN-TIME 0.001276 0.009836 1.320854 103515%
USER-RUN-TIME 0.001174 0.009696 1.233804 105094%
SYSTEM-RUN-TIME 0.000105 0.000145 0.087132 82983%
GC-RUN-TIME 0 0 217.427
BYTES-CONSED 1754032 37784576 10100294512 575832%
Table 55: Com.inuoe.jzon Parsing (sbcl)
Relative Increase
File Size (Bytes) 2715 12961 221618 8162%
Total (sec) Total (sec) Total (sec)
RUN-TIME 0.002695 0.012667 0.138126 5125%
USER-RUN-TIME 0.002694 0.012666 0.13813 5127%
SYSTEM-RUN-TIME 0 0 0 0
GC-RUN-TIME 0 0 0 0
BYTES-CONSED 504800 2246960 37353184 7399%

Now looking at parsing a 1.2 MB json string with nested objects, jonathan is orders of magnitude slower than all the other libraries regardless of which compiler is used. The json dataset can be found at https://github.com/mledoze/countries/blob/master/countries.json.

                                JSON Read Times (sbcl 1.2MB json object)
0                     1.193597                     2.387194
˫----------------------------+----------------------------˧
boost-json ▌
cl-json █▏
com.gigamonkeys.json ▊
jonathan ███████████████████████████████████████████████████████████
com.inuoe.jzon ██▏
json-lib █▋
json-streams ██▍
jsown ▍
shasht ▋
st-json ▊
yason █▌
yason-alist █▍
yason-plist █▍

CCL results not included since it had an issue with some of the unicode in the file.

JSON Read Times (ecl 1.2MB json object)
0                          2.0                          4.0
˫----------------------------+----------------------------˧
boost-json █▍
cl-json █████████▏
com.gigamonkeys.json ████████▏
jonathan ███████████████████████████████████████████████████████████
com.inuoe.jzon █████▌
json-lib █████▎
json-streams ██████████████▊
jsown ██▍
shasht █▋
st-json █▋
yason █████████▊



If we drop jonathan and look at the remaining libraries and add in results for yason decoding to plists and alists, it looks like this:

                                JSON Read Times (sbcl 1.2 MB json object)
0                   0.04651775                    0.0930355
˫----------------------------+----------------------------˧
boost-json ███████████▍
cl-json ██████████████████████████████▎
com.gigamonkeys.json ███████████████████▍
com.inuoe.jzon ██████████████████████████████████████████████████████▉
json-lib █████████████████████████████████████████▎
json-streams ██████████████████████████████████████████████████████████
jsown ███████▍
shasht ███████████████▋
st-json ███████████████████▍
yason ████████████████████████████████████▍
yason-alist █████████████████████████████████▉
yason-plist █████████████████████████████████▉


Interestingly, if we go up to a 9.8 MB file (still on the small side) downloaded from https://www.vizgr.org/historical-events/search.php?format=json&begin_date=-3000000&end_date=20151231&lang=en, we see substantial improvement from jzon (com.inoue.jzon) and some slowing down from gigamonkeys. I am going to guess that there is something different in the structure of the files which jzon handles faster than the 1.2 MB file. Again, trivial-json-codec rejected the data as invalid, so it is left out.

                                JSON Read Times (sbcl 9.8MB json object)
0                    8.908338                     17.816675
˫----------------------------+----------------------------˧
boost-json ▊
cl-json █▉
com.gigamonkeys.json █▎
jonathan ███████████████████████████████████████████████████████████
com.inuoe.jzon ▋
json-lib ██▍
json-streams ███▎
jsown ▎
shasht ▋
st-json ▊
yason ██▍

JSON Read Times (ccl 9.8MB json object)
0                    11.320626                    22.641253
˫----------------------------+----------------------------˧
boost-json ██▏
cl-json ███▋
com.gigamonkeys.json ███▏
jonathan ███████████████████████████████████████████████████████████
com.inuoe.jzon █▎
json-lib ██████████▊
json-streams ███████▊
jsown █▎
shasht ▉
st-json █▉
yason ███▋

JSON Read Times (ecl 9.8 MB json object)
0                         11.0                         22.0
˫----------------------------+----------------------------˧
boost-json ██▋
cl-json ████▏
com.gigamonkeys.json ██▋
jonathan ███████████████████████████████████████████████████████████
com.inuoe.jzon ██▋
json-lib █████▍
json-streams ████████████████▏
jsown ██▋
shasht █▍
st-json ██▋
yason █████▍


#### Read Times From Stream (sbcl)

The following charts are reading from stream rather than from string (so a slightly smaller list of libraries) and jonathan choked on both files being read into the stream, so it is excluded as well. We see the same comparative results when reading from stream as when we read from strings.

##### Countries File
                                JSON Read Times (sbcl) 1.2 MB Countries File
0                     0.045713887                      0.09142777
˫-------------------------------+-------------------------------˧
boost-json █████████████▏
cl-json █████████████████████████████████▍
com.inuoe.jzon █████████████████████████████████████████████████████████████████
json-streams █████████████████████████████████████████████████████████████▊
shasht ████████████████▊
st-json █████████████████████▏
yason █████████████████████████████████▋
yason-plist ████████████████████████████████▊
yason-plist █████████████████████████████████▋


##### Historical Events File
                                JSON Read Times (sbcl) 9.8 MB Historical Events File
0                       3.6496155                        7.299231
˫-------------------------------+-------------------------------˧
boost-json █▊
cl-json ████▊
com.inuoe.jzon ██▋
json-streams ███████▌
shasht █▊
st-json █▉
yason ████▉
yason-plist ████████████████████████████████████████████████████████████████▊
yason-plist █████████████████████████████████████████████████████████████████



### Write Times

Unlike the read times with Jonathan, writing times did not show any surprising differences between tiny bits of data and longer nested data. First writing tiny bits of data. We are skipping trivial-json-codec because of its inability to deliver valid json from lists.

                                JSON Write Times
0                    5.904379e-6                     1.1808758e-5
˫-------------------------------+-------------------------------˧
boost-json █████████████████████████████████████▋
cl-json █████████████████████████████████████████████████████████████████
com.inuoe.jzon ████████████████████████████████████▏
jonathan ██████████████████████████████████▊
json-lib ██████████████████████████████████████████████▋
json-streams ████████████████████████████▌
jsown ███████████████████████████████████████▌
shasht ████████████▊
st-json ███████▊
yason ██████████████████████████████████▉


Now the lisp data equivalent to the json 2.7k data string:

                                JSON Write Times 2.7k data
0                 1.1808636e-4                 2.3617272e-4
˫----------------------------+----------------------------˧
boost-json ███████████████████████████████████████████████████████████
cl-json ██████████████████████████████████████████████▌
com.gigamonkeys.json █████████▏
com.inuoe.jzon ████████▉
jonathan █████████▊
json-lib ██████████████████████████████████▌
json-streams ███████████████████████████████████████▎
jsown ██████████████████████▉
shasht ████████████▎
st-json ███████▋
yason ███████████████████▊



Now the lisp data equivalent to the json 12.9k data string

                                JSON Write Times 12.9k data
0                 5.776428e-4                  0.0011552856
˫----------------------------+----------------------------˧
boost-json ███████████████████████████████████████████████████████████
cl-json ██████████████████████████████████████████████▍
com.gigamonkeys.json ████████▌
com.inuoe.jzon ████████▊
jonathan █████████▊
json-lib ██████████████████████████████████▏
json-streams ██████████████████████████████████████▋
jsown ██████████████████████▌
shasht ████████████▏
st-json ███████▏
yason ███████████████████▏



Now the lisp data equivalent to a json 221.6k data string

                          JSON Write Times 221.6k data
0                  0.014210499                  0.028420998
˫----------------------------+----------------------------˧
boost-json ███████████████████████████████████████████████████████████
cl-json ███████████████████████████████▋
com.gigamonkeys.json █████▉
com.inuoe.jzon █████▊
jonathan ██████▋
json-lib ████████████████████████▉
json-streams ██████████████████████████▉
jsown ███████████████▌
shasht ████████▏
st-json ████▊
yason █████████████▍



### boost-json

boost-json Jeffrey Massung Apache v.2 https://github.com/cl-boost/json Not in quicklisp

Boost-json is one of the faster decoders. The author notes that he personally uses it to parse extremely large, genomics JSON-list files (several GB in size). The author and I have different opinions on nil/false/null/[] and whether they actually have meaningful differences. He does not think so; I disagree. It does have issues on the encoding side, sometimes losing values when encoding vectors and is generally one of the slowest writers. I think it has value in the right use cases, but you do need to make sure it meets your particular needs.

#### Default Mapping

Please note the direction of the arrows in the following table.

Lisp   Json
integer <-> number with no frac or exp parts
float <-> number with frac or exp parts
rational -> number with frac or exp parts
ratio -> number with frac or exp parts
T <-> true
nil <-> null
nil <- false
other symbol -> string
character -> Error
string <-> string
list (except alists) <-> array
alist (dotted pairs) -> Invalid json array
alist (undotted pairs) <-> nested array
hash-table -> object
CLOS object <- object
standard object -> need to write a method

#### Decoding

Boost-json has different functions for decoding from strings (json-decode) or streams (json-read). Json arrays are decoded to CL lists and json objects are decoded to a CLOS object with slot-value names interned in the boost-json package.

I find its handling of nil and false a bit confusing. Consider the following examples:

(boost-json:json-decode "[false]")
(NIL)

(boost-json:json-decode "{\"A\":false}")
#<BOOST-JSON:JSON-OBJECT {"A":null}>


Within an array, json's 'false' is converted to CL nil, but within a json object, json's 'false' is convert to CL :null. Why?

We also have a symmetry problem with respect to json's 'false'. See the following:

(boost-json:json-encode
(boost-json:json-decode "{\"A\":false}"))
{"A":null}


The starting point with 'false' got converted to null.

Boost-json does not handle unicode surrogate pairs if you care about that sort of thing.

##### Decoding to CLOS object

Now let's talk about boost-json's automagic decoding json objects to a json-object which is a standard CLOS object. Let's start with a simple version before we go to nested objects.

(boost-json:json-decode "{ \"weights\" : [ 0.5, 0.5 ] }")
#<BOOST-JSON:JSON-OBJECT {"weights":#}>

(describe (boost-json:json-decode "{ \"weights\" : [ 0.5, 0.5 ] }"))
#<BOOST-JSON:JSON-OBJECT {"weights":#}>
[standard-object]

Slots with :INSTANCE allocation:
MEMBERS                        = (("weights" (0.5 0.5)))


It appears that you need to call (boost-json:json-getf obj keyword) in order to act as an accessor to the automagically built CLOS object:

(boost-json:json-getf
(boost-json:json-decode "{ \"weights\" : [ 0.5, 0.5 ] }")
"weights")
(0.5 0.5)


or, of course, you can use slot-value, just remember that the slot-value name will be interned in the boost-json package..

##### Nested Objects

Now let's look at a nested object. As a reminder, we will use the following parameter:

(defparameter *nested-address-1* "{
\"first_name\": \"George\",
\"last_name\": \"Washington\",
\"birthday\": \"1732-02-22\",
\"street_address\": \"3200 Mount Vernon Memorial Highway\",
\"city\": \"Mount Vernon\",
\"state\": \"Virginia\",
\"country\": \"United States\"
}
}")


If we want to get the city from the CLOS object that boost-json created, it might look something like this:

(boost-json:json-getf
(boost-json:json-getf
"city")
"Mount Vernon"


#### Encoding

On the plus side, boost-json was one of two libraries which could encode a pathname.

On the neutral side, data types like char, local-time:timestamps and structs would require you to write an encoding method to handle them if you have them.

On the "choose your data structures carefully side",

• Encoding hash-tables succeeds if the hash-table keys are strings, fails if they are symbols
• Encoding vectors results in invalid results, generally losing the first value in the array. Sample output on a simple nested array looked like: [,"Cork","Limerick"][,[,"Frankfurt","Munich"]]
• Boost-json will try to encode alists as json arrays of array. It will generate invalid json if the alisp has dotted pairs.

(boost-json:json-encode '(("A" . 1) ("B" . 2) ("C" . 3)))
[["A",. 1],["B",. 2],["C",. 3]]


If the array is not dotted pairs, boost-json will give you arrays of arrays.

(boost-json:json-encode '(("A"  1) ("B"  2) ("C"  3)))
[["A",1],["B",2],["C",3]]

##### Encoding CLOS class instances

For boost-json to encode a CLOS class instance, you need to provide a new json-write method for that class unless it is a json-object class. In the decoding examples, we saw a json object "{\"weights\" : [ 0.5, 0.5 ] }" get decoded to a boost-json:json-object:

(boost-json:json-encode
(boost-json:json-decode "{ \"weights\" : [ 0.5, 0.5 ] }"))
{"weights":[0.5,0.5]}


If we tried that with our simple person class which is not a boost-json:json-object,

(boost-json:json-encode (make-instance 'person))
; Evaluation aborted on #<SB-PCL::NO-APPLICABLE-METHOD-ERROR {1005573E93}>.


You could use the existing methods in https://github.com/cl-boost/json/blob/main/encode.lisp or, for our very simple person class, something like the following could work:

(defmethod boost-json:json-write ((person person) &optional stream)
(let ((accessors '(("name" name) ("eye_colour" eye-colour))))
(write-char  #\{ stream)
(loop :for (key val) :in accessors
:for first := t :then nil
:unless first :do (write-char #\, stream)
do
(boost-json::json-write key stream)
(write-char #\: stream)
(boost-json::json-write (funcall val person) stream))
(write-char  #\} stream)))


#### Symmetry

From the standpoint of symmetry or round-tripping, going from Json->CL->Json it only fails on dealing with 'false'.

It had more problems going from CL->Json->CL. In some tests it triggered an error with an expected #\N, in others nil came back as nul and it could lose the first value in vectors.

#### Security

There were two general security issues we considered: (1) interning keywords and (2) issues with malformed data.

Boost-json did not exhibit the first issue in the same way that e.g. cl-json does. However, it does decode json objects to a CLOS object with slot-value names interned in the boost-json package.

With respect to the second issue, boost-json properly rejected 145 malformed test cases but accepted 28. Some of those malformed test cases which were accepted actually triggered stack exhaustion by opening too many levels of json open arrays and not closing them or similar types of issues.

#### Conformity with Json Standard

Boost-json accepted 95 of the 95 test cases that are considered "must accept". If *read-default-float-format* is set to 'single-float, it would refuse to accept: [123e65], [123e45] and [123.456e78].

It accepted 13 of the 17 test cases considered to be part of the gray area of the json specification - you could accept or reject.

#### boost-json exported symbols

• json-decode - Convert a JSON string into a Lisp object.
• json-encode - Encodes a Lisp value into a stream.
• json-read - decode from a stream
• json-object
• json-object-members
• json-getf - Find an member's value in a JSON object.
• json-setf - Assign a value to a key in a JSON object.

### cl-json

cl-json Henrik Hjelte, Boris Smilga, Robert Goldman MIT https://github.com/hankhero/cl-json

cl-json is an old workhorse in area, however, it has not been updated in the last seven years and is fairly slow as you can tell from the benchmarks. It does not handle unicode surrogate pairs, json's null and is not as likely to be symmetric as some of the other libraries. Like most of the other libraries (except jsown and shasht), cl-json does not encode multi-dimensional arrays. As noted in the security section, cl-json interns keys into the keyword package which can open you up to the equivalent of a DOS attack. cl-json does provide a mitigation function which will throw an error if the keyword to be interned does not already exist in the *json-symbols-package*.

(setf cl-json::*identifier-name-to-key* #'cl-json::safe-json-intern)
(cl-json:decode-json-from-string "{\"alpha-omega\": 1}")


On the plus side, cl-json does probably have the best support for converting json data to clos objects and vice versa, but surprisingly does not handle structs. Local-time:timestamps are returned as json objects {"day":7990,"sec":0,"nsec":0} but not as javascript date objects. Unlike many of the other libraries, it does handle symbols without you having to write a new method. It allows you to handle incremental encoding.

With respect to conformity testing, cl-json was 95/95 for the json strings it must accept. It did not do so well rejecting malformed strings and, like most of the other libraries could exhaust the stack when facing certain types of malformed strings.

When you are looking at plists and alists, cl-json provides specific functions for dealing with those structures rather than attempting to guess how they should be translated into json.

Table 56: Cl-json Outstanding Issues
Symmetry violations issue 22, issue 4
Decoding json with duplicate keys get returned rather than flagged issue 16
Does not handle unicode surrogate pairs issue 11
Does not handle null properly

#### Default Mapping

Please note the direction of the arrows in the following table.

Lisp   Json
integer <-> number with no frac or exp parts
float <-> number with frac or exp parts
rational -> number with frac or exp parts
ratio -> number with frac or exp parts
T <-> true
nil <-> null
nil <- false
other symbol -> string
character -> string
string <-> string
list (except alists) <-> array (1)
other sequences -> array
alist with dotted pairs <-> object (1)
hash-table -> object
standard object -> object
• (1) This is cl-json's default mode. Using cl-json:with-decoder-simple-clos-semantics or cl-json:simple-clos-semantics will switch cl-json into a mode where json arrays are decoded to cl vectors rather than lists, and json objects are decoded to CLOS objects rather than alists.

#### Decoding

Cl-json uses different functions to decode from a string (decode-json-from-string x) v. decoding from a stream (decode-json x).

It converts json's 'null' to NIL, which I disagree with. It also fails to deal with unicode surrogate pairs if you care about those.

Json objects are converted to alists with dotted pairs which is unusual for the rest of the libraries.

##### Json data to CLOS Object

While cl-json normally returns json objects as arrays, you could tell it to return the json object as an cl-json:fluid-class CLOS object. You do need to at least temporarily set the change the decoder to use simple-clos-semantics and set the *json-symbols-package* to nil. (It should be noted that since the decoder maintains a class registry, this is thread unsafe. According to the docs, if every incoming JSON Object is guaranteed to have a prototype with a "lispClass" member then there are no fluid objects and thread safety is ensured. If the user wishes to employ fluid objects in a threaded environment it is advisable to wrap the body of entry-point functions in with-local-class-registry.

(cl-json:set-decoder-simple-clos-semantics)


You can reset the decoder back to lists with the function:

(set-decoder-simple-list-semantics)


This example temporarily changes the cl-json decoder semantics so that it creates a cl-json:fluid class, then we can get the birthday slot value of that class.

  *address-1*
"{
\"name\": \"George Washington\",
\"birthday\": \"February 22, 1732\",
\"address\": \"Mount Vernon, Virginia, United States\"
}"

"{
\"first_name\": \"George\",
\"last_name\": \"Washington\",
\"birthday\": \"1732-02-22\",
\"street_address\": \"3200 Mount Vernon Memorial Highway\",
\"city\": \"Mount Vernon\",
\"state\": \"Virginia\",
\"country\": \"United States\"
}
}"


First, looking at the simpler version, notice you need to specify the slots in the fluid-class object:

(cl-json:with-decoder-simple-clos-semantics
(setf cl-json:*json-symbols-package* nil)
birthday)))
"1732-02-22"


Just to check something, lets describe that instance of the fluid-class:

(cl-json:with-decoder-simple-clos-semantics
(setf cl-json:*json-symbols-package* nil)
(describe x)))

<#<JSON:FLUID-CLASS COMMON-LISP:NIL {100319AEA3}> {1003358763}>
[standard-object]

Slots with :INSTANCE allocation:
NAME                           = "George Washington"
BIRTHDAY                       = "February 22, 1732"
ADDRESS                        = "Mount Vernon, Virginia, United States"


Now looking at the nested version, we need to note that by default cl-json will convert the underscores in the json keys to double hyphens in the slot names.

(cl-json:with-decoder-simple-clos-semantics
(setf cl-json:*json-symbols-package* nil)
(with-slots (first--name last--name birthday address) x
(values x first--name last--name birthday address))))
#<#<JSON:FLUID-CLASS COMMON-LISP:NIL {100F765713}> {10107CF9E3}>
"George"
"Washington"
"1732-02-22"
#<#<JSON:FLUID-CLASS COMMON-LISP:NIL {100F765713}> {10107CF6F3}>


Because we have a nested class, we would need drill down and specify the slots for the sub-object as well:

(cl-json:with-decoder-simple-clos-semantics
(setf cl-json:*json-symbols-package* nil)
(with-slots (first--name last--name birthday address) x
(values x first--name last--name birthday address city)))))
#<#<JSON:FLUID-CLASS COMMON-LISP:NIL {100F765713}> {1010E69B93}>
"George"
"Washington"
"1732-02-22"
#<#<JSON:FLUID-CLASS COMMON-LISP:NIL {100F765713}> {1010E698A3}>
"Mount Vernon"


We can also use slot values to get the info, but before we do that, let's use the nested-address sample data and just describe the object instance.

(cl-json:with-decoder-simple-clos-semantics
(setf cl-json:*json-symbols-package* nil)
(describe x)))

#<#<JSON:FLUID-CLASS COMMON-LISP:NIL {100319AEA3}> {10039F9CC3}>
[standard-object]

Slots with :INSTANCE allocation:
NAME                           = #<unbound slot>
BIRTHDAY                       = "1732-02-22"
ADDRESS                        = #<#<JSON:FLUID-CLASS COMMON-LISP:NIL {100319AEA3}> {100388FC13}>
CITY                           = #<unbound slot>
STATE                          = #<unbound slot>
COUNTRY                        = #<unbound slot>
FIRST--NAME                    = "George"
LAST--NAME                     = "Washington"


Ok, this surprised me. The fluid-class is showing all the slots it created from *address-1* as well as the slots it created from *nested-address-1*. We also see that the keys "first-name", "last-name" and "street-address" have double hyphens when they are slot names and the fluid class created slots for embedded address object.

So just to demonstrate using slot value to get the data from a fluid object:

(cl-json:with-decoder-simple-clos-semantics
(setf cl-json:*json-symbols-package* nil)
(slot-value x 'first--name)))
"George"


Now suppose we want to go into the nested address and get just the city. For that we need to descend down, creating another cl-json:fluid object and then access its city slot value:

(cl-json:with-decoder-simple-clos-semantics
(setf cl-json:*json-symbols-package* nil)
(with-slots (first--name last--name birthday address) x
city))))
"Mount Vernon"


Or, using the slot-value approach:

(cl-json:with-decoder-simple-clos-semantics
(setf cl-json:*json-symbols-package* nil)
'city)))
"Mount Vernon"


#### Encoding

Basic encoding functionality is provided by the generic function encode-json. This can be customised with an entire series of macros as listed in the documentation. cl-json's basic encoding function returns an object when handed an alist and returns an array when handed a plist. When handed a list of plists or list of alists, encode-json will return an array but the list of plists returned an array of arrays and the list of alists returned an array of objects.

cl-json provides a function to encode the alist properly as key:value, but my samples do not show any difference between cl-json:encode-json and cl-json:encode-json-alist. It will encode a dotted alist as a json object with key:value pairs. It will encode a proper list alist as an array of arrays.

(cl-json:encode-json '(("A" . 1) ("B" . 2) ("C" . 3)))
{"A":1,"B":2,"C":3}

(cl-json:encode-json '(("A" 1) ("B" 2) ("C" 3)))
[["A",1],["B",2],["C",3]]


On the plus side,

• cl-json was the only library to handle encoding char out of the box.

On the "be care side:

• As you might expect, plists are treated the same as plain lists and will lose their key-value connections. If you want to keep the key-value connections, you can either convert the list to an alist or hash-table or use the cl-json:encode-json-plist function.

On the "slightly additional work" side:

• Encoding structures, pathnames and timestamps would require writing a specialized method

On the not-so-plus side:

• It encodes nil as null. You can use the helper library cl-json-helper to encode nil as "false".
##### Incremental Encoding

The following examples use two exercises. First, incrementally build a json array. Second, incrementally build a json object which also contains an incrementally built json array.

(cl-json:with-array ()
(dotimes (i 3)
(cl-json:encode-array-member i)))
[0,1,2]


Now the second:

(cl-json:with-object ()
(cl-json:encode-object-member "hello" "hu hu")
(cl-json:as-object-member ("harr")
(cl-json:with-array ()
(dotimes (i 3)
(cl-json:encode-array-member i)))))
{"hello":"hu hu","harr":[0,1,2]}


#### Symmetry

From the standpoint of symmetry or round-tripping, going from Json->CL->Json, it only fails on dealing with false.

Similarly, going from CL->Json->CL, you would need to deal with the fact that :NULL got converted to "null".

#### Security

There were two general security issues we considered: (1) interning keywords and (2) issues with malformed data.

With respect to the first issue, Cl-json flags the issue and provides the function safe-json-intern - which will throw an error if the keyword to be interned does not already exist in the *json-symbols-package*. - so at least you are warned and provided with an alternative.

With respect to the second issue, cl-json properly rejected 153 malformed test cases but accepted 20. Some of those malformed test cases which were accepted actually triggered stack exhaustion by opening too many levels of json open arrays and not closing them or similar types of issues.

#### Conformity with Json Standard

cl-json accepted 95 of the 95 test cases that are considered "must accept". If *read-default-float-format* is set to 'single-float, it would refuse to accept: [123e65], [123e45] and [123.456e78].

It accepted 13 of the 17 test cases considered to be part of the gray area of the json specification - you could accept or reject.

#### Miscellaneous Information

1. Error Conditions Cl-json has several error conditions, some of them recoverable and some of them not recoverable. These include "unrecoverable-value-error", "json-syntax-error", "no-char-for-code", "cell-error" "type-error", errors for calling functions in the wrong environment and others. Please read the user-manual for more details.
2. Cl-json has a converter from camel case to "lisp" (i.e., kebab case) and back again.
3. Cl-json has a lot of other capabilities. The documentation is excellent and you should seriously consider the security considerations section of the user manual if you are going to be decoding uncontrolled JSON objects.

#### cl-json exported symbols

• *aggregate-scope-variables*
• *array-member-handler*
• *array-scope-variables*
• *beginning-of-array-handler*
• *beginning-of-object-handler*
• *beginning-of-string-handler*
• *boolean-handler*
• *end-of-array-handler*
• *end-of-object-handler*
• *end-of-string-handler*
• *identifier-name-to-key* - Designator for a function which, during decoding, maps the *json-identifier-name-to-lisp* -transformed key to the value it will have in the result object.
• *integer-handler*
• *internal-decoder*
• *json-array-type*
• *json-identifier-name-to-lisp* - Designator for a function which maps string (a JSON Object key) to string (name of a Lisp symbol).
• *json-input* - The default input stream for decoding operations.
• *json-output* - The default output stream for encoding operations.
• *json-symbols-package* - The package where JSON Object keys etc. are interned. Default keyword, nil = use current package.
• *lisp-identifier-name-to-json* - Designator for a function which maps string (name of a Lisp symbol) to string (e. g. JSON Object key).
• *object-key-handler*
• *object-scope-variables*
• *object-value-handler*
• *prototype-name*
• *real-handler*
• *string-char-handler*
• *string-scope-variables*
• *use-strict-json-rules* - If non-nil, signal error on unrecognized escape sequences in JSON Strings. If nil, translate any such sequence to the char after slash.
• as-array-member - BODY should be a program which encodes exactly one JSON datum to STREAM. AS-ARRAY-MEMBER ensures that the datum is properly formatted as a Member of an Array, i. e. separated by comma from any preceding or following Member.
• as-object-member - BODY should be a program which writes exactly one JSON datum to STREAM. AS-OBJECT-MEMBER ensures that the datum is properly formatted as a Member of an Object, i. e. preceded by the (encoded) KEY and colon, and separated by comma from any preceding or following Member.
• bignumber-string
• bind-custom-vars
• camel-case-to-lisp - Take a camel-case string and convert it into a string with Lisp-style hyphenation.
• clear-class-registry - Reset the *CLASS-REGISTRY* to NIL.
• current-decoder - Capture current values of custom variables and return a custom decoder which restores these values in its dynamic environment.
• custom-decoder - Return a function which is like DECODE-JSON called in a dynamic environment with the given CUSTOMIZATIONS.
• decode-json - Read a JSON Value from STREAM and return the corresponding Lisp value.
• decode-json-from-source - Decode a JSON Value from source using the value of decoder (default 'decode-json) as decoder function. If the source is a string, the input is from this string; if it is a pathname, the input is from the file that it names; otherwise, a stream is expected as source.
• decode-json-from-string - Read a JSON Value from json-string and return the corresponding Lisp value.
• decode-json-strict - Same as decode-json, but allow only Objects or Arrays on the top level, no junk afterwards.
• encode-array-member - Encode OBJECT as the next Member of the innermost JSON Array opened with WITH-ARRAY in the dynamic context. OBJECT is encoded using the ENCODE-JSON generic function, so it must be of a type for which an ENCODE-JSON method is defined.
• encode-json - Write a JSON representation of OBJECT to STREAM and return NIL.
• encode-json-alist - Write the JSON representation (Object) of alist to stream (or to json-output). Return nil.
• encode-json-alist-to-string - Return the JSON representation (Object) of alist as a string
• encode-json-plist - Write the JSON representation (Object) of plist to stream (or to json-output). Return nil.
• encode-json-plist-to-string - Return the JSON representation (Object) of plist as a string.
• encode-json-to-string - Return the JSON representation of object as a string.
• encode-object-member - Encode KEY and VALUE as a Member pair of the innermost JSON Object opened with WITH-OBJECT in the dynamic context. KEY and VALUE are encoded using the ENCODE-JSON generic function, so they both must be of a type for which an ENCODE-JSON method is defined. If KEY does not encode to a String, its JSON representation (as a string) is encoded over again.
• fluid-class - A class to whose instances arbitrary new slots may be added on the fly.
• fluid-object
• json-bind
• json-bool - Intended for the JSON-EXPLICT-ENCODER. Converts a non-nil value to a value (:true) that creates a json true value when used in the explict encoder. Or (:false).
• json-decode
• json-encode
• json-getf
• json-intern - Intern STRING in the current *JSON-SYMBOLS-PACKAGE*.
• json-object
• json-object-members
• json-or-null - Intended for the JSON-EXPLICT-ENCODER. Returns a non-nil value as itself, or a nil value as a json null-value
• json-setf
• json-syntax-error - Signal a JSON-SYNTAX-ERROR condition
• lisp-to-camel-case - Take a string with Lisp-style hyphentation and convert it to camel case. This is an inverse of CAMEL-CASE-TO-LISP.
• make-object - If CLASS is not NIL, create an instance of that class. Otherwise, create a fluid object whose class has the given SUPERCLASSES (null list by default). In either case, populate the resulting object using BINDINGS (an alist of slot names and values).
• make-object-prototype - Return a PROTOTYPE describing the OBJECT's class or superclasses, and the package into which the names of the class / superclasses and of the OBJECT's slots are to be interned.
• no-char-for-code
• pass-code
• placeholder
• prototype - A PROTOTYPE contains metadata for an object's class in a format easily serializable to JSON: either the name of the class as a string or (if it is anonymous) the names of the superclasses as a list of strings; and the name of the Lisp package into which the names of the class's slots and the name of the class / superclasses are to be interned.
• rational-approximation
• safe-json-intern - The default json-intern is not safe. Interns of many unique symbols could potentially use a lot of memory. An attack could exploit this by submitting something that is passed through cl-json that has many very large, unique symbols. This version is safe in that respect because it only allows symbols that already exists.
• set-custom-vars
• set-decoder-simple-clos-semantics - Set the decoder semantics to the following:
• Strings and Numbers are decoded naturally, reals becoming floats.
• The literal name true is decoded to T, false and null to NIL.
• Arrays are decoded to sequences of the type *JSON-ARRAY-TYPE*.
• Objects are decoded to CLOS objects. Object keys are converted by the function *JSON-IDENTIFIER-NAME-TO-LISP*. If a JSON Object has a field whose key matches *PROTOTYPE-NAME*, the class of the CLOS object and the package wherein to intern slot names are inferred from the corresponding value which must be a valid prototype. Otherwise, a FLUID-OBJECT is constructed whose slot names are interned in *JSON-SYMBOLS-PACKAGE*
• set-decoder-simple-list-semantics - Set the decoder semantics to the following:
• Strings and Numbers are decoded naturally, reals becoming floats.
• The literal name true is decoded to T, false and null to NIL.
• Arrays are decoded to sequences of the type *JSON-ARRAY-TYPE*.
• Objects are decoded to alists. Object keys are converted by the function *JSON-IDENTIFIER-NAME-TO-LISP* and then interned in the package *JSON-SYMBOLS-PACKAGE*.
• simplified-camel-case-to-lisp - Insert - between lowercase and uppercase chars. Ignore _ + * and several consecutive uppercase.
• stream-array-member-encoder - Return a function which takes an argument and encodes it to STREAM as a Member of an Array. The encoding function is taken from the value of ENCODER (default is #'ENCODE-JSON).
• stream-object-member-encoder - Return a function which takes two arguments and encodes them to STREAM as a Member of an Object (String : Value pair)
• substitute-char
• substitute-printed-representation
• unencodable-value-error - Signal an UNENCODABLE-VALUE-ERROR
• unknown-symbol-error
• use-explicit-encoder
• use-guessing-encoder
• with-array - Open a JSON Array, run BODY, then close the Array. Inside the BODY, AS-ARRAY-MEMBER or ENCODE-ARRAY-MEMBER should be called to encode Members of the Array.
• with-custom-decoder-level - Execute BODY in a dynamic environment such that, when nested structures are decoded, the outermost level is decoded with the given custom handlers (CUSTOMIZATIONS) whereas inner levels are decoded in the usual way
• with-decoder-simple-clos-semantics - Execute BODY in a dynamic environement where the decoder semantics is such as set by SET-DECODER-SIMPLE-CLOS-SEMANTICS.
• with-decoder-simple-list-semantics - Execute BODY in a dynamic environement where the decoder semantics is such as set by SET-DECODER-SIMPLE-LIST-SEMANTICS.
• with-explicit-encoder
• with-guessing-encoder
• with-local-class-registry - Run BODY in a dynamic environment where *CLASS-REGISTRY* is a temporary local list. If :INHERIT is non-null, the local registry shall initially have the same content as the exterior *CLASS-REGISTRY*, otherwise it shall be NIL.
• with-object - Open a JSON Object, run BODY, then close the Object. Inside the BODY, AS-OBJECT-MEMBER or ENCODE-OBJECT-MEMBER should be called to encode Members of the Object.
• with-substitute-printed-representation-restart - Establish a SUBSTITUTE-PRINTED-REPRESENTATION restart for OBJECT and execute BODY.

### com.gigamonkeys.json

com.gigamonkeys.json Peter Seibel BSD-3 https://github.com/gigamonkey/monkeylib-json

Com.gigamonkeys.json is one of the oldest libraries. In fact the author commented that he had forgotten he had written a json library. It is included for completeness, but I think the newer libraries have passed it by.

#### Default Mapping

Please note the direction of the arrows.

Lisp   Json
integer <-> number with no frac or exp parts
float <-> number with frac or exp parts
rational -> number with frac or exp parts
ratio -> number with frac or exp parts
T <-> true
nil <-> {}
:FALSE <- false
:NULL <- null
other symbol -> Only keywords allowed. string
character -> Hangs
string <-> string
list (except alists) <-> object (will force plist key:values)
vector <-> array
alist -> Error
hash-table -> object
standard object -> object

#### Decoding

Com.gigamonkeys.json only takes strings as inputs. It does not handle unicode surrogate pairs. On the plus side, it handles NULL issues correctly.

As noted in the mapping table above, json objects will be decoded as plists (or nested plists). Consider the following example of decoding a nested json object:

(com.gigamonkeys.json:parse-json "{
\"first_name\": \"George\",
\"last_name\": \"Washington\",
\"birthday\": \"1732-02-22\",
\"street_address\": \"3200 Mount Vernon Memorial Highway\",
\"city\": \"Mount Vernon\",
\"state\": \"Virginia\",
\"country\": \"United States\"
}
}")

("first_name" "George" "last_name" "Washington" "birthday" "1732-02-22"
("street_address" "3200 Mount Vernon Memorial Highway" "city" "Mount Vernon"
"state" "Virginia" "country" "United States"))


Arrays are decoded to vectors.

#### Encoding

There were a few surprises looking at what com.gigamonkeys.json would encode and not encode.

• It encodes keyword symbols, but not other symbols.
• It actually hangs on encoding chars and CLOS objects, which I found strange. Most of the other libraries generated errors.
• Attempting to encode a pathname also results in hanging
• nil encodes to {} (yes, empty object)
• Like some other libraries, it cannot deal with alists directly. Consider using alexandria:alist-hash-table to convert the alist to a hash table.
• On the other hand, in reversal from some of the other libraries, com.gigamonkeys.json will assume a plain list is a plist, returning a json object with key-value pairs. If the length of the list is odd, the final value in the list will be treated as a key and an empty set will be inserted as the value.

Let's take a closer look at the encoding functions. As noted previously, com.gigamonkeys.json will assume a plain list is a plist, returning a json object with key-value pairs. In the examples below, that means that since it is given a three value list, the third value is assumed to be the second key in an object and the value then is an empty set.

Table 57: Encoding Functions applied to a list
Function Input Result
json '(1 2 3) "{\"1\":2,\"3\":{}}"
to-json '(1 2 3) (1 2 3)
write-json '(1 2 3) {"1":2,"3":{}}
Table 58: Encoding Functions applied to an array
Function Input Result
json #(1 2 3) "[1,2,3]"
to-json #(1 2 3) #(1 2 3)
write-json #(1 2 3) [1,2,3]

#### Symmetry

From the standpoint of symmetry or round-tripping, going from Json->CL->Json, it did not have any issues.

As you might have expected given its ignoring lists, going from CL->Json->CL resulted in failure with tests starting with lists. If you started from an array, nil turned into {}, and it could get a little excited about how many decimal points it would return with respect to a float.

#### Security

There were two general security issues we considered: (1) interning keywords and (2) issues with malformed data. com.gigamonkeys.json did not exhibit the first issue.

With respect to the second issue, com.gigamonkeys.json accepted many malformed test cases which triggered stack exhaustion by opening too many levels of json open arrays and not closing them or similar types of issues.

#### Conformity with Json Standard

com.gigamonkeys.json accepted all 95 test cases that are considered "must accept".

It accepted 14 of the 17 test cases considered to be part of the gray area of the json specification - you could accept or reject.

#### com.gigamonkeys.json exported symbols

• *object-type*
• json - The top-level function for converting Lisp objects into a string in the JSON format. It can convert any object that can be converted to a json-exp via the to-json generic function.
• json-stringify - Convert object directly to a json representation as a string. Default methods are provided for strings, symbols (which must be keywords), and numbers but there may be situations where it is appropriate to define new methods on this function. In general, however, it is probably better to define a method on to-json to convert the object to a sexp that can be rendered as json.
• parse-json - Parse json text into Lisp objects. Hash tables are used to represent Javascript objects and vectors to represent arrays.
• to-json - Generic function that can convert an arbitrary Lisp object to a json-exp, i.e. a sexp that can then be rendered as json. To make an arbitrary class convertable to JSON, add a method to this generic function that generates a json-exp.
• write-json - Write data to stream in json format.

### com.inuoe.jzon

com.inuoe.jzon Wilfredo Velázquez-Rodríguez MIT https://github.com/Zulu-Inuoe/jzon Not in quicklisp

Com.inuoe.jzon is my new overall favorite. Unfortunately it is not in quicklisp (if that matters to you). It is fast, it handles null correctly, it encodes all kinds of lists, CLOS objects, structures and hash-tables. The one improvement I would request is the ability to do incremental coding and that is being worked on.

#### Default Mapping

##### Type Mapping Json -> CL
json CL
true t
false nil
null null
number integer or double float
string simple-string
array simple-vector
object hash-table using equal as the test function
##### Type Mapping CL -> Json

Using the stringify function, com.inuoe.jzon will map the following CL data types.

CL Json
symbol string (generally downcased unless they contain mixed case characters)
numbers number
alist object
plist object
list or sequence array
CLOS objects object (using bound slots as keys)
structures object

Com.inuoe.jzon looks at lists/alists/plists and tries to determine what is an alist or plist by looking at the key values.

#### Decoding

##### Acceptable Input for Parse

com.inuoe.jzon will accept strings, octets in utf8, stream characters or binary in utf8, or a pathname (at which point the library will assume you want it to open a file for reading.

The README also notes that (parse …) accepts the following keyword arguments:

• :maximum-depth This controls the maximum depth to allow arrays/objects to nest. Can be a positive integer, or nil to disable depth tests. This turned out to be important when dealing with deliberately malformed incoming json data because the library was able to call a halt well before the stack was exhausted.
• :key-fn A function of one argument responsible for 'interning' object keys. Should accept a simple-string and return the 'interned' key. The :key-fn parameter could be #'alexandria:make-keyword if you wanted to make object keys into symbols (but this is a bad practice from a security standpoint if the incoming json data is uncontrolled.

#### Encoding

com.inuoe.jzon can encode to either a stream or string. E.g.:

(com.inuoe.jzon:stringify '("A" "b" 4 3.2 9/4) :stream *standard-output*)
["A","b",4,3.2,2.25]

(com.inuoe.jzon:stringify '("A" "b" 4 3.2 9/4))
"[\"A\",\"b\",4,3.2,2.25]"


• :pretty If true, output pretty-formatted JSON
• :coerce-element A function for coercing 'non-native' values to JSON.
• :coerce-key A function for coercing key values to strings.

On the plus side,

• It automatically handles standard CLOS objects and also allows you to specialize.
• It is one of only two libraries which can encode a structure

On the "pay attention to your data structures" side: com.inuoe.jzon:stringify will encode an alist as a json object. It uses heuristics to predict what data structure it is using. Consider the difference between giving it an alist with dotted pairs and an alist without dotted pairs:

(com.inuoe.jzon:stringify '(("A" . 1) ("B" . 2) ("C" . 3)) :stream *standard-output*)
{"A":1,"B":2,"C":3}

(com.inuoe.jzon:stringify '(("A" 1) ("B" 2) ("C" 3)) :stream *standard-output*)
{"A":[1],"B":[2],"C":[3]}


Both are returned as json objects, but in the case of the undotted pairs, each value is embedded in its own array. It is pretty good at correctly guessing when something is a alist which should be encoded as key:value. Consider the following:

(com.inuoe.jzon:stringify '(("foo" . "bar") ("baz" . ((1 2 3) (4 5 6))))
:stream *standard-output*)
{"foo":"bar","baz":[[1,2,3],[4,5,6]]}


and this particular sample works because it does not try to make an integer into a key. However, if you change the sample data slightly so that the first value in eash list in the embedded alist was a string, it would have predicated that the "A" and "B" should be keys. The next two examples show the differences between its interpretation depending on whether you provided dotted pairs or not:

  (com.inuoe.jzon:stringify '(("foo" . "bar") ("baz" . (("A" 2 3) ("B" 5 6))))
:stream *standard-output*)
{"foo":"bar","baz":{"A":[2,3],"B":[5,6]}}

(com.inuoe.jzon:stringify '(("foo"  "bar") ("baz" (("A" 2 3) ("B" 5 6))))
:stream *standard-output*)
{"foo":["bar"],"baz":[{"A":[2,3],"B":[5,6]}]}


This may or may not have been what you wanted.

On the downside,

• Attempting to encode a pathname resulted in an empty json object.
##### Specialize Serialization

com.inuoe.jzon allows you to specialize the coerced-fields method to handle CL data types not included in the above list, including excluding, renaming and adding fields. See the README for examples.

##### Incremental Encoding

jzon incremental encoding is actively being worked on.

#### Symmetry

From the standpoint of symmetry or round-tripping, going from Json->CL->Json, it did not have any issues. It struggled a bit more with respect to going from CL->Json->CL. The first test had a starting point of:

((:NAME "George Washington") (:BIRTHDAY "February 22, 1732")
(:ADDRESS "Mount Vernon, Virginia, United States"))


and a result (adjusted to get the elements out of the hash-table) of:

(("address" . #("Mount Vernon, Virginia, United States"))
("birthday" . #("February 22, 1732")) ("name" . #("George Washington")))


So we went from undotted alist to dotted alist where each value was an array instead of a string. The second test started with an array instead of an alist and com.inuoe.jzon handled it with the one caveat that :NULL turned into "NULL".

#### Security

There were two general security issues we considered: (1) interning keywords and (2) issues with malformed data. com.inuoe.jzon did not exhibit the first issue.

With respect to the second issue, com.inuoe.jzon rejected all the malformed test cases. It was one of four packages which allowed you to limit the depths to which it would dive in nested json objects and, therefore, did not trigger the stack exhaustion exhibited by most of the other packages.

#### Conformity with Json Standard

com.inuoe.jzon accepted all 95 test cases that are considered "must accept".

It accepted 7 of the 17 test cases considered to be part of the gray area of the json specification - you could accept or reject.

#### com.inuoe.jzon exported symbols

• coerce-element - Coerce "element" into a "json-element", using "coerce-key" in cases the result is a hash-table.
• coerce-key - Coerce "key" into a string designator, or "nil" if "key" is an unsuitable key.
• coerced-fields - Return a list of key definitions for "element". A key definition is a three-element list of the form (name value type). Name is the key name and will be coerced if not already a string. Value is the value, and will be coerced if not a json-element. Type is a type for the key, in order to handle ambiguous "nil" interpretations.
• json-atom - a type definition including t, nil, null, real and string
• json-element - a type definition including json-atoms, vectors and hash-tables
• json-eof-error - a json-parse-error
• json-error - a simple condition
• json-parse-error - a json-error condition
• parse - Read a JSON value from in', which may be a vector, a stream, or a pathname. Keyword parameters: :maximum-depth controls the maximum depth of the object/array nesting. :allow-comments controls whether or not single-line // comments are allowed. :key-fn is a function of one value which 'pools' object keys, or null for the default pool.
• stringify - Serialize "element" into JSON. Returns a fresh string if "stream" is nil, nil otherwise. ":stream" like the "destination" in "format" ":pretty" if true, pretty-format the output ":coerce-element" is a function of two arguments, and is used to coerce an unknown value to a "json-element" ":coerce-key" is a function of one argument, and is used to coerce object keys into non-nil string designators. See "coerce-element" and "coerce-key".

### jonathan

jonathan Rudolph Miller MIT https://github.com/Rudolph-Miller/jonathan

While jsown is the winner in the decoding speed stakes, Jonathan can be faster or orders of magnitude slower depending on the amount of json data and its structure. Jonathan is among the leaders in encoding speed along with st-json, com.inuoe.jzon, and com.gigamonkeys.json. However, speed is not everything and there are a few concerning issues. It has optimize set for safety 0 to increase speed. This has bitten some users hard in the past, so you have been warned.

#### Default Mappping

Please note the direction of the arrows in the following table.

Lisp   Json
integer <-> number with no frac or exp parts
float <-> number with frac or exp parts
rational -> number with frac or exp parts
ratio -> number with frac or exp parts
T <-> true
nil <-> []
nil <- false
nil <- null
other symbol -> string
character -> Error
string <-> string
list (except alists) <-> array
vector -> array
alist w/o dotted pairs -> array of arrays
alist with dotted pairs -> Error
hash-table -> object
plist <- object (1)
standard object -> Error
• (1) Jonathan can parse a json object into plists (the default), alists, hash-tables or a "json object" by passing different keyword parameters to the parse function.

#### Decoding

##### Overview

Jonathan decodes strings, not streams. In testing, jonathan hung when trying to decode "[123e-10000000]". In its default settings, jonathan will decode a json null as nil, but if you set jonathan:*null-value* to :null, it will decod a json null properly as :null.

Jonathan parse a json object by default into plists, usually reversed. You can change the resulting data structure by passing :as :XXX to the parse function. E.g.

(jonathan:parse "{\"a\":1,\"b\":2}")
(:|b| 2 :|a| 1)

(jonathan:parse "{\"a\":1,\"b\":2}" :as :alist)
(("b" . 2) ("a" . 1))

(jonathan:parse *address-1* :as :jsown)
(:OBJ ("address" . "Mount Vernon, Virginia, United States")
("birthday" . "February 22, 1732") ("name" . "George Washington"))


The result is a cons cell format that appears to have originated with the jsown library.

##### Unicode

Jonathan can parse unicode and escaped unicode characters and can return unicode characters or escaped unicode characters.

(jonathan:parse "\"\\u30b8\\u30e7\\u30ca\\u30b5\\u30f3\"")
"ジョナサン"

(jonathan:parse "\"\\u30b8\\u30e7\\u30ca\\u30b5\\u30f3\""
:unescape-unicode-escape-sequence nil)
"u30b8u30e7u30cau30b5u30f3"

(jonathan:parse "{\"name\": \"ジョナサン\"}")
(:|name| "ジョナサン")

##### Nested Json Objects (Filters and Subsets)

Jonathan can easily extract a subset of data from the first level of a nested object, but you need to write a recursion method if you need to extract a subset of nested data that is deeper in the json object.

(jonathan:parse *nested-address-1* :keywords-to-read '("first_name"))
(:|first_name| "George")

(:|country| "United States" :|state| "Virginia" :|city| "Mount Vernon"
:|street_address| "3200 Mount Vernon Memorial Highway"))

nil

##### Allow json comment string
• can allow junked JSON format string (:junk-allowed t)
• can customize null-value, false-value and empty-array-value.
• can normalize keywords. (:keyword-normalizer)
• can not normalize keywords in nested objects.
• can ignore keywords when normalizer returns NIL.
• can unescape unicode escape sequences. (:unescape-unicode-escape-sequence)

#### Encoding

The basic encoding function is jonathan:to-json. Jonathan can return either a string or octets.

(jonathan:to-json '(:name "Common Lisp" :born 1984 :impls (SBCL KCL))
:octets t)
#(123 34 78 65 77 69 34 58 34 67 111 109 109 111 110 32 76 105 115 112 34 44 34
66 79 82 78 34 58 49 57 56 52 44 34 73 77 80 76 83 34 58 91 34 83 66 67 76 34
44 34 75 67 76 34 93 125)


When encoding alists or plists or jsown "objects", jonathan requires extra keyword parameters like :from :alist (or :plist, :jsown)

(jonathan:to-json '((a 1) (b 2)) :from :alist)
"{\"A\":[1],\"B\":[2]}"


While to-json can handle a plist without any additional parameters, to-json will throw an error if handed an alist without warning. This gets resolved by adding the additional keyword parameters :from :alist

(jonathan:to-json '((A . 1) (B . 2) (C . 3)) :from :alist)
"{\"A\":1,\"B\":2,\"C\":3}"


Jonathan expects simple-strings, so if your data source does not produce simple strings, you may have to massage the input to get there. Similarly, attempting to encode a char or a pathname will trigger an error. You should be able to methods that would handle them. The same situation arises with respect to encoding structs.

Now consider what happens when jonathan tries to encode alists with dotted pairs:

(jonathan:to-json '(("foo" . "bar") ("baz" . ((1 2 3) (4 5 6)))) :from :alist)

"{\"foo\":\"bar\",\"baz\":{\"1\":[2,3],\"4\":[5,6]}}"


and now without dotted pairs.

(jonathan:to-json '(("foo" "bar") ("baz"  ((1 2 3) (4 5 6)))) :from :alist)
"{\"foo\":[\"bar\"],\"baz\":[{\"1\":[2,3],\"4\":[5,6]}]}"


In both situations, jonathan is trying to force key:value pairs into places you would not expect.

##### Encoding Clos

Jonathan can encode CLOS objects to json if you create a method for that class. Consider our person class:

(defclass person ()
((name
:initarg :name :initform "Sabra"
:accessor name)
(eye-colour :initarg :eye-colour
:initform "brown"
:accessor eye-colour)))


Creating the required method for this class is straight forward:

(defmethod jonathan:%to-json ((person person))
(jonathan:with-object
(jonathan:write-key-value "name" (slot-value person 'name))
(jonathan:write-key-value "eye-colour" (slot-value person 'eye-colour))))


That then allows you to write something like this:

(let ((data (make-instance 'person)))
(jonathan:to-json data))
"{\"name\":\"Sabra\",\"eye-colour\":\"brown\"}"

##### Incremental Encoding

The following examples use two exercises. First, incrementally build a json array. Second, incrementally build a json object which also contains an incrementally built json array.

  (jonathan:with-output (*standard-output*)
(jonathan:with-array ()
(dotimes (i 3)
(jonathan:write-item i))))
[0,1,2]


The second is also straight forward.

(jonathan:with-output (*standard-output*)
(jonathan:with-object
(jonathan:write-key-value "hello" "hu hu")
(jonathan:write-key "harr")
(jonathan:write-value
(jonathan:with-array ()
(dotimes (i 3)
(jonathan:write-item i))))))
{"hello":"hu hu","harr":[0,1,2]}


#### Symmetry

From the standpoint of symmetry or round-tripping, going from Json->CL->Json, jonathan reversed the order of the list and converted both falsee and null to [].

Going from CL->Json->CL, it is not symmetric if the alists have dotted pairs. The array test resulted in getting the array converted into a list and :NULL was converted to nil.

#### Security

There were two general security issues we considered: (1) interning keywords and (2) issues with malformed data. Jonathan has the first issue.

With respect to the second issue, jonathan properly rejected 128 out of 173 malformed test cases. Unfortunately it was one of the packages that accepted malformed json data that would trigger stack exhaustion.

#### Conformity with Json Standard

jonathan accepted 94 of the 95 test cases that are considered "must accept". It signaled an overflow when trying to decode "[123.456e78]".

It accepted 9 out of the 17 test cases considered to be part of the gray area of the json specification - you could accept or reject. It actually hung when trying to decode the underflow number [123e-10000000].

#### jonathan exported symbols

• %to-json - Write obj as JSON string.
• %write-char - Write character to *stream*.
• %write-string - Write string to *stream*.
• *empty-array-value* - LISP value of [].
• *empty-object-value* - LISP value of {}.
• *false-value* - LISP value of false.
• *from* - Default value of from used by #'to-json.
• *null-value* - LISP value of null.
• *octets* - Default value of octets used by #'to-json.
• <jonathan-error> - Base condition of jonathan-errors.
• <jonathan-incomplete-json-error>
• <jonathan-not-supported-error>
• <jonathan-unexpected-eof-error>
• <jonathan-without-tail-surrogate-error>
• compile-encoder - Compile encoder
• parse - Convert JSON String to LISP object.
• to-json - Convert LISP object to JSON String.
• with-array - Make writing array safe.
• with-object - Make writing object safe.
• with-output - Bind *stream* to stream.
• with-output-to-string - Output *stream* as string.
• write-item - Write item of array.
• write-key - Write key part of object.
• write-key-value - Write key and value of object.
• write-value - Write value part of object.

### json-lib

json-lib Alex Nygren MIT https://github.com/KinaKnowledge/json-lib Not in quicklisp

Json-lib describes itself as a simple json decoder and encoder which tries to achieve symmetry. It falls a little short when dealing with json's false and with unicode char codes. On the plus side, it was one of four libraries to limit depth and avoid stack exhaustion on malformed json data. I think it generally does what it was intended for.

#### Mapping

JSON CL
null nil
false nil
true T
integer integer
float double-float
string string
array vector
object hash-table

#### Decoding

Json-lib parses utf-8 encoding json strings (and not streams). So if you are reading a json encoded file, you would need to specify in the input stream conversions that utf-8 be used. E.g.

(json-lib:parse (alexandria:read-file-into-string "file.json"
:external-format :utf8))


As with many of the libraries, it decodes a json 'null' as NIL. It also fails to handle unicode surrogate pairs if you care about that.

According to the README, the json-lib parser uses whitespace presence, regardless of commas, as a delimiting marker.

##### Nested JSON Objects

Json-lib does require that you completely parse the json data instead of being able to filter it while taking it in. Taking the nested JSON object below, how could we get a information out of the innermost nested object?

{
"items": [
{
"index": 1,
"integer": 29,
"float": 16.8278,
"fullname": "Milton Jensen",
"bool": false
}
]
}


You need to know your data structure so that you can figure out how to walk the tree. How would we get the value of the key "integer"? Looking at it, it is a JSON object which keyword "items"contains an array which contains a JSON object.

By default, json-lib decodes json objects to hash-tables and arrays into vectors. So we can descend the parsed json tree in this particular example something like this (assuming the json object was in a file named json4.txt):

(gethash "integer"
(aref
(gethash "items"
(json-lib:parse
0))

29


#### Encoding

Json-lib will encode to strings, not streams. Some idiosyncrasies are noted in the following points:

• It encodes keyword symbols, but other symbols get encoded to "null"
• It encodes a char to "null". You could write a method to handle char
• Ratios are encoded as "null"
• Attempting to encode a CLOS object or pathname returned "null".
• In encoding a hash-table, json-lib will be successful if the hash-table keys are strings or keyword symbols. If the keys are other symbols, the key will be rendered as "null".

#### Symmetry

From the standpoint of symmetry or round-tripping, going from Json->CL->Json, json-lib lost the unicode character in the middle of a string and converted false to null.

Going from CL->Json->CL, it took an input of alists and converted it to vectors of vectors with keywords symbols being turned into strings. In the second test it took an array input and the only hiccup was converting :NULL to "null".

#### Security

There were two general security issues we considered: (1) interning keywords and (2) issues with malformed data. json-lib did not exhibit the first issue.

With respect to the second issue, json-lib rejected 110 out of 173 malformed test cases. It was one of four packages which allowed you to limit the depths to which it would dive in nested json objects and, therefore, did not trigger the stack exhaustion exhibited by most of the other packages.

#### Conformity with Json Standard

json-lib accepted 95 of the 95 test cases that are considered "must accept". If *read-default-float-format* is set to 'single-float, it would refuse to accept: [123e65], [123e45] and [123.456e78].

It accepted 12 of the 17 test cases considered to be part of the gray area of the json specification - you could accept or reject.

#### Benchmarking

Json-lib has conversion functions lisp-to-snakecase, snakecase-to-lisp and lisp-to-camelcase. These will only be applied to keyword symbols.

#### Json-lib exported symbols

• encode-string
• parse - Given an encoded utf-8 json string, returns a cl structure or value. If use-keywords-for-keys is T, then hash table keys will be constructed as keywords. If an object-key-handler lambda/function is provided, this will be called for each object key and the result value used for the specific object key. By default the limit is 1000 for structural depth, but this can be set with the keyword max-depth. Exponent representation in serialized form is limited to a length of 2 to prevent huge values causing slow downs and other issues in the conversion process.
• stringify - Converts the given data structure to a stringified json form, suitable for serialization and other uses. An optional function can be passed for case encoding of native lisp keyword structures. If a function for unencodable-items is provided, this function will be called, and should return a JSON compliant string representing the encoded items. If no value is provided for unencodable-items, the json null value is used.

### json-streams

json-streams Thomas Bakketun, Stian Sletner GPL3 http://github.com/rotatef/json-streams

The author describes Json-streams advantages as:

• Separation of low-level and high-level functionality. json-streams provides a basic tokenizer for json, so that it's easy to build any data structure mapping one wants on top of it. All the other json libraries could in theory be built on it. The problem with many of them is that they've chosen a particular data mapping that very often is non-deterministic (False and [] both map to NIL, for example), or simply doesn't suit a particular use case. With json-streams you have full control of these things. But it also comes with a high-level API with a chosen mapping so it's ready to use.
• The API is streaming, so you don't have to process more than you want to, and you can process files of any size.
• Both Unicode and numbers are properly handled.

I am a bit surprised that something like json-streams that advertises itself as designed to be used as a building block for more high level libraries has no doc strings. Fortunately the README appears to be fairly comprehensive. However, after trying it during the course of testing, I am not convinced that it offers flexibility enough to counter the out of the box functionality of many of the other libraries, nor does it offer any speed advantage.

#### Default Mapping

This is the default mapping. Do you see a distinct lack of data structures on the CL side? See the discussion under Json-streams Encoding for more info.

JSON Lisp
true T
false NIL
null :NULL
string string
number integer, float or ratio
array (:ARRAY … ) (1)
object (:OBJECT (key . value) … ) (1)

(1) This is a cons and not a CLOS object.

#### Decoding

Generally speaking json-streams does what you would expect with respect to decoding. As you might expect, given the name, it can handle either stream or string input. However, consider parsing the two following json strings. The first is a json object which includes a json array and the second is just an array::

(json-streams:json-parse "{\"A\":false,\"B\":[false]}")
(:OBJECT ("A") ("B" :ARRAY NIL))

(json-streams:json-parse "[\"B\",false]")
(:ARRAY "B" NIL)


Both of the results are CONS. I do not know what I was expecting, but it probably was not that.

#### Encoding

I found json-streams frustrating when it comes to simple, straightforward encoding. You cannot just pass CL data to it and assume that it will work.

Data Result Comment
T true
NIL false
:null null
A (symbol) Error (1)
"b" "b"
1 1
1.2 1.2
9/17 Error (1)
(1 2) Error (1)
((A . 1)) Error (1)
(B 2) Error (1)
#(1 2 3) Error (1)
(MAKE-INSTANCE 'PERSON) Error (1)
• (1) A fell through ETYPECASE expression. Wanted one of ((OR STRING REAL) (MEMBER T NIL) (MEMBER :NULL) JSON-STREAMS:JSON-ARRAY JSON-STREAMS:JSON-OBJECT).

So how do you actually use stringify? Well,

Data Result Comment
(json-streams:json-stringify '(:array 1 :null t nil)) "[1,null,true,false]"
(json-stringify '(:object ("a". 1) ("b" . 2) ("c". 3))) "{\"a\":1,\"b\":2,\"c\":3}"
(json-stringify '(:object ("a" 1) ("b" 2) ("c" 3))) Error (1) Oops
• (1) A fell through ETYPECASE expression. Wanted one of ((OR STRING REAL) (MEMBER T NIL) (MEMBER :NULL) JSON-STREAMS:JSON-ARRAY JSON-STREAMS:JSON-OBJECT).

In case you were wondering, json-stringify json-stringify-multiple and json-stringify-single are regular functions. You cannot just write a new method to deal with a new datatype. They take a type json-object, json-array or json-string defined as:

(deftype json-object ()
'(cons (member :object) t))

##### Encoding Hash Tables

As result, when you want to encode CL data structures, you need to resort to more complicated calls such as the following for dealing with hash tables or similar calls. :

(let ((data (alexandria:plist-hash-table '("foo" 1 "bar" (7 8 9)) :test #'equal)))
(json-streams:with-json-output (nil :key-encoder #'string-downcase :indent t)
(json-streams:with-json-object
(json-streams:json-output-member :my-hash-table data))))

"{\"my-hash-table\": {\"foo\": 1,\"bar\": [7,8,9]}}"


This does not always work and there is no obvious way to just write additional methods dealing with new lisp data types other than to write your own methods which then wrap around lower level json-streams components. This essentially means that dealing with any type of CL data structure requires that you engage in incremental encoding.

##### Encoding Arrays

Encoding arrays is simile to encoding hash-tables (see above).

##### Incremental Encoding

The following examples use two exercises. First, incrementally build a json array. Second, incrementally build a json object which also contains an incrementally built json array.

(json-streams:with-json-output (nil :key-encoder #'string-downcase :indent t)
(json-streams:with-json-array
(dotimes (i 3) (json-streams:json-output-value i))))
"[0,1,2]"


And now the second:

(json-streams:with-json-output (nil :key-encoder #'string-downcase)
(json-streams:with-json-object
(json-streams:json-output-member "hello" "hu hu")
(json-streams:with-json-member :harr
(json-streams:with-json-array
(dotimes (i 3) (json-streams:json-output-value i))))))
"{\"hello\":\"hu hu\",\"harr\":[0,1,2]}"


The following is a more complicated version from the README:

(with-json-output (nil :key-encoder #'string-downcase :indent t)
(with-json-object
(json-output-member :first-name "John")
(json-output-member :last-name "Smith")
(with-json-member :is-alive (json-output-boolean t))
(json-output-member :age 25)
(json-output-alist '((:street-address . "21 2nd Street")
(:city . "New York")
(:state . "NY")
(:postal-code . "10021-3100"))))
(with-json-member :phone-numbers
(with-json-array
(json-output-plist '(:type "home"
:number "212 555-1234"))
(json-output-plist '(:type "office"
:number "646 555-4567"))
(json-output-plist '(:type "mobile"
:number "123 456-7890"))))
(json-output-member :children #())
(with-json-member :spouse (json-output-null))))

"{
\"first-name\": \"John\",
\"last-name\": \"Smith\",
\"is-alive\": true,
\"age\": 25,
\"city\": \"New York\",
\"state\": \"NY\",
\"postal-code\": \"10021-3100\"
},
\"phone-numbers\": [
{
\"type\": \"home\",
\"number\": \"212 555-1234\"
},
{
\"type\": \"office\",
\"number\": \"646 555-4567\"
},
{
\"type\": \"mobile\",
\"number\": \"123 456-7890\"
}
],
\"children\": [
],
\"spouse\": null
}"


#### Symmetry

From the standpoint of symmetry or round-tripping, going from Json->CL->Json, json-streams had no issues.

Going from CL->Json->CL, it failed (as you might expect) with both the alist input and the vector input because you would need to write a new method for dealing with either one.

#### Security

There were two general security issues we considered: (1) interning keywords and (2) issues with malformed data. json-streams did not exhibit the first issue.

With respect to the second issue, json-streams rejected all 173 malformed test cases. It was one of four packages which allowed you to limit the depths to which it would dive in nested json objects and, therefore, did not trigger the stack exhaustion exhibited by most of the other packages.

#### Conformity with Json Standard

Without additional parameters, json-streams accepted 93 of 95 test cases that are considered "must accept". The rejected test cases were files with duplicate keys and those would have been accepted by passing the keyword parameter :duplicate-key-check nil.

It accepted 1 of the 17 test cases considered to be part of the gray area of the json specification - you could accept or reject.

#### json-streams exported symbols

As mentioned above, there are no doc-strings for json-streams' exported symbols.

• call-with-json-output
• json-array
• json-close
• json-error
• json-input-stream
• json-object
• json-output-alist
• json-output-boolean
• json-output-member
• json-output-null
• json-output-plist
• json-output-stream
• json-output-value
• json-parse
• json-parse-error
• json-parse-multiple
• json-stream
• json-stream-position
• json-string
• json-stringify
• json-stringify-multiple
• json-write
• json-write-error
• make-json-input-stream
• make-json-output-stream
• most-negative-json-integer
• most-positive-json-integer
• with-json-array
• with-json-member
• with-json-object
• with-json-output
• with-open-json-stream

### jsown

Yes, it is the fastest in pure decoding but speed is not everything. It has optimize set for safety 0 to increase speed. This has bitten some users hard in the past, so you have been warned. You do have to be careful about what data structures you use. For example, encoding dotted alists triggered memory fault issues. It fails to deal with json's null properly and actually hung when trying to decode a json real number with an underflow problem.

On the plus side, it has the ability to extract a subset out of a json string, but you have to have read the entire string since it does not deal with streams.

#### Default Mappping

Please note the direction of the arrows in the following table.

Lisp   Json
integer <-> number with no frac or exp parts
float -> number with frac or exp parts
rational -> number with frac or exp parts
ratio <- number with frac or exp parts
T <-> true
nil <-> nil
nil <- false
nil <- null
nil <- []
other symbol -> string
character -> Error
string <-> string
list (except alists) <-> array
cons <- object
vector -> array
alist w/o dotted pairs -> array of arrays
alist with dotted pairs -> Error
hash-table -> object
standard object -> Error

#### Decoding

This is where jsown really shines. Jsown is by far the fastest decoder. It not only decodes, it also makes it easy to pull out elements of the json-object (at least at the first level of the object).

One thing that was interesting was that jsown is the only library to decode json floats to CL ratios.

As a minor annoyance, parsing strings that have form feeds get translated to line feeds.

When reading json objects, jsown converts their content to the most lispy translation of what was in there. As such, json's false will be translated to nil, which coincidentally also be the translation of JSON's []. Json's null is also translated to nil.

As you can tell from the default mapping table, json arrays are decoded to lists and json objects decoded to cons cells:

(jsown:parse "[1, \"a\"]")
(1 "a")

(jsown:parse "{\"foo\": \"alpha\", \"bar\":3.2}")
(:OBJ ("foo" . "alpha") ("bar" . 16/5))

##### Nested JSON Objects (Filters and Subsets)

Now consider the nested JSON object *nested-address-1*. Again, as a reminder, here is the object.

"{
\"first_name\": \"George\",
\"last_name\": \"Washington\",
\"birthday\": \"1732-02-22\",
\"street_address\": \"3200 Mount Vernon Memorial Highway\",
\"city\": \"Mount Vernon\",
\"state\": \"Virginia\",
\"country\": \"United States\"
}
}"


What can we do with it in jsown? It is a JSON object which keyword "address"contains another JSON object.

A basic call to parse returns a cons representing the entire object

(jsown:parse *nested-address-1*)
(:OBJ ("first_name" . "George") ("last_name" . "Washington")
("birthday" . "1732-02-22")
("city" . "Mount Vernon") ("state" . "Virginia")
("country" . "United States")))


If we add the keyword address, we get back a subset of the object:

(jsown:parse *nested-address-1* "address")
(:OBJ
("city" . "Mount Vernon") ("state" . "Virginia")
("country" . "United States")))


According to the README, "In order to achieve high performance when parsing specific keywords, the keywords to be found should be known at compile time. The compiler-macro-function can calculate the keyword container with the requested keywords at compile-time. When specifying the keywords in which you’re interested you should ignore any escaped characters. For instance, supplying the string “foo” will automatically match “f\\\\oo” too."

We can walk the tree deeper by applying the filter and val functions.

(jsown:val
(jsown:filter
"city")
"Mount Vernon"


Jsown has the ability, once a json object has been parsed into a jsown object, to get the first level keywords of the object.

(jsown:keywords (jsown:parse *nested-address-1*))


Jsown also has the ability to loop over the first level keywords of an object:

(jsown:do-json-keys (keyword value)
(format t "~a => ~a~%" keyword value))

first_name => George
last_name => Washington
birthday => 1732-02-22
(city . Mount Vernon) (state . Virginia) (country . United States))


On the downside, jsown actually hung when trying to decode an underflow number: "[123e-10000000]".

#### Encoding

(to-json x) is a generic function which you can specialize on your own types. This allows you to nest lisp objects in a jsown object and serialize them in a suitable way.

(to-json* x) is the non-generic function variant of the same thing. It isn't as smart, but it is faster.

Encoding chars, pathnames, CLOS objects, structs and other data structures not listed in the default mapping above will require that you write a new to-json method to handle that particular datatype.

On the plus side, jsown is the one of the two libraries (the other is shasht) which can handle multi-dimensional arrays

When writing to json,lisp's nil is translated to the empty JSON list []. You can write json's false by writing lisp's keywords :false or :f.

(jsown:to-json (jsown:new-js ("items" nil) ("falseIsEmptyList" :f) ("success" t)))

"{\"items\":[],\"falseIsEmptyList\":false,\"success\":true}"


As you can tell from the default mapping table, lists and vectors are automatically encoded to json arrays and hash-tables are encoded to json objects. As alists are considered lists of lists, an alist will return an array of arrays. Plists are treated the same as plain lists and will lose their key-value connections.

(jsown:to-json '(("a" "alpha") ("b" "beta")))
"[[\"a\",\"alpha\"],[\"b\",\"beta\"]]"


You can specify that json returns an object. The following call to to-json wraps the alist in another list headed by :obj and returns a json object.

(jsown:to-json '(:obj (("a" "alpha") ("b" "beta"))))
"{\"(a alpha)\":[[\"b\",\"beta\"]]}"


If jsown tries to encode an alist which has dotted cons cells, jsown triggered unhandled memory faults with sbcl 2.1.11-x86-64-linux, ccl version 1.12.1 LinuxX8664 and ecl-version 21.2.1. This appears to be because jsown has optimized the code and assumed that all lists will be proper lists. The assumption obviously fails in the context of dotted cons cells.

When to-json is called, jsown will internally call to-json each step of the way. This has a performance downside, yet it seems to provide the least surprises in the output. If you need more performance, jsown* offers that, at the cost of flexibility.

If you are constructing json objects, consider using the jsown:new-js and jsown:extend-js functions.jsown:js-new has a clean and clear interface for building content. It works together with jsown:extend-js if you need to split up the object creation.

Jsown has a setf-expander on (setf jsown:val) which automatically creates a jsown-object if no such object was available at the designated place. An example should clarify:

(let (doc)
(setf (jsown:val (jsown:val (jsown:val doc "properties") "color") "paint") "red")
(jsown:to-json doc))

"{\"properties\":{\"color\":{\"paint\":\"red\"}}}"


It turns out to be a handy little feature when you need to build deeply nested json documents.

##### Encoding CLOS

Jsown does not have a handy ability to automagically encode clos classes. For that you would have to write your own jsown:to-json method. Taking our simple person class with slots name and eye-colour, such a method could look something like this:

  (defmethod jsown:to-json ((object person))
(jsown:to-json
(:obj ("name" . ,(name object))
("eye-colour" . ,(eye-colour object)))))

(jsown:to-json (make-instance 'person))
"{\"name\":\"Sabra\",\"eye-colour\":\"brown\"}"

##### Incremental Encoding

The following examples use two exercises. First, incrementally build a json array. Second, incrementally build a json object which also contains an incrementally built json array. With jsown, I found using loop easier than dotimes.

(jsown:to-json
(loop :for i :from 0 :to 2 :collect i))
"[0,1,2]"


The second exercise

(jsown:to-json
(:obj
("hello" . "hu hu")
("harr" .
,(loop :for i :from 0 :to 2 :collect i))))
"{\"hello\":\"hu hu\",\"harr\":[0,1,2]}"


#### Symmetry

From the standpoint of symmetry or round-tripping, going from Json->CL->Json, jsown had no issues except for dealing with false and null.

Going from CL->Json->CL was more problematic. Keyword symbols became strings, :NULL became nil, a float was converted to a ratio and where the input started as an array, it returned a list.

#### Security

There were two general security issues we considered: (1) interning keywords and (2) issues with malformed data. Jsown did not exhibit the first issue. With respect to the second issue, jsown rejected 102 out of 173 malformed test cases. Unfortunately some of those malformed json data strings did trigger the stack exhaustion exhibited by most of the other packages.

#### Conformity with Json Standard

jsown accepted 93 of the 95 test cases that are considered "must accept". It failed on lonely numbers (an integer or negative real not within a json array or json object).

It accepted 9 of the 17 test cases considered to be part of the gray area of the json specification - you could accept or reject.

#### Miscellaneous

Issue 27 on github claims an unhandled memory fault on the following bad json string in sbcl 2.0.8, but I do not see that with ccl, clisp or sbcl 2.1.11, so this issue may have been fixed.

(jsown:parse "{\"foo\": a}")


#### jsown exported symbols

• *parsed-empty-list-value* - value to emit when parsing a json empty list '[]'
• *parsed-false-value* - value to emit when parsing json's 'false'
• *parsed-null-value* - value to emit when parsing json's 'null'
• *parsed-true-value* - value to emit when parsing json's 'true'
• as-js-bool - returns <value> as a boolean value (in jsown format)
• as-js-null - returns <value> with nil being javascript's null (in jsown format).
• build-key-container - Builds an internal structure to speed up the keywords which you can read. This should be used when the keywords needed are not known at compiletime, but you still want to parse those keywords of a lot of documents.
• do-json-keys - Macro - Iterates over the json key-value pairs
• empty-object - Returns an empty object which can be used to build new objects upon
• extend-js - Macro - fills in a bunch of jsown values for obj. each spec should contain a list with the first element being the string which represents the key and the second being the form which evaluates to the value to which the key should be set
• filter - Fancy filtering for jsown-parsed objects. spec can be one of the following:
• [object] key to find. will transform into (jsown:val value key)
• [cl:map] use this modifier with an [object] modifier after it, to filter all elements in the list.
• json-encoded-content - escribes a json object whos content is serialized already
• keyp - Returns non-nil iff <object> has key <key>
• keywords - Returns a list of all the keywords contained in the object
• new-js - Macro - creates a new empty object and fills it is per jsown-values
• parse - Reads a json object from the given string, with the given keywords being the keywords which are fetched from the object. All parse functions assume <string> is not an empty json object. (string/= string \"{}\")
• parse-with-container - Parses the keywords which have been specified in the container from the json string json-string. For most cases you can just use the parse function without a special key container. This is only here to support some cases where the building of the key container takes too much time. See #'parse for the normal variant. See #'build-key-container for a way to build new keyword containers.
• remkey - Removes key from object
• to-json - Writes the given object to json in a generic way.
• to-json* - Converts an object in internal jsown representation to a string containing the json representation
• val - Returns the value of the given key in object
• val-safe - Returns the value of <key> in <object> if <object> existed, or nil if it did not. A second value is returned which indicates whether or not the key was found.
• with-injective-reader - Rebinds *parsed--value so that reading json documents is injective and converting them back to json yields roughly the same document as the original. Rebinds: *parsed-true-value* => :true, *parsed-false-value* => :false, *parsed-null-value* => :null

### shasht

shasht Tarn W. Burton MIT https://github.com/yitzchak/shasht

Shasht is one of the two new libraries that I particularly like and is already in quicklisp. It is fast, it handles null correctly, it encodes CLOS objects, structures and hash-tables. It can also do incremental encoding.

#### Default Mappping

Please note the direction of the arrows in the following table.

Lisp   Json
integer <-> number with no frac or exp parts
float <-> number with frac or exp parts
rational -> number with frac or exp parts
ratio -> number with frac or exp parts
T <-> true
:true -> true
:false -> false
nil <-> false
:NULL <-> null
#() <- []
other symbol -> string
character -> string
string <-> string
pathname -> string
list (except alists) -> array
vector <-> array
multi-dimensional array -> nested array
alist -> array
hash-table <-> object
standard object -> object
structure -> object
:empty-array -> []
:empty-object -> {}
'(:array 1 2 3) -> [1,2,2]
'(:object-alist ("a" . 1)("b" . 2)) -> {"a":1,"b":2}
'(:object-plist "a" 1 "b" 2) -> {"a":1,"b":2}

The following subsections are from the README:

##### Mapping of Number Types

The format of a number read from JSON when a decimal or an exponent is present in the number literal can be influenced with cl:*read-default-float-format*. This is the same behavior of cl:read. In order to read JSON numbers with large exponents one would need do something like the following.

(shasht:read-json "[2.232e75]" :float-format 'double-float)

##### Mapping of Array Types

The dynamic variables *read-default-array-format*, *write-empty-array-values*, and *write-array-tags* all influence the mapping of JSON arrays to Common Lisp vectors and lists. Common Lisp vectors and multi-dimensional arrays are always writen as JSON arrays. By default JSON arrays are read as Common Lisp vectors. With the default settings only non-nil lists that don't satisfy some other mapping rule are written as JSON arrays.

If one wants to use lists as the default JSON array format then *read-default-false-value*, *read-default-array-format*, and *write-false-value* will need to need to be set to appropriate values since in the default mapping nil maps to false. For example, the following could be done.

(let ((shasht:*read-default-false-value* :false)
(shasht:*write-false-values* '(:false)))
(shasht:write-json ...))


Lists with a CAR eql to a value in *write-array-tags*, *write-object-alist-tags*, *write-object-plist-tags* will still be written as an array or object as appropriate. To completely disable this behavior the variables would need to be bound to nil. Or one could do the following.

(shasht:write-json '(1 2 3) :false-value '(:false) :array-tags nil
:object-alist-tags nil :object-plist-tags nil)


In this case the mapping for array types would become:

Common Lisp   JSON
vector -> array
multi-dimensional array -> nested array
list <-> array
##### Mapping of Object Types

The dynamic variables *read-default-object-format*, *write-alist-as-object*, *write-plist-as-object*, *write-empty-object-values*, *write-object-alist-tags*, and *write-object-plist-tags* all influence the mapping of JSON objects to Common Lisp hash tables, alists, and plists. Common Lisp hash tables are always written as JSON objects. By default JSON objects are read as Common Lisp hash tables.

In order to use alists as the default JSON object format the dynamic variables *read-default-object-format*, *write-alist-as-object*, *read-default-false-value*, and *write-false-values* will need to be set to appropriate values. For example, the following would use alists as the default JSON object format and :false as the JSON false value.

(let ((shasht:*read-default-object-format* :alist)
(shasht:*write-alist-as-object* t)
(shasht:*write-false-values* '(:false)))
(shasht:write-json ...))


In this case the mapping for object types would become:

Common Lisp   JSON
hash table -> object
alist <-> object
standard object -> object
structure object -> object

The same could be accomplished for plists by doing the following.

(let ((shasht:*read-default-object-format* :plist)
(shasht:*write-plist-as-object* t)
(shasht:*write-false-values* '(:false)))
(shasht:write-json ...))


#### Decoding

To quote from the README: The primary interface to parsing and reading JSON is the read-json function.

(read-json &optional input-stream-or-string (eof-error-p t) eof-value single-value-p)


The argument input-stream-or-string can be an stream, a string to read from, or nil to use *standard-input*. The arguments eof-error-p and eof-value have the same affect as they do in the CL function read. If the single-value-p argument is true then the input to read-json is assumed to be a single value, which means that extra tokens at the end will cause an error to be generated.

There are a number of dynamic variables that will influence the parsing of JSON data.

• common-lisp:*read-default-float-format* — Controls the floating-point format that is to be used when reading a floating-point number.
• *read-default-true-value* — The default value to return when reading a true token. Initially set to t.
• *read-default-false-value* — The default value to return when reading a false token. Initially set to nil.
• *read-default-null-value* — The default value to return when reading a null token. Initially set to :null.
• *read-default-array-format* — The default format to use when reading an array. Current supported formats are :vector or :list. Initially set to :vector.
• *read-default-object-format* — The default format to use when reading an object. Current supported formats are :hash-table, :alist or :plist. Initially set to :hash-table.

There is also a keyword variant read-json* which will set the various dynamic variables from supplied keywords.

(read-json* :stream nil
:eof-error t
:eof-value nil
:single-value nil
:true-value t
:false-value nil
:null-value :null
:array-format :vector
:object-format :hash-table
:float-format 'single-float)


#### Encoding

The primary interface to serializing and writing JSON is the write-json function.

(write-json value &optional (output-stream t))


On the plus side, shasht was one of two libraries which could encode a pathname or a struct. It also is one of the few libraries that can encode CLOS objects out of the box:

(shasht:write-json (make-instance 'person))
{
"NAME": "Sabra",
"EYE-COLOUR": "brown"
}


Local-time:timestamps were encoded as {"DAY": 7990, "SEC": 0, "NSEC": 0}, so if you wanted them to be in any particular javascript time representation, you would have to write a specialized method.

As you might expect, plists are treated the same as plain lists and will lose their key-value connections. If you want to use plists, you will need to convert to a hash table first.

There are a number of dynamic variables that will influence the serialization of JSON data.

• common-lisp:*print-pretty* — If true then a simple indentation algorithm will be used.
• *write-indent-string* — The string to use when indenting objects and arrays. Initially set to #\space.
• *write-ascii-encoding* — If true then any non ASCII values will be encoded using Unicode escape sequences. Initially set to nil.
• *write-true-values* — Values that will be written as a true token. Initially set to '(t :true).
• *write-false-values* — Values that will be written as a false token. Initially set to '(nil :false).
• *write-null-values* — Values that will be written as a null token. Initially set to (:null).
• *write-alist-as-object* — If true then undotted assocation lists will be written as an object. Initially set to nil.
• *write-plist-as-object* — If true then property lists will be written as an object. Initially set to nil.

Consider the following:

(setf common-lisp:*print-pretty* nil)
(shasht:write-json '(("foo" . "bar") ("baz" . ((1 2 3) (4 5 6)))))
[["foo"; Evaluation aborted on #<TYPE-ERROR expected-type: LIST datum: "bar">.


Now if we set shasht:*write-alist-as-object* t,

(setf common-lisp:*print-pretty* nil)
(setf shasht:*write-alist-as-object* t)
(shasht:write-json '(("foo" . "bar") ("baz" . ((1 2 3) (4 5 6)))))

{"foo":"bar","baz":{1:[2,3],4:[5,6]}}


The actual serialization of JSON data is done by the generic function print-json-value which can be specialized for additional value types.

(print-json-value value output-stream)


There is also a keyword variant write-json* which will set the various dynamic variables from supplied keywords.

(write-json* value :stream t
:ascii-encoding nil
:true-values '(t :true)
:false-values '(nil :false)
:null-values '(:null)
:empty-array-values '(:empty-array)
:empty-object-values '(:empty-object)
:alist-as-object nil
:plist-as-object nil
:pretty nil
:indent-string "  ")



In order to facilitate extending the serialization facilities of shasht there are a number of helper functions available. To aid in the printing of JSON strings there is the following.

(write-json-string value output-stream)


In order to ease the serialization of objects and arrays there is with-json-object and with-json-array. Both of these macros take an output stream as the first argument then enable indentation and automatic handling of all delimiter tokens. Inside the body of with-json-object the function (print-json-key-value key value output-stream) should be used to output a key value pair. Inside the body of with-json-array the function (print-json-value value output-stream) should be used to output a single value. Example usage can be seen in the source code.

##### Incremental Encoding

The following examples use two exercises. First, incrementally build a json array. Second, incrementally build a json object which also contains an incrementally built json array.

(shasht:with-json-array *standard-output*
(dotimes (i 3)
(shasht:write-json i)))
[0,1,2]


One way of doing the second exercise could look like this:

(shasht:with-json-object *standard-output*
(shasht:print-json-key-value ()
"hello" "hu hu" *standard-output*)
(shasht:print-json-key-value ()
"harr"
(loop :for i :from 0 :to 2 :collect i)
*standard-output*))
{"hello": "hu hu", "harr": [0,1,2]}


The following two examples are taken from the README. There are keyword literals that can be used to help constructing json objects and arrays. The values and tags that indicate these literals can be configured via the dynamic variables *write-empty-array-values*, *write-empty-object-values*, *write-array-tags*, *write-object-alist-tags*, and *write-object-plist-tags*.

These literals forms are only meant for serialization and not for round-trip mapping. Therefore there is no way to read JSON in the same format.:

(shasht:write-json
'(:object-alist ("a" . :empty-array)
("b" . :empty-object)
("c" . (:object-plist "d" 1
"e" (:array 1 2 3)))))
{
"a": [],
"b": {},
"c": {
"d": 1,
"e": [
1,
2,
3
]
}
}

(:OBJECT-ALIST ("a" . :EMPTY-ARRAY) ("b" . :EMPTY-OBJECT)
("c" :OBJECT-PLIST "d" 1 "e" (:ARRAY 1 2 3)))


#### Symmetry

From the standpoint of symmetry or round-tripping, going from Json->CL->Json, shasht had no issues.

Going from CL->Json->CL was more problematic. The alist input was converted to vectors of vectors and keyword symbols became strings.

#### Security

There were two general security issues we considered: (1) interning keywords and (2) issues with malformed data. shasht did not exhibit the first issue.

With respect to the second issue, shasht rejected 159 out of 173 malformed test cases. While it did not trigger the stack exhaustion issue exhibited by most of the other packages, some of the malformed json data strings did trigger recoverable error situations.

#### Conformity with Json Standard

shasht accepted 95 of the 95 test cases that are considered "must accept". If *read-default-float-format* is set to 'single-float, it would refuse to accept: [123e65], [123e45] and [123.456e78].

It accepted 9 of the 17 test cases considered to be part of the gray area of the json specification - you could accept or reject.

#### Shasht Exported Symbols

• *read-default-array-format* - The default format to use when reading an array. Current supported formats are :vector or :list. Initially set to :vector.
• *read-default-false-value* - The default value to return when reading a false token. Initially set to nil
• *read-default-null-value* - The default value to return when reading a null token. Initially set to :null
• *read-default-object-format* - The default format to use when reading an object. Current supported formats are :hash-table, :alist or :plist. Initially set to :hash-table
• *read-default-true-value* - The default value to return when reading a true token. Initially set to t
• *symbol-name-function*
• *write-alist-as-object* - If true then assocation lists will be written as an object. Initially set to nil
• *write-array-tags - Indicators in the CAR of a list that indicate that the CDR or the list should be written as an array.
• *write-object-alist-tags - Indicators in the CAR of a list that indicate that the CDR of the list is an alist and should be written as object.
• *write-object-plist-tags - Indicators in the CAR of a list that indicate that the CDR of the list is an plist and should be written as object.
• *write-ascii-encoding* - If true then any non ASCII values will be encoded using Unicode escape sequences. Initially set to nil
• *write-empty-array-values*
• *write-empty-object-values*
• *write-false-values* - Values that will be written as a false token. Initially set to '(nil :false)
• *write-indent-string* - The string to use when indenting objects and arrays. Initially set to #\space
• *write-null-values* - Values that will be written as a null token. Initially set to (:null)
• *write-plist-as-object* - If true then property lists will be written as an object. Initially set to nil
• *write-true-values* - Values that will be written as a true token. Initially set to '(t :true)
• make-object
• print-json-delimiter
• print-json-key-value
• print-json-value - The actual serialization of JSON data is done by the generic function print-json-value which can be specialized for additional value types.
(read-json &optional input-stream-or-string (eof-error-p t) eof-value single-value-p)

• read-json* - will set the various dynamic variables from supplied keywords.
(read-json* :stream nil
:eof-error t
:eof-value nil
:single-value nil
:true-value t
:false-value nil
:null-value :null
:array-format :vector
:object-format :hash-table
:float-format 'single-float)

• shasht-parse-error
• with-json-array - macros take an output stream as the first argument then enable indentation and automatic handling of all delimiter tokens. the function (print-json-value value output-stream) should be used to output a single value.
• with-json-key
• with-json-object - macros take an output stream as the first argument then enable indentation and automatic handling of all delimiter tokens. function (print-json-key-value key value output-stream) should be used to output a key value pair.
• write-json - The primary writing function
(write-json value &optional (output-stream t))

• write-json* - Will set the various dynamic variables from supplied keywords.
(write-json* value :stream t
:ascii-encoding nil
:true-values '(t :true)
:false-values '(nil :false)
:null-values '(:null)
:empty-array-values '(:empty-array)
:empty-object-values '(:empty-object)
:alist-as-object nil
:plist-as-object nil
:pretty nil
:indent-string "  ")

• write-json-string

### st-json

st-json Marijn Haverbeke zlib-style https://github.com/marijnh/ST-JSON

As the README says, st-json does mostly the same thing as cl-json, but is simpler and more precise about types (distinguishing boolean false, the empty array, and the empty object). It is very fast encoding. While not the fastest decoder, it has respectable decoding speed. It does have optimize set for safety 0 to increase speed. This has bitten some users hard in the past, so you have been warned. It handles json's null correctly. You will have to write your own methods to handle symbols and CLOS objects.

Documentation is found at https://marijnhaverbeke.nl/st-json/.

#### Default Mappping

Please note the direction of the arrows in the following table.

Lisp   Json
integer <-> number with no frac or exp parts
float <-> number with frac or exp parts
rational -> number with frac or exp parts
ratio -> number with frac or exp parts
T <-> true
nil <-> []
:FALSE <- false
:NULL <- null
other symbol (1) -> Error
character (1) -> Error
string <-> string
list (except alists) <-> array
vector (1) -> Error
alist (2) -> array
hash-table (3) -> object
struct <- object
standard object (1) -> Error
• (1) You can write a method to handle these
• (2) Only if the alist does not have dotted cons cells.
• (3) Encoding hash-tables works with the caveat that if the hash-table has symbols as keys, you will have to write your own method to handle symbols first.

#### Decoding

In decoding, st-json creates instances of a struct "jso" which wraps an alist. The underlying idea for using a struct is that often hash tables are too heavy weight for the needs.

(st-json:read-json "{\"foo\": \"alpha\", \"bar\":3.2,\"baz\": \"null\"}")

#S(ST-JSON:JSO :ALIST (("foo" . "alpha") ("bar" . 3.2) ("baz" . "null")))


#### Encoding

The basic encoding function is write-json. Similarly to cl-json, st-json has a function for writing json output to string as well as to a stream. St-json has a limited number of lisp datatypes that it handles out of the box. For example, vectors are not included. It is easy to write methods for those types, but that is an additional amount of work not necessary in most of the other libraries.

##### Encoding Symbols

You need to write your own st-json::write-json-element function for symbols. Possibly something like:

(defmethod st-json:write-json-element ((element symbol) stream)
(st-json:write-json-element (string element) stream))


You can also write methods to handle encoding the other lisp datatypes in the default mapping table which show an error.

##### Encoding Arrays

You need to write your own st-json::write-json-element function for arrays. Possibly something like:

(defmethod write-json-element ((element vector) stream)
(declare #.*optimize*)
(write-char #$stream) (loop :for val :across element :for first := t :then nil :unless first :do (write-char #\, stream) :do (write-json-element val stream)) (write-char #$ stream))

##### Encoding Alists

If the alist has dotted cons cells, jsown and st-json triggered unhandled memory faults with sbcl 2.1.11-x86-64-linux, ccl version 1.12.1 LinuxX8664 and ecl-version 21.2.1. This appears to be because jsown and st-json have optimized the code and assumed that all lists will be proper lists. The assumption obviously fails in the context of dotted cons cells.

##### Encoding CLOS

St-json requires that you actually define a write-json-element method for the CLOS class you want to encode. Consider our little person class.

(defclass person ()
((name
:initarg :name :initform "Sabra"
:accessor name)
(eye-colour :initarg :eye-colour
:initform "brown"
:accessor eye-colour)))


Creating the required method for this class is straight forward.

(defmethod st-json:write-json-element ((person person) stream)
(let ((accessors '(("name" name) ("eye_colour" eye-colour))))
(write-char  #\{ stream)
(loop :for (key val) :in accessors
:for first := t :then nil
:unless first :do (write-char #\, stream)
do
(st-json:write-json-element key stream)
(write-char #\: stream)
(st-json:write-json-element (funcall val person) stream))
(write-char  #\} stream)))

#<STANDARD-METHOD ST-JSON:WRITE-JSON-ELEMENT (PERSON T) {101561B163}>
JSON-TESTS> (let ((data (make-instance 'person))) (st-json:write-json data *standard-output*))
{"name":"Sabra","eye_colour":"brown"}

##### Incremental Encoding

The following examples use two exercises. First, incrementally build a json array. Second, incrementally build a json object which also contains an incrementally built json array. The first exercise (again, like jsown, I found using loop easier than dotimes):

(st-json:write-json-to-string
(loop :for i :from 0 :to 2 :collect i))
"[0,1,2]"


The second exercise:

(st-json:write-json-to-string
(st-json:jso "hello" "hu hu" "harr"
(loop :for i :from 0 :to 2 :collect i)))

"{\"hello\":\"hu hu\",\"harr\":[0,1,2]}"


#### Symmetry

From the standpoint of symmetry or round-tripping, going from Json->CL->Json, st-json had no issues.

Going from CL->Json->CL was more problematic solely because you need to write a method to deal with encoding symbols.

#### Security

There were two general security issues we considered: (1) interning keywords and (2) issues with malformed data. st-json did not exhibit the first issue.

With respect to the second issue, st-json rejected 140 out of 173 malformed test cases. Unfortunately it did trigger the stack exhaustion issue exhibited by most of the other packages.

#### Conformity with Json Standard

st-json accepted 95 of the 95 test cases that are considered "must accept". If *read-default-float-format* is set to 'single-float, it would refuse to accept: [123e65], [123e45] and [123.456e78].

It accepted 8 of the 17 test cases considered to be part of the gray area of the json specification - you could accept or reject.

#### Other Information

St-json provides error conditions for json-type-error, json-parse error, json-error and json-eof-error. These are undocumented, so you will have to look at the source code for how to use them.

#### st-json exported symbols

• *decode-objects-as* - Valid values: :jso :hashtable Controls how js objects should be decoded. :jso means decode to internal struct which can be processed by getjso, mapjso etc. :hashtable means decode as hash tables.
• *output-literal-unicode* - Bind this to T in order to reduce the use of \uXXXX Unicode escapes, by emitting literal characters (encoded in UTF-8). This may help reduce the parsing effort for any recipients of the JSON output, if they can already read UTF-8, or else, they'll need to implement complex unicode (eg UTF-16 surrogate pairs) escape parsers.
• *script-tag-hack* - Bind this to T when writing JSON that will be written to an HTML document. It prevents '</script>' from occurring in strings by escaping any slash following a '<' character.
• as-json-bool - Convert a generalised boolean to a :true/:false keyword.
• from-json-bool - Convert :true or :false to its boolean equivalent.
• getjso - Fetch a value from a JS object. Returns a second value like gethash.
• getjso* - The getjso* function in theory allows you to take a key in the form of "a.b.c" and st-json will generate a series of getjso calls to go down each level and return the value for key c. This, however, does not seem to work if read-json does not result in jso objects all the way down. Consider, for example, if we have is a jso object which wraps a cons which wraps a jso object. The intermediary cons prevents getjso* from walking down the nested list.
• jso - Create a JS object. Arguments should be alternating labels and values.
• json-bool - (deftype json-bool () '(member :true :false))
• json-eof-error
• json-error
• json-null - (deftype json-null () '(eql :null))
• json-parse-error
• json-type-error
• mapjso - Iterate over the key/value pairs in a JS object.
• read-json - Read a JSON-encoded value from a stream or a string.
• read-json-as-type - Read a JSON value and assert the result to be of a given type. Raises a json-type-error when the type is wrong.
• write-json - Write a value's JSON representation to a stream.
• write-json-element - Method used for writing values of a specific type. You can specialise this for your own types.
• write-json-to-string

### trivial-json-codec

trivial-json-codec Eric Diethelm MIT https://gitlab.com/ediethelm/trivial-json-codec

As the website says, trivial-json-codec is a json parser focused on the ability to handle class hierarchies. The classes must be defined each class in a hierarchy must have at least one slot named differently. It does not handle the difference between json's null and false (everything is nil once it gets to the CL side). The one thing I do not understand is its use of angle brackets when encoding lists to json. E.g.

(trivial-json-codec:serialize '(1 2 3) *standard-output*)
<1,2,3>


Encoding vectors looks perfectly standard:

(trivial-json-codec:serialize #(1 2 3) *standard-output*)
[1,2,3]


I see this encoding of lists as resulting in invalid json, but maybe I am missing something.

#### Default Mappping

Please note the direction of the arrows in the following table.

Lisp   Json
integer <-> number with no frac or exp parts
float <-> number with frac or exp parts
rational -> number with frac or exp parts
ratio -> Error (1)
T <-> true
nil <-> []
nil <- false
nil <-> null
other symbol -> string
character -> Error (1)
string <-> string
list (except alists) <-> invalid angle bracket array?
vector <-> array
alist w/o dotted pairs -> invalid angle bracket array
alist with dotted pairs -> Error (1)
hash-table -> Error (1)
plist <- invalid angle bracket array
standard object -> object
alist <- object (2)
alist <- object (3)
• (1) You could write a method to handle these
• (2) Using deserialize-raw
• (3) Using deserialize-json if a base class is given

#### Decoding

Trivial-json-codec has two functions that "deserialize" json strings. The first is deserialize-raw. Json arrays are converted to vectors and json objects are converted to alists.

(trivial-json-codec:deserialize-raw "[1.23,\"alpha\",false,null]")
#(1.23 "alpha" NIL NIL)

((:NAME "George Washington") (:BIRTHDAY "February 22, 1732")
(:ADDRESS "Mount Vernon, Virginia, United States"))


The second is deserialize-json. This function takes a json string, a CL class, a read-table and constructors. The last three default to nil. If you do not supply a CL class, deserialize-json can only handle native types:

(trivial-json-codec:deserialize-json "[1.23,\"alpha\",false,null]")
#(1.23 "alpha" NIL NIL)


If you do supply a class, it will try to create an instance of the class with data from the json string. Lets try with our simple person class which only has slots name and eye-colour.

(describe
(trivial-json-codec:deserialize-json "{\"name\":\"Rebecca\",\"eye-colour\":\"blue\"}"
:class (find-class 'person)))
#<PERSON {1010770303}>
[standard-object]

Slots with :INSTANCE allocation:
NAME                           = "Rebecca"
EYE-COLOUR                     = "blue"


To no one's surprise, if the json string is an array, it will just return a vector. If the array contains the proper json objects, it will return a vector of class objects:

(describe
(aref
(trivial-json-codec:deserialize-json
"[{\"name\":\"Rebecca\",\"eye-colour\":\"blue\"}, {\"name\":\"Johann\",\"eye-colour\":\"brown\"}]"
:class (find-class 'person))
1))
#<PERSON {1011E67053}>
[standard-object]
Slots with :INSTANCE allocation:
NAME                           = "Johann"
EYE-COLOUR                     = "brown"


If you have a class hierarchy, it will work through the inherited classes. Each class will need to have at least one slot named differently.

#### Encoding

Trivial-json-codec has two functions for encoding: serialize and serialize-json. Serialize is the underlying generic function and takes a stream parameter as well as the data. Serialize-json uses serialize and returns a string.

It does encode CLOS objects although you might want to write specialized methods to handle particular formatting. For example, local-time:timestamps were encoded as {"DAY": 7990, "SEC": 0, "NSEC": 0}, so if you wanted them to be in any particular javascript time representation, you would have to write a specialized method.

You would also need to write applicable methods for any structs, pathnames or hash-tables.

As mentioned above, encoding lists returns strings with angle brackets.

##### Hierarchy Use Case

I am just going to quote from the README here:

(defclass Payload ()
())

((value :type integer
:initarg :value)))

((value :type string
:initarg :value)
(message-id :type trivial-utilities:positive-fixnum
:initarg :message-id)))

((cargo :type fixnum
:initarg :cargo)))

(defclass Message ()
((uid :initarg :uid
:initform nil
:accessor uid)

(c2mop:ensure-finalized (find-class 'Message))

(let ((message (make-instance 'Message :uid 1
:value 12345))))
(trivial-json-codec:serialize-json message))
=> "{ \"UID\" : 1,  \"PAYLOAD\" : { \"VALUE\" : 12345}}"

(deserialize-json "{ \"UID\" : 1,  \"PAYLOAD\" : { \"VALUE\" : 12345}}"
:class (find-class 'Message))

(let ((message (make-instance 'Message
:uid 2
:value "abc"
:message-id 17
(trivial-json-codec:serialize-json message))
=> "{ \"UID\" : 2,  \"PAYLOAD\" : { \"VALUE\" : \"abc\",
\"MESSAGE-ID\" : 17}}"

(deserialize-json "{ \"UID\" : 2,
\"PAYLOAD\" : { \"VALUE\" : \"abc\",
\"MESSAGE-ID\" : 17}}"
:class (find-class 'Message))

(let ((message (make-instance 'Message :uid 2
:cargo -147))))
(trivial-json-codec:serialize-json message))
=> "{ \"UID\" : 2,  \"PAYLOAD\" : { \"CARGO\" : -147}}"

(deserialize-json "{ \"UID\" : 2,  \"PAYLOAD\" : { \"CARGO\" : -147}}"
:class (find-class 'Message))


Due to the known limitation mentioned in the description, the following is NOT possible:

(defclass StringPayload (Payload)
((value :type string
:initarg :value)))

(let ((message
(make-instance 'Message :uid 2
:value "abc"))))
(trivial-json-codec:serialize-json message))
=> "{ \"UID\" : 2,  \"PAYLOAD\" : { \"VALUE" : \"abc\"}}"

(deserialize-json "{ \"UID\" : 2,  \"PAYLOAD\" : { \"VALUE" : \"abc\"}}"
:class (find-class 'Message))
=> This terminates with an error due to non-unique class mapping.


#### Symmetry

From the standpoint of symmetry or round-tripping, going from Json->CL->Json, trivial-json-codec continued its insistence that json arrays should be shown as angle brackets. trivial-json-codec is really intended as a parser (one way) from json to CL, not really serializing to json.

Interestingly, trivial-json-codec had no problems going from CL->Json->CL.

#### Security

There were two general security issues we considered: (1) interning keywords and (2) issues with malformed data. trivial-json-codec did not exhibit the first issue.

With respect to the second issue, trivial-json-codec rejected 101 out of 173 malformed test cases. It did not trigger the stack exhaustion issue exhibited by most of the other packages.

#### Conformity with Json Standard

trivial-json-codec accepted 90 of the 95 test cases that are considered "must accept". It failed to accept: [[] ], [0e+1], [1E+2], [1e+2], { "min": -1.0e+28, "max": 1.0e+28 }. If *read-default-float-format* is set to 'single-float, it would refuse to accept: [123e65], [123e45] and [123.456e78].

It accepted 13 of the 17 test cases considered to be part of the gray area of the json specification - you could accept or reject.

#### trivial-json-codec exported symbols

• serialize-json obj - takes obj and serializes it into a string. uses the generic serialize to do the job.
• deserialize-json json-str &key (class nil) (read-table nil) (constructors nil) - reads json-str and creates an according object. If class is non-nil and represents a class, an instance of it is returned. otherwise only built-in types can be deserialized. Read-table makes it possible to inject specific readers, as pondons to serialize. it has the form of an alist containing the dispatch character as car and the deserialization function as cdr. constructors holds an alist mapping the keyword returned by a specific reader to a object construction function.
• serialize obj stream - serialize an object obj into stream. implementations for some built-in types already exist. the user might extend with methods for specific types.
• deserialize-raw json-str &key (read-table nil) - deserialize json-str into a property list. as opposed to deserialize-json this function does not require a base class to deserialize.

### yason

yason Hans Huebner BSD https://github.com/phmarek/yason

IMPORTANT: Notice the github location has moved. Hans Huebner's old github location will automatically redirect to Phil Marek's, but quicklisp is not (as of the time of this writing) pulling code from the new github location. It is actively maintained by Phil Marek and he is very responsive.

From the author: "the major difference between YASON and the other JSON libraries that were available when I wrote it is that YASON does not require the user to come up with a Lisp data structure that reflects the JSON data structure that should be generated. Rather, I wanted a way to generate JSON directly from my internal data structures.

The reason for that desire was that I had to generate different JSON format in different contexts. That is, a class instance would sometimes be generated including all its attributes, sometimes just with a select set of attributes and sometimes as a reference. Thus, there was no right way to render an object as JSON, and I found the approach to first generate a data structure that would then be rendered as JSON to be wasteful and, as CL has no literal syntax for hash tables, ugly.

Instead of going through an intermediate data structure, YASON allows you to encode to a JSON stream on the fly (http://common-lisp.net/project/yason/#stream-encoder."

#### Mapping DataTypes and Structures

json   cl Notes
object -> hash-table :test #'equal Keys are strings by default (see *parse-object-key-fn*). Set *parse-object-as* to :alist in order to have yason parse objects as alists or to :plist to parse them as plists. When using plists, you probably want to also set *parse-object-key-fn* to a function that interns the object's keys to symbols.
array -> list Can be changed to read to vectors (see *parse-json-arrays-as-vectors*)
string <-> string JSON escape characters are recognized upon reading. Upon writing, known escape characters are used, but non-ASCII Unicode characters are written as is.
number <-> number Parsed with READ, printed with PRINC
true <-> t Can be changed to read as TRUE (see *parse-json-booleans-as-symbols*).
false <-> nil Can be changed to read as FALSE (see *parse-json-booleans-as-symbols*).
null <-> nil or :NULL nil unless yason:parse is called with yason:*parse-json-null-as-keyword* set to t or passing the keyword parameter :json-nulls-as-keyword t to yason:parse
(1) <- symbol
(2) <- object
(3) <- alists
• (1) If you are working with yason version 0.8.2 or above, you can set the special variable *symbol-encoder* to the function #'YASON:ENCODE-SYMBOL-AS-LOWERCASE. In that case symbols will be encoded as lower case strings in json. Otherwise attempts to encode symbols will trigger a no-applicable-method error.
• (2) Will trigger a no-applicable-method error. So you could write class specific encoding methods for your objects.
• (3) Will return an object. Undotted alists will return values in an array, dotted alists will return individual values for each key. See Yason Encoding Alists below.

#### Decoding

Yason uses its generic function parse to generate a hash-table of the received json-object. It is possible to set the special variable *parse-object-as* :hash-table, :plist or :alist to specify the data structure that objects are parsed into. The default is :hash-table with test being equal. It is also possible to set the special variable *parse-json-arrays-as-vectors* to t, in which case the json arrays will be parsed as vectors and not as lists.

The following are decoding tests on JSON arrays, arrays in arrays and embedded arrays in an object and the results. I am using the additional yason parameters to show the alist representation instead of a hash-table and to return JSON arrays as vectors rather than lists.

(defparameter *short-encoded-items-A*

"[\"items\",
{\"index\":1,
\"float\":19.2041,
\"name\":\"Jennifer\",
\"surname\":\"Snow7\",
\"fullname\":\"Andrew4 Vaughan\",\"email\":\"sherri@ritchie.zw\",
\"bool\":null},
{\"index\":2,
\"float\":14.9888,\"name\":\"Alfred\",
\"surname\":\"Pitts\",\"fullname\":\"Barry Weiner\",
\"email\":\"cheryl@craven.re\",\"bool\":null}]")

(yason:parse *short-encoded-items-A* :json-arrays-as-vectors t :object-as :alist)

'("items"
(("bool") ("email" . "sherri@ritchie.zw") ("fullname" . "Andrew Vaughan")
("surname" . "Snow7") ("name" . "Jennifer") ("float" . 19.2041)
("index" . 1))
(("bool") ("email" . "cheryl@craven.re") ("fullname" . "Barry Weiner")
("surname" . "Pitts") ("name" . "Alfred") ("float" . 14.9888)
("index" . 2)))

##### Nested JSON Objects

Yason does require that you completely parse the json data instead of being able to filter it while taking it in. Taking the nested JSON object below, how could we get a information out of the innermost nested object?

{
"items": [
{
"index": 1,
"integer": 29,
"float": 16.8278,
"fullname": "Milton Jensen",
"bool": false
}
]
}


You need to know your data structure so that you can figure out how to walk the tree. How would we get the value of the key "integer"? Looking at it, it is a JSON object which keyword "items"contains an array which contains a JSON object.

By default, yason decodes json objects to hash-tables and arrays into lists. So we can descend the parsed json tree in this particular example something like this (assuming the json object was in a file named json4.txt):

(gethash "integer"
(first
(gethash "items"
(yason:parse

29


#### Encoding

The base yason encoding function is encode, which takes some data and an optional stream, defaulting to *standard-output*. It returns first the json-encoded item and second the lisp item:

(yason:encode '("a" 1 "b" 2))
["a",1,"b",2]
("a" 1 "b" 2)


Yason has multiple specialized methods that may be required, depending on the data structure to be encoded. For example: yason:encode-alist, yason:encode-plist, yason:encode-object, etc. The following demonstrates that providing a plist to yason:encode results in an json array, but providing a plist to yason:encode-plist results in a json object.

(yason:encode '("a" 1 "b" 2))
["a",1,"b",2]

(yason:encode-plist '("a" 1 "b" 2))
{"a":1,"b":2}

##### Encoding Symbols

The version of yason in quicklisp as of this writing (version 7.8) has no applicable method for encoding a symbol. This also means that by default, yason will not encode lists with symbols. That is resolved in version 8.2 on github, but for some reason it is not getting picked up in quicklisp. You can resolve this in version 7.8 by writing a new method to handle symbols. Such a method could look like this:

(defmethod encode ((object symbol) &optional (stream *json-output*))
(let ((new (funcall #'ENCODE-SYMBOL-AS-LOWERCASE object)))
(assert (stringp new))
(encode new stream)))


If you have symbols in alists as keys, you can (setf yason:*symbol-key-encoder* 'yason:ENCODE-SYMBOL-AS-LOWERCASE) and then yason:encode-alist will work, but not yason:encode.

(yason:encode-alist '((a 1) (b 2)))
{"a":[1],"b":[2]}
((A 1) (B 2))

##### Encoding Alists

Unlike some other libraries, yason's encode-alist function handles both dotted and undotted alists. It will return an object. The values in undotted alists will be returned as arrays, the values in dotted alists will be returned as individual values. As mentioned in just above in dealing with symbols, if you have symbols as keys in your alist, you need to have set *symbol-key-encoder* to 'yason:ENCODE-SYMBOL-AS-LOWERCASE in order to make them usable without triggering errors.

(setf yason:*symbol-key-encoder* 'yason:ENCODE-SYMBOL-AS-LOWERCASE)

(yason:encode-alist '((a 1 2 3 4) (b 5 6 7)))
{"a":[1,2,3,4],"b":[5,6,7]}

(yason:encode-alist '((a . 1) (b . 2)))
{"a":1,"b":2}

##### Encoding CLOS

For objects, you will have to write your own method to extend yason to encode objects using the generic functions encode-object and encode-slots or encode-object-slots. Taking our simple person class, we can define a new encode-slots, then within an output context, we can call encode-object and output the clos class instance.

(defmethod yason:encode-slots progn ((person person))
(yason:encode-object-slots person '(name eye-colour)))

(yason:with-output-to-string* ()
(yason:encode-object (make-instance 'person)))
"{\"NAME\":\"Sabra\",\"EYE-COLOUR\":\"brown\"}"


Or you can take a look at json-mop or herodotus. I have a preference for json-mop with the caveat that, at the moment, you cannot redefine classes.

##### Incremental Encoding

The following examples use two exercises. First, incrementally build a json array. Second, incrementally build a json object which also contains an incrementally built json array.

(yason:with-output (*standard-output*)
(yason:with-array ()
(dotimes (i 3)
(yason:encode-array-element i))))
[0,1,2]

(yason:with-output (*standard-output*)
(yason:with-object ()
(yason:encode-object-element "hello" "hu hu")
(yason:with-object-element ("harr")
(yason:with-array ()
(dotimes (i 3)
(yason:encode-array-element i))))))
{"hello":"hu hu","harr":[0,1,2]}


#### Symmetry

From the standpoint of symmetry or round-tripping, going from Json->CL->Json, yason had no issues. Going from CL->Json->CL was more problematic solely because keyword symbols were written to strings and :NULL was converted to nil.

#### Security

There were two general security issues we considered: (1) interning keywords and (2) issues with malformed data. yason did not exhibit the first issue.

With respect to the second issue, yason rejected 119 out of 173 malformed test cases. Unfortunately it did trigger the stack exhaustion issue exhibited by most of the other packages.

#### Conformity with Json Standard

yason accepted 95 of the 95 test cases that are considered "must accept". If *read-default-float-format* is set to 'single-float, it would refuse to accept: [123e65], [123e45] and [123.456e78].

It accepted 8 of the 17 test cases considered to be part of the gray area of the json specification - you could accept or reject.

#### yason exported symbols

• *list-encoder* - function to call to translate a cl list into json data. 'yason:encode-plain-list-to-array is the default; 'yason:encode-plist and 'yason:encode-alist are available to produce json objects. this is useful to translate a deeply recursive structure in a single yason:encode call.
• *parse-json-arrays-as-vectors* - if set to a true value, json arrays will be parsed as vectors, not as lists. nil is the default.
• *parse-json-booleans-as-symbols* - if set to a true value, json booleans will be read as the symbols true and false instead of t and nil, respectively. nil is the default.
• *parse-json-null-as-keyword* - if set to a true value, json null will be read as the keyword :null, instead of nil. nil is the default.
• *parse-object-as* - can be set to :hash-table to parse objects as hash tables, :alist to parse them as alists or :plist to parse them as plists. :hash-table is the default.
• *parse-object-as-alist*
• *parse-object-key-fn* - function to call to convert a key string in a json object to a key in the cl hash produced. identity is the default.
• *symbol-encoder* - warning: as of the time of this writing, this variable is in version 0.8.2 on github and not in version 0.7.8 in quicklisp. function to call to translate a cl symbol into a json string. the default is to error out, to provide backwards-compatible behaviour. a useful function that can be bound to this variable is yason:encode-symbol-as-lowercase
• *symbol-key-encoder- defines the policy to encode symbols as keys (eg. in hash tables). the default is to error out, to provide backwards-compatible behaviour. a useful function that can be bound to this variable is yason:encode-symbol-as-lowercase
• encode - encode object in json format and write to stream. may be specialized by applications to perform specific rendering. stream defaults to *standard-output*.
• encode-alist - encodes object, an alist, in json format and write to stream.
• encode-array-element - encode object as next array element to the last json array opened with with-array in the dynamic context. object is encoded using the encode generic function, so it must be of a type for which an encode method is defined.
• encode-array-elements - encode objects, a series of json encodable objects, as the next array elements in a json array opened with with-array. encode-array-elements uses encode-array-element, which must be applicable to each object in the list (i.e. encode must be defined for each object type). additionally, this must be called within a valid stream context.
• encode-object - generic function to encode an object. the default implementation opens a new object encoding context and calls encode-slots on the argument.
• encode-object-element - encode key and value as object element to the last json object opened with with-object in the dynamic context. key and value are encoded using the encode generic function, so they both must be of a type for which an encode method is defined.
• encode-object-elements - encodes the parameters into json in the last object opened with with-object using encode-object-element. the parameters should consist of alternating key/value pairs, and this must be called within a valid stream context.
• encode-object-slots - encodes each slot in slots for object in the last object opened with with-object using encode-object-element. the key is the slot name, and the value is the slot value for the slot on object.
• encode-plain-list-to-array
• encode-plist - encodes object, a plist, in json format and write to stream.
• encode-slots - generic function to encode object slots. there is no default implementation. it should be called in an object encoding context. it uses progn combinatation with most-specific-last order, so that base class slots are encoded before derived class slots.
• encode-symbol-as-lowercase
• false
• make-json-output-stream - creates a json-output-stream instance that wraps the supplied stream and optionally performs indentation of the generated json data. the indent argument is described in with-output. note that if the indent argument is nil, the original stream is returned in order to avoid the performance penalty of the indentation algorithm.
• no-json-output-context - this condition is signalled when one of the stream encoding functions is used outside the dynamic context of a with-output or with-output-to-string* body.
• null
• parse - input &key (object-key-fn *parse-object-as-key-fn*) (object-as *parse-object-as*) (json-arrays-as-vectors *parse-json-arrays-as-vectors*) (json-booleans-as-symbols *parse-json-booleans-as-symbols*) (json-nulls-as-keyword *parse-json-null-as-keyword*) => object parse input, which must be a string or a stream, as json. returns the lisp representation of the json structure parsed. the keyword arguments object-key-fn, object-as, json-arrays-as-vectors, json-booleans-as-symbols, and json-null-as-keyword may be used to specify different values for the parsing parameters from the current bindings of the respective special variables.
• true
• with-array - open a json array, then run body. inside the body, encode-array-element must be called to encode elements to the opened array. must be called within an existing json encoder context (see with-output and with-output-to-string*)
• with-object - open a json object, then run body. inside the body, encode-object-element or with-object-element must be called to encode elements to the object. must be called within an existing json encoder with-output and with-output-to-string*.
• with-object-element - open a new encoding context to encode a json object element. key is the key of the element. the value will be whatever body serializes to the current json output context using one of the stream encoding functions. this can be used to stream out nested object structures.
• with-output - macro: set up a json streaming encoder context on stream, then evaluate body. indent can be set to t to enable indentation with a default indentation width or to an integer specifying the desired indentation width. by default, indentation is switched off.
• with-output-to-string* - set up a json streaming encoder context on stream-symbol (by default a gensym), then evaluate body. return a string with the generated json output. see with-output for the description of the indent keyword argument.

## Helper Libraries

### cl-json-helper

cl-json-helper Bob Felts BSD homepage cl-json Last updated 2018

cl-json-helper is a very small package that adds two functions to help cl-json encoding JSON data and one function to help in processing decoded JSON data. I will use its nickname :xjson in the examples. At the end of the day, I see a slight benefit here, but given that it is really just a cl-json helper and the latest generation of json libraries have passed cl-json by, I think this is only for cl-json devotees.

#### cl-json-helper exported symbols

• json-bool - (json-bool val) returns an object that cl-json will decode to "true" or "false"
• json-empty - (json-empty) returns an object that cl-json will decode to '{}'
• json-key-value - (json-key-value key list) returns the value associated with key
• value-of

#### Json-bool

json-bool simply adds the ability for nil to be translated to json as false.

(cl-json:encode-json-to-string '(t nil))
"[true,null]"

(cl-json:encode-json-to-string (,(xjson:json-bool t) ,(xjson:json-bool nil)))
"[true,false]"


#### json-empty

json-empty provides the ability to return an empty object. E.g.

(cl-json:encode-json-to-string (("Empty" . ,(xjson:json-empty))))
"{\"Empty\":{}}"


#### Json-key-value

If cl-json:decode-json-from-string *nested-address-1* looks like this:

(cl-json:decode-json-from-string *nested-address-1*)
((:FIRST--NAME . "George") (:LAST--NAME . "Washington")
(:BIRTHDAY . "1732-02-22")
(:CITY . "Mount Vernon") (:STATE . "Virginia") (:COUNTRY . "United States")))


Then you can access the address using cl-json-helper

(xjson:json-key-value :address (cl-json:decode-json-from-string *nested-address-1*))
((:STREET--ADDRESS . "3200 Mount Vernon Memorial Highway")
(:CITY . "Mount Vernon") (:STATE . "Virginia") (:COUNTRY . "United States"))


The only difference between using this and using (assoc ..) is that assoc will also keep the key:

(assoc :address (cl-json:decode-json-from-string *nested-address-1*))
(:CITY . "Mount Vernon") (:STATE . "Virginia") (:COUNTRY . "United States"))


### define-json-expander

define-json-expander Johan Sjölén MIT homepage cl-json Last updated 2014

Define-json-expander is, I think, something interesting to look at, but I do not see it adding a great deal of value. Given the last time it was updated was eight years ago, I think it is of historical interest only.

#### define-json-expander exported symbols

• *accessor-prefix*
• define-json-expander

#### Explanation

Let's take our simplest json object address example which we keep in address-1:

"{
\"name\": \"George Washington\",
\"birthday\": \"February 22, 1732\",
\"address\": \"Mount Vernon, Virginia, United States\"
}"


We now use define-json-expander to create an address class that can be used:

(define-json-expander:define-json-expander address-class ()


Now we have a decode-address function that we can apply to a cl-json decoded json string:

  (decode-address-class  (cl-json:decode-json-from-string *address-1*))

[standard-object]

Slots with :INSTANCE allocation:
REST                           = NIL
NAME                           = "George Washington"
BIRTHDAY                       = "February 22, 1732"
ADDRESS                        = "Mount Vernon, Virginia, United States"


HOWEVER, as far as I can tell, the only accessor or reader method defined is for REST, which has a nil value. Net result, even after reading the docs and the test cases in the source files, I'm confused on whether this is fully baked. Given the last update was 8 years ago, I would give this a pass.

### herodotus

herodotus Henry Steere BSD (1) homepage yason Wrapper around yason to handle CLOS classes more easily. Last updated 16 Jun 2021

Yason does the json parsing and serialisation for herodotus. The other dependencies are cl-ppcre and alexandria. I think it may be useful if you are a yason user.

#### herodotus exported symbols

• define-json-model
• to-json

#### Setup

Let's make an anologue of our simple person class with herodotus:

(herodotus:define-json-model herodotus-person (name eye-colour) :snake-case)


This creates a package name HERODOTUS-PERSON-JSON with a function for parsing json HERODOTUS-PERSON-JSON:FROM-JSON and a generic method for writing to json. It has accessors and initargs of name and eye-colour. The created class only has accessor and initarg specs for the slots; it does not have the ability to create default values. That last value specifies that it expects json keys to be using snake-case. (The default is :camel-case, the other options are :snake-case, :kebab-case or :screaming-snake-case.)

From the README: "The define-json-model macro takes three arguments: name, slots and an optional argument for case-type. The name argument is the name of the generated CLOS class. The slots argument is a collection of slot descriptors and the case-type argument is a keyword.

Slot descriptors can be either symbols or lists. If a slot descriptor is a symbol then the value of the corresponding CLOS slot will be a deserialised json primitive in lisp form: a number, boolean, string, vector (for arrays), or hash-table (for objects).

If a slot descriptor is a list then first argument is the CLOS slot name, the second argument is either () or the name of a previously defined json model to deserialise the value of this field to. The optional third argument is a special case name for this field which can have custom formatting."

#### Decoding

At this point a simple json object could then be decoded into a herodotus-person instance.

(eye-colour
(herodotus-person-json:from-json "{\"name\":\"Claudia\",\"eye_colour\":\"blue\"}"))
"blue"


If the json object key was "eye-colour" instead of "eye_colour", the return value would have been nil because the case would have been wrong.

If you have an array of json objects, the from-json function would return a vector of herodotus-person objects:

(name
(aref
(herodotus-person-json:from-json
"[{\"name\":\"Rebecca\",\"eye_colour\":\"blue\"},{\"name\":\"Johann\",\"eye_colour\":\"brown\"}]")
1))
"Johann"


#### Encoding

Encoding usage looks much like yason:

(herodotus:to-json (make-instance 'herodotus-person :eye-colour "Green" :name "Persephone"))
"{\"name\":\"Persephone\",\"eye_colour\":\"Green\"}"


##### Nested classes

You can also define classes that have class members using the type specifier syntax. This block defines two json models tree and branch. A tree has branch members and the branch members will be parsed from json using the parser defined for the tree.

CL-USER> (herodotus:define-json-model branch (size))
CL-USER> (herodotus:define-json-model tree ((branches branch)))


The syntax (branches branch) declares that the field named branches must be parsed as the type branch. Json models for nested classes need to be defined before the models for the classes they are nested in or an error will be thrown. The error is thrown at macro expansion time.

CL-USER> (herodotus:define-json-model test-no-parser ((things not-parseable)))
CL-USER> (herodotus:define-json-model test-no-parser ((things not-parseable)))
class-name TEST-NO-PARSER slots ((THINGS NOT-PARSEABLE))
; Evaluation aborted on #<SIMPLE-ERROR "Could not find parser for
; class NOT-PARSEABLE. Please define a json model for it."
; {100599D903}>.


##### None, one or many semantics

Fields in class definitions are parsed as either nil (if missing from the json), a single instance if the field is not an array and isn’t empty or a vector if the json contains an array of elements.

CL-USER> (herodotus:define-json-model numbers (ns))
CL-USER> (ns (numbers-json:from-json "{ }"))
NIL
CL_USER> (ns (numbers-json:from-json "{ \"ns\": 1 }"))
1
CL-USER> (ns (numbers-json:from-json "{ \"ns\": [1, 2, 3] }"))
#(1 2 3)

##### Special case field names

Parsing specific field names can be done using the third argument of a field specifier. If a special field name is provided it doesn’t have to match the name of the slot in the CLOS class and can use any formatting convention.

(herodotus:define-json-model special-case ((unusual-format () "A_very-UniqueNAME"))
(unusual-format (special-case-json:from-json "{ \"A_very-UniqueNAME\": \"Phineas Fog\" }"))
"Phineas Fog"
(herodotus:to-json (special-case-json:from-json "{ \"A_very-UniqueNAME\": \"Phineas Fog\" }"))
"{\"A_very-UniqueNAME\":\"Phineas Fog\"}"


### json-mop

json-mop Grim Schjetne MIT homepage yason Last updated 9 Mar 2021

Unlike herodotus which creates a new package, json-mop wants you to define your classes with the class option

(:metaclass json-serializable-class)


I quite like json-mop (assuming I was using yason), but you do need to watch for one issue: At the moment you cannot redefine a class. The issue has been flagged twice on github and I ran into it in testing.

For slots that you want to appear in the JSON representation of your class, add the slot option :json-key with the string to use as the attribute name. The option :json-type defaults to :any, but you can control how each slot value is transformed to and from JSON with one of the following JSON type specifiers:

Type Remarks
:any Guesses the way to encode and decode the value
:string Enforces a string value
:number Enforces a number value
:hash-table Enforces a hash table value
:vector Enforces a vector value
:list Enforces a list value
:bool Maps T and NIL with true and false
<symbol> Uses a (:metaclass json-serializable-class) class definition to direct the transformation of the value

so taking our minimal person object and following these instructions:

(defclass mop-person ()
((name
:initarg :name :initform "Sabra" :json-key "name" :json-type :string
:accessor name)
(eye-colour :initarg :eye-colour :json-key "eye-colour" :json-type :string
:initform "brown"
:accessor eye-colour))
(:metaclass json-mop:json-serializable-class))
#<JSON-MOP:JSON-SERIALIZABLE-CLASS JSON-TESTS::MOP-PERSON>

(json-mop:encode (make-instance 'mop-person))
{"name":"Sabra","eye-colour":"brown"}


We can get a CLOS instance out of an appropriate json object with (json-to-clos data class-name)

(describe (json-mop:json-to-clos "{\"name\":\"Karla\",\"eye-colour\":\"green\"}"
'mop-person))
#<MOP-PERSON {1009BE24B3}>
[standard-object]

Slots with :INSTANCE allocation:
NAME                           = "Karla"
EYE-COLOUR                     = "green"


It gets interesting when you create an instance with an invalid data type. Suppose we create a mop-person with a list for a name and an integer for an eye colour. The instance gets created that way, but when we try to encode it, we get an errro:

(json-mop:encode (make-instance 'mop-person :name '(1 2 3) :eye-colour 12))
{

There is no class named :STRING.
[Condition of type SB-PCL:CLASS-NOT-FOUND-ERROR]


If the json object actually has a null value for eye-colour, that will be treated as an unbound slot. Our mop-person class has an initform of "brown" for eye-colour and our initform will overrule the json null:

(describe (json-mop:json-to-clos "{\"name\":\"Andre\",\"eye-colour\":null}"  'mop-person))
#<MOP-PERSON {1009C0C693}>
[standard-object]

Slots with :INSTANCE allocation:
NAME                           = "Andre"
EYE-COLOUR                     = "brown"


The README has an example of nested objects:

  (defclass book ()
((title :initarg :title
:json-type :string
:json-key "title")
(published-year :initarg :year
:json-type :number
:json-key "year_published")
(fiction :initarg :fiction
:json-type :bool
:json-key "is_fiction"))
(:metaclass json-serializable-class))

(defclass author ()
((name :initarg :name
:json-type :string
:json-key "name")
(birth-year :initarg :year
:json-type :number
:json-key "year_birth")
(bibliography :initarg :bibliography
:json-type (:list book)
:json-key "bibliography"))
(:metaclass json-serializable-class))

(defparameter *author*
(make-instance 'author
:name "Mark Twain"
:year 1835
:bibliography
(list
(make-instance 'book
:title "The Gilded Age: A Tale of Today"
:year 1873
:fiction t)
(make-instance 'book
:title "Life on the Mississippi"
:year 1883
:fiction nil)
(make-instance 'book
:year 1884
:fiction t))))


#### json-mop exported symbols

• json-mop:encode
• json-mop:json-serializable
• json-mop:json-serializable-class
• json-mop:json-to-clos
• json-mop:json-type
• json-mop:json-type-error
• json-mop:no-values-class
• json-mop:no-values-hash-table
• json-mop:no-values-parsed
• json-mop:null-in-homogenous-sequence
• json-mop:null-value
• json-mop:slot-name
• json-mop:slot-not-serializable
• json-mop:to-json-value
• json-mop:to-lisp-value

### cl-json-pointer

cl-json-pointer Yokata Yuki MIT homepage cl-json, st-json, yason, jsown, jonathan, json-streams, com.gigamonkeys.json (1) 40ants comments
• (1) There is an outstanding issue asking for com.inuoe.jzon support opened on May 14, 2021, but no action.

I am going to borrow 40ants review here:

This library implements RFC 6901 - a format for accessing nested JSON data-structures. It some sense, JSON pointer is similar to JSON path, but more suitable for use as a part of the URL fragment.

cl-json-pointer's README provides many examples, but all of them are applied to the object which almost flat. Let's try to reproduce an example from the JSON path's site:

{
"firstName": "John",
"lastName" : "doe",
"age"      : 26,
"city"         : "Nara",
"postalCode"   : "630-0192"
},
"phoneNumbers": [
{
"type"  : "iPhone",
"number": "0123-4567-8888"
},
{
"type"  : "home",
"number": "0123-4567-8910"
}
]
}


Now we'll translate this JSON path: $.phoneNumbers[0].type into JSON pointer /phoneNumbers/0/type: (defparameter *obj* (jsown:parse (alexandria:read-file-into-string "data.json"))) (cl-json-pointer:get-by-json-pointer *obj* "/phoneNumbers/0/type" :flavor :jsown) "iPhone" ("type" . "iPhone") NIL  It is also possible to add/set/delete elements using cl-json-pointer. You will find more examples in the official docs. Comparing to the JSON path, the pointer has clearer character escaping rules and is able to work with keys containing dots and slashes and other symbols. But it does not support slicing and some other features of the JSON path. #### cl-json-pointer exported symbols • json-pointer-error - • *json-object-flavor - Default flavor of JSON library currently used.This value is used for :FLAVOR argument of exported functions. Currently acceptable values are held by '*cl-json-pointer-supported-json-flavors* • *cl-json-pointer-supported-json-flavors* - • parse-json-pointer - Parses OBJ to an internal representation • get-by-json-pointer - Traverses OBJ with POINTER and returns three values: the found value (nil' if not found), a generalized boolean saying the existence of the place pointed by POINTER, and NIL. • exists-p-by-json-pointer - raverses OBJ with POINTER and returns the existence of the place pointed by POINTER. • set-by-json-pointer - Traverses OBJ with POINTER, sets VALUE into the pointed place, and returns the modified OBJ • add-by-json-pointer - Works same as set-by-json-pointer', except this try to make a new list when setting to lists. • delete-by-json-pointer - Traverses OBJ with POINTER, deletes the pointed place, and returns the modified OBJ • remove-by-json-pointer - Works same as delete-by-json-pointer', except this try to make a new list when deleting from lists. • update-by-json-pointer - Modify macro of set-by-json-pointer'. This sets results of set-by-json-pointer' to the referred place. • deletef-by-json-pointer - Modify macro of delete-by-json-pointer'. This sets results of delete-by-json-pointer' to the referred place. ### cl-json-schema Library Author License Website Works With Comments cl-json-schema Mark Skilbeck MIT homepage Per https://json-schema.org/, Json Schema is a vocabulary that allows you to annotate and validate json documents. Json-schema is an attempt to go from json schema to clos class definition. The intended sequel, https://gitlab.com/Gnuxie/json-schema2/, has not been really developed. The author's description on a reddit post found here was "It is bad. It should only be used to learn and do things better. I don't feel comfortable providing any form of support for this hack. But if you have a directory on the filesystem with some JSON schema in it, you should be able to use the macro json-schema2:define-schema-spec and list the pathname in there. You might get lucky and find everything you need is supported when you view the macroexpansion." The sum total of documentation for cl-json-schema is the following: The main entrypoint is (cl-json-schema:validate thing schema) where thing is a JSON-compatible value, and schema is a hash-table. (Alternatively, if you prefer, (json-schema:validate thing schema). For example (let ((schema (yason:parse "{ \"type\": \"object\", \"propertyNames\": { \"pattern\": \"^[A-Za-z_][A-Za-z0-9_]*$\"
}
}
")))
;; NEAT!
(json-schema:validate (yason:parse "{\"_a_proper_token_001\": \"value\"}")
schema)
;; NO BUENO! Key does not match the required pattern
(json-schema:validate (yason:parse "{\"001 invalid\": \"value\"}")
schema))


## Appendix

### Json Refresher

JSON (JavaScript Object Notation) is a lightweight data-interchange format, codified in RFC 8259. JSON has the following data structures:

• object: { "key1": "value1", "key2": "value2" }
• array: [ "first", "second", "third" ]
• number: 42, 3.1415926 (note that there is no separate type for integer or floating point).
• string: "This is a string"
• boolean: true, false
• null: null

Json objects can be nested, potentially creating the need for a json schema. Taking an example from https://json-schema.org/understanding-json-schema/about.html, we can see two json objects containing information about a person:

{
"name": "George Washington",
"birthday": "February 22, 1732",
"address": "Mount Vernon, Virginia, United States"
}

{
"first_name": "George",
"last_name": "Washington",
"birthday": "1732-02-22",
"street_address": "3200 Mount Vernon Memorial Highway",
"city": "Mount Vernon",
"state": "Virginia",
"country": "United States"
}
}


If you wanted to validate the json objects, you would need a schema which you could check against the objects. The first object would fail validation and the second object would pass validation if they were validate against the following schema:

{
"type": "object",
"properties": {
"first_name": { "type": "string" },
"last_name": { "type": "string" },
"birthday": { "type": "string", "format": "date" },
"type": "object",
"properties": {
"city": { "type": "string" },
"state": { "type": "string" },
"country": { "type" : "string" }
}
}
}
}


### Decoding Trivial-Benchmark Summary

The following summary table was generated by trivial-benchmarks from the libraries parsing the countries.json file (small 1.2 MB file) downloaded from https://github.com/mledoze/countries/blob/master/countries.json, over a cumulated 100 runs.

Table 59: Parsing a 1.2 MB Json File
Library Function User Run Time (Sec) Bytes Consed
boost-json parse 4.419011 2893418496
cl-json decode-json-from-string 4.493974 2893418880
com.gigamonkeys.json parse-json 2.959515 1343528080
com.inuoe.jzon parse 8.542783 789266144
jonathan decode-json-from-string 214.78308 1569003877056
json-lib parse 6.463969 3588339072
json-streams json-parse :duplicate-key-check nil 9.426854 1879613408
jsown parse 1.166661 564075440
yason parse 5.954201 1454553056
Table 60: Parsing a 9.8 MB Json File
Library Function User Run Time (Sec) Bytes Consed
boost-json parse 47.6092 28504785264
cl-json decode-json-from-string 47.9664 28504799776
com.gigamonkeysjson parse-json 33.7283 16456969856
com.inuoe.jzon parse 20.2055 5946667616
jonathan decode-json-from-string 1478.3938 8950991326592
json-lib parse 68.3475 37769586576
json-streams json-parse :duplicate-key-check nil 84.3768 24939253568
jsown parse 5.4876 4919788560
yason parse 68.2322 14313634192

### Decoding Trivial-Benchmark Detail By Library

#### 1.2 MB Json File

##### Boost-Json
 SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN DEVIATION REAL-TIME 100 1.803323 0.016666 0.023333 0.016667 0.001832 RUN-TIME 100 1.804574 0.01772 0.025428 0.017829 0.001108 USER-RUN-TIME 100 1.804623 0.017721 0.02543 0.017829 0.001108 SYSTEM-RUN-TIME 100 0.000015 0 0.000007 0 0.000001 PAGE-FAULTS 100 0 0 0 0 0.0 GC-RUN-TIME 100 17.386 0 6.855 0 1.008626 BYTES-CONSED 100 418195312 4170304 4205280 4188976 8412.935 EVAL-CALLS 100 0 0 0 0 0.0
##### Cl-json
Table 62: cl-json Parsing a 1.2 MB Json File
SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN DEVIATION
REAL-TIME 100 4.543314 0.039999 0.106666 0.043333 0.006987
RUN-TIME 100 4.541149 0.042466 0.106167 0.043373 0.00693
USER-RUN-TIME 100 4.493974 0.037929 0.089708 0.043293 0.005491
SYSTEM-RUN-TIME 100 0.047482 0 0.016466 0 0.002092
PAGE-FAULTS 100 0 0 0 0 0.0
GC-RUN-TIME 100 160.947 0 59.966 0 6.378449
BYTES-CONSED 100 2893418880 28926752 28943680 28931936 7004.0913
EVAL-CALLS 100 0 0 0 0 0.0
##### Com.gigamonkeys.json
 SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN DEVIATION REAL-TIME 100 2.99332 0.026666 0.083333 0.03 0.006055 RUN-TIME 100 2.996168 0.027761 0.083301 0.028399 0.005998 USER-RUN-TIME 100 2.959515 0.023301 0.07341 0.028338 0.005142 SYSTEM-RUN-TIME 100 0.036928 0 0.009893 0 0.001397 PAGE-FAULTS 100 0 0 0 0 0.0 GC-RUN-TIME 100 116.258 0 54.453 0 5.885685 BYTES-CONSED 100 1343528080 13383008 17653728 13392608 423975.53 EVAL-CALLS 100 0 0 0 0 0.0
##### Com.inuoe.jzon
 SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN DEVIATION REAL-TIME 100 8.556628 0.083332 0.103333 0.086666 0.002791 RUN-TIME 100 8.552521 0.084413 0.104917 0.084938 0.002441 USER-RUN-TIME 100 8.542783 0.084415 0.094946 0.084942 0.001753 SYSTEM-RUN-TIME 100 0.0 100 22 0.0001 0.000992 PAGE-FAULTS 100 0 0 0 0 0.0 GC-RUN-TIME 100 45.106 0 18.975 0 2.299594 BYTES-CONSED 100 789266144 7884976 7903872 7887792 8024.648 EVAL-CALLS 100 0 0 0 0 0.0
##### Jonathan

HUH??? In both decoding this real world file and the next, Jonathan is orders of magnitude slower than the other libraries.

Table 65: jonathan Parsing a 1.2 MB Json File
SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN DEVIATION
REAL-TIME 100 243.45543 2.393321 2.566654 2.429988 0.03203
RUN-TIME 100 244.48048 2.403443 2.575086 2.44082 0.032173
USER-RUN-TIME 100 214.78308 2.078124 2.291118 2.145707 0.034326
SYSTEM-RUN-TIME 100 29.697786 0.240362 0.368743 0.290953 0.031794
PAGE-FAULTS 100 0 0 0 0 0.0
GC-RUN-TIME 100 31995.117 306.53 400.945 316.783 13.0481
BYTES-CONSED 100 1569003877056 15689986304 15690142992 15690009472 48889.26
EVAL-CALLS 100 0 0 0 0 0.0
##### Json-lib
 SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN DEVIATION REAL-TIME 100 6.586635 0.059998 0.09 0.063333 0.005871 RUN-TIME 100 6.582603 0.061952 0.090146 0.063047 0.005646 USER-RUN-TIME 100 6.463969 0.055334 0.088421 0.063019 0.004434 SYSTEM-RUN-TIME 100 0.119001 0 0.019625 0 0.003254 PAGE-FAULTS 100 0 0 0 0 0.0 GC-RUN-TIME 100 204.444 0 25.049 0 4.62796 BYTES-CONSED 100 3588339072 35872896 35893952 35883440 2640.1985 EVAL-CALLS 100 0 0 0 0 0.0
##### Json-streams
 SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN DEVIATION REAL-TIME 100 9.456621 0.089999 0.123333 0.093333 0.003962 RUN-TIME 100 9.454452 0.092222 0.122918 0.093099 0.003993 USER-RUN-TIME 100 9.426854 0.086571 0.122918 0.093097 0.00398 SYSTEM-RUN-TIME 100 0.027914 0 0.014832 0 0.001649 PAGE-FAULTS 100 0 0 0 0 0.0 GC-RUN-TIME 100 78.426 0 9.447 0 2.269145 BYTES-CONSED 100 1879613408 18766656 20651648 18777280 186493.95 EVAL-CALLS 100 0 0 0 0 0.0
##### Jsown
 SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN DEVIATION REAL-TIME 100 1.166661 0.009999 0.023333 0.01 0.002517 RUN-TIME 100 1.163073 0.011116 0.023445 0.011194 0.001859 USER-RUN-TIME 100 1.159894 0.011075 0.023436 0.011195 0.001822 SYSTEM-RUN-TIME 100 0.003364 0 0.003139 0 0.000312 PAGE-FAULTS 100 0 0 0 0 0.0 GC-RUN-TIME 100 12.289 0 4.551 0 0.701991 BYTES-CONSED 100 564075440 5630816 5648304 5647344 7997.883 EVAL-CALLS 100 0 0 0 0 0.0
##### Shasht
 SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN DEVIATION REAL-TIME 100 2.27999 0.019999 0.05 0.023333 0.004339 RUN-TIME 100 2.282006 0.02094 0.050451 0.021578 0.004173 USER-RUN-TIME 100 2.232487 0.016563 0.043803 0.021496 0.00336 SYSTEM-RUN-TIME 100 0.04981 0 0.006699 0 0.001575 PAGE-FAULTS 100 0 0 0 0 0.0 GC-RUN-TIME 100 111.285 0 28.047 0 4.036195 BYTES-CONSED 100 1597571088 15962992 15984800 15971744 8653.811 EVAL-CALLS 100 0 0 0 0 0.0
##### St-json
 SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN DEVIATION REAL-TIME 100 3.00332 0.026666 0.036667 0.03 0.001105 RUN-TIME 100 3.001743 0.02964 0.035216 0.029852 0.000717 USER-RUN-TIME 100 2.995316 0.027776 0.033852 0.029826 0.000568 SYSTEM-RUN-TIME 100 0.006622 0 0.003205 0 0.000435 PAGE-FAULTS 100 0 0 0 0 0.0 GC-RUN-TIME 100 8.664 0 5.026 0 0.61437 BYTES-CONSED 100 429072000 4287680 4304800 4288464 5829.427 EVAL-CALLS 100 0 0 0 0 0.0
##### Yason
 SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN DEVIATION REAL-TIME 100 6.019973 0.056666 0.129999 0.059999 0.0077 RUN-TIME 100 6.013932 0.057522 0.130408 0.058553 0.007594 USER-RUN-TIME 100 5.954201 0.05142 0.120414 0.058432 0.006634 SYSTEM-RUN-TIME 100 0.060035 0 0.009996 0 0.001779 PAGE-FAULTS 100 0 0 0 0 0.0 GC-RUN-TIME 100 132.346 0 71.077 0 7.417082 BYTES-CONSED 100 1454553056 14519840 15860304 14537152 132363.33 EVAL-CALLS 100 0 0 0 0 0.0

#### 9.8 MB File

##### Boost Json
 SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN DEVIATION REAL-TIME 100 20.533205 0.176664 0.296664 0.193332 0.030789 RUN-TIME 100 20.536613 0.177694 0.297648 0.19184 0.03093 USER-RUN-TIME 100 19.744255 0.171358 0.271028 0.17994 0.024436 SYSTEM-RUN-TIME 100 0.792737 0 0.043307 0.000011 0.011387 PAGE-FAULTS 100 0 0 0 0 0.0 GC-RUN-TIME 100 2342.315 0 117.524 0 29.684881 BYTES-CONSED 100 8123005392 81224960 81230800 81230272 621.9398 EVAL-CALLS 100 0 0 0 0 0.0
##### Cl-json
 SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN DEVIATION REAL-TIME 100 50.209743 0.449999 0.576664 0.506664 0.031353 RUN-TIME 100 50.207718 0.449199 0.575751 0.505171 0.031618 USER-RUN-TIME 100 47.966446 0.434959 0.532878 0.471581 0.022254 SYSTEM-RUN-TIME 100 2.241677 0 0.066838 0.016578 0.019715 PAGE-FAULTS 100 0 0 0 0 0.0 GC-RUN-TIME 100 6780.997 23.482 136.011 70.721 26.335197 BYTES-CONSED 100 28504799776 285044336 285064480 285047984 1734.8457 EVAL-CALLS 100 0 0 0 0 0.0
##### Com.Gigamonkeys.json
 SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN DEVIATION REAL-TIME 100 35.636482 0.279999 0.476665 0.353332 0.037821 RUN-TIME 100 35.659466 0.281294 0.475776 0.354051 0.037583 USER-RUN-TIME 100 33.728333 0.263166 0.4192 0.343631 0.028073 SYSTEM-RUN-TIME 100 1.931565 0 0.060133 0.016573 0.02011 PAGE-FAULTS 100 0 0 0 0 0.0 GC-RUN-TIME 100 7478.528 0 203.77 77.85 37.92367 BYTES-CONSED 100 16456969856 164567360 164575008 164569936 820.5605 EVAL-CALLS 100 0 0 0 0 0.0
##### Com.inuoe.jzon
 SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN DEVIATION REAL-TIME 100 20.266563 0.199998 0.229998 0.203332 0.003742 RUN-TIME 100 20.279003 0.199857 0.230551 0.201254 0.003799 USER-RUN-TIME 100 20.205465 0.189838 0.21385 0.201131 0.002979 SYSTEM-RUN-TIME 100 0.073949 0 0.019984 0 0.003129 PAGE-FAULTS 100 0 0 0 0 0.0 GC-RUN-TIME 100 129.135 0 26.551 0 2.92862 BYTES-CONSED 100 5946667616 59464000 59466832 59466768 373.24005 EVAL-CALLS 100 0 0 0 0 0.0
##### Jonathan
 SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN DEVIATION REAL-TIME 100 1908.7975 18.729918 19.469904 19.086578 0.115386 RUN-TIME 100 1913.2931 18.778744 19.512901 19.131992 0.114834 USER-RUN-TIME 100 1478.3938 14.512775 15.216936 14.779124 0.107808 SYSTEM-RUN-TIME 100 434.89975 4.165258 4.500475 4.347707 0.064543 PAGE-FAULTS 100 0 0 0 0 0.0 GC-RUN-TIME 100 228871.9 2238.792 2364.713 2284.703 28.799946 BYTES-CONSED 100 8950991326592 89509910464 89509985824 89509912576 7298.351 EVAL-CALLS 100 0 0 0 0 0.0
##### Json-lib
 SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN DEVIATION REAL-TIME 100 71.676285 0.673329 0.786663 0.71333 0.028969 RUN-TIME 100 71.70299 0.668354 0.789122 0.713246 0.029252 USER-RUN-TIME 100 68.34757 0.65903 0.743143 0.679933 0.016493 SYSTEM-RUN-TIME 100 3.355867 0.002895 0.103614 0.026859 0.023092 PAGE-FAULTS 100 0 0 0 0 0.0 GC-RUN-TIME 100 8757.628 63.089 150.257 75.73 22.03588 BYTES-CONSED 100 37769586576 377680784 377697696 377696928 2368.0583 EVAL-CALLS 100 0 0 0 0 0.0
##### Json-streams
 SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN DEVIATION REAL-TIME 100 84.45293 0.779997 0.943329 0.833329 0.039585 RUN-TIME 100 84.37682 0.779965 0.944611 0.83044 0.039825 USER-RUN-TIME 100 81.8803 0.767179 0.911267 0.808016 0.026314 SYSTEM-RUN-TIME 100 2.496926 0 0.099895 0.016629 0.025798 PAGE-FAULTS 100 0 0 0 0 0.0 GC-RUN-TIME 100 7375.233 26.304 178.178 63.66 32.3792 BYTES-CONSED 100 24939253568 249389696 249401168 249392656 984.2861 EVAL-CALLS 100 0 0 0 0 0.0
##### Jsown
 SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN DEVIATION REAL-TIME 100 5.916637 0.043332 0.123332 0.046667 0.021926 RUN-TIME 100 5.923252 0.046274 0.121666 0.046463 0.02184 USER-RUN-TIME 100 5.487612 0.030406 0.108614 0.046446 0.017627 SYSTEM-RUN-TIME 100 0.436004 0 0.049775 0 0.008856 PAGE-FAULTS 100 0 0 0 0 0.0 GC-RUN-TIME 100 1151.569 0 71.668 0 21.207699 BYTES-CONSED 100 4919788560 49191344 49213856 49197856 1753.3816 EVAL-CALLS 100 0 0 0 0 0.0
##### Shasht
 SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN DEVIATION REAL-TIME 100 19.533234 0.189999 0.216666 0.193333 0.004643 RUN-TIME 100 19.535679 0.189569 0.217319 0.193956 0.004622 USER-RUN-TIME 100 19.506966 0.18178 0.217326 0.193623 0.004849 SYSTEM-RUN-TIME 100 0.02914 0 0.012187 0.001376 PAGE-FAULTS 100 0 0 0 0 0.0 GC-RUN-TIME 100 290.639 0 18.918 2.986 1.857263 BYTES-CONSED 100 16092328544 160919648 160923776 160923616 861.9954 EVAL-CALLS 100 0 0 0 0 0.0
##### St-json
 SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN DEVIATION REAL-TIME 100 22.603218 0.199999 0.306665 0.213332 0.028007 RUN-TIME 100 22.606417 0.200335 0.307404 0.212905 0.028388 USER-RUN-TIME 100 21.968456 0.191959 0.284119 0.205496 0.023547 SYSTEM-RUN-TIME 100 0.638355 0 0.046407 0.000023 0.009977 PAGE-FAULTS 100 0 0 0 0 0.0 GC-RUN-TIME 100 2180.432 0 107.748 0 28.770987 BYTES-CONSED 100 7806199792 78059632 78065344 78062176 572.11426 EVAL-CALLS 100 0 0 0 0 0.0
##### Yason
 SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN DEVIATION REAL-TIME 100 68.23965 0.666663 0.709996 0.679997 0.009673 RUN-TIME 100 68.24949 0.665376 0.707559 0.682328 0.009683 USER-RUN-TIME 100 68.23221 0.65959 0.707563 0.681629 0.009905 SYSTEM-RUN-TIME 100 0.017674 0 0.011396 0 0.000177 PAGE-FAULTS 100 0 0 0 0 0.0 GC-RUN-TIME 100 250.379 0 3.624 3.037 1.261508 BYTES-CONSED 100 14313634192 143132624 143136768 143136656 731.3266 EVAL-CALLS 100 0 0 0 0 0.0