Review of CL JSON Libraries UPDATED 15 May 2023
Changelog
15 May 2023
Picks up regressions in com.inuoe.jzon. It has dropped its heuristics for handling lists. This can change the resulting data type and co-incidentally has dropped the ability to handle dotted lists. This last is likely to be corrected in the future.
12 Apr 2023
com.inuoe.jzon (aka jzon) is now in quicklisp. Both jzon and shasht have had active development. The versions used are the github repositories as of 11 Apr 2023. Benchmarking has been updated.
A summary of changes in jzon:
- incremental parsing and writing are now implemented
- The issue with empty object on pathname has been 'fixed' - we now coerce to namestring
- Float<->JSON encoding/decoding have both been substantially improved in terms of speed, accuracy, and portability across lisp implementations (bundled float encoder/decoder)
- Parsing is now safe on the stack regardless of object nesting setting
- Parsing performance improvements across the board for a variety of file sizes/varieties
- Coerce strings to simple-base-string when possible for 1/4 memory usage on SBCL (ASCII vs UTF32 encoding) with minimal overhead.
- New :allow-trailing-comma, and block comment support for :allow-comments in order to allow for parsing jwcc data (https://nigeltao.github.io/blog/2021/json-with-commas-comments.html)
- Much more detailed errors across the board when a parse error occurs
Changes in shasht include adding read-level and read-length limits to address potential security concerns. (These are set to nil by default).
1 Feb 2022
Picks up updates in shasht and now includes warnings on libraries with safety 0 (jsown
, jonathan
, st-json
).
19 Jan 2022
Updated shasht for fixes in character, 2D arrays, and nil handling. Fixed some typos, including a missing earmuff.
18 Jan 2022
Corrected JSON conformity tests on boost-json
, cl-json
, json-lib
, shasht
, st-json
, trivial-json-codec
and yason
. If *read-default-float-format*
is set to 'single-float
, those libraries would refuse to accept: "[123e65]"
, "[123e45]"
and "[123.456e78]"
. If *read-default-float-format*
is set to 'double-float
, then those JSON strings would be correctly accepted. Net result: with that caveat, boost-json, cl-json, com.gigamonkeys.json, com.inuoe.jzon, json-lib, shasht, st-json and yason all score 95/95 on the JSON strings that must be accepted. See Standard Conformity - Must Accept.
15 Jan 2022
Complete rewrite of 1st edition.
Introduction
The Common Lisp (CL) landscape with respect to JSON libraries has changed since the first edition of this review eight years ago. While I sometimes complain about someone writing new libraries when there are already so many, there have been major improvements and a couple of the new generation of CL JSON libraries are actually quite exciting. It still remains the case, however, that your choice of data structures for your application may drive your choice of JSON library, and vice versa. Like everything else in life, there are trade-offs to be made and I hope this paper helps you think about what may or may not be relevant for your situation.
Corrections to this report are welcomed. Please submit issues to https://github.com/sabracrolleton/sabracrolleton.github.io or pull requests against the json-view.org file.
Common Lisp Encoding and Decoding Libraries
CL currently has at least twelve libraries that address importing and exporting JSON data. The libraries considered in this report are listed in the table below. (For purposes of this comparison, I will refer to "encoding" as converting from CL to JSON and "decoding" or "parsing" as converting from JSON to CL.)
Library | Author | License | Website | Quicklisp? | Updated |
---|---|---|---|---|---|
boost-json | Jeffrey Massung | Apache | homepage | No | 16 Dec 2021 |
cl-json | Henrik Hjelte, Boris Smilga, Robert Goldman | MIT | homepage | Yes | 7 Nov 2014 |
com.gigamonkeys.json | Peter Seibel | BSD-3 | homepage | Yes | 15 Apr 2017 |
com.inuoe.jzon | Wilfredo Velázquez-Rodríguez | MIT | homepage | Yes | 20 Mar 2023 |
jonathan | Rudolph Miller | MIT | homepage | Yes | 1 Sep 2020 |
json-lib | Alex Nygren | MIT | homepage | Yes | 22 Oct 2022 |
json-streams | Thomas Bakketun, Stian Sletner | GPL3 | homepage | Yes | 12 Oct 2017 |
jsown | Aad Versteden | MIT | homepage | Yes | 4 Feb 2020 |
shasht | Tarn W. Burton | MIT | homepage | Yes | 23 Feb 2023 |
st-json | Marijn Haverbeke | zlib | homepage | Yes | 28 Jun 2021 |
trivial-json-codec (2) | Eric Diethelm | MIT | homepage | Yes | 26 Apr 2022 |
yason (1) | Hans Huebner | BSD | homepage | Yes | 2 Feb 2023 |
- (1) IMPORTANT: Notice the GitHub location has moved. Hans Huebner's old GitHub location will automatically redirect to Phil Marek's.
- (2) trivial-json-codec seems more targeted at decoding from JSON to CL than serializing to JSON. Its main purpose is in serializing JSON data to CLOS hierarchical objects.
Library | Dependencies |
---|---|
boost-json | |
cl-json | |
com.gigamonkeys.json | |
com.inuoe.jzon | closer-mop, flexi-streams, trivial-gray-streams, uiop and float-features (except when using ECL) |
jonathan | cl-syntax, cl-syntax-annot, fast-io, trivial-types, babel, proc-parse, cl-ppcre, cl-annot |
json-lib | alexandria, str, parse-float, cl-fad, babel |
json-streams | |
jsown | |
shasht | trivial-do, closer-mop |
st-json | |
trivial-json-codec | trivial-utilities, log4cl, closer-mop, iterate, parse-number |
yason | alexandria, trivial-gray-streams |
Helper Libraries
We should also talk a bit about some helper libraries.
Library | Author | License | Website | Works With | Comments |
---|---|---|---|---|---|
define-json-expander | Johan Sjölén | MIT | homepage | cl-json | CLOS<->JSON |
herodotus | Henry Steere | BSD (1) | homepage | yason | CLOS<->JSON |
json-mop | Grim Schjetne | MIT | homepage | yason | CLOS<->JSON |
cl-json-helper | Bob Felts | BSD | homepage | cl-json | |
cl-json-schema | Mark Skilbeck | MIT | homepage | ||
cl-json-pointer | Yokata Yuki | MIT | homepage | cl-json, st-json, yason, jsown , jonathan, json-streams, com.gigamonkeys.json, (1) |
40ants comments |
- (1) asd file says BSD, but license included is MIT.
Quick Summary
As can be expected, the libraries do much of the same if you have basic needs. However, differences exist and should be considered in choosing which library is right for any particular project. Many applications are asymmetric in how they will use these libraries, so consider the strengths and weakness given the needs of your case. Library links are to the libraries' individual sections below.
- Overall: I quite like the newcomers shasht and com.inuoe.jzon. cl-json and yason are still the work horses if you need fine control, but speed is not their forte.
- Decoding or Parsing Speed: Speed is not everything, but seems to be important to a lot of readers. If you are parsing compliant data and are just looking for speed, look at jsown, com.inuoe.jzon or shasht. Jonathan is faster on tiny strings but starts to slow down with nested objects and eventually, depending on the size of the JSON object, becomes orders of magnitude slower than the other libraries. The effect was seen faster under SBCL but as data file size increased, it was also obvious under CCL and ECL as well. See read-times
- Encoding Speed: If you mostly need to encode lisp data to JSON and are just looking for speed, I would look at com.inuoe.jzon, shasht or depending on your data, jonathan or st-json. See write-times
- Safety 0 Three libraries (
jsown
, jonathan and st-json) have optimize set for safety 0 to increase speed. This has bitten some users hard in the past, so you have been warned. If you restrict safety to 2 or 3 on SBCL, Jonathan is unable to parse floats. See https://github.com/rudolph-miller/jonathan/issues/66 - Unicode Handling: Most of the libraries can handle unicode. However, if you have unicode surrogate pairs, you need to choose from com.inuoe.jzon, jonathan, json-streams, jsown, shasht, st-json or yason. See Decoding Unicode.
- Extracting Subsets: If you want to extract a subset of JSON data without paying attention to everything else, jsown is the choice. Jonathan allows you to get a key:value pair, but I could only use it one level deep. Obviously every library can import all the JSON data into some CL datastructure and then use standard CL functionality to get whatever subset you want.
- Handling all proper JSON forms: If you remember to set *read-default-float-format* to 'double-float, there is no clear winner. boost-json, cl-json, com.gigamonkeys.json, com.inuoe.jzon, json-lib, shasht, st-json and yason all score 95/95 on the JSON standard conformity tests that must be met. If *read-default-float-format* is set to to 'single-float, only com.inuoe.jzon and com.gigamonkeys.json have perfect scores. See Must Accept.
- Handling potentially invalid data: See Standard Conformity and Dealing with Malformed Data. I think com.inuoe.jzon is the winner..
- Proper Distinction Between NULL/Nil/False: Com.inuoe.jzon, shasht and st-json get it right out of the box. jonathan and yason provide the ability to get there with setting a simple variable.
- CLOS Abilities - Decoding: Trivial-json-codec can decode objects to standard pre-defined classes. Boost-json can decode JSON objects to a boost-json:json-object which is a standard object. Cl-json has the ability to decode JSON objects into a "fluid-class" CLOS object. (Note, this is thread unsafe if you have not already prepped classes for every expected JSON object. If you define your own classes with a lispclass member, this can be avoided so long as every class is defined that way.) Personally I prefer being able to define my own classes. Trivial-json-codec then just dumps the JSON data into your classes. If you use yason, I would also use the helper library json-mop. (com.inuoe.jzon decodes JSON to hash-tables, so does not have decoding to CLOS abilities out of the box.) See Decoding to CLOS
- CLOS Abilities - Encoding: On the encoding side, cl-json, com.inuoe.jzon, shasht and trivial-json-codec are able to encode CLOS objects to JSON out of the box. The rest of the libraries require that you write your own methods for your clos classes - which is not necessarily that difficult, but not something you get out of the box. If you like yason, there are two helper libraries: json-mop or herodotus. I have a preference for json-mop with the caveat that, at the moment, you cannot redefine classes. There is also a cl-json helper library, define-json-expander, which defines classses which can be used by cl-json to help move data back and forth. See Encoding Objects
- Encoding different lisp types: shasht and com.inuoe.jzon were the best at encoding different lisp data types. For example, they were the only library able to handle structs without the user having to define a new method. There is a trade-off between handling multiple lisp types and symmetry. If you do not have a 1:1 matching but rather have a many:1 matching, there may be complications doing a round trip between JSON and CL and getting exactly the same data type or structure you started with. See Encoding or Encoding Data Structures.
- Incremental Encoding: cl-json, com.inuoe.jzon jonathan, json-streams, jsown, shasht, st-json and yason all have the capability to do incremental encoding. You can see small examples in their individual sections at the links in the previous sentence. I found shasht a little easier. See Incremental Encoding Discussion for more detail.
- Security issues: Security issues are always a potential problem if you are getting data from uncontrolled sources. Malware disguised in the data will likely either be properly formed JSON strings that try to overload the system by overloading libraries using keywords as hash symbols or improperly formed JSON strings that exhaust the stack. See the Security discussion for more information. In cases facing data from uncontrolled sources, I would look first at com.inuoe.jzon , then maybe at shasht. I would suggest that all libraries put some type of limit on parsing depth of JSON objects or arrays. At this point only com.inuoe.jzon, json-lib, json-streams and trivial-json-codec have such limits, all of which are configurable. See security.
- Symmetry or "Round Tripping": Symmetric treatment is important to some users. In this area, libraries tend to have issues with NULL/Nil/False as well as whether keys should be symbols or strings. Symmetry is easier going from Jason to CL and back. It is definitely harder if you go from CL to JSON and back to CL and you do not limit your CL data types. (CL has more data types so you do not have a 1:1 match). Your choices are to limit your CL data types or using libraries which allow you to precisely specify the CL datatype you want at that point. Overall, I think the symmetry winner is shasht, but your fact pattern may drive a different answer. See symmetry discussion.
At the end of the day, your particular data and requirements will determine which library is best for any particular application. Webapps may have tiny bits of JSON going back and forth while other uses will be asymmetric - gigabytes of JSON getting imported and little or occasional amounts getting exported or vice versa. You may have different needs depending on whether the JSON encoded data is strictly controlled or it is coming in from unknown sources. In one test, yason:encode choked on a list which included a keyword :NULL in an unexpected location. Cl-json just encoded it as the string "null" and st-json encoded it as 'null' (not a string). In testing for your use, if you get JSON data from uncontrolled sources, deliberately feed badly formed data and see how the library reacts. Some will throw recoverable conditions (depending on the error) while others may actually lock up a thread.
Decoding/Reading/Parsing JSON Data to Lisp
The following table shows the basic decoding function for a library and then specialist functions. Each will have more detail in the section specific to that library.
Library | Base Function | Specialist Functions |
---|---|---|
boost-json | json-decode | json-read |
cl-json | decode-json | decode-json-from-source, decode-json-from-string, decode-json-strict |
com.gigamonkeys.json | parse-json | |
com.inuoe.jzon | parse | |
jonathan | parse | |
json-lib | parse | |
json-streams | json-parse | json-read |
jsown | parse | |
shasht | read-json | read-json* |
st-json | read-json | read-json-as-type read-json-from-string |
trivial-json-codec | deserialize-raw | deserialize-json |
yason (1) | parse | parse-json-arrays-as-vectors, parse-json-booleans-as-symbols, parse-json-null-as-keyword, parse-object-as, parse-object-as-alist, parse-object-key-fn, symbol-key-encoder |
- (1) yason:parse takes keyword parameters :object-key-fn, :object-as :json-arrays-as-vectors :json-booleans-as-symbols :json-nulls-as-keyword
Decoding Streams or Strings
Does the library take both strings and streams as input?
Library | Strings | Streams |
---|---|---|
boost-json | YES | YES (1) |
cl-json | YES | YES |
com.gigamonkeys.json | YES | NO |
com.inuoe.jzon | YES | YES |
jonathan | YES | NO |
json-lib | UTF-8 Encoded Only | NO |
json-streams | YES | YES |
jsown |
YES | NO |
shasht | YES | YES |
st-json | YES | YES |
trivial-json-codec | YES | NO |
yason | YES | YES |
- (1) Use boost-json:json-read instead of boost-json:json-decode
Mapping Data Types and Structures from JSON to CL
JSON has a limited number of data types. You will get different lisp data-types from the decoding depending on the library.
Library | True/False/ Null | Number | Array | JSON Object (3)(6) |
---|---|---|---|---|
Original JSON | "true"/ "false"/ "null" | "12.3" | "[1,2]" | "{\"a\": 2}" |
boost-json | T / NIL/ NIL | 12.3 | (1 2) | #<JSON-OBJECT {"a":2}> |
cl-json (1) | T / NIL / NIL | 12.3 | (1 2) | ((A . 2)) |
com.gigamonkeys.json | TRUE / FALSE / NULL | 12.3d0 | #(1 2) | (a 2) |
com.inuoe.jzon (3) | T / NIL/ NULL | 12.3d0 | #(1 2) | #<HASH-TABLE} :TEST EQUAL :COUNT 1> |
jonathan | T / NIL / NIL | 12.3 | (1 2) | (a 2) |
json-lib (3) | T / NIL/ NIL | 12.3 | #(1 2) | #<HASH-TABLE :TEST EQUAL :COUNT 1> |
json-streams | T / NIL/ NULL | 12.3d0 | (ARRAY 1 2) | (OBJECT (a . 2)) |
jsown (5) | T / NIL / NIL | 123/10 (5) | (1 2) | (OBJ (a . 2)) |
shasht (3)(6) | T / NIL / NULL | 12.3 | #(1 2) | #<HASH-TABLE :TEST EQUAL :COUNT 1> (6) |
st-json | TRUE / FALSE / NULL | 12.3 | (1 2) | #S(JSO :ALIST ((a . 2))) |
trivial-json-codec (2) | "true"/ "false"/ "null" | 12.3 | #(1 2) | ((:A 2)) |
yason (3)(4)(6) | T / NIL/ NIL/:NULL | 12.3 | (1 2)/#(1 2) | #<HASH-TABLE :TEST EQUAL :COUNT 1> (6) |
- (1) This is cl-json's default mode. Using cl-json:with-decoder-simple-clos-semantics or cl-json:simple-clos-semantics will switch cl-json into a mode where JSON arrays are decoded to cl vectors rather than lists, and JSON objects are decoded to CLOS objects rather than alists.
- (2) There is a difference between how it handles the string false and string null and how it handles them when they are in an object and not separated as sub-strings. See true-false-null-mapping.
- (3) All four libraries that decode a JSON object to a CL hash use strings as hash keys by default
- (4) yason:parse has a keyword parameter
:json-nulls-as-keyword
and a keyword parameter:json-arrays-as-vectors
. (5) Parsing JSON value strings which are not embedded in a JSON object or array will trigger errors. For example
(jsown:parse "12.3")
or(jsown:parse "alpha")
will trigger asb-kernal:bounding-indices-bad
error in SBCL. Floats in an array or object will be converted to a ratio:(jsown:parse "[123e-1, 15.2]") (123/10 76/5)
- (6) While shasht and yason default to parsing JSON objects as hash-tables, they can optionally be parsed as alists or plists.
The following tables set out some additional detail on mapping from JSON data structures to lisp data structures using the normal functions listed above with some comments on the results.
Unicode
Consider a JSON string with unicode characters. The JSON data string is the first row in each table.
Library | Function | Result | Comment |
---|---|---|---|
Original | ["\u004C","明彦","\u2604"] | ||
boost-json | json-decode | (L 明彦 ☄) | |
cl-json | json-decode-from-string | (L 明彦 ☄) | |
com.gigamonkeys.json | parse-json | #(L 明彦 ☄) | |
com.inuoe.jzon | parse | #(L 明彦 ☄) | |
json-lib | parse | #( 明彦 ) | (1) |
json-streams | json-parse | (ARRAY L 明彦 ☄) | |
shasht | read-json | #(L 明彦 ☄) | |
st-json | read-json-from-string | (L 明彦 ☄) | |
trivial-json-codec | deserialize-raw | #(\u004C 明彦 \u2604) | (2) |
yason | parse | (L 明彦 ☄) |
- (1) Did not handle the unicode char codes
- (2) Repeated the unicode char codes
In the following table, we show the results of attempting to decode unicode with surrogate pairs
Library | Function | Result |
---|---|---|
Original | "\uD83D\uDE02\uD83D\uDE02" | |
boost-json | json-decode | ���� |
cl-json | json-decode-from-string | ���� |
com.gigamonkeys.json | parse-json | ���� |
com.inuoe.jzon | parse | 😂😂 |
jonathan | parse | 😂😂 |
json-lib | parse | ��� |
json-streams | json-parse | 😂😂 |
jsown | parse | 😂😂 |
shasht | read-json | 😂😂 |
st-json | read-json-from-string | 😂😂 |
trivial-json-codec | deserialize-raw | \uD83D\uDE02\uD83D\uDE02 |
yason | parse | 😂😂 |
Conclusion: If you may have surrogate pairs in your JSON data, you might want to stick to com.inuoe.jzon, jonathan, json-streams, jsown, shasht, st-json or yason.
Number Mapping
Library | Result | Comment |
---|---|---|
Original JSON String | "{\"integer\": 32,\"float\": 34.89}" | |
boost-json | #<JSON-OBJECT {"integer":32,"float":34.89}> | |
cl-json | ((INTEGER . 32) (FLOAT . 34.89)) | |
com.gigamonkeys.json | (integer 32 float 34.89d0) | |
com.inuoe.jzon | ((float . 34.89d0) (integer . 32)) | |
json-lib | ((float . 34.89) (integer . 32)) | |
jonathan | ((float . 34.89) (integer . 32)) | |
json-streams | (OBJECT (integer . 32) (float . 34.89d0)) | |
jsown | (OBJ (integer . 32) (float . 3489/100)) | Ratio |
shasht | ((float . 34.89) (integer . 32)) | |
st-json | #S(JSO :ALIST ((integer . 32) (float . 34.89))) | |
trivial-json-codec | ((:INTEGER 32) (:FLOAT 34.89)) | |
yason | ((float . 34.89) (integer . 32)) |
True, False, Null, Empty Array Mapping
Contrary to some people's belief systems that boolean logic encompasses everything, there is a meaningful difference between "false" and "unknown". Null ≠ nil. For that matter, I subscribe to the belief that (not true) is not the same as the empty set. JSON's null and empty arrays can track the differences between false, null and empty array. When translating back and forth between CL and JSON, it is important to be able to keep those distinction. Thank you, com.gigamonkeys.json, com.inuoe.jzon, shasht and st-json for getting them correct right out of the box and thank you to jonathan and yason for giving me the ability to get there with setting a variable.
Library | Result | Comment |
---|---|---|
Original JSON String | "{\"1\": true,\"2\": false, \"3\": null}" | |
boost-json | #<JSON-OBJECT {"1":true,"2":null,"3":null}> | failed false |
cl-json | ((1 . T) (2) (3)) | failed null |
com.gigamonkeys.json | (1 TRUE 2 FALSE 3 NULL) | |
com.inuoe.jzon | ((3 . NULL) (2 . nil) (1 . T)) | |
jonathan | ((:3 . nil) (:2 . nil) (:1 . T)) | failed null (but see next row) |
jonathan (4) | ((:3 . :null) (:2 . nil) (:1 . T)) | |
json-lib | ((3 . nil) (2 . nil) (1 . T)) | failed null |
json-streams | (OBJECT (1 . T) (2) (3 . NULL)) | failed false (2) |
jsown | (OBJ (1 . T) (2) (3)) | failed null |
shasht | ((3 . NULL) (2 . nil) (1 . T)) | |
st-json | #S(JSO :ALIST ((1 . TRUE) (2 . FALSE) (3 . NULL))) | |
trivial-json-codec | ((:|1| T) (:|2| NIL) (:|3| NIL)) | failed false and null |
yason (3) | (("3") ("2") ("1" . T)) | failed null (but see next row |
yason (4) | (("3" . :NULL) ("2") ("1" . T)) |
- (1) The cl-boost author notes in his README that "it is not possible to distinguish between false, null, or []. And, I have personally never found this to be problematic." Obviously your experience may vary. My experience is that they show meaningful differences between "false" and "unknown". I think his position is stronger if the question is solely between nil and [].
- (2) When reading a JSON object with nil and null, json-streams provides no value for the cons cell which represents the JSON value false. I would have expected the cons cell to have a value of nil.
- (3) When called with :object-as :plist, the nil is explicit: ("1" T "2" NIL "3" nil)
- (4) Jonathan called with jonathan:*null-value* set to :null. Yason called with yason:*parse-json-null-as-keyword* set to t or passing the keyword parameter :json-nulls-as-keyword t to yason:parse
The following table shows how the various libraries try to decode a JSON object with an empty array as the value. Com.gigamonkeys, com.inuoe.jzon, json-lib and shasht are the clearest representation of the original JSON.
Library | Function | Result |
---|---|---|
Original JSON string | {"a": []} | |
boost-json | json-decode | #<JSON-OBJECT {"a":null}> |
cl-json | json-decode-from-string | ((A)) |
com.gigamonkeys.json | parse-json | (a #()) |
com.inuoe.jzon | parse | ((a . #())) |
jonathan | parse | (a NIL) |
json-lib | parse | ((a . #())) |
json-streams | json-parse | (OBJECT (a ARRAY)) |
jsown | parse | (OBJ (a)) |
shasht | read-json | ((a . #())) |
st-json | read-json-from-string | #S(JSO :ALIST ((a))) |
trivial-json-codec | deserialize-raw | ((A NIL)) |
yason | parse | ((a)) |
Now looking at all four types within a JSON array:
Library | Result | Comment |
---|---|---|
Original | [true,false,null, []] | |
boost-json | (T NIL NIL NIL) | failed null |
cl-json | (T NIL NIL NIL) | failed null |
com.gigamonkeys.json | #(TRUE FALSE NULL #()) | |
com.inuoe.jzon | #(T NIL NULL #()) | |
jonathan | (T NIL NIL NIL) | failed null but see next row |
jonathan (1) | (T NIL NIL :NULL) | |
json-lib | #(T NIL NIL #()) | failed null |
json-streams | (ARRAY T NIL NULL (ARRAY)) | correct this time compared to decoding object |
jsown | (T NIL NIL NIL) | failed null |
shasht | #(T NIL NULL #()) | |
st-json | (TRUE FALSE NULL NIL) | |
trivial-json-codec | #(T NIL NIL NIL) | failed null |
yason | (T NIL NIL NIL) | |
yason (1) | (T NIL NIL :null) | failed null but see next row |
- (1) Jonathan called with jonathan:*null-value* set to :null. Yason called with yason:*parse-json-null-as-keyword* set to t or passing the keyword parameter :json-nulls-as-keyword t to yason:parse
Decoding JSON Arrays
Library | Result | Comment |
---|---|---|
Original | ["Skoda", "Peugeot", "SEAT"] | |
boost-json | (Skoda Peugeot SEAT) | list |
cl-json | (Skoda Peugeot SEAT) | list |
com.gigamonkeys.json | #(Skoda Peugeot SEAT) | vector |
com.inuoe.jzon | #(Skoda Peugeot SEAT) | vector |
jonathan | (Skoda Peugeot SEAT) | list |
json-lib | #(Skoda Peugeot SEAT) | vector |
json-streams | (:ARRAY Skoda Peugeot SEAT) | cons |
jsown | (Skoda Peugeot SEAT) | list |
shasht | #(Skoda Peugeot SEAT) | vector |
st-json | (Skoda Peugeot SEAT) | list |
trivial-json-codec | #("Skoda" "Peugeot" "SEAT") | vector |
yason | (Skoda Peugeot SEAT) | list |
yason (1) | #("Skoda" "Peugeot" "SEAT") | vector |
- (1) Yason called with additional keyword parameter :json-arrays-as-vectors t
Decoding JSON Objects
For the JSON object examples, we will be using the following simple JSON string:
(defparameter *address-1* "{ \"name\": \"George Washington\", \"birthday\": \"February 22, 1732\", \"address\": \"Mount Vernon, Virginia, United States\" }"
Library | Result | Comment |
---|---|---|
Original | {"name":"George Washington","birthday":"February 22, 1732","address":"Mount Vernon, Virginia, United States"} | |
boost-json | #<JSON-OBJECT {"name":"George Washington","birthday":"February 22, 1732","address":"Mount Vernon, Virginia, United States"}> | |
cl-json | ((NAME . George Washington) (BIRTHDAY . February 22, 1732) (ADDRESS . Mount Vernon, Virginia, United States)) | |
com.gigamonkeys.json | (name George Washington birthday February 22, 1732 address Mount Vernon, Virginia, United States) | |
com.inuoe.jzon | #<HASH-TABLE :TEST EQUAL :COUNT 3 {101F71B1E3}> | (1) |
jonathan | ((address . Mount Vernon, Virginia, United States) (birthday . February 22, 1732) (name . George Washington)) | |
json-lib | #<HASH-TABLE :TEST EQUAL :COUNT 3 {101F71CC63}> | (1) |
json-streams | (OBJECT (name . George Washington) (birthday . February 22, 1732) (address . Mount Vernon, Virginia, United States)) | |
jsown | (OBJ (name . George Washington) (birthday . February 22, 1732) (address . Mount Vernon, Virginia, United States)) | |
shasht | #<HASH-TABLE :TEST EQUAL :COUNT 3 {101F71FA83}> | (1) |
st-json | #S(JSO :ALIST ((name . George Washington) (birthday . February 22, 1732) (address . Mount Vernon, Virginia, United States))) | |
trivial-json-codec | ((NAME George Washington) (BIRTHDAY February 22, 1732) (ADDRESS Mount Vernon, Virginia, United States)) | |
yason | #<HASH-TABLE :TEST EQUAL :COUNT 3 {101F722863}> | (1) |
yason-alist | ((address . Mount Vernon, Virginia, United States) (birthday . February 22, 1732) (name . George Washington)) | |
yason-plist | (name George Washington birthday February 22, 1732 address Mount Vernon, Virginia, United States) |
- (1) All four libraries that decode a JSON object to a CL hash use strings as hash keys by default. All four will allow you to optionally use keywords instead. Shasht and yason allow you to decode a JSON object to alists or plists instead of hashes.
Decoding Nested JSON Objects
For nested object examples, we will be using the following simple JSON object:
(defparameter *nested-address-1* "{ \"first_name\": \"George\", \"last_name\": \"Washington\", \"birthday\": \"1732-02-22\", \"address\": { \"street_address\": \"3200 Mount Vernon Memorial Highway\", \"city\": \"Mount Vernon\", \"state\": \"Virginia\", \"country\": \"United States\" } }")
Library | Result | Comment |
---|---|---|
Original | { "first_name": "George", "last_name": "Washington", "birthday": "1732-02-22", "address": { "street_address": "3200 MountVernon Memorial Highway", "city": "Mount Vernon", "state": "Virginia", "country": "United States" }} | |
boost-json | #<JSON-OBJECT {"first_name":"George","last_name":"Washington", "birthday":"1732-02-22","address":#}> | Nested address info is enclosed in a nested json-object. See boost-json decoding nested objects to clos |
cl-json | ((FIRST–NAME . George) (LAST–NAME . Washington) (BIRTHDAY . 1732-02-22) (ADDRESS (STREET–ADDRESS . 3200 Mount Vernon Memorial Highway) (CITY . Mount Vernon) (STATE . Virginia) (COUNTRY . United States))) | All info provided |
com.gigamonkeys.json | (first_name George last_name Washington birthday 1732-02-22 address (street_address 3200 Mount Vernon Memorial Highway city Mount Vernon state Virginia country United States)) | All info but nested plists provided |
com.inuoe.jzon | ((address . #<HASH-TABLE :TEST EQUAL :COUNT 4 {10092D4753}>) (birthday . 1732-02-22) (last_name . Washington) (first_name . George)) | all info available but shows need to recursively apply differnt functions to pull out the nested hashes created by com.inuoe.jzon |
jonathan | ((address (country . United States) (state . Virginia) (city . Mount Vernon) (street_address . 3200 Mount Vernon Memorial Highway)) (birthday . 1732-02-22) (last_name . Washington) (first_name . George)) | All info provided |
json-lib | ((address . #<HASH-TABLE :TEST EQUAL :COUNT 4 {10092DA153}>) (birthday . 1732-02-22) (last_name . Washington) (first_name . George)) | all info available but shows need to recursively apply differnt functions to pull out the nested hashes created by json-lib |
json-streams | (OBJECT (first_name . George) (last_name . Washington) (birthday . 1732-02-22) (address OBJECT (street_address . 3200 Mount Vernon Memorial Highway) (city . Mount Vernon) (state . Virginia) (country . United States))) | All info provided |
jsown | (OBJ (first_name . George) (last_name . Washington) (birthday . 1732-02-22) (address OBJ (street_address . 3200 Mount Vernon Memorial Highway) (city . Mount Vernon) (state . Virginia) (country . United States))) | All info provided in nested jsown objects |
shasht | ((address . #<HASH-TABLE :TEST EQUAL :COUNT 4 {10092E2203}>) (birthday . 1732-02-22) (last_name . Washington) (first_name . George)) | all info available but shows need to recursively apply differnt functions to pull out the nested hashes created by shasht |
st-json | #S(JSO :ALIST ((first_name . George) (last_name . Washington) (birthday . 1732-02-22) (address . #S(JSO :ALIST ((street_address . 3200 Mount Vernon Memorial Highway) (city . Mount Vernon) (state . Virginia) (country . United States)))))) | All info provided in nested st-json objects |
trivial-json-codec | ((:FIRST_NAME "George") (:LAST_NAME "Washington") (:BIRTHDAY "1732-02-22") (:ADDRESS ((:STREET_ADDRESS "3200 Mount VernonMemorial Highway") (:CITY "Mount Vernon") (:STATE "Virginia") (:COUNTRY "United States")))) | |
yason | ((address . #<HASH-TABLE :TEST EQUAL :COUNT 4 {10092E6003}>) (birthday . 1732-02-22) (last_name . Washington) (first_name . George)) | all info available but shows need to recursively apply differnt functions to pull out the nested hashes created by yason. |
Converting JSON data to a CLOS object
Which libraries have some built-in ability to convert a JSON object to a CLOS object?
Library | Function | Comment |
---|---|---|
boost-json | json-decode | YES (1) |
cl-json | decode-json-from-string | YES (2) |
com.gigamonkeys.json | NO | |
com.inuoe.jzon | NO | |
jonathan | NO | |
json-lib | NO | |
json-streams | NO | |
jsown | NO | |
shasht | NO | |
st-json | NO (3) | |
trivial-json-codec | deserialize-json | YES (4) |
yason | NO (5) |
- (1) boost-json creates a boost-json:json-object which is a standard-object. Accessing the slots is done with boost-json:json-getf and boost-json:json-setf functions. See Boost-json decoding to CLOS
- (2) cl-json can create a cl-json:fluid-class which is a standard-object. Accessing the slots is done with the name of the slot. See cl-json-data-to-clos.
- (3) The st-json "object" jso is a struct, not a CLOS object.
- (4) You need to define your classes first
- (5) Possible with helper libraries json-mop or herodotus.
For example, consider two address JSON objects:
*address-1* "{ \"name\": \"George Washington\", \"birthday\": \"February 22, 1732\", \"address\": \"Mount Vernon, Virginia, United States\" }" *nested-address-1* "{ \"first_name\": \"George\", \"last_name\": \"Washington\", \"birthday\": \"1732-02-22\", \"address\": { \"street_address\": \"3200 Mount Vernon Memorial Highway\", \"city\": \"Mount Vernon\", \"state\": \"Virginia\", \"country\": \"United States\" } }"
Obviously you can always write your own function to initialize a specific class from an alist, but we do have two libraries (boost-json and cl-json) that try to do something like this for you automagically. Let's take a look.
Boost-json
Boost-json will decode the JSON nested object to a CLOS object class called boost-json:json-object.
(boost-json:json-decode *nested-address-1*) #<BOOST-JSON:JSON-OBJECT {"first_name":"George","last_name":"Washington","birthday":"1732-02-22","address":#}>
It appears like you need to call (json-getf obj keyword) in order to act as an accessor. E.g.
(let ((data (boost-json:json-decode *nested-address-1*))) (setf (boost-json:json-getf data "first_name") "Michael") data) #<BOOST-JSON:JSON-OBJECT {"first_name":"Michael","last_name":"Washington","birthday":"1732-02-22","address":#}>
Cl-json
cl-json can decode and convert them to a cl-json:fluid-class CLOS object. You do need to at least temporarily set the change the decoder to use simple-clos-semantics and set the *json-symbols-package* to nil. (Note, this is thread unsafe unless you have pre-created. Per the documentation: "To maintain the mapping between lists of superclass names and fluid classes, the decoder maintains a class registry. Thus, using fluid objects makes the CLOS decoder essentially thread-unsafe. (If every incoming JSON Object is guaranteed to have a prototype with a "lispClass" member then there are no fluid objects and thread safety is ensured.) If the user wishes to employ fluid objects in a threaded environment it is advisable to wrap the body of entry-point functions in with-local-class-registry.")
First, looking at the simpler version, notice you need to specify the slots in the fluid-class object:
(cl-json:with-decoder-simple-clos-semantics (setf cl-json:*json-symbols-package* nil) (let ((x (cl-json:decode-json-from-string *address-1*))) (with-slots (name birthday address) x (values x name birthday address))))) #<#<JSON:FLUID-CLASS COMMON-LISP:NIL {100F765713}> {100FB22513}> "George Washington" "February 22, 1732" "Mount Vernon, Virginia, United States"
Now looking at the nested version, we need to note that by default cl-json will convert the underscores in the JSON keys to double hyphens in the slot names.
(cl-json:with-decoder-simple-clos-semantics (setf cl-json:*json-symbols-package* nil) (let ((x (cl-json:decode-json-from-string *nested-address-1*))) (with-slots (first--name last--name birthday address) x (values x first--name last--name birthday address))))) #<#<JSON:FLUID-CLASS COMMON-LISP:NIL {100F765713}> {10107CF9E3}> "George" "Washington" "1732-02-22" #<#<JSON:FLUID-CLASS COMMON-LISP:NIL {100F765713}> {10107CF6F3}>
Because we have a nested class, we would need drill down and specify the slots for the sub-object as well:
(cl-json:with-decoder-simple-clos-semantics (setf cl-json:*json-symbols-package* nil) (let ((x (cl-json:decode-json-from-string *nested-address-1*))) (with-slots (first--name last--name birthday address) x (with-slots (street--address city state country) address (values x first--name last--name birthday address city)))))) #<#<JSON:FLUID-CLASS COMMON-LISP:NIL {100F765713}> {1010E69B93}> "George" "Washington" "1732-02-22" #<#<JSON:FLUID-CLASS COMMON-LISP:NIL {100F765713}> {1010E698A3}> "Mount Vernon"
Trivial-json-codec
If you have defined your classes, trivial-json-codec does make it easy to decode JSON data directly to a vector of your classes. Suppose we have a simple person class:
(defclass person () ((name :initarg :name :initform "Sabra" :accessor name) (eye-colour :initarg :eye-colour :initform "brown" :accessor eye-colour)))
If we have a vector of JSON objects which are all data for a person class, we can automatically build a vector of persons by specifying the class we want to use:
(let ((data (trivial-json-codec:deserialize-json "[{\"name\":\"Claudia\",\"eye-colour\":\"blue\"}, {\"name\":\"Johann\",\"eye-colour\":\"brown\"}]" :class (find-class 'person)))) (name (aref data 1))) "Johann"
There is no magic if the JSON array has objects of different types.
Extracting a Subset of a JSON object
Suppose you just want a subset of data from a JSON file. Obviously every library allows you to parse the entire JSON file and then use standard lisp functions to pull out the subset you want. Only a few libraries will extract the desired subset directly.
Library | Result | Comment |
---|---|---|
boost-json | NO | |
cl-json | NO | But see cl-json-helper's json-key-value function |
com.gigamonkeys.json | NO | |
com.inuoe.jzon | NO | |
jonathan | YES | First level of data only. Also remember jonathan only reads strings, not streams |
json-lib | NO | |
json-streams | NO | |
jsown | YES | But remember jsown only reads strings, not streams |
shasht | NO | |
st-json | NO | |
yason |
Determining Object Keywords
Jsown has the ability, once a JSON object has been parsed into a jsown object, to get the keywords of the object. The following only gets the first level keywords.
(jsown:keywords (jsown:parse *nested-address-1* )) ("first_name" "last_name" "birthday" "address")
Handling NIL
This is a test of how the library function for reading from a string handles nil. Errors are not surprising.
Library | Function | Result | Comment |
---|---|---|---|
boost-json | json-decode | Error | |
cl-json | json-decode-from-string | Error | |
com.gigamonkeys.json | parse-json | NIL | |
com.inuoe.jzon | parse | Error | |
jonathan | parse | Error | |
json-lib | parse | "" | |
json-streams | json-parse | Error | |
jsown | parse | NIL | |
shasht | read-json | Error | |
st-json | deserialize-raw | Error | |
yason | parse | Error |
Encoding Lisp Data to JSON
Library | Base Function | Specialist Functions |
---|---|---|
boost-json | json-encode | |
cl-json | encode-json | encode-json-to-string, encode-json-alist encode-json-alist-to-string, encode-json-plist, encode-json-plist-to-string, encode-object-member, encode-array-member |
com.gigamonkeys.json | to-json, write-json | |
com.inuoe.jzon | stringify | |
jonathan | to-json | |
json-lib | stringify | |
json-streams | json-stringify | |
jsown | to-json | to-json* |
shasht | write-json | write-json-string, write-json* |
st-json | write-json | write-json-to-string, write-json-element |
trivial-json-codec (1) | serialize | serialize-json |
yason | encode | encode-alist, encode-plist, encode-object, encode-slots, encode-object-element, encode-array-element, encode-array-elements |
- (1) As previously noted, trivial-json-codec is really intended as a parser (one way) from JSON to CL, not really serializing to JSON.
Encoding to Streams or Strings
Does the library take both strings and streams as output?
Library | String Function | Stream Function |
---|---|---|
boost-json | json-encode | json-encode |
cl-json | encode-json | encode-json-to-string |
com.gigamonkeys.json | to-json | write-json |
com.inuoe.jzon | stringify | stringify |
jonathan | to-json | with-output … |
json-lib | stringify | No |
json-streams | json-stringify | with-json-output |
jsown | to-json | No |
shasht | write-json | write-json |
st-json | write-json-to-string | write-json |
trivial-json-codec | serialize or serialize-json | serialize |
yason | encode | encode |
Encoding Symbols, Chars, T, nil and :null
Several libraries have an issue encoding symbols. Typically these can be resolved if you write a method for handling symbols. Of course, if you have symbols in your lisp data, you might want to look at libraries which handle them without any additional effort on your part.
As mentioned when discussing mapping from JSON to CL, there is a problem with null in that it represents "unknown", not false. CL does not have that as a data type, which is unfortunate. You cannot determine, absent other additional data, whether CL's NIL = [] or NIL = False or "unknown". As a result, you really have to pay attention serializing to and from JSON as to what you want nil to represent and whether it is unacceptably overloaded.
The following table uses 'a as the symbol input and #\C as a character input. :null was used as a proxy for null.
Library | Function | Symbol | Char | T | Nil/:NULL |
---|---|---|---|---|---|
boost-json | json-encode | "\"A\"" | Error (3) | "true" | "null"/"NULL" |
cl-json | encode-json | "a" | "C" | true | null/"null" (4) |
com.gigamonkeys.json | write-json | Error (1) | Hangs (3) | true | {}/null |
com.inuoe.jzon | stringify | "\"A\"" | [] (3) | "true" | "false"/"NULL" |
jonathan | to-json | "\"A\"" | Error (3) | "true" | "[]"/"null" |
json-lib | stringify | "null" (2) | "null" (3) | "true" | "null"/"null" |
json-streams | json-stringify | Error (3) | Error (3) | "true" | "false"/"null" |
jsown | to-json | "\"A\"" | Error (3) | "true" | "[]"/"null" |
shasht | write-json | "A" | "C" | "true" | false/null |
st-json | write-json | Error (3) | Error (3) | "true" | []/null |
trivial-json-codec | serialize-json | "A" | Error (3) | "true" | "null"/":NULL" |
yason | encode | Error (5) | Error (3) | true | null/null |
- (1) Keywords Only
- (2) returns "null"
- (3) You could write a method to handle this type. For jonathan, see jonathan-encoding. For st-json see st-json-encoding.
- (4) You can use the helper library cl-json-helper to encode nil as "false".
- (5) Using yason to encode a symbol would require the use of yason:encode-symbol-as-lowercase rather than just the simple yason:encode
Encoding Numbers
There are no surprises in encoding integers. Floats are generally fine depending on your view of rounding and the use of exponents. Ratios get slightly more interesting. Json-streams and trivial-json-codec were the only libraries to refuse to encode a ratio number. Json-lib wrote it as the string "null". The rest converted it to some form of digital number. The following table uses 9/4 as the sample data.
Library | Function | Float | Ratios |
---|---|---|---|
Original Numbers | 3.675 | 9/4 | |
boost-json | json-encode | 3.675 | "2.25" |
cl-json | encode-json | 3.675 | 2.25 |
com.gigamonkeys.json | write-json | 3.674999952316284 | 2.25 |
com.inuoe.jzon | stringify | 3.675 | "2.25" |
jonathan | to-json | 3.675 | "2.25" |
json-lib (1) | stringify | 3.675 | "null" |
json-streams | json-stringify | 3.675 | Error (2) |
jsown | to-json | 3.675 | "2.25" |
shasht | write-json | 3.675e+0 | 2.25e+0 |
st-json | write-json | 0.3675e+1 | 0.2373e+2 |
trivial-json-codec | serialize-json | 3.675 | Error (2) |
yason | encode | 3.674999952316284 | 2.25 |
(1) As of the time of writing, json-lib only handles integers and floats, not ratios. (2) Error: JSON write error: Number must be integer or float, got 9/4.
Encoding Pathnames
Just to see what the libraries did with pathnames, using #P"/home/sabra" as sample data. Out of the box, only boost-json and shasht turned it into a string. cl-json, jonathan, jsown, st-json, trivial-json-codec and yason indicate that you could write a method to handle the path datatype.
Library | Function | Result |
---|---|---|
boost-json | json-encode | "/home/sabra" |
cl-json | encode-json | Not of a type which can be encoded by encode-json |
com.gigamonkeys.json | write-json | hangs |
com.inuoe.jzon | stringify | "/home/sabra" |
jonathan | to-json | No applicable method |
json-lib | stringify | null |
json-streams | json-stringify | Fell through etypecase expression |
jsown | to-json | No applicable method |
shasht | write-json | "home/sabra" |
st-json | write-json | Cannot write object of type Pathname to json |
trivial-json-codec | serialize-json | No applicable method |
yason | encode | No applicable method |
Encoding Local-Time Timestamps
Again, just to see what the libraries did with timestamps, using (local-time:now) as sample data. This is actually just a preview of what happens with CLOS objects. We did have more libraries handling the timestamp and returning a JSON object, but none returned a JSON data object. As you can see, cl-json, com.inuoe.json, shasht and trivial-json-codec returned an object. Boost-json, jonathan, jsown, st-json and yason all indicate you could write a method to handle the timestamp.
Library | Function | Result |
---|---|---|
boost-json | json-encode | There is no applicable method for the generic function |
cl-json | encode-json | {"day":7990,"sec":48161,"nsec":580943000} |
com.gigamonkeys.json | write-json | hangs |
com.inuoe.jzon | stringify | {"day":7990,"sec":48161,"nsec":580943000} |
jonathan | to-json | There is no applicable method for the generic function |
json-lib | stringify | null |
json-streams | json-stringify | @2022-01-15T08:22:41.580943-05:00 fell through ETYPECASE expression. |
jsown | to-json | There is no applicable method for the generic function |
shasht | write-json | { "DAY": 7990, "SEC": 48161, "NSEC": 580943000} |
st-json | write-json | Can not write object of type TIMESTAMP as JSON. |
trivial-json-codec | serialize-json | { "DAY" : 7990, "SEC" : 48161, "NSEC" : 580943000} |
yason | encode | There is no applicable method for the generic function |
Encoding Data Structures to JSON (Summary)
The following table is just a quick summary of library functionality for arrays, hashtables, CLOS objects and structs. More detail is provided in specific section for each of those categories of data structures.
Library | Function | Vectors | Hash-table | Object | Struct |
---|---|---|---|---|---|
boost-json | json-encode | (6) | (1) | (4) | (4) |
cl-json | encode-json-to-string | YES | YES | YES | (4) |
com.gigamonkeys.json | write-json | YES | (2) | hangs | hangs |
com.inuoe.jzon | stringify | YES | YES | YES (5) | YES |
jonathan | to-json (8) | YES | YES | (4) | (4) |
json-lib | stringify | YES | YES | "null" | NO |
json-streams | json-stringify | (3) | (3) | (3) | (3) |
jsown | to-json | YES (10) | YES | (4) | (4) |
shasht | write-json | YES (10) | YES | YES | YES |
st-json | write-json | (7) | (1) | (4) | (4) |
trivial-json-codec | serialize-json | YES | (4) | YES | (4) |
yason | encode-* (9) | YES | (1) (4) | (4) | (4) |
- (1) Succeeds if the hash-table keys are strings, fails if they are symbols
- (2) Invalid results if given a list inside the hash-table or errors if the hash-table keys are symbols
- (3) The basic json-stringify function does not handle data structures, so you need to resort to more complex calls. See json-streams-encoding-hash-tables or json-streams-encoding-arrays for examples.
- (4) May be able to resolve if you write a specialized method.
- (5) Automatically handles standard CLOS objects and also allows you to specialize
- (6) Invalid results Sample output on a simple nested array looked like: [,"Cork","Limerick"][,[,"Frankfurt","Munich"]]
- (7) You need to write your own st-json::write-json-element function for arrays.
See st-json-encoding.
- (8) Typically for these objects jonathan requires extra keyword parameters like :from :alist
- (9) May require using one of the more specialized functions such as encode-alist etc.
- (10) jsown and shasht are the only libraries which can handle multi-dimensional arrays.
Encoding Lists (Summary)
By default, all libraries except json-streams:json-stringify (which does not accept lists) and trivial-json-codec (which generates invalid JSON) will return a JSON array when provided a plain CL list. Some libraries will not accept non-keyword symbols in a list (or will need an additional parameter). If you need to keep the key-value connections of a plist, you may need to convert the plist to another form or use a more specific function. More detail can be found below plain-lists, encoding alists, encoding plists.
In general, plain lists and plists are returned as JSON arrays unless some other keyword parameter is provided. If the library handles alists, they may be returned as JSON objects enclosing arrays, arrays enclosing arrays, or other variations. See encoding-alists for more detail. Please set the footnotes to see if libraries have issues with symbols (including keyword symbols).
Library | Function | lists | alists | plists |
---|---|---|---|---|
boost-json | json-encode | YES | (7) | YES |
cl-json | encode-json | YES | YES | (8) |
cl-json | encode-json-alist | YES | YES | (11) |
cl-json | encode-json-plist | YES | (9) | YES |
com.gigamonkeys.json | write-json | (2)(12) | (1) | (2) |
com.inuoe.jzon | stringify | YES | YES (15) | YES |
jonathan | to-json | YES | YES (13) | YES |
json-lib | stringify | (4) | (1) | YES |
json-streams | json-stringify | (6) | (6) | (6) |
jsown | to-json | YES | (1) (10) | (3) |
shasht | write-json | YES | YES | YES |
st-json | write-json | YES | YES (10) | (3) |
trivial-json-codec | serialize-json | (14) | (14) | (14) |
yason | encode | (5) | (5) | (5) |
yason | encode-plist | (12) | YES | YES |
yason | encode-alist | (5) | YES | (3) |
- (1) com.gigamonkeys.json, json-lib cannot deal with alists directly. Consider using alexandria:alist-hash-table to convert the alist to a hash table.
- (2) Symbols are allowed only if they are keyword symbols, otherwise com.gigamonkeys.json will error.
- (3) Plists are treated the same as plain lists and will lose their key-value connections. Convert to a hash table first.
- (4) Symbols are allowed only if they are keyword symbols. Json-lib will convert a non-keyword symbol to null.
- (5) If it is a plain list or plist, yason:encode and yason:encode-alist will not accept symbols in the list. If it is an alist which has symbols, it will accept them as keys if you run (setf yason:*symbol-key-encoder* 'yason:ENCODE-SYMBOL-AS-LOWERCASE) first. Otherwise it errors.
- (6) Json-streams:json-stringify does not accept lists as input. You would need to use lower level components of json-streams. See json-streams-encoding.
- (7) Fail - the JSON arrays are invalid. They have ',.' or ',' rather than ':' depending on whether they are dotted cons cells or not.
- (8) As you might expect, plists are treated the same as plain lists and will lose their key-value connections. If you want to keep the key-value connections, you can either convert the list to an alist or hash-table or use the cl-json:encode-json-plist function.
- (9) This function is plist specific and will error if provided an alist.
- (10) If the alist has dotted cons cells, jsown and st-json triggered unhandled memory faults with SBCL 2.1.11-x86-64-linux, CCL version 1.12.1 LinuxX8664 and ecl-version 21.2.1. This appears to be because jsown and st-json have optimized the code and assumed that all lists will be proper lists. The assumption obviously fails in the context of dotted cons cells.
- (11) The *-alist functions require an alist. Shocking, I know.
- (12) com.gigamonkeys.json will assume a plain list is a plist, returning a JSON object with key-value pairs. If the length of the list is odd, the final value in the list will be treated as a key and an empty set will be inserted as the value.
- (13) For jonathan to properly handle alists, you need to add the additional keyword parameters :from :alist
- (14) Trivial-json-codec is going to give us invalid JSON from now on with respect to tests involving lists - e.g. '(a b c)) becomes "<A,B,C>", so we will drop it from the rest of the encoding tests involving lists.
- (15) Recent changes in jzon mean that it currently does not accept dotted lists.
Encoding Plain Lists
As noted above, if you had symbols in a list, libraries like json-streams and st-json are going to error out because they need additional methods, while com.gigamonkeys.json accepts only keyword symbols. In the below table, I included trivial-json-codec's results without the error that it generates on ratio numbers. Notice that gigamonkeys is interpreting the list as a plist and trying to create key:value pairs.
Using '("A" "b" 4 3.2 9/4) as the sample data
Library | Function | Result |
---|---|---|
Original data | '("A" "b" 4 3.2 9/4) | |
boost-json | json-encode | ["A","b",4,3.2,2.25] |
cl-json | encode-json | ["A","b",4,3.2,2.25] |
com.gigamonkeys.json | write-json | {"A":"b","4":3.200000047683716,"2.25":{}} |
com.inuoe.jzon | stringify | ["A","b",4,3.2,2.25] |
jonathan | to-json | "[\"A\",\"b\",4,3.2,2.25]" |
json-lib | stringify | "[\"A\", \"b\", 4, 3.2, 9/4]" |
json-streams | json-stringify | Fail on list, (1) |
jsown | to-json | "[\"A\",\"b\",4,3.2,2.25]" |
shasht | write-json | ["A","b",4,3.2e+0,2.25e+0] |
st-json | write-json | ["A","b",4,0.32e+1,0.225e+1] |
trivial-json-codec (2) | serialize-json | "<\"A\",\"b\",4,3.2>" |
yason | encode | ["A","b",4,3.200000047683716,2.25] |
- (1) See json-streams-encoding.
- (2) I have no idea why trivial-json-codec wants to generate angle brackets.
Encoding Alists
I discovered that many libraries have problems if the alist components are dotted pairs (improper lists). So we break alists down between alists with dotted pairs and alists without dotted pairs. We will keep symbols out of the sample data because we already know from encoding-symbols that some libraries will have issues with alists containing symbols as keys. Of note:
- cl-json provides a function to encode the alist properly as key:value,
- com.inuoe.jzon has dropped its heuristics and currently does not handle dotted pairs
- shasht:*write-alist-as-object* provides the ability to write an undotted alist as a JSON object rather than an array.
- yason provides a specific function to encode the alist properly as key:value,
- If the alist has dotted cons cells, jsown and st-json triggered unhandled memory faults with SBCL 2.1.11-x86-64-linux, CCL version 1.12.1 LinuxX8664 and ecl-version 21.2.1. This appears to be because jsown and st-json have optimized the code and assumed that all lists will be proper lists. The assumption obviously fails in the context of dotted cons cells.
Dotted cons cells (dotted pairs)
Using '(("A" . 1) ("B" . 2) ("C" . 3)) as the sample data
Library | Function | Result |
---|---|---|
Original data | '(("A" . 1) ("B" . 2) ("C" . 3)) | |
boost-json (1) | json-encode | [["A",. 1],["B",. 2],["C",. 3]] |
cl-json | encode-json | {"A":1,"B":2,"C":3} |
cl-json | encode-json-alist | {"A":1,"B":2,"C":3} |
com.gigamonkeys.json | write-json | Error:Can't stringify (A . 1) |
com.inuoe.jzon (6) | stringify | Error: value 1 is not of type list |
jonathan | to-json | Error: value 1 is not of type list |
jonathan | to-json XXX :from :alist | "{\"A\":1,\"B\":2,\"C\":3}" |
json-lib (2) | stringify | Error value 1 is not of type list |
json-streams | json-stringify | Error (5) |
jsown (4) | to-json | Unhandled memory fault |
shasht (3) | write-json | { "A": 1, "B": 2, "C": 3} |
st-json (4) | write-json | Unhandled memory fault |
yason | encode-alist | {"A":1,"B":2,"C":3} |
- (1) Fail - the JSON arrays are invalid. They have ',.' rather than ','
- (2) You may be able to write a new method to handle dotted.
- (3) Only if (setf shasht:*write-alist-as-object* t), which is not the default. Otherwise it generates an error.
- (4) If the alist has dotted cons cells, jsown and st-json triggered unhandled memory faults with SBCL 2.1.11-x86-64-linux, CCL version 1.12.1 LinuxX8664 and ecl-version 21.2.1. This appears to be because jsown and st-json have optimized the code and assumed that all lists will be proper lists. The assumption obviously fails in the context of dotted cons cells.
- (5) You would need to use lower level components of json-streams. See json-streams-encoding.
- (6) This is a regression
Now we change the data slightly, still using dotted pairs, but with a nested list as the value in one pair: Using '(("foo" . "bar") ("baz" . ((1 2 3) (4 5 6)))) as the sample data.
Library | Function | Result |
---|---|---|
Original data | '(("foo" . "bar") ("baz" . ((1 2 3) (4 5 6)))) | |
boost-json (1) | json-encode | [["foo",. "bar"],["baz",[1,2,3],[4,5,6]]] |
cl-json | encode-json | {"foo":"bar","baz":[[1,2,3],[4,5,6]]} |
cl-json | encode-json-alist | {"foo":"bar","baz":[[1,2,3],[4,5,6]]} |
com.gigamonkeys.json | write-json | Error:Can't stringify (foo . bar) |
com.inuoe.jzon (6) | stringify | Error: value "bar" is not a list |
jonathan | to-json | Error: value 1 is not of type list |
jonathan | to-json XXX :from :alist | {"foo":"bar","baz":{"1":[2,3],"4":[5,6]}} |
json-lib (2) | stringify | Error value "bar" is not of type list |
json-streams | json-stringify | Error (5) |
jsown (4) | to-json | Unhandled memory fault |
shasht (3) | write-json | Error: The value "bar" is not of type LIST |
shasht (3a) | write-json | {"foo":"bar","baz":{1:[2,3],4:[5,6]}} |
st-json (4) | write-json | Unhandled memory fault |
yason | encode | Error: "bar" is not of type list |
yason | encode-alist | {"foo":"bar","baz":[[1,2,3],[4,5,6]]} |
- (1) Fail - the JSON arrays are invalid. They have ',.' rather than ','
- (2) You may be able to write a new method to handle dotted.
- (3) This error is generated if shasht:*write-alist-as-object* is nil (the default).
- (3a) If shasht:*write-alist-as-object* is t (not the default).
- (4) If the alist has dotted cons cells, jsown and st-json triggered unhandled memory faults with SBCL 2.1.11-x86-64-linux, CCL version 1.12.1 LinuxX8664 and ecl-version 21.2.1. This appears to be because jsown and st-json have optimized the code and assumed that all lists will be proper lists. The assumption obviously fails in the context of dotted cons cells.
- (5) You would need to use lower level components of json-streams. See json-streams-encoding.
- (6) This is a regression
Notice the difference in the successful results. cl-json and yason's result of {"foo":"bar","baz":[[1,2,3],[4,5,6]]} is what I would have expected. jonathan tried to force a key value pair into the second alist value. {"foo":"bar","baz":{"1":[2,3],"4":[5,6]}}.
Without dotted cons cells
Using '(("A" 1) ("B" 2) ("C" 3)) as the sample data.
Note difference in results. Some libraries return arrays, others return objects. Some libraries return key:[value] pairs, others return two member arrays.
Library | Function | Result |
---|---|---|
Original data | '(("A" 1) ("B" 2) ("C" 3)) | |
boost-json | boost-json-write-to-string | [["A",1],["B",2],["C",3]] |
cl-json | encode-json | [["A",1],["B",2],["C",3]] |
cl-json | encode-json-alist | {"A":[1],"B":[2],"C":[3]} |
com.gigamonkeys.json | gigamonkeys-write-to-string | Can't stringify (A 1) |
com.inuoe.jzon (2) | stringify | [["A",1],["B",2],["C",3]] |
jonathan | jonathan-to-json-alist | {"A":[1],"B":[2],"C":[3]} |
json-lib | stringify | [["A",1],["B",2],["C",3]] |
json-streams | json-stringify | Error (1) |
jsown | to-json | [["A",1],["B",2],["C",3]] |
shasht | write-json | [["A",1],["B",2],["C",3]] |
st-json | write-json-to-string | [["A",1],["B",2],["C",3]] |
yason | encode | [["A",1],["B",2],["C",3]] |
yason | encode-alist | {"A":[1],"B":[2],"C":[3]} |
- (1) Using json-stringify caused the alist used as data to fall through etypecase expression demanding a json-streams:json-array. That means that you have to fall back to more complicated calls. See json-streams-encoding.
- (2) Before recent changes, com.inuoe.jzon would have returned {"A":[1],"B":[2],"C":[3]}
Let's make the value portion of the alist another alist. Now using '(("foo" "bar") ("baz" ((1 2 3) (4 5 6)))) as the sample data. The results may or may not be what you want, so look at them carefully.
Library | Function | Result |
---|---|---|
Original data | '(("foo" "bar") ("baz" ((1 2 3) (4 5 6)))) | |
boost-json | json-encode | [["foo","bar"],["baz",[[1,2,3],[4,5,6]]]] |
cl-json | encode-json | [["foo","bar"],["baz",[[1,2,3],[4,5,6]]]] |
cl-json | encode-json-alist | [["foo","bar"],["baz",[[1,2,3],[4,5,6]]]] |
com.gigamonkeys.json | write-json | Can't stringify (foo bar) |
com.inuoe.jzon | stringify | [["foo","bar"],["baz",[[1,2,3],[4,5,6]]]] |
jonathan | to-json XX :from :alist | {"foo":["bar"],"baz":[{"1":[2,3],"4":[5,6]}]} |
json-lib | stringify | [["foo", "bar"], ["baz",[[1,2,3],[4,5,6]]]] |
json-streams | json-stringify | Error (1) |
jsown | to-json | [["foo","bar"],["baz",[[1,2,3],[4,5,6]]]] |
shasht | write-json | [["foo","bar"],["baz",[[1,2,3],[4,5,6]]]] |
st-json | write-json | [["foo","bar"],["baz",[[1,2,3],[4,5,6]]]] |
yason | encode | [["foo","bar"],["baz",[[1,2,3],[4,5,6]]]] |
yason | encode-alist | {"foo":["bar"],"baz":[[[1,2,3],[4,5,6]]]} |
- (1) You would need to use lower level components of json-streams. See json-streams-encoding.
- (2) Before recent changes, com.inuoe.jzon would have returned {"foo":["bar"],"baz":[[[1,2,3],[4,5,6]]]}
Yason:encode-alist and jonathan return JSON objects with key pairs; the rest try to return arrays. jonathan seem to be trying to force a key value pair into the baz values when I do not think this is probably what you want. Yason's heuristics return the same result. Let's change the data slightly and just look at these two libraries again.
Library | Function | Result |
---|---|---|
New data | '(("foo" "bar") ("baz" ((1 2 3) ("A" 5 6)))) | |
yason | encode | [["foo","bar"],["baz",[[1,2,3],["A",5,6]]]] |
yason | encode-alist | {"foo":["bar"],"baz":[[[1,2,3],["A",5,6]]]} |
New data | '(("foo" "bar") ("baz" (("B" 2 3) ("A" 5 6)))) | |
yason | encode | [["foo","bar"],["baz",[["B",2,3],["A",5,6]]]] |
yason | encode-alist | {"foo":["bar"],"baz":[[["B",2,3],["A",5,6]]]} |
Encoding Plists
- cl-json:encode-json (returns array) and cl-json:encode-json-plist (returns object). cl-json provides a specialized function to encode the plist properly as key:value
- com.inuoe.jzon correctly guesses this is a plist which should be encoded as key:value
- With keyword symbols, jonathan returns an object, but otherwise returns an array.
- yason provides a function to encode the plist properly as key:value
Using '("a" 1 "b" 2 "c" 3) as the sample data
Library | Function | Result | Comment |
---|---|---|---|
Original data | '("a" 1 "b" 2 "c" 3) | ||
boost-json | json-encode | ["a",1,"b",2,"c",3] | (1) |
cl-json | encode-json | ["a",1,"b",2,"c",3] | (1) |
cl-json | encode-json-plist | {"a":1,"b":2,"c":3} | |
com.gigamonkeys.json | write-json | {"a":1,"b":2,"c":3} | |
com.inuoe.jzon | stringify | ["a",1,"b",2,"c",3] | (3) |
jonathan | to-json | ["a",1,"b",2,"c",3]" | (1) |
json-lib | stringify | ["a", 1, "b", 2, "c", 3] | |
json-streams | json-stringify | Fail: (2) | |
jsown | to-json | ["a",1,"b",2,"c",3] | (1) |
shasht | write-json | ["a",1,"b",2,"c",3] | (1) |
st-json | write-json | ["a",1,"b",2,"c",3] | (1) |
yason | encode | ["a",1,"b",2,"c",3] | (1) |
yason | encode-plist | {"a":1,"b":2,"c":3} |
- (1) Encoding to array but loses the obvious key-value connection.
- (2) You would need to use lower level components of json-streams. See json-streams-encoding.
- (3) Before recent changes, jzon would have returned {"a":1,"b":2,"c":3}
Encoding Arrays
Encoding Single Dimensional Vector
Boost-json loses the first value in the vector. Json-streams and st-json do not handle the vector. All the other libraries pass, although some get excited about the floating point number.
Using #("A" 1 2.3) as sample data.
Library | Function | Result |
---|---|---|
Original data | #("A" 1 2.3) | |
boost-json | json-encode | [,1,2.3] |
cl-json | encode-json | ["A",1,2.3] |
com.gigamonkeys.json | write-json | ["A",1,2.299999952316284] |
com.inuoe.jzon | stringify | ["A",1,2.3] |
jonathan | to-json | ["A",1,2.3] |
json-lib | stringify | ["A", 1, 2.3] |
json-streams (1) | json-stringify | #("A" 1 2.3) fell through ETYPECASE expression. |
jsown | to-json | ["A",1,2.3] |
shasht | write-json | ["A",1,2.2999999e+0] |
st-json (2) | write-json | Cannot write object of type (SIMPLE-VECTOR 3) as JSON. |
trivial-json-codec | serialize-json | ["A",1,2.3] |
yason | encode | ["A",1,2.299999952316284] |
- (1) You would need to use lower level components of json-streams. See json-streams-encoding.
- (2) You need to write your own st-json::write-json-element function for arrays. Possibly something like:
(defmethod write-json-element ((element vector) stream) (declare #.*optimize*) (write-char #\[ stream) (loop :for val :across element :for first := t :then nil :unless first :do (write-char #\, stream) :do (write-json-element val stream)) (write-char #\] stream))
See st-json-encoding.
Encoding Simple Bit Vector
Again boost-json, json-streams and st-json fail. The rest pass.
Using #*10110 as the sample data.
Library | Exported Function | Result | Comment |
---|---|---|---|
Original data | #*10110 | ||
boost-json | json-encode | Fail [,0,1,1,0] | Lost first item |
cl-json | encode-json | [1,0,1,1,0] | |
com.gigamonkeys.json | write-json | [1,0,1,1,0] | |
com.inuoe.jzon | stringify | [1,0,1,1,0] | |
jonathan | to-json | "[1,0,1,1,0]" | |
json-lib | stringify | "[1, 0, 1, 1, 0]" | |
json-streams | json-stringify | Fail | Default fails (1) |
jsown | to-json | "[1,0,1,1,0]" | |
shasht | write-json | [1,0,1,1,0] | |
st-json | write-json | Fail | Default fails (2) |
trivial-json-codec | serialize-json | "[1,0,1,1,0]" | |
yason | encode | [1,0,1,1,0] |
- (1) You would need to use lower level components of json-streams. See json-streams-encoding.
- (2) Error: Can not write object of type (SIMPLE-BIT-VECTOR 5) as JSON. You can resolve this by writing your own method for handling symbols. See st-json-encoding.
Encoding Nested Arrays
The following example uses a very simple nested array:
#(#("Dublin" "Cork" "Limerick") #("Berlin" "Frankfurt" "Munich"))
Library | Result | Comment |
---|---|---|
Original data | #(#("Dublin" "Cork" "Limerick") #("Berlin" "Frankfurt" "Munich")) | |
boost-json | [,"Cork","Limerick"][,[,"Frankfurt","Munich"]] | Fail |
cl-json | [["Dublin","Cork","Limerick"],["Berlin","Frankfurt","Munich"]] | |
gigamonkeys | [["Dublin","Cork","Limerick"],["Berlin","Frankfurt","Munich"]] | |
jzon | [["Dublin","Cork","Limerick"],["Berlin","Frankfurt","Munich"]] | |
jonathan | [["Dublin","Cork","Limerick"],["Berlin","Frankfurt","Munich"]] | |
json-lib | [["Dublin", "Cork", "Limerick"], ["Berlin", "Frankfurt", "Munich"]] | |
json-streams | (1) | |
jsown | [["Dublin","Cork","Limerick"],["Berlin","Frankfurt","Munich"]] | |
shasht | [["Dublin","Cork","Limerick"], ["Berlin","Frankfurt","Munich"]] | |
st-json | (2) | |
trivial-json-codec | [["Dublin","Cork","Limerick"],["Berlin","Frankfurt","Munich"]]" | |
yason | [["Dublin","Cork","Limerick"],["Berlin","Frankfurt","Munich"]] |
- (1) You would need to use lower level components of json-streams. See json-streams-encoding.
- (2) Error: Can not write object of type (SIMPLE-BIT-VECTOR 5) as JSON. You can resolve this by writing your own method for handling symbols. See st-json-encoding.
Encoding 2 Dimensional Array Version
Just to see what happens if we give the libraries a two dimensional array, let's use #2A((1.0 1.0) (1.0 1.0) (1.0 1.0)) as the sample data.
Jsown and shasht are the only libraries to handle a two dimensional array without the user having to write additional methods.
Library | Function | Result |
---|---|---|
Original data | #2A((1.0 1.0) (1.0 1.0) (1.0 1.0)) | |
boost-json | json-encode | Error: No applicable method |
cl-json | encode-json | Error: Unencodable value |
com.gigamonkeys.json | write-json | Method emit-json hangs |
com.inuoe.jzon | stringify | {} |
jonathan | to-json | Error: No applicable method |
json-lib | stringify | "null" |
json-streams | json-stringify | Error: (1) |
jsown | to-json | Passed "[[1.0,1.0],[1.0,1.0],[1.0,1.0]]" |
shasht | write-json | Passed "[[1.0e+0,1.0e+0],[1.0e+0,1.0e+0],[1.0e+0,1.0e+0]]" |
st-json | write-json | Error: Cannot write object as json |
trivial-json-codec | serialize-json | Error: value is not of type sequence |
yason | encode | Error: No applicable method |
- (1) You would need to use lower level components of json-streams. See json-streams-encoding.
- (2) You can resolve this by writing your own method for handling symbols. See st-json-encoding.
Encoding hash-tables
Using string as key
Using (alexandria:plist-hash-table '("foo" 1 "bar" (7 8 9)) :test #'equal) as the sample data:
Library | Function | Result | Comment |
---|---|---|---|
boost-json | json-encode | {"foo":1,"bar":[7,8,9]} | |
cl-json | encode-json | {"foo":1,"bar":[7,8,9]} | |
com.gigamonkeys.json | write-json | {"foo":1,"bar":{"7":8,"9":{}}} | (1) |
com.inuoe.jzon | stringify | "{\"foo\":1,\"bar\":[7,8,9]}" | |
jonathan | to-json | "{\"foo\":1,\"bar\":[7,8,9]}" | |
json-lib | stringify | "{\"foo\": 1, \"bar\": [7, 8, 9]}" | |
json-streams | json-stringify | Error | (2) |
jsown | to-json | "{\"foo\":1,\"bar\":[7,8,9]}" | |
shasht | write-json | { "foo": 1, "bar": [7,8,9]} | |
st-json | write-json | {"foo":1,"bar":[7,8,9]} | |
trivial-json-codec | serialize-json | Error | (3) |
yason | encode | {"foo":1,"bar":[7,8,9]} |
- (1) The hash table encoding is fine. In this example com.gigamonkeys.json is treating the list which is a hash table value as a plist and trying to create key:value pairs.
- (2) You would need to use lower level components of json-streams. See json-streams-encoding.
- (3) May be able to resolve this with writing your own method.
Using symbol as key
Using (alexandria:plist-hash-table '(:foo 1 :bar (7 8 9)) :test #'eq) as the sample data: Again, see what gigamonkeys is doing with respect to trying to treat the list as key:value pairs.
Library | Function | Result | Comment |
---|---|---|---|
boost-json | json-encode | FAIL: {} | |
cl-json | encode-json | {"foo":1,"bar":[7,8,9]} | |
com.gigamonkeys.json | write-json | {"foo":1,"bar":{"7":8,"9":{}}} | (1) |
com.inuoe.jzon | stringify | {"foo":1,"bar":[7,8,9]} | |
jonathan | to-json | {"FOO":1,"BAR":[7,8,9]} | |
json-lib | stringify | "{\"foo\": 1, \"bar\": [7,8,9]}" | (2) |
json-streams | json-stringify | Error | (3) |
jsown | to-json | "{\"FOO\":1,\"BAR\":[7,8,9]}" | |
shasht | write-json | {"FOO": 1,"BAR": [7,8,9]} | |
st-json | write-json | Error | (4) |
trivial-json-codec | serialize-json | Error | (5) |
yason | encode | Error | (6) |
- (1) The symbol must be a keyword symbol or com.gigamonkeys.json will error out. As noted above, it is also trying to treat the embedded list as key:value pairs rather than an array.
- (2) If the symbols were not keywords, json-lib would return "{null: 1, null: [7, 8, 9]}"
- (3) You would need to use lower level components of json-streams. See json-streams-encoding.
- (4) You will need to write your own method to handle symbols in this situation
- (5) May be able to resolve this with writing your own method.
- (6) There is no applicable method for the generic function when called with arguments :symbol
Encoding CLOS objects
cl-json, com.inuoe.jzon and shasht all are able to encode CLOS objects by default. Boost-json, jonathan, jsown, st-json and yason require that you write methods for each class. With the other libraries you would either have to convert the clos object into another type (hash-table or alist etc.) or write a specific function for that particular class of object. We are using a very simple CLOS object here, just to test the water.
(defclass person () ((name :initarg :name :initform "Sabra" :accessor name) (eye-colour :initarg :eye-colour :initform "brown" :accessor eye-colour)))
As you can see from the table below, at least in this simple example, cl-json, com.inuoe.jzon and shasht all are able to encode CLOS objects by default. Boost-json, jonathan, jsown, and yason require that you write methods for each class.
Library | Function | Result |
---|---|---|
boost-json | json-encode | (1) |
cl-json | encode-json | {"name":"Sabra","eyeColour":"brown"} |
com.gigamonkeys.json | write-json | hangs |
com.inuoe.jzon | stringify | "{\"name\":\"Sabra\",\"eye-colour\":\"brown\"}" (4) |
jonathan | to-json | (1) |
json-lib | stringify | "null" |
json-streams | json-stringify | Error (2) |
jsown | to-json | (1) |
shasht | write-json | { "NAME": "Sabra", "EYE-COLOUR": "brown"} |
st-json | write-json | (1) |
trivial-json-codec | serialize-json | "{\"NAME\":\"Sabra\",\"EYE-COLOUR\":\"brown\"}" |
yason | encode | (1) |
- (1) You need to write a method for each CLOS object. See boost-json-encoding-clos, jonathan-encoding-clos,,jsown-encoding-clos, st-json-encoding-clos or yason-encoding-clos.
- (2) You would need to use lower level components of json-streams. See json-streams-encoding.
Encoding Structs
Only com.inuoe.jzon and shasht were able to encode a struct without having to define a special method.
The sample data for the following table is:
(defparameter *book1* (make-book :title "C Programming" :author "Nuha Ali" :subject "C-Programming Tutorial" :book-id "478"))
Library | Function | Result |
---|---|---|
boost-json | json-encode | Error (1) |
cl-json | encode-json | Error (2) |
com.gigamonkeys.json | write-json | hangs |
com.inuoe.jzon | stringify | {"title":"C Programming","author":"Nuha Ali","subject":"C-Programming Tutorial","book-id":"478"} |
jonathan | to-json | Error (1) |
json-lib | stringify | null |
json-streams | json-stringify | Error (3) |
jsown | to-json | Error (1) |
shasht | write-json | {"TITLE":"C Programming","AUTHOR":"Nuha Ali","SUBJECT":"C-Programming Tutorial","BOOK-ID":"478"} |
st-json | write-json | Error (4) |
trivial-json-codec | serialize | Error (1) |
yason | encode | Error (1) |
- (1) No applicable method
- (2) Not a type which can be encoded by encode-json
- (3) You would need to use lower level components of json-streams. See json-streams-encoding.
- (4) Can not write object of type BOOK as JSON.
Incremental Encoding
Suppose you do not yet have a specific lisp data structure with all the data you want to encode to a single JSON object or array. A simple example would be intermediate function results during a loop. How would you go about writing it? Obviously you could collect all the results into a list or other CL data structure, or you could build a JSON object or array incrementally or write them to stream incrementally. Examples can be found in the library details section below using the appropriate links in the following table.
Library | Ability? |
---|---|
boost-json | No |
cl-json | Yes |
com.gigamonkeys.json | No |
com.inuoe.jzon | Yes |
jonathan | Yes |
json-lib | No |
json-streams | Yes |
jsown | Yes |
shasht | Yes |
st-json | Yes |
trivial-json-codec | No |
yason | Yes |
- (1) being actively worked on
Symmetry
Are the libraries symmetrical in the sense that (a) going from JSON to cl and back to JSON gets you the exact same JSON and (b) going from cl to JSON and back to cl gets you the exact same cl data and data structure? (This is sometimes called round-tripping.) This is critical to some users, not critical to others. If there was a 1:1 matching of cl and JSON data structures, this would be easy, but there isn't. JSON has a null that cl does not have and cl has many data structures that JSON does not have.
Consider starting from JSON and trying to get back to JSON. To quote Steve Losh: "For me, the most important quality I need in a JSON library is an unambiguous, one-to-one mapping of types. For example: some libraries will deserialize JSON arrays as Lisp lists, and JSON true/false as t/nil. But this means [] and false both deserialize to nil, so you can't reliably round trip anything!"
Now consider starting from cl and trying to get back to cl. A cl library that handles both lists and arrays might encode both as JSON arrays. But now when you decode the JSON array, are you decoding to a cl list or a cl array? If round-tripping is important to you, this will drive your choice of cl data structures so that you know what you are decoding back to. Similarly, what do you do with symbols?
JSON -> CL -> JSON
First test (Easy)
In the first easy test, the only surprises were jonathan reversing the order of the elements in the JSON object and I do not know what trivial-json-codec is doing with the angle brackets in the JSON object test.
Library | Pass/Fail | Resulting String | Comment |
---|---|---|---|
Original | "{\"a\":1,\"b\":\"sales\",\"c\":true}" | ||
boost-json | PASS | "{\"a\":1,\"b\":\"sales\",\"c\":true}" | |
cl-json | PASS | "{\"a\":1,\"b\":\"sales\",\"c\":true}" | |
com.gigamonkeys.json | PASS | {"a":1,"b":"sales","c":true} | |
com.inuoe.jzon | PASS | "{\"a\":1,\"b\":\"sales\",\"c\":true}" | |
jonathan | FAIL | "{\"c\":true,\"b\":\"sales\",\"a\":1}" | Reversed order (2) |
json-lib | PASS | "{\"a\":1,\"b\":\"sales\",\"c\":true}" | |
json-streams | PASS | "{\"a\":1,\"b\":\"sales\",\"c\":true}" | |
jsown | PASS | "{\"a\":1,\"b\":\"sales\",\"c\":true}" | |
shasht | PASS | "{\"a\":1,\"b\":\"sales\",\"c\":true}" | |
st-json | PASS | "{\"a\":1,\"b\":\"sales\",\"c\":true}" | |
trivial-json-codec | FAIL | "<<\":A\",1>,<\":B\",\"sales\">,<\":C\",true>>" | (1) |
yason | PASS | "{\"a\":1,\"b\":\"sales\",\"c\":true}" |
- (1) trivial-json-codec is really intended as a parser (one way) from JSON to CL, not really serializing to JSON.
- (2) Ignoring the reversed order issue, consider the following from jonathan. We decode a JSON object as an alist. Jonathan returns an alist with dotted pairs. E.g.
(jonathan:parse "{\"a\":1,\"b\":2}" :as :alist) (("b" . 2) ("a" . 1))
If we then pass that to jonathan:to-json, it throws an error because jonathan does not handle alists with dotted pairs. I am only calling this out because jonathan expressly provides you with the ability to return a JSON object as a dotted pair alist, but then cannot handle it going back the other direction.
Everyone passes when dealing with the easy array with no nulls involved.
Library | Pass/Fail | Resulting String |
---|---|---|
Original | "[1,\"sales\",true]" | |
boost-json | PASS | "[1,\"sales\",true]" |
cl-json | PASS | "[1,\"sales\",true]" |
com.gigamonkeys.json | PASS | [1,"sales",true] |
com.inuoe.jzon | PASS | "[1,\"sales\",true]" |
jonathan | PASS | "[1,\"sales\",true]" |
json-lib | PASS | "[1, \"sales\", true]" |
json-streams | PASS | "[1,\"sales\",true]" |
jsown | PASS | "[1,\"sales\",true]" |
shasht | PASS | "[1,\"sales\",true]" |
st-json | PASS | "[1,\"sales\",true]" |
trivial-json-codec | PASS | "[1,\"sales\",true]" |
yason | PASS | "[1,\"sales\",true]" |
Second Test (Array Inside a JSON Object)
Library | Pass/Fail | Resulting String |
---|---|---|
Original | "{\"items\": [1,2,3]}" | |
boost-json | PASS | "{\"items\":[1,2,3]}" |
cl-json | FAIL (1) | "[[\"items\",1,2,3]]" |
com.gigamonkeys.json | PASS | {"items":[1,2,3]} |
com.inuoe.jzon | PASS | "{\"items\":[1,2,3]}" |
jonathan | PASS | "{\"items\":[1,2,3]}" |
json-lib | PASS | "{\"items\": [1, 2, 3]}" |
json-streams | PASS | "{\"items\":[1,2,3]}" |
jsown | PASS | "{\"items\":[1,2,3]}" |
shasht | PASS | "{\"items\":[1,2,3]}" |
st-json | PASS | "{\"items\":[1,2,3]}" |
trivial-json-codec | FAIL | "<<\":ITEMS\",[1,2,3]>>" |
yason | PASS | "{\"items\":[1,2,3]}" |
- (1) I was surprised that cl-json failed here by returning a JSON array instead of a JSON object. This is flagged on the homepage as issue 22.
Third test (Trickier Data Types)
Now we make it a bit trickier with the following JSON string, decoded and then encoded it to see if we got back the original. This has trickier bits because of the unicode, exponent and false and null. The results are pretty much what you would expect.
"{\"key1\":\"value\\n\",\"key2\":1,\"key3\":[\"Hello \\u2604\",1.2e-34 ,true,false,null]}"
Library | Pass/Fail | Resulting String |
---|---|---|
Original | "{\"key1\":\"value\\n\",\"key2\":1, \"key3\":[\"Hello \\u2604\",1.2e-34 ,true,false,null]}" | |
boost-json | FAIL (3) | "{\"key1\":\"value\\n\",\"key2\":1, \"key3\":[\"Hello \\u2604\", 0.00000000000000000000000000000000012000001, true,null,null]}" |
cl-json | FAIL (3) | "{\"key1\":\"value\\n\",\"key2\":1, \"key3\":[\"Hello \\u2604\",0.00000000000000000000000000000000012000001, true,null,null]}" |
com.gigamonkeys.json | PASS | {"key1":"value\n","key2":1, "key3":["Hello ☄",1.2e-34,true,false,null]} |
com.inuoe.jzon | FAIL (3) | "{\"key1\":\"value\\n\",\"key2\":1, \"key3\":[\"Hello ☄\", 0.00000000000000000000000000000000012, true,false,null]}" |
jonathan (1) | FAIL (3)(5) | "{\"key3\":[\"Hello ☄\", 0.00000000000000000000000000000000012000001, true,[],[]],\"key2\":1, \"key1\":\"value\\n\"}" |
json-lib | FAIL | "{\"key1\": \"value\\n\", \"key2\": 1, \"key3\": [\"Hello \", 1.2000001e-34, true, null, null]}" |
json-streams (1) | PASS | "{\"key1\":\"value\\n\",\"key2\":1, \"key3\":[\"Hello ☄\",1.2E-34,true,false,null]}" |
jsown | FAIL (3)(5) | "{\"key1\":\"value\\n\",\"key2\":1, \"key3\":[\"Hello \\u2604\", 0.00000000000000000000000000000000012, true,[],[]]}" |
shasht (1) | PASS | "{\"key1\":\"value\\n\",\"key2\":1, \"key3\":[\"Hello ☄\",1.2e-34,true,false,null]}" |
st-json | PASS | "{\"key1\":\"value\\n\",\"key2\":1, \"key3\":[\"Hello \\u2604\",0.12e-33,true,false,null]}" |
trivial-json-codec | FAIL (4) | "<<\":KEY1\",\"value\\n\">,<\":KEY2\",1>, <\":KEY3\",[\"Hello \\u2604\",1.2000001e-34,true,null,null]>>" |
yason (1) (2) | PASS? (3) | "{\"key1\":\"value\\n\",\"key2\":1, \"key3\":[\"Hello ☄\", 0.00000000000000000000000000000000011999999642058263, true,false,null]}" |
- (1) unicode expanded to the appropriate character which I will treat as a success
- (2) You need to pass :json-booleans-as-symbols t to the yason parse function as a keyword argument in order to get false and it will come out as a YASON:FALSE symbol
- (3) Exponent expanded to a decimal which I could argue is a success. Your call.
- (4) trivial-json-codec is really intended as a parser (one way) from JSON to CL, not really serializing to JSON.
- (5) both false and null are converted to [], but they are different concepts.
Library | Point(s) of Failure | Comment |
---|---|---|
boost-json | exponent (1), false | |
cl-json | exponent (1), false | |
com.gigamonkeys.json | ||
com.inuoe.jzon | exponent (1) | |
jonathan | backwards, exponent (1), false, null | unicode expanded to the appropriate character |
json-lib | unicode, false | |
json-streams | unicode expanded to the appropriate character | |
jsown | exponent (1), false, null | |
shasht | unicode expanded to the appropriate character | |
st-json | exponent (1) | 0.12e-33 v. 1.2e-34 should count as pass |
yason | exponent (1) | unicode expanded to the appropriate character |
- (1) Up to you whether you want to treat the exponent expanded to a decimal as a pass or fail
CL -> JSON -> CL
We did this testing with a simple alist with undotted pairs. Since this is an alist, cl-json and json use specialized functions :encode-json-alist-to-string to successfully write JSON, then read back a result that is symmetrical to the original alist.
'((:NAME "George Washington") (:BIRTHDAY "February 22, 1732") (:ADDRESS "Mount Vernon, Virginia, United States"))
Library | Pass/Fail | Resulting String |
---|---|---|
Original | ((:NAME "George Washington") (:BIRTHDAY "February 22, 1732") (:ADDRESS "Mount Vernon, Virginia, United States")) | |
boost-json | FAIL | Error: Unexpected #\N |
cl-json | PASS | ((:NAME "George Washington") (:BIRTHDAY "February 22, 1732") (:ADDRESS "Mount Vernon, Virginia, United States")) |
com.inuoe.jzon | FAIL | (("address" . #("Mount Vernon, Virginia, United States")) ("birthday" . #("February 22, 1732")) ("name" . #("George Washington"))) |
jonathan | PASS (3) | ((:NAME "George Washington") (:BIRTHDAY "February 22, 1732") (:ADDRESS "Mount Vernon, Virginia, United States")) |
json-lib | FAIL | #(#("name" "George Washington") #("birthday" "February 22, 1732") #("address" "Mount Vernon, Virginia, United States")) |
json-streams | FAIL | (4) |
jsown | FAIL | (("NAME" "George Washington") ("BIRTHDAY" "February 22, 1732") ("ADDRESS" "Mount Vernon, Virginia, United States")) |
shasht | FAIL | #(#("NAME" "George Washington") #("BIRTHDAY" "February 22, 1732") #("ADDRESS" "Mount Vernon, Virginia, United States")) |
st-json | FAIL | Depends on how you write a method for handling symbols. See st-json-encoding |
trivial-json-codec | PASS | ((:NAME "George Washington") (:BIRTHDAY "February 22, 1732") (:ADDRESS "Mount Vernon, Virginia, United States")) |
yason (2) | FAIL | (("address" "Mount Vernon, Virginia, United States") ("birthday" "February 22, 1732") ("name" "George Washington")) |
- (1) com.inuoe.jzon and yason return hashtables, so alexandria:hash-tables-alist was used to get an alist back.
- (2) (yason:parse (with-output-to-string (s) (yason:encode-alist data s)) :object-as :alist)
- (3) Caveat here if we tried to do this with respect to alists having dotted pairs. In that case jonathan would also fail.
- (4) You could write your own methods to wrap lower level components of json-streams. See json-streams-encoding.
Library | Point(s) of Failure |
---|---|
boost-json | Unexpected #\N |
cl-json | |
com.gigamonkeys.json | ??? |
com.inuoe.jzon | keys are strings, not keywords, values are embedded in vectors |
jonathan | (1) |
json-lib | result is vectors in vectors with the keys being strings, not keywords |
json-streams | You would need to use lower level components of json-streams. See json-streams-encoding. |
jsown | keys are strings, not keywords |
shasht | result is vectors in vectors with the keys being strings, not keywords |
st-json | You need to write your own method for handling symbols. |
trivial-json-codec | |
yason | keys are strings, not keywords |
- (1) As noted when going from JSON -> CL -> JSON, jonathan is not symmetric if the alists have dotted pairs.
Now let's try starting from an array which also contains a :NULL keyword symbol as a substitute for cl not having a proper null value.
Library | Pass/Fail | Resulting String |
---|---|---|
Original | #("a" 1 4.2 NIL :NULL) | |
boost-json | FAIL | [,1,4.2,null,"NULL"] |
cl-json DEFAULT | FAIL | ( "a" 1 4.2 NIL "null") |
cl-json | PASS (1)(2) | #("a" 1 4.2 NIL "null") |
com.gigamonkeys.json | FAIL | ["a",1,4.199999809265137,{},null] |
com.inuoe.jzon | PASS (2) | #("a" 1 4.2d0 NIL "NULL") |
jonathan | FAIL | (a 1 4.2 NIL NIL) |
json-lib | PASS (2) | #(a 1 4.2 NIL "null") |
json-streams | FAIL | Error |
jsown | FAIL | (a 1 21/5 NIL NIL) |
shasht | PASS (3) | #("a" 1 4.2000003 NIL :NULL) |
st-json | FAIL | Error |
trivial-json-codec | PASS | #(a 1 4.2 NIL NULL) |
yason (4) | FAIL | #("a" 1 4.2 NIL NIL) |
- (1) Needs to use cl-json:with-decoder-simple-clos-semantics or cl-json:simple-clos-semantics
- (2) Pass assuming you deal with the "NULL" or 'null" string somehow to get back to :NULL
- (3) Pass assuming the float result is acceptable to you
- (4) Assumes yason:parse has keyword parameter :json-arrays-as-vectors set to t
Library | Point(s) of Failure |
---|---|
boost-json | lost the first value, nil came back as null |
cl-json | need to parse the result and convert "null" to :NULL |
com.gigamonkeys.json | {} braces instead of () or nil |
com.inuoe.jzon | need to parse the result and convert "NULL" to :NULL |
jonathan | result is a list and it converts :NULL to nil |
json-lib | need to parse the result and convert "null" to :NULL |
json-streams | You would need to write your own method wrapping components of json-streams to handle a vector |
jsown | returns list instead of array and :NULL is converted to nil. Interesting that the float is converted to a ratio. |
shasht | |
st-json | You need to write your own method for handling vectors |
trivial-json-codec | |
yason | Null converted to nil |
Security
Getting JSON objects from another source is just as insecure as any other data you receive from another source. You are still responsible for ensuring that you have properly sanitized, validated or other checked the security of the data.
Redditor lokedhs has pointed out that "I'd be careful about using any JSON library that uses keywords for hash keys (like CL-JSON). The reason is that if you are using it to parse unchecked input, it can be used to cause a denial of service attack by sending maps that contain random keys. Every key will be interned into the keyword package, which are never garbage collected, causing an out of memory condition after a while." Cl-json flags the issue and provides the function safe-json-intern - which will throw an error if the keyword to be interned does not already exist in the *json-symbols-package*. - so at least you are warned and provided with an alternative. The following code will throw an error if alpha-omega is not already interned in the *json-symbols-package*.
(setf cl-json::*identifier-name-to-key* #'cl-json::safe-json-intern)
(cl-json:decode-json-from-string "{\"alpha-omega\": 1}")
Boost-json and Jonathan also seem to have this as a potential issue. Boost-json decodes JSON objects to a CLOS object with slot-value names interned in the boost-json package.
com.gigamonkeys.json, com.inuoe.jzon, json, json-lib, json-streams, jsown, shasht and st-json do not intern the keywords. Yason doesn't intern the keywords in the library but the test files do, giving you the impression that it assumes that will be normal practice.
You will see in the next section when it talks about malformed data that certain strings can also hang the system or exhaust the stack (typically by opening thousands of JSON arrays or objects and never closing them). Com.inuoe.jzon, json-lib, json-streams and shasht have maximum level limits that can be used to prevent this type of overloading from happening.
Standard Conformity and Dealing with Malformed Data
To check standard conformity and test for malformed data, we will take advantage of some of the test suites that have come out for JSON. Some can be found at, specifically https://github.com/nst/JSONTestSuite. You may also find interesting reading at https://yitzchak.github.io/shasht/parsing.html#17 and http://www.seriot.ch/parsing_json.php.
One thing to take into consideration, bearing in mind your own fact pattern is whether you agree or disagree with Jon Postel's robustness principle:
"Be strict when sending and tolerant when receiving. Implementations must follow specifications precisely when sending to the network, and tolerate faulty input from the network. When in doubt, discard faulty input silently, without returning an error message unless this is required by the specification."
If you follow this logic, your encoding function should be perfectly compliant and your decoding function may want to accept invalid JSON (subject to security concerns). According to https://datatracker.ietf.org/doc/html/draft-thomson-postel-was-wrong-03,the robustness principle should be read in the context of dealing with imperfect protocols, not everything under the sun including implementation bugs.
Conformity Testing
While JSON has a standard, it is arguably under-specified. Using the tests from https://github.com/nst/JSONTestSuite, I intended to focus on whether the libraries correctly parsed JSON strings that must be accepted and whether they throw errors on JSON strings that must be rejected. However, as you will see, how the libraries handle the "must reject" strings and "under-specified" strings also implicate security and stability. Some libraries' attempts to handle intentionally malformed JSON strings actually hung SBCL or triggered stack exhaustion. These are not the libraries you want facing uncontrolled input.
Must Accept
This batch of JSON strings whould be accepted by all libraries. The only libraries to meet that standard are com.inuoe.jzon and com.gigamonkeys.json. Almost all the rest came "reasonably" close.
Library | Correct | Incorrect | Comment |
---|---|---|---|
boost-json | 95/(92) | 0 | (1) |
cl-json | 95/(92) | 0 | (1) |
gigamonkeys | 95 | 0 | |
com.inuoe.jzon | 95 | 0 | |
jonathan | 94 | 1 | (2) |
json-lib | 95/(92) | 0 | (1) |
json-streams | 93 | 2 | (3) |
jsown | 93 | 2 | (4) |
shasht | 95/(92) | 0 | (1) |
st-json | 95/(92) | 0 | (1) |
trivial-json-codec | 90/(87) | 5 | (1)(5) |
yason | 95/(92) | 0 | (1) |
- (1) If *read-default-float-format* is set to 'single-float, there will be decoding failures on [123e65], [123e45], [123.456e78],
- (2) Failed on [123.456e78] even with *read-default-float-format* is set to 'double-float
- (3) rejected files with duplicate keys. This can be resolved by passing the keyword parameter :duplicate-key-check nil
- (4) Failed on lonely numbers (an integer or negative real not within a JSON array or JSON object)
- (5) Failed on [[] ], [0e+1], [1E+2], [1e+2], { "min": -1.0e+28, "max": 1.0e+28 }
Can Accept or Reject
This batch of JSON strings were designed to find the under-specified holes in the JSON specification. As such, libraries could accept or reject the strings without being out of compliance with the standard. For purposes of the following table, I only tested files that could be read as encoded in utf-8.
What I found troubling was that jonathan and jsown all hung on one or more of these strings.
Library | Accepted | Rejected | Comment |
---|---|---|---|
boost-json | 13 | 4 | |
cl-json | 13 | 4 | |
gigamonkeys | 14 | 3 | |
com.inuoe.jzon | 7 | 10 | |
jonathan | 9 | 7 | (1) |
json-lib | 12 | 5 | |
json-streams | 1 | 16 | |
jsown | 11 | 5 | (1) |
shasht | 9 | 8 | |
st-json | 8 | 9 | |
trivial-json-codec | 13 | 4 | |
yason | 8 | 9 |
- (1) hang on i_number_real_underflow [123e-10000000]
Malformed Data
Must Reject
Only three libraries did not hang or trigger stack exhaustion on strings designed to open thousands of beginning nested arrays or objects. Those are com.inuoe.jzon, json-lib and json-streams. All three have a limit and refused to exceed the limit.
Library | Correct | Incorrect | Comment |
---|---|---|---|
boost-json | 145 | 28 | stack exhausted (1) |
cl-json | 153 | 20 | stack exhausted (1) |
com.gigamonkeys.json | 0 | 173 | stack exhausted (1) |
com.inuoe.jzon | 173 | 0 | |
jonathan | 128 | 45 | stack exhausted (1) |
json-lib | 110 | 63 | |
json-streams | 173 | 0 | |
jsown | 102 | 71 | stack exhausted (1) |
shasht | 159 | 18 | recoverable error (2) |
st-json | 140 | 33 | stack exhausted (1) |
trivial-json-codec | 101 | 72 | |
yason | 119 | 54 | stack exhausted (1) |
- (1) n_structure_open_array_object.json, n_structure_100000_opening_arrays.json
- (2) n_structure_open_array_object.json
Benchmarking
Read Times
Ok. We are going to show read benchmarking results using SBCL, CCL and ECL because the changes in the results for jonathan were so different. Some other libraries also showed some interesting differences with shasht actually taking the speed lead on larger files when using ECL.
Jsown maintains its crown as the fastest parser on SBCL (assuming no errors in the file). It is beaten slightly by jonathan on tiny strings, but jonathan shows a slow-down with certain nested JSON objects that gets progressively worse as string sizes increase. This effect is more immediate under SBCL where it starts slowing down < 12.7k data. The effect is obvious even under CCL and ECL by the time you get to 227k JSON data objects and by the time you get over 1MB JSON data objects, jonathan is orders of magnitude slower than all other libraries, not just jsown.
If we take a quick look at benchmarking with tiny JSON strings, we get read time chart comparisons that look like the following. The numbers come from applying the cost-of-nothing benchmark function against the libraries parsing the JSON string immediately below and using cl-spark's vspark function.
yason-alist and yason-plist are shorthand for (yason:parse data :object-as :alist) and (yason:parse data :object-as :plist)
(defparameter +json-string+ "{\"key1\": \"value\\n\", \"key2\":1,\"key3\":[\"Hello \\u2604\", 1.2e-34 ,true, false,null]}")
JSON Read Times (SBCL 2.3.2 tiny JSON object) 0 5.8130204e-6 1.1626041e-5 ˫----------------------------+----------------------------˧ boost-json ████████████████████████▎ cl-json ████████████████████████████████▊ com.gigamonkeys.json ███████████████████████▋ jonathan ███████▍ com.inuoe.jzon ███████▌ json-lib ███████████████████████████████▍ json-streams ███████████████████████████████████████████████████████████ jsown ███████▉ shasht ███████████▉ st-json ██████████████████████████████▊ yason █████████████████████████████████████▋ JSON Read Times (CCL 1.12.1 tiny JSON object) 0 3.1699197E-5 6.3398395E-5 ˫----------------------------+----------------------------˧ boost-json ███████████████▏ cl-json ██████████████████▎ com.gigamonkeys.json █████████████████▍ jonathan ███▏ com.inuoe.jzon ██████▊ json-lib ██████████████████████████████████████████████████████████ json-streams ██████████████████████████████████████████▎ jsown █████▎ shasht ██████████▎ st-json ████████████████▌ yason ██████████████████▌ JSON Read Times (ECL 21.2.1 tiny JSON object) 0 1.7382258e-4 3.4764517e-4 ˫----------------------------+----------------------------˧ boost-json █████████████▎ cl-json ████████▎ com.gigamonkeys.json ████▊ jonathan ████▏ com.inuoe.jzon ██████▍ json-lib ██████████████████████▎ json-streams ███████████████████████████████████▊ jsown ███████████████████████████████████████████████████████████ shasht █████▌ st-json ████▉ yason ███████▍
The read times start to look different with respect to nested JSON objects as the file sizes increase. The next chart is a 2.7k JSON nested object. With SBCL, jonathan is slightly behind JSON, but still looking good compared to the rest of the libraries. The JSON dataset for charts between 2.7k and 221k are all from a Nobel prize dataset which can be found at http://api.nobelprize.org/v1/prize.json. The 2.7k data is just 2021. The 12.9k data is 2017-2021 and the 221k data are all years. We are dropping trivial-json-codec out at this point because it claims the data is invalid.
JSON Read Times (SBCL 2.3.2 2.7k JSON object) 0 1.2616655e-4 2.523331e-4 ˫----------------------------+----------------------------˧ boost-json ███████████▍ cl-json █████████████████████████████▋ com.gigamonkeys.json █████████████████▎ jonathan █████▌ com.inuoe.jzon █████████▌ json-lib ████████████████████████████████████▋ json-streams ███████████████████████████████████████████████████████████ jsown ███▌ shasht █████████████▍ st-json ████████████▍ yason ███████████████████████████████████████████▋ JSON Read Times (CCL 1.12.1 2.7k JSON object) 0 4.6838302E-4 9.3676604E-4 ˫----------------------------+----------------------------˧ boost-json █████████████▋ cl-json ████████████████████████████▏ com.gigamonkeys.json █████████████████▍ jonathan ███▍ com.inuoe.jzon █████████▊ json-lib █████████████████████████████████████████████████████████▌ json-streams ███████████████████████████████████████████████████████████ jsown ███████▍ shasht ███████▉ st-json ████████████▋ yason ███████████████████████████▎ JSON Read Times (ECL 21.2.1 2.7k JSON object) 0 0.010292258 0.020584516 ˫----------------------------+----------------------------˧ boost-json █▌ cl-json ███▌ com.gigamonkeys.json ███▉ jonathan █▎ com.inuoe.jzon █▉ json-lib █████▋ json-streams ███████████████████████████████████████████████████████████ jsown ██▊ shasht █▎ st-json ████████▋ yason ███████████▊
Now we increase the size to a 12.9k JSON nested object. Using SBCL, jsown is by itself and jonathan has fallen behind four other libraries. On the other hand, jonathan is still faster with CCL and ECL.
JSON Read Times (SBCL 2.3.2 12.9k JSON object) 0 6.1594247e-4 0.0012318849 ˫----------------------------+----------------------------˧ boost-json ███████████▍ cl-json █████████████████████████████▊ com.gigamonkeys.json ████████████████▉ jonathan █████████████████▉ com.inuoe.jzon █████████▍ json-lib ███████████████████████████████████▉ json-streams ███████████████████████████████████████████████████████████ jsown ███▌ shasht █████████████▏ st-json ████████████▏ yason ██████████████████████████████████████████▋ JSON Read Times (CCL 1.12.1 12.9k JSON object) 0 0.0022662715 0.004532543 ˫----------------------------+----------------------------˧ boost-json █████████████▋ cl-json ████████████████████████████▍ com.gigamonkeys.json ████████████████▉ jonathan ██████▌ com.inuoe.jzon █████████▋ json-lib █████████████████████████████████████████████████████████▉ json-streams ███████████████████████████████████████████████████████████ jsown ███████▏ shasht ████████▎ st-json ████████████▉ yason ███████████████████████████▌ JSON Read Times (ECL 21.2.1 12.9k JSON object) 0 0.018946955 0.03789391 ˫----------------------------+----------------------------˧ boost-json █████▎ cl-json ████████████▍ com.gigamonkeys.json █████▋ jonathan █████▍ com.inuoe.jzon █████▋ json-lib ████████████████████████▊ json-streams ███████████████████████████████████████████████████████████ jsown ███████▊ shasht ████████▏ st-json █████▎ yason ████████▍
Now we jump up to the 221k JSON object and jonathan is really falling behind with SBCL and suddenly starts struggling under CCL and ECL for unknown reasons.
JSON Read Times (SBCL 2.3.2 221k JSON object) 0 0.030821392 0.061642785 ˫----------------------------+----------------------------˧ boost-json ███▊ cl-json █████████▊ com.gigamonkeys.json █████▉ jonathan ██████████████████████████████████████████████████████████ com.inuoe.jzon ███▎ json-lib ████████████▍ json-streams ███████████████████▋ jsown █▎ shasht ████▌ st-json ████▏ yason ██████████████▋ JSON Read Times (CCL 1.12.1 221k JSON object) 0 0.0410785 0.082157 ˫----------------------------+----------------------------˧ boost-json ████████████▉ cl-json ██████████████████████████▏ com.gigamonkeys.json ██████████████████████▎ jonathan █████████████████████████████████████████████████████▉ com.inuoe.jzon ██████████████▊ json-lib ███████████████████████████████████████████████████████████ json-streams █████████████████████████████████████████████████████▋ jsown ███████▍ shasht ███████████████████████▉ st-json ████████████▏ yason ████████████████████████████████▉ JSON Read Times ECL 21.2.1 221k JSON object 0 0.17499994 0.34999987 ˫----------------------------+----------------------------˧ boost-json ██████████▎ cl-json █████████████▎ com.gigamonkeys.json █████████▉ jonathan ██████████████████████████████████████████▏ com.inuoe.jzon █████████▎ json-lib ███████████████████▊ json-streams ███████████████████████████████████████████████████████████ jsown ██████████████████████████▏ shasht ████▉ st-json █████▉ yason ███████████████████████████▌
Switching to trivial-benchmarking numbers and just taking the cumulative timing and consing for 20 runs of these three JSON strings and just comparing jonathan v. com.inuoe.jzon. Obviously the first thing that jumps out is the garbage collection while jonathan parses the 221k JSON string. Look at the relative increase. The largest string is 8162% larger than the smallest, but jonathan's run time increases by 100000% and the bytes consed increases by 576000%. At the same time com.inuoe.jzon's increases slightly less than the increase in string size.
Relative Increase | ||||
---|---|---|---|---|
File Size (Bytes) | 2715 | 12961 | 221618 | 8162% |
Total (sec) | Total (sec) | Total (sec) | ||
RUN-TIME | 0.000993 | 0.008474 | 1.261165 | 127005% |
USER-RUN-TIME | 0.000970 | 0.008460 | 1.184836 | 122251% |
SYSTEM-RUN-TIME | 0.000004 | 0.000023 | 0.076502 | 1912550% |
GC-RUN-TIME | 0 | 0 | 155.651 | |
BYTES-CONSED | 1811312 | 37779888 | 10098760112 | 557538% |
Relative Increase | ||||
---|---|---|---|---|
File Size (Bytes) | 2715 | 12961 | 221618 | 8162% |
Total (sec) | Total (sec) | Total (sec) | ||
RUN-TIME | 0.001822 | 0.004031 | 0.071345 | 3915% |
USER-RUN-TIME | 0.001822 | 0.004031 | 0.071326 | 3915% |
SYSTEM-RUN-TIME | 0 | 0 | 0 | 0 |
GC-RUN-TIME | 0 | 0 | 0 | 0 |
BYTES-CONSED | 426640 | 1839472 | 30431056 | 4311% |
Now looking at parsing a 1.2 MB JSON string with nested objects, jonathan is orders of magnitude slower than all the other libraries regardless of which compiler is used. The JSON dataset can be found at https://github.com/mledoze/countries/blob/master/countries.json.
JSON Read Times (SBCL 2.3.2 1.2MB JSON object) 0 1.0532565 2.106513 ˫----------------------------+----------------------------˧ boost-json ▌ cl-json █▏ com.gigamonkeys.json ▊ jonathan ██████████████████████████████████████████████████████████ com.inuoe.jzon ▍ json-lib █▋ json-streams ██▌ jsown ▍ shasht ▋ st-json ▊ yason █▌ CCL results not included since CCL had an issue with some of the unicode in the file. JSON Read Times (ECL ECL 21.2.1 1.2MB JSON object) 0 3.5 7.0 ˫----------------------------+----------------------------˧ boost-json █▍ cl-json █▉ com.gigamonkeys.json █▉ jonathan ███████████████████████████████████████████████████████████ com.inuoe.jzon █████▏ json-lib ████▎ json-streams ████████▍ jsown █▉ shasht █▌ st-json █▍ yason █▉
If we drop jonathan and look at the remaining libraries, it looks like this:
JSON Read Times 0 0.045177087 0.090354174 ˫----------------------------+----------------------------˧ boost-json ██████████▎ cl-json ███████████████████████████▍ com.gigamonkeys.json ██████████████████▍ com.inuoe.jzon ████████▌ json-lib █████████████████████████████████████▌ json-streams ███████████████████████████████████████████████████████████ jsown ███████▍ shasht █████████████▍ st-json █████████████████▎ yason ███████████████████████████████████▎ JSON Read Times (ECL 21.2.1 1.2MB JSON object) 0 0.49999994 0.9999999 ˫----------------------------+----------------------------˧ boost-json ██████████▋ cl-json ████████████████████▋ com.gigamonkeys.json ████████████████▊ com.inuoe.jzon ████████████████████████████████▌ json-lib ███████████████████████▋ json-streams ███████████████████████████████████████████████████████████ jsown ███████████████████▋ shasht █████████▋ st-json █████████▏ yason █████████████▎
Moving up to a 9.8 MB file (still on the small side) downloaded from https://www.vizgr.org/historical-events/search.php?format=json&begin_date=-3000000&end_date=20151231&lang=en, Yason's standard parsing into a hash complained that there was a duplicate key in the data, so we are using yason's parsing into an alist instead.
With Jonathan
JSON Read Times (SBCL 2.3.2 9.8 MB JSON object) 0 8.069131 16.138262 ˫----------------------------+----------------------------˧ boost-json ▋ cl-json █▋ com.gigamonkeys.json █▎ jonathan ███████████████████████████████████████████████████████████ com.inuoe.jzon ▍ json-lib ██▎ json-streams ███▏ jsown ▎ shasht ▋ st-json ▋ yason-alist ██▌ JSON Read Times (CCL 1.12.1 9.8 MB Json Object) 0 10.630925 21.26185 ˫----------------------------+----------------------------˧ boost-json ██▎ cl-json ███▊ com.gigamonkeys.json ███▍ jonathan ███████████████████████████████████████████████████████████ com.inuoe.jzon ▉ json-lib ███████████▎ json-streams ████████▏ jsown █▍ shasht █▏ st-json ██▏ JSON Read Times (ECL 21.2.1 9.8 MB Json Object) 0 19.5 39.0 ˫----------------------------+----------------------------˧ boost-json █▌ cl-json ███▏ com.gigamonkeys.json ██▎ jonathan ███████████████████████████████████████████████████████████ com.inuoe.jzon █▌ json-lib ██████▏ json-streams ████████████▏ jsown ███▏ shasht █▌ st-json █▌ yason-alist ███▏
Without Jonathan
JSON Read Times (SBCL 2.3.2 9.8 MB JSON object) 0 0.4027305 0.805461 ˫----------------------------+----------------------------˧ boost-json ████████████▋ cl-json ████████████████████████████████▍ com.gigamonkeys.json ███████████████████████▎ com.inuoe.jzon █████▉ json-lib ████████████████████████████████████████████▊ json-streams ███████████████████████████████████████████████████████████ jsown ████▍ shasht █████████████▏ st-json ████████████▊ yason-alist ███████████████████████████████████████████████████▋ JSON Read Times (CCL 9.8MB JSON object) 0 2.043566 4.087132 ˫----------------------------+----------------------------˧ boost-json ███████████▊ cl-json ███████████████████▉ com.gigamonkeys.json ████████████████▉ com.inuoe.jzon █████▏ json-lib ███████████████████████████████████████████████████████████ json-streams ██████████████████████████████████████████▉ jsown ██████▉ shasht █████▍ st-json ██████████▉ yason-alist █████████████████████████▋ JSON Read Times (ECL 9.8 MB JSON object) 0 4.5 9.0 ˫----------------------------+----------------------------˧ boost-json ██████▌ cl-json █████████████▏ com.gigamonkeys.json █████████████▏ com.inuoe.jzon ██████▌ json-lib ███████████████████▋ json-streams ███████████████████████████████████████████████████████████ jsown █████████████▏ shasht █████▊ st-json ██████▌ yason-alist █████████████▏
Read Times From Stream (SBCL)
The following charts are reading from stream rather than from string (so a slightly smaller list of libraries) and jonathan choked on both files being read into the stream, so it is excluded as well. We see the same comparative results when reading from stream as when we read from strings.
Countries File
JSON Read Times (SBCL) 1.2 MB Countries File 0 0.04425004 0.08850008 ˫-------------------------------+-------------------------------˧ boost-json ██████████▋ cl-json ████████████████████████████▌ com.inuoe.jzon ████████████████▉ json-streams █████████████████████████████████████████████████████████████████ shasht █████████████▋ st-json ████████████████▎ yason ████████████████████████████████▏ yason-alist ██████████████████████████████ yason-plist ███████████████████████████████▏
Historical Events File
JSON Read Times (SBCL) 9.8 MB Historical Events File 0 0.39376867 0.78753734 ˫-------------------------------+-------------------------------˧ boost-json ████████████▋ cl-json ██████████████████████████████████▌ com.inuoe.jzon █████████████████▉ json-streams █████████████████████████████████████████████████████████████████ shasht █████████████▏ st-json █████████████▎ yason-alist ██████████████████████████████████████████▏
Write Times
Unlike the read times with Jonathan, writing times did not show any surprising differences between tiny bits of data and longer nested data. First writing tiny bits of data. We are skipping trivial-json-codec because of its inability to deliver valid JSON from lists and gigamonkeys had some difficulties with some of the data as well, so it was dropped. The data used is the same data as the reading benchmarks, but encoded by yason. Due to time constraints, I did not run the write times benchmarks for CCL and ECL.
JSON Write Times (SBCL trivial sized list) 0 5.3087347e-6 1.0617469e-5 ˫-------------------------------+-------------------------------˧ boost-json █████████████████████████████████████████████████████████████████ cl-json ███████████████████████████████████████▍ com.inuoe.jzon ████████████▊ json-lib ████████████████████████████████████████████████▊ jonathan ██████████████████████████████████████████████████████▌ jsown ███████████████████████████████████████████████▏ shasht █████████████████▌ st-json █████████████████████████████▍ yason ██████████████████████████████████████▊
Now the lisp data equivalent to the JSON 2.7k data string:
JSON Write Times 2.7k data (SBCL) 0 1.1911893e-4 2.3823786e-4 ˫-------------------------------+-------------------------------˧ boost-json █████████████████████████████████████████████████████████████████ cl-json ███████████████████▍ com.inuoe.jzon ██████████▌ json-lib ███████████████████████████████████▉ jonathan ██████████▌ jsown ████████████████████████████▌ shasht ███████████▎ st-json ████████▉ yason ██████████████████████▏
Now the lisp data equivalent to the JSON 12.9k data string
JSON Write Times 12.9k data (SBCL) 0 5.9223245e-4 0.0011844649 ˫-------------------------------+-------------------------------˧ boost-json █████████████████████████████████████████████████████████████████ cl-json ██████████████████▉ com.inuoe.jzon ██████████▎ json-lib ██████████████████████████████████▊ jonathan ██████████▌ jsown ███████████████████████████▍ shasht ██████████▊ st-json ████████▌ yason █████████████████████▏
Now the lisp data equivalent to a JSON 221.6k data string
JSON Write Times 221.6k data (SBCL) 0 0.014435494 0.028870989 ˫-------------------------------+-------------------------------˧ boost-json █████████████████████████████████████████████████████████████████ cl-json ████████████▌ com.inuoe.jzon ██████▉ json-lib ████████████████████████▎ jonathan ██████▊ jsown ███████████████████▌ shasht ███████▎ st-json █████▋ yason ██████████████▋
JSON Libraries Specific Comments
boost-json
Library | Author | License | Website | Comments |
---|---|---|---|---|
boost-json | Jeffrey Massung | Apache v.2 | https://github.com/cl-boost/json | Not in Quicklisp |
Boost-json is one of the faster decoders. The author notes that he personally uses it to parse extremely large, genomics JSON-list files (several GB in size). The author and I have different opinions on nil/false/null/[] and whether they actually have meaningful differences. He does not think so; I disagree. It does have issues on the encoding side, sometimes losing values when encoding vectors and is generally one of the slowest writers. I think it has value in the right use cases, but you do need to make sure it meets your particular needs.
Default Mapping
Please note the direction of the arrows in the following table.
Lisp | JSON | |
---|---|---|
integer | <-> | number with no frac or exp parts |
float | <-> | number with frac or exp parts |
rational | -> | number with frac or exp parts |
ratio | -> | number with frac or exp parts |
T | <-> | true |
nil | <-> | null |
nil | <- | false |
other symbol | -> | string |
character | -> | Error |
string | <-> | string |
list (except alists) | <-> | array |
alist (dotted pairs) | -> | Invalid JSON array |
alist (undotted pairs) | <-> | nested array |
hash-table | -> | object |
CLOS object | <- | object |
standard object | -> | need to write a method |
Decoding
Boost-json has different functions for decoding from strings (json-decode) or streams (json-read). JSON arrays are decoded to CL lists and JSON objects are decoded to a CLOS object with slot-value names interned in the boost-json package.
I find its handling of nil and false a bit confusing. Consider the following examples:
(boost-json:json-decode "[false]") (NIL) (boost-json:json-decode "{\"A\":false}") #<BOOST-JSON:JSON-OBJECT {"A":null}>
Within an array, JSON's 'false' is converted to CL nil, but within a JSON object, JSON's 'false' is convert to CL :null. Why?
We also have a symmetry problem with respect to JSON's 'false'. See the following:
(boost-json:json-encode (boost-json:json-decode "{\"A\":false}")) {"A":null}
The starting point with 'false' got converted to null.
Boost-json does not handle unicode surrogate pairs if you care about that sort of thing.
Decoding to CLOS object
Now let's talk about boost-json's automagic decoding JSON objects to a json-object which is a standard CLOS object. Let's start with a simple version before we go to nested objects.
(boost-json:json-decode "{ \"weights\" : [ 0.5, 0.5 ] }") #<BOOST-JSON:JSON-OBJECT {"weights":#}> (describe (boost-json:json-decode "{ \"weights\" : [ 0.5, 0.5 ] }")) #<BOOST-JSON:JSON-OBJECT {"weights":#}> [standard-object] Slots with :INSTANCE allocation: MEMBERS = (("weights" (0.5 0.5)))
It appears that you need to call (boost-json:json-getf obj keyword) in order to act as an accessor to the automagically built CLOS object:
(boost-json:json-getf (boost-json:json-decode "{ \"weights\" : [ 0.5, 0.5 ] }") "weights") (0.5 0.5)
or, of course, you can use slot-value, just remember that the slot-value name will be interned in the boost-json package..
Nested Objects
Now let's look at a nested object. As a reminder, we will use the following parameter:
(defparameter *nested-address-1* "{ \"first_name\": \"George\", \"last_name\": \"Washington\", \"birthday\": \"1732-02-22\", \"address\": { \"street_address\": \"3200 Mount Vernon Memorial Highway\", \"city\": \"Mount Vernon\", \"state\": \"Virginia\", \"country\": \"United States\" } }")
If we want to get the city from the CLOS object that boost-json created, it might look something like this:
(boost-json:json-getf (boost-json:json-getf (boost-json:json-decode *nested-address-1*) "address") "city") "Mount Vernon"
Encoding
On the plus side, boost-json was one of two libraries which could encode a pathname.
On the neutral side, data types like char, local-time:timestamps and structs would require you to write an encoding method to handle them if you have them.
On the "choose your data structures carefully side",
- Encoding hash-tables succeeds if the hash-table keys are strings, fails if they are symbols
- Encoding vectors results in invalid results, generally losing the first value in the array. Sample output on a simple nested array looked like: [,"Cork","Limerick"][,[,"Frankfurt","Munich"]]
Boost-json will try to encode alists as JSON arrays of array. It will generate invalid JSON if the alisp has dotted pairs.
(boost-json:json-encode '(("A" . 1) ("B" . 2) ("C" . 3))) [["A",. 1],["B",. 2],["C",. 3]]
If the array is not dotted pairs, boost-json will give you arrays of arrays.
(boost-json:json-encode '(("A" 1) ("B" 2) ("C" 3))) [["A",1],["B",2],["C",3]]
Encoding CLOS class instances
For boost-json to encode a CLOS class instance, you need to provide a new json-write method for that class unless it is a json-object class. In the decoding examples, we saw a JSON object "{\"weights\" : [ 0.5, 0.5 ] }" get decoded to a boost-json:json-object:
(boost-json:json-encode (boost-json:json-decode "{ \"weights\" : [ 0.5, 0.5 ] }")) {"weights":[0.5,0.5]}
If we tried that with our simple person class which is not a boost-json:json-object,
(boost-json:json-encode (make-instance 'person)) ; Evaluation aborted on #<SB-PCL::NO-APPLICABLE-METHOD-ERROR {1005573E93}>.
You could use the existing methods in https://github.com/cl-boost/json/blob/main/encode.lisp or, for our very simple person class, something like the following could work:
(defmethod boost-json:json-write ((person person) &optional stream) (let ((accessors '(("name" name) ("eye_colour" eye-colour)))) (write-char #\{ stream) (loop :for (key val) :in accessors :for first := t :then nil :unless first :do (write-char #\, stream) do (boost-json::json-write key stream) (write-char #\: stream) (boost-json::json-write (funcall val person) stream)) (write-char #\} stream)))
Symmetry
From the standpoint of symmetry or round-tripping, going from JSON->CL->JSON it only fails on dealing with 'false'.
It had more problems going from CL->JSON->CL. In some tests it triggered an error with an expected #\N, in others nil came back as nul and it could lose the first value in vectors.
Security
There were two general security issues we considered: (1) interning keywords and (2) issues with malformed data.
Boost-json did not exhibit the first issue in the same way that e.g. cl-json does. However, it does decode JSON objects to a CLOS object with slot-value names interned in the boost-json package.
With respect to the second issue, boost-json properly rejected 145 malformed test cases but accepted 28. Some of those malformed test cases which were accepted actually triggered stack exhaustion by opening too many levels of JSON open arrays and not closing them or similar types of issues.
Conformity with JSON Standard
Boost-json accepted 95 of the 95 test cases that are considered "must accept". If *read-default-float-format* is set to 'single-float, it would refuse to accept: [123e65], [123e45] and [123.456e78].
It accepted 13 of the 17 test cases considered to be part of the gray area of the JSON specification - you could accept or reject.
Benchmarking
See Benchmarking
boost-json exported symbols
- json-decode - Convert a JSON string into a Lisp object.
- json-encode - Encodes a Lisp value into a stream.
- json-read - decode from a stream
- json-enable-reader-macro - enable json-object-reader macro
- json-object
- json-object-members
- json-getf - Find an member's value in a JSON object.
- json-setf - Assign a value to a key in a JSON object.
cl-json
Library | Author | License | Website |
---|---|---|---|
cl-json | Henrik Hjelte, Boris Smilga, Robert Goldman | MIT | https://github.com/hankhero/cl-json |
cl-json is an old workhorse in area, however, it has not been updated in the last seven years and is fairly slow as you can tell from the benchmarks. It does not handle unicode surrogate pairs, JSON's null and is not as likely to be symmetric as some of the other libraries. Like most of the other libraries (except jsown and shasht), cl-json does not encode multi-dimensional arrays. As noted in the security section, cl-json interns keys into the keyword package which can open you up to the equivalent of a DOS attack. cl-json does provide a mitigation function which will throw an error if the keyword to be interned does not already exist in the *json-symbols-package*.
(setf cl-json::*identifier-name-to-key* #'cl-json::safe-json-intern)
(cl-json:decode-json-from-string "{\"alpha-omega\": 1}")
On the plus side, cl-json does probably have the best support for converting JSON data to clos objects and vice versa, but surprisingly does not handle structs. Local-time:timestamps are returned as JSON objects {"day":7990,"sec":0,"nsec":0} but not as javascript date objects. Unlike many of the other libraries, it does handle symbols without you having to write a new method. It allows you to handle incremental encoding.
With respect to conformity testing, cl-json was 95/95 for the JSON strings it must accept. It did not do so well rejecting malformed strings and, like most of the other libraries could exhaust the stack when facing certain types of malformed strings.
When you are looking at plists and alists, cl-json provides specific functions for dealing with those structures rather than attempting to guess how they should be translated into JSON.
Issue | Comments |
---|---|
Symmetry violations issue 22, issue 4 | |
Decoding JSON with duplicate keys get returned rather than flagged issue 16 | |
Does not handle unicode surrogate pairs issue 11 | |
Does not handle null properly |
Default Mapping
Please note the direction of the arrows in the following table.
Lisp | JSON | |
---|---|---|
integer | <-> | number with no frac or exp parts |
float | <-> | number with frac or exp parts |
rational | -> | number with frac or exp parts |
ratio | -> | number with frac or exp parts |
T | <-> | true |
nil | <-> | null |
nil | <- | false |
other symbol | -> | string |
character | -> | string |
string | <-> | string |
list (except alists) | <-> | array (1) |
other sequences | -> | array |
alist with dotted pairs | <-> | object (1) |
hash-table | -> | object |
standard object | -> | object |
- (1) This is cl-json's default mode. Using cl-json:with-decoder-simple-clos-semantics or cl-json:simple-clos-semantics will switch cl-json into a mode where JSON arrays are decoded to cl vectors rather than lists, and JSON objects are decoded to CLOS objects rather than alists.
Decoding
Cl-json uses different functions to decode from a string (decode-json-from-string x) v. decoding from a stream (decode-json x).
It converts JSON's 'null' to NIL, which I disagree with. It also fails to deal with unicode surrogate pairs if you care about those.
JSON objects are converted to alists with dotted pairs which is unusual for the rest of the libraries.
JSON data to CLOS Object
While cl-json normally returns JSON objects as arrays, you could tell it to return the JSON object as an cl-json:fluid-class CLOS object. You do need to at least temporarily set the change the decoder to use simple-clos-semantics and set the *json-symbols-package* to nil. (It should be noted that since the decoder maintains a class registry, this is thread unsafe. According to the docs, if every incoming JSON Object is guaranteed to have a prototype with a "lispClass" member then there are no fluid objects and thread safety is ensured. If the user wishes to employ fluid objects in a threaded environment it is advisable to wrap the body of entry-point functions in with-local-class-registry.
(cl-json:set-decoder-simple-clos-semantics)
You can reset the decoder back to lists with the function:
(set-decoder-simple-list-semantics)
This example temporarily changes the cl-json decoder semantics so that it creates a cl-json:fluid class, then we can get the birthday slot value of that class.
For example purposes, consider two JSON objects, *address-1* and *nested-address-1*:
*address-1* "{ \"name\": \"George Washington\", \"birthday\": \"February 22, 1732\", \"address\": \"Mount Vernon, Virginia, United States\" }" *nested-address-1* "{ \"first_name\": \"George\", \"last_name\": \"Washington\", \"birthday\": \"1732-02-22\", \"address\": { \"street_address\": \"3200 Mount Vernon Memorial Highway\", \"city\": \"Mount Vernon\", \"state\": \"Virginia\", \"country\": \"United States\" } }"
First, looking at the simpler version, notice you need to specify the slots in the fluid-class object:
(cl-json:with-decoder-simple-clos-semantics (setf cl-json:*json-symbols-package* nil) (let ((x (cl-json:decode-json-from-string *address-1*))) (with-slots (name birthday address) x birthday))) "1732-02-22"
Just to check something, lets describe that instance of the fluid-class:
(cl-json:with-decoder-simple-clos-semantics (setf cl-json:*json-symbols-package* nil) (let ((x (cl-json:decode-json-from-string *address-1*))) (describe x))) <#<JSON:FLUID-CLASS COMMON-LISP:NIL {100319AEA3}> {1003358763}> [standard-object] Slots with :INSTANCE allocation: NAME = "George Washington" BIRTHDAY = "February 22, 1732" ADDRESS = "Mount Vernon, Virginia, United States"
Now looking at the nested version, we need to note that by default cl-json will convert the underscores in the JSON keys to double hyphens in the slot names.
(cl-json:with-decoder-simple-clos-semantics (setf cl-json:*json-symbols-package* nil) (let ((x (cl-json:decode-json-from-string *nested-address-1*))) (with-slots (first--name last--name birthday address) x (values x first--name last--name birthday address)))) #<#<JSON:FLUID-CLASS COMMON-LISP:NIL {100F765713}> {10107CF9E3}> "George" "Washington" "1732-02-22" #<#<JSON:FLUID-CLASS COMMON-LISP:NIL {100F765713}> {10107CF6F3}>
Because we have a nested class, we would need drill down and specify the slots for the sub-object as well:
(cl-json:with-decoder-simple-clos-semantics (setf cl-json:*json-symbols-package* nil) (let ((x (cl-json:decode-json-from-string *nested-address-1*))) (with-slots (first--name last--name birthday address) x (with-slots (street--address city state country) address (values x first--name last--name birthday address city))))) #<#<JSON:FLUID-CLASS COMMON-LISP:NIL {100F765713}> {1010E69B93}> "George" "Washington" "1732-02-22" #<#<JSON:FLUID-CLASS COMMON-LISP:NIL {100F765713}> {1010E698A3}> "Mount Vernon"
We can also use slot values to get the info, but before we do that, let's use the nested-address sample data and just describe the object instance.
(cl-json:with-decoder-simple-clos-semantics (setf cl-json:*json-symbols-package* nil) (let ((x (cl-json:decode-json-from-string *nested-address-1*))) (describe x))) #<#<JSON:FLUID-CLASS COMMON-LISP:NIL {100319AEA3}> {10039F9CC3}> [standard-object] Slots with :INSTANCE allocation: NAME = #<unbound slot> BIRTHDAY = "1732-02-22" ADDRESS = #<#<JSON:FLUID-CLASS COMMON-LISP:NIL {100319AEA3}> {100388FC13}> STREET--ADDRESS = #<unbound slot> CITY = #<unbound slot> STATE = #<unbound slot> COUNTRY = #<unbound slot> FIRST--NAME = "George" LAST--NAME = "Washington"
Ok, this surprised me. The fluid-class is showing all the slots it created from *address-1* as well as the slots it created from *nested-address-1*. We also see that the keys "first-name", "last-name" and "street-address" have double hyphens when they are slot names and the fluid class created slots for embedded address object.
So just to demonstrate using slot value to get the data from a fluid object:
(cl-json:with-decoder-simple-clos-semantics (setf cl-json:*json-symbols-package* nil) (let ((x (cl-json:decode-json-from-string *nested-address-1*))) (slot-value x 'first--name))) "George"
Now suppose we want to go into the nested address and get just the city. For that we need to descend down, creating another cl-json:fluid object and then access its city slot value:
(cl-json:with-decoder-simple-clos-semantics (setf cl-json:*json-symbols-package* nil) (let ((x (cl-json:decode-json-from-string *nested-address-1*))) (with-slots (first--name last--name birthday address) x (with-slots (street--address city state country) address city)))) "Mount Vernon"
Or, using the slot-value approach:
(cl-json:with-decoder-simple-clos-semantics (setf cl-json:*json-symbols-package* nil) (let ((x (cl-json:decode-json-from-string *nested-address-1*))) (slot-value (slot-value x 'address) 'city))) "Mount Vernon"
Encoding
Basic encoding functionality is provided by the generic function encode-json. This can be customised with an entire series of macros as listed in the documentation. cl-json's basic encoding function returns an object when handed an alist and returns an array when handed a plist. When handed a list of plists or list of alists, encode-json will return an array but the list of plists returned an array of arrays and the list of alists returned an array of objects.
cl-json provides a function to encode the alist properly as key:value, but my samples do not show any difference between cl-json:encode-json and cl-json:encode-json-alist. It will encode a dotted alist as a JSON object with key:value pairs. It will encode a proper list alist as an array of arrays.
(cl-json:encode-json '(("A" . 1) ("B" . 2) ("C" . 3))) {"A":1,"B":2,"C":3} (cl-json:encode-json '(("A" 1) ("B" 2) ("C" 3))) [["A",1],["B",2],["C",3]]
On the plus side,
- cl-json was the only library to handle encoding char out of the box.
On the "be care side:
- As you might expect, plists are treated the same as plain lists and will lose their key-value connections. If you want to keep the key-value connections, you can either convert the list to an alist or hash-table or use the cl-json:encode-json-plist function.
On the "slightly additional work" side:
- Encoding structures, pathnames and timestamps would require writing a specialized method
On the not-so-plus side:
- It encodes nil as null. You can use the helper library cl-json-helper to encode nil as "false".
Incremental Encoding
The following examples use two exercises. First, incrementally build a JSON array. Second, incrementally build a JSON object which also contains an incrementally built JSON array.
(cl-json:with-array () (dotimes (i 3) (cl-json:encode-array-member i))) [0,1,2]
Now the second:
(cl-json:with-object () (cl-json:encode-object-member "hello" "hu hu") (cl-json:as-object-member ("harr") (cl-json:with-array () (dotimes (i 3) (cl-json:encode-array-member i))))) {"hello":"hu hu","harr":[0,1,2]}
Symmetry
From the standpoint of symmetry or round-tripping, going from JSON->CL->JSON, it only fails on dealing with false.
Similarly, going from CL->JSON->CL, you would need to deal with the fact that :NULL got converted to "null".
Security
There were two general security issues we considered: (1) interning keywords and (2) issues with malformed data.
With respect to the first issue, Cl-json flags the issue and provides the function safe-json-intern - which will throw an error if the keyword to be interned does not already exist in the *json-symbols-package*. - so at least you are warned and provided with an alternative.
With respect to the second issue, cl-json properly rejected 153 malformed test cases but accepted 20. Some of those malformed test cases which were accepted actually triggered stack exhaustion by opening too many levels of JSON open arrays and not closing them or similar types of issues.
Conformity with JSON Standard
cl-json accepted 95 of the 95 test cases that are considered "must accept". If *read-default-float-format* is set to 'single-float, it would refuse to accept: [123e65], [123e45] and [123.456e78].
It accepted 13 of the 17 test cases considered to be part of the gray area of the JSON specification - you could accept or reject.
Benchmarking
See Benchmarking
Miscellaneous Information
- Error Conditions Cl-json has several error conditions, some of them recoverable and some of them not recoverable. These include "unrecoverable-value-error", "json-syntax-error", "no-char-for-code", "cell-error" "type-error", errors for calling functions in the wrong environment and others. Please read the user-manual for more details.
- Cl-json has a converter from camel case to "lisp" (i.e., kebab case) and back again.
- Cl-json has a lot of other capabilities. The documentation is excellent and you should seriously consider the security considerations section of the user manual if you are going to be decoding uncontrolled JSON objects.
cl-json exported symbols
- *aggregate-scope-variables*
- *array-member-handler*
- *array-scope-variables*
- *beginning-of-array-handler*
- *beginning-of-object-handler*
- *beginning-of-string-handler*
- *boolean-handler*
- *end-of-array-handler*
- *end-of-object-handler*
- *end-of-string-handler*
- *identifier-name-to-key* - Designator for a function which, during decoding, maps the *json-identifier-name-to-lisp* -transformed key to the value it will have in the result object.
- *integer-handler*
- *internal-decoder*
- *json-array-type*
- *json-identifier-name-to-lisp* - Designator for a function which maps string (a JSON Object key) to string (name of a Lisp symbol).
- *json-input* - The default input stream for decoding operations.
- *json-output* - The default output stream for encoding operations.
- *json-symbols-package* - The package where JSON Object keys etc. are interned. Default keyword, nil = use current package.
- *lisp-identifier-name-to-json* - Designator for a function which maps string (name of a Lisp symbol) to string (e. g. JSON Object key).
- *object-key-handler*
- *object-scope-variables*
- *object-value-handler*
- *prototype-name*
- *real-handler*
- *string-char-handler*
- *string-scope-variables*
- *use-strict-json-rules* - If non-nil, signal error on unrecognized escape sequences in JSON Strings. If nil, translate any such sequence to the char after slash.
- as-array-member - BODY should be a program which encodes exactly one JSON datum to STREAM. AS-ARRAY-MEMBER ensures that the datum is properly formatted as a Member of an Array, i. e. separated by comma from any preceding or following Member.
- as-object-member - BODY should be a program which writes exactly one JSON datum to STREAM. AS-OBJECT-MEMBER ensures that the datum is properly formatted as a Member of an Object, i. e. preceded by the (encoded) KEY and colon, and separated by comma from any preceding or following Member.
- bignumber-string
- bind-custom-vars
- camel-case-to-lisp - Take a camel-case string and convert it into a string with Lisp-style hyphenation.
- clear-class-registry - Reset the *CLASS-REGISTRY* to NIL.
- current-decoder - Capture current values of custom variables and return a custom decoder which restores these values in its dynamic environment.
- custom-decoder - Return a function which is like DECODE-JSON called in a dynamic environment with the given CUSTOMIZATIONS.
- decode-json - Read a JSON Value from STREAM and return the corresponding Lisp value.
- decode-json-from-source - Decode a JSON Value from source using the value of decoder (default 'decode-json) as decoder function. If the source is a string, the input is from this string; if it is a pathname, the input is from the file that it names; otherwise, a stream is expected as source.
- decode-json-from-string - Read a JSON Value from json-string and return the corresponding Lisp value.
- decode-json-strict - Same as decode-json, but allow only Objects or Arrays on the top level, no junk afterwards.
- encode-array-member - Encode OBJECT as the next Member of the innermost JSON Array opened with WITH-ARRAY in the dynamic context. OBJECT is encoded using the ENCODE-JSON generic function, so it must be of a type for which an ENCODE-JSON method is defined.
- encode-json - Write a JSON representation of OBJECT to STREAM and return NIL.
- encode-json-alist - Write the JSON representation (Object) of alist to stream (or to json-output). Return nil.
- encode-json-alist-to-string - Return the JSON representation (Object) of alist as a string
- encode-json-plist - Write the JSON representation (Object) of plist to stream (or to json-output). Return nil.
- encode-json-plist-to-string - Return the JSON representation (Object) of plist as a string.
- encode-json-to-string - Return the JSON representation of object as a string.
- encode-object-member - Encode KEY and VALUE as a Member pair of the innermost JSON Object opened with WITH-OBJECT in the dynamic context. KEY and VALUE are encoded using the ENCODE-JSON generic function, so they both must be of a type for which an ENCODE-JSON method is defined. If KEY does not encode to a String, its JSON representation (as a string) is encoded over again.
- fluid-class - A class to whose instances arbitrary new slots may be added on the fly.
- fluid-object
- json-bind
- json-bool - Intended for the JSON-EXPLICT-ENCODER. Converts a non-nil value to a value (:true) that creates a JSON true value when used in the explict encoder. Or (:false).
- json-decode
- json-enable-reader-macro
- json-encode
- json-getf
- json-intern - Intern STRING in the current *JSON-SYMBOLS-PACKAGE*.
- json-object
- json-object-members
- json-or-null - Intended for the JSON-EXPLICT-ENCODER. Returns a non-nil value as itself, or a nil value as a JSON null-value
- json-read
- json-setf
- json-syntax-error - Signal a JSON-SYNTAX-ERROR condition
- lisp-to-camel-case - Take a string with Lisp-style hyphentation and convert it to camel case. This is an inverse of CAMEL-CASE-TO-LISP.
- make-object - If CLASS is not NIL, create an instance of that class. Otherwise, create a fluid object whose class has the given SUPERCLASSES (null list by default). In either case, populate the resulting object using BINDINGS (an alist of slot names and values).
- make-object-prototype - Return a PROTOTYPE describing the OBJECT's class or superclasses, and the package into which the names of the class / superclasses and of the OBJECT's slots are to be interned.
- no-char-for-code
- pass-code
- placeholder
- prototype - A PROTOTYPE contains metadata for an object's class in a format easily serializable to JSON: either the name of the class as a string or (if it is anonymous) the names of the superclasses as a list of strings; and the name of the Lisp package into which the names of the class's slots and the name of the class / superclasses are to be interned.
- rational-approximation
- safe-json-intern - The default json-intern is not safe. Interns of many unique symbols could potentially use a lot of memory. An attack could exploit this by submitting something that is passed through cl-json that has many very large, unique symbols. This version is safe in that respect because it only allows symbols that already exists.
- set-custom-vars
- set-decoder-simple-clos-semantics - Set the decoder semantics to the following:
- Strings and Numbers are decoded naturally, reals becoming floats.
- The literal name true is decoded to T, false and null to NIL.
- Arrays are decoded to sequences of the type *JSON-ARRAY-TYPE*.
- Objects are decoded to CLOS objects. Object keys are converted by the function *JSON-IDENTIFIER-NAME-TO-LISP*. If a JSON Object has a field whose key matches *PROTOTYPE-NAME*, the class of the CLOS object and the package wherein to intern slot names are inferred from the corresponding value which must be a valid prototype. Otherwise, a FLUID-OBJECT is constructed whose slot names are interned in *JSON-SYMBOLS-PACKAGE*
- set-decoder-simple-list-semantics - Set the decoder semantics to the following:
- Strings and Numbers are decoded naturally, reals becoming floats.
- The literal name true is decoded to T, false and null to NIL.
- Arrays are decoded to sequences of the type *JSON-ARRAY-TYPE*.
- Objects are decoded to alists. Object keys are converted by the function *JSON-IDENTIFIER-NAME-TO-LISP* and then interned in the package *JSON-SYMBOLS-PACKAGE*.
- simplified-camel-case-to-lisp - Insert - between lowercase and uppercase chars. Ignore _ + * and several consecutive uppercase.
- stream-array-member-encoder - Return a function which takes an argument and encodes it to STREAM as a Member of an Array. The encoding function is taken from the value of ENCODER (default is #'ENCODE-JSON).
- stream-object-member-encoder - Return a function which takes two arguments and encodes them to STREAM as a Member of an Object (String : Value pair)
- substitute-char
- substitute-printed-representation
- unencodable-value-error - Signal an UNENCODABLE-VALUE-ERROR
- unknown-symbol-error
- use-explicit-encoder
- use-guessing-encoder
- with-array - Open a JSON Array, run BODY, then close the Array. Inside the BODY, AS-ARRAY-MEMBER or ENCODE-ARRAY-MEMBER should be called to encode Members of the Array.
- with-custom-decoder-level - Execute BODY in a dynamic environment such that, when nested structures are decoded, the outermost level is decoded with the given custom handlers (CUSTOMIZATIONS) whereas inner levels are decoded in the usual way
- with-decoder-simple-clos-semantics - Execute BODY in a dynamic environement where the decoder semantics is such as set by SET-DECODER-SIMPLE-CLOS-SEMANTICS.
- with-decoder-simple-list-semantics - Execute BODY in a dynamic environement where the decoder semantics is such as set by SET-DECODER-SIMPLE-LIST-SEMANTICS.
- with-explicit-encoder
- with-guessing-encoder
- with-local-class-registry - Run BODY in a dynamic environment where *CLASS-REGISTRY* is a temporary local list. If :INHERIT is non-null, the local registry shall initially have the same content as the exterior *CLASS-REGISTRY*, otherwise it shall be NIL.
- with-object - Open a JSON Object, run BODY, then close the Object. Inside the BODY, AS-OBJECT-MEMBER or ENCODE-OBJECT-MEMBER should be called to encode Members of the Object.
- with-shadowed-custom-vars
- with-substitute-printed-representation-restart - Establish a SUBSTITUTE-PRINTED-REPRESENTATION restart for OBJECT and execute BODY.
com.gigamonkeys.json
Library | Author | License | Website |
---|---|---|---|
com.gigamonkeys.json | Peter Seibel | BSD-3 | https://github.com/gigamonkey/monkeylib-json |
Com.gigamonkeys.json is one of the oldest libraries. In fact the author commented that he had forgotten he had written a JSON library. It is included for completeness, but I think the newer libraries have passed it by.
Default Mapping
Please note the direction of the arrows.
Lisp | JSON | |
---|---|---|
integer | <-> | number with no frac or exp parts |
float | <-> | number with frac or exp parts |
rational | -> | number with frac or exp parts |
ratio | -> | number with frac or exp parts |
T | <-> | true |
nil | <-> | {} |
:FALSE | <- | false |
:NULL | <- | null |
other symbol | -> | Only keywords allowed. string |
character | -> | Hangs |
string | <-> | string |
list (except alists) | <-> | object (will force plist key:values) |
vector | <-> | array |
alist | -> | Error |
hash-table | -> | object |
standard object | -> | object |
Decoding
Com.gigamonkeys.json only takes strings as inputs. It does not handle unicode surrogate pairs. On the plus side, it handles NULL issues correctly.
As noted in the mapping table above, JSON objects will be decoded as plists (or nested plists). Consider the following example of decoding a nested JSON object:
(com.gigamonkeys.json:parse-json "{ \"first_name\": \"George\", \"last_name\": \"Washington\", \"birthday\": \"1732-02-22\", \"address\": { \"street_address\": \"3200 Mount Vernon Memorial Highway\", \"city\": \"Mount Vernon\", \"state\": \"Virginia\", \"country\": \"United States\" } }") ("first_name" "George" "last_name" "Washington" "birthday" "1732-02-22" "address" ("street_address" "3200 Mount Vernon Memorial Highway" "city" "Mount Vernon" "state" "Virginia" "country" "United States"))
Arrays are decoded to vectors.
Encoding
There were a few surprises looking at what com.gigamonkeys.json would encode and not encode.
- It encodes keyword symbols, but not other symbols.
- It actually hangs on encoding chars and CLOS objects, which I found strange. Most of the other libraries generated errors.
- Attempting to encode a pathname also results in hanging
- nil encodes to {} (yes, empty object)
- Like some other libraries, it cannot deal with alists directly. Consider using alexandria:alist-hash-table to convert the alist to a hash table.
- On the other hand, in reversal from some of the other libraries, com.gigamonkeys.json will assume a plain list is a plist, returning a JSON object with key-value pairs. If the length of the list is odd, the final value in the list will be treated as a key and an empty set will be inserted as the value.
Let's take a closer look at the encoding functions. As noted previously, com.gigamonkeys.json will assume a plain list is a plist, returning a JSON object with key-value pairs. In the examples below, that means that since it is given a three value list, the third value is assumed to be the second key in an object and the value then is an empty set.
Function | Input | Result |
---|---|---|
json | '(1 2 3) | "{\"1\":2,\"3\":{}}" |
to-json | '(1 2 3) | (1 2 3) |
write-json | '(1 2 3) | {"1":2,"3":{}} |
Function | Input | Result |
---|---|---|
json | #(1 2 3) | "[1,2,3]" |
to-json | #(1 2 3) | #(1 2 3) |
write-json | #(1 2 3) | [1,2,3] |
Symmetry
From the standpoint of symmetry or round-tripping, going from JSON->CL->JSON, it did not have any issues.
As you might have expected given its ignoring lists, going from CL->JSON->CL resulted in failure with tests starting with lists. If you started from an array, nil turned into {}, and it could get a little excited about how many decimal points it would return with respect to a float.
Security
There were two general security issues we considered: (1) interning keywords and (2) issues with malformed data. com.gigamonkeys.json did not exhibit the first issue.
With respect to the second issue, com.gigamonkeys.json accepted many malformed test cases which triggered stack exhaustion by opening too many levels of JSON open arrays and not closing them or similar types of issues.
Conformity with JSON Standard
com.gigamonkeys.json accepted all 95 test cases that are considered "must accept".
It accepted 14 of the 17 test cases considered to be part of the gray area of the JSON specification - you could accept or reject.
Benchmarking
See Benchmarking
com.gigamonkeys.json exported symbols
- *object-type*
- json - The top-level function for converting Lisp objects into a string in the JSON format. It can convert any object that can be converted to a json-exp via the to-json generic function.
- json-stringify - Convert object directly to a JSON representation as a string. Default methods are provided for strings, symbols (which must be keywords), and numbers but there may be situations where it is appropriate to define new methods on this function. In general, however, it is probably better to define a method on to-json to convert the object to a sexp that can be rendered as JSON.
- parse-json - Parse JSON text into Lisp objects. Hash tables are used to represent Javascript objects and vectors to represent arrays.
- to-json - Generic function that can convert an arbitrary Lisp object to a json-exp, i.e. a sexp that can then be rendered as JSON. To make an arbitrary class convertable to JSON, add a method to this generic function that generates a json-exp.
- write-json - Write data to stream in JSON format.
com.inuoe.jzon
Library | Author | License | Website | Comments |
---|---|---|---|---|
com.inuoe.jzon | Wilfredo Velázquez-Rodríguez | MIT | https://github.com/Zulu-Inuoe/jzon |
Com.inuoe.jzon is my new overall favorite. It is fast, it handles null correctly, it encodes all kinds of lists, CLOS objects, structures and hash-tables.
Default Mapping
Type Mapping JSON -> CL
JSON | CL |
---|---|
true | t |
false | nil |
null | null |
number | integer or double float |
string | simple-string |
array | simple-vector |
object | hash-table using equal as the test function |
Type Mapping CL -> JSON
Using the stringify function, com.inuoe.jzon will map the following CL data types.
CL | JSON |
---|---|
symbol | string (generally downcased unless they contain mixed case characters) |
numbers | number |
alist | object |
plist | object |
list or sequence | array |
CLOS objects | object (using bound slots as keys) |
structures | object |
Decoding
Acceptable Input for Parse
com.inuoe.jzon will accept strings, octets in utf8, stream characters or binary in utf8, or a pathname (at which point the library will assume you want it to open a file for reading.
The README also notes that (parse …) accepts the following keyword arguments:
- :allow-comments This allows the given JSON to contain //cpp-style comments
- :maximum-depth This controls the maximum depth to allow arrays/objects to nest. Can be a positive integer, or nil to disable depth tests. This turned out to be important when dealing with deliberately malformed incoming JSON data because the library was able to call a halt well before the stack was exhausted.
- :key-fn A function of one argument responsible for 'interning' object keys. Should accept a simple-string and return the 'interned' key. The :key-fn parameter could be #'alexandria:make-keyword if you wanted to make object keys into symbols (but this is a bad practice from a security standpoint if the incoming JSON data is uncontrolled.
Encoding
com.inuoe.jzon can encode to either a stream or string. E.g.:
(com.inuoe.jzon:stringify '("A" "b" 4 3.2 9/4) :stream *standard-output*) ["A","b",4,3.2,2.25] (com.inuoe.jzon:stringify '("A" "b" 4 3.2 9/4)) "[\"A\",\"b\",4,3.2,2.25]"
Additional Keyword Parameters for Stringify
- :pretty If true, output pretty-formatted JSON
- :coerce-element A function for coercing 'non-native' values to JSON.
- :coerce-key A function for coercing key values to strings.
On the plus side,
- It automatically handles standard CLOS objects and also allows you to specialize.
- It is one of only two libraries which can encode a structure
On the "pay attention to your data structures" side: Recent change: com.inuoe.jzon:stringify will no longer encode an alist as a JSON object. In dropping its heuristics, com.inuoe.jzon has, at least temporarily, lost the ability to handle dotted pairs.
Encoding Objects
com.inuoe.jzon actually uses the :type of each slot to determine what to do with nil. Consider the following where two slots have no specified type, two have a list type and two have an array type. Specifically, consider the resulting difference between value-list0 and value-array0 when writing the object out:
(defclass my-class () ((name :initarg :name) (value0 :initarg :value0) (value-list0 :initarg :value-list0 :type list) (value-array0 :initarg :value-array0 :type array) (value1 :initarg :value1) (value-list1 :initarg :value-list1 :type list) (value-array1 :initarg :value-array1 :type array))) (let ((obj (make-instance 'my-class :name "Name" :value0 nil :value-list0 nil :value-array0 nil :value1 1 :value-list1 '(1 2 3) :value-array1 #("a" "b" "c")))) (stringify obj)) "{ \"name\":\"Name\", \"value0\":null, \"value-list0\":[], \"value-array0\":null, \"value1\":1, \"value-list1\":[1,2,3], \"value-array1\":[\"a\",\"b\",\"c\"] }"
Specialize Serialization
com.inuoe.jzon allows you to specialize the coerced-fields method to handle CL data types not included in the above list, including excluding, renaming and adding fields. See the README for examples.
Incremental Encoding
The first simple example is just incrementally writing an array. We will set :pretty to nil.
(com.inuoe.jzon:with-writer* (:stream *standard-output* :pretty nil) (com.inuoe.jzon:begin-array*) (dotimes (i 3) (com.inuoe.jzon:write-value* i)) (com.inuoe.jzon:end-array*)) [0,1,2]
The second example borrows from the readme documentation and incrementally writes a JSON object:
(com.inuoe.jzon:with-writer* (:stream *standard-output* :pretty t) (com.inuoe.jzon:with-object* (com.inuoe.jzon:write-key* :age) (com.inuoe.jzon:write-value* 24) (com.inuoe.jzon:write-property* :colour :blue) (com.inuoe.jzon:write-properties* :outside nil :interests #() :talent 'null) (com.inuoe.jzon:write-key* "an-array") (com.inuoe.jzon:with-array* (com.inuoe.jzon:write-values* :these :are :elements)) (com.inuoe.jzon:write-key* "another array") (com.inuoe.jzon:write-array* :or "you" "can use this"))) { "age": 24, "colour": "BLUE", "outside": false, "interests": [], "talent": null, "an-array": [ "THESE", "ARE", "ELEMENTS" ], "another array": [ "OR", "you", "can use this" ] }
Symmetry
From the standpoint of symmetry or round-tripping, going from JSON->CL->JSON, it did not have any issues. It struggled a bit more with respect to going from CL->JSON->CL. The first test had a starting point of:
((:NAME "George Washington") (:BIRTHDAY "February 22, 1732") (:ADDRESS "Mount Vernon, Virginia, United States"))
and a result (adjusted to get the elements out of the hash-table) of:
(("address" . #("Mount Vernon, Virginia, United States")) ("birthday" . #("February 22, 1732")) ("name" . #("George Washington")))
So we went from undotted alist to dotted alist where each value was an array instead of a string. The second test started with an array instead of an alist and com.inuoe.jzon handled it with the one caveat that :NULL turned into "NULL".
Security
There were two general security issues we considered: (1) interning keywords and (2) issues with malformed data. com.inuoe.jzon did not exhibit the first issue.
With respect to the second issue, com.inuoe.jzon rejected all the malformed test cases. It was one of four packages which allowed you to limit the depths to which it would dive in nested JSON objects and, therefore, did not trigger the stack exhaustion exhibited by most of the other packages.
Conformity with JSON Standard
com.inuoe.jzon accepted all 95 test cases that are considered "must accept".
It accepted 7 of the 17 test cases considered to be part of the gray area of the JSON specification - you could accept or reject.
Benchmarking
See Benchmarking
com.inuoe.jzon exported symbols
- coerce-element - Coerce "element" into a "json-element", using "coerce-key" in cases the result is a hash-table.
- coerce-key - Coerce "key" into a string designator, or "nil" if "key" is an unsuitable key.
- coerced-fields - Return a list of key definitions for "element". A key definition is a three-element list of the form (name value type). Name is the key name and will be coerced if not already a string. Value is the value, and will be coerced if not a json-element. Type is a type for the key, in order to handle ambiguous "nil" interpretations.
- json-atom - a type definition including t, nil, null, real and string
- json-element - a type definition including json-atoms, vectors and hash-tables
- json-eof-error - a json-parse-error
- json-error - a simple condition
- json-parse-error - a json-error condition
- parse - Read a JSON value from `in', which may be a vector, a stream, or a pathname. Keyword parameters: :maximum-depth controls the maximum depth of the object/array nesting. :allow-comments controls whether or not single-line // comments are allowed. :key-fn is a function of one value which 'pools' object keys, or null for the default pool.
- stringify - Serialize "element" into JSON. Returns a fresh string if "stream" is nil, nil otherwise. ":stream" like the "destination" in "format" ":pretty" if true, pretty-format the output ":coerce-element" is a function of two arguments, and is used to coerce an unknown value to a "json-element" ":coerce-key" is a function of one argument, and is used to coerce object keys into non-nil string designators. See "coerce-element" and "coerce-key".
jonathan
Library | Author | License | Website |
---|---|---|---|
jonathan | Rudolph Miller | MIT | https://github.com/Rudolph-Miller/jonathan |
While jsown is the winner in the decoding speed stakes, Jonathan can be faster or orders of magnitude slower depending on the amount of JSON data and its structure. Jonathan is among the leaders in encoding speed along with st-json, com.inuoe.jzon, and com.gigamonkeys.json. However, speed is not everything and there are a few concerning issues. It has optimize set for safety 0 to increase speed. This has bitten some users hard in the past, so you have been warned.
- As previously noted, jonathan shows substantial slow downs in decoding nested JSON objects of any real size.
- Jonathan is unable to parse float if SAFETY restricted to 2 or 3 on SBCL. See https://github.com/Rudolph-Miller/jonathan/issues/66.
Default Mappping
Please note the direction of the arrows in the following table.
Lisp | JSON | |
---|---|---|
integer | <-> | number with no frac or exp parts |
float | <-> | number with frac or exp parts |
rational | -> | number with frac or exp parts |
ratio | -> | number with frac or exp parts |
T | <-> | true |
nil | <-> | [] |
nil | <- | false |
nil | <- | null |
other symbol | -> | string |
character | -> | Error |
string | <-> | string |
list (except alists) | <-> | array |
vector | -> | array |
alist w/o dotted pairs | -> | array of arrays |
alist with dotted pairs | -> | Error |
hash-table | -> | object |
plist | <- | object (1) |
standard object | -> | Error |
- (1) Jonathan can parse a JSON object into plists (the default), alists, hash-tables or a "JSON object" by passing different keyword parameters to the parse function.
Decoding
Overview
Jonathan decodes strings, not streams. In testing, jonathan hung when trying to decode "[123e-10000000]". In its default settings, jonathan will decode a JSON null as nil, but if you set jonathan:*null-value* to :null, it will decod a JSON null properly as :null.
Jonathan parse a JSON object by default into plists, usually reversed. You can change the resulting data structure by passing :as :XXX to the parse function. E.g.
(jonathan:parse "{\"a\":1,\"b\":2}") (:|b| 2 :|a| 1) (jonathan:parse "{\"a\":1,\"b\":2}" :as :alist) (("b" . 2) ("a" . 1))
(jonathan:parse *address-1* :as :jsown) (:OBJ ("address" . "Mount Vernon, Virginia, United States") ("birthday" . "February 22, 1732") ("name" . "George Washington"))
The result is a cons cell format that appears to have originated with the jsown library.
Unicode
Jonathan can parse unicode and escaped unicode characters and can return unicode characters or escaped unicode characters.
(jonathan:parse "\"\\u30b8\\u30e7\\u30ca\\u30b5\\u30f3\"") "ジョナサン" (jonathan:parse "\"\\u30b8\\u30e7\\u30ca\\u30b5\\u30f3\"" :unescape-unicode-escape-sequence nil) "u30b8u30e7u30cau30b5u30f3" (jonathan:parse "{\"name\": \"ジョナサン\"}") (:|name| "ジョナサン")
Nested JSON Objects (Filters and Subsets)
Jonathan can easily extract a subset of data from the first level of a nested object, but you need to write a recursion method if you need to extract a subset of nested data that is deeper in the JSON object.
(jonathan:parse *nested-address-1* :keywords-to-read '("first_name")) (:|first_name| "George") (jonathan:parse *nested-address-1* :keywords-to-read '("address")) (:|address| (:|country| "United States" :|state| "Virginia" :|city| "Mount Vernon" :|street_address| "3200 Mount Vernon Memorial Highway")) (jonathan:parse *nested-address-1* :keywords-to-read '("state")) nil
Allow JSON comment string
- can allow junked JSON format string (:junk-allowed t)
- can customize null-value, false-value and empty-array-value.
- can restrict keywords to read. (:keywords-to-read)
- can normalize keywords. (:keyword-normalizer)
- can not normalize keywords in nested objects.
- can ignore keywords when normalizer returns NIL.
- can unescape unicode escape sequences. (:unescape-unicode-escape-sequence)
Encoding
The basic encoding function is jonathan:to-json. Jonathan can return either a string or octets.
(jonathan:to-json '(:name "Common Lisp" :born 1984 :impls (SBCL KCL)) :octets t) #(123 34 78 65 77 69 34 58 34 67 111 109 109 111 110 32 76 105 115 112 34 44 34 66 79 82 78 34 58 49 57 56 52 44 34 73 77 80 76 83 34 58 91 34 83 66 67 76 34 44 34 75 67 76 34 93 125)
When encoding alists or plists or jsown "objects", jonathan requires extra keyword parameters like :from :alist (or :plist, :jsown)
(jonathan:to-json '((a 1) (b 2)) :from :alist) "{\"A\":[1],\"B\":[2]}"
While to-json can handle a plist without any additional parameters, to-json will throw an error if handed an alist without warning. This gets resolved by adding the additional keyword parameters :from :alist
(jonathan:to-json '((A . 1) (B . 2) (C . 3)) :from :alist) "{\"A\":1,\"B\":2,\"C\":3}"
Jonathan expects simple-strings, so if your data source does not produce simple strings, you may have to massage the input to get there. Similarly, attempting to encode a char or a pathname will trigger an error. You should be able to methods that would handle them. The same situation arises with respect to encoding structs.
Now consider what happens when jonathan tries to encode alists with dotted pairs:
(jonathan:to-json '(("foo" . "bar") ("baz" . ((1 2 3) (4 5 6)))) :from :alist) "{\"foo\":\"bar\",\"baz\":{\"1\":[2,3],\"4\":[5,6]}}"
and now without dotted pairs.
(jonathan:to-json '(("foo" "bar") ("baz" ((1 2 3) (4 5 6)))) :from :alist) "{\"foo\":[\"bar\"],\"baz\":[{\"1\":[2,3],\"4\":[5,6]}]}"
In both situations, jonathan is trying to force key:value pairs into places you would not expect.
Encoding Clos
Jonathan can encode CLOS objects to JSON if you create a method for that class. Consider our person class:
(defclass person () ((name :initarg :name :initform "Sabra" :accessor name) (eye-colour :initarg :eye-colour :initform "brown" :accessor eye-colour)))
Creating the required method for this class is straight forward:
(defmethod jonathan:%to-json ((person person)) (jonathan:with-object (jonathan:write-key-value "name" (slot-value person 'name)) (jonathan:write-key-value "eye-colour" (slot-value person 'eye-colour))))
That then allows you to write something like this:
(let ((data (make-instance 'person))) (jonathan:to-json data)) "{\"name\":\"Sabra\",\"eye-colour\":\"brown\"}"
Incremental Encoding
The following examples use two exercises. First, incrementally build a JSON array. Second, incrementally build a JSON object which also contains an incrementally built JSON array.
(jonathan:with-output (*standard-output*) (jonathan:with-array () (dotimes (i 3) (jonathan:write-item i)))) [0,1,2]
The second is also straight forward.
(jonathan:with-output (*standard-output*) (jonathan:with-object (jonathan:write-key-value "hello" "hu hu") (jonathan:write-key "harr") (jonathan:write-value (jonathan:with-array () (dotimes (i 3) (jonathan:write-item i)))))) {"hello":"hu hu","harr":[0,1,2]}
Symmetry
From the standpoint of symmetry or round-tripping, going from JSON->CL->JSON, jonathan reversed the order of the list and converted both falsee and null to [].
Going from CL->JSON->CL, it is not symmetric if the alists have dotted pairs. The array test resulted in getting the array converted into a list and :NULL was converted to nil.
Security
There were two general security issues we considered: (1) interning keywords and (2) issues with malformed data. Jonathan has the first issue.
With respect to the second issue, jonathan properly rejected 128 out of 173 malformed test cases. Unfortunately it was one of the packages that accepted malformed JSON data that would trigger stack exhaustion.
Conformity with JSON Standard
jonathan accepted 94 of the 95 test cases that are considered "must accept". It signaled an overflow when trying to decode "[123.456e78]".
It accepted 9 out of the 17 test cases considered to be part of the gray area of the JSON specification - you could accept or reject. It actually hung when trying to decode the underflow number [123e-10000000].
Benchmarking
See Benchmarking
jonathan exported symbols
- %to-json - Write obj as JSON string.
- %write-char - Write character to *stream*.
- %write-string - Write string to *stream*.
- *empty-array-value* - LISP value of [].
- *empty-object-value* - LISP value of {}.
- *false-value* - LISP value of false.
- *from* - Default value of from used by #'to-json.
- *null-value* - LISP value of null.
- *octets* - Default value of octets used by #'to-json.
- <jonathan-error> - Base condition of jonathan-errors.
- <jonathan-incomplete-json-error>
- <jonathan-not-supported-error>
- <jonathan-unexpected-eof-error>
- <jonathan-without-tail-surrogate-error>
- compile-encoder - Compile encoder
- parse - Convert JSON String to LISP object.
- to-json - Convert LISP object to JSON String.
- with-array - Make writing array safe.
- with-object - Make writing object safe.
- with-output - Bind *stream* to stream.
- with-output-to-string - Output *stream* as string.
- write-item - Write item of array.
- write-key - Write key part of object.
- write-key-value - Write key and value of object.
- write-value - Write value part of object.
json-lib
Library | Author | License | Website | Comments |
---|---|---|---|---|
json-lib | Alex Nygren | MIT | https://github.com/KinaKnowledge/json-lib | Not in Quicklisp |
Json-lib describes itself as a simple JSON decoder and encoder which tries to achieve symmetry. It falls a little short when dealing with JSON's false and with unicode char codes. On the plus side, it was one of four libraries to limit depth and avoid stack exhaustion on malformed JSON data. I think it generally does what it was intended for.
Mapping
JSON | CL |
---|---|
null | nil |
false | nil |
true | T |
integer | integer |
float | double-float |
string | string |
array | vector |
object | hash-table |
Decoding
Json-lib parses utf-8 encoding JSON strings (and not streams). So if you are reading a JSON encoded file, you would need to specify in the input stream conversions that utf-8 be used. E.g.
(json-lib:parse (alexandria:read-file-into-string "file.json" :external-format :utf8))
As with many of the libraries, it decodes a JSON 'null' as NIL. It also fails to handle unicode surrogate pairs if you care about that.
According to the README, the json-lib parser uses whitespace presence, regardless of commas, as a delimiting marker.
Nested JSON Objects
Json-lib does require that you completely parse the JSON data instead of being able to filter it while taking it in. Taking the nested JSON object below, how could we get a information out of the innermost nested object?
{ "items": [ { "index": 1, "integer": 29, "float": 16.8278, "fullname": "Milton Jensen", "bool": false } ] }
You need to know your data structure so that you can figure out how to walk the tree. How would we get the value of the key "integer"? Looking at it, it is a JSON object which keyword "items"contains an array which contains a JSON object.
By default, json-lib decodes JSON objects to hash-tables and arrays into vectors. So we can descend the parsed JSON tree in this particular example something like this (assuming the JSON object was in a file named json4.txt):
(gethash "integer" (aref (gethash "items" (json-lib:parse (alexandria:read-file-into-string #P"/home/sabra/json4.txt"))) 0)) 29
Encoding
Json-lib will encode to strings, not streams. Some idiosyncrasies are noted in the following points:
- It encodes keyword symbols, but other symbols get encoded to "null"
- It encodes a char to "null". You could write a method to handle char
- Ratios are encoded as "null"
- Attempting to encode a CLOS object or pathname returned "null".
- In encoding a hash-table, json-lib will be successful if the hash-table keys are strings or keyword symbols. If the keys are other symbols, the key will be rendered as "null".
Symmetry
From the standpoint of symmetry or round-tripping, going from JSON->CL->JSON, json-lib lost the unicode character in the middle of a string and converted false to null.
Going from CL->JSON->CL, it took an input of alists and converted it to vectors of vectors with keywords symbols being turned into strings. In the second test it took an array input and the only hiccup was converting :NULL to "null".
Security
There were two general security issues we considered: (1) interning keywords and (2) issues with malformed data. json-lib did not exhibit the first issue.
With respect to the second issue, json-lib rejected 110 out of 173 malformed test cases. It was one of four packages which allowed you to limit the depths to which it would dive in nested json objects and, therefore, did not trigger the stack exhaustion exhibited by most of the other packages.
Conformity with JSON Standard
json-lib accepted 95 of the 95 test cases that are considered "must accept". If *read-default-float-format* is set to 'single-float, it would refuse to accept: [123e65], [123e45] and [123.456e78].
It accepted 12 of the 17 test cases considered to be part of the gray area of the JSON specification - you could accept or reject.
Benchmarking
See Benchmarking
Miscellaneous Comments
Json-lib has conversion functions lisp-to-snakecase, snakecase-to-lisp and lisp-to-camelcase. These will only be applied to keyword symbols.
Json-lib exported symbols
- encode-string
- parse - Given an encoded utf-8 JSON string, returns a cl structure or value. If use-keywords-for-keys is T, then hash table keys will be constructed as keywords. If an object-key-handler lambda/function is provided, this will be called for each object key and the result value used for the specific object key. By default the limit is 1000 for structural depth, but this can be set with the keyword max-depth. Exponent representation in serialized form is limited to a length of 2 to prevent huge values causing slow downs and other issues in the conversion process.
- stringify - Converts the given data structure to a stringified JSON form, suitable for serialization and other uses. An optional function can be passed for case encoding of native lisp keyword structures. If a function for unencodable-items is provided, this function will be called, and should return a JSON compliant string representing the encoded items. If no value is provided for unencodable-items, the JSON null value is used.
json-streams
Library | Author | License | Website |
---|---|---|---|
json-streams | Thomas Bakketun, Stian Sletner | GPL3 | http://github.com/rotatef/json-streams |
The author describes Json-streams advantages as:
- Separation of low-level and high-level functionality. json-streams provides a basic tokenizer for JSON, so that it's easy to build any data structure mapping one wants on top of it. All the other JSON libraries could in theory be built on it. The problem with many of them is that they've chosen a particular data mapping that very often is non-deterministic (False and [] both map to NIL, for example), or simply doesn't suit a particular use case. With JSON-streams you have full control of these things. But it also comes with a high-level API with a chosen mapping so it's ready to use.
- The API is streaming, so you don't have to process more than you want to, and you can process files of any size.
- Both Unicode and numbers are properly handled.
I am a bit surprised that something like json-streams that advertises itself as designed to be used as a building block for more high level libraries has no doc strings. Fortunately the README appears to be fairly comprehensive. However, after trying it during the course of testing, I am not convinced that it offers flexibility enough to counter the out of the box functionality of many of the other libraries, nor does it offer any speed advantage.
Default Mapping
This is the default mapping. Do you see a distinct lack of data structures on the CL side? See the discussion under Json-streams Encoding for more info.
JSON | Lisp |
---|---|
true | T |
false | NIL |
null | :NULL |
string | string |
number | integer, float or ratio |
array | (:ARRAY … ) (1) |
object | (:OBJECT (key . value) … ) (1) |
(1) This is a cons and not a CLOS object.
Decoding
Generally speaking json-streams does what you would expect with respect to decoding. As you might expect, given the name, it can handle either stream or string input. However, consider parsing the two following JSON strings. The first is a JSON object which includes a JSON array and the second is just an array::
(json-streams:json-parse "{\"A\":false,\"B\":[false]}") (:OBJECT ("A") ("B" :ARRAY NIL)) (json-streams:json-parse "[\"B\",false]") (:ARRAY "B" NIL)
Both of the results are CONS. I do not know what I was expecting, but it probably was not that.
Encoding
I found json-streams frustrating when it comes to simple, straightforward encoding. You cannot just pass CL data to it and assume that it will work.
Data | Result | Comment |
---|---|---|
T | true | |
NIL | false | |
:null | null | |
A (symbol) | Error (1) | |
"b" | "b" | |
1 | 1 | |
1.2 | 1.2 | |
9/17 | Error (1) | |
(1 2) | Error (1) | |
((A . 1)) | Error (1) | |
(B 2) | Error (1) | |
#(1 2 3) | Error (1) | |
(MAKE-INSTANCE 'PERSON) | Error (1) |
- (1) A fell through ETYPECASE expression. Wanted one of ((OR STRING REAL) (MEMBER T NIL) (MEMBER :NULL) JSON-STREAMS:JSON-ARRAY JSON-STREAMS:JSON-OBJECT).
So how do you actually use stringify? Well,
Data | Result | Comment |
---|---|---|
(json-streams:json-stringify '(:array 1 :null t nil)) | "[1,null,true,false]" | |
(json-stringify '(:object ("a". 1) ("b" . 2) ("c". 3))) | "{\"a\":1,\"b\":2,\"c\":3}" | |
(json-stringify '(:object ("a" 1) ("b" 2) ("c" 3))) | Error (1) | Oops |
- (1) A fell through ETYPECASE expression. Wanted one of ((OR STRING REAL) (MEMBER T NIL) (MEMBER :NULL) JSON-STREAMS:JSON-ARRAY JSON-STREAMS:JSON-OBJECT).
In case you were wondering, json-stringify json-stringify-multiple and JSON-stringify-single are regular functions. You cannot just write a new method to deal with a new datatype. They take a type json-object, json-array or json-string defined as:
(deftype json-object () '(cons (member :object) t))
Encoding Hash Tables
As result, when you want to encode CL data structures, you need to resort to more complicated calls such as the following for dealing with hash tables or similar calls. :
(let ((data (alexandria:plist-hash-table '("foo" 1 "bar" (7 8 9)) :test #'equal))) (json-streams:with-json-output (nil :key-encoder #'string-downcase :indent t) (json-streams:with-json-object (json-streams:json-output-member :my-hash-table data)))) "{\"my-hash-table\": {\"foo\": 1,\"bar\": [7,8,9]}}"
This does not always work and there is no obvious way to just write additional methods dealing with new lisp data types other than to write your own methods which then wrap around lower level json-streams components. This essentially means that dealing with any type of CL data structure requires that you engage in incremental encoding.
Encoding Arrays
Encoding arrays is simile to encoding hash-tables (see above).
Incremental Encoding
The following examples use two exercises. First, incrementally build a JSON array. Second, incrementally build a JSON object which also contains an incrementally built JSON array.
(json-streams:with-json-output (nil :key-encoder #'string-downcase :indent t) (json-streams:with-json-array (dotimes (i 3) (json-streams:json-output-value i)))) "[0,1,2]"
And now the second:
(json-streams:with-json-output (nil :key-encoder #'string-downcase) (json-streams:with-json-object (json-streams:json-output-member "hello" "hu hu") (json-streams:with-json-member :harr (json-streams:with-json-array (dotimes (i 3) (json-streams:json-output-value i)))))) "{\"hello\":\"hu hu\",\"harr\":[0,1,2]}"
The following is a more complicated version from the README:
(with-json-output (nil :key-encoder #'string-downcase :indent t) (with-json-object (json-output-member :first-name "John") (json-output-member :last-name "Smith") (with-json-member :is-alive (json-output-boolean t)) (json-output-member :age 25) (with-json-member :address (json-output-alist '((:street-address . "21 2nd Street") (:city . "New York") (:state . "NY") (:postal-code . "10021-3100")))) (with-json-member :phone-numbers (with-json-array (json-output-plist '(:type "home" :number "212 555-1234")) (json-output-plist '(:type "office" :number "646 555-4567")) (json-output-plist '(:type "mobile" :number "123 456-7890")))) (json-output-member :children #()) (with-json-member :spouse (json-output-null)))) "{ \"first-name\": \"John\", \"last-name\": \"Smith\", \"is-alive\": true, \"age\": 25, \"address\": { \"street-address\": \"21 2nd Street\", \"city\": \"New York\", \"state\": \"NY\", \"postal-code\": \"10021-3100\" }, \"phone-numbers\": [ { \"type\": \"home\", \"number\": \"212 555-1234\" }, { \"type\": \"office\", \"number\": \"646 555-4567\" }, { \"type\": \"mobile\", \"number\": \"123 456-7890\" } ], \"children\": [ ], \"spouse\": null }"
Symmetry
From the standpoint of symmetry or round-tripping, going from JSON->CL->JSON, json-streams had no issues.
Going from CL->JSON->CL, it failed (as you might expect) with both the alist input and the vector input because you would need to write a new method for dealing with either one.
Security
There were two general security issues we considered: (1) interning keywords and (2) issues with malformed data. json-streams did not exhibit the first issue.
With respect to the second issue, json-streams rejected all 173 malformed test cases. It was one of four packages which allowed you to limit the depths to which it would dive in nested JSON objects and, therefore, did not trigger the stack exhaustion exhibited by most of the other packages.
Conformity with JSON Standard
Without additional parameters, json-streams accepted 93 of 95 test cases that are considered "must accept". The rejected test cases were files with duplicate keys and those would have been accepted by passing the keyword parameter :duplicate-key-check nil.
It accepted 1 of the 17 test cases considered to be part of the gray area of the JSON specification - you could accept or reject.
Benchmarking
See Benchmarking
json-streams exported symbols
As mentioned above, there are no doc-strings for json-streams' exported symbols.
- call-with-json-output
- json-array
- json-close
- json-error
- json-input-stream
- json-object
- json-output-alist
- json-output-boolean
- json-output-member
- json-output-null
- json-output-plist
- json-output-stream
- json-output-value
- json-parse
- json-parse-error
- json-parse-multiple
- json-read
- json-stream
- json-stream-position
- json-string
- json-stringify
- json-stringify-multiple
- json-write
- json-write-error
- make-json-input-stream
- make-json-output-stream
- most-negative-json-integer
- most-positive-json-integer
- with-json-array
- with-json-member
- with-json-object
- with-json-output
- with-open-json-stream
jsown
Library | Author | License | Website |
---|---|---|---|
jsown | Aad Versteden | MIT | https://github.com/madnificent/jsown |
Yes, it is the fastest in pure decoding but speed is not everything. It has optimize set for safety 0 to increase speed. This has bitten some users hard in the past, so you have been warned. You do have to be careful about what data structures you use. For example, encoding dotted alists triggered memory fault issues. It fails to deal with JSON's null properly and actually hung when trying to decode a JSON real number with an underflow problem.
On the plus side, it has the ability to extract a subset out of a JSON string, but you have to have read the entire string since it does not deal with streams.
Default Mappping
Please note the direction of the arrows in the following table.
Lisp | JSON | |
---|---|---|
integer | <-> | number with no frac or exp parts |
float | -> | number with frac or exp parts |
rational | -> | number with frac or exp parts |
ratio | <- | number with frac or exp parts |
T | <-> | true |
nil | <-> | nil |
nil | <- | false |
nil | <- | null |
nil | <- | [] |
other symbol | -> | string |
character | -> | Error |
string | <-> | string |
list (except alists) | <-> | array |
cons | <- | object |
vector | -> | array |
alist w/o dotted pairs | -> | array of arrays |
alist with dotted pairs | -> | Error |
hash-table | -> | object |
standard object | -> | Error |
Decoding
This is where jsown really shines. Jsown is by far the fastest decoder. It not only decodes, it also makes it easy to pull out elements of the json-object (at least at the first level of the object).
One thing that was interesting was that jsown is the only library to decode JSON floats to CL ratios.
As a minor annoyance, parsing strings that have form feeds get translated to line feeds.
When reading JSON objects, jsown converts their content to the most lispy translation of what was in there. As such, JSON's false will be translated to nil, which coincidentally also be the translation of JSON's []. JSON's null is also translated to nil.
As you can tell from the default mapping table, JSON arrays are decoded to lists and JSON objects decoded to cons cells:
(jsown:parse "[1, \"a\"]") (1 "a") (jsown:parse "{\"foo\": \"alpha\", \"bar\":3.2}") (:OBJ ("foo" . "alpha") ("bar" . 16/5))
Nested JSON Objects (Filters and Subsets)
Now consider the nested JSON object *nested-address-1*. Again, as a reminder, here is the object.
"{ \"first_name\": \"George\", \"last_name\": \"Washington\", \"birthday\": \"1732-02-22\", \"address\": { \"street_address\": \"3200 Mount Vernon Memorial Highway\", \"city\": \"Mount Vernon\", \"state\": \"Virginia\", \"country\": \"United States\" } }"
What can we do with it in jsown? It is a JSON object which keyword "address"contains another JSON object.
A basic call to parse returns a cons representing the entire object
(jsown:parse *nested-address-1*) (:OBJ ("first_name" . "George") ("last_name" . "Washington") ("birthday" . "1732-02-22") ("address" :OBJ ("street_address" . "3200 Mount Vernon Memorial Highway") ("city" . "Mount Vernon") ("state" . "Virginia") ("country" . "United States")))
If we add the keyword address, we get back a subset of the object:
(jsown:parse *nested-address-1* "address") (:OBJ ("address" :OBJ ("street_address" . "3200 Mount Vernon Memorial Highway") ("city" . "Mount Vernon") ("state" . "Virginia") ("country" . "United States")))
According to the README, "In order to achieve high performance when parsing specific keywords, the keywords to be found should be known at compile time. The compiler-macro-function can calculate the keyword container with the requested keywords at compile-time. When specifying the keywords in which you’re interested you should ignore any escaped characters. For instance, supplying the string “foo” will automatically match “f\\\\oo” too."
We can walk the tree deeper by applying the filter and val functions.
(jsown:val (jsown:filter (jsown:parse *nested-address-1*) "address") "city") "Mount Vernon"
Jsown has the ability, once a JSON object has been parsed into a jsown object, to get the first level keywords of the object.
(jsown:keywords (jsown:parse *nested-address-1*)) ("first_name" "last_name" "birthday" "address")
Jsown also has the ability to loop over the first level keywords of an object:
(jsown:do-json-keys (keyword value) (jsown:parse *nested-address-1*) (format t "~a => ~a~%" keyword value)) first_name => George last_name => Washington birthday => 1732-02-22 address => (OBJ (street_address . 3200 Mount Vernon Memorial Highway) (city . Mount Vernon) (state . Virginia) (country . United States))
On the downside, jsown actually hung when trying to decode an underflow number: "[123e-10000000]".
Encoding
(to-json x) is a generic function which you can specialize on your own types. This allows you to nest lisp objects in a jsown object and serialize them in a suitable way.
(to-json* x) is the non-generic function variant of the same thing. It isn't as smart, but it is faster.
Encoding chars, pathnames, CLOS objects, structs and other data structures not listed in the default mapping above will require that you write a new to-json method to handle that particular datatype.
On the plus side, jsown is the one of the two libraries (the other is shasht) which can handle multi-dimensional arrays
When writing to JSON, lisp's nil is translated to the empty JSON list []. You can write JSON's false by writing lisp's keywords :false or :f.
(jsown:to-json (jsown:new-js ("items" nil) ("falseIsEmptyList" :f) ("success" t))) "{\"items\":[],\"falseIsEmptyList\":false,\"success\":true}"
As you can tell from the default mapping table, lists and vectors are automatically encoded to JSON arrays and hash-tables are encoded to JSON objects. As alists are considered lists of lists, an alist will return an array of arrays. Plists are treated the same as plain lists and will lose their key-value connections.
(jsown:to-json '(("a" "alpha") ("b" "beta"))) "[[\"a\",\"alpha\"],[\"b\",\"beta\"]]"
You can specify that JSON returns an object. The following call to to-json wraps the alist in another list headed by :obj and returns a JSON object.
(jsown:to-json '(:obj (("a" "alpha") ("b" "beta")))) "{\"(a alpha)\":[[\"b\",\"beta\"]]}"
If jsown tries to encode an alist which has dotted cons cells, jsown triggered unhandled memory faults with SBCL 2.1.11-x86-64-linux, CCL version 1.12.1 LinuxX8664 and ecl-version 21.2.1. This appears to be because jsown has optimized the code and assumed that all lists will be proper lists. The assumption obviously fails in the context of dotted cons cells.
When to-json is called, jsown will internally call to-json each step of the way. This has a performance downside, yet it seems to provide the least surprises in the output. If you need more performance, jsown* offers that, at the cost of flexibility.
If you are constructing JSON objects, consider using the jsown:new-js and jsown:extend-js functions.jsown:js-new has a clean and clear interface for building content. It works together with jsown:extend-js if you need to split up the object creation.
Jsown has a setf-expander on (setf jsown:val) which automatically creates a jsown-object if no such object was available at the designated place. An example should clarify:
(let (doc) (setf (jsown:val (jsown:val (jsown:val doc "properties") "color") "paint") "red") (jsown:to-json doc)) "{\"properties\":{\"color\":{\"paint\":\"red\"}}}"
It turns out to be a handy little feature when you need to build deeply nested JSON documents.
Encoding CLOS
Jsown does not have a handy ability to automagically encode clos classes. For that you would have to write your own jsown:to-json method. Taking our simple person class with slots name and eye-colour, such a method could look something like this:
(defmethod jsown:to-json ((object person)) (jsown:to-json `(:obj ("name" . ,(name object)) ("eye-colour" . ,(eye-colour object))))) (jsown:to-json (make-instance 'person)) "{\"name\":\"Sabra\",\"eye-colour\":\"brown\"}"
Incremental Encoding
The following examples use two exercises. First, incrementally build a JSON array. Second, incrementally build a JSON object which also contains an incrementally built JSON array. With jsown, I found using loop easier than dotimes.
(jsown:to-json (loop :for i :from 0 :to 2 :collect i)) "[0,1,2]"
The second exercise
(jsown:to-json `(:obj ("hello" . "hu hu") ("harr" . ,(loop :for i :from 0 :to 2 :collect i)))) "{\"hello\":\"hu hu\",\"harr\":[0,1,2]}"
Symmetry
From the standpoint of symmetry or round-tripping, going from JSON->CL->JSON, jsown had no issues except for dealing with false and null.
Going from CL->JSON->CL was more problematic. Keyword symbols became strings, :NULL became nil, a float was converted to a ratio and where the input started as an array, it returned a list.
Security
There were two general security issues we considered: (1) interning keywords and (2) issues with malformed data. Jsown did not exhibit the first issue. With respect to the second issue, jsown rejected 102 out of 173 malformed test cases. Unfortunately some of those malformed JSON data strings did trigger the stack exhaustion exhibited by most of the other packages.
Conformity with JSON Standard
jsown accepted 93 of the 95 test cases that are considered "must accept". It failed on lonely numbers (an integer or negative real not within a JSON array or JSON object).
It accepted 9 of the 17 test cases considered to be part of the gray area of the JSON specification - you could accept or reject.
Benchmarking
See Benchmarking
Miscellaneous
Issue 27 on GitHub claims an unhandled memory fault on the following bad JSON string in SBCL 2.0.8, but I do not see that with CCL, clisp or SBCL 2.1.11, so this issue may have been fixed.
(jsown:parse "{\"foo\": a}")
jsown exported symbols
- *parsed-empty-list-value* - value to emit when parsing a JSON empty list '[]'
- *parsed-false-value* - value to emit when parsing JSON's 'false'
- *parsed-null-value* - value to emit when parsing JSON's 'null'
- *parsed-true-value* - value to emit when parsing JSON's 'true'
- as-js-bool - returns <value> as a boolean value (in jsown format)
- as-js-null - returns <value> with nil being javascript's null (in jsown format).
- build-key-container - Builds an internal structure to speed up the keywords which you can read. This should be used when the keywords needed are not known at compiletime, but you still want to parse those keywords of a lot of documents.
- do-json-keys - Macro - Iterates over the JSON key-value pairs
- empty-object - Returns an empty object which can be used to build new objects upon
- extend-js - Macro - fills in a bunch of jsown values for obj. each spec should contain a list with the first element being the string which represents the key and the second being the form which evaluates to the value to which the key should be set
- filter - Fancy filtering for jsown-parsed objects. spec can be one of the following:
- [object] key to find. will transform into (jsown:val value key)
- [cl:map] use this modifier with an [object] modifier after it, to filter all elements in the list.
- json-encoded-content - escribes a JSON object whos content is serialized already
- keyp - Returns non-nil iff <object> has key <key>
- keywords - Returns a list of all the keywords contained in the object
- new-js - Macro - creates a new empty object and fills it is per jsown-values
- parse - Reads a JSON object from the given string, with the given keywords being the keywords which are fetched from the object. All parse functions assume <string> is not an empty JSON object. (string/= string \"{}\")
- parse-with-container - Parses the keywords which have been specified in the container from the JSON string json-string. For most cases you can just use the parse function without a special key container. This is only here to support some cases where the building of the key container takes too much time. See #'parse for the normal variant. See #'build-key-container for a way to build new keyword containers.
- remkey - Removes key from object
- to-json - Writes the given object to JSON in a generic way.
- to-json* - Converts an object in internal jsown representation to a string containing the JSON representation
- val - Returns the value of the given key in object
- val-safe - Returns the value of <key> in <object> if <object> existed, or nil if it did not. A second value is returned which indicates whether or not the key was found.
- with-injective-reader - Rebinds *parsed--value so that reading JSON documents is injective and converting them back to JSON yields roughly the same document as the original. Rebinds: *parsed-true-value* => :true, *parsed-false-value* => :false, *parsed-null-value* => :null
shasht
Library | Author | License | Website |
---|---|---|---|
shasht | Tarn W. Burton | MIT | https://github.com/yitzchak/shasht |
Shasht is one of the two new libraries that I particularly like. It is fast, it handles null correctly, it encodes CLOS objects, structures and hash-tables. It can also do incremental encoding.
Default Mappping
Please note the direction of the arrows in the following table.
Lisp | JSON | |
---|---|---|
integer | <-> | number with no frac or exp parts |
float | <-> | number with frac or exp parts |
rational | -> | number with frac or exp parts |
ratio | -> | number with frac or exp parts |
T | <-> | true |
:true | -> | true |
:false | -> | false |
nil | <-> | false |
:NULL | <-> | null |
#() | <- | [] |
other symbol | -> | string |
character | -> | string |
string | <-> | string |
pathname | -> | string |
list (except alists) | -> | array |
vector | <-> | array |
multi-dimensional array | -> | nested array |
alist | -> | array |
hash-table | <-> | object |
standard object | -> | object |
structure | -> | object |
:empty-array | -> | [] |
:empty-object | -> | {} |
'(:array 1 2 3) | -> | [1,2,2] |
'(:object-alist ("a" . 1)("b" . 2)) | -> | {"a":1,"b":2} |
'(:object-plist "a" 1 "b" 2) | -> | {"a":1,"b":2} |
The following subsections are from the README:
Mapping of Number Types
The format of a number read from JSON when a decimal or an exponent is present in the number literal can be influenced with cl:*read-default-float-format*. This is the same behavior of cl:read. In order to read JSON numbers with large exponents one would need do something like the following.
(shasht:read-json "[2.232e75]" :float-format 'double-float)
Mapping of Array Types
The dynamic variables *read-default-array-format*, *write-empty-array-values*, and *write-array-tags* all influence the mapping of JSON arrays to Common Lisp vectors and lists. Common Lisp vectors and multi-dimensional arrays are always writen as JSON arrays. By default JSON arrays are read as Common Lisp vectors. With the default settings only non-nil lists that don't satisfy some other mapping rule are written as JSON arrays.
If one wants to use lists as the default JSON array format then *read-default-false-value*, *read-default-array-format*, and *write-false-value* will need to need to be set to appropriate values since in the default mapping nil maps to false. For example, the following could be done.
(let ((shasht:*read-default-false-value* :false) (shasht:*read-default-array-format* :list) (shasht:*write-false-values* '(:false))) (shasht:read-json ...) (shasht:write-json ...))
Lists with a CAR eql to a value in *write-array-tags*, *write-object-alist-tags*, *write-object-plist-tags* will still be written as an array or object as appropriate. To completely disable this behavior the variables would need to be bound to nil. Or one could do the following.
(shasht:write-json '(1 2 3) :false-value '(:false) :array-tags nil :object-alist-tags nil :object-plist-tags nil)
In this case the mapping for array types would become:
Common Lisp | JSON | |
---|---|---|
vector | -> | array |
multi-dimensional array | -> | nested array |
list | <-> | array |
Mapping of Object Types
The dynamic variables *read-default-object-format*, *write-alist-as-object*, *write-plist-as-object*, *write-empty-object-values*, *write-object-alist-tags*, and *write-object-plist-tags* all influence the mapping of JSON objects to Common Lisp hash tables, alists, and plists. Common Lisp hash tables are always written as JSON objects. By default JSON objects are read as Common Lisp hash tables.
In order to use alists as the default JSON object format the dynamic variables *read-default-object-format*, *write-alist-as-object*, *read-default-false-value*, and *write-false-values* will need to be set to appropriate values. For example, the following would use alists as the default JSON object format and :false as the JSON false value.
(let ((shasht:*read-default-object-format* :alist) (shasht:*write-alist-as-object* t) (shasht:*read-default-false-value* :false) (shasht:*write-false-values* '(:false))) (shasht:read-json ...) (shasht:write-json ...))
In this case the mapping for object types would become:
Common Lisp | JSON | |
---|---|---|
hash table | -> | object |
alist | <-> | object |
standard object | -> | object |
structure object | -> | object |
The same could be accomplished for plists by doing the following.
(let ((shasht:*read-default-object-format* :plist) (shasht:*write-plist-as-object* t) (shasht:*read-default-false-value* :false) (shasht:*write-false-values* '(:false))) (shasht:read-json ...) (shasht:write-json ...))
Decoding
To quote from the README: The primary interface to parsing and reading JSON is the read-json function.
(read-json &optional input-stream-or-string (eof-error-p t) eof-value single-value-p)
The argument input-stream-or-string can be an stream, a string to read from, or nil to use *standard-input*. The arguments eof-error-p and eof-value have the same affect as they do in the CL function read. If the single-value-p argument is true then the input to read-json is assumed to be a single value, which means that extra tokens at the end will cause an error to be generated.
There are a number of dynamic variables that will influence the parsing of JSON data.
- common-lisp:*read-default-float-format* — Controls the floating-point format that is to be used when reading a floating-point number.
- *read-default-true-value* — The default value to return when reading a true token. Initially set to t.
- *read-default-false-value* — The default value to return when reading a false token. Initially set to nil.
- *read-default-null-value* — The default value to return when reading a null token. Initially set to :null.
- *read-default-array-format* — The default format to use when reading an array. Current supported formats are :vector or :list. Initially set to :vector.
- *read-default-object-format* — The default format to use when reading an object. Current supported formats are :hash-table, :alist or :plist. Initially set to :hash-table.
There is also a keyword variant read-json* which will set the various dynamic variables from supplied keywords.
(read-json* :stream nil :eof-error t :eof-value nil :single-value nil :true-value t :false-value nil :null-value :null :array-format :vector :object-format :hash-table :float-format 'single-float)
Encoding
The primary interface to serializing and writing JSON is the write-json function.
(write-json value &optional (output-stream t))
On the plus side, shasht was one of two libraries which could encode a pathname or a struct. It also is one of the few libraries that can encode CLOS objects out of the box:
(shasht:write-json (make-instance 'person)) { "NAME": "Sabra", "EYE-COLOUR": "brown" }
Local-time:timestamps were encoded as {"DAY": 7990, "SEC": 0, "NSEC": 0}, so if you wanted them to be in any particular javascript time representation, you would have to write a specialized method.
As you might expect, plists are treated the same as plain lists and will lose their key-value connections. If you want to use plists, you will need to convert to a hash table first.
There are a number of dynamic variables that will influence the serialization of JSON data.
- common-lisp:*print-pretty* — If true then a simple indentation algorithm will be used.
- *write-indent-string* — The string to use when indenting objects and arrays. Initially set to #\space.
- *write-ascii-encoding* — If true then any non ASCII values will be encoded using Unicode escape sequences. Initially set to nil.
- *write-true-values* — Values that will be written as a true token. Initially set to '(t :true).
- *write-false-values* — Values that will be written as a false token. Initially set to '(nil :false).
- *write-null-values* — Values that will be written as a null token. Initially set to (:null).
- *write-alist-as-object* — If true then undotted assocation lists will be written as an object. Initially set to nil.
- *write-plist-as-object* — If true then property lists will be written as an object. Initially set to nil.
Consider the following:
(setf common-lisp:*print-pretty* nil) (shasht:write-json '(("foo" . "bar") ("baz" . ((1 2 3) (4 5 6))))) [["foo"; Evaluation aborted on #<TYPE-ERROR expected-type: LIST datum: "bar">.
Now if we set shasht:*write-alist-as-object* t,
(setf common-lisp:*print-pretty* nil) (setf shasht:*write-alist-as-object* t) (shasht:write-json '(("foo" . "bar") ("baz" . ((1 2 3) (4 5 6))))) {"foo":"bar","baz":{1:[2,3],4:[5,6]}}
The actual serialization of JSON data is done by the generic function print-json-value which can be specialized for additional value types.
(print-json-value value output-stream)
There is also a keyword variant write-json* which will set the various dynamic variables from supplied keywords.
(write-json* value :stream t :ascii-encoding nil :true-values '(t :true) :false-values '(nil :false) :null-values '(:null) :empty-array-values '(:empty-array) :empty-object-values '(:empty-object) :alist-as-object nil :plist-as-object nil :pretty nil :indent-string " ")
In order to facilitate extending the serialization facilities of shasht there are a number of helper functions available. To aid in the printing of JSON strings there is the following.
(write-json-string value output-stream)
In order to ease the serialization of objects and arrays there is with-json-object and with-json-array. Both of these macros take an output stream as the first argument then enable indentation and automatic handling of all delimiter tokens. Inside the body of with-json-object the function (print-json-key-value key value output-stream) should be used to output a key value pair. Inside the body of with-json-array the function (print-json-value value output-stream) should be used to output a single value. Example usage can be seen in the source code.
Encoding Objects
Shasht actually uses the :type of each slot to determine what to do with nil. Consider the following where two slots have no specified type, two have a list type and two have an array type:
(defclass my-class () ((name :initarg :name) (value0 :initarg :value0) (value-list0 :initarg :value-list0 :type list) (value-array0 :initarg :value-array0 :type array) (value1 :initarg :value1) (value-list1 :initarg :value-list1 :type list) (value-array1 :initarg :value-array1 :type array))) (let ((obj (make-instance 'my-class :name "Name" :value0 nil :value-list0 nil :value-array0 nil :value1 1 :value-list1 '(1 2 3) :value-array1 #("a" "b" "c"))) (cl:*print-pretty* t) (shasht:*write-false-values* '(:false)) (shasht:*write-empty-array-values* '(:empty-array nil))) (shasht:write-json obj nil)) "{ \"NAME\": \"Name\", \"VALUE0\": false, \"VALUE-LIST0\": [], \"VALUE-ARRAY0\": false, \"VALUE1\": 1, \"VALUE-LIST1\": [ 1, 2, 3 ], \"VALUE-ARRAY1\": [ \"a\", \"b\", \"c\" ] }"
Incremental Encoding
The following examples use two exercises. First, incrementally build a JSON array. Second, incrementally build a JSON object which also contains an incrementally built JSON array.
(shasht:with-json-array *standard-output* (dotimes (i 3) (shasht:write-json i))) [0,1,2]
One way of doing the second exercise could look like this:
(shasht:with-json-object *standard-output* (shasht:print-json-key-value () "hello" "hu hu" *standard-output*) (shasht:print-json-key-value () "harr" (loop :for i :from 0 :to 2 :collect i) *standard-output*)) {"hello": "hu hu", "harr": [0,1,2]}
The following two examples are taken from the README. There are keyword literals that can be used to help constructing JSON objects and arrays. The values and tags that indicate these literals can be configured via the dynamic variables *write-empty-array-values*, *write-empty-object-values*, *write-array-tags*, *write-object-alist-tags*, and *write-object-plist-tags*.
These literals forms are only meant for serialization and not for round-trip mapping. Therefore there is no way to read JSON in the same format.:
(shasht:write-json '(:object-alist ("a" . :empty-array) ("b" . :empty-object) ("c" . (:object-plist "d" 1 "e" (:array 1 2 3))))) { "a": [], "b": {}, "c": { "d": 1, "e": [ 1, 2, 3 ] } } (:OBJECT-ALIST ("a" . :EMPTY-ARRAY) ("b" . :EMPTY-OBJECT) ("c" :OBJECT-PLIST "d" 1 "e" (:ARRAY 1 2 3)))
Symmetry
From the standpoint of symmetry or round-tripping, going from JSON->CL->JSON, shasht had no issues.
Going from CL->JSON->CL was more problematic. The alist input was converted to vectors of vectors and keyword symbols became strings.
Security
There were two general security issues we considered: (1) interning keywords and (2) issues with malformed data. shasht did not exhibit the first issue.
With respect to the second issue, shasht rejected 159 out of 173 malformed test cases. While it did not trigger the stack exhaustion issue exhibited by most of the other packages, some of the malformed JSON data strings did trigger recoverable error situations.
Shasht has also added read-level and read-length limits but these are set to nil by default.
Conformity with JSON Standard
shasht accepted 95 of the 95 test cases that are considered "must accept". If *read-default-float-format* is set to 'single-float, it would refuse to accept: [123e65], [123e45] and [123.456e78].
It accepted 9 of the 17 test cases considered to be part of the gray area of the JSON specification - you could accept or reject.
Benchmarking
See Benchmarking
Shasht Exported Symbols
- *read-default-array-format* - The default format to use when reading an array. Current supported formats are :vector or :list. Initially set to :vector.
- *read-default-false-value* - The default value to return when reading a false token. Initially set to nil
- *read-default-null-value* - The default value to return when reading a null token. Initially set to :null
- *read-default-object-format* - The default format to use when reading an object. Current supported formats are :hash-table, :alist or :plist. Initially set to :hash-table
- *read-default-true-value* - The default value to return when reading a true token. Initially set to t
- *read-length* - The maximum number of values in an array or object. Initially set to nil, which disables length checking.
- *read-level* - The maximum number of levels to allow during reading for arrays and objects. Initially set to nil, which disables level checking.
- *symbol-name-function*
- *write-alist-as-object* - If true then assocation lists will be written as an object. Initially set to nil
- *write-array-tags - Indicators in the CAR of a list that indicate that the CDR or the list should be written as an array.
- *write-object-alist-tags - Indicators in the CAR of a list that indicate that the CDR of the list is an alist and should be written as object.
- *write-object-plist-tags - Indicators in the CAR of a list that indicate that the CDR of the list is an plist and should be written as object.
- *write-ascii-encoding* - If true then any non ASCII values will be encoded using Unicode escape sequences. Initially set to nil
- *write-empty-array-values*
- *write-empty-object-values*
- *write-false-values* - Values that will be written as a false token. Initially set to '(nil :false)
- *write-indent-string* - The string to use when indenting objects and arrays. Initially set to #\space
- *write-null-values* - Values that will be written as a null token. Initially set to (:null)
- *write-plist-as-object* - If true then property lists will be written as an object. Initially set to nil
- *write-true-values* - Values that will be written as a true token. Initially set to '(t :true)
- make-object
- print-json-delimiter
- print-json-key-value
- print-json-value - The actual serialization of JSON data is done by the generic function print-json-value which can be specialized for additional value types.
- read-json - The primary read function
(read-json &optional input-stream-or-string (eof-error-p t) eof-value single-value-p)
- read-json* - will set the various dynamic variables from supplied keywords.
(read-json* :stream nil :eof-error t :eof-value nil :single-value nil :true-value t :false-value nil :null-value :null :array-format :vector :object-format :hash-table :float-format 'single-float)
- shasht-parse-error
- with-json-array - macros take an output stream as the first argument then enable indentation and automatic handling of all delimiter tokens. the function (print-json-value value output-stream) should be used to output a single value.
- with-json-key
- with-json-object - macros take an output stream as the first argument then enable indentation and automatic handling of all delimiter tokens. function (print-json-key-value key value output-stream) should be used to output a key value pair.
- write-json - The primary writing function
(write-json value &optional (output-stream t))
- write-json* - Will set the various dynamic variables from supplied keywords.
(write-json* value :stream t :ascii-encoding nil :true-values '(t :true) :false-values '(nil :false) :null-values '(:null) :empty-array-values '(:empty-array) :empty-object-values '(:empty-object) :alist-as-object nil :plist-as-object nil :pretty nil :indent-string " ")
- write-json-string
st-json
Library | Author | License | Website |
---|---|---|---|
st-json | Marijn Haverbeke | zlib-style | https://github.com/marijnh/ST-JSON |
As the README says, st-json does mostly the same thing as cl-json, but is simpler and more precise about types (distinguishing boolean false, the empty array, and the empty object). It is very fast encoding. While not the fastest decoder, it has respectable decoding speed. It does have optimize set for safety 0 to increase speed. This has bitten some users hard in the past, so you have been warned. It handles JSON's null correctly. You will have to write your own methods to handle symbols and CLOS objects.
Documentation is found at https://marijnhaverbeke.nl/st-json/.
Default Mappping
Please note the direction of the arrows in the following table.
Lisp | JSON | |
---|---|---|
integer | <-> | number with no frac or exp parts |
float | <-> | number with frac or exp parts |
rational | -> | number with frac or exp parts |
ratio | -> | number with frac or exp parts |
T | <-> | true |
nil | <-> | [] |
:FALSE | <- | false |
:NULL | <- | null |
other symbol (1) | -> | Error |
character (1) | -> | Error |
string | <-> | string |
list (except alists) | <-> | array |
vector (1) | -> | Error |
alist (2) | -> | array |
hash-table (3) | -> | object |
struct | <- | object |
standard object (1) | -> | Error |
- (1) You can write a method to handle these
- (2) Only if the alist does not have dotted cons cells.
- (3) Encoding hash-tables works with the caveat that if the hash-table has symbols as keys, you will have to write your own method to handle symbols first.
Decoding
In decoding, st-json creates instances of a struct "jso" which wraps an alist. The underlying idea for using a struct is that often hash tables are too heavy weight for the needs.
(st-json:read-json "{\"foo\": \"alpha\", \"bar\":3.2,\"baz\": \"null\"}") #S(ST-JSON:JSO :ALIST (("foo" . "alpha") ("bar" . 3.2) ("baz" . "null")))
Encoding
The basic encoding function is write-json. Similarly to cl-json, st-json has a function for writing JSON output to string as well as to a stream. St-json has a limited number of lisp datatypes that it handles out of the box. For example, vectors are not included. It is easy to write methods for those types, but that is an additional amount of work not necessary in most of the other libraries.
Encoding Symbols
You need to write your own st-json::write-json-element function for symbols. Possibly something like:
(defmethod st-json:write-json-element ((element symbol) stream) (st-json:write-json-element (string element) stream))
You can also write methods to handle encoding the other lisp datatypes in the default mapping table which show an error.
Encoding Arrays
You need to write your own st-json::write-json-element function for arrays. Possibly something like:
(defmethod write-json-element ((element vector) stream) (declare #.*optimize*) (write-char #\[ stream) (loop :for val :across element :for first := t :then nil :unless first :do (write-char #\, stream) :do (write-json-element val stream)) (write-char #\] stream))
Encoding Alists
If the alist has dotted cons cells, jsown and st-json triggered unhandled memory faults with SBCL 2.1.11-x86-64-linux, CCL version 1.12.1 LinuxX8664 and ecl-version 21.2.1. This appears to be because jsown and st-json have optimized the code and assumed that all lists will be proper lists. The assumption obviously fails in the context of dotted cons cells.
Encoding CLOS
St-json requires that you actually define a write-json-element method for the CLOS class you want to encode. Consider our little person class.
(defclass person () ((name :initarg :name :initform "Sabra" :accessor name) (eye-colour :initarg :eye-colour :initform "brown" :accessor eye-colour)))
Creating the required method for this class is straight forward.
(defmethod st-json:write-json-element ((person person) stream) (let ((accessors '(("name" name) ("eye_colour" eye-colour)))) (write-char #\{ stream) (loop :for (key val) :in accessors :for first := t :then nil :unless first :do (write-char #\, stream) do (st-json:write-json-element key stream) (write-char #\: stream) (st-json:write-json-element (funcall val person) stream)) (write-char #\} stream))) #<STANDARD-METHOD ST-JSON:WRITE-JSON-ELEMENT (PERSON T) {101561B163}> JSON-TESTS> (let ((data (make-instance 'person))) (st-json:write-json data *standard-output*)) {"name":"Sabra","eye_colour":"brown"}
Incremental Encoding
The following examples use two exercises. First, incrementally build a JSON array. Second, incrementally build a JSON object which also contains an incrementally built JSON array. The first exercise (again, like jsown, I found using loop easier than dotimes):
(st-json:write-json-to-string (loop :for i :from 0 :to 2 :collect i)) "[0,1,2]"
The second exercise:
(st-json:write-json-to-string (st-json:jso "hello" "hu hu" "harr" (loop :for i :from 0 :to 2 :collect i))) "{\"hello\":\"hu hu\",\"harr\":[0,1,2]}"
Symmetry
From the standpoint of symmetry or round-tripping, going from JSON->CL->JSON, st-json had no issues.
Going from CL->JSON->CL was more problematic solely because you need to write a method to deal with encoding symbols.
Security
There were two general security issues we considered: (1) interning keywords and (2) issues with malformed data. st-json did not exhibit the first issue.
With respect to the second issue, st-json rejected 140 out of 173 malformed test cases. Unfortunately it did trigger the stack exhaustion issue exhibited by most of the other packages.
Conformity with JSON Standard
st-json accepted 95 of the 95 test cases that are considered "must accept". If *read-default-float-format* is set to 'single-float, it would refuse to accept: [123e65], [123e45] and [123.456e78].
It accepted 8 of the 17 test cases considered to be part of the gray area of the JSON specification - you could accept or reject.
Benchmarking
See Benchmarking
Other Information
St-json provides error conditions for json-type-error, json-parse error, json-error and json-eof-error. These are undocumented, so you will have to look at the source code for how to use them.
st-json exported symbols
- *allow-comments* - Non-nil means ignore comments when parsing.
- *decode-objects-as* - Valid values: :jso :hashtable Controls how js objects should be decoded. :jso means decode to internal struct which can be processed by getjso, mapjso etc. :hashtable means decode as hash tables.
- *output-literal-unicode* - Bind this to T in order to reduce the use of \uXXXX Unicode escapes, by emitting literal characters (encoded in UTF-8). This may help reduce the parsing effort for any recipients of the JSON output, if they can already read UTF-8, or else, they'll need to implement complex unicode (eg UTF-16 surrogate pairs) escape parsers.
- *script-tag-hack* - Bind this to T when writing JSON that will be written to an HTML document. It prevents '</script>' from occurring in strings by escaping any slash following a '<' character.
- as-json-bool - Convert a generalised boolean to a :true/:false keyword.
- from-json-bool - Convert :true or :false to its boolean equivalent.
- getjso - Fetch a value from a JS object. Returns a second value like gethash.
- getjso* - The getjso* function in theory allows you to take a key in the form of "a.b.c" and st-json will generate a series of getjso calls to go down each level and return the value for key c. This, however, does not seem to work if read-json does not result in jso objects all the way down. Consider, for example, if we have is a jso object which wraps a cons which wraps a jso object. The intermediary cons prevents getjso* from walking down the nested list.
- jso - Create a JS object. Arguments should be alternating labels and values.
- json-bool - (deftype json-bool () '(member :true :false))
- json-eof-error
- json-error
- json-null - (deftype json-null () '(eql :null))
- json-parse-error
- json-type-error
- mapjso - Iterate over the key/value pairs in a JS object.
- read-json - Read a JSON-encoded value from a stream or a string.
- read-json-as-type - Read a JSON value and assert the result to be of a given type. Raises a json-type-error when the type is wrong.
- read-json-from-string
- write-json - Write a value's JSON representation to a stream.
- write-json-element - Method used for writing values of a specific type. You can specialise this for your own types.
- write-json-to-string
trivial-json-codec
Library | Author | License | Website |
---|---|---|---|
trivial-json-codec | Eric Diethelm | MIT | https://gitlab.com/ediethelm/trivial-json-codec |
As the website says, trivial-json-codec is a JSON parser focused on the ability to handle class hierarchies. The classes must be defined each class in a hierarchy must have at least one slot named differently. It does not handle the difference between JSON's null and false (everything is nil once it gets to the CL side). The one thing I do not understand is its use of angle brackets when encoding lists to JSON. E.g.
(trivial-json-codec:serialize '(1 2 3) *standard-output*) <1,2,3>
Encoding vectors looks perfectly standard:
(trivial-json-codec:serialize #(1 2 3) *standard-output*) [1,2,3]
I see this encoding of lists as resulting in invalid JSON, but maybe I am missing something.
Default Mappping
Please note the direction of the arrows in the following table.
Lisp | JSON | |
---|---|---|
integer | <-> | number with no frac or exp parts |
float | <-> | number with frac or exp parts |
rational | -> | number with frac or exp parts |
ratio | -> | Error (1) |
T | <-> | true |
nil | <-> | [] |
nil | <- | false |
nil | <-> | null |
other symbol | -> | string |
character | -> | Error (1) |
string | <-> | string |
list (except alists) | <-> | invalid angle bracket array? |
vector | <-> | array |
alist w/o dotted pairs | -> | invalid angle bracket array |
alist with dotted pairs | -> | Error (1) |
hash-table | -> | Error (1) |
plist | <- | invalid angle bracket array |
standard object | -> | object |
alist | <- | object (2) |
alist | <- | object (3) |
- (1) You could write a method to handle these
- (2) Using deserialize-raw
- (3) Using deserialize-json if a base class is given
Decoding
Trivial-json-codec has two functions that "deserialize" JSON strings. The first is deserialize-raw. JSON arrays are converted to vectors and JSON objects are converted to alists.
(trivial-json-codec:deserialize-raw "[1.23,\"alpha\",false,null]") #(1.23 "alpha" NIL NIL) (trivial-json-codec:deserialize-raw *address-1*) ((:NAME "George Washington") (:BIRTHDAY "February 22, 1732") (:ADDRESS "Mount Vernon, Virginia, United States"))
The second is deserialize-json. This function takes a JSON string, a CL class, a read-table and constructors. The last three default to nil. If you do not supply a CL class, deserialize-json can only handle native types:
(trivial-json-codec:deserialize-json "[1.23,\"alpha\",false,null]") #(1.23 "alpha" NIL NIL)
If you do supply a class, it will try to create an instance of the class with data from the JSON string. Lets try with our simple person class which only has slots name and eye-colour.
(describe (trivial-json-codec:deserialize-json "{\"name\":\"Rebecca\",\"eye-colour\":\"blue\"}" :class (find-class 'person))) #<PERSON {1010770303}> [standard-object] Slots with :INSTANCE allocation: NAME = "Rebecca" EYE-COLOUR = "blue"
To no one's surprise, if the JSON string is an array, it will just return a vector. If the array contains the proper JSON objects, it will return a vector of class objects:
(describe (aref (trivial-json-codec:deserialize-json "[{\"name\":\"Rebecca\",\"eye-colour\":\"blue\"}, {\"name\":\"Johann\",\"eye-colour\":\"brown\"}]" :class (find-class 'person)) 1)) #<PERSON {1011E67053}> [standard-object] Slots with :INSTANCE allocation: NAME = "Johann" EYE-COLOUR = "brown"
If you have a class hierarchy, it will work through the inherited classes. Each class will need to have at least one slot named differently.
Encoding
Trivial-json-codec has two functions for encoding: serialize and serialize-json. Serialize is the underlying generic function and takes a stream parameter as well as the data. Serialize-json uses serialize and returns a string.
It does encode CLOS objects although you might want to write specialized methods to handle particular formatting. For example, local-time:timestamps were encoded as {"DAY": 7990, "SEC": 0, "NSEC": 0}, so if you wanted them to be in any particular javascript time representation, you would have to write a specialized method.
You would also need to write applicable methods for any structs, pathnames or hash-tables.
As mentioned above, encoding lists returns strings with angle brackets.
Hierarchy Use Case
I am just going to quote from the README here:
(defclass Payload () ()) (defclass SimplePayload (Payload) ((value :type integer :initarg :value))) (defclass ComplicatedPayload (Payload) ((value :type string :initarg :value) (additional-info :type string :initarg :additional-info) (message-id :type trivial-utilities:positive-fixnum :initarg :message-id))) (defclass DifferentPayload (Payload) ((cargo :type fixnum :initarg :cargo))) (defclass Message () ((uid :initarg :uid :initform nil :accessor uid) (payload :type (or null Payload) :initarg :payload :accessor payload))) (c2mop:ensure-finalized (find-class 'Payload)) (c2mop:ensure-finalized (find-class 'SimplePayload)) (c2mop:ensure-finalized (find-class 'ComplicatedPayload)) (c2mop:ensure-finalized (find-class 'DifferentPayload)) (c2mop:ensure-finalized (find-class 'Message)) (let ((message (make-instance 'Message :uid 1 :payload (make-instance 'Simplepayload :value 12345)))) (trivial-json-codec:serialize-json message)) => "{ \"UID\" : 1, \"PAYLOAD\" : { \"VALUE\" : 12345}}" (deserialize-json "{ \"UID\" : 1, \"PAYLOAD\" : { \"VALUE\" : 12345}}" :class (find-class 'Message)) => #<MESSAGE> with a payload of type SimplePayload (let ((message (make-instance 'Message :uid 2 :payload (make-instance 'ComplicatedPayload :value "abc" :message-id 17 :additional-info "1234")))) (trivial-json-codec:serialize-json message)) => "{ \"UID\" : 2, \"PAYLOAD\" : { \"VALUE\" : \"abc\", \"ADDITIONAL-INFO\" : \"1234\", \"MESSAGE-ID\" : 17}}" (deserialize-json "{ \"UID\" : 2, \"PAYLOAD\" : { \"VALUE\" : \"abc\", \"ADDITIONAL-INFO\" : \"1234\", \"MESSAGE-ID\" : 17}}" :class (find-class 'Message)) => #<MESSAGE> with a payload of type ComplicatedPayload (let ((message (make-instance 'Message :uid 2 :payload (make-instance 'DifferentPayload :cargo -147)))) (trivial-json-codec:serialize-json message)) => "{ \"UID\" : 2, \"PAYLOAD\" : { \"CARGO\" : -147}}" (deserialize-json "{ \"UID\" : 2, \"PAYLOAD\" : { \"CARGO\" : -147}}" :class (find-class 'Message)) => #<MESSAGE> with a payload of type DifferentPayload
Due to the known limitation mentioned in the description, the following is NOT possible:
(defclass StringPayload (Payload) ((value :type string :initarg :value))) (let ((message (make-instance 'Message :uid 2 :payload (make-instance 'StringPayload :value "abc")))) (trivial-json-codec:serialize-json message)) => "{ \"UID\" : 2, \"PAYLOAD\" : { \"VALUE" : \"abc\"}}" (deserialize-json "{ \"UID\" : 2, \"PAYLOAD\" : { \"VALUE" : \"abc\"}}" :class (find-class 'Message)) => This terminates with an error due to non-unique class mapping. StringPayload and Simplepayload differ only on the slot's type.
Symmetry
From the standpoint of symmetry or round-tripping, going from JSON->CL->JSON, trivial-json-codec continued its insistence that JSON arrays should be shown as angle brackets. trivial-json-codec is really intended as a parser (one way) from JSON to CL, not really serializing to JSON.
Interestingly, trivial-json-codec had no problems going from CL->JSON->CL.
Security
There were two general security issues we considered: (1) interning keywords and (2) issues with malformed data. trivial-json-codec did not exhibit the first issue.
With respect to the second issue, trivial-json-codec rejected 101 out of 173 malformed test cases. It did not trigger the stack exhaustion issue exhibited by most of the other packages.
Conformity with JSON Standard
trivial-json-codec accepted 90 of the 95 test cases that are considered "must accept". It failed to accept: [[] ], [0e+1], [1E+2], [1e+2], { "min": -1.0e+28, "max": 1.0e+28 }. If *read-default-float-format* is set to 'single-float, it would refuse to accept: [123e65], [123e45] and [123.456e78].
It accepted 13 of the 17 test cases considered to be part of the gray area of the JSON specification - you could accept or reject.
Benchmarking
See Benchmarking
trivial-json-codec exported symbols
- serialize-json obj - takes obj and serializes it into a string. uses the generic serialize to do the job.
- deserialize-json json-str &key (class nil) (read-table nil) (constructors nil) - reads json-str and creates an according object. If class is non-nil and represents a class, an instance of it is returned. otherwise only built-in types can be deserialized. Read-table makes it possible to inject specific readers, as pondons to serialize. it has the form of an alist containing the dispatch character as car and the deserialization function as cdr. constructors holds an alist mapping the keyword returned by a specific reader to a object construction function.
- serialize obj stream - serialize an object obj into stream. implementations for some built-in types already exist. the user might extend with methods for specific types.
- deserialize-raw json-str &key (read-table nil) - deserialize json-str into a property list. as opposed to deserialize-json this function does not require a base class to deserialize.
yason
Library | Author | License | Website |
---|---|---|---|
yason | Hans Huebner | BSD | https://github.com/phmarek/yason |
IMPORTANT: Notice the GitHub location has moved. Hans Huebner's old GitHub location will automatically redirect to Phil Marek's, but Quicklisp is not (as of the time of this writing) pulling code from the new GitHub location. It is actively maintained by Phil Marek and he is very responsive.
From the author: "the major difference between YASON and the other JSON libraries that were available when I wrote it is that YASON does not require the user to come up with a Lisp data structure that reflects the JSON data structure that should be generated. Rather, I wanted a way to generate JSON directly from my internal data structures.
The reason for that desire was that I had to generate different JSON format in different contexts. That is, a class instance would sometimes be generated including all its attributes, sometimes just with a select set of attributes and sometimes as a reference. Thus, there was no right way to render an object as JSON, and I found the approach to first generate a data structure that would then be rendered as JSON to be wasteful and, as CL has no literal syntax for hash tables, ugly.
Instead of going through an intermediate data structure, YASON allows you to encode to a JSON stream on the fly (http://common-lisp.net/project/yason/#stream-encoder."
Mapping DataTypes and Structures
JSON | cl | Notes | |
---|---|---|---|
object | -> | hash-table :test #'equal | Keys are strings by default (see *parse-object-key-fn*). Set *parse-object-as* to :alist in order to have yason parse objects as alists or to :plist to parse them as plists. When using plists, you probably want to also set *parse-object-key-fn* to a function that interns the object's keys to symbols. |
array | -> | list | Can be changed to read to vectors (see *parse-json-arrays-as-vectors*) |
string | <-> | string | JSON escape characters are recognized upon reading. Upon writing, known escape characters are used, but non-ASCII Unicode characters are written as is. |
number | <-> | number | Parsed with READ, printed with PRINC |
true | <-> | t | Can be changed to read as TRUE (see *parse-json-booleans-as-symbols*). |
false | <-> | nil | Can be changed to read as FALSE (see *parse-json-booleans-as-symbols*). |
null | <-> | nil or :NULL | nil unless yason:parse is called with yason:*parse-json-null-as-keyword* set to t or passing the keyword parameter :json-nulls-as-keyword t to yason:parse |
(1) | <- | symbol | |
(2) | <- | object | |
(3) | <- | alists |
- (1) If you are working with yason version 0.8.2 or above, you can set the special variable *symbol-encoder* to the function #'YASON:ENCODE-SYMBOL-AS-LOWERCASE. In that case symbols will be encoded as lower case strings in JSON. Otherwise attempts to encode symbols will trigger a no-applicable-method error.
- (2) Will trigger a no-applicable-method error. So you could write class specific encoding methods for your objects.
- (3) Will return an object. Undotted alists will return values in an array, dotted alists will return individual values for each key. See Yason Encoding Alists below.
Decoding
Yason uses its generic function parse to generate a hash-table of the received json-object. It is possible to set the special variable *parse-object-as* :hash-table, :plist or :alist to specify the data structure that objects are parsed into. The default is :hash-table with test being equal. It is also possible to set the special variable *parse-json-arrays-as-vectors* to t, in which case the JSON arrays will be parsed as vectors and not as lists.
The following are decoding tests on JSON arrays, arrays in arrays and embedded arrays in an object and the results. I am using the additional yason parameters to show the alist representation instead of a hash-table and to return JSON arrays as vectors rather than lists.
(defparameter *short-encoded-items-A* "[\"items\", {\"index\":1, \"float\":19.2041, \"name\":\"Jennifer\", \"surname\":\"Snow7\", \"fullname\":\"Andrew4 Vaughan\",\"email\":\"sherri@ritchie.zw\", \"bool\":null}, {\"index\":2, \"float\":14.9888,\"name\":\"Alfred\", \"surname\":\"Pitts\",\"fullname\":\"Barry Weiner\", \"email\":\"cheryl@craven.re\",\"bool\":null}]") (yason:parse *short-encoded-items-A* :json-arrays-as-vectors t :object-as :alist) '("items" (("bool") ("email" . "sherri@ritchie.zw") ("fullname" . "Andrew Vaughan") ("surname" . "Snow7") ("name" . "Jennifer") ("float" . 19.2041) ("index" . 1)) (("bool") ("email" . "cheryl@craven.re") ("fullname" . "Barry Weiner") ("surname" . "Pitts") ("name" . "Alfred") ("float" . 14.9888) ("index" . 2)))
Nested JSON Objects
Yason does require that you completely parse the JSON data instead of being able to filter it while taking it in. Taking the nested JSON object below, how could we get a information out of the innermost nested object?
{ "items": [ { "index": 1, "integer": 29, "float": 16.8278, "fullname": "Milton Jensen", "bool": false } ] }
You need to know your data structure so that you can figure out how to walk the tree. How would we get the value of the key "integer"? Looking at it, it is a JSON object which keyword "items"contains an array which contains a JSON object.
By default, yason decodes JSON objects to hash-tables and arrays into lists. So we can descend the parsed JSON tree in this particular example something like this (assuming the JSON object was in a file named json4.txt):
(gethash "integer" (first (gethash "items" (yason:parse (alexandria:read-file-into-string #P"/home/sabra/json4.txt"))))) 29
Encoding
The base yason encoding function is encode, which takes some data and an optional stream, defaulting to *standard-output*. It returns first the json-encoded item and second the lisp item:
(yason:encode '("a" 1 "b" 2)) ["a",1,"b",2] ("a" 1 "b" 2)
Yason has multiple specialized methods that may be required, depending on the data structure to be encoded. For example: yason:encode-alist, yason:encode-plist, yason:encode-object, etc. The following demonstrates that providing a plist to yason:encode results in an JSON array, but providing a plist to yason:encode-plist results in a JSON object.
(yason:encode '("a" 1 "b" 2)) ["a",1,"b",2] (yason:encode-plist '("a" 1 "b" 2)) {"a":1,"b":2}
Encoding Symbols
The version of yason in Quicklisp as of this writing (version 7.8) has no applicable method for encoding a symbol. This also means that by default, yason will not encode lists with symbols. That is resolved in version 8.2 on GitHub, but for some reason it is not getting picked up in Quicklisp. You can resolve this in version 7.8 by writing a new method to handle symbols. Such a method could look like this:
(defmethod encode ((object symbol) &optional (stream *json-output*)) (let ((new (funcall #'ENCODE-SYMBOL-AS-LOWERCASE object))) (assert (stringp new)) (encode new stream)))
If you have symbols in alists as keys, you can (setf yason:*symbol-key-encoder* 'yason:ENCODE-SYMBOL-AS-LOWERCASE) and then yason:encode-alist will work, but not yason:encode.
(yason:encode-alist '((a 1) (b 2))) {"a":[1],"b":[2]} ((A 1) (B 2))
Encoding Alists
Unlike some other libraries, yason's encode-alist function handles both dotted and undotted alists. It will return an object. The values in undotted alists will be returned as arrays, the values in dotted alists will be returned as individual values. As mentioned in just above in dealing with symbols, if you have symbols as keys in your alist, you need to have set *symbol-key-encoder* to 'yason:ENCODE-SYMBOL-AS-LOWERCASE in order to make them usable without triggering errors.
(setf yason:*symbol-key-encoder* 'yason:ENCODE-SYMBOL-AS-LOWERCASE) (yason:encode-alist '((a 1 2 3 4) (b 5 6 7))) {"a":[1,2,3,4],"b":[5,6,7]} (yason:encode-alist '((a . 1) (b . 2))) {"a":1,"b":2}
Encoding CLOS
For objects, you will have to write your own method to extend yason to encode objects using the generic functions encode-object and encode-slots or encode-object-slots. Taking our simple person class, we can define a new encode-slots, then within an output context, we can call encode-object and output the clos class instance.
(defmethod yason:encode-slots progn ((person person)) (yason:encode-object-slots person '(name eye-colour))) (yason:with-output-to-string* () (yason:encode-object (make-instance 'person))) "{\"NAME\":\"Sabra\",\"EYE-COLOUR\":\"brown\"}"
Or you can take a look at json-mop or herodotus. I have a preference for json-mop with the caveat that, at the moment, you cannot redefine classes.
Incremental Encoding
The following examples use two exercises. First, incrementally build a JSON array. Second, incrementally build a JSON object which also contains an incrementally built JSON array.
(yason:with-output (*standard-output*) (yason:with-array () (dotimes (i 3) (yason:encode-array-element i)))) [0,1,2]
(yason:with-output (*standard-output*) (yason:with-object () (yason:encode-object-element "hello" "hu hu") (yason:with-object-element ("harr") (yason:with-array () (dotimes (i 3) (yason:encode-array-element i)))))) {"hello":"hu hu","harr":[0,1,2]}
Symmetry
From the standpoint of symmetry or round-tripping, going from JSON->CL->JSON, yason had no issues. Going from CL->JSON->CL was more problematic solely because keyword symbols were written to strings and :NULL was converted to nil.
Security
There were two general security issues we considered: (1) interning keywords and (2) issues with malformed data. yason did not exhibit the first issue.
With respect to the second issue, yason rejected 119 out of 173 malformed test cases. Unfortunately it did trigger the stack exhaustion issue exhibited by most of the other packages.
Conformity with JSON Standard
yason accepted 95 of the 95 test cases that are considered "must accept". If *read-default-float-format* is set to 'single-float, it would refuse to accept: [123e65], [123e45] and [123.456e78].
It accepted 8 of the 17 test cases considered to be part of the gray area of the JSON specification - you could accept or reject.
Benchmarking
See Benchmarking
yason exported symbols
- *list-encoder* - function to call to translate a cl list into JSON data. 'yason:encode-plain-list-to-array is the default; 'yason:encode-plist and 'yason:encode-alist are available to produce JSON objects. this is useful to translate a deeply recursive structure in a single yason:encode call.
- *parse-json-arrays-as-vectors* - if set to a true value, JSON arrays will be parsed as vectors, not as lists. nil is the default.
- *parse-json-booleans-as-symbols* - if set to a true value, JSON booleans will be read as the symbols true and false instead of t and nil, respectively. nil is the default.
- *parse-json-null-as-keyword* - if set to a true value, JSON null will be read as the keyword :null, instead of nil. nil is the default.
- *parse-object-as* - can be set to :hash-table to parse objects as hash tables, :alist to parse them as alists or :plist to parse them as plists. :hash-table is the default.
- *parse-object-as-alist*
- *parse-object-key-fn* - function to call to convert a key string in a JSON object to a key in the cl hash produced. identity is the default.
- *symbol-encoder* - warning: as of the time of this writing, this variable is in version 0.8.2 on GitHub and not in version 0.7.8 in Quicklisp. function to call to translate a cl symbol into a JSON string. the default is to error out, to provide backwards-compatible behaviour. a useful function that can be bound to this variable is yason:encode-symbol-as-lowercase
- *symbol-key-encoder- defines the policy to encode symbols as keys (eg. in hash tables). the default is to error out, to provide backwards-compatible behaviour. a useful function that can be bound to this variable is yason:encode-symbol-as-lowercase
- encode - encode object in JSON format and write to stream. may be specialized by applications to perform specific rendering. stream defaults to *standard-output*.
- encode-alist - encodes object, an alist, in JSON format and write to stream.
- encode-array-element - encode object as next array element to the last JSON array opened with with-array in the dynamic context. object is encoded using the encode generic function, so it must be of a type for which an encode method is defined.
- encode-array-elements - encode objects, a series of JSON encodable objects, as the next array elements in a JSON array opened with with-array. encode-array-elements uses encode-array-element, which must be applicable to each object in the list (i.e. encode must be defined for each object type). additionally, this must be called within a valid stream context.
- encode-object - generic function to encode an object. the default implementation opens a new object encoding context and calls encode-slots on the argument.
- encode-object-element - encode key and value as object element to the last JSON object opened with with-object in the dynamic context. key and value are encoded using the encode generic function, so they both must be of a type for which an encode method is defined.
- encode-object-elements - encodes the parameters into JSON in the last object opened with with-object using encode-object-element. the parameters should consist of alternating key/value pairs, and this must be called within a valid stream context.
- encode-object-slots - encodes each slot in slots for object in the last object opened with with-object using encode-object-element. the key is the slot name, and the value is the slot value for the slot on object.
- encode-plain-list-to-array
- encode-plist - encodes object, a plist, in JSON format and write to stream.
- encode-slots - generic function to encode object slots. there is no default implementation. it should be called in an object encoding context. it uses progn combinatation with most-specific-last order, so that base class slots are encoded before derived class slots.
- encode-symbol-as-lowercase
- false
- make-json-output-stream - creates a json-output-stream instance that wraps the supplied stream and optionally performs indentation of the generated JSON data. the indent argument is described in with-output. note that if the indent argument is nil, the original stream is returned in order to avoid the performance penalty of the indentation algorithm.
- no-json-output-context - this condition is signalled when one of the stream encoding functions is used outside the dynamic context of a with-output or with-output-to-string* body.
- null
- parse - input &key (object-key-fn *parse-object-as-key-fn*) (object-as *parse-object-as*) (json-arrays-as-vectors *parse-json-arrays-as-vectors*) (json-booleans-as-symbols *parse-json-booleans-as-symbols*) (json-nulls-as-keyword *parse-json-null-as-keyword*) => object parse input, which must be a string or a stream, as JSON. returns the lisp representation of the JSON structure parsed. the keyword arguments object-key-fn, object-as, json-arrays-as-vectors, json-booleans-as-symbols, and json-null-as-keyword may be used to specify different values for the parsing parameters from the current bindings of the respective special variables.
- true
- with-array - open a JSON array, then run body. inside the body, encode-array-element must be called to encode elements to the opened array. must be called within an existing JSON encoder context (see with-output and with-output-to-string*)
- with-object - open a JSON object, then run body. inside the body, encode-object-element or with-object-element must be called to encode elements to the object. must be called within an existing JSON encoder with-output and with-output-to-string*.
- with-object-element - open a new encoding context to encode a JSON object element. key is the key of the element. the value will be whatever body serializes to the current JSON output context using one of the stream encoding functions. this can be used to stream out nested object structures.
- with-output - macro: set up a JSON streaming encoder context on stream, then evaluate body. indent can be set to t to enable indentation with a default indentation width or to an integer specifying the desired indentation width. by default, indentation is switched off.
- with-output-to-string* - set up a JSON streaming encoder context on stream-symbol (by default a gensym), then evaluate body. return a string with the generated JSON output. see with-output for the description of the indent keyword argument.
Helper Libraries
cl-json-helper
Library | Author | License | Website | Works With | Comments |
---|---|---|---|---|---|
cl-json-helper | Bob Felts | BSD | homepage | cl-json | Last updated 2018 |
cl-json-helper is a very small package that adds two functions to help cl-json encoding JSON data and one function to help in processing decoded JSON data. I will use its nickname :xjson in the examples. At the end of the day, I see a slight benefit here, but given that it is really just a cl-json helper and the latest generation of JSON libraries have passed cl-json by, I think this is only for cl-json devotees.
cl-json-helper exported symbols
- json-bool - (json-bool val) returns an object that cl-json will decode to "true" or "false"
- json-empty - (json-empty) returns an object that cl-json will decode to '{}'
- json-key-value - (json-key-value key list) returns the value associated with key
- value-of
Json-bool
json-bool simply adds the ability for nil to be translated to JSON as false.
(cl-json:encode-json-to-string '(t nil)) "[true,null]" (cl-json:encode-json-to-string `(,(xjson:json-bool t) ,(xjson:json-bool nil))) "[true,false]"
json-empty
json-empty provides the ability to return an empty object. E.g.
(cl-json:encode-json-to-string `(("Empty" . ,(xjson:json-empty)))) "{\"Empty\":{}}"
Json-key-value
If cl-json:decode-json-from-string *nested-address-1* looks like this:
(cl-json:decode-json-from-string *nested-address-1*) ((:FIRST--NAME . "George") (:LAST--NAME . "Washington") (:BIRTHDAY . "1732-02-22") (:ADDRESS (:STREET--ADDRESS . "3200 Mount Vernon Memorial Highway") (:CITY . "Mount Vernon") (:STATE . "Virginia") (:COUNTRY . "United States")))
Then you can access the address using cl-json-helper
(xjson:json-key-value :address (cl-json:decode-json-from-string *nested-address-1*)) ((:STREET--ADDRESS . "3200 Mount Vernon Memorial Highway") (:CITY . "Mount Vernon") (:STATE . "Virginia") (:COUNTRY . "United States"))
The only difference between using this and using (assoc ..) is that assoc will also keep the key:
(assoc :address (cl-json:decode-json-from-string *nested-address-1*)) (:ADDRESS (:STREET--ADDRESS . "3200 Mount Vernon Memorial Highway") (:CITY . "Mount Vernon") (:STATE . "Virginia") (:COUNTRY . "United States"))
define-json-expander
Library | Author | License | Website | Works With | Comments |
---|---|---|---|---|---|
define-json-expander | Johan Sjölén | MIT | homepage | cl-json | Last updated 2014 |
Define-json-expander is, I think, something interesting to look at, but I do not see it adding a great deal of value. Given the last time it was updated was eight years ago, I think it is of historical interest only.
define-json-expander exported symbols
- *accessor-prefix*
- define-json-expander
Explanation
Let's take our simplest JSON object address example which we keep in address-1:
"{ \"name\": \"George Washington\", \"birthday\": \"February 22, 1732\", \"address\": \"Mount Vernon, Virginia, United States\" }"
We now use define-json-expander to create an address class that can be used:
(define-json-expander:define-json-expander address-class ()
((name) (birthday) (address)))
DECODE-ADDRESS
Now we have a decode-address function that we can apply to a cl-json decoded JSON string:
(decode-address-class (cl-json:decode-json-from-string *address-1*)) (describe (decode-address-class (cl-json:decode-json-from-string *address-1*))) #<ADDRESS-CLASS {100D67F9E3}> [standard-object] Slots with :INSTANCE allocation: REST = NIL NAME = "George Washington" BIRTHDAY = "February 22, 1732" ADDRESS = "Mount Vernon, Virginia, United States"
HOWEVER, as far as I can tell, the only accessor or reader method defined is for REST, which has a nil value. Net result, even after reading the docs and the test cases in the source files, I'm confused on whether this is fully baked. Given the last update was 8 years ago, I would give this a pass.
herodotus
Library | Author | License | Website | Works With | Comments |
---|---|---|---|---|---|
herodotus | Henry Steere | BSD (1) | homepage | yason | Wrapper around yason to handle CLOS classes more easily. Last updated 16 Jun 2021 |
Yason does the JSON parsing and serialisation for herodotus. The other dependencies are cl-ppcre and alexandria. I think it may be useful if you are a yason user.
herodotus exported symbols
- define-json-model
- to-json
Setup
Let's make an anologue of our simple person class with herodotus:
(herodotus:define-json-model herodotus-person (name eye-colour) :snake-case)
This creates a package name HERODOTUS-PERSON-JSON with a function for parsing JSON HERODOTUS-PERSON-JSON:FROM-JSON and a generic method for writing to JSON. It has accessors and initargs of name and eye-colour. The created class only has accessor and initarg specs for the slots; it does not have the ability to create default values. That last value specifies that it expects JSON keys to be using snake-case. (The default is :camel-case, the other options are :snake-case, :kebab-case or :screaming-snake-case.)
From the README: "The define-json-model macro takes three arguments: name, slots and an optional argument for case-type. The name argument is the name of the generated CLOS class. The slots argument is a collection of slot descriptors and the case-type argument is a keyword.
Slot descriptors can be either symbols or lists. If a slot descriptor is a symbol then the value of the corresponding CLOS slot will be a deserialised JSON primitive in lisp form: a number, boolean, string, vector (for arrays), or hash-table (for objects).
If a slot descriptor is a list then first argument is the CLOS slot name, the second argument is either () or the name of a previously defined JSON model to deserialise the value of this field to. The optional third argument is a special case name for this field which can have custom formatting."
Decoding
At this point a simple JSON object could then be decoded into a herodotus-person instance.
(eye-colour (herodotus-person-json:from-json "{\"name\":\"Claudia\",\"eye_colour\":\"blue\"}")) "blue"
If the JSON object key was "eye-colour" instead of "eye_colour", the return value would have been nil because the case would have been wrong.
If you have an array of JSON objects, the from-json function would return a vector of herodotus-person objects:
(name (aref (herodotus-person-json:from-json "[{\"name\":\"Rebecca\",\"eye_colour\":\"blue\"},{\"name\":\"Johann\",\"eye_colour\":\"brown\"}]") 1)) "Johann"
Encoding
Encoding usage looks much like yason:
(herodotus:to-json (make-instance 'herodotus-person :eye-colour "Green" :name "Persephone")) "{\"name\":\"Persephone\",\"eye_colour\":\"Green\"}"
Additional Info
From the README
Nested classes
You can also define classes that have class members using the type specifier syntax. This block defines two JSON models tree and branch. A tree has branch members and the branch members will be parsed from JSON using the parser defined for the tree.
CL-USER> (herodotus:define-json-model branch (size)) CL-USER> (herodotus:define-json-model tree ((branches branch)))
The syntax (branches branch) declares that the field named branches must be parsed as the type branch. JSON models for nested classes need to be defined before the models for the classes they are nested in or an error will be thrown. The error is thrown at macro expansion time.
CL-USER> (herodotus:define-json-model test-no-parser ((things not-parseable))) CL-USER> (herodotus:define-json-model test-no-parser ((things not-parseable))) class-name TEST-NO-PARSER slots ((THINGS NOT-PARSEABLE)) ; Evaluation aborted on #<SIMPLE-ERROR "Could not find parser for ; class NOT-PARSEABLE. Please define a json model for it." ; {100599D903}>.
None, one or many semantics
Fields in class definitions are parsed as either nil (if missing from the JSON), a single instance if the field is not an array and isn’t empty or a vector if the JSON contains an array of elements.
CL-USER> (herodotus:define-json-model numbers (ns)) CL-USER> (ns (numbers-json:from-json "{ }")) NIL CL_USER> (ns (numbers-json:from-json "{ \"ns\": 1 }")) 1 CL-USER> (ns (numbers-json:from-json "{ \"ns\": [1, 2, 3] }")) #(1 2 3)
Special case field names
Parsing specific field names can be done using the third argument of a field specifier. If a special field name is provided it doesn’t have to match the name of the slot in the CLOS class and can use any formatting convention.
(herodotus:define-json-model special-case ((unusual-format () "A_very-UniqueNAME")) (unusual-format (special-case-json:from-json "{ \"A_very-UniqueNAME\": \"Phineas Fog\" }")) "Phineas Fog" (herodotus:to-json (special-case-json:from-json "{ \"A_very-UniqueNAME\": \"Phineas Fog\" }")) "{\"A_very-UniqueNAME\":\"Phineas Fog\"}"
json-mop
Library | Author | License | Website | Works With | Comments |
---|---|---|---|---|---|
json-mop | Grim Schjetne | MIT | homepage | yason | Last updated 9 Mar 2021 |
Unlike herodotus which creates a new package, json-mop wants you to define your classes with the class option
(:metaclass json-serializable-class)
I quite like json-mop (assuming I was using yason), but you do need to watch for one issue: At the moment you cannot redefine a class. The issue has been flagged twice on GitHub and I ran into it in testing.
For slots that you want to appear in the JSON representation of your class, add the slot option :json-key with the string to use as the attribute name. The option :json-type defaults to :any, but you can control how each slot value is transformed to and from JSON with one of the following JSON type specifiers:
Type | Remarks |
---|---|
:any | Guesses the way to encode and decode the value |
:string | Enforces a string value |
:number | Enforces a number value |
:hash-table | Enforces a hash table value |
:vector | Enforces a vector value |
:list | Enforces a list value |
:bool | Maps T and NIL with true and false |
<symbol> | Uses a (:metaclass json-serializable-class) class definition to direct the transformation of the value |
so taking our minimal person object and following these instructions:
(defclass mop-person () ((name :initarg :name :initform "Sabra" :json-key "name" :json-type :string :accessor name) (eye-colour :initarg :eye-colour :json-key "eye-colour" :json-type :string :initform "brown" :accessor eye-colour)) (:metaclass json-mop:json-serializable-class)) #<JSON-MOP:JSON-SERIALIZABLE-CLASS JSON-TESTS::MOP-PERSON> (json-mop:encode (make-instance 'mop-person)) {"name":"Sabra","eye-colour":"brown"}
We can get a CLOS instance out of an appropriate json object with (json-to-clos data class-name)
(describe (json-mop:json-to-clos "{\"name\":\"Karla\",\"eye-colour\":\"green\"}" 'mop-person)) #<MOP-PERSON {1009BE24B3}> [standard-object] Slots with :INSTANCE allocation: NAME = "Karla" EYE-COLOUR = "green"
It gets interesting when you create an instance with an invalid data type. Suppose we create a mop-person with a list for a name and an integer for an eye colour. The instance gets created that way, but when we try to encode it, we get an errro:
(json-mop:encode (make-instance 'mop-person :name '(1 2 3) :eye-colour 12)) { There is no class named :STRING. [Condition of type SB-PCL:CLASS-NOT-FOUND-ERROR]
If the JSON object actually has a null value for eye-colour, that will be treated as an unbound slot. Our mop-person class has an initform of "brown" for eye-colour and our initform will overrule the JSON null:
(describe (json-mop:json-to-clos "{\"name\":\"Andre\",\"eye-colour\":null}" 'mop-person)) #<MOP-PERSON {1009C0C693}> [standard-object] Slots with :INSTANCE allocation: NAME = "Andre" EYE-COLOUR = "brown"
The README has an example of nested objects:
(defclass book () ((title :initarg :title :json-type :string :json-key "title") (published-year :initarg :year :json-type :number :json-key "year_published") (fiction :initarg :fiction :json-type :bool :json-key "is_fiction")) (:metaclass json-serializable-class)) (defclass author () ((name :initarg :name :json-type :string :json-key "name") (birth-year :initarg :year :json-type :number :json-key "year_birth") (bibliography :initarg :bibliography :json-type (:list book) :json-key "bibliography")) (:metaclass json-serializable-class)) (defparameter *author* (make-instance 'author :name "Mark Twain" :year 1835 :bibliography (list (make-instance 'book :title "The Gilded Age: A Tale of Today" :year 1873 :fiction t) (make-instance 'book :title "Life on the Mississippi" :year 1883 :fiction nil) (make-instance 'book :title "Adventures of Huckleberry Finn" :year 1884 :fiction t))))
json-mop exported symbols
- json-mop:encode
- json-mop:json-serializable
- json-mop:json-serializable-class
- json-mop:json-to-clos
- json-mop:json-type
- json-mop:json-type-error
- json-mop:no-values-class
- json-mop:no-values-hash-table
- json-mop:no-values-parsed
- json-mop:null-in-homogenous-sequence
- json-mop:null-value
- json-mop:slot-name
- json-mop:slot-not-serializable
- json-mop:to-json-value
- json-mop:to-lisp-value
cl-json-pointer
Library | Author | License | Website | Works With | Comments |
---|---|---|---|---|---|
cl-json-pointer | Yokata Yuki | MIT | homepage | cl-json, st-json, yason, jsown, jonathan, json-streams, com.gigamonkeys.json (1) | 40ants comments |
- (1) There is an outstanding issue asking for com.inuoe.jzon support opened on May 14, 2021, but no action.
I am going to borrow 40ants review here:
This library implements RFC 6901 - a format for accessing nested JSON data-structures. It some sense, JSON pointer is similar to JSON path, but more suitable for use as a part of the URL fragment.
cl-json-pointer's README provides many examples, but all of them are applied to the object which almost flat. Let's try to reproduce an example from the JSON path's site:
{ "firstName": "John", "lastName" : "doe", "age" : 26, "address" : { "streetAddress": "naist street", "city" : "Nara", "postalCode" : "630-0192" }, "phoneNumbers": [ { "type" : "iPhone", "number": "0123-4567-8888" }, { "type" : "home", "number": "0123-4567-8910" } ] }
Now we'll translate this JSON path: $.phoneNumbers[0].type into JSON pointer /phoneNumbers/0/type:
(defparameter *obj* (jsown:parse (alexandria:read-file-into-string "data.json"))) (cl-json-pointer:get-by-json-pointer *obj* "/phoneNumbers/0/type" :flavor :jsown) "iPhone" ("type" . "iPhone") NIL
It is also possible to add/set/delete elements using cl-json-pointer.
You will find more examples in the official docs.
Comparing to the JSON path, the pointer has clearer character escaping rules and is able to work with keys containing dots and slashes and other symbols. But it does not support slicing and some other features of the JSON path.
cl-json-pointer exported symbols
- json-pointer-error -
- *json-object-flavor - Default flavor of JSON library currently used.This value is used for :FLAVOR argument of exported functions.
Currently acceptable values are held by '*cl-json-pointer-supported-json-flavors*
- *cl-json-pointer-supported-json-flavors* -
- parse-json-pointer - Parses OBJ to an internal representation
- get-by-json-pointer - Traverses OBJ with POINTER and returns three values: the found value (`nil' if not found), a generalized boolean saying the existence of the place pointed by POINTER, and NIL.
- exists-p-by-json-pointer - raverses OBJ with POINTER and returns the existence of the place pointed by POINTER.
- set-by-json-pointer - Traverses OBJ with POINTER, sets VALUE into the pointed place, and returns the modified OBJ
- add-by-json-pointer - Works same as `set-by-json-pointer', except this try to make a new list when setting to lists.
- delete-by-json-pointer - Traverses OBJ with POINTER, deletes the pointed place, and returns the modified OBJ
- remove-by-json-pointer - Works same as `delete-by-json-pointer', except this try to make a new list when deleting from lists.
- update-by-json-pointer - Modify macro of `set-by-json-pointer'. This sets results of `set-by-json-pointer' to the referred place.
- deletef-by-json-pointer - Modify macro of `delete-by-json-pointer'. This sets results of `delete-by-json-pointer' to the referred place.
cl-json-schema
Library | Author | License | Website | Works With | Comments |
---|---|---|---|---|---|
cl-json-schema | Mark Skilbeck | MIT | homepage |
Per https://json-schema.org/, JSON Schema is a vocabulary that allows you to annotate and validate JSON documents. Json-schema is an attempt to go from JSON schema to clos class definition. The intended sequel, https://gitlab.com/Gnuxie/json-schema2/, has not been really developed. The author's description on a reddit post found here was "It is bad. It should only be used to learn and do things better. I don't feel comfortable providing any form of support for this hack. But if you have a directory on the filesystem with some JSON schema in it, you should be able to use the macro json-schema2:define-schema-spec and list the pathname in there. You might get lucky and find everything you need is supported when you view the macroexpansion."
The sum total of documentation for cl-json-schema is the following: The main entrypoint is (cl-json-schema:validate thing schema) where thing is a JSON-compatible value, and schema is a hash-table. (Alternatively, if you prefer, (json-schema:validate thing schema).
For example
(let ((schema (yason:parse "{ \"type\": \"object\", \"propertyNames\": { \"pattern\": \"^[A-Za-z_][A-Za-z0-9_]*$\" } } "))) ;; NEAT! (json-schema:validate (yason:parse "{\"_a_proper_token_001\": \"value\"}") schema) ;; NO BUENO! Key does not match the required pattern (json-schema:validate (yason:parse "{\"001 invalid\": \"value\"}") schema))
Appendix
JSON Refresher
JSON (JavaScript Object Notation) is a lightweight data-interchange format, codified in RFC 8259. JSON has the following data structures:
- object: { "key1": "value1", "key2": "value2" }
- array: [ "first", "second", "third" ]
- number: 42, 3.1415926 (note that there is no separate type for integer or floating point).
- string: "This is a string"
- boolean: true, false
- null: null
JSON objects can be nested, potentially creating the need for a JSON schema. Taking an example from https://json-schema.org/understanding-json-schema/about.html, we can see two JSON objects containing information about a person:
{ "name": "George Washington", "birthday": "February 22, 1732", "address": "Mount Vernon, Virginia, United States" } { "first_name": "George", "last_name": "Washington", "birthday": "1732-02-22", "address": { "street_address": "3200 Mount Vernon Memorial Highway", "city": "Mount Vernon", "state": "Virginia", "country": "United States" } }
If you wanted to validate the JSON objects, you would need a schema which you could check against the objects. The first object would fail validation and the second object would pass validation if they were validate against the following schema:
{ "type": "object", "properties": { "first_name": { "type": "string" }, "last_name": { "type": "string" }, "birthday": { "type": "string", "format": "date" }, "address": { "type": "object", "properties": { "street_address": { "type": "string" }, "city": { "type": "string" }, "state": { "type": "string" }, "country": { "type" : "string" } } } } }
Decoding Trivial-Benchmark Summary
The following summary table was generated by trivial-benchmarks from the libraries parsing the countries.json file (small 1.2 MB file) downloaded from https://github.com/mledoze/countries/blob/master/countries.json, over a cumulated 100 runs using SBCL version 2.3.2.
Library | Function | User Run Time (Sec) | Bytes Consed |
---|---|---|---|
boost-json | parse | 1.657614 | 418080864 |
cl-json | decode-json-from-string | 4.077347 | 2893155200 |
com.gigamonkeys.json | parse-json | 2.768985 | 1336311024 |
com.inuoe.jzon | parse | 1.330737 | 782284512 |
jonathan | decode-json-from-string | 194.9456 | 1568784712704 |
json-lib | parse | 5.738711 | 3438749120 |
json-streams | json-parse :duplicate-key-check nil | 10.231434 | 1850713696 |
jsown | parse | 1.123939 | 564034448 |
shasht | read-json | 1.993802 | 1586599792 |
st-json | read-json | 2.662635 | 429030832 |
yason | parse | 5.155174 | 1127308544 |
Library | Function | User Run Time (Sec) | Bytes Consed |
---|---|---|---|
boost-json | parse | 15.844047 | 8121764192 |
cl-json | decode-json-from-string | 40.892357 | 28502164224 |
com.gigamonkeysjson | parse-json | 30.002583 | 16455712480 |
com.inuoe.jzon | parse | 7.821162 | 4366975664 |
jonathan | parse | 1325.759500 | 8950139810080 |
json-lib | parse | 58.481285 | 36938784784 |
json-streams | json-parse :duplicate-key-check nil | 80.72323 | 24938589616 |
jsown | parse | 4.989549 | 4919389104 |
shasht | read-json | 17.84457 | 16028649184 |
st-json | read-json | 17.739523 | 7805457392 |
yason | parse | 67.30084 | 13795069104 |
Decoding Trivial-Benchmark Detail By Library
1.2 MB JSON File
Boost-json
- SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN AVERAGE DEVIATION REAL-TIME 100 1.619998 0.013333 0.033334 0.016667 0.0162 0.002539 RUN-TIME 100 1.619314 0.015628 0.033694 0.015754 0.016193 0.002224 USER-RUN-TIME 100 1.619338 0.015629 0.033679 0.015754 0.016193 0.002223 SYSTEM-RUN-TIME 100 0.000049 0 0.000013 0 0.0 0.000002 PAGE-FAULTS 100 0 0 0 0 0 0.0 GC-RUN-TIME 100 11.174 0 6.546 0 0.11174 0.793851 BYTES-CONSED 100 418099808 4117712 4216192 4182816 4180998.0 22944.854 EVAL-CALLS 100 0 0 0 0 0 0.0
Cl-json
- SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN AVERAGE DEVIATION REAL-TIME 100 4.063327 0.036666 0.05 0.04 0.040633 0.003007 RUN-TIME 100 4.063163 0.039048 0.049219 0.039521 0.040632 0.002733 USER-RUN-TIME 100 4.046636 0.039002 0.0482 0.039525 0.040466 0.002406 SYSTEM-RUN-TIME 100 0.016857 0 0.003338 0 0.000169 0.000722 PAGE-FAULTS 100 0 0 0 0 0 0.0 GC-RUN-TIME 100 107.753 0 8.268 0 1.07753 2.532335 BYTES-CONSED 100 2893124064 28892784 28960032 28927472 28931240.0 16630.285 EVAL-CALLS 100 0 0 0 0 0 0.0
Com.gigamonkeys.json
- SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN AVERAGE DEVIATION REAL-TIME 100 2.81333 0.026665 0.06 0.026667 0.028133 0.004277 RUN-TIME 100 2.810417 0.026351 0.060089 0.026769 0.028104 0.004258 USER-RUN-TIME 100 2.727645 0.019766 0.046832 0.026694 0.027276 0.003144 SYSTEM-RUN-TIME 100 0.083058 0 0.013258 0 0.000831 0.002216 PAGE-FAULTS 100 0 0 0 0 0 0.0 GC-RUN-TIME 100 102.407 0 32.956 0 1.02407 4.182565 BYTES-CONSED 100 1336330880 13308192 13407152 13374576 13363309.0 18525.904 EVAL-CALLS 100 0 0 0 0 0 0.0
Com.inuoe.jzon
- SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN AVERAGE DEVIATION REAL-TIME 100 1.326664 0.009999 0.023334 0.013333 0.013267 0.00221 RUN-TIME 100 1.331578 0.012838 0.024069 0.012932 0.013316 0.001832 USER-RUN-TIME 100 1.331617 0.012839 0.024071 0.012933 0.013316 0.001832 SYSTEM-RUN-TIME 100 0.000039 0 0.000012 0 0.0 0.000002 PAGE-FAULTS 100 0 0 0 0 0 0.0 GC-RUN-TIME 100 35.118 0 10.7 0 0.35118 1.748899 BYTES-CONSED 100 782314080 7777888 7846992 7814368 7823141.0 20710.158 EVAL-CALLS 100 0 0 0 0 0 0.0
Jonathan
- SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN AVERAGE DEVIATION REAL-TIME 100 227.00395 2.186708 2.359995 2.260044 2.27004 0.039442 RUN-TIME 100 227.33931 2.19705 2.351574 2.263612 2.273393 0.034106 USER-RUN-TIME 100 194.9456 1.876474 2.028554 1.948056 1.949456 0.034681 SYSTEM-RUN-TIME 100 32.394077 0.238007 0.417066 0.32814 0.323941 0.038221 PAGE-FAULTS 100 0 0 0 0 0 0.0 GC-RUN-TIME 100 25044.62 227.701 292.919 246.914 250.4462 12.838325 BYTES-CONSED 100 1568784712704 15687690976 15688001344 15687838672 15687847000.0 92185.85 EVAL-CALLS 100 0 0 0 0 0 0.0
Json-lib
- SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN AVERAGE DEVIATION REAL-TIME 100 5.79335 0.053333 0.08 0.056667 0.057934 0.004157 RUN-TIME 100 5.785923 0.055158 0.078339 0.056172 0.057859 0.003757 USER-RUN-TIME 100 5.696394 0.050573 0.068746 0.056031 0.056964 0.002951 SYSTEM-RUN-TIME 100 0.089889 0 0.016569 0 0.000899 0.002956 PAGE-FAULTS 100 0 0 0 0 0 0.0 GC-RUN-TIME 100 131.514 0 18.926 0 1.31514 3.159485 BYTES-CONSED 100 3438649392 34298176 34431824 34395536 34386492.0 30572.027 EVAL-CALLS 100 0 0 0 0 0 0.0
Json-streams
- SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN AVERAGE DEVIATION REAL-TIME 100 9.97336 0.093333 0.12 0.1 0.099734 0.00358 RUN-TIME 100 9.966192 0.094767 0.120993 0.099149 0.099662 0.003443 USER-RUN-TIME 100 9.893029 0.091745 0.111065 0.099069 0.09893 0.00257 SYSTEM-RUN-TIME 100 0.073451 0 0.010093 0 0.000735 0.002128 PAGE-FAULTS 100 0 0 0 0 0 0.0 GC-RUN-TIME 100 65.955 0 21.973 0 0.65955 2.640409 BYTES-CONSED 100 1850613600 18431904 18564096 18497264 18506136 22893.922 EVAL-CALLS 100 0 0 0 0 0 0.0
Jsown
- SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN AVERAGE DEVIATION REAL-TIME 100 1.203332 0.009999 0.043333 0.01 0.012033 0.004025 RUN-TIME 100 1.200135 0.011014 0.04474 0.011151 0.012001 0.003781 USER-RUN-TIME 100 1.123939 0.005928 0.028398 0.011122 0.011239 0.002225 SYSTEM-RUN-TIME 100 0.076273 0 0.016343 0 0.000763 0.002229 PAGE-FAULTS 100 0 0 0 0 0 0.0 GC-RUN-TIME 100 52.572 0 32.378 0 0.52572 3.640431 BYTES-CONSED 100 564034448 5604656 5658336 5625824 5640344.5 16818.484 EVAL-CALLS 100 0 0 0 0 0 0.0
Shasht
- SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN AVERAGE DEVIATION REAL-TIME 100 2.060005 0.016666 0.046666 0.02 0.0206 0.004253 RUN-TIME 100 2.05557 0.018955 0.048301 0.01916 0.020556 0.004146 USER-RUN-TIME 100 1.972519 0.008969 0.031692 0.019143 0.019725 0.00262 SYSTEM-RUN-TIME 100 0.083304 0 0.016611 0 0.000833 0.002577 PAGE-FAULTS 100 0 0 0 0 0 0.0 GC-RUN-TIME 100 105.303 0 28.432 0 1.05303 3.938279 BYTES-CONSED 100 1586639328 15796880 15930032 15864640 15866393.0 28376.613 EVAL-CALLS 100 0 0 0 0 0 0.0
St-json
- SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN AVERAGE DEVIATION REAL-TIME 100 2.720009 0.023333 0.05 0.026667 0.0272 0.002737 RUN-TIME 100 2.718913 0.026276 0.048607 0.026637 0.027189 0.00243 USER-RUN-TIME 100 2.672309 0.023864 0.035445 0.026506 0.026723 0.001448 SYSTEM-RUN-TIME 100 0.046662 0 0.013164 0 0.000467 0.001615 PAGE-FAULTS 100 0 0 0 0 0 0.0 GC-RUN-TIME 100 33.976 0 21.379 0 0.33976 2.333153 BYTES-CONSED 100 428995424 4250288 4316112 4283488 4289954.0 23263.066 EVAL-CALLS 100 0 0 0 0 0 0.0
Yason
- SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN AVERAGE DEVIATION REAL-TIME 100 5.169989 0.049999 0.08 0.05 0.0517 0.003844 RUN-TIME 100 5.165566 0.050289 0.079555 0.050581 0.051656 0.00362 USER-RUN-TIME 100 5.155174 0.05029 0.076295 0.050571 0.051552 0.003197 SYSTEM-RUN-TIME 100 0.010577 0 0.003258 0 0.000106 0.000532 PAGE-FAULTS 100 0 0 0 0 0 0.0 GC-RUN-TIME 100 50.982 0 10.973 0 0.50982 1.959192 BYTES-CONSED 100 1127308544 11219936 11286160 11284112 11273085.0 16248.54 EVAL-CALLS 100 0 0 0 0 0 0.0
9.8 MB File
Boost-json
- SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN AVERAGE DEVIATION REAL-TIME 100 16.600002 0.143332 0.263332 0.159999 0.166 0.022166 RUN-TIME 100 16.591812 0.144675 0.262875 0.158953 0.165918 0.022023 USER-RUN-TIME 100 15.826161 0.132868 0.22961 0.150189 0.158262 0.015996 SYSTEM-RUN-TIME 100 0.766012 0 0.036679 0.000082 0.00766 0.01054 PAGE-FAULTS 100 0 0 0 0 0 0.0 GC-RUN-TIME 100 1628.348 0 116.271 0 16.28348 21.486185 BYTES-CONSED 100 8121726592 81135040 81310064 81211104 81217260.0 28616.291 EVAL-CALLS 100 0 0 0 0 0 0.0
Cl-json
- SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN AVERAGE DEVIATION REAL-TIME 100 43.343338 0.39 0.500001 0.43 0.433433 0.028302 RUN-TIME 100 43.318657 0.389881 0.500517 0.430587 0.433187 0.028388 USER-RUN-TIME 100 40.892357 0.374546 0.457877 0.406374 0.408924 0.017006 SYSTEM-RUN-TIME 100 2.426699 0 0.083306 0.013432 0.024267 0.023222 PAGE-FAULTS 100 0 0 0 0 0 0.0 GC-RUN-TIME 100 5206.639 23.906 110.039 55.065 52.06639 20.941925 BYTES-CONSED 100 28502164224 284987600 285054944 285020384 285021630.0 18817.63 EVAL-CALLS 100 0 0 0 0 0 0.0
Com.gigamonkeys.json
- SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN AVERAGE DEVIATION REAL-TIME 100 31.760002 0.256665 0.430001 0.316666 0.3176 0.03324 RUN-TIME 100 31.735737 0.254121 0.429785 0.31501 0.317357 0.033264 USER-RUN-TIME 100 30.002583 0.248119 0.396469 0.304214 0.300026 0.027704 SYSTEM-RUN-TIME 100 1.733597 0 0.063267 0.016155 0.017336 0.016309 PAGE-FAULTS 100 0 0 0 0 0 0.0 GC-RUN-TIME 100 6202.89 0 177.155 65.725 62.0289 36.170036 BYTES-CONSED 100 16455712480 164509344 164590064 164557408 164557120.0 16735.898 EVAL-CALLS 100 0 0 0 0 0 0.0
Com.inuoe.jzon
- SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN AVERAGE DEVIATION REAL-TIME 100 7.853333 0.076665 0.083334 0.076667 0.078533 0.002227 RUN-TIME 100 7.850083 0.077052 0.083984 0.077664 0.078501 0.001689 USER-RUN-TIME 100 7.821162 0.064623 0.083986 0.077638 0.078212 0.002059 SYSTEM-RUN-TIME 100 0.029249 0 0.015565 0 0.000292 0.001729 PAGE-FAULTS 100 0 0 0 0 0 0.0 GC-RUN-TIME 100 64.397 0 4.54 0 0.64397 1.149026 BYTES-CONSED 100 4366975664 43620656 43707264 43666272 43669756.0 20102.645 EVAL-CALLS 100 0 0 0 0 0 0.0
Jonathan
- SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN AVERAGE DEVIATION REAL-TIME 100 1876.5785 18.566694 19.183428 18.723421 18.765785 0.153013 RUN-TIME 100 1878.7534 18.609053 19.140911 18.75922 18.787535 0.12626 USER-RUN-TIME 100 1325.7595 13.067881 13.549653 13.239522 13.257596 0.103736 SYSTEM-RUN-TIME 100 552.99426 5.264558 5.807163 5.529135 5.529942 0.099494 PAGE-FAULTS 100 0 0 0 0 0 0.0 GC-RUN-TIME 100 201864.69 1951.218 2153.499 2010.102 2018.6469 39.038464 BYTES-CONSED 100 8950139810080 89501170640 89501514624 89501406768 89501400000.0 46350.645 EVAL-CALLS 100 0 0 0 0 0 0.0
Json-lib
- SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN AVERAGE DEVIATION REAL-TIME 100 62.49986 0.583331 0.703331 0.616665 0.624999 0.03394 RUN-TIME 100 62.402306 0.582471 0.703751 0.616243 0.624023 0.034107 USER-RUN-TIME 100 58.481285 0.546558 0.641697 0.58024 0.584813 0.016765 SYSTEM-RUN-TIME 100 3.921454 0.00285 0.113307 0.029706 0.039215 0.028473 PAGE-FAULTS 100 0 0 0 0 0 0.0 GC-RUN-TIME 100 8449.81 63.201 148.833 69.068 84.4981 23.446356 BYTES-CONSED 100 36938784784 369336704 369447696 369389248 369387840.0 22078.822 EVAL-CALLS 100 0 0 0 0 0 0.0
Json-streams
- SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN AVERAGE DEVIATION REAL-TIME 100 83.553085 0.773331 0.91333 0.833331 0.835531 0.03647 RUN-TIME 100 83.30883 0.770919 0.913009 0.832009 0.833088 0.036665 USER-RUN-TIME 100 80.72323 0.756486 0.8762 0.803182 0.807232 0.020206 SYSTEM-RUN-TIME 100 2.586007 0 0.092613 0.016669 0.02586 0.024769 PAGE-FAULTS 100 0 0 0 0 0 0.0 GC-RUN-TIME 100 6292.337 21.019 155.018 54.529 62.92337 29.172997 BYTES-CONSED 100 24938589616 249331392 249428080 249386576 249385890.0 18607.875 EVAL-CALLS 100 0 0 0 0 0 0.0
Jsown
- SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN AVERAGE DEVIATION REAL-TIME 100 5.59332 0.043332 0.126666 0.046667 0.055933 0.01773 RUN-TIME 100 5.569105 0.044441 0.126507 0.045026 0.055691 0.017726 USER-RUN-TIME 100 4.989549 0.031232 0.100324 0.044751 0.049895 0.011466 SYSTEM-RUN-TIME 100 0.579924 0 0.03667 0.000061 0.005799 0.009229 PAGE-FAULTS 100 0 0 0 0 0 0.0 GC-RUN-TIME 100 855.405 0 78.454 0 8.55405 16.65085 BYTES-CONSED 100 4919389104 49168080 49204912 49204848 49193892.0 14703.879 EVAL-CALLS 100 0 0 0 0 0 0.0
Shasht
- SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN AVERAGE DEVIATION REAL-TIME 100 17.919956 0.173333 0.19 0.179999 0.1792 0.002911 RUN-TIME 100 17.912256 0.173717 0.191011 0.178996 0.179123 0.002458 USER-RUN-TIME 100 17.84457 0.169846 0.18917 0.178604 0.178446 0.002719 SYSTEM-RUN-TIME 100 0.068081 0 0.00844 0.000001 0.000681 0.001528 PAGE-FAULTS 100 0 0 0 0 0 0.0 GC-RUN-TIME 100 365.91 0 10.242 3.984 3.6591 1.36922 BYTES-CONSED 100 16028649184 160244432 160334912 160286432 160286500.0 21336.906 EVAL-CALLS 100 0 0 0 0 0 0.0
St-json
- SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN AVERAGE DEVIATION REAL-TIME 100 18.69662 0.163331 0.279999 0.176667 0.186966 0.027689 RUN-TIME 100 18.646624 0.162164 0.277236 0.177811 0.186466 0.027531 USER-RUN-TIME 100 17.739523 0.146125 0.251218 0.167254 0.177395 0.018997 SYSTEM-RUN-TIME 100 0.907483 0 0.073193 0.000403 0.009075 0.013883 PAGE-FAULTS 100 0 0 0 0 0 0.0 GC-RUN-TIME 100 1844.771 0 106.111 0 18.44771 25.936756 BYTES-CONSED 100 7805457392 78012704 78078784 78046272 78054580.0 19326.176 EVAL-CALLS 100 0 0 0 0 0 0.0
Yason
- SAMPLES TOTAL MINIMUM MAXIMUM MEDIAN AVERAGE DEVIATION REAL-TIME 100 69.51981 0.619999 0.796665 0.713333 0.695198 0.048038 RUN-TIME 100 69.38795 0.62071 0.794521 0.710142 0.693879 0.047837 USER-RUN-TIME 100 67.30084 0.620458 0.756178 0.674462 0.673008 0.033042 SYSTEM-RUN-TIME 100 2.087528 0 0.069886 0.01291 0.020875 0.022515 PAGE-FAULTS 100 0 0 0 0 0 0.0 GC-RUN-TIME 100 6011.343 0 156.623 77.148 60.11343 44.679024 BYTES-CONSED 100 13795069104 137923120 137990768 137958112 137950690.0 15643.376 EVAL-CALLS 100 0 0 0 0 0 0.0