I recently went through the exercise of testing
jc on several traditional and next-gen shells to document the integrations.
jc is a utility that converts the output of many commands and file-types to JSON for easier parsing in scripts. I have typically highlighted the use of JSON with Bash in concert with
jq, but this is 2022 and there are so many more shells to choose from!
In this article I’d like to give a quick snapshot of what it’s like to work with JSON in various traditional and next generation shells. Traditional shells like Bash and Windows Command Prompt (
cmd.exe) don’t have built-in JSON support and require 3rd party utilities. Newer shells like NGS, Nushell, Oil, Elvish, Murex, and PowerShell have JSON serialization/deserialization and filtering capabilities built-in for a cleaner experience.
Bash is still the automation workhorse of the Unix ecosystem and it’s not going away any time soon, but it’s good to see what capabilities are out there in more modern shells. Perhaps this will inspire you to try them out for yourself!
Bash is old. Bash is solid. Bash is ubiquitous. Bash isn’t going anywhere. I’ve done some crazy things with Bash in my career… Bash and me go a long way. That being said, using JSON in Bash is not always very ergonomic. Tools like
jp, etc. help bridge the gap between 1970’s-2000’s POSIX line-based text manipulation to the modern-day JSON API reality.
Here’s a simple example of how to pull a value from JSON and assign it to a variable in Bash:
$ myvar=$(dig www.google.com | jc --dig | jq -r '..answer.data') $ echo $myvar 18.104.22.168
If you would like to see more complex examples of assigning multiple JSON values to Bash arrays, see:
Elvish is a next-gen shell that uses structured data in pipelines. It has JSON deserialization built-in, so you don’t need
jq et-al to convert it into an Elvish data structure. You can explore structured data in a similar way to
jq or Python.
Here’s an example of loading a JSON object into a variable and displaying one of the JSON values using the
~> var myvar = (dig www.google.com | jc --dig | from-json) ~> put $myvar['answer']['data'] ▶ 22.214.171.124
See the Elvish documentation for more details.
Fish is similar to Bash in that it does not have built-in support for JSON, but it’s a more modern take on the shell that provides nice autosuggestions, tab completion, syntax-highlighting, and a clean syntax that is optimized for interactive use.
When working with JSON in Fish, you will typically use tools like
jp, etc. to filter and query the data. Here are some examples showing how to assign filtered JSON data to a variable so it can be used elsewhere in the script:
$ set myvar (dig www.google.com | jc --dig | jq -r '..answer.data') $ echo $myvar 126.96.36.199 $ set myvar (jc dig www.google.com | jello -r '_.answer.data') $ echo $myvar 188.8.131.52 $ set myvar (jc dig www.google.com | jp -u '.answer.data') $ echo $myvar 184.108.40.206
The Murex next generation shell is designed for DevOps productivity and includes native JSON capabilities. There are a couple ways to set JSON variables: you can use the
cast json builtin to convert a string to a JSON variable, or you can define the JSON type when setting the variable. (e.g.
set json myvar).
Here’s an example of setting a JSON variable and accessing a nested value using the Element syntax:
~ » jc dig www.google.com -> set json myvar ~ » $myvar[[.0.answer.0.data]] -> set mydata ~ » out $mydata 220.127.116.11
Check out the documentation for more information.
Next Generation Shell (NGS) is a modern shell that aims to be DevOps-friendly. To that end, it is no surprise that it has great JSON support out of the box. If you have Python experience, you will find yourself at home with many of the concepts.
Here is a quick example of how to pull a value from JSON into a variable and output a specific value to STDOUT:
myvar = ``jc dig www.google.com``.answer.data echo(myvar) # returns 18.104.22.168
The double-backtick syntax runs the command and parses the JSON output. Then you can use bracket and dot notation to access the key you would like.
There are many other ways to filter the objects, including
the_one(), etc. No
Nushell’s website describes itself this way:
“Nu pipelines use structured data so you can safely select, filter, and sort the same way every time. Stop parsing strings and start solving problems.”
This is definitely a new take on the shell which works nicely with JSON data. In fact, Nushell has a
from json builtin function that deserializes JSON into a native structured object. Here’s a quick example of how to assign a JSON object to a variable and filter it down to a desired value:
> let myvar = (dig www.google.com | jc --dig | from json) > echo $myvar | get 0.answer.0.data 22.214.171.124
Check out the Nushell documentation for more filtering options.
Oil comes with the
json read builtin that deserializes JSON into a native Oil object. You can use standard bracket notation or a unique
-> notation to access attributes within objects. Here’s an example:
$ dig www.google.com | jc --dig | json read myvar $ var mydata = myvar['answer']['data'] $ echo $mydata 126.96.36.199
For more details on working with JSON in Oil, see the documentation.
They say you either love or hate PowerShell. I have to admit, coming from a Bash background, I wasn’t too hot on PowerShell the first time I needed to create a script for it. It seemed needlessly verbose. And what were these objects? Why can’t I just pipe text between processes!?
But I have to say it has grown on me because of its concept of passing structured objects between processes via pipes. Well, I neither love or hate PowerShell… I like the concept, but I’m still not a huge fan of some of the execution. It does have pretty good native JSON support, though.
Here’s an example of loading JSON data from
jc into an object using the
ConvertFrom-Json utility and printing a specific property within the resulting object using bracket and dot notation:
PS C:\> $myvar = dig www.google.com | jc --dig | ConvertFrom-Json PS C:\> Write-Output $myvar.answer.data 188.8.131.52
Here’s a good article with more detail on how to work with JSON in PowerShell.
Windows Command Prompt (
Wow, this is a blast from the past! I don’t think I’ve written a batch file since the ’90s. Back then there was no such thing as JSON. I do remember doing some crazy login scripts with batch files back in the day, and I’m sure there are many (not mine) still in use today.
At first I wasn’t sure if it was even practical to use JSON at the Windows Command Prompt, but I thought it would be fun to take on the challenge. Turns out, it wasn’t too terribly difficult, though I’m still not sure of the practicality.
When at the Command Prompt, you can use tools like
jp, etc. to filter and query the JSON:
C:\> dig www.google.com | jc --dig | jq -r ..answer.data 184.108.40.206 C:\> jc dig www.google.com | jello -r _.answer.data 220.127.116.11 C:\> jc dig www.google.com | jp -u .answer.data 18.104.22.168
That’s fine and all, but can you actually load JSON values into variables? Yes you can – with the trusty
FOR /F command!
C:\> FOR /F "tokens=* USEBACKQ" %i IN (`dig www.google.com ^| jc --dig ^| jq -r ..answer.data`) DO SET myvar=%i C:\> ECHO %myvar% 22.214.171.124
Well, that’s a mouthful. But it does work. Batch files require double
%% prefixes when setting the variables, so this is how you would do it in a batch file:
FOR /F "tokens=* USEBACKQ" %%i IN (`dig www.google.com ^| jc --dig ^| jq -r ..answer.data`) DO SET myvar=%%i ECHO %myvar% :: returns 126.96.36.199
I needed to make a visit to Stack Overflow to learn how to get this working. Was it worth it? I don’t know – maybe this will help some poor unfortunate soul someday searching “how to use json in batch file”. 🙂
That was fun – I’ve always enjoyed the command line and playing with different shells can spark inspiration for new ways of solving problems. There are lots of next-gen alternatives that are looking to take us to the 21st century shell experience. Did I leave out your favorite new shell?
Happy JSON parsing!