Working with JSON in Various Shells

I recently went through the exercise of testing jc on several traditional and next-gen shells to document the integrations. jc is a utility that converts the output of many commands and file-types to JSON for easier parsing in scripts. I have typically highlighted the use of JSON with Bash in concert with jq, but this is 2022 and there are so many more shells to choose from!

In this article I’d like to give a quick snapshot of what it’s like to work with JSON in various traditional and next generation shells. Traditional shells like Bash and Windows Command Prompt (cmd.exe) don’t have built-in JSON support and require 3rd party utilities. Newer shells like NGS, Nushell, Oil, Elvish, Murex, and PowerShell have JSON serialization/deserialization and filtering capabilities built-in for a cleaner experience.

Bash is still the automation workhorse of the Unix ecosystem and it’s not going away any time soon, but it’s good to see what capabilities are out there in more modern shells. Perhaps this will inspire you to try them out for yourself!


Bash is old. Bash is solid. Bash is ubiquitous. Bash isn’t going anywhere. I’ve done some crazy things with Bash in my career… Bash and me go a long way. That being said, using JSON in Bash is not always very ergonomic. Tools like jq, jello, jp, etc. help bridge the gap between 1970’s-2000’s POSIX line-based text manipulation to the modern-day JSON API reality.

Here’s a simple example of how to pull a value from JSON and assign it to a variable in Bash:

$ myvar=$(dig | jc --dig | jq -r '.[0].answer[0].data')
$ echo $myvar

If you would like to see more complex examples of assigning multiple JSON values to Bash arrays, see:


Elvish is a next-gen shell that uses structured data in pipelines. It has JSON deserialization built-in, so you don’t need jq et-al to convert it into an Elvish data structure. You can explore structured data in a similar way to jq or Python.

Here’s an example of loading a JSON object into a variable and displaying one of the JSON values using the from-json built-in:

~> var myvar = (dig | jc --dig | from-json)
~> put $myvar[0]['answer'][0]['data']

See the Elvish documentation for more details.


Fish is similar to Bash in that it does not have built-in support for JSON, but it’s a more modern take on the shell that provides nice autosuggestions, tab completion, syntax-highlighting, and a clean syntax that is optimized for interactive use.

When working with JSON in Fish, you will typically use tools like jq, jello, jp, etc. to filter and query the data. Here are some examples showing how to assign filtered JSON data to a variable so it can be used elsewhere in the script:

$ set myvar (dig | jc --dig | jq -r '.[0].answer[0].data')
$ echo $myvar

$ set myvar (jc dig | jello -r '_[0].answer[0].data')
$ echo $myvar

$ set myvar (jc dig | jp -u '[0].answer[0].data')
$ echo $myvar


The Murex next generation shell is designed for DevOps productivity and includes native JSON capabilities. There are a couple ways to set JSON variables: you can use the cast json builtin to convert a string to a JSON variable, or you can define the JSON type when setting the variable. (e.g. set json myvar).

There are also a couple different ways to access nested attributes within the JSON: you can use Index syntax (single bracket []) or Element syntax (double bracket [[]]).

Here’s an example of setting a JSON variable and accessing a nested value using the Element syntax:

~ » jc dig -> set json myvar
~ » $myvar[[]] -> set mydata
~ » out $mydata

Check out the documentation for more information.


Next Generation Shell (NGS) is a modern shell that aims to be DevOps-friendly. To that end, it is no surprise that it has great JSON support out of the box. If you have Python experience, you will find yourself at home with many of the concepts.

Here is a quick example of how to pull a value from JSON into a variable and output a specific value to STDOUT:

myvar = ``jc dig``[0].answer[0].data

# returns

The double-backtick syntax runs the command and parses the JSON output. Then you can use bracket and dot notation to access the key you would like.

There are many other ways to filter the objects, including map(), filter(), reject(), the_one(), etc. No jq required!


Nushell’s website describes itself this way:

“Nu pipelines use structured data so you can safely select, filter, and sort the same way every time. Stop parsing strings and start solving problems.”

This is definitely a new take on the shell which works nicely with JSON data. In fact, Nushell has a from json builtin function that deserializes JSON into a native structured object. Here’s a quick example of how to assign a JSON object to a variable and filter it down to a desired value:

> let myvar = (dig | jc --dig | from json)
> echo $myvar | get

Check out the Nushell documentation for more filtering options.


If you are at home in Javascript or Python then you should check out Oil. Oil started out being compatible with Bash, but has since advanced into its own shell and scripting language that supports more robust structured objects.

Oil comes with the json read builtin that deserializes JSON into a native Oil object. You can use standard bracket notation or a unique -> notation to access attributes within objects. Here’s an example:

$ dig | jc --dig | json read myvar
$ var mydata = myvar[0]['answer'][0]['data']
$ echo $mydata

For more details on working with JSON in Oil, see the documentation.


They say you either love or hate PowerShell. I have to admit, coming from a Bash background, I wasn’t too hot on PowerShell the first time I needed to create a script for it. It seemed needlessly verbose. And what were these objects? Why can’t I just pipe text between processes!?

But I have to say it has grown on me because of its concept of passing structured objects between processes via pipes. Well, I neither love or hate PowerShell… I like the concept, but I’m still not a huge fan of some of the execution. It does have pretty good native JSON support, though.

Here’s an example of loading JSON data from jc into an object using the ConvertFrom-Json utility and printing a specific property within the resulting object using bracket and dot notation:

PS C:\> $myvar = dig | jc --dig | ConvertFrom-Json
PS C:\> Write-Output $myvar[0].answer[0].data

Here’s a good article with more detail on how to work with JSON in PowerShell.

Windows Command Prompt (cmd.exe)

Wow, this is a blast from the past! I don’t think I’ve written a batch file since the ’90s. Back then there was no such thing as JSON. I do remember doing some crazy login scripts with batch files back in the day, and I’m sure there are many (not mine) still in use today.

At first I wasn’t sure if it was even practical to use JSON at the Windows Command Prompt, but I thought it would be fun to take on the challenge. Turns out, it wasn’t too terribly difficult, though I’m still not sure of the practicality.

When at the Command Prompt, you can use tools like jq, jello, jp, etc. to filter and query the JSON:

C:\> dig | jc --dig | jq -r .[0].answer[0].data

C:\> jc dig | jello -r _[0].answer[0].data

C:\> jc dig | jp -u [0].answer[0].data

That’s fine and all, but can you actually load JSON values into variables? Yes you can – with the trusty FOR /F command!

C:\> FOR /F "tokens=* USEBACKQ" %i IN (`dig ^| jc --dig ^| jq -r .[0].answer[0].data`) DO SET myvar=%i
C:\> ECHO %myvar%

Well, that’s a mouthful. But it does work. Batch files require double %% prefixes when setting the variables, so this is how you would do it in a batch file:

FOR /F "tokens=* USEBACKQ" %%i IN (`dig ^| jc --dig ^| jq -r .[0].answer[0].data`) DO SET myvar=%%i
ECHO %myvar%

:: returns

I needed to make a visit to Stack Overflow to learn how to get this working. Was it worth it? I don’t know – maybe this will help some poor unfortunate soul someday searching “how to use json in batch file”. 🙂


That was fun – I’ve always enjoyed the command line and playing with different shells can spark inspiration for new ways of solving problems. There are lots of next-gen alternatives that are looking to take us to the 21st century shell experience. Did I leave out your favorite new shell?

Happy JSON parsing!

Published by kellyjonbrazil

I'm a cybersecurity and cloud computing nerd.

2 thoughts on “Working with JSON in Various Shells

    1. Good catch! Now I do remember hearing about murex in the past, but I’m not familiar with it. I’ll check it out and add it to the list. Thanks!

Leave a Reply