Time to continue the journey towards better CLI shells without the constraints of terminal emulation. In the previous post we looked into the new commands list and stash. List was for providing an interactive and live updated take on ‘ls’, and stash for batching file operations.
This time we get contain for merging jobs together into datasets and each for batched command execution.
First a quick look at the latest UI convenience: view #job detach . This simply allows us to take any job and pop it out into its own window. The following clip shows how it works with ‘list’, initiated with mouse drag on the job bar. List was chosen as it uses both keyboard and mouse navigation, as well as spawns new windows of its own.
Onwards to contain. In the following clip I create a new job container by typing contain new. I then spawn a few noisy jobs and tell contain to adopt them through contain add.
By default contain will show an overview, coloured by their current run status. I can step through the job outputs either by clicking on their respective index in the job bar or type in contain show 1 (or any other valid index).
The container can also be set to automatically capture new jobs. In the following clip I spawn such a container and then run some commands. Those get added into the container automatically.
Contain meshes with commands like repeat, applying the action for all contained jobs at once. It gets spicier when I chose to merge the output of multiple contained jobs, either by right-clicking their entry in the job bar, or manually by running contain #0 show 1 2. These are then treated as a single dataset for any other commands, e.g. copy #0(1-100) that operate on the data for a job.
Contain even applies to interactive jobs. In the following clip I contain a ‘list’ in a detached window and show that mouse navigation is still working.
Moving on to each. Each is related to ‘for’ in Bash and similar shells, locally known as the syntax that I never recall when I need to and rarely get to do precisely what I want. Since we accumulate previous command outputs in discrete and typed contexts, we can avoid the “for I in file1 file2 file 3 do xyz $I done;’ form and reference the data to operate through our job and slicing syntax.
Starting simple, running this:
each #0(1,3,5-7) !! cat $arg
Anything before !! will be treated as part of the each command, and anything after will be reprocessed and parsed with $arg substituted for the sliced data, with some special sauce as $dir which will check if it is referring to a file and substitute its path, or use the path of the referenced job.
While it might at the quickest of glances look similar to the ‘for’ setup, the actual processing is anything but. Recall that everything we do here is asynchronous. If I would swap out ‘cat $arg‘ for ‘v! cat $arg‘ each invocation would spawn a new vertically split window, attach a legacy terminal emulator to it, and run the cat command.
Each also supports processing arguments:
each (sequential) #0(1,3,5-7) !!open $arg
Would functionally make it into a playlist. In this clip you can see how the media in the stash opens, and each time I close the window it launches the next in line.
Since we are not fighting for a single stdin/stdout pipeline, we have more interesting options:
each (merge) #0 !!cat $arg
This joins forces with the contain command by spawning a new container and attach the new jobs automatically.
The contained set of jobs also interact well with other commands, like trigger or repeat. In the following clip I repeat the previous form of running each on a stash of files. I then run the merge / cat command listed above and you can see how the commands keep progressing in parallel. Running repeat on the container would repeat the commands that had finished executing, merging output with the previous run, while letting the ongoing ones continue until completion.
The container here would also respect commands like trigger. Having a stash of makefiles, running them through a contained each like this:
each (merge) #stash !! make -f $arg -C $dir
trigger #0 ok alert "completed successfully"
Would treat each in the stash as a makefile, dispatch make, merge it into a container, associate a trigger with all merged jobs completing successfully and trigger a desktop notification.
That is enough for this time, next time around we will (likely) see what we can do to assist developer tooling such as the venerable gdb.