Dealing with large output from long running commands
cliloggingBackground #
Often, I have long running commands where all of these things are true:
- the output has useful information that I might need to parse through after the command finishes
- the command has a large amount of output to stdout/stderr
- the command takes a fairly long time to execute
Note: this technique is still useful even if the 3rd thing isn't true (i.e., your command runs fairly quickly). However, it makes the tail -f
step less useful.
The technique #
Let's say for example, you command is yarn android
. For me, that is a common command I run where this tactic is useful. You can replace that command with any relevant command you are using.
Run the command and log everything to a file:
yarn android > out.log 2>&1
Note: the above command combines stderr and stdout using 2>&1
(redirect stderr to stdout). If you wish to keep them separate, you can instead use
yarn android > out.log 2> error.log
Now, while that command is running, in another window (another great use case for tmux) tail the log:
tail -f out.log
Now, you can watch the progress of the command in this window. The tail -f
command will continue printing out each line of the output as it goes.
Once the command is done running, you can look at out.log
(and / or error.log
if you used it) to find whatever information you need. For me personally, I will simply open out.log
in vscode. You can of course open it in any text editor, or just grep
on the command line to find what you need.