I am trying to redirect all output from a command line programme to
a file. I am using Bash. Some of the output is directed to a the file, but some still
appears in the terminal and is not stored to the file.
Similar symptoms are described
here:
href="https://stackoverflow.com/questions/6674327/redirect-all-output-to-file">Redirect
all output to file
However I have
tried the proposed solution (capture stderr) without
success:
> stdout.txt 2>
stderr.txt
The
file stderr.txt is created but is empty.
A
possible clue is that the command-line programme is a client communicating with a server
on the same machine. It may be that some of the output is coming from the
server.
Is there a
way to capture all the output from the terminal, irrespective of its
origin?
EDIT:
I've
confirmed that the missing output is generated by the server. Running the command in a
separate terminal causes some output in both terminals, I can pipe all the output from
the command terminal to a file. This raises issues about how to capture the server
output, but that's a different question.
If the server is started on the
same terminal, then it's the server's stderr that is presumably being written to the
terminal and which you are not capturing.
The
best way to capture everything would be to
run:
script
output.txt
before
starting up either the server or the client. This will launch a new shell with all
terminal output redirected out output.txt as well as the terminal. Then start the server
from within that new shell, and then the client. Everything that you see on the screen
(both your input and the output of everything writing to the terminal from within that
shell) will be written to the file.
When you are
done, type "exit" to exit the shell run by the script
command.
No comments:
Post a Comment