Skip to main content
We’ll cover how to redirect input and output in Linux — a foundational skill for shell scripting, automation, and command-line workflows.
A dark-themed presentation slide with the title "Redirecting Input and Output" on the left and a large empty rounded rectangle area on the right. The KodeKloud logo appears in the top-right corner.
Why this matters: most Unix utilities read from stdin (standard input) and write to stdout (standard output). Redirecting these streams — and stderr (standard error) — lets you capture program output, suppress errors, chain commands, and feed files into programs that expect interactive input.

Basic example (sort)

Many commands accept a filename argument, but they also work with stdin/stdout. For example, sort reads text and prints sorted lines:
$ cat file.txt
6
5
1
3
4
2

$ sort file.txt
1
2
3
4
5
6

# Save sorted output to a new file
$ sort file.txt > sortedfile.txt
$ cat sortedfile.txt
1
2
3
4
5
6

Redirecting stdout: overwrite vs append

  • overwrites (creates the file if it doesn’t exist).
  • appends to the file.
Example — overwrite (only the last run remains):
$ date > file.txt
$ date > file.txt
$ cat file.txt
Mon Nov  8 18:50:30 CST 2021
Example — append (each timestamp preserved):
$ date >> file.txt
$ date >> file.txt
$ cat file.txt
Mon Nov  8 18:50:30 CST 2021
Mon Nov  8 18:50:31 CST 2021
Quick comparison:
OperatorBehaviorCreates file if missing
>Overwrite stdoutYes
>>Append stdoutYes

File descriptors and common redirections

Programs use three standard streams:
DescriptorNamePurpose
0stdinInput to the process (keyboard or redirected file)
1stdoutNormal program output
2stderrError messages and diagnostics
Common redirection operators:
SyntaxMeaning
< file.txtRedirect stdin from file.txt
> file.txt or 1> file.txtRedirect stdout to file.txt (overwrite)
>> file.txt or 1>> file.txtAppend stdout to file.txt
2> errors.txtRedirect stderr to errors.txt (overwrite)
2>> errors.txtAppend stderr to errors.txt
File descriptors: 0 = stdin, 1 = stdout, 2 = stderr. Use 2> to redirect error messages separately from normal output.

Discard unwanted output: /dev/null

Send output you don’t want to see to /dev/null — a special sink that discards everything. Example: hide permission-denied messages from a recursive grep:
# Without suppression: stderr clutter
$ grep -r '^The' /etc/
grep: /etc/cups/ssl: Permission denied
...

# Suppress stderr by redirecting it to /dev/null
$ grep -r '^The' /etc/ 2>/dev/null
/etc/brltty/Input/tn/all.txt:The two keys at the left rear (2 columns, 1 row):
...

Redirect stdout and stderr separately

Capture normal output and errors in different files:
# Overwrite files
$ grep -r '^The' /etc/ 1>output.txt 2>errors.txt

# Append to files
$ grep -r '^The' /etc/ 1>>output.txt 2>>errors.txt

Redirect both stdout and stderr to the same file

To collect both streams into one file, redirect stdout first, then redirect stderr to stdout with 2>&1. The order matters:
# Correct: both streams go into all_output.txt
$ grep -r '^The' /etc/ > all_output.txt 2>&1

# Equivalent with explicit descriptor
$ grep -r '^The' /etc/ 1>all_output.txt 2>&1
Why order matters:
  • 1>all_output.txt sets stdout to the file.
  • 2>&1 then points stderr to wherever stdout is currently going (the file). If you reverse the order (2>&1 1>file), stderr is redirected to the original stdout (the terminal) before stdout is redirected, so errors still appear on-screen.

Input redirection (<) and feeding commands

Some programs read from stdin instead of accepting a filename. Use < to provide a file as stdin:
# Example: feed an email body from a file to a sendemail utility that reads stdin
$ sendemail someone@example.com < emailcontent.txt

Here-documents and here-strings

Use here-documents (heredocs) for multi-line inline input. Terminate with the chosen delimiter (EOF is common):
$ sort <<EOF
6
3
2
5
1
4
EOF
1
2
3
4
5
6
Here-strings pass a single string to stdin using <<<:
$ bc <<< "1+2+3+4"
10

Piping: chain small tools

Pipes (|) send the stdout of one command into the stdin of the next. This enables powerful one-line workflows: Example — remove commented lines, sort, and column-format the file:
# Show non-comment lines
$ grep -v '^#' /etc/login.defs

# Pipe into sort
$ grep -v '^#' /etc/login.defs | sort

# Pipe into column for neat alignment
$ grep -v '^#' /etc/login.defs | sort | column -t
CREATE_HOME        yes
ENCRYPT_METHOD     SHA512
GID_MAX            60000
...
Pipes are essential for combining simple Unix tools into effective data-processing chains — searching, sorting, formatting, counting, and more.

Quick reference: common patterns

GoalExample
Overwrite stdout to filecommand > file.txt
Append stdout to filecommand >> file.txt
Redirect stderr to filecommand 2> errors.txt
Append stderr to filecommand 2>> errors.txt
Redirect both to same filecommand > all.txt 2>&1
Suppress stderrcommand 2>/dev/null
Read stdin from filecommand < input.txt
Pipeline multiple commandscmd1cmd2cmd3

Further reading and references

That’s all for this lesson.

Watch Video

Practice Lab