Yesterday you learned how to properly clean up after yourself with IDisposable and Resource Management — using statements, finalizers, and the art of not leaving file handles dangling like forgotten laundry. Today we're going to put that knowledge to work, because we're reading and writing actual files. Welcome to the part of programming where your code finally interacts with the real world — the file system.
1. File vs Stream — Two Ways to Touch the Disk
Think of it like eating at a restaurant. The File class is the buffet — you walk up, grab everything at once, bring it back to your table, and eat. Simple. Convenient. But if the buffet has 4 GB of shrimp, you're going to need a bigger plate (and more RAM).
A Stream is like ordering courses one at a time. You get a little, process it, get some more. You never have to hold the entire meal in your hands at once. It's more work to manage, but it scales to any size.
Here's the rule of thumb:
- Small files (configs, logs under a few MB) → use
File.ReadAllText()and friends. Quick, easy, done. - Large files (multi-GB logs, binary data, video) → use
StreamReader,StreamWriter, orFileStream. Your RAM will thank you. - Binary data (images, PDFs, custom formats) →
FileStreamwith byte arrays. Text-based readers won't help you here.
Both approaches live in the System.IO namespace, so slap a using System.IO; at the top of your file and let's get cooking.
2. Reading Files — The Buffet Approach
The File class gives you three dead-simple static methods for reading:
// Read the entire file as one big string
string content = File.ReadAllText("notes.txt");
Console.WriteLine(content);
// Read every line into a string array
string[] lines = File.ReadAllLines("shopping-list.txt");
foreach (string line in lines)
{
Console.WriteLine($"Buy: {line}");
}
// Read raw bytes (for binary files)
byte[] data = File.ReadAllBytes("photo.jpg");
Console.WriteLine($"File size: {data.Length} bytes");
File.ReadAllText() returns the entire file as a single string. Great for small configs, templates, or that one JSON file you keep meaning to parse properly.
File.ReadAllLines() splits the file on newlines and hands you a string[]. Perfect when you're processing line-by-line data like CSV files or log entries.
File.ReadAllBytes() gives you the raw byte[]. This is your go-to for anything that isn't plain text — images, executables, compressed archives.
All three methods read the entire file into memory at once. If the file is 2 GB, congratulations — you now have 2 GB less available RAM. For big files, keep reading.
3. Writing Files — Putting Data on Disk
Writing is just as straightforward:
// Write a string to a file (overwrites if it exists!)
File.WriteAllText("output.txt", "Hello from C#!\nLine two.");
// Write an array of strings as separate lines
string[] logEntries = ["2024-01-01 INFO Started", "2024-01-01 ERROR Oops"];
File.WriteAllLines("log.txt", logEntries);
// Append to an existing file (doesn't overwrite)
File.AppendAllText("log.txt", "\n2024-01-02 INFO Still running");
A few things to watch out for:
| Method | Behavior | Use When |
|---|---|---|
File.WriteAllText() |
Overwrites the file completely | You want a fresh file every time |
File.WriteAllLines() |
Overwrites with one string per line | You've got a collection of lines |
File.AppendAllText() |
Adds to the end of the file | You're logging or accumulating data |
Warning: WriteAllText and WriteAllLines will nuke whatever was in the file before. There's no confirmation dialog, no "Are you sure?" prompt. The old content is just gone. If you want to add to a file, use AppendAllText.
Also — if the file doesn't exist yet, all three methods will create it for you. If the directory doesn't exist, though, you'll get a DirectoryNotFoundException. C# will create files, but it won't create folders on your behalf. We'll fix that in section 7.
4. StreamReader and StreamWriter — For When Files Get Big
Remember that using statement from Day 23? Here's where it earns its keep.
StreamReader reads a file line by line, so you only hold one line in memory at a time. Perfect for processing a 10 GB log file without melting your laptop:
using StreamReader reader = new("huge-log.txt");
string? line;
int lineCount = 0;
while ((line = reader.ReadLine()) is not null)
{
lineCount++;
if (line.Contains("ERROR"))
{
Console.WriteLine($"Line {lineCount}: {line}");
}
}
Console.WriteLine($"Total lines processed: {lineCount}");
StreamWriter is the mirror image — it writes data without buffering the entire output in memory first:
using StreamWriter writer = new("output.txt");
writer.WriteLine("Timestamp,Level,Message");
for (int i = 0; i < 1_000_000; i++)
{
writer.WriteLine($"2024-01-01,INFO,Event {i}");
}
// The using declaration flushes and closes the file automatically
A couple of StreamWriter tricks worth knowing:
// Append mode — pass 'true' as the second argument
using StreamWriter appender = new("log.txt", append: true);
appender.WriteLine("This goes at the end");
// Auto-flush after every write (slower, but data is never stuck in a buffer)
using StreamWriter realTime = new("live-log.txt") { AutoFlush = true };
realTime.WriteLine("This hits the disk immediately");
That using declaration at the start of each variable isn't optional decoration — it's doing real work. When the variable goes out of scope, the stream gets flushed and closed. Skip the using and you might end up with half-written files or locked file handles. Day 23 called — it wants you to remember its lessons.
5. FileStream — Byte-Level Control
When you need to work with raw bytes — reading a custom binary format, copying files efficiently, or doing anything where text encoding isn't relevant — FileStream is your tool:
using FileStream source = new("photo.jpg", FileMode.Open, FileAccess.Read);
using FileStream destination = new("photo-copy.jpg", FileMode.Create, FileAccess.Write);
byte[] buffer = new byte[8192]; // 8 KB buffer
int bytesRead;
while ((bytesRead = source.Read(buffer, 0, buffer.Length)) > 0)
{
destination.Write(buffer, 0, bytesRead);
}
Console.WriteLine("File copied!");
The key parameters when creating a FileStream:
FileMode—Open,Create,Append,OpenOrCreate,CreateNew,Truncate. Controls whether the file must exist, gets created, or gets overwritten.FileAccess—Read,Write,ReadWrite. Pretty self-explanatory.FileShare—None,Read,Write,ReadWrite. Controls whether other processes can access the file while you have it open. Default isNone— you get exclusive access.
You probably won't use FileStream directly very often. For simple file copies, File.Copy("source.jpg", "dest.jpg") does the job in one line. But when you need fine-grained control — like reading only the first 100 bytes of a file, or writing to a specific position — FileStream is the only way.
One more thing: StreamReader and StreamWriter are actually wrappers around FileStream. When you create a new StreamReader("file.txt"), it opens a FileStream under the hood and adds text-decoding on top. So you've been using FileStream all along — you just had a friendlier face on it.
6. The Path Class — Your Cross-Platform Safety Net
Hardcoding file paths with backslashes is a great way to have your code break on macOS and Linux. The Path class handles path manipulation correctly on every operating system:
// Combine paths safely (uses the right separator for the OS)
string fullPath = Path.Combine("Users", "farhad", "Documents", "notes.txt");
// Windows: Users\farhad\Documents\notes.txt
// Linux/macOS: Users/farhad/Documents/notes.txt
// Extract parts of a path
string file = Path.GetFileName("/logs/app/errors.log"); // "errors.log"
string name = Path.GetFileNameWithoutExtension("report.pdf"); // "report"
string ext = Path.GetExtension("data.json"); // ".json"
string dir = Path.GetDirectoryName("/logs/app/errors.log"); // "/logs/app"
// Get special folders
string tempFile = Path.GetTempFileName(); // Creates a temp file and returns its path
string tempDir = Path.GetTempPath(); // System temp directory
// Change a file's extension
string newName = Path.ChangeExtension("report.txt", ".md"); // "report.md"
Golden rule: never concatenate paths with + and "\\". Always use Path.Combine(). Your future self — and anyone running your code on a Mac — will thank you.
Also handy: Path.GetFullPath("relative/path.txt") converts a relative path to an absolute one based on the current working directory. Useful for debugging "file not found" errors when you're not sure where your app is actually looking.
One more path trick that saves headaches — Path.DirectorySeparatorChar gives you \ on Windows and / on Linux/macOS. You shouldn't need it often if you're using Path.Combine(), but it's good to know it exists when you're parsing paths that came from user input or a config file.
7. Directory and DirectoryInfo — Managing Folders
Files don't float in the void — they live in directories. The Directory class (static methods) and DirectoryInfo class (instance-based) let you create, inspect, and navigate the folder structure:
// Check if a directory exists
if (!Directory.Exists("logs"))
{
// Create it (including nested subdirectories)
Directory.CreateDirectory("logs/archive/2024");
Console.WriteLine("Directories created!");
}
// List all .txt files in a directory
string[] textFiles = Directory.GetFiles("logs", "*.txt");
foreach (string file in textFiles)
{
Console.WriteLine(file);
}
// List files recursively (all subdirectories)
string[] allFiles = Directory.GetFiles("logs", "*.*", SearchOption.AllDirectories);
Console.WriteLine($"Found {allFiles.Length} files total");
// List subdirectories
string[] subDirs = Directory.GetDirectories("logs");
foreach (string dir in subDirs)
{
Console.WriteLine($"Subfolder: {dir}");
}
DirectoryInfo does the same things but as an object you can pass around and reuse:
DirectoryInfo logsDir = new("logs");
if (logsDir.Exists)
{
Console.WriteLine($"Created: {logsDir.CreationTime}");
FileInfo[] files = logsDir.GetFiles("*.log");
foreach (FileInfo file in files)
{
Console.WriteLine($"{file.Name} — {file.Length} bytes, modified {file.LastWriteTime}");
}
}
Use Directory (static) for quick one-off checks. Use DirectoryInfo when you need to query multiple properties or pass a directory reference around your code. Same logic applies to File vs FileInfo, by the way.
Pro tip: Directory.CreateDirectory() is safe to call even if the directory already exists — it just does nothing. No need for the if (!Exists) check if you're just ensuring the folder is there.
One pattern you'll see a lot in real apps is ensuring an output directory exists before writing:
string outputDir = Path.Combine("reports", "2024", "january");
Directory.CreateDirectory(outputDir); // Creates all missing directories in the path
string reportPath = Path.Combine(outputDir, "summary.txt");
File.WriteAllText(reportPath, "Monthly report data...");
This is safer than hoping the folder structure already exists. Create first, write second — that order will save you from a lot of DirectoryNotFoundException surprises.
8. Async File I/O — Don't Freeze the UI
Every File.ReadAllText() call blocks the current thread until the disk finishes reading. For a console app, that's fine. For a Blazor app, a MAUI app, or an API endpoint — that's a frozen UI or a thread-pool thread sitting around doing nothing.
The fix? Async versions of everything you've already learned:
// Async reading
string content = await File.ReadAllTextAsync("config.json");
string[] lines = await File.ReadAllLinesAsync("data.csv");
byte[] bytes = await File.ReadAllBytesAsync("image.png");
// Async writing
await File.WriteAllTextAsync("output.txt", "Async writing!");
await File.AppendAllTextAsync("log.txt", "Another entry\n");
// Async StreamReader
using StreamReader reader = new("huge-file.txt");
string? line;
while ((line = await reader.ReadLineAsync()) is not null)
{
// Process each line without blocking
Console.WriteLine(line);
}
// Async StreamWriter
using StreamWriter writer = new("async-output.txt");
await writer.WriteLineAsync("Written asynchronously");
await writer.FlushAsync();
The pattern is dead simple — slap Async on the method name, add await in front, and make sure your method signature says async Task. If you survived Day 18's async/await lesson, this is just applying those same ideas to file operations.
One thing to be aware of: async file I/O in .NET doesn't always give you a true async disk read at the OS level. Under the hood, some of these methods use thread-pool threads to simulate async behavior. For most apps, the difference doesn't matter — you're still freeing up your calling thread. But if you're writing a high-performance server and squeezing every last microsecond, you'd pass useAsync: true when constructing a FileStream to get genuine OS-level async I/O.
When to use async file I/O:
- Web APIs and Blazor apps — always. You don't want to block request threads.
- Console apps — honestly, synchronous is usually fine. Nobody's waiting on a UI.
- MAUI/desktop apps — always for any file that might take more than a blink to read.
9. Your Homework: Build a Log File Analyzer
Time to put it all together. Build a console app that reads a log file, counts how many lines are ERROR, WARNING, and INFO, then writes a summary report to a new file.
First, create a sample log file called app.log:
2024-01-15 08:00:01 INFO Application started
2024-01-15 08:00:02 INFO Loading configuration
2024-01-15 08:00:03 WARNING Config file is using deprecated format
2024-01-15 08:00:05 INFO Connected to database
2024-01-15 08:00:06 ERROR Failed to load user preferences: FileNotFoundException
2024-01-15 08:00:07 INFO Retrying with defaults
2024-01-15 08:00:10 WARNING Disk space below 10%
2024-01-15 08:00:15 ERROR Connection timeout after 30s
2024-01-15 08:00:16 INFO Reconnected successfully
2024-01-15 08:00:20 INFO Shutting down gracefully
Now write the analyzer:
using System.IO;
string inputFile = "app.log";
string outputFile = "log-summary.txt";
if (!File.Exists(inputFile))
{
Console.WriteLine($"File not found: {inputFile}");
return;
}
int infoCount = 0;
int warningCount = 0;
int errorCount = 0;
int totalLines = 0;
List<string> errorMessages = [];
using (StreamReader reader = new(inputFile))
{
string? line;
while ((line = reader.ReadLine()) is not null)
{
totalLines++;
if (line.Contains("ERROR"))
{
errorCount++;
errorMessages.Add(line);
}
else if (line.Contains("WARNING"))
{
warningCount++;
}
else if (line.Contains("INFO"))
{
infoCount++;
}
}
}
// Build the summary
using StreamWriter writer = new(outputFile);
writer.WriteLine("=== Log Analysis Summary ===");
writer.WriteLine($"File analyzed: {Path.GetFullPath(inputFile)}");
writer.WriteLine($"Total lines: {totalLines}");
writer.WriteLine($"INFO: {infoCount}");
writer.WriteLine($"WARNING: {warningCount}");
writer.WriteLine($"ERROR: {errorCount}");
writer.WriteLine();
if (errorMessages.Count > 0)
{
writer.WriteLine("=== Error Details ===");
foreach (string error in errorMessages)
{
writer.WriteLine(error);
}
}
Console.WriteLine($"Analysis complete! Summary written to {outputFile}");
Console.WriteLine($" INFO: {infoCount} | WARNING: {warningCount} | ERROR: {errorCount}");
Bonus challenges:
- Add async I/O so the analyzer doesn't block.
- Use
Directory.GetFiles()to analyze all.logfiles in a folder. - Sort the error messages by timestamp.
- Add a percentage breakdown (e.g., "ERROR: 2 (20.0%)").
Summary of Day 24
- The
Fileclass offers one-liner methods likeReadAllText,WriteAllLines, andAppendAllText— great for small files. StreamReaderandStreamWriterprocess files line by line, keeping memory usage low for large files. Always wrap them in ausingstatement.FileStreamgives you byte-level access for binary data and fine-grained control over file modes and access permissions.- The
Pathclass is your best friend for combining paths, extracting file names, and keeping your code cross-platform. DirectoryandDirectoryInfolet you create folders, list files, and check existence — the basic toolkit for navigating the file system.- Every synchronous file method has an async counterpart ending in
Async— use them in web apps, APIs, and desktop UIs to avoid blocking threads. - When in doubt, use
File.Exists()andDirectory.Exists()before reading — a graceful error message beats an unhandledFileNotFoundException.
Tomorrow: we'll tackle JSON Serialization — turning your C# objects into JSON and back, because every API on the planet speaks JSON. 🚀
See you on Day 25!