Lesson 29: Working with Files — Streams, Buffers, Pipelines
Master high-performance file I/O with StreamReader, BufferedStream, MemoryStream, and async streaming patterns.
What You'll Learn
- • StreamReader/StreamWriter for text and BinaryReader/Writer for raw data
- • Buffering strategies and their performance impact
- • MemoryStream for in-memory I/O
- • IAsyncEnumerable for streaming large files without loading into memory
🧠 Real-World Analogy
A stream is like a conveyor belt in a factory. Data flows past you piece by piece — you don't need to see the entire shipment at once. A buffer is like a loading dock: items accumulate there before being moved in bulk, which is far more efficient than carrying them one by one.
Text & Binary Streams
StreamWriter/StreamReader handle text with encoding (UTF-8 by default). BinaryWriter/BinaryReader work with raw bytes — perfect for compact file formats, game save files, or network protocols.
StreamWriter, StreamReader & Binary I/O
Write and read text files and binary data with proper encoding.
using System;
using System.IO;
using System.Text;
class Program
{
static void Main()
{
string filePath = "demo.txt";
// === Writing with StreamWriter ===
using (var writer = new StreamWriter(filePath, false, Encoding.UTF8))
{
writer.WriteLine("Line 1: Hello from StreamWriter!");
writer.WriteLine("Line 2: Streams handle encoding automatically.");
writer.WriteLine("Line 3: UTF-8 is the default and recommended enc
...Buffering & Performance
Every disk I/O call has overhead. By buffering data (accumulating many small writes into fewer large writes), you can achieve 10-100x speedups. The default buffer is 4KB, but for bulk operations, 64KB or larger is optimal.
Buffered vs Unbuffered Performance
Measure the dramatic speed difference between buffered and unbuffered writes.
using System;
using System.Diagnostics;
using System.IO;
class Program
{
static void Main()
{
const int iterations = 100_000;
string path = "perf_test.txt";
// === Unbuffered writes (slow) ===
var sw1 = Stopwatch.StartNew();
using (var fs = new FileStream(path, FileMode.Create,
FileAccess.Write, FileShare.None, 1)) // 1-byte buffer!
{
using var writer = new StreamWriter(fs);
for (int i = 0; i <
...Async Streams & IAsyncEnumerable
For large files, File.ReadAllLines() loads everything into memory. With IAsyncEnumerable, you process one line at a time asynchronously — perfect for files that are gigabytes in size.
Async File Processing with IAsyncEnumerable
Stream large files line-by-line without loading into memory.
using System;
using System.Collections.Generic;
using System.IO;
using System.Threading.Tasks;
class Program
{
// IAsyncEnumerable — stream data without loading everything into memory
static async IAsyncEnumerable<string> ReadLinesAsync(string path)
{
using var reader = new StreamReader(path);
string? line;
while ((line = await reader.ReadLineAsync()) != null)
{
yield return line;
}
}
static async Task ProcessLargeFile
...Pro Tip
For ultra-high-performance scenarios (like web servers processing request bodies), look into System.IO.Pipelines. It provides zero-copy, back-pressure-aware streaming that's used internally by ASP.NET Core's Kestrel server.
Common Mistakes
- • Forgetting to flush/dispose StreamWriter — data stays in buffer and is lost
- • Not resetting
MemoryStream.Positionbefore reading — reads nothing - • Using
File.ReadAllTextfor large files — causes OutOfMemoryException
Lesson Complete!
You've mastered .NET file I/O from basic streams to async pipelines. Next, learn advanced JSON processing with System.Text.Json.
Sign up for free to track which lessons you've completed and get learning reminders.