Week 3 was about making the pipeline survive messy input. Which pattern do you think you'll reach for most often?
The chunk pattern. range(0, len(items), size) feels universal — any batched I/O task uses the same three lines.
That's the one most people carry forward. The generator streaming pattern is the other — once you internalise yield, large inputs stop being intimidating.
The windowed error rate on Day 21 actually tied together the window math from Day 14. Same loop shape, new metric.
That's the whole point of reusable shapes. Own the window loop once and every rolling metric — max, mean, rate, threshold — becomes a one-line swap.
This week your pipeline learned to handle real load:
stream_lines — yield each non-empty stripped line lazilymerge_log_streams — concat + sorted(key=lambda) for chronological mergechunk_list — [items[i:i+size] for i in range(0, len, size)]count_retries_needed — enumerate with 1-based count + sentinel -1windowed_error_rate — rolling window + round(errors/size, 2)Meta-skill: streaming, batching, and windowing are the three shapes that scale from a hundred rows to a hundred million.
Week 3 was about making the pipeline survive messy input. Which pattern do you think you'll reach for most often?
The chunk pattern. range(0, len(items), size) feels universal — any batched I/O task uses the same three lines.
That's the one most people carry forward. The generator streaming pattern is the other — once you internalise yield, large inputs stop being intimidating.
The windowed error rate on Day 21 actually tied together the window math from Day 14. Same loop shape, new metric.
That's the whole point of reusable shapes. Own the window loop once and every rolling metric — max, mean, rate, threshold — becomes a one-line swap.
This week your pipeline learned to handle real load:
stream_lines — yield each non-empty stripped line lazilymerge_log_streams — concat + sorted(key=lambda) for chronological mergechunk_list — [items[i:i+size] for i in range(0, len, size)]count_retries_needed — enumerate with 1-based count + sentinel -1windowed_error_rate — rolling window + round(errors/size, 2)Meta-skill: streaming, batching, and windowing are the three shapes that scale from a hundred rows to a hundred million.