Final lesson. The smallest reliable pipeline that exercises six production primitives on a tiny generic input. No new APIs.
Given two events (one valid, one with bad data) and a known-good signature for each:
processed setitem.created is handledstate only on successimport hmac, hashlib, json
secret = b"shh"
def sign(payload_bytes):
return hmac.new(secret, payload_bytes, hashlib.sha256).hexdigest()
def process(event):
"""Apply event to a side state. Raises if invalid."""
if event.get("data", {}).get("value", 0) < 0:
raise ValueError("negative value")
return event["data"]["value"] * 2
state = {}
processed = set()
dead_letter = []
results = []
events_in = [
b'{"event":{"type":"item.created","id":"e_a","data":{"name":"a","value":5}}}',
b'{"event":{"type":"item.created","id":"e_b","data":{"name":"b","value":-1}}}',
]
for payload_bytes in events_in:
sig = sign(payload_bytes)
if not hmac.compare_digest(sign(payload_bytes), sig):
results.append("verify_failed"); continue
payload = json.loads(payload_bytes)
event = payload.get("event", {})
eid = event.get("id")
if eid in processed:
results.append("skip"); continue
if event.get("type") != "item.created":
results.append("unhandled"); continue
try:
out = process(event)
state[eid] = out
processed.add(eid)
results.append("done")
except Exception as e:
dead_letter.append({"id": eid, "error": str(e)})
results.append("dead_letter")Expected: results == ["done", "dead_letter"], state == {"e_a": 10}, len(dead_letter) == 1.
Six primitives in 25 lines.
That's the whole point of the track. The lessons gave you primitives. The synthesis is a proof that they compose cleanly.
Could this run in production?
With a few additions — persistent state instead of in-memory, a real queue instead of a list, retry around the process call, structured logs per step, an alert when dead_letter exceeds threshold — yes. The shape is what you'd ship. The persistence and observability layers add bulk but don't change the structure.
| Primitive | From | Used for |
|---|---|---|
| HMAC verification | day 4 | step 1 — signature |
| JSON parse + defensive read | day 3 | step 2 — payload |
| Event idempotency | day 6 | step 3 — dedupe |
| Dispatch by type | day 5 | step 4 — route |
| Dead-letter on failure | day 23 | step 5 — handle bad data |
| Replay-safe state writes | day 24 | step 6 — write only on success |
Six primitives. Each was a small lesson; combined they're a production-shaped pipeline.
Deliberately out of scope to keep verification deterministic:
If you wanted production, all of these would layer on. The synthesis stays thin so the composition shape is legible.
# right — six primitives, in order, with structured pass/fail per event
verify -> parse -> dedupe -> dispatch -> try-process -> state-or-dlq
# wrong — assumes everything succeeds, no dead-letter
try:
state[event_id] = process(event)
except:
pass # silent loss
# over-shape — every layer wrapped in try, retry on every step
try:
for attempt in range(3):
try:
verify()
except: time.sleep(2 ** attempt)
for attempt in range(3):
try:
parse()
...Production code does layer in retries. For learning the composition, the lean shape is right.
With six primitives composed, you can write the smallest reliable webhook handler. Your kit:
.get, requests for direct HTTP, rate-limit-awareAny automation you'll read or build composes these. The remaining tracks (AI series) add the LLM layer on top — same shape, different tool.
After this track, write your own. Pick a real automation you've been meaning to build — a daily summary, a Sheet-driven workflow, a webhook receiver — and apply the kit. The first version will look like one of these synthesis lessons. That's correct.
Final lesson. The smallest reliable pipeline that exercises six production primitives on a tiny generic input. No new APIs.
Given two events (one valid, one with bad data) and a known-good signature for each:
processed setitem.created is handledstate only on successimport hmac, hashlib, json
secret = b"shh"
def sign(payload_bytes):
return hmac.new(secret, payload_bytes, hashlib.sha256).hexdigest()
def process(event):
"""Apply event to a side state. Raises if invalid."""
if event.get("data", {}).get("value", 0) < 0:
raise ValueError("negative value")
return event["data"]["value"] * 2
state = {}
processed = set()
dead_letter = []
results = []
events_in = [
b'{"event":{"type":"item.created","id":"e_a","data":{"name":"a","value":5}}}',
b'{"event":{"type":"item.created","id":"e_b","data":{"name":"b","value":-1}}}',
]
for payload_bytes in events_in:
sig = sign(payload_bytes)
if not hmac.compare_digest(sign(payload_bytes), sig):
results.append("verify_failed"); continue
payload = json.loads(payload_bytes)
event = payload.get("event", {})
eid = event.get("id")
if eid in processed:
results.append("skip"); continue
if event.get("type") != "item.created":
results.append("unhandled"); continue
try:
out = process(event)
state[eid] = out
processed.add(eid)
results.append("done")
except Exception as e:
dead_letter.append({"id": eid, "error": str(e)})
results.append("dead_letter")Expected: results == ["done", "dead_letter"], state == {"e_a": 10}, len(dead_letter) == 1.
Six primitives in 25 lines.
That's the whole point of the track. The lessons gave you primitives. The synthesis is a proof that they compose cleanly.
Could this run in production?
With a few additions — persistent state instead of in-memory, a real queue instead of a list, retry around the process call, structured logs per step, an alert when dead_letter exceeds threshold — yes. The shape is what you'd ship. The persistence and observability layers add bulk but don't change the structure.
| Primitive | From | Used for |
|---|---|---|
| HMAC verification | day 4 | step 1 — signature |
| JSON parse + defensive read | day 3 | step 2 — payload |
| Event idempotency | day 6 | step 3 — dedupe |
| Dispatch by type | day 5 | step 4 — route |
| Dead-letter on failure | day 23 | step 5 — handle bad data |
| Replay-safe state writes | day 24 | step 6 — write only on success |
Six primitives. Each was a small lesson; combined they're a production-shaped pipeline.
Deliberately out of scope to keep verification deterministic:
If you wanted production, all of these would layer on. The synthesis stays thin so the composition shape is legible.
# right — six primitives, in order, with structured pass/fail per event
verify -> parse -> dedupe -> dispatch -> try-process -> state-or-dlq
# wrong — assumes everything succeeds, no dead-letter
try:
state[event_id] = process(event)
except:
pass # silent loss
# over-shape — every layer wrapped in try, retry on every step
try:
for attempt in range(3):
try:
verify()
except: time.sleep(2 ** attempt)
for attempt in range(3):
try:
parse()
...Production code does layer in retries. For learning the composition, the lean shape is right.
With six primitives composed, you can write the smallest reliable webhook handler. Your kit:
.get, requests for direct HTTP, rate-limit-awareAny automation you'll read or build composes these. The remaining tracks (AI series) add the LLM layer on top — same shape, different tool.
After this track, write your own. Pick a real automation you've been meaning to build — a daily summary, a Sheet-driven workflow, a webhook receiver — and apply the kit. The first version will look like one of these synthesis lessons. That's correct.
Create a free account to get started. Paid plans unlock all tracks.