Twenty lessons. Today you assemble five of them into the smallest webhook-driven pipeline.
Given a payload string + a known-good HMAC signature + a shared secret, run the pipeline:
import hmac
import hashlib
import json
def log(level, event, **fields):
print(json.dumps({"level": level, "event": event, **fields}))
secret = b"shh"
payload_bytes = b'{"event":{"type":"item.created","id":"e_42","data":{"name":"x","value":7}}}'
provided_sig = hmac.new(secret, payload_bytes, hashlib.sha256).hexdigest() # known-good for the lesson
processed = set()
results = []
def on_created(event):
return f"created:{event['id']}"
HANDLERS = {"item.created": on_created}
# step 1 — verify
computed = hmac.new(secret, payload_bytes, hashlib.sha256).hexdigest()
if not hmac.compare_digest(computed, provided_sig):
log("error", "signature_invalid")
raise RuntimeError("bad signature")
log("info", "step", n=1, action="verify", result="ok")
# step 2 — parse
payload = json.loads(payload_bytes)
event = payload.get("event", {})
log("info", "step", n=2, action="parse", event_id=event.get("id"), event_type=event.get("type"))
# step 3 — dedupe
eid = event.get("id")
if eid in processed:
log("info", "step", n=3, action="skip", reason="duplicate")
else:
processed.add(eid)
# step 4 — dispatch
fn = HANDLERS.get(event.get("type"))
if fn:
result = fn(event)
results.append(result)
log("info", "step", n=4, action="dispatch", result=result)
else:
log("warn", "step", n=4, action="dispatch", reason="unhandled")
log("info", "step", n=5, action="done", processed_count=len(processed))No new primitives at all?
Zero. Verify (day 4), parse (day 3), dedupe (day 6), dispatch (day 5), structured log (day 16). Five primitives composed cleanly on one generic event.
Why doesn't it write to a Sheet?
This synthesis stays in-memory to keep the verification deterministic and the focus on composition shape. Day 7 already exercised the Sheet write on its own; combining all six (verify + parse + dedupe + dispatch + log + state) into one lesson would be too much. The week-4 final synthesis brings state back in.
| Primitive | From | Used for |
|---|---|---|
hmac.compare_digest | day 4 | step 1 — verify the signature |
json.loads + defensive .get | day 3 | step 2 — parse the payload |
Dedup set | day 6 | step 3 — skip replays |
| Dispatch table | day 5 | step 4 — route by event type |
log(level, event, **fields) | day 16 | each step's structured output |
Five primitives. Each was a 10-minute lesson on its own. Combined: a webhook handler shape that production code recognizes.
Deliberately not in scope for week 3:
The skill of week 3: recognize a 5-primitive shape and write it cleanly. Not predict every edge case.
# right — five primitives, in order, with structured logs
verify -> parse -> dedupe -> dispatch -> log
# wrong — assumes the signature is always good
payload = json.loads(payload_bytes) # no verify step
event = payload["event"] # no .get()
fn = HANDLERS[event["type"]] # KeyError on unhandled
# over-shape — try/except every step, retry every call
try:
try:
verify(...)
except Exception as e:
retry...
except Exception:
alert...The over-shape belongs in production. For learning the composition, the lean shape is right.
A real webhook handler adds:
process part — sending email, charging cards)All of those compose with this synthesis — they don't replace it. The shape stays verify → parse → dedupe → dispatch → state → log.
Twenty lessons. Today you assemble five of them into the smallest webhook-driven pipeline.
Given a payload string + a known-good HMAC signature + a shared secret, run the pipeline:
import hmac
import hashlib
import json
def log(level, event, **fields):
print(json.dumps({"level": level, "event": event, **fields}))
secret = b"shh"
payload_bytes = b'{"event":{"type":"item.created","id":"e_42","data":{"name":"x","value":7}}}'
provided_sig = hmac.new(secret, payload_bytes, hashlib.sha256).hexdigest() # known-good for the lesson
processed = set()
results = []
def on_created(event):
return f"created:{event['id']}"
HANDLERS = {"item.created": on_created}
# step 1 — verify
computed = hmac.new(secret, payload_bytes, hashlib.sha256).hexdigest()
if not hmac.compare_digest(computed, provided_sig):
log("error", "signature_invalid")
raise RuntimeError("bad signature")
log("info", "step", n=1, action="verify", result="ok")
# step 2 — parse
payload = json.loads(payload_bytes)
event = payload.get("event", {})
log("info", "step", n=2, action="parse", event_id=event.get("id"), event_type=event.get("type"))
# step 3 — dedupe
eid = event.get("id")
if eid in processed:
log("info", "step", n=3, action="skip", reason="duplicate")
else:
processed.add(eid)
# step 4 — dispatch
fn = HANDLERS.get(event.get("type"))
if fn:
result = fn(event)
results.append(result)
log("info", "step", n=4, action="dispatch", result=result)
else:
log("warn", "step", n=4, action="dispatch", reason="unhandled")
log("info", "step", n=5, action="done", processed_count=len(processed))No new primitives at all?
Zero. Verify (day 4), parse (day 3), dedupe (day 6), dispatch (day 5), structured log (day 16). Five primitives composed cleanly on one generic event.
Why doesn't it write to a Sheet?
This synthesis stays in-memory to keep the verification deterministic and the focus on composition shape. Day 7 already exercised the Sheet write on its own; combining all six (verify + parse + dedupe + dispatch + log + state) into one lesson would be too much. The week-4 final synthesis brings state back in.
| Primitive | From | Used for |
|---|---|---|
hmac.compare_digest | day 4 | step 1 — verify the signature |
json.loads + defensive .get | day 3 | step 2 — parse the payload |
Dedup set | day 6 | step 3 — skip replays |
| Dispatch table | day 5 | step 4 — route by event type |
log(level, event, **fields) | day 16 | each step's structured output |
Five primitives. Each was a 10-minute lesson on its own. Combined: a webhook handler shape that production code recognizes.
Deliberately not in scope for week 3:
The skill of week 3: recognize a 5-primitive shape and write it cleanly. Not predict every edge case.
# right — five primitives, in order, with structured logs
verify -> parse -> dedupe -> dispatch -> log
# wrong — assumes the signature is always good
payload = json.loads(payload_bytes) # no verify step
event = payload["event"] # no .get()
fn = HANDLERS[event["type"]] # KeyError on unhandled
# over-shape — try/except every step, retry every call
try:
try:
verify(...)
except Exception as e:
retry...
except Exception:
alert...The over-shape belongs in production. For learning the composition, the lean shape is right.
A real webhook handler adds:
process part — sending email, charging cards)All of those compose with this synthesis — they don't replace it. The shape stays verify → parse → dedupe → dispatch → state → log.
Create a free account to get started. Paid plans unlock all tracks.