A client brief references a market rate for a service you have never quoted before. You need current pricing before writing the proposal. Right now that is a browser tab, a search, thirty minutes of reading. What is the agent version?
An agent that searches the web? But I thought the models we have been using do not have live internet access.
Standard models do not. But lesson_type: "ai-search" switches to Perplexity's Sonar model — web search is built in. The call looks identical to Day 3:
def search_the_web(query: str) -> str:
return Agent(model).run_sync(query).outputThe code is identical to Day 3. Where does the search happen?
The Sonar model decides when a query needs a web search and performs it automatically. You do not add extra parameters — the model type is selected by lesson_type in the sandbox. Compare the two lessons:
# Day 3 — standard model, no search
def run_agent(prompt: str) -> str:
return Agent(model).run_sync(prompt).output
# Day 24 — Sonar model, web search built in
def search_the_web(query: str) -> str:
return Agent(model).run_sync(query).outputSame API, different capability. I point the same function at a different model and it gains web search.
That is model selection. The code is the same; the intelligence behind it changes based on which model is configured. Perplexity Sonar is tuned for search-grounded answers — it cites sources, stays current, and prefers factual over fluent.
Current market rates, competitor pricing, industry benchmarks — all accessible from the same Agent(model).run_sync(query).output pattern.
Yes — but treat it as a first draft. Sonar can hallucinate sources. Use it to get starting context, then verify anything that goes into a client deliverable. Web access does not eliminate the judgment layer; it just makes research faster.
lesson_type: "ai-search"When lesson_type is "ai-search", the sandbox configures the Perplexity Sonar model. Web search is implicit — the model decides when to search.
def search_the_web(query: str) -> str:
return Agent(model).run_sync(query).outputThe code is identical to Day 3's run_agent. The only difference is the model selected by the sandbox — no extra parameters needed on your side.
Sonar can hallucinate sources. Use it for initial context; verify anything that goes into a client deliverable.
A client brief references a market rate for a service you have never quoted before. You need current pricing before writing the proposal. Right now that is a browser tab, a search, thirty minutes of reading. What is the agent version?
An agent that searches the web? But I thought the models we have been using do not have live internet access.
Standard models do not. But lesson_type: "ai-search" switches to Perplexity's Sonar model — web search is built in. The call looks identical to Day 3:
def search_the_web(query: str) -> str:
return Agent(model).run_sync(query).outputThe code is identical to Day 3. Where does the search happen?
The Sonar model decides when a query needs a web search and performs it automatically. You do not add extra parameters — the model type is selected by lesson_type in the sandbox. Compare the two lessons:
# Day 3 — standard model, no search
def run_agent(prompt: str) -> str:
return Agent(model).run_sync(prompt).output
# Day 24 — Sonar model, web search built in
def search_the_web(query: str) -> str:
return Agent(model).run_sync(query).outputSame API, different capability. I point the same function at a different model and it gains web search.
That is model selection. The code is the same; the intelligence behind it changes based on which model is configured. Perplexity Sonar is tuned for search-grounded answers — it cites sources, stays current, and prefers factual over fluent.
Current market rates, competitor pricing, industry benchmarks — all accessible from the same Agent(model).run_sync(query).output pattern.
Yes — but treat it as a first draft. Sonar can hallucinate sources. Use it to get starting context, then verify anything that goes into a client deliverable. Web access does not eliminate the judgment layer; it just makes research faster.
lesson_type: "ai-search"When lesson_type is "ai-search", the sandbox configures the Perplexity Sonar model. Web search is implicit — the model decides when to search.
def search_the_web(query: str) -> str:
return Agent(model).run_sync(query).outputThe code is identical to Day 3's run_agent. The only difference is the model selected by the sandbox — no extra parameters needed on your side.
Sonar can hallucinate sources. Use it for initial context; verify anything that goes into a client deliverable.
Create a free account to get started. Paid plans unlock all tracks.