Before removing the subagent role system, migrate the useful parts to skills and role definitions. Each SubagentRole (WebCrawler, CodeReviewer, DataExtractor, Researcher) represents a useful pattern that should be preserved.
Tools: web_search, read_webpages, search_codebase Model: claude-3-haiku (fast/cheap) Purpose: Quick web research
Migrate to: Omni/Agent/Skills/shared/web-research/SKILL.md
---
name: web-research
description: Research a topic using web search. Use when you need to find current information, verify facts, or explore a topic online.
---
# Web Research
## Process
1. Formulate search queries - be specific, try multiple phrasings
2. Use `web_search` to find relevant sources
3. Use `read_webpages` to extract content from top 3-5 results
4. Synthesize findings, always cite sources with URLs
## Guidelines
- Be EFFICIENT - extract key facts, don't save full page contents
- Limit to 3-5 most relevant sources
- Work iteratively: search → skim → read best 2-3 → synthesize
- ALWAYS cite sources - every claim needs a URL
- Stop when you have sufficient information
Tools: read_file, search_codebase, search_and_read, run_bash Model: claude-sonnet-4 Purpose: Thorough code analysis
Migrate to: Omni/Agent/Roles/Reviewer.md (already planned in t-307)
The Reviewer role covers code review. The skill for code review process:
Omni/Agent/Skills/shared/code-review/SKILL.md
---
name: code-review
description: Review code changes for quality, correctness, and conventions. Use when asked to review a PR, diff, or code changes.
---
# Code Review
<role:reviewer>
Review the code with focus on:
1. **Correctness**: Does it do what it's supposed to?
2. **Testing**: Are there tests? Do they cover edge cases?
3. **Conventions**: Does it follow project patterns?
4. **Security**: Any obvious vulnerabilities?
5. **Performance**: Any obvious inefficiencies?
Use `search_codebase` to understand existing patterns.
Use `run_bash` to run tests if needed.
Provide specific, actionable feedback with file:line references.
</role>
Tools: read_webpages, read_file, search_codebase Model: claude-3-haiku Purpose: Structured data extraction
Migrate to: Omni/Agent/Skills/shared/data-extraction/SKILL.md
---
name: data-extraction
description: Extract structured data from web pages or documents. Use when you need to pull specific information into a structured format.
---
# Data Extraction
## Process
1. Identify the source (URL or file path)
2. Determine what data to extract (fields, format)
3. Read the source content
4. Extract into structured format (JSON, table, etc.)
## Output Format
Return extracted data as JSON:
\`\`\`json
{
"source": "url or file",
"extracted_at": "timestamp",
"data": { ... }
}
\`\`\`
Tools: web_search, read_webpages, read_file, search_codebase, search_and_read Model: claude-sonnet-4 Purpose: General research
Migrate to: Omni/Agent/Skills/shared/research/SKILL.md
---
name: research
description: Deep research on a topic combining web sources and local knowledge. Use for thorough investigation requiring multiple sources.
---
# Research
## Process
1. Clarify the research question
2. Search local codebase/files for existing knowledge
3. Search web for external sources
4. Cross-reference and synthesize findings
5. Document with citations
## Output Format
\`\`\`json
{
"summary": "Brief overall summary",
"confidence": 0.85,
"findings": [
{
"claim": "Key insight",
"source_url": "https://...",
"source_name": "Source"
}
],
"caveats": "Limitations or uncertainties"
}
\`\`\`
<role:reviewer>)