#!/usr/bin/env agent

Find Prospects Skill

Search online communities for people who have problems your product solves.

Related Skills

Input

You need these parameters (provided in the task or ask the user):

Process

Phase 1: Search

  1. Construct 3-5 targeted search queries combining:

    • Site filter: site:reddit.com or site:news.ycombinator.com
    • Problem keywords from search_terms
    • Frustration/desire signals: “wish”, “looking for”, “need”, “anyone know”, “is there”
  2. Run each search with web_search, collect promising URLs

Phase 2: Deep Extraction

For each promising Reddit URL, use the ParseReddit skill (Biz/Skills/ParseReddit.md):

curl -s -H "User-Agent: agentbot/1.0" "REDDIT_URL.json" | jq '{
  author: .[0].data.children[0].data.author,
  title: .[0].data.children[0].data.title,
  body: .[0].data.children[0].data.selftext,
  score: .[0].data.children[0].data.score,
  num_comments: .[0].data.children[0].data.num_comments,
  created: .[0].data.children[0].data.created_utc,
  subreddit: .[0].data.children[0].data.subreddit,
  comments: [.[1].data.children[:3][] | .data | {author, body, score}]
}'

For Hacker News URLs, use the ParseHackerNews skill (Biz/Skills/ParseHackerNews.md):

# Extract item ID from URL like news.ycombinator.com/item?id=41824973
ID=41824973
curl -s "https://hacker-news.firebaseio.com/v0/item/$ID.json" | jq '{
  author: .by,
  title: .title,
  text: .text,
  score: .score,
  time: .time,
  comments: .descendants
}'

Phase 3: Qualify

For each extracted post, determine priority:

Phase 4: Output

Output Format

## Prospects Found

### High Priority

1. **u/username** in r/subreddit
   - Posted: YYYY-MM-DD (X months ago)
   - Score: N points, M comments
   - Quote: "Exact quote from post..."
   - URL: https://reddit.com/...
   - Why: [Specific reason this is high priority]

### Medium Priority

2. **hn_user** on Hacker News
   - Posted: YYYY-MM-DD
   - Quote: "..."
   - URL: https://news.ycombinator.com/item?id=...
   - Why: [Reason]

### Skipped
- URL - Reason (too old, already solved, etc.)

### Search Queries Used
- query 1
- query 2

### Summary
- Queries run: X
- Posts examined: Y
- High priority: Z
- Medium priority: W

### Recommended Actions
1. [Specific suggestion for top prospect]
2. [Subreddit to monitor]

Quality Checklist

Before outputting, verify:

Example

Product: PodcastItLater - converts articles to podcasts
Problem: Too many articles/newsletters to read, want to listen instead
Communities: reddit, hackernews
Search terms:
  - "wish I could listen to articles"
  - "newsletter text to speech"
  - "too many newsletters to read"
  - "article backlog"