I needed a job board for Vancouver marketing roles. I didn't want to spend a week building it or pay $5,000 for a WordPress plugin that would break every time I updated something. So I sat down with Claude Code on a Tuesday afternoon and built one from scratch in about three hours.
The result is live at /marketing-jobs-vancouver/. It scrapes job postings from multiple sources, filters them by role and location, updates automatically, and serves everything as a static site. Zero database. Zero hosting costs beyond what I already pay for the domain. And it's faster than any job board I've used.
Here's exactly how I did it — the prompts I used, the structure I chose, and what I'd do differently if I built another one tomorrow.
Why Build a Job Board with Claude Code?
Before I walk through the process, it's worth explaining why Claude Code is particularly well-suited for this kind of project. A job board is fundamentally about three things: scraping data from external sources, structuring it consistently, and displaying it in a clean, searchable interface.
Claude Code can handle all three. It writes the scraping logic, builds the filtering and categorization layer, and generates the HTML with proper SEO structure. And because it's all static files, there's no backend to maintain — just a script that runs once a day to refresh the listings.
Compare that to hiring a developer or using a SaaS platform. A developer quote for this would start at $3,000 and take two weeks. A platform like Breezy or Workable charges $150–$400/month and locks you into their ecosystem. I wanted something I owned, could customize instantly, and could replicate for other niches if it worked.
Step 1: Defining the Structure
The first conversation I had with Claude Code wasn't about code — it was about what data fields each job posting needed and where the data would come from.
I decided on these core fields:
- Job title — exactly as posted
- Company name — standardized where possible
- Location — city-level, with a "Remote" flag
- Job type — Full-time, Part-time, Contract, Internship
- Category — SEO, Content, Paid Ads, Social Media, etc.
- Posted date — to sort by freshness
- Source URL — direct link to the original posting
The hardest part here was deciding how to categorize roles. Job titles are inconsistent — one company's "Growth Marketer" is another's "Digital Marketing Specialist." I solved this by giving Claude Code a mapping table of keywords to categories. If the title contains "SEO" or "search," it goes in SEO. If it has "social" or "community," it goes in Social Media. Imperfect, but good enough for a first version.
Step 2: Writing the Scraper
I chose three sources to start: LinkedIn Jobs, Indeed, and a few local Vancouver marketing agencies that post openings on their own sites. The scraper logic for each source is slightly different because they all structure their HTML differently, but the pattern is the same.
The prompt I used looked like this:
You are building a job scraper for marketing roles in Vancouver. Target source: [URL] Extract: job title, company, location, type, posting date, and apply URL. Return a JSON array with consistent field names. Handle missing fields gracefully (use null or "Not specified"). Respect robots.txt and add a 2-second delay between requests.
Claude Code wrote a Python script using BeautifulSoup that fetches the job listings page, parses the HTML, and outputs clean JSON. I ran it on each source separately, reviewed the output for accuracy, and then merged the results into a single jobs.json file.
One thing I learned: always validate the scraper output manually before trusting it at scale. Claude Code will follow the structure you ask for, but if the source site changes its HTML class names, the scraper breaks silently. I check the output once a week to catch issues early.
Step 3: Building the Job Board Interface
With the data in a structured JSON file, the next step was displaying it. I wanted a simple, fast interface with filters for category, job type, and location. No database queries. No server-side rendering. Just static HTML and a bit of JavaScript for the filtering.
I gave Claude Code the jobs.json file and asked it to generate an HTML page that:
- Loops through each job and renders a card with the title, company, location, and a link to apply
- Adds filter dropdowns for category, type, and location
- Uses JavaScript to hide/show cards based on the active filters
- Sorts jobs by posted date, with the most recent at the top
The output was cleaner than I expected. Claude Code wrote the card layout with proper semantic HTML, added ARIA labels for accessibility, and included schema.org markup for JobPosting so Google can index the listings properly.
I made a few tweaks to the styling to match the rest of alejandroarce.com, but the structure stayed intact. Total time from prompt to working page: about 40 minutes, including testing.
Step 4: Automating the Updates
A job board is only useful if it stays current. I didn't want to manually re-run the scraper every day, so I set up a cron job that runs the script at 6 AM Pacific and pushes the updated jobs.json file to the site.
The automation script does three things:
- Runs the scraper for each source and merges the results
- Deduplicates any listings that appear on multiple sources
- Removes jobs older than 30 days (to keep the board fresh)
Because the site is static, the update process is just replacing one JSON file. No migrations. No downtime. It takes about 90 seconds to run and hasn't failed once in two months.
What This Cost Me
The total cost to build and run this job board:
- Development time: 3 hours
- Hosting: $0 (it's part of my existing site)
- Scraping cost: $0 (all public data, under API rate limits)
- Maintenance: ~15 minutes/week to spot-check accuracy
Compare that to a $3,000 developer build or $200/month for a SaaS tool, and the ROI is obvious. Even if I only use this board for my own hiring, it's already paid for itself in time saved.
What I'd Do Differently Next Time
If I were building this again — or a job board for a different niche — here's what I'd change:
- Add email alerts: Let users subscribe to new postings in a specific category. I didn't build this because I wanted to avoid managing a mailing list, but it's the #1 feature request I've gotten.
- Pull from more sources: Right now I'm scraping three. Adding Glassdoor, AngelList, and a few Vancouver startup job boards would give better coverage.
- Build a submission form: Let companies post directly. This would require moderation, but it would also make the board more comprehensive.
None of these are hard to implement. They're just features I didn't need on day one. That's the advantage of building your own tool — you add complexity only when it's justified.
The Bigger Lesson Here
This project reinforced something I've been saying to clients for months: if you can describe the logic clearly, Claude Code can probably build it. A job board isn't a trivial project, but it's also not unique. The structure is well-defined, the data sources are public, and the interface follows a pattern that's been proven thousands of times.
What made this fast wasn't that Claude Code is magic. It's that I knew what I wanted, broke it into clear steps, and validated the output at each stage. That's the skillset that matters now — knowing how to direct the tool, not how to write every line of code yourself.
If you're thinking about building something similar — a job board, a directory, a content aggregator — the process is repeatable. Start with the data structure, build the scraper, generate the interface, and automate the updates. You can have a working version in an afternoon.
And if you want to talk through how this would work for your specific use case, I'm happy to walk you through it. I've done this enough times now that I can usually sketch out the architecture in a 20-minute call. The FAQ page also covers a lot of the common questions about scoping and timelines.
The tools exist. The question is just what you're going to build with them.