Independent Project | Workflow Design

AI-Assisted Workflow for Job Discovery

A systems-focused case showing how I designed an AI-assisted workflow to reduce noise in job discovery, classify role fit, and maintain a structured opportunity pipeline across fragmented job platforms.


Context

While searching for a new opportunity, I noticed that the job market was noisy and fragmented. There was no unified system to aggregate roles, evaluate fit, track application status, or identify stale or duplicate listings.

Instead, listings appear across multiple platforms with inconsistent titles and incomplete information. Some roles remain posted long after they are no longer active, creating ghost jobs. Because titles and job descriptions vary widely between companies, it's difficult to determine where legitimate opportunities exist and which roles are actually worth pursuing.

Maintaining a structured record of applications introduces additional complexity. Once an opportunity is identified, applications must often be tracked across multiple tools, making it difficult to maintain visibility into how the search is progressing.

As I tracked listings across multiple sources, a pattern emerged: the challenge wasn't simply discovering opportunities, but navigating a job market where the same roles appear repeatedly across platforms with small but critical variations.


How can a system reduce noise in job discovery so relevant opportunities can be identified, evaluated, and tracked through a reliable decision process?

Approaching the job search as an operations problem revealed an opportunity to design a lightweight system that could surface relevant opportunities and structure how they were evaluated and tracked.


Leadership

I designed and implemented a lightweight AI-assisted workflow to structure job discovery, role evaluation, and application tracking across multiple sources.

To create a consistent view of opportunities, roles surfaced through web searches and company job boards are entered into a dataset with core fields such as title, company, location, and posting link. The system then evaluates these roles against my resume variants, proposed titles, and companies I'm targeting. This surfaces both reasons a role may be a strong fit and caveats such as missing salary information or unclear hybrid requirements.

All opportunities initially appear as potential roles, which I can assign a status: Interested, Applied, Not a fit, or Unavailable / inactive. These statuses help filter opportunities and reduce duplicate evaluation when listings reappear across search sources.

AI assists with early triage before the application process begins, helping surface relevant roles, identify potential matches with my experience, and flag duplicate listings. Human oversight remains central to the process. Roles marked Unavailable allow me to verify that a listing is inactive or invalid, while the Not a fit status signals that the opportunity doesn't align with my experience or constraints. Both prevent the role from resurfacing in future searches.

For roles marked Applied, the system surfaces them again when searches are run so their status can be updated to Application pending, Interviewing, or Not selected. This prevents duplicate applications while maintaining a clear record of search progress. Applied roles are also exported into a document for Washington unemployment reporting, automatically generating the list required for weekly submissions and eliminating the need to maintain separate documentation.


Findings

Rather than isolated issues with individual listings, the workflow revealed several recurring patterns in how job opportunities are distributed and presented across platforms.

  • The same roles appear in multiple places. Many roles appear across several job boards with small variations in title or description. Without a centralized view, this often leads to repeatedly evaluating the same opportunity.
  • Listings often omit key details. Important information such as salary ranges, remote or hybrid expectations, and location restrictions is frequently missing. This requires additional verification before determining whether a role is viable.
  • Relevant roles can hide behind unexpected titles. Roles with very different titles often describe similar responsibilities, while roles with similar titles can represent very different scopes. Evaluating the full description rather than relying on titles alone proved necessary to determine alignment.

Designing this workflow meant balancing automation with reliability. While automation helped reduce the time spent scanning listings across platforms, relying on it for final decisions risked surfacing stale postings or overlooking relevant roles due to inconsistent titles and descriptions. Instead, AI is used primarily for triage and pattern detection, while evaluation and application decisions remain human-driven.


Impact

Once the workflow was in place, the job search became much easier to manage. Bringing listings into a single dataset made it easier to see when the same role appeared across multiple platforms and avoid repeatedly evaluating the same opportunity. Classifying roles against my resumes and target titles also helped surface which opportunities were worth a closer look.

Tracking status inside the system created a clear view of current progress, while automatically generating the documentation needed for weekly unemployment reporting.

More importantly, the workflow changed how opportunities were evaluated. Instead of scanning large numbers of listings and applying broadly, I could focus on roles that were active, aligned with my experience, and worth pursuing.

↑ Back to top