Automate Your Job Search Using AI (Indeed, LinkedIn, ZipRecruiter)

Automate Your Job Search Using AI (Indeed, LinkedIn, ZipRecruiter)

How to Automate Your Job Application Process

Introduction to Job Application Automation

  • The video addresses individuals who dislike the job application process, offering a solution to automate and streamline it.
  • A free workflow will be provided that automatically pulls job listings from various platforms like LinkedIn and Zip Recruiter daily.
  • Jobs will be scored on a scale of 1 to 10 based on user interest, allowing applicants to focus only on relevant opportunities.

Workflow Overview

  • The automated system generates cover letters using AI and compiles job listings into a Google spreadsheet for easy access.
  • Viewers are encouraged to follow along in building this workflow step-by-step, regardless of their prior experience with NAD (the automation tool).

Setting Up the Automation

  • Users need to create an account with NAD and start by generating a new personal workflow.
  • The first step involves setting up a trigger that schedules job pulls at specific intervals, ideally every day.

Scheduling Job Pulls

  • Triggers initiate actions; in this case, jobs will be pulled daily at 8 AM for immediate review during morning routines.
  • The next phase involves scraping jobs from sites like Indeed, which serves as a model for scraping other job portals.

Scraping Jobs Using Appify

  • Appify.com is introduced as the tool for scraping jobs from multiple websites including LinkedIn and Upwork.
  • Users can sign up for a free account with $5 credits monthly, allowing them to scrape approximately 1,000 listings without cost.

Navigating Appify Store

  • After signing in, users should navigate the Appify store to find web scrapers (referred to as "actors") specifically designed for different job sites.

Web Scraping with Indeed Job Scraper

Choosing a Web Scraper

  • The speaker discusses various web scrapers available, emphasizing the choice of the Indeed Job Scraper PPR for this tutorial. They note that if viewers are watching later, alternatives may exist as tools can change over time.

Setting Up the Scraper

  • The initial step involves saving the configuration to access it later in NAD (NAD is mentioned as a workflow program). A free trial allows scraping 1,000 jobs for $5.
  • The speaker explains that using NAD simplifies job scraping by automating processes that would otherwise be manual and time-consuming, such as data manipulation and exporting to Google Sheets.

Installing Necessary Tools

  • To begin using Appify within NAD, users must install it since it's created by a third party. This installation is necessary for accessing additional functionalities.
  • Users need to create credentials to connect their accounts with Appify through NAD. This connection enables programmatic access to scrape data from Appify.

Configuring Job Search Parameters

  • After saving the job scraper setup, users can specify search parameters like job type (e.g., SEO), location (e.g., US), and limit results (e.g., maximum of 10 jobs).
  • The process requires switching from manual input to JSON format for configuration. Users copy existing JSON text while making minor adjustments before pasting it into the designated area.

Executing the Web Scrape

  • A wait time of 60 seconds is set before moving on in the workflow. This ensures all results are retrieved before proceeding with further actions like scoring or exporting jobs.
  • The output after scraping includes irrelevant data; however, users focus on extracting specific job listings rather than extraneous information returned by the scraper.

Retrieving Data Sets

  • To obtain relevant job listings post-scraping, an additional node is added in NAD where users enter a specific dataset ID linked to their scraped results.
  • Users can pin results during testing runs to save data without waiting each time for new scrapes. This feature significantly reduces testing time when running multiple iterations of the scraper.

Job Application Process Using AI

Overview of Job Scraping and Analysis

  • The speaker discusses using sample data instead of a web scraper to pull job listings, specifically highlighting an SEO specialist position that is full-time and remote.
  • The details of the job listing are examined, emphasizing its relevance for those seeking SEO-optimized roles.

Implementing a Limiter in Job Processing

  • A limiter is introduced to restrict processing to one job at a time during testing, despite having scraped ten results. This approach aims to save time and resources.
  • The rationale behind limiting the number of jobs processed simultaneously is explained: it helps manage costs and speeds up the testing phase.

Utilizing OpenAI for Job Scoring

  • The speaker transitions to using OpenAI's API for scoring job listings and writing cover letters, noting that an API key is required for this process.
  • Instructions on obtaining an API key from OpenAI are provided, including navigating through settings to generate a new key.

Structuring the AI Interaction

  • The speaker prepares to use pre-existing integration with OpenAI, aiming to streamline the process by copying necessary code snippets rather than retyping them.
  • Emphasis is placed on scoring jobs based on their suitability for candidates by evaluating various factors such as salary and remote work options.

Breakdown of Scoring Criteria

  • A detailed explanation of how jobs will be scored on a scale from 1 to 10 is given. Factors include whether the job is remote or full-time and its alignment with the candidate's skills.
  • Specific criteria are outlined: points awarded for being remote (1 point), full-time (1 point), in the SEO field (5 points), among others.

User Input and Resume Integration

  • The user message component involves inputting relevant job information like salary and benefits alongside a sample resume for AI evaluation.
  • Clarification on how AI will assess both the resume and job description ensures accurate scoring based on compatibility between them.

Job Fit and Application Process

Understanding Job Fit

  • The speaker emphasizes the importance of determining if both the candidate and the job are a good fit for each other, focusing on high-value jobs that align with personal interests rather than low-value ones.

Structuring Output with JSON

  • The speaker introduces JSON as a data formatting method, likening it to how Google Sheets organizes data. This format is used to score job applications.
  • It is suggested that users can request JSON data from ChatGPT without needing to memorize its structure, making it accessible for those unfamiliar with coding.

Scoring Jobs and Creating Cover Letters

  • The speaker mentions using the GBT5 mini model for scoring jobs due to its cost-effectiveness and speed.
  • There’s a need to change output formats from merged text into structured JSON, separating scores from their corresponding reasons.

Generating Customized Cover Letters

  • Instructions are provided for creating customized cover letters based on job details and candidates' resumes.
  • The assistant message specifies how the cover letter should be formatted in the output, ensuring clarity in presentation.

Automating Job Applications

  • A specific job application example is discussed (SEO specialist role), highlighting challenges in reading dry job descriptions.
  • After generating a cover letter, there’s an emphasis on inserting this information into Google Sheets for organized tracking of multiple applications across various platforms like LinkedIn and Zip Recruiter.

Job Data Management Workflow

Unique Job Identification

  • The workflow requires a unique identifier for job entries, with the URL being the most reliable since it remains constant even if other details change.
  • The process involves mapping input data to a central interface, ensuring that all relevant job information is organized systematically.

Data Input and Mapping

  • Key fields include job description, title, platform (hardcoded as Indeed), company name (Optimal 7), cover letter generated by AI, salary details, city (Miami), and whether the position is remote.
  • Not all jobs may have salary information available; discrepancies can occur when scraping multiple listings from Indeed.

Job Entry Process

  • Additional data points such as ratings and publication dates are also collected to provide comprehensive insights into each job listing.
  • The system allows users to add new job entries dynamically and check for errors in previously entered data.

Application Workflow

  • Users can quickly apply for jobs directly through the interface after reviewing essential details like date published and rating.
  • Filtering options enable users to exclude lower-rated jobs based on personal criteria, enhancing the efficiency of their job search.

Expanding Job Search Capabilities

  • Users can duplicate workflows for different job types (e.g., SEO vs. digital marketing), allowing for tailored searches across various platforms.
  • The goal is to consolidate multiple job listings from various sources (LinkedIn, Zip Recruiter, Google Jobs, etc.) into one dashboard for streamlined access and application processes.

Community Engagement

  • Viewers are encouraged to subscribe for more content related to AI and automation skills that are increasingly valuable in today's business landscape.

AI Skills and Opportunities

Overview of Learning Outcomes

  • The speaker offers free tutorials on YouTube, encouraging viewers to explore these resources for foundational learning.
  • Three primary outcomes are highlighted:
  • Learn skills to save time and earn money in personal life.
  • Create an AI automation agency or pursue freelancing opportunities.
  • Receive guidance on finding, closing, and fulfilling deals in the business realm.
Video description

🚨🚨Bug Fix🚨🚨 n8n updated their ChatGPT node. Here's the fix: Open up both both ChatGPT nodes. 1. Inside the ChatGPT node there's 3 message roles: system, user and assistant. 2. Copy all text from the "assistant role" and paste it into the "system role". 3. Delete the assistant message role by hitting the trash can 4. At the bottom of the ChatGPT nodes, click "add option" then Select "Output format" 5. Change the "type" of the dropdown field to "JSON Object" 6. Do it for both ChatGPT nodes 🌍 COMMUNITY https://www.skool.com/automatable/about 📝 BLUEPRINTS https://www.skool.com/automatable-free/classroom/6ca29126?md=a7931b87fdf14cea9486bf2fd14b6aad 📚 SUMMARY In this video, learn how to build a fully automated job application system — from scraping job listings to submitting applications automatically. 📣 SOCIAL MEDIA • Instagram → https://instagram.com/jono_catliff • TikTok → https://www.tiktok.com/@jonocatliff • LinkedIn → https://www.linkedin.com/in/jonocatliff/ • X → https://twitter.com/@jonocatliff 📺 RELATED VIDEOS • Full crash course on Make.com → https://youtu.be/hinLebdX8aM • Full crash course on n8n →https://youtu.be/AURnISajubk • 11 Favourite Make.com automations → https://youtu.be/dIH1F1WlE84 • 12 Favourite n8n automations → https://youtu.be/uQGT2K26W84 🎯 1:1 CONSULTING Book a time → https://jonocatliff.com/consultation 🚀 AUTOMATION AGENCY Get help with your business → https://www.automatable.co 💼 CAREERS Work with me → https://jonocatliff.com/careers 🔗 LINKS (some of these make me money - thanks in advance!) • n8n → https://jonocatliff.com/n8n • Make.com → https://jonocatliff.com/make • Go High Level → https://jonocatliff.com/gohighlevel • Apify → https://jonocatliff.com/apify • Skool → https://jonocatliff.com/skool • Zapier → https://jonocatliff.com/zapier • PandaDoc → https://jonocatliff.com/pandadoc • Apollo → https://jonocatliff.com/apollo • ManyChat → https://jonocatliff.com/manychat • Vapi → https://jonocatliff.com/vapi • PhantomBuster → https://jonocatliff.com/phantombuster • ClickUp → https://jonocatliff.com/clickup • ElevenLabs → https://jonocatliff.com/elevenlabs • Upwork → https://jonocatliff.com/upwork • Instantly.ai → https://jonocatliff.com/instantly • Airtable → https://jonocatliff.com/airtable 👋 ABOUT ME Hey everyone, my name is Jono. I run a 7-figure service business that offers DJ, photo, video services (#1 largest in Canada), and spent years figuring out how to automate every part of it (and hired the roles that I couldn't). Conservatively, I used to work 80+ hours per week, before sunrise till long after sunset; missing gatherings, family events and everything in between. Through automation though, I was able to replace my job. My goal is to help share what worked for me, in a dream of helping others find true success with their passion. Please subscribe, like and comment below if you have any questions! Thank you 😊 ⌛ TIMESTAMPS 0:00 Intro 1:39 Build Starts 27:39 Conclusion