This is a Node.js script I wrote to clean up my messy job application inbox. It scans my Gmail, uses a local AI (Ollama) to figure out what each email is about (rejection, interview, etc.), and saves everything into a clean SQLite database so I can actually analyze my job search.
Simple: I was applying to a lot of jobs and my inbox was a complete disaster. I wanted to answer basic questions but couldn't:
- How many rejections have I gotten? Bruh.
- Which companies are actually getting back to me for interviews?
- Am I getting more responses from LinkedIn or Indeed?
- What's my interview-to-application ratio?
Doing this manually was boring. This script automates it.
It's pretty straightforward:
- Connects to Gmail: It uses the official (and secure) Google OAuth method, so it only gets read-only access. It never sees your password.
- Finds Emails: It searches your Gmail for emails matching a query you set in the config (e.g.,
(apply OR application OR interview)). It processes them in batches so it doesn't crash. - Asks the AI: It sends the important parts of each email (sender, subject, body) to a local LLM running on your machine (like Ollama). I gave it a detailed prompt to pull out structured info like the category (
rejection,interview), company name, job title, etc., and return it as JSON. - Saves the Results: It stores this clean, structured data in a local SQLite database file (
job_emails.db). It's smart enough not to process the same email twice.
flowchart TD
subgraph "On Your Computer"
direction LR
App("Node.js App")
Ollama("Ollama (LLM)")
DB("SQLite Database")
end
User("You") -- "Runs & Configures" --> App
subgraph "Email Fetching"
App -- "1. Request Emails" --> Gmail("Gmail API")
Gmail("Gmail API") -- "2. Returns Emails" --> App
App -- "3. Feed Emails" --> Ollama
Ollama -- "4. Analyze & return Emails"--> App
App -- "5. Store Results" --> DB
end
User -- "Analyzes final data" --> DB
- Node.js (v18+)
- A Google Account
- A local AI server running, like Ollama. You'll need to have a model pulled (e.g.,
ollama pull llama3).
First, get the code and install the dependencies.
git clone https://github.com/your-username/your-repo-name.git
cd your-repo-name
npm installThis part is a bit of a pain, but you only have to do it once.
- Go to the Google Cloud Console and create a new project.
- Enable the Gmail API for that project.
- Go to APIs & Services -> OAuth consent screen.
- Select External and create it.
- Fill out the app name (e.g., "My Job Search Analyzer") and your email.
- In the "Scopes" step, add the
.../auth/gmail.readonlyscope. - In the "Test users" step, add your own Gmail address.
- Go to APIs & Services -> Credentials.
- Click + Create Credentials -> OAuth client ID.
- Choose Desktop app for the application type.
- After you create it, Google will give you a Client ID and a Client Secret. Copy these.
Copy the example environment file.
cp .env.example .envNow open the new .env file and fill it out:
# Paste the credentials you got from Google
GOOGLE_CLIENT_ID="YOUR_CLIENT_ID.apps.googleusercontent.com"
GOOGLE_CLIENT_SECRET="YOUR_CLIENT_SECRET"
# Leave this empty for now, the next step will generate it
GOOGLE_TOKENS=''
# The Gmail search query to find job-related emails
GMAIL_SEARCH_QUERY="subject:(application OR apply OR interview OR career)"
# Your local Ollama server settings
LLM_BASE_URL="http://localhost:11434"
LLM_MODEL="llama3" # Or whatever model you downloadedNow you need to generate a token so the script can log in on your behalf.
- Make sure your local Ollama server is running.
- Run the token script:
node src/get-token.js
- It will print a URL. Copy it and paste it into your browser.
- Log in with your Google account and approve the permissions.
- It will redirect you to a blank page or an error page (that's normal). Copy the full URL from your browser's address bar.
- Paste the URL back into your terminal.
- The script will print out the
GOOGLE_TOKENSline. Copy this entire line and paste it into your.envfile, replacing the empty one.
That's it for setup. Now just run the main script.
npm startIt will show you its progress as it goes through the batches of emails.
After the script finishes, you'll have a email_analysis.db file in the project folder. You can open this with a tool like DB Browser for SQLite to look at the data and run queries.
For example, you can easily find out how many rejections you have from each company:
SELECT company, COUNT(*) as rejection_count
FROM job_emails
WHERE category = 'rejection' AND company IS NOT NULL
GROUP BY company
ORDER BY rejection_count DESC;