Skip to content

kippnorcal/jobvite

Repository files navigation

jobvite

An ETL job for extracting candidate and job data from Jobvite's API to Google Cloud Storage.

Dependencies:

Getting Started

Setup Environment

  1. Clone this repo
$ git clone https://github.com/kippnorcal/jobvite.git
  1. Install Pipenv
$ pip install pipenv
$ pipenv install
  1. Install Docker
  1. Build Docker Image
$ docker build -t jobvite .
  1. Create .env file with project secrets
JOBVITE_API_KEY=
JOBVITE_API_SECRET=

# Mailgun & email notification variables
MG_DOMAIN=
MG_API_URL=
MG_API_KEY=
FROM_ADDRESS=
TO_ADDRESS=

# Google Cloud Storage Settings
CANDIDATES_CLOUD_FOLDER=
JOBS_CLOUD_FOLDER=
CANDIDATE_TABLE=

# Google Cloud Credentials
GOOGLE_APPLICATION_CREDENTIALS=
GBQ_PROJECT=
BUCKET=

# dbt Credentials
DBT_ACCOUNT_ID=
DBT_JOB_ID=
DBT_BASE_URL=
DBT_PERSONAL_ACCESS_TOKEN=

Running the Job

Regular run for new candidates. Will query Jobvite API for record updates based since the most recent timestamp in the data warehouse.

$ docker run -t jobvite 

Optionally, you can run it with start/end date arguments to pull candidates from a period of time.

$ docker run -t jobvite --start-date='2025-07-01' --end-date='2025-07-31'

Maintenance

  • No annual maintenance is required
  • This connector should NOT be turned off during the summer.

About

An ETL service job for staging Jobvite candidate data

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •