core is my personal repository. Most of the code is related to providing data for a
quantified self dashboard on Klipfolio. Data is
ETL'd and sent to a
PostgreSQL database hosted on
Google Cloud SQL. Hibernate is used as the ORM and schema
generator. Everything is scheduled with Quartz.
- Anki local SQLite database
- Calibre local SQLite database
- Fitbit API
- Goodreads API
- Google Analytics API
- Google Fit API
- Google Sheets API
- Habitica API
- HERE API
- Human API
- Indie Hackers scraping
- Kiva API
- Last.fm API
- LeetCode scraping
- LIFX API
- RescueTime API
- RottenTomatoes scraping
- Toodledo API & scraping
- Trello API
- WakaTime API
- Wikipedia: Wikimedia API, DBpedia scraping, MediaWiki API
- Install the gcloud sdk.
- Run
gcloud init, enter your credentials into browser. - When prompted, select project
z1lc-qs/arctic-rite-143002.
- Run
- Set the environment variable
GOOGLE_APPLICATION_CREDENTIALSto point toz1lc-qs.json. More info here. - Install Anki, ideally a version ≥2.1.
- Log into Anki and sync.
- Install the AnkiConnect add-on.
- To avoid having passwords and API keys stored alongside code in Git, this project uses a file called
secrets.jsonwhich provides secrets to the application at runtime. Ensure you've provided a valid mapping for eachcom.robertsanek.util.SecretTypewithin thesecrets.json, and that it is located in the root directory. You can find out where this directory is for your platform by callingcom.robertsanek.util.platform.CrossPlatformUtils::getRootPathIncludingTrailingSlash. You can refer to thesecrets.template.jsonfile for an example of what the realsecrets.jsonshould look like. - If you plan on running the
ETLcommand, ensure you've run theETL_SETUPcommand once beforehand.
Pass a command-line argument to select one of the below (documented in Main.java).
ETLwill run all ETLs and then runDQ.DQwill run data quality checks.HABITICAwill generate an html document with a summary of Habitica dailies.PASSIVE_KIVAwill generate an html document with short-duration Kiva loans from highly-rated field partners.WIKIwill extract basic information about popular Wikipedia articles that refer to people, outputting a csv file and images to import into Anki.ETL_SETUPneeds to be triggered before ETLs are run. Idempotent (no downside to re-running).DAEMONwill run some combination of the above commands on a specified schedule. SeeMain.javafor the exact scheduling.
Example: java -jar target/core-1.0-SNAPSHOT.jar -command etl_setup -type manual