Skip to content

Latest commit

 

History

History
131 lines (109 loc) · 5.05 KB

NOTES.md

File metadata and controls

131 lines (109 loc) · 5.05 KB

Notes

TODO

Sooner

  • add more indexes to the database to improve performance of common queries.
  • cache data served from /api/data to improve performance/reduce queries. goroutine to refresh data periodically.
  • [/] website requests statistics tracking/logging
  • robots.txt ?
  • import older "palmer" data (note logs are in Y-D-M format, not Y-M-D)

Later

Done

  • prevent button mashing in client with temporary disable of load photos button.
  • fallback to some 'default' astro or start/end time if Sun/Moon served is inaccessible
  • resize image on height, not width (can do either or both. 0='auto' to maintain aspect ratio)
  • resize image only if bigger than height/width
  • simulate 'click' on info tab when hiding other tabs after user selects a camera
  • fix 1-off error in timelapse counter (thought i did before?)
  • limit timespan of request to 2 weeks or do something else sensible to prevent huge 58000 scrape requests
  • css styles for "flashing" tabs when they're first unhidden
  • server to use StaticRoot if available, fallback to embedded
  • HTTP strict transport? see: https://cheatsheetseries.owasp.org/cheatsheets/HTTP_Strict_Transport_Security_Cheat_Sheet.html
  • block (404) any HTTP requests not for specfic domains (hosts). (and also anything that isn't for root ("/"))
  • "end" scrape param still needs to include that day for all cases.
  • shutdown scraped on signal (requires change to scheduler to allow tasks to complete)

Ideas

scraping

  • scraping done via a scheduler 'daemon' (instead of using cron)
    • upon startup, tasks are enqueued for the scheduler
      • tasks: scrape, update ancillary data, enqueue next set of tasks, etc.
      • need function that figures out what tasks to enqueue for each mt/cam
  • scraping is performed as scheduled, and using a set of 'rules' which are 'scripted' via a go text template evaluating to True or False.
  • url to scrape is generated via go text template.
    • allows more 'dynamic' urls, such as ones containing a date/time
    • many urls will still just be static
  • image processing:
    • resize image (save disk space for large images)
    • check if scraped image is identical (or nearly so) to previously scraped (resized) image, and discard if so (to avoid duplicates/"frozen" cams)
  • get sunrise/set data from us navy api

config

  • program config in a file (ie config.json)
    • config system 'watches' file for updates and reloads settings live

binaries

  • scrape daemon
  • frontend server
  • cmd line tool to manipulate (CRUD) mountains and cams

front end

  • frontend changes
    • remove time from selection; dates only, midnight-midnight
    • show mt/cam info before fetching photos
      • add 'statistics' after fetch
      • changes when mt/cam selection changes
    • show 'log' time in mountain's local time

other

  • remove
    • weather? kinda sucks

internal packages

  • astro - gets sun/moon data from navy api
    • various constants for phemonenon
  • db - database connection and queries
  • model - data structs
  • scheduler - executes tasks at pre-scheduled times
  • googletz - get tz location id (eg "America/Los_Angeles") for lat/lon
  • log - provides simple logging to systemd via stdout
  • config - suite wide config structure and helper functions for config file watching

Dependencies

  1. github.com/mattn/go-sqlite3 - for sqlite
  2. github.com/disintegration/imaging - for image resizing
  3. github.com/gorilla/mux - easier handling of api routes
  4. http://github.com/sirupsen/logrus - might have to make my own formatter for systemd
  5. github.com/shibukawa/configdir - don't really need if i assume linux (can just use os.GetEnv())

Directories

  • binaries and cfg in /opt/mtcam
  • images and db in ~/mtcam

or

  • all files (bin, cfg, db) in /opt/mtcam
  • images in /opt/mtcam/img

or

  • binaries in /opt/mtcam
  • config files in /etc/mtcam
  • database in /var/opt/mtcam
  • images in /var/opt/mtcam/img

Migration

  1. stop old mtcam scraper (on pi)
    1. remove 'idle' scrapes from old db
  2. $ sqlite3 new.db create new database file
  3. .read new_table.sql create the new tables
  4. .read migration.sql will pull in old.db, set new/updated fields on old data
  5. go run cmd/convert_tz/main.go new.db converts all times in db from PST to UTC
  6. move images from pi to nuc (~18GB)

API

Generally not changed from python version

https://<whatever address>/ -> /
    homepage and static root

/img/
    root of image folders

/api/
    root of api. returns nothing.
/api/data/
    GET: returns json dict<id,obj> of mountains containing dict<id,obj> of cams
/api/mountains/<mt_id>/cams/<cam_id>/scrapes[?start=<datetime>&end=<datetime>]
    GET: returns json list of scrape records