Surf is a self-hosted search engine, which attempts to provide relevant search results based on pages similar to the ones you've already visited. Upon visiting a new webpage, the Surf web extension communicates the webpage to the surf back-end, which crawls the outbound links of the webpage and ranks them using PageRank.
./surf_install_macos.sh
./surf_install_linux.sh
├── ext # Chrome / Firefox extensions for gathering searches
├── crawl # web crawler, to run in background
├── docs # Documentation (WIP)
├── ui # web / app UI (WIP)
├── data # data folder for web crawler data. Can move this somewhere else upon install
├── process_data # python scripts for processing data (WIP)
├── logging # logging web scraping + data processing(WIP)
└── messaging # internal IPC messaging between components (WIP)
- crawler_handler script (running in backgroun) calls crawler: (Data from crawler) -> format_data.py -> matrix for pagerank -> mat.csv -> website metadata (later) -> meta.json -> schema.json for pagerank schema