- It has a linear execution time depending on the number of posts on cold start and logarithmic on hot start.
- Memory usage is constant, no changes when the number of posts increases.
- The level of memory usage on cold start may be adjusted by urls chunk size (the number of links that requested in parallel, now 5 links).
- NO use of existing frameworks (created NanoFramework just for task)
- NO use of external modules/libraries
- Docker container for php8 & composer
- GitHub Actions CI\CD
- Composer & PSR autoload
- Curl multi execution, query urls in parallel
- Custom exceptions
- PSR Logs/Cache
- SPL FixedArray
- DB - file/table for a yielding reading by rows
- Console colored prints to the screen
- Psalm static analysis tool
- PHPUnit + PEST tests
- Get token for user & put it to the user cache (1 hour)
- Generate urls chunks (by 5 links)
- Send each chunk to the http client
- For each chunk, get multiple links results in parallel
- Save each chunk results to the DB in append mode (cache for 1 minute, posts may change)
- Start reading from DB by row/post
- Send each post to Statistic calculator that aggregate data
- Start reading from DB by row/post
- Send each post to Statistic calculator that aggregate data
$ git clone https://github.com/GeorgII-web/supermetrics
$ cp config.example.php config.php
$ composer install
Change config values.
$ php command statistics
$ php command help
$ php command cache clear
$ docker-compose up -d --build
$ docker exec -it supermetrics_php bash
$ cp config.example.php config.php
$ composer install
Change config values.
$ php command statistics
$ php command help
$ php command cache clear
$ php ./vendor/bin/psalm
$ php ./vendor/bin/pest