envoyproxy / toolshed Goto Github PK
View Code? Open in Web Editor NEWLicense: Apache License 2.0
License: Apache License 2.0
When this repo was created from the tools in envoy, packages were created according to their existing namespace.
This has led to a proliferation of packages, some of which - esp the core aio packages, can be packaged together, which will make this repo, and dep management easier.
When adding async logging it was not possible to add for the root logger as the flake8 module that is called in envoy.code.check
uses multiprocessing.pool
and the threadsafe queue that is used by default, does not work for that situation
Its likely that using a multiprocessing.queue would fix the problem, at the expensive of some perf, so it may be desirable to use a queue according to whether debug logging is set
When the runners/checkers were first written for envoy, they were non-async, and async functionality was added subsequently.
Most of the existing non-async runners/checkers could benefit from async preloading etc
It will cut code and make things more manageable if we just make all runners/checkers async and remove the non-async code
I have started adding some notes about development on this repo as wiki pages.
Lets add more, and crystalize it into README pages etc in the repo.
I think because of the way the data model is structured for dependencies/releases we only added sha checking for github deps, yet the other deps should work fine for this check also.
Partly as i hadnt realised that call_args
have equality with their list
representation and partly as i sometimes dumped the call_args
in pdb in this way - i got into the habit of wrapping all call_args
in list
we should clean that up
previously it worked mostly such that it ~always exited gracefully after doing any required cleanup
since merging the base code with runner, it no longer does this correctly
Currently we manually create releases and any package that is not marked -dev
in its VERSION
file gets published
This creates a lot of unwanted commits, and boilerplate tasks that are pretty annoying and potentially unnecessary.
One possible solution is to make it such that:
main
if the -dev
version has changed for one or more packages...VERSION
files and publishes the packages to the version indicated by the previous dev version/sCurrently it says "No Github access token supplied" but doesnt tell you how to set it
most likely this is a bug with mypy-abstracts
i hit a bug where mypy wasnt able to see abstract implementations *args
and **kwargs
it might just be the way mypy works with args/kwargs tho, so requires some investigation
If you try to run the dependency checker with bad credentials - ie a key that doesnt have permission to access some/all of the API resources required it can throw an error, which currently appears to be unhandled.
as the repo is a monorepo and there are quite a few packages with differing versions, it doesnt make sense to have semantically versioned releaeses - i think instead we should start doing date.{i}
for each release on a given date
Once envoy.code.check
lands there will only be the extension metadata tool left in Envoy that requires the envoy_py_binary
macro and related pytest/coverage/code
We can shift the metadata checker pretty trivially, so probably we want to do that quite quickly just for the benefit of removing so much code.
There is currently also another tool that is billed as "dependency validation" but mostly it deals with extension dependencies, im thinking we can add that to the extension.check
also
Currently, in Envoy CI, when a failing CVE is found for a dep the job errors until the issue is resolved by the dep being updated, or the CVE being excluded.
Often this is not noticed immediately.
If we instead of erroring, just warned that there is a failing CVE in CI, and then - as we do with "Newer release" tickets - created a ticket that a "Dependency CVE issue" had been found, we would more likely notice, have something to close through resolution, and the checker wouldnt error, unless something unexpected happened
We can fairly easily repurpose the release-issues
check to achieve this, altho it will take a little refactoring to handle multiple issue trackers/issue types
Some checkers provide fix functionality already.
It would be quite helpful if we had a way of easily marking fix utils as being associated with warnings/failures in specific checks.
Recently we added a @preload
decorator that serves a similar purpose, so we can probably copy/repurpose that to do the same.
Currently envoy often gets out of sync with the toolchains and images from build tools
Its also not always clear to people updating build-tools what needs to be updated, but its not so difficult to figure out programmatically
Most simply this could just issue a warning in envoys ci - more helpfully, but with significant added complexity, it could raise an issue or even a PR to fix
It would be great when updating dates/shas/etc if you were able to check just one or so deps at a time
code.checker has something to filter filepaths, we possibly want to do something similar with deps
We currently have 2 types of runner - a basic runner (as currently defined in aio.run.runner
) and a checker (as in aio.run.checker
)
The checker is multi-stage in the sense that it has multiple checks, and some will/not run according to cli args
It would be helpful to add a another runner type - builder
which like checker
is multi-stage, and responds to cli args to determine which stages to run
Im thinking this would be useful in particular for integrating the current tools for building docs
the checker summary could do with a bit of UX attention
also, there seems to be an issue where it displays summary notifications for a count of warnings/errors matching the least amount per check (or something like that)
similar to having a framework for checkers (and preloaders) it would be helpful to have something to trigger advice on certain failing checks
some flags dont work correctly and there is a lot of extraneous noise
known not to work as expected - multiple -c
flags
there are some that could just work better - eg adding !
negation to some matchers
some of the matchers use regexes - but probs should use globs for cli and convert with fnmatch
Currently Envoy has quite a few format checks of one sort or another.
Most could do with some optimization, and all could easily work as checks in a integrated checker.
Initially i would suggest integrating all the current jobs in format_pre
. After that we can look at the other bazel/proto format jobs.
Currently async_property
only has an integration test, which is handy but its missing coverage, and it needs some unit tests
Esp shellcheck as its less likely to be available on the system - but probs also need this for git and maybe even grep
Since version 0.17.0
of pytest-asyncio you no longer need the asyncio pytest markers
we should remove them all, and move to strict so that they are no longer allowed
We recently added a shell util for wrapping subprocess runners
atm, it runs the command, gathers the results and returns them - it would be good if it yielded results as available.
relevant code is here
It would be very helpful if we had a check that compares the shas of the tarballs to the ones in repository_locations.bzl
broadly i think the required steps are:
abstract/checker.py
abstract/checker.py
which downloads the tarballsabstract/dependency.py
to download the tarballs and compare the shasaio.core and envoy.code.check are both missing some unit tests
when this repo was initially setup it copied the local namespaces that were in envoy
as we consolidate/reduce packages, it would be good to post a message on the discontinued packages on pypi of where the code moved to
When #346 landed it didnt include tests - some should be added
Currently in envoy's format_pre
ci, if the glint check fails, then advice about fixing editor is added
This should be added in envoy.code.check
Currently we have a Dependency
class that may/not be github and has various raises etc accordingly.
If it is a github dep we can create a "release" object for it which unlocks further functionality.
I think we need to create a GithubDependency and separate it from Dependency, likewise with Release, so we can make some of the functionality available to other deps (cf #320 )
The recent release of pytest-asyncio
requires a config setting
As not all packages require this for their tests, they break on not recognizing the config
Ideally we need per-package configs to allow use of different pytest-plugins
It would be good to have a yaml linter/validater in code.check
Previous attempt at adding can be seen here
Currently the dep checker doesnt discriminate between prs and issues so can close a pr if it looks like a duplicate of an issue
We probs want to filter out the PRs so that doesnt happen
We also may want to filter out any issues/prs that were not created by the bot
We recently added an ignored_dirs
to ignore the requirements file in /tools/dev
It seems like bazel checks out -e
pip packages in the src
directory of the build, which breaks the pip check
eg, using the envoy.dependency.check
from PR is failing the pip check as a requirements file is found in /tools/dev/src/pytooling
one possible solution to this is to use regex matching for the directories, and match anything below /tools/dev
The glint code check in particular checks 1000s of files, so logging success on every one is a bit of a log hog
ideally we limit this to a given number and then overwrite the last success line or similar
this is kinda made more complex as i dont think we want to do this for any other logging and we need to properly accommodate both types of output
mostil, I think we dont want to pin pip hashes - because this repo is testing the latest builds, but i think at least we want dependabot to check our github actions, and maybe for some repo pip deps
There are quite a few patterns in aio.core that are basically experimental, and either dont quite work as expected, or there are better ways to do it.
For the most part the code using these routines has been shifted, so there is quite a bit that is now unused - a few items may be worth retaining
Current version 2.8.0rc0
debugging blocking code in the new code checker it seems as though writing so much output is blocking
im pretty sure ive seen something about async logging, ill investigate further
when you run pants check some.package::
it uses the dependencies that are specified in the package BUILD files
when you run pants check::
it ignores these and just uses the local versions if they are also being tested
i have raised this upstream, not least because you would expect the results to be the same running against each package, and running against all packages
i also tried to resolve this by running against each package separately, but the performance is terrible - i believe this is something being looked at upstream
The recommended way to run pants is with a script in the repo.
Until now, just installing pantsbuild.pants
has just worked, but currently there is a version conflict with docker-compose
which is breaking my local venv
Using the recommended pants script avoids this problem as it uses its own venv - we should add...
when #279 lands it probably wont include the integration tests that envoy.dependency.check currently has for the cve scanner as lib does not have a runner
these tests, altho useful are prone to breaking as they test the checker output which tends to unrelatedly change
now that the cve scanner is in its own package we can probably add better integration tests (based on the originals) but it will take some time
We need to test python 3.10 so it would be good to have a matrix setup to facilitate this (if its not there already)
We also need to consider whether we only target 3.10+ moving forward - i think yes
The implications of this are that if we go full 3.10 we can use newer language and libs, whereas if we dont we wil only have to ensure compatibility
The other implication is re users and testing
If we go strict 3.10+ we dont need to maintain a test matrix for different versions going forward
However, it also means we mandate to devs that use the tooling that they must use the correct python version
This will be a lot simpler once rules_python has proper toolchain/runtime env support
For example here envoyproxy/envoy#20057
The PR has the name that it matches for finding issues and thinks its a duplicate
we can prevent in a few ways
not sure why - seems like a github issue - i certainly didnt change anything, and its still working on envoy
As the initial PR is quite large (#143 ) - this ticket tracks the follow-up tasks
The latest update of pants seems to have changed something in how the packages are built
there is some customisation to use a setup.cfg file which i guess is broken
Currently you can sometimes get duplicate logs
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.