Giter VIP home page Giter VIP logo

cookiecutter-poetry's Introduction


Release Build status Supported Python versions Docs License

This is a modern Cookiecutter template that can be used to initiate a Python project with all the necessary tools for development, testing, and deployment. It supports the following features:


Documentation - Example - PyPi


Quickstart

On your local machine, navigate to the directory in which you want to create a project directory, and run the following two commands:

pip install cookiecutter-poetry
ccp

Alternatively, install cookiecutter and directly pass the URL to this Github repository to the cookiecutter command:

pip install cookiecutter
cookiecutter https://github.com/fpgmaas/cookiecutter-poetry.git

Create a repository on GitHub, and then run the following commands, replacing <project-name>, with the name that you gave the Github repository and <github_author_handle> with your Github username.

cd <project_name>
git init -b main
git add .
git commit -m "Init commit"
git remote add origin [email protected]:<github_author_handle>/<project_name>.git
git push -u origin main

Finally, install the environment and the pre-commit hooks with

make install

You are now ready to start development on your project! The CI/CD pipeline will be triggered when you open a pull request, merge to main, or when you create a new release.

To finalize the set-up for publishing to PyPi or Artifactory, see here. For activating the automatic documentation with MkDocs, see here. To enable the code coverage reports, see here.

Acknowledgements

This project is partially based on Audrey Feldroy's's great cookiecutter-pypackage repository.

cookiecutter-poetry's People

Contributors

aaccioly avatar bhjelmar avatar coinhexa avatar fpgmaas avatar kenibrewer avatar manimozaffar avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

cookiecutter-poetry's Issues

MKDOCS not producing output for mkdocstrings

Describe the bug

the -mkdocstrings configuration is not producting docstrings for code

To Reproduce

Steps to reproduce the behavior:

  1. just run the cookicutter for an empty project with no uploads set
  2. add some docstring to basic code in a code file / test file
  3. run make docs
  4. look for the modules output

thanks using not sphinx here !
not sure if that is an issue with the cookie but if mkdocstringsworks for you what configuration is needed ?

Expected behavior

some mkdocs are produced andserved for the docstrings

System [please complete the following information]:

  • OS: e.g. [Ubuntu 22.04]
  • Language Version: [e.g. Python 3.10]
  • Virtual environment: [mkdocs 1.5.3, poetry 1.8.0]

just started using the cookie and even first time cookicutter user

Add Devcontainer/Codespaces Specifications

Describe the solution you would like

Devcontainer specifications provide a reproducible environment with pre-installed software dependencies. They make it easier for new contributors / collaborators to jump into a project and make contributions instead of spending time setting up their dev environment.

Additional context

If this feature is of interest, I would be willing to contribute the necessary code.

GHA fails with dependencies not found: cache issue?

I created a repo, did a few PRs which spawned GHA that worked fine. Then, I renamed the repo (typo!) and now the caching seems wrong. The Setup the environment gets a cache which contains nothing: no isort, black, or anything at all. This, of course, causes the next step to fail horribly:

๐Ÿš€ Checking code formatting: Running isort
make: isort: Command not found
make: *** [Makefile:13: check] Error 127

Any idea what could be going wrong there?

Project has active maintenance ?

Question
The project looks really interesting so I wonder why not more issues has been raised.

e.g. python 3.9 is pretty old now addays.

the renovate pr has been already been raised.

Is the project actively beeing maintained ?

Template does not work properly if two different templates are installed.

If two templates are installed in an environment at the same time (e.g. this one and cookiecutter-pdm), this causes an issue, since both try to add hooks, {{cookiecutter.project_name}} and cookiecutter.json in the site-packages directory.

This means that only one template is actually installed, and both the commands ccp and ccpdm will trigger the same template. e.g. when installing both packages:

pip install cookiecutter-poetry
pip install cookiecutter-pdm

and then running

ccp
ccpdm

both commands wil create a poetry project!

Solutions

Option 1

A possible solution would be to move the template to a subdirectory:

#91

However, this would mean that the template, if installed from the URL, would need to be installed as follows:

cookiecutter https://github.com/fpgmaas/cookiecutter-poetry.git \
  --directory templates/cookiecutter-poetry

which is less clean and less intuitive. This also breaks the existing functionality.

Option 2

Remove the possibility to install the package, and remove the publishing to pypi. I don't really like this option, because currently it is easy for someone to pin the specific version of the template they want to use by simply installing that version of the template.

Option 3

Find another way to bundle the template with the Python package?

Feature Request: coverage report as GHA

Is your feature request related to a problem? Please describe.

I would like to see a coverage report generated by the unit tests. Clearly, this will have to happen to only the latest (?) python version.

Describe the solution you would like

We could use this GitHub action.

This takes an coverage.lcov report, which coverage can generate via coverage lcov. The latest (or supported) version of Python within tox could generate it.

Additional context

I am not attached to this GitHub action at all, so if there is something else, that would work too.

Is the gh-pages branch for MkDocs supposed to be created manually?

Question

I'm trying to setup a new project and noticed that a gh-pages branch wasn't created automatically at any point for making auto MkDocs possible. The documentation doesn't specify when/how the branch is created so I've become a bit confused on whether or not I was supposed to create it myself and then run a specific make command or similar to get the docs published on the remote repo. What are the proper steps here? Thanks!

How to avoid tox failing to import modules during GitHub actions?

Question

During GitHub Actions tests after pushing to main branch, Tox for some reason can't import modules I've added with poetry add and imported into my source files.
I always get this error: ImportError: libEGL.so.1: cannot open shared object file: No such file or directory
I'm not sure if I need to add something to tox.ini or change how I'm importing in my python files, or both to fix this.
Also, when creating my project with ccp, I accepted all the options except for docker. I'm not sure if that's playing a role here.

For example...

/my-project
  /my_project
    /main.py
  /tests
    /test_main.py

If I do poetry add pyside6
then add something like from PySide6.QtWidgets import QApplication, QWidget to /my_project/main.py
then import said source file to /tests/test_main.py file with from my_project.main import main, I will get the following error from GitHub actions when pushing to the remote main branch...

 ==================================== ERRORS ====================================
  _____________________ ERROR collecting tests/test_main.py ______________________
  tests/test_main.py:3: in <module>
      from my_project.main import main
  my_project/main.py:4: in <module>
      from PySide6.QtWidgets import QApplication, QWidget
  E   ImportError: libEGL.so.1: cannot open shared object file: No such file or directory

What's the proper way to fix tox not able to import properly?
All config files are at their defaults when they were created from ccp and I haven't run any special commands to change anything.

Pyright vs mypy

Just wanted to open up a discussion. Have you thought about giving pyright a try instead of using mypy?
I'm curious if you find one more favorable or effective than the other, let's discuss! What are your thoughts?

Bug Summary

Describe the bug

make docs fails on a new project with:

Error reading page 'modules.md': No module named 'mkdocstrings.handlers.python'

Prettier Pre-commit Hooks

Is your feature request related to a problem? Please describe.

Docs diffs become much easier to maintain when there is an opinionated code formatter for the Markdown files such as Prettier.
Prettier also allows opinionated formatting of other files relevant to python projects including YAML and JSON.

Describe the solution you would like

An ideal solution would consist of the following:

Additional context

If this feature is of interest, feel free to assign it to me for completion.

Broken on-release-main.yml when publish_to = none and cookiecutter.mkdocs = n

As per title, when one initialises a project with

include_github_actions: y
cookiecutter.mkdocs: n
publish_to: none

Cookiecutter generates a on-release-main file with no steps, which fails the build.

Suggested solutions:

Either remove the file all together when publish_to: none or execute all steps until the actual publishing.

Should locally defined actions be moved from `.github/workflows/` to `.github/actions/`?

Hi there! ๐Ÿ‘‹

I'm the maintainer of check-jsonschema, and a user recently reported the hooks failing on a project generated from this template: python-jsonschema/check-jsonschema#113

I see that there are actions defined in this repo in the workflows dir, which is what was confusing the pre-commit hook.
The hook behavior is to check any YAML files in .github/workflows/ against the workflow schema from schemastore. I'm not confident that the logic is correct, but it is consistent with the match rule in schemastore today.

By my reading of GitHub Workflows docs -- and the observed behavior of the service -- workflows are defined as .github/workflows/*.ya?ml, not .github/workflows/**/*.ya?ml. So I think there's a case to be made that check-jsonschema is not accurately describing github's behavior, but I'm continuing to study the situation a bit before making changes.

Schemastore defines a schema for github-actions, and check-jsonschema provides a hook. In the former case, it matches action.ya?ml in the repo root dir, and in the latter it matches action.ya?ml in the repo root or .github/actions/**/*.ya?ml. The rationale here is that GitHub's docs show use of that directory for local actions, so it seems somewhat common. Supporting validation of more local actions is therefore reasonable down this path.

It would therefore be nice, from my perspective, to move action definitions from subdirs of ./github/workflows to subdirs of ./github/actions/. The hooks would then run correctly without adjustments to check-jsonschema. Even if I make the workflow matching more strict (I probably will, but am taking time to think about it), actions won't be checked outside of the .github/actions/ dir.

Make badges work in private github repositories.

Is your feature request related to a problem? Please describe.

Badges do not work in private GitHub repositories, due to the use of shields.io.

Describe the solution you would like

Badges should be functional in private repositories as well.

Launch dev server with `use_exec`

Is your feature request related to a problem? Please describe.

I can't use my debugger (pudb) with poe api --dev because it is launched in the background. It triggers an error when the debugger is supposed to open:

  File "[...]/site-packages/pudb/debugger.py", line 2683, in event_loop
    keys = self.screen.get_input()
  File "[...]/site-packages/urwid/display/_raw_display_base.py", line 286, in get_input
    keys, raw = self.parse_input(None, None, self.get_available_raw_input())

  [...]

  File "[...]/site-packages/urwid/display/_posix_raw_display.py", line 298, in _getch
    return ord(os.read(fd, 1))
TypeError: ord() expected a character, but string of length 0 found

Describe the solution you would like

Launch the dev server with use_exec = true c.f. https://poethepoet.natn.io/tasks/options.html#defining-tasks-that-run-via-exec-instead-of-a-subprocess

For my project I changed the tasks declarations like that:

  [tool.poe.tasks.dev]
  help = "Serve a REST API for DEV purpose"
  use_exec = true
  cmd = """
    uvicorn \ 
      --host $host \
      --port $port \
      --reload \
      --log-level debug \
      network_manager.api:app
    """

    [[tool.poe.tasks.dev.args]]
    help = "Bind socket to this host (default: 0.0.0.0)"
    name = "host"
    options = ["--host"]
    default = "0.0.0.0"

    [[tool.poe.tasks.dev.args]]
    help = "Bind socket to this port (default: 8000)"
    name = "port"
    options = ["--port"]
    default = "8000"

  [tool.poe.tasks.api]
  help = "Serve a REST API"
  use_exec = true
  cmd = """
    gunicorn \
      --access-logfile - \
      --bind $host:$port \
      --graceful-timeout 10 \
      --keep-alive 10 \
      --log-file - \
      --timeout 30 \
      --worker-class uvicorn.workers.UvicornWorker \
      --worker-tmp-dir /dev/shm \
      --workers 2 \
      network_manager.api:app
    """

    [[tool.poe.tasks.api.args]]
    help = "Bind socket to this host (default: 0.0.0.0)"
    name = "host"
    options = ["--host"]
    default = "0.0.0.0"

    [[tool.poe.tasks.api.args]]
    help = "Bind socket to this port (default: 8000)"
    name = "port"
    options = ["--port"]
    default = "8000"
  • remove the dev option and split the existing task in 2
  • use cmd with use_exec instead of shell

Because when using use_exec, it's not possible to use shell. And when using cmd, it's not possible to use the shell if instruction... there might be a better way but this seems to work.

Additional context

There might be a better solution... but I can create a pull request if you are happy with this.

Add mypy and deptry as pre-commit hooks

Currently mypy is not running automatically (neither locally nor as part of GitHub's pipeline).

A user has to manually run:

make check

If he wants to trigger type checking.

deptry runs as part of the pipeline, but, IMO, it should be also run as part a pre-commit hook as catching dependency problems after they have been introduced is not great .

Proposed solution

Add both tools to pre-commit

Additional context

I've just recently adopted cookiecutter-poetry (thanks a million for it) and butchered some type hints.

I was under the impression that mypy was running automatically but unfortunately this wasn't the case. Later I've realised that there's a dedicated make target for linting, but had to read the Makefile to find it out (see #79).

As both mypy and deptry are able to run very fast I would have no problem running both of them on every commit.

Include MyPy install types when using make install command

Is your feature request related to a problem? Please describe.

When enabling MyPy type checking, the install types are not included after the make install, unless we manually specify all needed type dependencies.

Describe the solution you would like

We should enable something to include all the install types automatically. A pragmatic solution would be to add @poetry run mypy --install-types. But there might be better solutions, such as poetry-types, as we would like an automatic way to include all used types as (dev) dependencies.

Additional context

Curious what you think about this?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.