Giter VIP home page Giter VIP logo

Comments (7)

xXWarMachineRoXx avatar xXWarMachineRoXx commented on August 10, 2024

Updated the Makefile to help people running on windows

.PHONY: help install install_dev add add_dev export_requirements dev_train_local train_local dev_train_beam train_beam dev_infer_local infer_local dev_infer_beam infer_beam lint_check lint_fix format_check format_fix

# === Install ===

install:
	@echo "Installing training pipeline..."
	
	set PYTHON_KEYRING_BACKEND=keyring.backends.null.Keyring & \
	poetry env use $(shell where python) & \
	poetry install & \
	poetry run pip install torch==2.0.1

install_dev: install
	set PYTHON_KEYRING_BACKEND=keyring.backends.null.Keyring & \
	poetry install --with dev

install_only_dev:
	set PYTHON_KEYRING_BACKEND=keyring.backends.null.Keyring & \
	poetry install --only dev

add:
	set PYTHON_KEYRING_BACKEND=keyring.backends.null.Keyring & \
	poetry add $(package)

add_dev:
	set PYTHON_KEYRING_BACKEND=keyring.backends.null.Keyring & \
	poetry add --group dev $(package)


# === Beam ===

export_requirements:
	@echo "Exporting requirements..."

	if exist requirements.txt del requirements.txt
	poetry export -f requirements.txt --output requirements.txt --without-hashes

upload_dataset_to_beam:
	@echo "Pushing data to the qa_dataset volume on Beam..."
	
	beam volume upload qa_dataset dataset


# === Training ===

dev_train_local:
	@echo "Running training pipeline locally using the development config..."
	
	poetry run python -m tools.train_run --config_file configs/dev_training_config.yaml --output_dir ./output --dataset_dir ./dataset

train_local:
	@echo "Running training pipeline locally using the production config..."
	
	poetry run python -m tools.train_run --config_file configs/training_config.yaml --output_dir ./output --dataset_dir ./dataset

dev_train_beam: export_requirements
	@echo "Running training pipeline on Beam using the development config..."

	set BEAM_IGNORE_IMPORTS_OFF=true & \
	beam run ./tools/train_run.py:train -d '{"config_file": "configs/dev_training_config.yaml", "output_dir": "./output", "dataset_dir": "./qa_dataset/dataset", "env_file_path": ".env", "model_cache_dir": "./model_cache"}'

train_beam: export_requirements
	@echo "Running training pipeline on Beam using the production config..."

	set BEAM_IGNORE_IMPORTS_OFF=true & \
	beam run ./tools/train_run.py:train -d '{"config_file": "configs/training_config.yaml", "output_dir": "./output", "dataset_dir": "./qa_dataset/dataset", "env_file_path": ".env", "model_cache_dir": "./model_cache"}'


# === Inference ===

dev_infer_local:
	@echo "Running inference pipeline locally using the development config..."

	poetry run python -m tools.inference_run --config_file configs/dev_inference_config.yaml --dataset_dir ./dataset

infer_local:
	@echo "Running inference pipeline locally using the production config..."

	poetry run python -m tools.inference_run --config_file configs/inference_config.yaml --dataset_dir ./dataset

dev_infer_beam: export_requirements
	@echo "Running inference pipeline on Beam using the development config..."

	set BEAM_IGNORE_IMPORTS_OFF=true & \
	beam run ./tools/inference_run.py:infer -d '{"config_file": "configs/dev_inference_config.yaml", "dataset_dir": "./qa_dataset/dataset", "env_file_path": ".env", "model_cache_dir": "./model_cache"}'

infer_beam: export_requirements
	@echo "Running inference pipeline on Beam using the production config..."

	set BEAM_IGNORE_IMPORTS_OFF=true & \
	beam run ./tools/inference_run.py:infer -d '{"config_file": "configs/inference_config.yaml", "dataset_dir": "./qa_dataset/dataset", "env_file_path": ".env", "model_cache_dir": "./model_cache"}'


# === PEP8 ===
# Be sure to install the dev dependencies first #

lint_check:
	@echo "Checking for linting issues..."

	poetry run ruff check .

lint_fix:
	@echo "Fixing linting issues..."

	poetry run ruff check --fix .

format_check:
	@echo "Checking for formatting issues..."

	poetry run black --check .

format_fix:
	@echo "Formatting code..."

	poetry run black .

from hands-on-llms.

xXWarMachineRoXx avatar xXWarMachineRoXx commented on August 10, 2024

Now it says I need python 3.10 but I have 3.11!!

from hands-on-llms.

xXWarMachineRoXx avatar xXWarMachineRoXx commented on August 10, 2024

Changed pyproject.toml from :

python = ">=3.10,<3.11"

To :

python = ">=3.10,<3.12"

from hands-on-llms.

xXWarMachineRoXx avatar xXWarMachineRoXx commented on August 10, 2024

Error

Installing collected packages: mpmath, typing-extensions, sympy, networkx, MarkupSafe, filelock, jinja2, torch
ERROR: Could not install packages due to an OSError: [Errno 2] No such file or directory: 'C:\\Users\\User\\AppData\\Local\\packages\\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\\LocalCache\\Local\\pypoetry\\Cache\\virtualenvs\\training-pipeline-Twg6Ctcn-py3.11\\Lib\\site-packages\\sympy\\parsing\\autolev\\test-examples\\pydy-example-repo\\mass_spring_damper.al

Solution

Enable long path support in regedit

from hands-on-llms.

xXWarMachineRoXx avatar xXWarMachineRoXx commented on August 10, 2024

Error

BitsandBytes is compiled using CPU support and not GPU

Log

UserWarning: The installed version of bitsandbytes was compiled without GPU support. 8-bit optimizers, 8-bit multiplication, and GPU quantization are unavailable.. Please update your configuration.

'NoneType' object has no attribute 'cadam32bit_grad_fp32'
2024-05-29 17:29:25,536 - INFO - Initializing env vars...
2024-05-29 17:29:25,537 - INFO - Loading environment variables from: .env
2024-05-29 17:29:37,786 - INFO - PyTorch version 2.0.1 available.
Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "C:\Users\
    \Documents\coding\Work\GPT2\Fine_tune3.5\hands-on-llms\modules\training_pipeline\tools\train_run.py", line 83, in <module>
    fire.Fire(train)
  File "C:\Users\
    \AppData\Local\packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\Local\pypoetry\Cache\virtualenvs\training-pipeline-Twg6Ctcn-py3.11\Lib\site-packages\fire\core.py", line 141, in Fire
    component_trace = _Fire(component, args, parsed_flag_args, context, name)
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\
    \AppData\Local\packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\Local\pypoetry\Cache\virtualenvs\training-pipeline-Twg6Ctcn-py3.11\Lib\site-packages\fire\core.py", line 475, in _Fire
    component, remaining_args = _CallAndUpdateTrace(
                                ^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\
    \AppData\Local\packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\Local\pypoetry\Cache\virtualenvs\training-pipeline-Twg6Ctcn-py3.11\Lib\site-packages\fire\core.py", line 691, in _CallAndUpdateTrace
    component = fn(*varargs, **kwargs)
                ^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\User\AppData\Local\packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\Local\pypoetry\Cache\virtualenvs\training-pipeline-Twg6Ctcn-py3.11\Lib\site-packages\beam\app.py", line 1346, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\
    \Documents\coding\Work\GPT2\Fine_tune3.5\hands-on-llms\modules\training_pipeline\tools\train_run.py", line 59, in train
    from training_pipeline.api import TrainingAPI
  File "C:\Users\
    \Documents\coding\Work\GPT2\Fine_tune3.5\hands-on-llms\modules\training_pipeline\training_pipeline\api\__init__.py", line 2, in <module>
    from training_pipeline.api.training import TrainingAPI
  File "C:\Users\
    \Documents\coding\Work\GPT2\Fine_tune3.5\hands-on-llms\modules\training_pipeline\training_pipeline\api\training.py", line 17, in <module>
    from trl import SFTTrainer
  File "C:\Users\
    \AppData\Local\packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\Local\pypoetry\Cache\virtualenvs\training-pipeline-Twg6Ctcn-py3.11\Lib\site-packages\trl\__init__.py", line 5, in <module>
    from .core import set_seed
  File "C:\Users\
    \AppData\Local\packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\Local\pypoetry\Cache\virtualenvs\training-pipeline-Twg6Ctcn-py3.11\Lib\site-packages\trl\core.py", line 23, in <module>
    from transformers import top_k_top_p_filtering
ImportError: cannot import name 'top_k_top_p_filtering' from 'transformers' (C:\Users\
    \AppData\Local\packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\Local\pypoetry\Cache\virtualenvs\training-pipeline-Twg6Ctcn-py3.11\Lib\site-packages\transformers\__init__.py)
make: *** [dev_train_local] Error 1

Attempted fix

change bitsandbytes version to cuda based bitsandbytes ( bitsandbytes-cuda112 as my nvidia version is 12.1 )
turns out its not supported for windows 💢😤😡🤬

So,
I install bitsandbytes-windows and now I get another error module huggingface_hub has no import like generated.types.zero_shot_image_classification'

and im ### Stuck

from hands-on-llms.

xXWarMachineRoXx avatar xXWarMachineRoXx commented on August 10, 2024

Current Error

ModuleNotFoundError: No module named huggingface_hub.inference._generated.types.zero_shot_image_classification
make: *** [dev_train_local] Error 1

from hands-on-llms.

MotiBaadror avatar MotiBaadror commented on August 10, 2024

Why don't you use linux subspace for window to make things smooth?

from hands-on-llms.

Related Issues (16)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.