Giter VIP home page Giter VIP logo

Comments (18)

MikeDean2367 avatar MikeDean2367 commented on June 9, 2024 1

您好,我本地测试一切正常,下面是一些排查建议:

  1. 请确保您的代码是仓库最新的代码,这个bug在之前的版本已经被修复;
  2. 如果是最新的代码,请检查gradio的版本,您可以在命令行使用pip list | grep gradio来查看,我使用的gradio的版本号是3.50.2

如果上面的步骤仍不能帮您解决问题,请告知我 :)

from knowlm.

MikeDean2367 avatar MikeDean2367 commented on June 9, 2024 1

您好,您可以尝试把第8行from peft import PeftModel代码注释掉,在下面的代码中没有使用这个包。

from knowlm.

keyilizzie avatar keyilizzie commented on June 9, 2024 1

您好,您可以尝试把第8行from peft import PeftModel代码注释掉,在下面的代码中没有使用这个包。

问题解决啦,谢谢🥂👏🏻

from knowlm.

guihonghao avatar guihonghao commented on June 9, 2024 1

由于knowlm-ie已经在大量信息抽取数据集上经过训练,因此具备一定的通用抽取能力。如果想微调领域的三元组抽取,至少100条吧,越多越好。

from knowlm.

keyilizzie avatar keyilizzie commented on June 9, 2024 1

由于knowlm-ie已经在大量信息抽取数据集上经过训练,因此具备一定的通用抽取能力。如果想微调领域的三元组抽取,至少100条吧,越多越好。

好滴😄3Q

from knowlm.

keyilizzie avatar keyilizzie commented on June 9, 2024

您好,gradio的版本号是3.50.2。为避免其他可能被忽视的细节,重新下载了代码,同样的命令,出现报错如下:
(knowlm) [liukeyi@ibgpu09 ~]$ CUDA_VISIBLE_DEVICES=0,1,2 python /home/liukeyi/KnowLM/examples/generate_lora_web.py --base_model /home/liukeyi/zhixi --multi_gpu
╭─────────────────────────────── Traceback (most recent call last) ────────────────────────────────╮
│ /home/liukeyi/KnowLM/examples/generate_lora_web.py:8 in │
│ │
│ 5 import gradio as gr │
│ 6 import torch │
│ 7 import transformers │
│ ❱ 8 from peft import PeftModel │
│ 9 from transformers import GenerationConfig, LlamaForCausalLM, LlamaTokenizer │
│ 10 from multi_gpu_inference import get_tokenizer_and_model │
│ 11 from typing import List │
│ │
│ /home/liukeyi/miniconda3/envs/knowlm/lib/python3.9/site-packages/peft/init.py:22 in │
│ │
│ 19 │
│ 20 version = "0.5.0" │
│ 21 │
│ ❱ 22 from .auto import ( │
│ 23 │ AutoPeftModel, │
│ 24 │ AutoPeftModelForCausalLM, │
│ 25 │ AutoPeftModelForSequenceClassification, │
│ │
│ /home/liukeyi/miniconda3/envs/knowlm/lib/python3.9/site-packages/peft/auto.py:30 in │
│ │
│ 27 │ AutoModelForTokenClassification, │
│ 28 ) │
│ 29 │
│ ❱ 30 from .config import PeftConfig │
│ 31 from .mapping import MODEL_TYPE_TO_PEFT_MODEL_MAPPING │
│ 32 from .peft_model import ( │
│ 33 │ PeftModel, │
│ │
│ /home/liukeyi/miniconda3/envs/knowlm/lib/python3.9/site-packages/peft/config.py:24 in │
│ │
│ 21 from huggingface_hub import hf_hub_download │
│ 22 from transformers.utils import PushToHubMixin │
│ 23 │
│ ❱ 24 from .utils import CONFIG_NAME, PeftType, TaskType │
│ 25 │
│ 26 │
│ 27 @DataClass
│ │
│ /home/liukeyi/miniconda3/envs/knowlm/lib/python3.9/site-packages/peft/utils/init.py:22 in │
│ │
│ │
│ 19 │
│ 20 # from .config import PeftConfig, PeftType, PromptLearningConfig, TaskType │
│ 21 from .peft_types import PeftType, TaskType │
│ ❱ 22 from .other import ( │
│ 23 │ TRANSFORMERS_MODELS_TO_PREFIX_TUNING_POSTPROCESS_MAPPING, │
│ 24 │ TRANSFORMERS_MODELS_TO_LORA_TARGET_MODULES_MAPPING, │
│ 25 │ TRANSFORMERS_MODELS_TO_ADALORA_TARGET_MODULES_MAPPING, │
│ │
│ /home/liukeyi/miniconda3/envs/knowlm/lib/python3.9/site-packages/peft/utils/other.py:24 in │
│ │
│ │
│ 21 import accelerate │
│ 22 import torch │
│ 23 from accelerate.hooks import add_hook_to_module, remove_hook_from_module │
│ ❱ 24 from accelerate.utils import is_npu_available, is_xpu_available │
│ 25 │
│ 26 from ..import_utils import is_auto_gptq_available │
│ 27 │
╰──────────────────────────────────────────────────────────────────────────────────────────────────╯
ImportError: cannot import name 'is_npu_available' from 'accelerate.utils'
(/home/liukeyi/miniconda3/envs/knowlm/lib/python3.9/site-packages/accelerate/utils/init.py)

from knowlm.

keyilizzie avatar keyilizzie commented on June 9, 2024

您好,还想请教下,如果想微调领域的三元组抽取,至少需要多少条训练数据呢?

from knowlm.

vv521 avatar vv521 commented on June 9, 2024

由于knowlm-ie已经在大量信息抽取数据集上经过训练,因此具备一定的通用抽取能力。如果想微调领域的三元组抽取,至少100条吧,越多越好。

好滴😄3Q

你好,请问做专属领域的信息抽取如何微调自己领域的数据?

from knowlm.

guihonghao avatar guihonghao commented on June 9, 2024

由于knowlm-ie已经在大量信息抽取数据集上经过训练,因此具备一定的通用抽取能力。如果想微调领域的三元组抽取,至少100条吧,越多越好。

好滴😄3Q

你好,请问做专属领域的信息抽取如何微调自己领域的数据?

你可以参考https://github.com/zjunlp/DeepKE/blob/main/example/llm/InstructKGC/README_CN.md,进行微调。

from knowlm.

vv521 avatar vv521 commented on June 9, 2024

由于knowlm-ie已经在大量信息抽取数据集上经过训练,因此具备一定的通用抽取能力。如果想微调领域的三元组抽取,至少100条吧,越多越好。

好滴😄3Q

你好,请问做专属领域的信息抽取如何微调自己领域的数据?

你可以参考https://github.com/zjunlp/DeepKE/blob/main/example/llm/InstructKGC/README_CN.md,进行微调。

好的,请问输出的结果有p,r,f1的参考值吗?

from knowlm.

guihonghao avatar guihonghao commented on June 9, 2024

你可以测试,我们用计算P、R、F1的脚本。https://github.com/zjunlp/DeepKE/blob/main/example/llm/InstructKGC/kg2instruction/evaluate.py

from knowlm.

vv521 avatar vv521 commented on June 9, 2024

你可以测试,我们用计算P、R、F1的脚本。https://github.com/zjunlp/DeepKE/blob/main/example/llm/InstructKGC/kg2instruction/evaluate.py

好的非常感谢您的回答!

from knowlm.

zxlzr avatar zxlzr commented on June 9, 2024

请问您还有其他问题吗

from knowlm.

vv521 avatar vv521 commented on June 9, 2024

请问您还有其他问题吗
您好,请问如果要做关系抽取,整理了一百条数据集,这个格式要怎么转化为knowlm-13b-ie需要的格式?另外微调knowlm-13b-ie的话是基于instructkgc这个项目微调吗?

from knowlm.

guihonghao avatar guihonghao commented on June 9, 2024

请问您还有其他问题吗
您好,请问如果要做关系抽取,整理了一百条数据集,这个格式要怎么转化为knowlm-13b-ie需要的格式?另外微调knowlm-13b-ie的话是基于instructkgc这个项目微调吗?

你好,我们新开源的仓库IEPile发布了更强大的抽取模型LLaMA2-IEPILE以及相应的教程。
https://github.com/zjunlp/IEPile/tree/main
你可以参考
https://github.com/zjunlp/IEPile/blob/main/README_CN.md#4%E9%A2%86%E5%9F%9F%E5%86%85%E6%95%B0%E6%8D%AE%E7%BB%A7%E7%BB%AD%E8%AE%AD%E7%BB%83
将100条数据转换成指定的指令数据继续训练模型。

from knowlm.

vv521 avatar vv521 commented on June 9, 2024

请问您还有其他问题吗
您好,请问如果要做关系抽取,整理了一百条数据集,这个格式要怎么转化为knowlm-13b-ie需要的格式?另外微调knowlm-13b-ie的话是基于instructkgc这个项目微调吗?

你好,我们新开源的仓库IEPile发布了更强大的抽取模型LLaMA2-IEPILE以及相应的教程。 https://github.com/zjunlp/IEPile/tree/main 你可以参考 https://github.com/zjunlp/IEPile/blob/main/README_CN.md#4%E9%A2%86%E5%9F%9F%E5%86%85%E6%95%B0%E6%8D%AE%E7%BB%A7%E7%BB%AD%E8%AE%AD%E7%BB%83 将100条数据转换成指定的指令数据继续训练模型。

好的感谢您,请问这个LLaMA2-IEPILE比knowlm-13b-ie专属领域抽取效果更好是吗?那微调这些数据24g 显存的3090够用吗?

from knowlm.

guihonghao avatar guihonghao commented on June 9, 2024

LLaMA2-IEPILE比knowlm-13b-ie专属领域抽取效果好很多,LLaMA2-IEPILE的底座模型是LLaMA2-13B-Chat,LLaMA2-IEPILE本身是一个lora,微调开启量化,24g 显存的3090应该够用。

from knowlm.

vv521 avatar vv521 commented on June 9, 2024

好滴,非常感谢哈!!!棒棒哒

from knowlm.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.