Giter VIP home page Giter VIP logo

rasa-faq-bot's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

rasa-faq-bot's Issues

rasa run actions resulted following error

2021-05-16 19:35:30 INFO rasa_sdk.endpoint - Starting action endpoint server...
2021-05-16 19:35:31 INFO root - Load pretrained SentenceTransformer: bert-base-nli-mean-tokens
2021-05-16 19:35:31 INFO root - Load SentenceTransformer from folder: bert-base-nli-mean-tokens
2021-05-16 19:35:34 INFO root - Use pytorch device: cpu
Standard question size 1000
Start to calculate encoder....
Batches: 0%| | 0/125 [00:00<?, ?it/s]Truncation was not explicitly activated but max_length is provided a specific value, please use truncation=True to explicitly truncate examples to max length. Defaulting to 'longest_first' truncation strategy. If you encode pairs of sequences (GLUE-style) with the tokenizer you can select this strategy more precisely by providing a specific strategy to truncation.
/opt/rasa/venv/lib/python3.6/site-packages/transformers/tokenization_utils_base.py:2143: FutureWarning: The pad_to_max_length argument is deprecated and will be removed in a future version, use padding=True or padding='longest' to pad to the longest sequence in the batch, or use padding='max_length' to pad to a max length. In this case, you can give a specific length with max_length (e.g. max_length=45) or leave max_length to None to pad to the maximal input size of the model (e.g. 512 for Bert).
FutureWarning,
Batches: 0%| | 0/125 [00:00<?, ?it/s]
Traceback (most recent call last):
File "/opt/rasa/venv/bin/rasa", line 8, in
sys.exit(main())
File "/opt/rasa/venv/lib/python3.6/site-packages/rasa/main.py", line 92, in main
cmdline_arguments.func(cmdline_arguments)
File "/opt/rasa/venv/lib/python3.6/site-packages/rasa/cli/run.py", line 52, in run_actions
sdk.main_from_args(args)
File "/opt/rasa/venv/lib/python3.6/site-packages/rasa_sdk/main.py", line 21, in main_from_args
args.auto_reload,
File "/opt/rasa/venv/lib/python3.6/site-packages/rasa_sdk/endpoint.py", line 137, in run
action_package_name, cors_origins=cors_origins, auto_reload=auto_reload
File "/opt/rasa/venv/lib/python3.6/site-packages/rasa_sdk/endpoint.py", line 80, in create_app
executor.register_package(action_package_name)
File "/opt/rasa/venv/lib/python3.6/site-packages/rasa_sdk/executor.py", line 250, in register_package
self._import_submodules(package)
File "/opt/rasa/venv/lib/python3.6/site-packages/rasa_sdk/executor.py", line 206, in _import_submodules
package = self._import_module(package)
File "/opt/rasa/venv/lib/python3.6/site-packages/rasa_sdk/executor.py", line 227, in _import_module
module = importlib.import_module(name)
File "/usr/local/lib/python3.6/importlib/init.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "", line 994, in _gcd_import
File "", line 971, in _find_and_load
File "", line 955, in _find_and_load_unlocked
File "", line 665, in _load_unlocked
File "", line 678, in exec_module
File "", line 219, in _call_with_frames_removed
File "/opt/rasa/rasa-faq/actions.py", line 94, in
encode_standard_question(sentence_transformer_select,pretrained_model)
File "/opt/rasa/rasa-faq/actions.py", line 86, in encode_standard_question
standard_questions_encoder = torch.tensor(bc.encode(standard_questions)).numpy()
File "/opt/rasa/venv/lib/python3.6/site-packages/sentence_transformers/SentenceTransformer.py", line 150, in encode
out_features = self.forward(features)
File "/opt/rasa/venv/lib/python3.6/site-packages/torch/nn/modules/container.py", line 119, in forward
input = module(input)
File "/opt/rasa/venv/lib/python3.6/site-packages/torch/nn/modules/module.py", line 889, in _call_impl
result = self.forward(*input, **kwargs)
File "/opt/rasa/venv/lib/python3.6/site-packages/sentence_transformers/models/BERT.py", line 33, in forward
output_states = self.bert(**features)
File "/opt/rasa/venv/lib/python3.6/site-packages/torch/nn/modules/module.py", line 889, in _call_impl
result = self.forward(*input, **kwargs)
File "/opt/rasa/venv/lib/python3.6/site-packages/transformers/models/bert/modeling_bert.py", line 912, in forward
batch_size, seq_length = input_shape
ValueError: not enough values to unpack (expected 2, got 1)

Bot in Rasa X doesn't answer

I try to run a simple example with the default data. However, when I use the interactive bot, no matter what I write, the bot stuck after action_listen and it doesn't return any response.

The steps I followed in a Windows laptop:

  1. Downloaded this repository
  2. Install Bert as A service and run the .sh file

The command shows this

I:WORKER-0:ready and listening!
I:WORKER-1:ready and listening!
I:VENTILATOR:all set, ready to serve request!
I:VENTILATOR:new config request req id: 1       client: b'8e0660af-7981-49c0-a31                           7-b620f0c8fe91'
I:SINK:send config      client b'8e0660af-7981-49c0-a317-b620f0c8fe91'
  1. Then I installed both Rasa and Rasa X. For Rasa X, I had to do it with multipass as the official documentation says.
  2. Then on the folder, I run the command rasa run actions

I get this response

2020-05-14 11:33:38 INFO     rasa_sdk.endpoint  - Starting action endpoint server...
(1000, 3072)
2020-05-14 11:33:44 INFO     rasa_sdk.executor  - Registered function for 'action_get_answer'.

Then on Rasa X, I added the FAQ.md into the NLU tab and train the model. I set it as active and tried the interactive bot. But as I said, no response with the dots just hanging there as someone is writing.

Cannot install rasa x 0.20.1

There is lots of compatibility issues here

I cannot install rasa x 0.20.1 so I had to install newer version, which eventually leads to install Tensorflow 2.x, and bert as a service cannot run on TF >2, only can run with TF <1.15

It's a myth how you can install rasa x 0.20.1...

Updated: I didnt install rasa x anymore but instead use rasa train normally, and then use rasa shell to interact with the chatbot. However, Segmentation Fault appears. I tried on both my Mac machines, same fault

error after executing rasa run actions

(venv) root@rasadev:/opt/rasa/rasa-faq# rasa run actions
2021-05-13 20:01:39 INFO rasa_sdk.endpoint - Starting action endpoint server...
2021-05-13 20:01:39 ERROR rasa_sdk.executor - Failed to register package 'actions'.
Traceback (most recent call last):
File "/opt/rasa/venv/lib/python3.8/site-packages/rasa_sdk/executor.py", line 262, in register_package
self._import_submodules(package)
File "/opt/rasa/venv/lib/python3.8/site-packages/rasa_sdk/executor.py", line 218, in _import_submodules
package = self._import_module(package)
File "/opt/rasa/venv/lib/python3.8/site-packages/rasa_sdk/executor.py", line 239, in _import_module
module = importlib.import_module(name)
File "/usr/lib/python3.8/importlib/init.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "", line 1014, in _gcd_import
File "", line 991, in _find_and_load
File "", line 975, in _find_and_load_unlocked
File "", line 671, in _load_unlocked
File "", line 783, in exec_module
File "", line 219, in _call_with_frames_removed
File "/opt/rasa/rasa-faq/actions.py", line 11, in
import torch
ModuleNotFoundError: No module named 'torch'

installation issue with deep learning linux ami ec2

Hello Team,

I have to install rasa-faq-bot to amazon ec2 but tensorflow have issues to install while install rasa. Can you send some guidelines to properly install this in linux ami. Looking forward to hear from you.

Operand/shape error

Sending a query from Rasa to the action server generates the following error:
score = np.sum((self.standard_questions_encoder * query_vector), axis=1) / (self.standard_questions_encoder_len * (np.sum(query_vector * query_vector) ** 0.5)) ValueError: operands could not be broadcast together with shapes (1000,3072) (768,)
The very similar code from the bert-as-service example,
score = np.sum(query_vec * doc_vecs, axis=1) / np.linalg.norm(doc_vecs, axis=1)
works as expected.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.