Comments (15)
@sayakpaul hi, I will check out your notebook and see what I can find out. I will ping you once I have more info
from bentoml.
Yes. I think I am going to try to do that and see how it goes. Will keep you updated.
Thank you for finding this issue for us in BentoML @sayakpaul
from bentoml.
@sayakpaul Hey, I create a quick solution that will solve this. #97
What I did is a create a TF model wrapper that will load and predict with the same session and graph. I am not sure this is a good long term solution for this problem. I think a good long term solution might be separate the handle request layer (flask/other front end) and inferencing server(tensorflow serving/etc).
I tested the patch locally with your notebook. I will do few more test and then merge this PR. If you are able to test the your notebook with this branch and find any issues, please ping me.
Thank you for reporting bug. Have a good weekend!
from bentoml.
@sayakpaul yes and you can build BentoML locally by:
git clone https://github.com/bentoml/BentoML.git
cd BentoML
# fetch the remote branch pull request #97
git fetch origin pull/97/head:pr-97
# switch to branch
git checkout pr-97
# install bentoML with local changes
pip install .
I will also add a local development document next week to make it easier for people who want to contribute to BentoML.
from bentoml.
@yubozhao @parano it works as expected now. Here is the repository all updated. Thank you very much to both of you for your sincere and generous help. Really appreciated.
from bentoml.
Thanks for reporting the issue, @sayakpaul, I ran your BentoService definition code with the model from our tf-keras example https://github.com/bentoml/BentoML/blob/master/examples/tf-keras-text-classification/tf-keras-text-classification.ipynb but cannot reproduce this issue.
The only way I can reproduce the exact error, is by changing the @artifacts([TfKerasModelArtifact('model')])
to @artifacts(TfKerasModelArtifact('model'))
. Note that @artifacts
takes a list of Artifacts definition. We should add a better error message here or just make it work when providing one single artifact.
from bentoml.
On a related note, you can also put vectorizer into the list of artifacts, that way you don't need to re-train in on every API call. For example:
@artifacts([
TfKerasModelArtifact('model'),
PickleArtifact('vectorizer')
])
@env(conda_dependencies=['tensorflow', 'pandas', 'scikit-learn'])
class TextClassificationService(BentoService):
def remove_digits(s):
remove_digits = str.maketrans('', '', digits)
res = s.translate(remove_digits)
return res
@api(JsonHandler)
def predict(self, parsed_json):
text = parsed_json['text']
text = remove_digits(text)
text = self.artifacts.vectorizer.transform(text)
prediction = self.artifacts.model.predict_classes(text)
response = {'Sentiment': prediction}
return response
model = keras.Sequential()
model.fit(...)
vectorizer = CountVectorizer(stop_words=None, lowercase=True,
ngram_range=(1, 1), min_df=2, binary=True)
train = pd.read_csv('https://raw.githubusercontent.com/Nilabhra/kolkata_nlp_workshop_2019/master/data/train.csv')
vectorizer.fit_transform(train['text'])
svc = TextClassificationService.pack(model=model,vectorizer=vectorizer)
from bentoml.
Hi @parano it has worked now. Thank you very much especially for the second suggestion regarding adding the vectorizer
as the PickleArtifact
. I am able to make inference using the following way -
svc.predict({"text": "I had a wonderful experience eating their chicken noodles! Also loved the ambience."})
However, after I serve the model as a REST API using the bentoml serve
, it starts to behave weirdly -
File "/miniconda3/lib/python3.7/site-packages/flask/app.py", line 2292, in wsgi_app
response = self.full_dispatch_request()
File "/miniconda3/lib/python3.7/site-packages/flask/app.py", line 1815, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/miniconda3/lib/python3.7/site-packages/flask/app.py", line 1718, in handle_user_exception
reraise(exc_type, exc_value, tb)
File "/miniconda3/lib/python3.7/site-packages/flask/_compat.py", line 35, in reraise
raise value
File "/miniconda3/lib/python3.7/site-packages/flask/app.py", line 1813, in full_dispatch_request
rv = self.dispatch_request()
File "/miniconda3/lib/python3.7/site-packages/flask/app.py", line 1799, in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
File "/miniconda3/lib/python3.7/site-packages/bentoml/server/bento_api_server.py", line 104, in wrapper
response = api.handle_request(request)
File "/miniconda3/lib/python3.7/site-packages/bentoml/service.py", line 85, in handle_request
return self.handler.handle_request(request, self.func)
File "/miniconda3/lib/python3.7/site-packages/bentoml/handlers/json_handler.py", line 45, in handle_request
output = func(parsed_json)
File "./text_classification/TextClassificationService/0.0.2019_04_19_c1c58d4d/TextClassificationService/text_classification_service.py", line 22, in predict
prediction = self.artifacts.model.predict_classes(text)
File "/miniconda3/lib/python3.7/site-packages/tensorflow/python/keras/engine/sequential.py", line 311, in predict_classes
proba = self.predict(x, batch_size=batch_size, verbose=verbose)
File "/miniconda3/lib/python3.7/site-packages/tensorflow/python/keras/engine/training.py", line 1113, in predict
self, x, batch_size=batch_size, verbose=verbose, steps=steps)
File "/miniconda3/lib/python3.7/site-packages/tensorflow/python/keras/engine/training_arrays.py", line 195, in model_iteration
f = _make_execution_function(model, mode)
File "/miniconda3/lib/python3.7/site-packages/tensorflow/python/keras/engine/training_arrays.py", line 122, in _make_execution_function
return model._make_execution_function(mode)
File "/miniconda3/lib/python3.7/site-packages/tensorflow/python/keras/engine/training.py", line 1989, in _make_execution_function
self._make_predict_function()
File "/miniconda3/lib/python3.7/site-packages/tensorflow/python/keras/engine/training.py", line 1979, in _make_predict_function
**kwargs)
File "/miniconda3/lib/python3.7/site-packages/tensorflow/python/keras/backend.py", line 3201, in function
return GraphExecutionFunction(inputs, outputs, updates=updates, **kwargs)
File "/miniconda3/lib/python3.7/site-packages/tensorflow/python/keras/backend.py", line 2939, in __init__
with ops.control_dependencies(self.outputs):
File "/miniconda3/lib/python3.7/site-packages/tensorflow/python/framework/ops.py", line 5028, in control_dependencies
return get_default_graph().control_dependencies(control_inputs)
File "/miniconda3/lib/python3.7/site-packages/tensorflow/python/framework/ops.py", line 4528, in control_dependencies
c = self.as_graph_element(c)
File "/miniconda3/lib/python3.7/site-packages/tensorflow/python/framework/ops.py", line 3478, in as_graph_element
return self._as_graph_element_locked(obj, allow_tensor, allow_operation)
File "/miniconda3/lib/python3.7/site-packages/tensorflow/python/framework/ops.py", line 3557, in _as_graph_element_locked
raise ValueError("Tensor %s is not an element of this graph." % obj)
ValueError: Tensor Tensor("dense_2/Sigmoid:0", shape=(?, 1), dtype=float32) is not an element of this graph.
I have tried the suggestions as mentioned in this and this issues. But still, they do not seem to work. Here's the notebook for your convenience.
from bentoml.
Hey @yubozhao sure. Thanks.
from bentoml.
Hi @sayakpaul, This is pretty interesting. I was able to reproduce this error only in REST api server. I ran both bentoml.load
method and cli predict command, both works as you expected. I will research this problem a little bit more then I will get back to you on this.
from bentoml.
I think this is where the issue is
Flask uses multiple threads. The problem you are running into is because the tensorflow model is not loaded and used in the same thread. One workaround is to force tensorflow to use the gloabl default graph .
I will test couple thing during TfKeraHandler loading, if that works, it should solve this issue.
from bentoml.
Link of the issue comment: tensorflow/tensorflow#14356 (comment)
from bentoml.
Yeah, I studied about it yesterday. I think you will have to include the global graph part in the section where BentoML loads the model. Let me know about that. @yubozhao
from bentoml.
Always welcome @yubozhao. You guys are doing pretty wonderful job. Serving machine learning models as API endpoints has never been this easier.
from bentoml.
@yubozhao on it. Will update once done. Just to confirm do I need to locally build BentoML using the branch? Or something else?
from bentoml.
Related Issues (20)
- Trouble in saving the Keras model to the Bentoml model store
- bug: Error when calling "bentoml.bentos.build_bentofile()"
- Bento Service does not support pandas.core.series.Series as a type
- bug: Working Service for HTTP gives error with Grpc "type doesn't support gRPC serving"
- feature: Custom runner support for time series models like ARIMA using BentoML
- feature: Support for River models in BentoML
- bug: Unable to request PILImage with Content-Type image/jpeg with new Service decorator
- bug: pyright failed
- feature: Allow removal of server header HOT 1
- feature: Custom the swagger UI.
- bug: bentoml batch not work HOT 1
- feature: add http api server's uvicorn option
- bug: bentoml serve error pynvml.nvml.NVMLError_DriverNotLoaded: Driver Not Loaded
- bug: i find paddle.py ,but i can't use it HOT 1
- bug: AttributeError: type object 'GrpcClient' has no attribute '_create_channel'
- bug: module 'socket' has no attribute 'AF_UNIX'
- bug: ‘bentoml containerize’ got a problem
- bug: module 'bentoml' has no attribute 'build' HOT 3
- bug: Can't connect to bentoml API using websocket
- Bug: 'bentoml containerize' doesn't include models in the image
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from bentoml.