Giter VIP home page Giter VIP logo

workshops's People

Contributors

abhinavsp0730 avatar ageron avatar amygdala avatar andsteing avatar arjung avatar cesar-ilharco avatar charlesccychen avatar gunan avatar hanneshapke avatar irenegi avatar pmontelo avatar random-forests avatar rcrowe-google avatar sararob avatar serenalwang avatar sscardapane avatar tcmetzger avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

workshops's Issues

Dataflow fails on Transform component in "TFX_Pipeline_for_Bert_Preprocessing"

@hanneshapke

I have tried out TFX_Pipeline_for_Bert_Preprocessing on GCP AI Platform Pipeline while Dataflow option turned on.

However, it failed, and I got some error messages.

tensorflow.python.framework.errors_impl.NotFoundError: Converting GraphDef to Graph has failed. The binary trying to import the GraphDef was built when GraphDef version was 440. The GraphDef was produced by a binary built when GraphDef version was 561. The difference between these versions is larger than TensorFlow's forward compatibility guarantee. The following error might be due to the binary trying to import the GraphDef being too old: Op type not registered 'CaseFoldUTF8' in binary running on beamapp-root-0218001007-4-02171610-z5bw-harness-3q6c. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib, accessing (e.g.) `tf.contrib.resampler` should be done before importing the graph, as contrib ops are lazily registered when the module is first accessed

any thoughts?

TFX Evaluator takes too long in "TFX_Pipeline_for_Bert_Preprocessing"

@hanneshapke

It seems like 'Evaluator' component takes too long time (more than 2 hours, and it hadn't done yet) in Kubeflow environment on GCP AI Platform Pipeline. It is very unexpected behaviour when comparing the notebook version which took about less than 5 minutes with GPU.

  • I have tried a number of different VM options with different CPU/Memory (but not GPU, because GCP team didn't let me have more GPU quota)

I am assuming that environments with and without GPU behaves differently (since Evaluator tries to evaluate two models[blessing, current] by inferencing inputs). If that is the case, the problem is that I want to allocate one GPU k8s node for one specific TFX component. Otherwise I have to equip every single nodes with GPU which is not desirable.

Any possible thoughts?

very confused with tensorflow java api

i have trained a model using python3.7 and tf 2.7 , save the model by 'saved model' format , like this:

MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:

signature_def['serving_default']:
The given SavedModel SignatureDef contains the following input(s):
inputs['examples'] tensor_info:
dtype: DT_STRING
shape: (-1)
name: input_example_tensor:0
The given SavedModel SignatureDef contains the following output(s):
outputs['cf_1'] tensor_info:
dtype: DT_INT64
shape: (-1, 1)
name: ParseExample/ParseExampleV2:0
outputs['cf_2'] tensor_info:
dtype: DT_INT64
shape: (-1, 1)
name: ParseExample/ParseExampleV2:1
outputs['cf_label'] tensor_info:
dtype: DT_INT64
shape: (-1, 1)
name: ParseExample/ParseExampleV2:2
outputs['cf_id'] tensor_info:
dtype: DT_INT64
shape: (-1, 1)
name: ParseExample/ParseExampleV2:3
outputs['score'] tensor_info:
dtype: DT_FLOAT
shape: (-1, 1)
name: score:0
Method name is: tensorflow/serving/predict

i wanna to load it and do some predict by spark, but really confused with the model input : Serialized Example Object. i can use python load the model and predict, just like this:

def model_predict(example_proto):
exam_input = tf.constant([example_proto.SerializeToString()])
return model.signatures'serving_default'

here, the example_proto is my Example Object, and use SerializeToString method , it is worked. but when i do the same thing by spark, there is always report error, such as:

val result = sparkEnv.spark.read.parquet(inputPath).map(item => {
val example = convert2Example(schemaInfo,item)
val map = new java.util.HashMapString,Tensor

  val tensor = TString.vectorOf(new String(example.toByteArray,Charset.forName("UTF-8")))

  map.put("examples",tensor)
  val score = model.value.call(map).get("score")
  score.toString
}).rdd

Is there any method to deploy a estimator model which input is Example object by java ?

Worshop "File has moved"

Hi, the only working notebook in the master branch is the mnt one. I saw someone else recalled the issue but I was wondering if the change was merged or not because they are still not working.

thanks

Real Time Forecasting/Prediction Not getting feasible

I am trying to apply your concept on a real time scenario where I get the data and need to give the forecast for it.
But I am not understanding why the output I am getting is changing over time?
Please guide me what way I need to process for the real time deployment of the saved model.

use lstm

Hello, thank you very much for your work. I need to use a level of lstm in workshops/extras/keras-bag-of-words/keras-bow-model.ipynb ..can you help me? I tried to insert
model.add(LSTM(activation='softmax', units=300, recurrent_activation='hard_sigmoid', return_sequences=True)) after the model.add(Dropout(0.5)) but i've the error: Input 0 is incompatible with layer lstm_1: expected ndim=3, found ndim=2
Sorry for my bad english.
Thank you very much. Bye

Could not parse example input in TFX_Pipeline_for_Bert_Preprocessing

Hi, could you please advice, where I'm wrong.
I don't have much experience and I'm trying to figure out how does it work.

I tried to to use model build with your TFX_Pipeline_for_Bert_Preprocessing.ipynb, but when I try to serve it via TF Serving I receive ""error": "Could not parse example input, value: 'You are very good person'\n\t [[{{node ParseExample/ParseExampleV2}}]]""

My steps:

  1. Download TFX_Pipeline_for_Bert_Preprocessing.ipynb notebook locally
  2. Change "/content/..." folder to "/tmp/..."
  3. Change version of dataset from "0.1.0" to "1.0.0", cause only 1.0.0 is available
  4. Install dependencies and build model locally
  5. Run TF serving via docker and fit already built model
  6. Make request curl -d '{"instances": ["You are very good person"]}' -X POST --output - http://localhost:8501/v1/models/my_model:predict
    Receive { "error": "Could not parse example input, value: 'You are very good person'\n\t [[{{node ParseExample/ParseExampleV2}}]]" }

So I assume, that model is trained with tensor as an input. Also in the end of your notebook there is a test, trying model's "serving default" and we also fit a tensor to the model.

How could I achieve to pass the raw text in request to TF Serving ? Should TF Serving convert string to tensor?

Could you please advice where I'm wrong. Spent more than a week trying to solve this.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.