Giter VIP home page Giter VIP logo

openrec's People

Contributors

christinatsan avatar christycui avatar dependabot[bot] avatar ebagdasa avatar kellywang95 avatar tayo avatar whongyi avatar ylongqi avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

openrec's Issues

Could not locate download_dataset.sh

Hi there. Thank you for making this tool. I'd like to try it out, but after I did pip install openrec no new folder was created.
I then go to /usr/local/lib/python2.7/site-packages to find the openrec folder, but could not locate the download_dataset.sh file. Can you please show me where it is? Thanks.

How to get user/item embeddings ?

I've some problems to get the user/item embeddings(or representations) from this framework.
I'm very interested in your CML model. And I want to use it with movielens dataset, and evaluate with other recommendation algorithms like libMF. Is there an option or any plan to implement this feature?

Request for a Demo Code for Saving a Trained Model and Loading it for Testing

Hi,I am a PhD student from a computer architecture background I am using OpenRec DLRM as a benchmark for my research. I followed all the previous issues to figure out a way but those links are no longer available. I would be extremely thankful if you could kindly provide me an example code/ show me the method how to save the trained model and restore it later and do inference on some test dataset.

Thank You &
Warm Regards,
Piyumal

How to convert JSON data to numpy to train with openrec?

Hi there. I'm trying to convert my JSON dataset that I've got from Yelp to numpy array in order to train a recommender with openrec, and I don't know how to do it correctly.

I have printed out the numpy data from dataset folder to see if I can mimic the structure, but I didn't quite get it. Can someone please give me a quick tutorial on how to set up my data to train?

For example, I have a small set of data like this:

[
    {
        "funny": 0, 
        "user_id": "msQe1u7Z_XuqjGoqhB0J5g", 
        "review_id": "x7mDIiDB3jEiPGPHOmDzyw", 
        "text": "The pizza was okay. Not the best I've had. I prefer Biaggio's on Flamingo / Fort Apache. The chef there can make a MUCH better NY style pizza. The pizzeria @ Cosmo was over priced for the quality and lack of personality in the food. Biaggio's is a much better pick if youre going for italian - family owned, home made recipes, people that actually CARE if you like their food. You dont get that at a pizzeria in a casino. I dont care what you say...", 
        "business_id": "iCQpiavjjPzJ5_3gPD5Ebg", 
        "stars": 2, 
        "date": "2011-02-25", 
        "useful": 0, 
        "cool": 0
    }, 
    {
        "funny": 0, 
        "user_id": "msQe1u7Z_XuqjGoqhB0J5g", 
        "review_id": "dDl8zu1vWPdKGihJrwQbpw", 
        "text": "I love this place! My fiance And I go here atleast once a week. The portions are huge! Food is amazing. I love their carne asada. They have great lunch specials... Leticia is super nice and cares about what you think of her restaurant. You have to try their cheese enchiladas too the sauce is different And amazing!!!", 
        "business_id": "pomGBqfbxcqPv14c3XH-ZQ", 
        "stars": 5, 
        "date": "2012-11-13", 
        "useful": 0, 
        "cool": 0
    }, 
    {
        "funny": 1, 
        "user_id": "msQe1u7Z_XuqjGoqhB0J5g", 
        "review_id": "LZp4UX5zK3e-c5ZGSeo3kA", 
        "text": "Terrible. Dry corn bread. Rib tips were all fat and mushy and had no flavor. If you want bbq in this neighborhood go to john mulls roadkill grill. Trust me.", 
        "business_id": "jtQARsP6P-LbkyjbO1qNGg", 
        "stars": 1, 
        "date": "2014-10-23", 
        "useful": 3, 
        "cool": 1
    }, 
    {
        "funny": 0, 
        "user_id": "msQe1u7Z_XuqjGoqhB0J5g", 
        "review_id": "Er4NBWCmCD4nM8_p1GRdow", 
        "text": "Back in 2005-2007 this place was my FAVORITE thai place EVER. I'd go here ALLLLL the time. I never had any complaints. Once they started to get more known and got busy, their service started to suck and their portion sizes got cut in half. I have a huge problem with paying MORE for way less food. The last time I went there I had the Pork Pad se Ew and it tasted good, but I finished my plate and was still hungry. I used to know the manager here and she would greet me with a \"Hello Melissa, nice to see you again, diet coke & pad thai or pad se ew?\" Now a days, I know she still knows me but she disregards my presence. Also, I had asked her what was up with the new portion sizes and she had no answer for me. Great food but not worth the money. I havent been back in over a year because I refuse to pay $10-15 for dinner and still be hungry after. Sorry PinKaow, you are not what you used to be!!", 
        "business_id": "elqbBhBfElMNSrjFqW3now", 
        "stars": 2, 
        "date": "2011-02-25", 
        "useful": 2, 
        "cool": 0
    }, 
    {
        "funny": 0, 
        "user_id": "msQe1u7Z_XuqjGoqhB0J5g", 
        "review_id": "jsDu6QEJHbwP2Blom1PLCA", 
        "text": "Delicious healthy food. The steak is amazing. Fish and pork are awesome too. Service is above and beyond. Not a bad thing to say about this place. Worth every penny!", 
        "business_id": "Ums3gaP2qM3W1XcA5r6SsQ", 
        "stars": 5, 
        "date": "2014-09-05", 
        "useful": 0, 
        "cool": 0
    }, 
    {
        "funny": 0, 
        "user_id": "msQe1u7Z_XuqjGoqhB0J5g", 
        "review_id": "pfavA0hr3nyqO61oupj-lA", 
        "text": "This place sucks. The customer service is horrible. They dont serve food unless you order a pizza from a neighboring restaurant. Who does that? They dont control their crowd. Many times I've gone I've seen fights. The bartenders suck - I've almost got in a fight with one because she was a complete bitch. Refused to serve me a drink because she was \"busy\" celebrating her friends birthday BEHIND THE BAR. This place is ridiculous. I will NEVER go there again.. EVER.", 
        "business_id": "vgfcTvK81oD4r50NMjU2Ag", 
        "stars": 1, 
        "date": "2011-02-25", 
        "useful": 2, 
        "cool": 0
    }, 
    {
        "funny": 0, 
        "user_id": "msQe1u7Z_XuqjGoqhB0J5g", 
        "review_id": "brokEno2n7s4vrwmmUdr9w", 
        "text": "If you like Thai food, you have to try the original thai bbq. Their pad se ew is to DIE for. Their thai egg rolls are delicious. Basil beef will not let you down (its not on the menu anymore, you have to ask for it!) \n\nYes, the building is not as fancy as some other places. Yes, i've batted a fly off my plate more than once. Yes, I do NOT go to the bathroom their because I dont even WANT to know what it looks like... \n\nBUT.. the thai food is the best in town. The service rocks. And you can get a $25 gift cert. on Restaurant.com for $2. Can you beat that? I think NOT.\n\nThis is the only place my husband and I go for anniversarys, date nights, birthdays.. anything!! I recommend it to everyone I know. If you KNOW good thai food, go here.", 
        "business_id": "AxeQEz3-s9_1TyIo-G7UQw", 
        "stars": 5, 
        "date": "2011-10-10", 
        "useful": 1, 
        "cool": 0
    }
]

I can link between the ratings and the restaurants' properties, but I don't know how to make them into numpy array :(

Feature to export the built servegraph into saved_model.pb

Hi ylongqi. Thanks for the project. I enjoyed playing with the framework.

I have noticed that Recommender object does not have a functionality to export to a saved_model.pb. This is a very important function for me as I want to serve my model via tensorflow serving.

I managed to add the feature myself and would like to contribute to your project. Do you have any CONTRIBUTING.md such that I can follow guidelines if any.

About performance

Can you tell me the Criteo example AUC score and logloss of the dlrm that you implemented?

drml_model.save('my_model') cannot save model after training

I add one line code to save the model at the end of tf2_examples/dlrm_criteo.py, but the save function does not work. The output is attached. It looks like it required some information in the dataset inside the tensorflow function call. Any idea to fix the bug? I tried TensorFlow gpu version 2.0, 2.1, 2.2, same bug output.
dlrm_model.save("/home/chi/test/my_model", save_format="tf")

Bug output:
~/test/openrec/tf2_examples(master*) » python dlrm_criteo.py
2020-07-07 17:53:00.942611: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcuda.so.1
2020-07-07 17:53:00.966263: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:1006] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2020-07-07 17:53:00.966536: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1618] Found device 0 with properties:
name: GeForce RTX 2080 major: 7 minor: 5 memoryClockRate(GHz): 1.59
pciBusID: 0000:01:00.0
2020-07-07 17:53:00.967845: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.0
2020-07-07 17:53:00.983018: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10.0
2020-07-07 17:53:00.991269: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10.0
2020-07-07 17:53:00.994296: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10.0
2020-07-07 17:53:01.107993: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10.0
2020-07-07 17:53:01.216752: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10.0
2020-07-07 17:53:01.223077: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7
2020-07-07 17:53:01.223144: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:1006] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2020-07-07 17:53:01.223415: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:1006] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2020-07-07 17:53:01.223640: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1746] Adding visible gpu devices: 0
2020-07-07 17:53:01.228401: I tensorflow/core/platform/cpu_feature_guard.cc:142] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
2020-07-07 17:53:01.372497: I tensorflow/core/platform/profile_utils/cpu_utils.cc:94] CPU Frequency: 2899885000 Hz
2020-07-07 17:53:01.376047: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x4e7eca0 executing computations on platform Host. Devices:
2020-07-07 17:53:01.376115: I tensorflow/compiler/xla/service/service.cc:175] StreamExecutor device (0): Host, Default Version
2020-07-07 17:53:01.485098: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:1006] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2020-07-07 17:53:01.486519: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x54a7720 executing computations on platform CUDA. Devices:
2020-07-07 17:53:01.486588: I tensorflow/compiler/xla/service/service.cc:175] StreamExecutor device (0): GeForce RTX 2080, Compute Capability 7.5
2020-07-07 17:53:01.490833: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:1006] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2020-07-07 17:53:01.492068: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1618] Found device 0 with properties:
name: GeForce RTX 2080 major: 7 minor: 5 memoryClockRate(GHz): 1.59
pciBusID: 0000:01:00.0
2020-07-07 17:53:01.492146: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.0
2020-07-07 17:53:01.492187: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10.0
2020-07-07 17:53:01.492221: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10.0
2020-07-07 17:53:01.492253: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10.0
2020-07-07 17:53:01.492288: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10.0
2020-07-07 17:53:01.492323: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10.0
2020-07-07 17:53:01.492359: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7
2020-07-07 17:53:01.492514: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:1006] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2020-07-07 17:53:01.493762: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:1006] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2020-07-07 17:53:01.494894: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1746] Adding visible gpu devices: 0
2020-07-07 17:53:01.498147: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.0
2020-07-07 17:53:01.504498: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1159] Device interconnect StreamExecutor with strength 1 edge matrix:
2020-07-07 17:53:01.504586: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1165] 0
2020-07-07 17:53:01.504610: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1178] 0: N
2020-07-07 17:53:01.507360: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:1006] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2020-07-07 17:53:01.508746: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:1006] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2020-07-07 17:53:01.509974: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1304] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 6724 MB memory) -> physical GPU (device: 0, name: GeForce RTX 2080, pci bus id: 0000:01:00.0, compute capability: 7.5)
2020-07-07 17:53:07.765291: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10.0
Iter: 0, Loss: 0.27, AUC: 0.4053
Traceback (most recent call last):
File "dlrm_criteo.py", line 72, in
dlrm_model.save('/home/chi/test/my_model', save_format="tf")
File "/home/chi/tf2_0/lib/python3.7/site-packages/tensorflow_core/python/keras/engine/network.py", line 975, in save
signatures, options)
File "/home/chi/tf2_0/lib/python3.7/site-packages/tensorflow_core/python/keras/saving/save.py", line 115, in save_model
signatures, options)
File "/home/chi/tf2_0/lib/python3.7/site-packages/tensorflow_core/python/keras/saving/saved_model/save.py", line 74, in save
save_lib.save(model, filepath, signatures, options)
File "/home/chi/tf2_0/lib/python3.7/site-packages/tensorflow_core/python/saved_model/save.py", line 870, in save
checkpoint_graph_view)
File "/home/chi/tf2_0/lib/python3.7/site-packages/tensorflow_core/python/saved_model/signature_serialization.py", line 64, in find_function_to_export
functions = saveable_view.list_functions(saveable_view.root)
File "/home/chi/tf2_0/lib/python3.7/site-packages/tensorflow_core/python/saved_model/save.py", line 141, in list_functions
self._serialization_cache)
File "/home/chi/tf2_0/lib/python3.7/site-packages/tensorflow_core/python/keras/engine/base_layer.py", line 2422, in _list_functions_for_serialization
.list_functions_for_serialization(serialization_cache))
File "/home/chi/tf2_0/lib/python3.7/site-packages/tensorflow_core/python/keras/saving/saved_model/base_serialization.py", line 91, in list_functions_for_serialization
fns = self.functions_to_serialize(serialization_cache)
File "/home/chi/tf2_0/lib/python3.7/site-packages/tensorflow_core/python/keras/saving/saved_model/layer_serialization.py", line 79, in functions_to_serialize
serialization_cache).functions_to_serialize)
File "/home/chi/tf2_0/lib/python3.7/site-packages/tensorflow_core/python/keras/saving/saved_model/layer_serialization.py", line 94, in _get_serialized_attributes
serialization_cache)
File "/home/chi/tf2_0/lib/python3.7/site-packages/tensorflow_core/python/keras/saving/saved_model/model_serialization.py", line 47, in _get_serialized_attributes_internal
default_signature = save_impl.default_save_signature(self.obj)
File "/home/chi/tf2_0/lib/python3.7/site-packages/tensorflow_core/python/keras/saving/saved_model/save_impl.py", line 206, in default_save_signature
fn.get_concrete_function()
File "/home/chi/tf2_0/lib/python3.7/site-packages/tensorflow_core/python/eager/def_function.py", line 776, in get_concrete_function
self._initialize(args, kwargs, add_initializers_to=initializer_map)
File "/home/chi/tf2_0/lib/python3.7/site-packages/tensorflow_core/python/eager/def_function.py", line 408, in _initialize
*args, **kwds))
File "/home/chi/tf2_0/lib/python3.7/site-packages/tensorflow_core/python/eager/function.py", line 1848, in _get_concrete_function_internal_garbage_collected
graph_function, _, _ = self._maybe_define_function(args, kwargs)
File "/home/chi/tf2_0/lib/python3.7/site-packages/tensorflow_core/python/eager/function.py", line 2150, in _maybe_define_function
graph_function = self._create_graph_function(args, kwargs)
File "/home/chi/tf2_0/lib/python3.7/site-packages/tensorflow_core/python/eager/function.py", line 2041, in _create_graph_function
capture_by_value=self._capture_by_value),
File "/home/chi/tf2_0/lib/python3.7/site-packages/tensorflow_core/python/framework/func_graph.py", line 915, in func_graph_from_py_func
func_outputs = python_func(*func_args, **func_kwargs)
File "/home/chi/tf2_0/lib/python3.7/site-packages/tensorflow_core/python/eager/def_function.py", line 358, in wrapped_fn
return weak_wrapped_fn().wrapped(*args, **kwds)
File "/home/chi/tf2_0/lib/python3.7/site-packages/tensorflow_core/python/keras/saving/saving_utils.py", line 143, in _wrapped_model
outputs_list = nest.flatten(model(inputs=inputs, training=False))
File "/home/chi/tf2_0/lib/python3.7/site-packages/tensorflow_core/python/keras/engine/base_layer.py", line 847, in call
outputs = call_fn(cast_inputs, *args, **kwargs)
File "/home/chi/tf2_0/lib/python3.7/site-packages/tensorflow_core/python/autograph/impl/api.py", line 292, in wrapper
return func(*args, **kwargs)
TypeError: call() missing 2 required positional arguments: 'sparse_features' and 'label'
(tf2_0) --------

By the way, the dlrm_model.save_weights function works well, I can get the checkpoints.
And for tf2.0 2.1, you should not use from tensorflow.data import Dataset,
you should use tf.data.Dataset......
or it will cause this bug:
https://github.com/tensorflow/tensorflow/issues/33022

How to load model and test with serve?

Hi there. I also another opened ticket, but that's almost settled. To avoid confusions, I'm making a new ticket here regarding how to test the model:
After training, I have couple files and they're named like this model.ckpt-40000.data-00000-of-00001.
@ylongqi showed me how to test using the serve method. However, the only argument I can feed in data for that method is batch_data. Right now, each of the entries in my numpy array looks like this (143 135 5) denoting user_id, item_id and rating
Would someone mind to provide and example on how to load the model and call serve method to test as I couldn't find any example for it? Thank you very much!

Getting ResourceExhaustedError

Hi I am trying to run BPR on my test dataset.
max_users are: 326608
max_items are: 458334
I was referring to the tutorial "OpenRec Tutorial #1"
12 Gigs of GPU memory and 64 G RAM.

But while trying to build BPR model with batch size 100 it gives ResourceExhaustedError.

Here is the part of last stack trace:

File "/mnt/data/virtualenv/tensorflow/lib/python3.5/site-packages/tensorflow/python/framework/ops.py", line 3392, in create_op
    op_def=op_def)
  File "/mnt/data/virtualenv/tensorflow/lib/python3.5/site-packages/tensorflow/python/framework/ops.py", line 1718, in __init__
    self._traceback = self._graph._extract_stack()  # pylint: disable=protected-access

ResourceExhaustedError (see above for traceback): OOM when allocating tensor of shape [458334,20] and type float
	 [[Node: item/embedding/Adam/Initializer/zeros = Const[dtype=DT_FLOAT, value=Tensor<type: float shape: [458334,20] values: [0 0 0]...>, _device="/job:localhost/replica:0/task:0/device:GPU:0"]()]]

Does this library dosen't yet supports this size of input or I am doing something wrong. Can you please help.

AttributeError: Can't pickle local object 'RandomPairwiseSampler.<locals>.batch'

Hi, i got an error below when trying to run every tf1 models on tensorflow 1.4.
Do you know what's the problem is? perhaps my tensorflow version is not supported?

AttributeError                            Traceback (most recent call last)
<ipython-input-33-3c464690e82a> in <module>
      4                     train_sampler=train_sampler,
      5                     eval_samplers=[val_sampler,test_sampler],
----> 6                     evaluators=[auc_evaluator])

~\.conda\envs\tf14\lib\site-packages\openrec\tf1\model_trainer.py in train(self, total_iter, eval_iter, save_iter, train_sampler, start_iter, eval_samplers, evaluators)
     63         self._eval_manager = EvalManager(evaluators=evaluators)
     64 
---> 65         train_sampler.reset()
     66         for sampler in eval_samplers:
     67             sampler.reset()

~\.conda\envs\tf14\lib\site-packages\openrec\tf1\utils\samplers\sampler.py in reset(self)
     49             runner = _Sampler(self._dataset, self._q, self._generate_batch)
     50             runner.daemon = True
---> 51             runner.start()
     52             self._runner_list.append(runner)
     53         self._start = True

~\.conda\envs\tf14\lib\multiprocessing\process.py in start(self)
    103                'daemonic processes are not allowed to have children'
    104         _cleanup()
--> 105         self._popen = self._Popen(self)
    106         self._sentinel = self._popen.sentinel
    107         # Avoid a refcycle if the target function holds an indirect

~\.conda\envs\tf14\lib\multiprocessing\context.py in _Popen(process_obj)
    221     @staticmethod
    222     def _Popen(process_obj):
--> 223         return _default_context.get_context().Process._Popen(process_obj)
    224 
    225 class DefaultContext(BaseContext):

~\.conda\envs\tf14\lib\multiprocessing\context.py in _Popen(process_obj)
    320         def _Popen(process_obj):
    321             from .popen_spawn_win32 import Popen
--> 322             return Popen(process_obj)
    323 
    324     class SpawnContext(BaseContext):

~\.conda\envs\tf14\lib\multiprocessing\popen_spawn_win32.py in __init__(self, process_obj)
     63             try:
     64                 reduction.dump(prep_data, to_child)
---> 65                 reduction.dump(process_obj, to_child)
     66             finally:
     67                 set_spawning_popen(None)

~\.conda\envs\tf14\lib\multiprocessing\reduction.py in dump(obj, file, protocol)
     58 def dump(obj, file, protocol=None):
     59     '''Replacement for pickle.dump() using ForkingPickler.'''
---> 60     ForkingPickler(file, protocol).dump(obj)
     61 
     62 #

AttributeError: Can't pickle local object 'RandomPairwiseSampler.<locals>.batch'


Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.