Comments (8)
I'll add that there is now a dumpmodel
binary built when you build gnina that will export the built-in caffe model weights (but not that actual caffe model file as you can get this easily from the source code).
from gnina-torch.
I actually have these weights files for all of the built-in models easily available.
The built-in model weights were pulled directly from the source code of GNINA and converted to a .caffemodel
. Then https://github.com/vadimkantorov/caffemodel2pytorch was utilized to convert the .caffemodel
files to .pt
for PyTorch.
I have created PR #34 to add these weights files.
from gnina-torch.
Unfortunately, I never converted the original Caffe models weights to PyTorch. The examples
folder contains some scripts to reproduce part of the results of your and Paul's work, but are meant mostly for testing.
However, I would be very interested in doing so (or incorporate PRs that will allow to do that). It would be a great feature, and would serve as double check that everything works as expected. Is there any recommended way of doing this Caffe to PyTorch conversion?
PS: For easier integration, I could push the package to PyPI and conda-forge, if it is of interest.
from gnina-torch.
That's fantastic, thanks @drewnutt ! I'll incorporate PR #34 and start adding the functionality to use them easily (including model ensemble, which is currently not implemented).
from gnina-torch.
Awesome! Thanks, @drewnutt.
from gnina-torch.
I opened #35 as a follow-up. I'll link relevant PRs there. Please feel free to comment with specific needs and missing features from your point of view.
from gnina-torch.
@mattragoza you can now easily load the pre-trained default2017
and *default2018*
model as follows:
from gninatorch.gnina import load_gnina_model
model = load_gnina_model(MODEL_NAME)
where MODEL_NAME
corresponds to name of the .pt
files in gninatorch/weights
(without the .pt
extension); this is essentially equivalent to GNINA's --cnn
option (but without model ensembles, which will follow).
The dense
model is not yet supported because I haven't managed to obtain the same output as GNINA.
from gnina-torch.
An ensemble of models is now also available:
from gninatorch.gnina import load_gnina_models
ensemble = load_gnina_models([MODEL_NAME_1, MODEL_NAME_2, ...])
The ensemble of models returns log_CNNscore
, CNNaffinity
and CNNvariance
.
Still no dense
model for the time being (I suspect that there might be an issue with BatchNorm
layer weights, but I'm still investigating), so it is not currently possible to reproduce GNINA's default
model (since it contains two dense
models). However, *_default2018_ensemble
work as expected.
from gnina-torch.
Related Issues (15)
- Allow file mapped GNINA typer
- Common weights initialisation
- Default arguments logging is confusing
- Names of some output files are hard-coded
- Learning rate scheduler is called twice on both train and test losses
- Timings
- Inconsistent CLI arguments for model specification
- Allow user to choose not to cache structures
- Add functionality to use GNINA pre-trained models
- Contents of configuration file
- Guidance on preparing data for training from structures
- Erroneous random translation
- Segmentation fault (core dumped) HOT 11
- Expose additional parameters
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from gnina-torch.