Giter VIP home page Giter VIP logo

sktime / sktime-dl Goto Github PK

View Code? Open in Web Editor NEW
595.0 21.0 79.0 1.35 MB

DEPRECATED, now in sktime - companion package for deep learning based on TensorFlow

License: BSD 3-Clause "New" or "Revised" License

Python 98.95% Dockerfile 0.05% Makefile 0.31% Shell 0.69%
deep-learning time-series machine-learning scikit-learn panel-data longitudinal-data neural-networks time-series-classification time-series-regression time-series-forecasting

sktime-dl's Introduction

sktime-dl is currently being ported to mini-packages within sktime, and no longer maintained as a separate package.

Most estimators formerly in sktime-dl are now available in the sktime.classification.deep_learning and sktime.regression.deep_learning modules, and maintained there.

Contributions are appreciated to port the rest!

To contribute, follow instructions in the umbrella planning issue sktime/sktime#3351 on the sktime repo.

sktime-dl's People

Contributors

abostrom avatar dependabot[bot] avatar fkiraly avatar james-large avatar jnrusson1 avatar mloning avatar nanashi06 avatar riyabelle25 avatar tonybagnall avatar withington avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

sktime-dl's Issues

[BUG] ModuleNotFoundError: No module named 'sktime.utils.data_container'

Running in Google colab:
from sktime_dl.deeplearning import ResNetClassifier

Gives this error:
/usr/local/lib/python3.7/dist-packages/sktime_dl/utils/_data.py in <module>()
4
5 import pandas as pd
----> 6 from sktime.utils.data_container import tabularise, nested_to_3d_numpy
7 from sktime.utils.validation.series_as_features import check_X, check_X_y
8

ModuleNotFoundError: No module named 'sktime.utils.data_container'`

Additional context
Installed by running:
!git clone https://github.com/sktime/sktime-dl.git
cd sktime-dl
!git checkout dev
!git pull origin dev
!pip install .

Versions

Linux-5.4.104+-x86_64-with-Ubuntu-18.04-bionic
Python 3.7.10 (default, May 3 2021, 02:48:31)
[GCC 7.5.0]
NumPy 1.19.5
SciPy 1.4.1
sktime 0.6.1
sktime_dl 0.2.0

Loss function in CNNClassifier

Hi,

Thanks for the great work! I was looking at the implementation of CNNClassifier and noticed that the loss function used for this model appears to be MSE. Is there any specific reason for this choice of the loss function? Other models seem to use cross-entropy which should potentially work better for classification problems.

[BUG] ImportError: cannot import name 'tabularise' from 'sktime.utils.data_container'

Describe the bug
Looks like there's digression between sktime and sktime-dl with some deprecated methods in sktime.utils.data_container

To Reproduce
A fresh installation of sktime-dl from dev branch with standard pip install sktime yielded the following error:

ImportError: cannot import name 'tabularise' from 'sktime.utils.data_container'

in sktime_dl.utils._data line 6

from sktime.utils.data_container import tabularise, nested_to_3d_numpy

Expected behavior
sktime.utils.data_container deprecated tabularize, there isn't a method named tabularise:

@deprecated("Please use `from_nested_to_2d_numpy` instead.")
def tabularize(X, return_array=False):
    return from_nested_to_2d_array(X, return_array)

There also seems to be a method name change, nested_to_3d_numpy doesn't exist, instead there's from_nested_to_3d_numpy.

Additional context
I commented out the block on Python 3.8 in setup.py, why is it there? Py3.8 seems to run Ok

Versions
macOS-10.15.6-x86_64-i386-64bit
Python 3.8.2 (default, Oct 14 2020, 17:06:12)
[Clang 11.0.3 (clang-1103.0.32.62)]
NumPy 1.18.5
SciPy 1.5.3
sktime 0.4.3
sktime_dl 0.2.0

Forecasting reduction capabilities/tests

Had a quick look into the forecasting wrapper problem from sktime 0.4, brought up in PR #59, referenced now in #47.

Ultimately the problem is that the reduction strategy passes the data to the -dl regressor as numpy arrays (even if pd.Series are given to the ReducedRegressionForecaster.fit()), which gets flagged as illegal by sktime.utils.validation.series_as_features.check_X_y, which is always checked because input_checks=True is the default in the -dl regressor's fit()

The test (currently commented out on dev) basically follows the code the the forecasting example from base sktime, which uses the KNNRegressor from sklearn, which of course doesn't run with the check_X_y

This comes down to what is the convention now in base sktime in terms of data input/throughput. Looking at a couple of the classifiers in base sktime, e.g., the checks are applied unconditionally, so presumably regression reduced to classification runs into the same problem?

  1. We relax the dataframe requirement (which technically under the current code is only a requirement if input_checks=True, which again it is by default) and allow np arrays
  2. We change the default to input_checks=False
  3. This is a problem in the reduction strategy in that it should be passing around dataframes
  4. This is a problem in the reduction strategy in that it should try to pass the keyword arg input_checks=False to the base regressor's fit, under the assumption that it produces data (via the splitter etc.) that is suitable anyway and doesn't need checking, e.g. here
  5. ??

Thoughts?

Inheritance and initialisation

As discussed in PR #43, how best to handle the inheritance and initialisation of attributes:

  1. Leave it as it is
  2. Cooperative classes
  3. Initiate attributes when needed

With both 1 and 3, this test, as per sklearn rolling your own estimator, fails:

from sklearn.utils.estimator_checks import check_estimator
from sktime_dl.deeplearning import CNNClassifier
   def test_api(network=CNNClassifier()):
   check_estimator(network)

Tensorflow 2 compatibility

Brought up initially in #15

TF2 compatibility is definitely something to look into. I'm not sure how big of a job it would be for this repo in particular. Presumably the automated upgrade process could be applied, though from memory there's very little raw tensorflow code used anyway

Open question of whether to enforce version >=2, or if there are still reasons related to implementation maturity/stability to allow 1.x

[BUG] ImportError: cannot import name 'tabularise'

Describe the bug

I cannot import any classifier or regressor from sktime_dl.deeplearing! I got ImportError: cannot import name 'tabularise'
To Reproduce

conda create -n sktime-dl python=3.6
conda activate sktime-dl
git clone https://github.com/sktime/sktime-dl.git
cd sktime-dl
git checkout dev
git pull origin dev
pip install .

from sktime_dl.deeplearning import CNNClassifier

Expected behavior

Additional context

Versions

Darwin-19.6.0-x86_64-i386-64bit
Python 3.6.12 |Anaconda, Inc.| (default, Sep 8 2020, 17:50:39)
[GCC Clang 10.0.0 ]
NumPy 1.19.4
SciPy 1.5.4
sktime 0.4.3
sktime_dl 0.2.0

[BUG] No `NotFittedError` is raised

** Bug description
sklearn's check_is_fitted() by default checks if any attribute with a trailing underscore is present, if so, it assumes the estimator has been fitted. We initialise is_fitted_=False in the constructor, so the checks pass, ignoring the value of is_fitted_ simply by finding an attribute with a trailing underscore.

To Reproduce

from sktime_dl.deeplearning import MLPRegressor
from sktime.datasets import load_gunpoint

X_train, y_train = load_gunpoint("TRAIN", return_X_y=True)
r = MLPRegressor()
r.predict(X_train)

Expected behavior
raise NotFittedError, instead we get
AttributeError: 'NoneType' object has no attribute 'predict'

Proposed solution

  • I'm not a big fan of scikit-learn's solution using the attributes-with-trailing-underscore convention. Should we simply implement a check_is_fitted() method on the base class which checks if _is_fitted=True?
  • add unit test

Checklist for Refactoring CI/CD of sktime-dl

Checklist of suggested bug-fixes

  • replace Travis with GitHub Actions
  • clean up build_tools/build.sh
  • use Miniconda GitHub Action
  • exclude TF version < 2
  • add pre-commit yaml config file and GitHub actions (similar to sktime)
  • rename master branch to main
  • replace maint_tools/linting.sh with pre-commit
  • replace stage: Deploy from azure-pipelines.yml with new release process similar to sktime
  • remove support for Python 3.6 (only run on 3.7 for now, once everything is set up we can add 3.8 and 3.9)

To Reproduce

Expected behavior

Additional context

Versions

ImportError: cannot import name 'check_X_y' from 'sktime.utils.validation'

Hi,

the check_X_y is not found in sktime.

Traceback (most recent call last):
  File "[censored]", line 13, in <module>
    from sktime_dl.classifiers.deeplearning import CNNClassifier
  File "C:\Users\[censored]\Anaconda3\envs\untitled\lib\site-packages\sktime_dl\classifiers\deeplearning\__init__.py", line 1, in <module>
    from ._cnn import CNNClassifier
  File "C:\Users\[censored]\Anaconda3\envs\untitled\lib\site-packages\sktime_dl\classifiers\deeplearning\_cnn.py", line 23, in <module>
    from sktime.utils.validation import check_X_y
ImportError: cannot import name 'check_X_y' from 'sktime.utils.validation' (C:\Users\[censored]\Anaconda3\envs\untitled\lib\site-packages\sktime\utils\validation\__init__.py)

I guess this is supposed to use validate_X_y from https://github.com/alan-turing-institute/sktime/blob/master/sktime/utils/validation/supervised.py ?

sktime-dl examples?

Is your feature request related to a problem? Please describe.
Hi, are there any example notebooks or documentation specifically for sktime-dl?

Describe the solution you'd like
Any pointers or tutorials

Thank you :)

macOS and Windows environments on travis tests

For completeness, these test envs should be added too, however they would be non-trivial.

Versioning between operating systems is of course one issue. See here, as of writing macOS supports up to tf 2.0, but not 2.1 (on cpu, that is)

Windows should be able to support any version, but windows envs on travis are less feature-rich. It should be possible in principle, https://docs.travis-ci.com/user/reference/windows/

Alternatively, docker/equivalent environments could be setup and used within the travis tests to alleviate some of these problems. #27

Investigation into other open-source projects and how they do multi-os tests would be good

April sitrep

Hello all, hope everyone's happy and healthy during these times. In general, I think we're all aware that world and life events put projects like this on the backburner. This is just a faux issue to get back up to speed a bit on the project

The general short/mid term goal was (and still is) to get a master push out and pypi package updated with all the basic, solid, and compatible functionality we've added/are adding. That way, the project serves as a base of good core functionality that we can just implement new algs/networks into, and update as needed.

Very broadly, things we still want in:

  • Ease of installation: the setup is pretty solid I think. I had a play around with the docker pr #35 last week, and while I didn't get a gpu version working, I could use it with tensorflow-cpu (I believe anyone claiming to have successfully set up gpu usage for docker on a non-fresh windows installation without nesting a linux vm to run docker on (lol) is either a lying or a god). I'm happy that this is working in principle. Perhaps we do add the option to build for cpu only, either a separate script for a separate docker image, or an option into the same script/image.

    On the binder plugin, that is all working in principle too, but wont work on master until it's actually on there obviously

  • Ease of use: Examples/documentation. Example notebooks are ongoing under #35 also. A job of going through the docstrings will be needed as one of the final steps before going dev -> master. The overall API is solid, however there's the backend ongoing issues and how we want to handle them #45.

    On the API though, base sktime is going through it's refactoring, where there'll likely be some knock on updates needed for sktime-dl. A word on this @mloning? Is the refactor leading into a master push for base sktime? What vague timescale do you image on this given all that's going on, weeks/months?

  • Functionality: Few main points under here

    • Networks. As before, there's always going to be new networks from the literature to add, this is just on going this regardless of short term goals. There are individual networks that need the generalisation process #18 though, see #46
    • Tasks. Classification/regression in general are all good, and forecasting via reduction is fine as well, barring base sktime updates to this. Other tasks can wait, in my opinion.
    • Meta. Ensembling, may need updates to generalise across tasks. Tuning the same, but also networks may need updates to expose tunable parameters. Callbacks are exposed, but there's no nice wrapping code for it at the moment, unsure whether this would be needed either
    • Experimental/analysis. Any functionality wanted beyond that which is already in base sktime. Network and training visualisations seem most obvious, whether we wrap some of the process or just expose it for now a la callbacks. Have actual keras model saving/loading if I recall correctly, but this could be expanded further to incorporate more of the sktime-dl model wrapper as well, will look into it
  • Maintenance: Core tests and coverage are ok, obviously can be improved. Test coverage for macos and windows on travis would be good too. Don't think these are a must for the next push, but are things to improve over time.

Anything major I've missed you think?

Question: Normalization required?

Hi, I am just taking a look at how to use sktime-dl and noticed that the example data already appears to be normalized/standardized. Is this a requirement or is there a normalization layer included in the classifiers?

Not sure if this is the right forum to be asking questions. If not, is there a better place? TIA

[BUG] Problem with sktime-dl installation; cannot import e.g. sktime_dl.deeplearning

Describe the bug

I create a new Conda env: conda create -n sktime-dl python=3.6 then run: pip install sktime-dl but than when I try to from sktime_dl.deeplearning import CNNClassifier Python cannot find sktime_dl.deeplearning.

I also tried to install development version of sktime-dl but the package cannot be installed as there are NumPy and h5py conflicts (Tensorflow requires older numpy and h5py). I also tried to install required NumPy and h5py before running pip install ., but it doesn't help.

To Reproduce

conda create -n sktime-dl python=3.6
conda activate sktime-dl
pip install sktime-dl
from sktime_dl.deeplearning import CNNClassifier

Expected behavior

I can run: from sktime_dl.deeplearning import CNNClassifier
and I can run locally Jupyter Notebooks from examples.

Additional context

Versions
Linux-4.19.128-microsoft-standard-x86_64-with-debian-buster-sid
Python 3.6.11 | packaged by conda-forge | (default, Aug 5 2020, 20:09:42)
NumPy 1.18.5
SciPy 1.5.4
sktime 0.4.3
sktime_dl 0.1.0

Use `check_random_state` for handling random seeds

Is your feature request related to a problem? Please describe.
Currently we do not have a consistent way of handling random seeds.

Describe the solution you'd like
I suggest to use scikit-learn's check_random_state function and to rename all input arguments to random_state for compatibility with scikit-learn.

Generalisation to regression

A reasonable extension that fits back into the base sktime would be to generalise the networks to be usable for regression out the box as well, in addition to any reduction techniques via base sktime

Fundamentally, this should be a matter of changing the output layer and loss function, maybe a couple more finer details also

Likely, this should involve a refactor separating the 'actual' network definitions (e.g. build model minus the final layer, and any bespoke fitting/training code) into their own private sections, with wrappers to add on classification/regression/(likely other tasks in future) capabilities

e.g. file structure

  • networks
    • _ResNetNetwork: defines build model, minus output layer and model compilation
    • ...
  • classifiers
    • ResNetClassifier: builds a _ResNetNetwork instance, adds output/compiles, ready for fit/predict
    • ...
  • regressors
    • ResNetRegressor: " "
    • ...

[BUG] Can't import the CNNClassifier from sktime_dl.deeplearning

Describe the bug

Hello,

When i try to import the sktime_dl module in my code like you did it in your example, more precisely the CNNClassifier it tell me that the module "sktime_dl.deeplearning" doesn't exist. However i installed sktime-dl before, with "pip install sktime-dl".

To Reproduce

from sktime_dl.deeplearning import CNNClassifier

Versions

Linux-5.4.104+-x86_64-with-Ubuntu-18.04-bionic
Python 3.7.11 (default, Jul 3 2021, 18:01:19)
[GCC 7.5.0]
NumPy 1.19.5
SciPy 1.4.1
sktime 0.7.0
sktime_dl 0.1.0

[BUG] Examples don't run

Describe the bug
When trying to run the examples in the /examples folder.


ModuleNotFoundError Traceback (most recent call last)
in
5 from sklearn.model_selection import GridSearchCV
6 from sktime.datasets import load_gunpoint, load_italy_power_demand
----> 7 from sktime_dl.deeplearning import CNNClassifier
8
9 sns.set_style('whitegrid')

ModuleNotFoundError: No module named 'sktime_dl.deeplearning'

To Reproduce

  1. Clone the repository
  2. Install the dependencies and install sktime-dl
  3. Open univariate_time_series_classification.ipynb
  4. Execute the first cell with code

Expected behavior
Apparently sktime_dl don't have this module anymore, which was replaced by
sktime_dl.classification and sktime_dl.regression.

Additional context

Versions
absl-py-0.15.0 flatbuffers-1.12 gast-0.4.0 grpcio-1.34.1 h5py-3.5.0 keras-2.5.0rc0 keras-nightly-2.5.0.dev2021032900 opt-einsum-3.3.0 sktime-0.7.0 sktime-dl-0.2.0 tensorboard-2.7.0 tensorboard-data-server-0.6.1 tensorflow-2.5.0 tensorflow-addons-0.14.0 tensorflow-estimator-2.5.0 typeguard-2.13.0

Linux-4.15.0-151-generic-x86_64-with-debian-buster-sid
Python 3.7.6 | packaged by conda-forge | (default, Jun 1 2020, 18:57:50)
[GCC 7.5.0]
NumPy 1.21.3
SciPy 1.4.1
sktime 0.7.0
sktime_dl 0.2.0

ModuleNotFoundError: No module named 'sktime_dl.deeplearning.lstm'

Hi,

Describe the bug
I installed sktime-dl dev branch on my windows server:

git clone --single-branch --branch dev https://github.com/sktime/sktime-dl.git
cd sktime-dl
pip install .

and getting this error:

from sktime_dl.deeplearning import CNNClassifier
  ...: 
2020-08-07 10:03:06.513815: W tensorflow/stream_executor/platform/default/dso_loader.cc:59] Could not load dynamic library 'cudart64_101.dll'; dlerror: cudart64_101.dll not found
2020-08-07 10:03:06.514140: I tensorflow/stream_executor/cuda/cudart_stub.cc:29] Ignore above cudart dlerror if you do not have a GPU set up on your machine.
Traceback (most recent call last):
  File "D:\Anaconda3\envs\tipjar\lib\site-packages\IPython\core\interactiveshell.py", line 3417, in run_code
    exec(code_obj, self.user_global_ns, self.user_ns)
  File "<ipython-input-2-268c17ad5901>", line 18, in <module>
    from sktime_dl.deeplearning import CNNClassifier
  File "C:\Program Files\JetBrains\PyCharm Community Edition 2018.3.5\plugins\python-ce\helpers\pydev\_pydev_bundle\pydev_import_hook.py", line 21, in do_import
    module = self._system_import(name, *args, **kwargs)
  File "D:\Anaconda3\envs\tipjar\lib\site-packages\sktime_dl\deeplearning\__init__.py", line 34, in <module>
    from sktime_dl.deeplearning.lstm._regressor import LSTMRegressor
  File "C:\Program Files\JetBrains\PyCharm Community Edition 2018.3.5\plugins\python-ce\helpers\pydev\_pydev_bundle\pydev_import_hook.py", line 21, in do_import
    module = self._system_import(name, *args, **kwargs)
ModuleNotFoundError: No module named 'sktime_dl.deeplearning.lstm'

Versions


Windows-10-10.0.18362-SP0
Python 3.7.8 | packaged by conda-forge | (default, Jul 31 2020, 01:53:57) [MSC v.1916 64 bit (AMD64)]
NumPy 1.18.5
SciPy 1.4.1
sktime 0.4.1
sktime_dl 0.1.0

Thanks for your help

Migrate to tf.keras

Is your feature request related to a problem? Please describe.
Consider migrating from keras to TensorFlow keras (tf.keras).

Keras will only be maintained until April 2020 and the advice from keras-team is to migrate to tf.keras. Although keras 2.3 supports TensorFlow 2, it doesn't use tf2's new features, such as eager execution.

Migrating from keras to tf.keras means losing multi-backend support, i.e. Theano and CNTK, but the other backends are no longer in active development (ref. Theano; CNTK).

Describe the solution you'd like
Feel free to assign this issue to me.

  • Change import keras to from tensorflow import keras
  • Test
  • Update README

Describe alternatives you've considered
Do nothing: Not migrating means not being able to use future keras (now tf.keras) developments. Moreover, keras is only compatible with Python<=3.6 and this will restrict sktime-dl going forward.

Additional context
Additional benefit - migrating from keras to tf.keras will also enable migration from keras-contrib (which isn't on PyPi) to tensorflow-addons. This package is used in sktime-dl to provide keras_contrib.layers.InstanceNormalization()

Migration to `sktime`

sktime-dl is currently being ported to mini-packages within sktime, and no longer maintained as a separate package.

Most estimators formerly in sktime-dl are now available in the sktime.classification.deep_learning and sktime.regression.deep_learning modules, and maintained there.

Contributions are appreciated to port the rest!

To contribute, follow instructions in the umbrella planning issue sktime/sktime#3351 on the sktime repo.

Examples/documentation

It's about time an example or two and perhaps deeper documentation is made, I reckon

Examples should likely just mirror https://github.com/alan-turing-institute/sktime/tree/master/examples in location/structure

Example 1: probably mirroring univariate_time_series_classification.ipynb, including regression too

Example 2: examples with more keras-focused things (separate from base sktime): model saving/loading, callbacks when those come back in, maybe some tensorboard things

etc.

Full on documentation and e,g, github pages are things that to me personally would be great if they existed, but I'm likely not going to do it any time soon. The majority of what you might expect for sktime-dl is covered by base sktime anyway.

Multivariate classification capabilities

Currently networks only accept univariate input data, should open up this restriction to allow multivariate data. Should for the most part just involve some numpy data wrangling and updating the input shape

[BUG] Possible memory leak

Copied over/adapted from the initial PR to sktime:

There seems to be some kind of memory leak, specifically in those models that do some form of internal parameter tuning MCNN, and TLENET (perhaps the others too, however the rate of the leak would seem to be negligible in these cases). Running a single model on a single dataset seems to be fine. Running many in a single execution eventually leads to an error with very little explanation, but which is almost certainly due to an unexpected/unintended OOM, i.e. a leak.

Some links so far into my investigation, but again this requires further investigation over time to fix:

  • Old, but might be a problem with keras itself and how it handles it's tf sessions
    • 1, where downgrading Keras may fix:
  • Switching to theano or downgrading is an option...
  • Might also be a problem not with keras specifically, but running keras within pycharm...
    • 4
    • 5
    • leading to... 6
  • Running multiple MCNNs in pycharm leads to errors/exit codes, however running from e.g. anaconda prompt looks like it leaks memory much slower, but still eventually fails

Travis setup

We need to figure out how to set up Travis to test the networks. I'm currently unfamiliar with how to actually define tests, however presumably what we'll do it set it up to test with tensorflow on the backend only.

Will look for examples of other project using travis with keras dependencies.

Installation and distribution

We need to create

  • a new setup.py with sktime and the deep learning libraries as an install dependency,
  • a pypi repo similar to the sktime repo to distribute the package to allow users to pip install the package.

[BUG] ModuleNotFoundError: No module named 'sktime.utils.data_container'

I installed the development version of sktime along with pip version which is 0.5.0
Sktime-dl development version is installed

I am using colab with following settings
Linux-4.19.112+-x86_64-with-Ubuntu-18.04-bionic
Python 3.6.9 (default, Oct 8 2020, 12:12:24)
[GCC 8.4.0]
NumPy 1.19.4
SciPy 1.4.1
sktime 0.5.0
sktime_dl 0.2.0

I am getting following error
ModuleNotFoundError: No module named 'sktime.utils.data_container'

Here is the solution
At this file
sktime_dl/utils/_data.py

change
from sktime.utils.data_container import tabularise, nested_to_3d_numpy
to
from sktime.utils.data_processing import from_nested_to_2d_array, from_nested_to_3d_numpy
and change

from sktime.utils.validation.series_as_features import check_X, check_X_y
to
from sktime.utils.validation.panel import check_X, check_X_y

Creating a docker image for sktime-dl

This certainly seems a prudent next step in aiding usage/setup, particularly for non-expert users

Or any alternative/similar systems to use instead? I noticed while setting this issue up that sktime has a neat binder plugin on the readme, worth investigating for sktime-dl too @mloning?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.