Giter VIP home page Giter VIP logo

Comments (6)

leonlan avatar leonlan commented on May 29, 2024 1

Some random thoughts related to the question: how good is LNS compared to ALNS?

  • Christiaens and Vanden Berghe (2020) show that an LNS using a 1) slack-inducing string removal destroy operator and 2) greedy insert with blinks repair operator obtains state-of-the-art results on many variants of VRP.

  • In the scheduling literature, LNS goes under the name of Iterated Greedy, see Stützle and Ruiz (2018). It is currently the state-of-the-art for the permutation flow shop problem and parallel machine scheduling.

  • In Stützle and Ruiz (2018), two interesting conclusions are drawn in Chapter 4 where they perform numerical experiments with IG on the permutation flow shop problem:

    As a conclusion from this study, the most significant factor is the local search and the NEH reconstruction. Most other factors have less importance.

    • Here, the NEH reconstruction refers to greedy insertion using a specific ordering of the unassigned jobs (in short, largest total processing time first). Morever, many iterated greedy papers also show that the local search seems to be the most important aspects of IG in order to obtain SOTA scheduling results.
    • Local-search post-optimization seems to not play such an important role in VRPs. E.g., Christiaens and Vanden Berghe (2020) do not use local search and François et al. (2019) mentions that the local search procedure only yields tiny improvements. Also the original ALNS papers by Ropke and Pisinger don't use local search.
  • There's little to no research on adaptive IG/ALNS in the literature. Even when multiple repair/destroy operators are considered, studies often test all possible combinations of a single destroy and repair operator.

from alns.

N-Wouda avatar N-Wouda commented on May 29, 2024

There's also this paper about the A in ALNS: Turkes et al. 2021 (not sure if we've linked to this thing before). I read this as "it's probably not that beneficial in general, since it also adds complexity".

from alns.

N-Wouda avatar N-Wouda commented on May 29, 2024

We now have SISR as part of the CVRP example. We can add another example doing LNS with $\alpha$-UCB for a job shop problem later on: that ticks the IG box, and shows we're not just a one-trick-ALNS-pony.

Another good direction might be to offer more diagnostics. Can we, for example, help users somehow with tuning parameters/providing tools to efficiently tune an ALNS instance?

from alns.

leonlan avatar leonlan commented on May 29, 2024

There's 3 "parameter groups" that we might want to tune in ALNS:

  • ALNS itself (the parameters such as destroy rate, which destroy and repair operators)
  • The operator selection scheme
  • The acceptance criterion

It would be nice to have a tune module that does some of the following:

  • Given the space of parameters, return a sampled instance/configuration.
    • E.g., suppose we have ALNS with 2 destroy operators and 2 repair operators. tune.alns should return the $n$ configurations of ALNS with a sampled combination of those destroy/repair operators.
    • E.g., suppose we use RecordToRecordTravel. tune.accept should return $n$ sampled configurations/instances of RRT.

A simple workflow for tuning the acceptance criteria would look as follows:

alns = make_alns(...)
init = ...
select = ...
stop = ...

data = []
for idx, accept in tune.accept(RecordToRecordTravel, parameter_space, sampling_method):
    res = alns.iterate(init, select, accept, stop)
    data[idx] = res.best_state.objective()

# Best configuration
print(np.argmin(data))

This could be extended to tuning ALNS and operator selection schemes as well. I don't have much experience tuning so I don't know exactly how the tuning interface should look like.

from alns.

N-Wouda avatar N-Wouda commented on May 29, 2024

We probably shouldn't invent our own half-baked solution for this. The ML community has a lot of this already, with e.g. keras-tuner, ray.tune, etc. Those are used by a lot of people, apparently with some success. At some later point it could pay off to see how they work, and whether we can do something similar in terms of interface for our code.

from alns.

N-Wouda avatar N-Wouda commented on May 29, 2024

I'm closing this issue because tuning is now in #109, and the other ideas from last summer have (for the most part) already been implemented.

from alns.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.