Giter VIP home page Giter VIP logo

Comments (10)

kulcsarm avatar kulcsarm commented on June 3, 2024 1

@Fa20, just as @bernardbeckerman I'm doing the rejection and resampling method, I have a function which given the parameters of a generator run evaluates the constraints and returns whether the point is feasible or not, this is the part of my code which handles this in the main loop: (my batch size is 1 for now, so I only select a single point)

    candidates =  model.gen(batch_size*3)
    new_point = Models.SOBOL(search_space=mvg_search_space).gen(batch_size)
    while not is_feasible(new_point.arms[0].parameters):
        new_point = Models.SOBOL(search_space=mvg_search_space).gen(batch_size)
    for i in range(0,batch_size*3):
        if is_feasible(candidates.arms[i].parameters):
            new_point.arms[0]._parameters = candidates.arms[i].parameters
            print(f"parameters accepted: {candidates.arms[i].parameters}")
            break
        else:
            print(f"parameters rejected: {candidates.arms[i].parameters}")

How I tried to get around trying to sample the same point is that if I don't find a feasible point in the top three candidate points I just evaluate a random (but feasible) point which hopefully changes the acquisition function enough not to resample the same points, but any help is greatly appriciated as I am quite new to this.

from ax.

esantorella avatar esantorella commented on June 3, 2024 1

@kulcsarm Interesting! It sounds like you're doing rejection sampling, i.e., if the point suggested by Ax violates your nonlinear parameter constraint, you skip evaluation and just leave the point out, is that right? If so, I'd imagine that Ax might eventually keep re-suggesting the same constraint-violating points

Yeah, I'd expect so. If you want to keep doing this sort of manual rejection sampling, you could avoid that problem by attaching these trials with "Pending" status and never evaluating them. In the Developer API, that could be achieved by doing trial.run() but never trial.mark_completed(). Then in your evaluation code, you would need to do something to ensure that running such trials doesn't actually trigger the expensive evaluation. This is also doable and actually easier with the Service API, where you would run ax_client.get_next_trial() as usual but never do ax_client.complete_trial(...).

For what it's worth, Ax actually does support nonlinear constraints, but only with BoTorch models, which do Bayesian optimization. By default, Ax starts with a batch of quasi-random Sobol points, and that step doesn't support nonlinear constraints.

from ax.

bernardbeckerman avatar bernardbeckerman commented on June 3, 2024

Hi @kulcsarm, Thanks for reporting this! While I look into this, is there anything preventing you from being able to migrate to the Service API (tutorial)? This API tends to be most robust and can generally serve a wide range of use-cases.

from ax.

kulcsarm avatar kulcsarm commented on June 3, 2024

Hello @bernardbeckerman, thank you for your quick reply! I decided to use the Developer API because my problem has non-linear constraints (which are very unlikely to be violated but would break the evaluation function) which the three built in constraint classes can't handle as far as I understand, so I filter out unfeasible points "by hand" during the optimization loop.

from ax.

Fa20 avatar Fa20 commented on June 3, 2024

@kulcsarm could you please how did you handel such this non-linear constarined in AX? becuase I have the same problem.

from ax.

bernardbeckerman avatar bernardbeckerman commented on June 3, 2024

@kulcsarm Interesting! It sounds like you're doing rejection sampling, i.e., if the point suggested by Ax violates your nonlinear parameter constraint, you skip evaluation and just leave the point out, is that right? If so, I'd imagine that Ax might eventually keep re-suggesting the same constraint-violating points, since most Ax generation strategies try to sample yet-unsampled parts of the search space, which it doesn't know violate a constraint. I'm not sure if we have a good setup to handle this - let me loop in one of our researchers to help.

from ax.

bernardbeckerman avatar bernardbeckerman commented on June 3, 2024

@kulcsarm You can accomplish this in the Service API tutorial (link), by substituting the evaluate function for something like this:

def evaluate(parameterization):
    x = np.array([parameterization.get(f"x{i+1}") for i in range(6)])
    l2norm = np.sqrt((x**2).sum())
    if l2norm > 1.25:
        return {"l2norm": (l2norm, 0.0)}
    # In our case, standard error is 0, since we are computing a synthetic function.
    return {"hartmann6": (hartmann6(x), 0.0), "l2norm": (l2norm, 0.0)}

This will accomplish two things:

  1. Computes the constraint upfront and skips computation of the objective in the case that the constraint is violated. This is useful in the case that the constraint is cheap to compute relative to the objective (which I would imagine is the case if your constraint is a cheap mathematical function of the parameters) since it lets you skip evaluating the objective in cases where you know it won't be useful due to a violated constraint.
  2. In the case that the constraint is violated, the constraint metric values are still returned so that the model can learn to avoid them in the future. This should help you skip the manual step of throwing out trials.

Note that it may take more trials than the 25 that the tutorial uses in order to produce the 12 complete trials that Ax needs to proceed to the modeling stage (I've been using 50 trials, which usually does the trick). Once the optimization proceeds to the model-based "BoTorch" phase, Ax will use its internal understanding of the space to try to avoid bad parameterizations.

Let me know if this helps!

from ax.

bernardbeckerman avatar bernardbeckerman commented on June 3, 2024

@kulcsarm any luck with the above suggestions? I'm closing this out for now but please feel free to comment or reopen for further help!

from ax.

Fa20 avatar Fa20 commented on June 3, 2024

@bernardbeckerman another questions if it is possible: plus this problem with non-linear constarined on the parameters which can be solved as explained I have 3 other constarined on the objective functions not on the search parameters which should be checked after we evaluate the objective function . should we add this constarined on the evaluation function and on the outcome constarined or what is the best way to handel this problem

from ax.

kulcsarm avatar kulcsarm commented on June 3, 2024

@kulcsarm any luck with the above suggestions? I'm closing this out for now but please feel free to comment or reopen for further help!

I'm sorry for not replying I was away. The original issue still stands, I can't import the RegistryBundle class, I didn't try using the Service API yet. Do I understand correctly that to choose wihich model I want to use in the Service API, I have to set up a GenerationStrategy?

For what it's worth, Ax actually does support nonlinear constraints, but only with BoTorch models, which do Bayesian optimization. By default, Ax starts with a batch of quasi-random Sobol points, and that step doesn't support nonlinear constraints.

@esantorella I am using a BoTorch model for the Bayesian part, as I only switched to Ax for its feature to save and continue the experiments. Can I pass the inequality_constraints inside the BoTorchModel() and if yes, how? I think my method should work fine for the initial points.

from ax.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.