Giter VIP home page Giter VIP logo

Comments (6)

41968 avatar 41968 commented on July 21, 2024

The config file is as follows:

# This is an example configuration file that contains most useful parameter settings.
model:
  name: cnn7  # This model is defined in model_defs.py. Add your own model definitions there.
  path: ***  # Path to PyTorch checkpoint.
data:
  dataset: CIFAR  # Dataset name. This is just the standard CIFAR-10 test set defined in the "load_verification_dataset()" function in utils.py
  mean: [0.4914, 0.4822, 0.4465]  # Mean for normalization.
  std: [0.2023, 0.1994, 0.2010]  # Std for normalization.
  start: 0  # First example to verify in dataset.
  end: 100  # Last example to verify in dataset. We verify 100 examples in this test.
specification:
  norm: .inf  # Linf norm (can also be 2 or 1).
  # epsilon: 0.00784313725490196  # epsilon=2./255.
  epsilon: 0.03137254901960784 # epsilon=8./255
attack:  # Currently attack is only implemented for Linf norm.
  pgd_steps: 100  # Increase for a stronger attack. A PGD attack will be used before verification to filter on non-robust data examples.
  pgd_restarts: 30  # Increase for a stronger attack.
solver:
  batch_size: 1  # Number of subdomains to compute in parallel in bound solver. Decrease if you run out of memory.
  alpha-crown:
    iteration: 100   # Number of iterations for alpha-CROWN optimization. Alpha-CROWN is used to compute all intermediate layer bounds before branch and bound starts.
    lr_alpha: 0.1    # Learning rate for alpha in alpha-CROWN. The default (0.1) is typically ok.
  beta-crown:
    lr_alpha: 0.01  # Learning rate for optimizing the alpha parameters, the default (0.01) is typically ok, but you can try to tune this parameter to get better lower bound.
    lr_beta: 0.05  # Learning rate for optimizing the beta parameters, the default (0.05) is typically ok, but you can try to tune this parameter to get better lower bound.
    iteration: 20  # Number of iterations for beta-CROWN optimization. 20 is often sufficient, 50 or 100 can also be used.
bab:
  timeout: 0  # Timeout threshold for branch and bound. Increase for verifying more points.
  branching:  # Parameters for branching heuristics.
    reduceop: min  # Reduction function for the branching heuristic scores, min or max. Using max can be better on some models.
    method: kfsb  # babsr is fast but less accurate; fsb is slow but most accurate; kfsb is usually a balance.
    candidates: 3  # Number of candidates to consider in fsb and kfsb. More leads to slower but better branching. 3 is typically good enough.

batch_size is already 1.

from alpha-beta-crown.

shizhouxing avatar shizhouxing commented on July 21, 2024

@41968 It may depend on whether this model contains many unstable ReLU neurons. If most ReLU neurons are unstable, this model can be large for the complete verifiers. How did you obtain the CNN7 model?

from alpha-beta-crown.

41968 avatar 41968 commented on July 21, 2024

@shizhouxing Trained by myself using a slightly modified IBP.

  • dataset: CIFAR10
  • perturbation: 0.00784313725
  • accuracy: 0.6870
  • certified accuracy (by IBP): 0.3299

from alpha-beta-crown.

shizhouxing avatar shizhouxing commented on July 21, 2024

@41968 I see in your configuration above epsilon: 0.03137254901960784 but the model is trained with eps=2/255?

from alpha-beta-crown.

41968 avatar 41968 commented on July 21, 2024

@shizhouxing Sorry for that, but the mismatch error did not occur in my experiment.

In practice, I trained with eps=2/255 and certified with epsilon=0.00784313725490196. That is, the epsilon in the config will be modified to 0.00784313725490196 when certifying.

Do you often encounter cuda out of memory issue when verifying CNN7?

from alpha-beta-crown.

shizhouxing avatar shizhouxing commented on July 21, 2024

@41968 I see. It may imply that there are too many unstable ReLU neurons in the model.

For me, I have tried models trained with https://github.com/shizhouxing/Fast-certified-robust-training on eps=8/255, which worked fine, but I'm not sure about the GPU memory I used. Those models can mostly be verified by IBP though.

from alpha-beta-crown.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.