Giter VIP home page Giter VIP logo

neural-art's Introduction

Not another neural style implementation!

Recreate photos in in the style of famous artists with machine learning. CSCI3341 Artificial Intelligence Final Project.

Usage

Make sure to download models with the util/download_models.sh script first. Thanks @jcjohnson!

python art.py --help

Examples

Create a (low quality) version of the image below. The first positional argument is the content image and the image(s) after are the style images. The further options instruct the algorithm to run for 200 iterations and output an image with a 256px maximum width, which should be doable relatively quickly on a personal computer.

python art.py examples/gasson.jpg examples/starry.jpg -n 200 -w 256

Further options include using the entire oeuvre of an artist as a stylistic target (which never works well), scaling the stylistic features by a factor, specifying whether to initialize the candidate image from random noise or the content image, and more. See the script's help for details.

The best Gassongram ever:

gasson starry gasson_final

Where do I get all this art?

util/wikiart-scraper.py contains a neat script which will automatically scrape every painting for a given artist from wikiart. Simply specify the URL component of the artist's name (e.g. pablo-picasso) in the ARTISTS array in the script.

neural-art's People

Contributors

jayelm avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

neural-art's Issues

Caffe module doesn't exist

Caffe doesn't exist on Pip. So I installed Caffe2, but it also couldn't find it after install. Not sure how to fix the issue as I haven't seen a module folder looking like this before.

Check failed: K_ == new_K (25088 vs. 16384) Input size incompatible with inner product parameters. *** Check failure stack trace: ***

I am trying to run this code but getting the following problem:

: Logging before InitGoogleLogging() is written to STDERR
W0712 10:26:14.632521 1784 _caffe.cpp:139] DEPRECATION WARNING - deprecated use of Python interface
W0712 10:26:14.632577 1784 _caffe.cpp:140] Use this instead (with the named "weights" parameter):
W0712 10:26:14.632582 1784 caffe.cpp:142] Net('./models/VGG_ILSVRC_19_layers_deploy.prototxt', 1, weights='./models/VGG_ILSVRC_19_layers.caffemodel')
I0712 10:26:14.633937 1784 upgrade_proto.cpp:53] Attempting to upgrade input file specified using deprecated V1LayerParameter: ./models/VGG_ILSVRC_19_layers_deploy.prototxt
I0712 10:26:14.634027 1784 upgrade_proto.cpp:61] Successfully upgraded file specified using deprecated V1LayerParameter
I0712 10:26:14.634066 1784 upgrade_proto.cpp:67] Attempting to upgrade input file specified using deprecated input fields: ./models/VGG_ILSVRC_19_layers_deploy.prototxt
I0712 10:26:14.634078 1784 upgrade_proto.cpp:70] Successfully upgraded file specified using deprecated input fields.
W0712 10:26:14.634083 1784 upgrade_proto.cpp:72] Note that future Caffe releases will only support input layers and not input fields.
I0712 10:26:14.634299 1784 net.cpp:51] Initializing net from parameters:
name: "VGG_ILSVRC_19_layers"
force_backward: true
state {
phase: TEST
level: 0
}
layer {
name: "input"
type: "Input"
top: "data"
input_param {
shape {
dim: 10
dim: 3
dim: 224
dim: 224
}
}
}
layer {
name: "conv1_1"
type: "Convolution"
bottom: "data"
top: "conv1_1"
convolution_param {
num_output: 64
pad: 1
kernel_size: 3
}
}
layer {
name: "relu1_1"
type: "ReLU"
bottom: "conv1_1"
top: "conv1_1"
}
layer {
name: "conv1_2"
type: "Convolution"
bottom: "conv1_1"
top: "conv1_2"
convolution_param {
num_output: 64
pad: 1
kernel_size: 3
}
}
layer {
name: "relu1_2"
type: "ReLU"
bottom: "conv1_2"
top: "conv1_2"
}
layer {
name: "pool1"
type: "Pooling"
bottom: "conv1_2"
top: "pool1"
pooling_param {
pool: MAX
kernel_size: 2
stride: 2
}
}
layer {
name: "conv2_1"
type: "Convolution"
bottom: "pool1"
top: "conv2_1"
convolution_param {
num_output: 128
pad: 1
kernel_size: 3
}
}
layer {
name: "relu2_1"
type: "ReLU"
bottom: "conv2_1"
top: "conv2_1"
}
layer {
name: "conv2_2"
type: "Convolution"
bottom: "conv2_1"
top: "conv2_2"
convolution_param {
num_output: 128
pad: 1
kernel_size: 3
}
}
layer {
name: "relu2_2"
type: "ReLU"
bottom: "conv2_2"
top: "conv2_2"
}
layer {
name: "pool2"
type: "Pooling"
bottom: "conv2_2"
top: "pool2"
pooling_param {
pool: MAX
kernel_size: 2
stride: 2
}
}
layer {
name: "conv3_1"
type: "Convolution"
bottom: "pool2"
top: "conv3_1"
convolution_param {
num_output: 256
pad: 1
kernel_size: 3
}
}
layer {
name: "relu3_1"
type: "ReLU"
bottom: "conv3_1"
top: "conv3_1"
}
layer {
name: "conv3_2"
type: "Convolution"
bottom: "conv3_1"
top: "conv3_2"
convolution_param {
num_output: 256
pad: 1
kernel_size: 3
}
}
layer {
name: "relu3_2"
type: "ReLU"
bottom: "conv3_2"
top: "conv3_2"
}
layer {
name: "conv3_3"
type: "Convolution"
bottom: "conv3_2"
top: "conv3_3"
convolution_param {
num_output: 256
pad: 1
kernel_size: 3
}
}
layer {
name: "relu3_3"
type: "ReLU"
bottom: "conv3_3"
top: "conv3_3"
}
layer {
name: "conv3_4"
type: "Convolution"
bottom: "conv3_3"
top: "conv3_4"
convolution_param {
num_output: 256
pad: 1
kernel_size: 3
}
}
layer {
name: "relu3_4"
type: "ReLU"
bottom: "conv3_4"
top: "conv3_4"
}
layer {
name: "pool3"
type: "Pooling"
bottom: "conv3_4"
top: "pool3"
pooling_param {
pool: MAX
kernel_size: 2
stride: 2
}
}
layer {
name: "conv4_1"
type: "Convolution"
bottom: "pool3"
top: "conv4_1"
convolution_param {
num_output: 512
pad: 1
kernel_size: 3
}
}
layer {
name: "relu4_1"
type: "ReLU"
bottom: "conv4_1"
top: "conv4_1"
}
layer {
name: "conv4_2"
type: "Convolution"
bottom: "conv4_1"
top: "conv4_2"
convolution_param {
num_output: 512
pad: 1
kernel_size: 3
}
}
layer {
name: "relu4_2"
type: "ReLU"
bottom: "conv4_2"
top: "conv4_2"
}
layer {
name: "conv4_3"
type: "Convolution"
bottom: "conv4_2"
top: "conv4_3"
convolution_param {
num_output: 512
pad: 1
kernel_size: 3
}
}
layer {
name: "relu4_3"
type: "ReLU"
bottom: "conv4_3"
top: "conv4_3"
}
layer {
name: "conv4_4"
type: "Convolution"
bottom: "conv4_3"
top: "conv4_4"
convolution_param {
num_output: 512
pad: 1
kernel_size: 3
}
}
layer {
name: "relu4_4"
type: "ReLU"
bottom: "conv4_4"
top: "conv4_4"
}
layer {
name: "pool4"
type: "Pooling"
bottom: "conv4_4"
top: "pool4"
pooling_param {
pool: MAX
kernel_size: 2
stride: 2
}
}
layer {
name: "conv5_1"
type: "Convolution"
bottom: "pool4"
top: "conv5_1"
convolution_param {
num_output: 512
pad: 1
kernel_size: 3
}
}
layer {
name: "relu5_1"
type: "ReLU"
bottom: "conv5_1"
top: "conv5_1"
}
layer {
name: "conv5_2"
type: "Convolution"
bottom: "conv5_1"
top: "conv5_2"
convolution_param {
num_output: 512
pad: 1
kernel_size: 3
}
}
layer {
name: "relu5_2"
type: "ReLU"
bottom: "conv5_2"
top: "conv5_2"
}
layer {
name: "conv5_3"
type: "Convolution"
bottom: "conv5_2"
top: "conv5_3"
convolution_param {
num_output: 512
pad: 1
kernel_size: 3
}
}
layer {
name: "relu5_3"
type: "ReLU"
bottom: "conv5_3"
top: "conv5_3"
}
layer {
name: "conv5_4"
type: "Convolution"
bottom: "conv5_3"
top: "conv5_4"
convolution_param {
num_output: 512
pad: 1
kernel_size: 3
}
}
layer {
name: "relu5_4"
type: "ReLU"
bottom: "conv5_4"
top: "conv5_4"
}
layer {
name: "pool5"
type: "Pooling"
bottom: "conv5_4"
top: "pool5"
pooling_param {
pool: MAX
kernel_size: 2
stride: 2
}
}
layer {
name: "fc6"
type: "InnerProduct"
bottom: "pool5"
top: "fc6"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
inner_product_param {
num_output: 4096
weight_filler {
type: "gaussian"
std: 0.01
}
bias_filler {
type: "constant"
value: 0
}
}
}
layer {
name: "relu6"
type: "ReLU"
bottom: "fc6"
top: "fc6"
}
layer {
name: "drop6"
type: "Dropout"
bottom: "fc6"
top: "fc6"
dropout_param {
dropout_ratio: 0.5
}
}
layer {
name: "fc7"
type: "InnerProduct"
bottom: "fc6"
top: "fc7"
inner_product_param {
num_output: 4096
}
}
layer {
name: "relu7"
type: "ReLU"
bottom: "fc7"
top: "fc7"
}
layer {
name: "drop7"
type: "Dropout"
bottom: "fc7"
top: "fc7"
dropout_param {
dropout_ratio: 0.5
}
}
layer {
name: "fc8"
type: "InnerProduct"
bottom: "fc7"
top: "fc8"
inner_product_param {
num_output: 1000
}
}
layer {
name: "prob"
type: "Softmax"
bottom: "fc8"
top: "prob"
}
I0712 10:26:14.634415 1784 layer_factory.hpp:77] Creating layer input
I0712 10:26:14.634429 1784 net.cpp:84] Creating Layer input
I0712 10:26:14.634435 1784 net.cpp:380] input -> data
I0712 10:26:14.646639 1784 net.cpp:122] Setting up input
I0712 10:26:14.646675 1784 net.cpp:129] Top shape: 10 3 224 224 (1505280)
I0712 10:26:14.646682 1784 net.cpp:137] Memory required for data: 6021120
I0712 10:26:14.646688 1784 layer_factory.hpp:77] Creating layer conv1_1
I0712 10:26:14.646703 1784 net.cpp:84] Creating Layer conv1_1
I0712 10:26:14.646711 1784 net.cpp:406] conv1_1 <- data
I0712 10:26:14.646718 1784 net.cpp:380] conv1_1 -> conv1_1
I0712 10:26:14.646912 1784 net.cpp:122] Setting up conv1_1
I0712 10:26:14.646936 1784 net.cpp:129] Top shape: 10 64 224 224 (32112640)
I0712 10:26:14.646942 1784 net.cpp:137] Memory required for data: 134471680
I0712 10:26:14.646961 1784 layer_factory.hpp:77] Creating layer relu1_1
I0712 10:26:14.646972 1784 net.cpp:84] Creating Layer relu1_1
I0712 10:26:14.646979 1784 net.cpp:406] relu1_1 <- conv1_1
I0712 10:26:14.646984 1784 net.cpp:367] relu1_1 -> conv1_1 (in-place)
I0712 10:26:14.646997 1784 net.cpp:122] Setting up relu1_1
I0712 10:26:14.647006 1784 net.cpp:129] Top shape: 10 64 224 224 (32112640)
I0712 10:26:14.647011 1784 net.cpp:137] Memory required for data: 262922240
I0712 10:26:14.647014 1784 layer_factory.hpp:77] Creating layer conv1_2
I0712 10:26:14.647022 1784 net.cpp:84] Creating Layer conv1_2
I0712 10:26:14.647025 1784 net.cpp:406] conv1_2 <- conv1_1
I0712 10:26:14.647030 1784 net.cpp:380] conv1_2 -> conv1_2
I0712 10:26:14.647205 1784 net.cpp:122] Setting up conv1_2
I0712 10:26:14.647217 1784 net.cpp:129] Top shape: 10 64 224 224 (32112640)
I0712 10:26:14.647222 1784 net.cpp:137] Memory required for data: 391372800
I0712 10:26:14.647228 1784 layer_factory.hpp:77] Creating layer relu1_2
I0712 10:26:14.647234 1784 net.cpp:84] Creating Layer relu1_2
I0712 10:26:14.647241 1784 net.cpp:406] relu1_2 <- conv1_2
I0712 10:26:14.647246 1784 net.cpp:367] relu1_2 -> conv1_2 (in-place)
I0712 10:26:14.647254 1784 net.cpp:122] Setting up relu1_2
I0712 10:26:14.647260 1784 net.cpp:129] Top shape: 10 64 224 224 (32112640)
I0712 10:26:14.647264 1784 net.cpp:137] Memory required for data: 519823360
I0712 10:26:14.647269 1784 layer_factory.hpp:77] Creating layer pool1
I0712 10:26:14.647274 1784 net.cpp:84] Creating Layer pool1
I0712 10:26:14.647281 1784 net.cpp:406] pool1 <- conv1_2
I0712 10:26:14.647289 1784 net.cpp:380] pool1 -> pool1
I0712 10:26:14.647315 1784 net.cpp:122] Setting up pool1
I0712 10:26:14.647323 1784 net.cpp:129] Top shape: 10 64 112 112 (8028160)
I0712 10:26:14.647328 1784 net.cpp:137] Memory required for data: 551936000
I0712 10:26:14.647332 1784 layer_factory.hpp:77] Creating layer conv2_1
I0712 10:26:14.647338 1784 net.cpp:84] Creating Layer conv2_1
I0712 10:26:14.647343 1784 net.cpp:406] conv2_1 <- pool1
I0712 10:26:14.647348 1784 net.cpp:380] conv2_1 -> conv2_1
I0712 10:26:14.647531 1784 net.cpp:122] Setting up conv2_1
I0712 10:26:14.647541 1784 net.cpp:129] Top shape: 10 128 112 112 (16056320)
I0712 10:26:14.647545 1784 net.cpp:137] Memory required for data: 616161280
I0712 10:26:14.647552 1784 layer_factory.hpp:77] Creating layer relu2_1
I0712 10:26:14.647558 1784 net.cpp:84] Creating Layer relu2_1
I0712 10:26:14.647562 1784 net.cpp:406] relu2_1 <- conv2_1
I0712 10:26:14.647568 1784 net.cpp:367] relu2_1 -> conv2_1 (in-place)
I0712 10:26:14.647573 1784 net.cpp:122] Setting up relu2_1
I0712 10:26:14.647578 1784 net.cpp:129] Top shape: 10 128 112 112 (16056320)
I0712 10:26:14.647583 1784 net.cpp:137] Memory required for data: 680386560
I0712 10:26:14.647588 1784 layer_factory.hpp:77] Creating layer conv2_2
I0712 10:26:14.647593 1784 net.cpp:84] Creating Layer conv2_2
I0712 10:26:14.647598 1784 net.cpp:406] conv2_2 <- conv2_1
I0712 10:26:14.647603 1784 net.cpp:380] conv2_2 -> conv2_2
I0712 10:26:14.647817 1784 net.cpp:122] Setting up conv2_2
I0712 10:26:14.647833 1784 net.cpp:129] Top shape: 10 128 112 112 (16056320)
I0712 10:26:14.647837 1784 net.cpp:137] Memory required for data: 744611840
I0712 10:26:14.647843 1784 layer_factory.hpp:77] Creating layer relu2_2
I0712 10:26:14.647850 1784 net.cpp:84] Creating Layer relu2_2
I0712 10:26:14.647855 1784 net.cpp:406] relu2_2 <- conv2_2
I0712 10:26:14.647859 1784 net.cpp:367] relu2_2 -> conv2_2 (in-place)
I0712 10:26:14.647864 1784 net.cpp:122] Setting up relu2_2
I0712 10:26:14.647871 1784 net.cpp:129] Top shape: 10 128 112 112 (16056320)
I0712 10:26:14.647874 1784 net.cpp:137] Memory required for data: 808837120
I0712 10:26:14.647878 1784 layer_factory.hpp:77] Creating layer pool2
I0712 10:26:14.647884 1784 net.cpp:84] Creating Layer pool2
I0712 10:26:14.647895 1784 net.cpp:406] pool2 <- conv2_2
I0712 10:26:14.647900 1784 net.cpp:380] pool2 -> pool2
I0712 10:26:14.647936 1784 net.cpp:122] Setting up pool2
I0712 10:26:14.647945 1784 net.cpp:129] Top shape: 10 128 56 56 (4014080)
I0712 10:26:14.647949 1784 net.cpp:137] Memory required for data: 824893440
I0712 10:26:14.647959 1784 layer_factory.hpp:77] Creating layer conv3_1
I0712 10:26:14.647965 1784 net.cpp:84] Creating Layer conv3_1
I0712 10:26:14.647972 1784 net.cpp:406] conv3_1 <- pool2
I0712 10:26:14.647977 1784 net.cpp:380] conv3_1 -> conv3_1
I0712 10:26:14.649175 1784 net.cpp:122] Setting up conv3_1
I0712 10:26:14.649201 1784 net.cpp:129] Top shape: 10 256 56 56 (8028160)
I0712 10:26:14.649205 1784 net.cpp:137] Memory required for data: 857006080
I0712 10:26:14.649214 1784 layer_factory.hpp:77] Creating layer relu3_1
I0712 10:26:14.649219 1784 net.cpp:84] Creating Layer relu3_1
I0712 10:26:14.649225 1784 net.cpp:406] relu3_1 <- conv3_1
I0712 10:26:14.649235 1784 net.cpp:367] relu3_1 -> conv3_1 (in-place)
I0712 10:26:14.649241 1784 net.cpp:122] Setting up relu3_1
I0712 10:26:14.649250 1784 net.cpp:129] Top shape: 10 256 56 56 (8028160)
I0712 10:26:14.649256 1784 net.cpp:137] Memory required for data: 889118720
I0712 10:26:14.649262 1784 layer_factory.hpp:77] Creating layer conv3_2
I0712 10:26:14.649271 1784 net.cpp:84] Creating Layer conv3_2
I0712 10:26:14.649278 1784 net.cpp:406] conv3_2 <- conv3_1
I0712 10:26:14.649286 1784 net.cpp:380] conv3_2 -> conv3_2
I0712 10:26:14.650760 1784 net.cpp:122] Setting up conv3_2
I0712 10:26:14.650781 1784 net.cpp:129] Top shape: 10 256 56 56 (8028160)
I0712 10:26:14.650785 1784 net.cpp:137] Memory required for data: 921231360
I0712 10:26:14.650792 1784 layer_factory.hpp:77] Creating layer relu3_2
I0712 10:26:14.650799 1784 net.cpp:84] Creating Layer relu3_2
I0712 10:26:14.650804 1784 net.cpp:406] relu3_2 <- conv3_2
I0712 10:26:14.650810 1784 net.cpp:367] relu3_2 -> conv3_2 (in-place)
I0712 10:26:14.650816 1784 net.cpp:122] Setting up relu3_2
I0712 10:26:14.650822 1784 net.cpp:129] Top shape: 10 256 56 56 (8028160)
I0712 10:26:14.650833 1784 net.cpp:137] Memory required for data: 953344000
I0712 10:26:14.650837 1784 layer_factory.hpp:77] Creating layer conv3_3
I0712 10:26:14.650843 1784 net.cpp:84] Creating Layer conv3_3
I0712 10:26:14.650848 1784 net.cpp:406] conv3_3 <- conv3_2
I0712 10:26:14.650853 1784 net.cpp:380] conv3_3 -> conv3_3
I0712 10:26:14.652269 1784 net.cpp:122] Setting up conv3_3
I0712 10:26:14.652293 1784 net.cpp:129] Top shape: 10 256 56 56 (8028160)
I0712 10:26:14.652298 1784 net.cpp:137] Memory required for data: 985456640
I0712 10:26:14.652304 1784 layer_factory.hpp:77] Creating layer relu3_3
I0712 10:26:14.652312 1784 net.cpp:84] Creating Layer relu3_3
I0712 10:26:14.652316 1784 net.cpp:406] relu3_3 <- conv3_3
I0712 10:26:14.652323 1784 net.cpp:367] relu3_3 -> conv3_3 (in-place)
I0712 10:26:14.652328 1784 net.cpp:122] Setting up relu3_3
I0712 10:26:14.652334 1784 net.cpp:129] Top shape: 10 256 56 56 (8028160)
I0712 10:26:14.652338 1784 net.cpp:137] Memory required for data: 1017569280
I0712 10:26:14.652343 1784 layer_factory.hpp:77] Creating layer conv3_4
I0712 10:26:14.652349 1784 net.cpp:84] Creating Layer conv3_4
I0712 10:26:14.652354 1784 net.cpp:406] conv3_4 <- conv3_3
I0712 10:26:14.652359 1784 net.cpp:380] conv3_4 -> conv3_4
I0712 10:26:14.653734 1784 net.cpp:122] Setting up conv3_4
I0712 10:26:14.653757 1784 net.cpp:129] Top shape: 10 256 56 56 (8028160)
I0712 10:26:14.653762 1784 net.cpp:137] Memory required for data: 1049681920
I0712 10:26:14.653769 1784 layer_factory.hpp:77] Creating layer relu3_4
I0712 10:26:14.653775 1784 net.cpp:84] Creating Layer relu3_4
I0712 10:26:14.653780 1784 net.cpp:406] relu3_4 <- conv3_4
I0712 10:26:14.653792 1784 net.cpp:367] relu3_4 -> conv3_4 (in-place)
I0712 10:26:14.653798 1784 net.cpp:122] Setting up relu3_4
I0712 10:26:14.653806 1784 net.cpp:129] Top shape: 10 256 56 56 (8028160)
I0712 10:26:14.653812 1784 net.cpp:137] Memory required for data: 1081794560
I0712 10:26:14.653820 1784 layer_factory.hpp:77] Creating layer pool3
I0712 10:26:14.653836 1784 net.cpp:84] Creating Layer pool3
I0712 10:26:14.653844 1784 net.cpp:406] pool3 <- conv3_4
I0712 10:26:14.653849 1784 net.cpp:380] pool3 -> pool3
I0712 10:26:14.653883 1784 net.cpp:122] Setting up pool3
I0712 10:26:14.653892 1784 net.cpp:129] Top shape: 10 256 28 28 (2007040)
I0712 10:26:14.653898 1784 net.cpp:137] Memory required for data: 1089822720
I0712 10:26:14.653904 1784 layer_factory.hpp:77] Creating layer conv4_1
I0712 10:26:14.653913 1784 net.cpp:84] Creating Layer conv4_1
I0712 10:26:14.653931 1784 net.cpp:406] conv4_1 <- pool3
I0712 10:26:14.653936 1784 net.cpp:380] conv4_1 -> conv4_1
I0712 10:26:14.656430 1784 net.cpp:122] Setting up conv4_1
I0712 10:26:14.656452 1784 net.cpp:129] Top shape: 10 512 28 28 (4014080)
I0712 10:26:14.656457 1784 net.cpp:137] Memory required for data: 1105879040
I0712 10:26:14.712179 1784 layer_factory.hpp:77] Creating layer relu4_1
I0712 10:26:14.712198 1784 net.cpp:84] Creating Layer relu4_1
I0712 10:26:14.712205 1784 net.cpp:406] relu4_1 <- conv4_1
I0712 10:26:14.712213 1784 net.cpp:367] relu4_1 -> conv4_1 (in-place)
I0712 10:26:14.712222 1784 net.cpp:122] Setting up relu4_1
I0712 10:26:14.712230 1784 net.cpp:129] Top shape: 10 512 28 28 (4014080)
I0712 10:26:14.712236 1784 net.cpp:137] Memory required for data: 1121935360
I0712 10:26:14.712241 1784 layer_factory.hpp:77] Creating layer conv4_2
I0712 10:26:14.712265 1784 net.cpp:84] Creating Layer conv4_2
I0712 10:26:14.712568 1784 net.cpp:406] conv4_2 <- conv4_1
I0712 10:26:14.712579 1784 net.cpp:380] conv4_2 -> conv4_2
I0712 10:26:14.718035 1784 net.cpp:122] Setting up conv4_2
I0712 10:26:14.718060 1784 net.cpp:129] Top shape: 10 512 28 28 (4014080)
I0712 10:26:14.718065 1784 net.cpp:137] Memory required for data: 1137991680
I0712 10:26:14.718072 1784 layer_factory.hpp:77] Creating layer relu4_2
I0712 10:26:14.718081 1784 net.cpp:84] Creating Layer relu4_2
I0712 10:26:14.718086 1784 net.cpp:406] relu4_2 <- conv4_2
I0712 10:26:14.718091 1784 net.cpp:367] relu4_2 -> conv4_2 (in-place)
I0712 10:26:14.718098 1784 net.cpp:122] Setting up relu4_2
I0712 10:26:14.718104 1784 net.cpp:129] Top shape: 10 512 28 28 (4014080)
I0712 10:26:14.718111 1784 net.cpp:137] Memory required for data: 1154048000
I0712 10:26:14.718114 1784 layer_factory.hpp:77] Creating layer conv4_3
I0712 10:26:14.718122 1784 net.cpp:84] Creating Layer conv4_3
I0712 10:26:14.718128 1784 net.cpp:406] conv4_3 <- conv4_2
I0712 10:26:14.718134 1784 net.cpp:380] conv4_3 -> conv4_3
I0712 10:26:14.723202 1784 net.cpp:122] Setting up conv4_3
I0712 10:26:14.723227 1784 net.cpp:129] Top shape: 10 512 28 28 (4014080)
I0712 10:26:14.723232 1784 net.cpp:137] Memory required for data: 1170104320
I0712 10:26:14.723237 1784 layer_factory.hpp:77] Creating layer relu4_3
I0712 10:26:14.723245 1784 net.cpp:84] Creating Layer relu4_3
I0712 10:26:14.723251 1784 net.cpp:406] relu4_3 <- conv4_3
I0712 10:26:14.723263 1784 net.cpp:367] relu4_3 -> conv4_3 (in-place)
I0712 10:26:14.723269 1784 net.cpp:122] Setting up relu4_3
I0712 10:26:14.723275 1784 net.cpp:129] Top shape: 10 512 28 28 (4014080)
I0712 10:26:14.723279 1784 net.cpp:137] Memory required for data: 1186160640
I0712 10:26:14.723284 1784 layer_factory.hpp:77] Creating layer conv4_4
I0712 10:26:14.723299 1784 net.cpp:84] Creating Layer conv4_4
I0712 10:26:14.723307 1784 net.cpp:406] conv4_4 <- conv4_3
I0712 10:26:14.723314 1784 net.cpp:380] conv4_4 -> conv4_4
I0712 10:26:14.728392 1784 net.cpp:122] Setting up conv4_4
I0712 10:26:14.728430 1784 net.cpp:129] Top shape: 10 512 28 28 (4014080)
I0712 10:26:14.728435 1784 net.cpp:137] Memory required for data: 1202216960
I0712 10:26:14.728440 1784 layer_factory.hpp:77] Creating layer relu4_4
I0712 10:26:14.728447 1784 net.cpp:84] Creating Layer relu4_4
I0712 10:26:14.728469 1784 net.cpp:406] relu4_4 <- conv4_4
I0712 10:26:14.728475 1784 net.cpp:367] relu4_4 -> conv4_4 (in-place)
I0712 10:26:14.728482 1784 net.cpp:122] Setting up relu4_4
I0712 10:26:14.728489 1784 net.cpp:129] Top shape: 10 512 28 28 (4014080)
I0712 10:26:14.728494 1784 net.cpp:137] Memory required for data: 1218273280
I0712 10:26:14.728498 1784 layer_factory.hpp:77] Creating layer pool4
I0712 10:26:14.728505 1784 net.cpp:84] Creating Layer pool4
I0712 10:26:14.728510 1784 net.cpp:406] pool4 <- conv4_4
I0712 10:26:14.728515 1784 net.cpp:380] pool4 -> pool4
I0712 10:26:14.728554 1784 net.cpp:122] Setting up pool4
I0712 10:26:14.728564 1784 net.cpp:129] Top shape: 10 512 14 14 (1003520)
I0712 10:26:14.728567 1784 net.cpp:137] Memory required for data: 1222287360
I0712 10:26:14.728571 1784 layer_factory.hpp:77] Creating layer conv5_1
I0712 10:26:14.728580 1784 net.cpp:84] Creating Layer conv5_1
I0712 10:26:14.728587 1784 net.cpp:406] conv5_1 <- pool4
I0712 10:26:14.728605 1784 net.cpp:380] conv5_1 -> conv5_1
I0712 10:26:14.733520 1784 net.cpp:122] Setting up conv5_1
I0712 10:26:14.733542 1784 net.cpp:129] Top shape: 10 512 14 14 (1003520)
I0712 10:26:14.733547 1784 net.cpp:137] Memory required for data: 1226301440
I0712 10:26:14.733553 1784 layer_factory.hpp:77] Creating layer relu5_1
I0712 10:26:14.733561 1784 net.cpp:84] Creating Layer relu5_1
I0712 10:26:14.733567 1784 net.cpp:406] relu5_1 <- conv5_1
I0712 10:26:14.733578 1784 net.cpp:367] relu5_1 -> conv5_1 (in-place)
I0712 10:26:14.733588 1784 net.cpp:122] Setting up relu5_1
I0712 10:26:14.733593 1784 net.cpp:129] Top shape: 10 512 14 14 (1003520)
I0712 10:26:14.733598 1784 net.cpp:137] Memory required for data: 1230315520
I0712 10:26:14.733603 1784 layer_factory.hpp:77] Creating layer conv5_2
I0712 10:26:14.733609 1784 net.cpp:84] Creating Layer conv5_2
I0712 10:26:14.733613 1784 net.cpp:406] conv5_2 <- conv5_1
I0712 10:26:14.733619 1784 net.cpp:380] conv5_2 -> conv5_2
I0712 10:26:14.738690 1784 net.cpp:122] Setting up conv5_2
I0712 10:26:14.738713 1784 net.cpp:129] Top shape: 10 512 14 14 (1003520)
I0712 10:26:14.738718 1784 net.cpp:137] Memory required for data: 1234329600
I0712 10:26:14.738724 1784 layer_factory.hpp:77] Creating layer relu5_2
I0712 10:26:14.738734 1784 net.cpp:84] Creating Layer relu5_2
I0712 10:26:14.738739 1784 net.cpp:406] relu5_2 <- conv5_2
I0712 10:26:14.738744 1784 net.cpp:367] relu5_2 -> conv5_2 (in-place)
I0712 10:26:14.738750 1784 net.cpp:122] Setting up relu5_2
I0712 10:26:14.738756 1784 net.cpp:129] Top shape: 10 512 14 14 (1003520)
I0712 10:26:14.738760 1784 net.cpp:137] Memory required for data: 1238343680
I0712 10:26:14.738765 1784 layer_factory.hpp:77] Creating layer conv5_3
I0712 10:26:14.738771 1784 net.cpp:84] Creating Layer conv5_3
I0712 10:26:14.738780 1784 net.cpp:406] conv5_3 <- conv5_2
I0712 10:26:14.738786 1784 net.cpp:380] conv5_3 -> conv5_3
I0712 10:26:14.743741 1784 net.cpp:122] Setting up conv5_3
I0712 10:26:14.743763 1784 net.cpp:129] Top shape: 10 512 14 14 (1003520)
I0712 10:26:14.743767 1784 net.cpp:137] Memory required for data: 1242357760
I0712 10:26:14.743774 1784 layer_factory.hpp:77] Creating layer relu5_3
I0712 10:26:14.743782 1784 net.cpp:84] Creating Layer relu5_3
I0712 10:26:14.743786 1784 net.cpp:406] relu5_3 <- conv5_3
I0712 10:26:14.743793 1784 net.cpp:367] relu5_3 -> conv5_3 (in-place)
I0712 10:26:14.743798 1784 net.cpp:122] Setting up relu5_3
I0712 10:26:14.743804 1784 net.cpp:129] Top shape: 10 512 14 14 (1003520)
I0712 10:26:14.743809 1784 net.cpp:137] Memory required for data: 1246371840
I0712 10:26:14.743814 1784 layer_factory.hpp:77] Creating layer conv5_4
I0712 10:26:14.743820 1784 net.cpp:84] Creating Layer conv5_4
I0712 10:26:14.743830 1784 net.cpp:406] conv5_4 <- conv5_3
I0712 10:26:14.743835 1784 net.cpp:380] conv5_4 -> conv5_4
I0712 10:26:14.748875 1784 net.cpp:122] Setting up conv5_4
I0712 10:26:14.748899 1784 net.cpp:129] Top shape: 10 512 14 14 (1003520)
I0712 10:26:14.748904 1784 net.cpp:137] Memory required for data: 1250385920
I0712 10:26:14.748910 1784 layer_factory.hpp:77] Creating layer relu5_4
I0712 10:26:14.748917 1784 net.cpp:84] Creating Layer relu5_4
I0712 10:26:14.748944 1784 net.cpp:406] relu5_4 <- conv5_4
I0712 10:26:14.748950 1784 net.cpp:367] relu5_4 -> conv5_4 (in-place)
I0712 10:26:14.748957 1784 net.cpp:122] Setting up relu5_4
I0712 10:26:14.748963 1784 net.cpp:129] Top shape: 10 512 14 14 (1003520)
I0712 10:26:14.748967 1784 net.cpp:137] Memory required for data: 1254400000
I0712 10:26:14.748971 1784 layer_factory.hpp:77] Creating layer pool5
I0712 10:26:14.748980 1784 net.cpp:84] Creating Layer pool5
I0712 10:26:14.748983 1784 net.cpp:406] pool5 <- conv5_4
I0712 10:26:14.748988 1784 net.cpp:380] pool5 -> pool5
I0712 10:26:14.749023 1784 net.cpp:122] Setting up pool5
I0712 10:26:14.749033 1784 net.cpp:129] Top shape: 10 512 7 7 (250880)
I0712 10:26:14.749037 1784 net.cpp:137] Memory required for data: 1255403520
I0712 10:26:14.749042 1784 layer_factory.hpp:77] Creating layer fc6
I0712 10:26:14.749049 1784 net.cpp:84] Creating Layer fc6
I0712 10:26:14.749056 1784 net.cpp:406] fc6 <- pool5
I0712 10:26:14.749071 1784 net.cpp:380] fc6 -> fc6
I0712 10:26:15.738207 1784 net.cpp:122] Setting up fc6
I0712 10:26:15.738247 1784 net.cpp:129] Top shape: 10 4096 (40960)
I0712 10:26:15.738252 1784 net.cpp:137] Memory required for data: 1255567360
I0712 10:26:15.738266 1784 layer_factory.hpp:77] Creating layer relu6
I0712 10:26:15.738281 1784 net.cpp:84] Creating Layer relu6
I0712 10:26:15.738288 1784 net.cpp:406] relu6 <- fc6
I0712 10:26:15.738296 1784 net.cpp:367] relu6 -> fc6 (in-place)
I0712 10:26:15.738306 1784 net.cpp:122] Setting up relu6
I0712 10:26:15.738314 1784 net.cpp:129] Top shape: 10 4096 (40960)
I0712 10:26:15.738319 1784 net.cpp:137] Memory required for data: 1255731200
I0712 10:26:15.738323 1784 layer_factory.hpp:77] Creating layer drop6
I0712 10:26:15.738330 1784 net.cpp:84] Creating Layer drop6
I0712 10:26:15.738337 1784 net.cpp:406] drop6 <- fc6
I0712 10:26:15.738343 1784 net.cpp:367] drop6 -> fc6 (in-place)
I0712 10:26:15.738368 1784 net.cpp:122] Setting up drop6
I0712 10:26:15.738376 1784 net.cpp:129] Top shape: 10 4096 (40960)
I0712 10:26:15.738380 1784 net.cpp:137] Memory required for data: 1255895040
I0712 10:26:15.738384 1784 layer_factory.hpp:77] Creating layer fc7
I0712 10:26:15.738392 1784 net.cpp:84] Creating Layer fc7
I0712 10:26:15.738399 1784 net.cpp:406] fc7 <- fc6
I0712 10:26:15.738404 1784 net.cpp:380] fc7 -> fc7
I0712 10:26:15.779806 1784 net.cpp:122] Setting up fc7
I0712 10:26:15.779850 1784 net.cpp:129] Top shape: 10 4096 (40960)
I0712 10:26:15.779855 1784 net.cpp:137] Memory required for data: 1256058880
I0712 10:26:15.779865 1784 layer_factory.hpp:77] Creating layer relu7
I0712 10:26:15.779878 1784 net.cpp:84] Creating Layer relu7
I0712 10:26:15.779886 1784 net.cpp:406] relu7 <- fc7
I0712 10:26:15.779902 1784 net.cpp:367] relu7 -> fc7 (in-place)
I0712 10:26:15.779934 1784 net.cpp:122] Setting up relu7
I0712 10:26:15.779943 1784 net.cpp:129] Top shape: 10 4096 (40960)
I0712 10:26:15.779947 1784 net.cpp:137] Memory required for data: 1256222720
I0712 10:26:15.779953 1784 layer_factory.hpp:77] Creating layer drop7
I0712 10:26:15.779999 1784 net.cpp:84] Creating Layer drop7
I0712 10:26:15.780005 1784 net.cpp:406] drop7 <- fc7
I0712 10:26:15.780011 1784 net.cpp:367] drop7 -> fc7 (in-place)
I0712 10:26:15.780059 1784 net.cpp:122] Setting up drop7
I0712 10:26:15.780067 1784 net.cpp:129] Top shape: 10 4096 (40960)
I0712 10:26:15.780071 1784 net.cpp:137] Memory required for data: 1256386560
I0712 10:26:15.780076 1784 layer_factory.hpp:77] Creating layer fc8
I0712 10:26:15.780082 1784 net.cpp:84] Creating Layer fc8
I0712 10:26:15.780089 1784 net.cpp:406] fc8 <- fc7
I0712 10:26:15.780095 1784 net.cpp:380] fc8 -> fc8
I0712 10:26:15.789294 1784 net.cpp:122] Setting up fc8
I0712 10:26:15.789330 1784 net.cpp:129] Top shape: 10 1000 (10000)
I0712 10:26:15.789335 1784 net.cpp:137] Memory required for data: 1256426560
I0712 10:26:15.789343 1784 layer_factory.hpp:77] Creating layer prob
I0712 10:26:15.789353 1784 net.cpp:84] Creating Layer prob
I0712 10:26:15.789361 1784 net.cpp:406] prob <- fc8
I0712 10:26:15.789367 1784 net.cpp:380] prob -> prob
I0712 10:26:15.789456 1784 net.cpp:122] Setting up prob
I0712 10:26:15.789465 1784 net.cpp:129] Top shape: 10 1000 (10000)
I0712 10:26:15.789469 1784 net.cpp:137] Memory required for data: 1256466560
I0712 10:26:15.789474 1784 net.cpp:200] prob does not need backward computation.
I0712 10:26:15.789487 1784 net.cpp:200] fc8 does not need backward computation.
I0712 10:26:15.789494 1784 net.cpp:200] drop7 does not need backward computation.
I0712 10:26:15.789499 1784 net.cpp:200] relu7 does not need backward computation.
I0712 10:26:15.789503 1784 net.cpp:200] fc7 does not need backward computation.
I0712 10:26:15.789507 1784 net.cpp:200] drop6 does not need backward computation.
I0712 10:26:15.789512 1784 net.cpp:200] relu6 does not need backward computation.
I0712 10:26:15.789516 1784 net.cpp:200] fc6 does not need backward computation.
I0712 10:26:15.789521 1784 net.cpp:200] pool5 does not need backward computation.
I0712 10:26:15.789526 1784 net.cpp:200] relu5_4 does not need backward computation.
I0712 10:26:15.789531 1784 net.cpp:200] conv5_4 does not need backward computation.
I0712 10:26:15.789535 1784 net.cpp:200] relu5_3 does not need backward computation.
I0712 10:26:15.789541 1784 net.cpp:200] conv5_3 does not need backward computation.
I0712 10:26:15.789544 1784 net.cpp:200] relu5_2 does not need backward computation.
I0712 10:26:15.789548 1784 net.cpp:200] conv5_2 does not need backward computation.
I0712 10:26:15.789553 1784 net.cpp:200] relu5_1 does not need backward computation.
I0712 10:26:15.789557 1784 net.cpp:200] conv5_1 does not need backward computation.
I0712 10:26:15.789562 1784 net.cpp:200] pool4 does not need backward computation.
I0712 10:26:15.789567 1784 net.cpp:200] relu4_4 does not need backward computation.
I0712 10:26:15.789572 1784 net.cpp:200] conv4_4 does not need backward computation.
I0712 10:26:15.789577 1784 net.cpp:200] relu4_3 does not need backward computation.
I0712 10:26:15.789582 1784 net.cpp:200] conv4_3 does not need backward computation.
I0712 10:26:15.789587 1784 net.cpp:200] relu4_2 does not need backward computation.
I0712 10:26:15.789590 1784 net.cpp:200] conv4_2 does not need backward computation.
I0712 10:26:15.789595 1784 net.cpp:200] relu4_1 does not need backward computation.
I0712 10:26:15.789599 1784 net.cpp:200] conv4_1 does not need backward computation.
I0712 10:26:15.789604 1784 net.cpp:200] pool3 does not need backward computation.
I0712 10:26:15.789608 1784 net.cpp:200] relu3_4 does not need backward computation.
I0712 10:26:15.789613 1784 net.cpp:200] conv3_4 does not need backward computation.
I0712 10:26:15.789618 1784 net.cpp:200] relu3_3 does not need backward computation.
I0712 10:26:15.789623 1784 net.cpp:200] conv3_3 does not need backward computation.
I0712 10:26:15.789628 1784 net.cpp:200] relu3_2 does not need backward computation.
I0712 10:26:15.789631 1784 net.cpp:200] conv3_2 does not need backward computation.
I0712 10:26:15.789636 1784 net.cpp:200] relu3_1 does not need backward computation.
I0712 10:26:15.789640 1784 net.cpp:200] conv3_1 does not need backward computation.
I0712 10:26:15.789645 1784 net.cpp:200] pool2 does not need backward computation.
I0712 10:26:15.789650 1784 net.cpp:200] relu2_2 does not need backward computation.
I0712 10:26:15.789654 1784 net.cpp:200] conv2_2 does not need backward computation.
I0712 10:26:15.789659 1784 net.cpp:200] relu2_1 does not need backward computation.
I0712 10:26:15.789664 1784 net.cpp:200] conv2_1 does not need backward computation.
I0712 10:26:15.789669 1784 net.cpp:200] pool1 does not need backward computation.
I0712 10:26:15.789672 1784 net.cpp:200] relu1_2 does not need backward computation.
I0712 10:26:15.789676 1784 net.cpp:200] conv1_2 does not need backward computation.
I0712 10:26:15.789681 1784 net.cpp:200] relu1_1 does not need backward computation.
I0712 10:26:15.789686 1784 net.cpp:200] conv1_1 does not need backward computation.
I0712 10:26:15.789691 1784 net.cpp:200] input does not need backward computation.
I0712 10:26:15.789700 1784 net.cpp:242] This network produces output prob
I0712 10:26:15.789737 1784 net.cpp:255] Network initialization done.
[libprotobuf WARNING google/protobuf/io/coded_stream.cc:604] Reading dangerously large protocol message. If the message turns out to be larger than 2147483647 bytes, parsing will be halted for security reasons. To increase the limit (or to disable these warnings), see CodedInputStream::SetTotalBytesLimit() in google/protobuf/io/coded_stream.h.
[libprotobuf WARNING google/protobuf/io/coded_stream.cc:81] The total number of bytes read was 574671192
I0712 10:26:16.366539 1784 upgrade_proto.cpp:53] Attempting to upgrade input file specified using deprecated V1LayerParameter: ./models/VGG_ILSVRC_19_layers.caffemodel
I0712 10:26:17.345669 1784 upgrade_proto.cpp:61] Successfully upgraded file specified using deprecated V1LayerParameter
I0712 10:26:17.361883 1784 upgrade_proto.cpp:67] Attempting to upgrade input file specified using deprecated input fields: ./models/VGG_ILSVRC_19_layers.caffemodel
I0712 10:26:17.361909 1784 upgrade_proto.cpp:70] Successfully upgraded file specified using deprecated input fields.
W0712 10:26:17.361914 1784 upgrade_proto.cpp:72] Note that future Caffe releases will only support input layers and not input fields.
2020-07-12 10:26:17.501478: Starting ./img/examples/gasson__starry-w227-227-1
2020-07-12 10:26:17.501643: Setting up content targets
F0712 10:26:18.383313 1784 inner_product_layer.cpp:64] Check failed: K
== new_K (25088 vs. 16384) Input size incompatible with inner product parameters.
*** Check failure stack trace: ***

Can I know why I am getting this error?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.