Giter VIP home page Giter VIP logo

dory's Introduction

DORY: Deployment ORiented to memorY

DORY is an automatic tool to deploy DNNs on low-cost MCUs with typically less than 1MB of on-chip SRAM memory.

Reference

If you use the DORY tool to deploy your models, please make sure to cite our paper: https://ieeexplore.ieee.org/document/9381618 (preprint available also at https://arxiv.org/abs/2008.07127)

@article{burrello2020dory,
  author={A. {Burrello} and A. {Garofalo} and N. {Bruschi} and G. {Tagliavini} and D. {Rossi} and F. {Conti}},
  journal={IEEE Transactions on Computers}, 
  title={DORY: Automatic End-to-End Deployment of Real-World DNNs on Low-Cost IoT MCUs}, 
  year={2021},
  volume={},
  number={},
  pages={1-1},
  doi={10.1109/TC.2021.3066883}
}

Highlights

DORY abstracts tiling as a Constraint Programming~(CP) problem: it maximizes L1 memory utilization under the topological constraints imposed by each DNN layer. Then, it generates ANSI C code to orchestrate off- and on-chip transfers and computation phases. Layer tiling is depicted in Fig.1.


Fig.1 DORY L3-L2-L1 layer routine example. On the left, the I/O DMA copies weights tile in case only Cy is L3-tiled. Two different buffers are used for L2w. Then, the Cluster DMA manages L2-L1 communication using double-buffering, while the cores compute a kernel on the current tile stored in one of the L1 buffers.

Platform Supported

The current platforms supported are GAP8, Diana chip, and Occamy chip (under maintenance). For each backend, you have different options:

  • GAP8_board: insert limitations for dory calls, which are blocking.

  • GAP8_board_L2: it also removes the L3 memory utilization.

  • GAP8_gvsoc: no limitations.

  • Diana_SoC: generates the whole executable application.

  • Diana_TVM: only generates strings to be passed to tvm.

Limitations

The DORY framework is currently tested on feed-forward networks with single-wire residual connections. NEMO or Quantlab produces the input ONNXs.

You have to set the "v2" chip flag in DORY parameters to use GAP8 v2 boards or v1 boards. Further, you have to flash weights by using the old pulpbridge manually.

Supported layer types

  • Pointwise Convolution (+ BatchNorm + Relu)
  • DepthWise Convolution (+ BatchNorm + Relu)
  • Convolution (+ BatchNorm + Relu)
  • Max Pooling (+ BatchNorm)
  • Average Pooling (+ BatchNorm)
  • Add (+ BatchNorm + Relu)
  • Linear Layer (+ BatchNorm + Relu)
  • Linear Layer 32 bits output -- final layer

All layers are implemented in 8-bit integers, but also in mixed-precision bits (2, 4, 8 bits). Each specific layer is read from the Frontend by searching from specific patterns in the .onnx graph.

Quantlab Frontend

  • Nodes that are accepted from DORY:

'Conv', 'Pad', 'Mul', 'Add', 'Div', 'Constant', 'AveragePool', 'GlobalAveragePool', 'MaxPool', 'Cast', 'Clip', 'Floor', 'Flatten', 'Gemm', 'MatMul', 'Shape', 'Gather', 'Unsqueeze', 'Concat', 'Reshape', 'Sigmoid', 'LogSoftmax'

  • Nodes that are accepted and neglected by DORY (their functionality are included in the other nodes. E.g., the out of a conv is automatically flattened before a Fully-connected layer)

'Cast', 'Floor', 'Flatten', 'Shape', 'Gather', 'Unsqueeze', 'Concat', 'Reshape', 'Sigmoid', 'LogSoftmax'

  • Nodes that are not merged and become individual nodes in DORY graph

'AveragePool', 'MaxPool', 'Conv', 'Gemm', 'MatMul', 'GlobalAveragePool', 'Add'

  • Rules that DORY search in the graph

'Relu' = 'Mul-Div-Floor-Clip'
'BNRelu' = 'Mul-Add-Div-Floor-Clip'
'Pad' = 'Pad'

These nodes are searched as consecutive nodes in the onnx graph.
BNRelu and Relu are always merge to the previous node of the DORY graph. Pad is always merged to the subsequent node.

Current Issues

  • 1D Mixed-precision networks: not supported. The 2D mixed-precision kernels are used.

Topology tested

  • MobilenetV1-128
  • Custom networks
  • MobilenetV1-224 4-8bits
  • MobilenetV2-224 4-8 bits
  • MobilenetV1-224 8bits
  • MobilenetV2-224 8 bits
  • Residual networks

Requirements

Backend

The DORY framework can be tested using the gvsoc of GAP8 from GreenWaves. A detailed guide on installing and setting up the latest version can be found at link. The DORY tool is tested using 3.6 Realase of gap_sdk, commit: c6494b97314470446674bb468d31e4391fb187e9 .

Python

The framework has been developed using python 3.6.8. The following packages are needed:

  • Mako (1.0.12)
  • numpy (1.18.4)
  • onnx (1.10.0)
  • ortools (7.5.7466)

Input

The framework receives as input:

  1. an ONNX quantized network generated with the Nemo tool. Refer to nemo for Nemo framework installation and execution.
  2. an ONNX quantized network generated with Quantlab tool.
    Note that only a standard format 8-bit quantized produced by NEMO/Quantlab can be read given the specific nodes' sequences that are recognized by DORY;
    Examples are given inside DORY examples

Installation

The execution of DORY for 8-bits networks requires the following folders:

  1. dory: repository with the framework
  2. pulp-nn: repository with backend kernels developed for DORY flow execution

Execute the following commands to clone DORY and pulp-nn backend:

git clone https://github.com/pulp-platform/dory
cd dory
git submodule update --remote --init dory/dory_examples
git submodule update --remote --init dory/Hardware_targets/PULP/Backend_Kernels/pulp-nn
git submodule update --remote --init dory/Hardware_targets/PULP/Backend_Kernels/pulp-nn-mixed
python3 -m pip install -e .

Examples

To download the examples built on DORY, clone the internal dory_example submodule (it should be automatically previously downloaded). Then, you can run one example from the library with the following command:

python3 network_generate.py NEMO PULP.PULP_gvsoc ./dory/dory_examples/config_files/config_NEMO_MV1.json --app_dir ./application/

Where NEMO is the Frontened used, PULP.PULP_gvsoc the backend (supported by GAP8), ./dory/dory_examples/config_files/config_NEMO_MV1.json the config file. Note that in the folder logs/, all the intermediate .json and .onnx are generated.

The power profiling on a GAP8 v3 of a 1.0-MobilenetV1-128 is reported in Fig.2.


Fig.2 In the left part, the 1.0-MobileNet-128 power profile when running on GAP-8 @ fc cluster = 100 MHz and VDD = 1V. On the right, number of MAC operations, average power, and time for each layer of the network. Power was sampled at 64 KHz and then filtered with a moving average of 300 micro seconds.

Building and Using the Dockerfile

To build the docker image (mostly used to debug the CI), run the script in the docker_utils folder:

./docker_utils/build_docker.sh $IMAGE_NAME

$IMAGE_NAME is an optional argument specifying the target image name. If none is provided, the resulting image will be called "dory_docker".

Likewise, to run a docker container from the built image, run:

./docker_utils/run_docker.sh $IMAGE_NAME

Once you are in the docker container shell, you can use the setup scripts to set up your environment for pulp-sdk or gap-sdk:

source ./docker_utils/docker_pulp_sdk.sh

or

source ./docker_utils/docker_gap_sdk.sh

Now you can use DORY in the docker image as you wish, e.g., to run the tests:

python3 -m pytest test_GAP8.py --compat pulp-sdk

To mount additional folders, edit the run_docker.sh script and add more -v options.

Contributors

  • Alessio Burrello, University of Bologna, email
  • Francesco Conti, University of Bologna, email
  • Luka Macan, University of Bologna, email
  • Georg Ruetishauer, ETH Zurich, email
  • Thorir Mar Ingolfsson, ETH Zurich, email
  • Angelo Garofalo, University of Bologna, email
  • Nazareno Bruschi, University of Bologna, email
  • Giuseppe Tagliavini, University of Bologna, email
  • Davide Rossi, University of Bologna, email
  • Luca Benini, University of Bologna and ETH Zurich, email

License

DORY is released under Apache 2.0, see the LICENSE file in the root of this repository for details.

dory's People

Contributors

aburrello avatar da-gazzi avatar francescoconti avatar jossevandelm avatar lukamac avatar maartenvds avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dory's Issues

Cannot recursively clone dory

There are private git submodules in dory which makes it impossible for others to recursively clone the repo. Could you please move these private modules outside dory? This would be very convenient.

Question about the width of Bn_Relu_Bits

Hi!
I noticed that the PenguiNet has two versions in different width of Bn_Relu_Bits, one is 32 and the other is 64.

I wonder that what is the difference between them and if there is something to notice when generating an ONNX quantized network with the Nemo tool if I want the Bn_Relu_Bits to be 32 instead of 64?

when I use DORY to deploy the yolo-v3 tiny backbone(Purely feed forward network) , if the Bn_Relu_Bits is 32 the checksum failed and if the Bn_Relu_Bits is 64 the checksum is OK. Is there something wrong with my quantization process?

Best regards,
Zhu Yuqing

Diana_TVM: element-wise sum output transfers are not tiled?

@ABurrello for Diana_TVM I noticed while going through both the convolution and the addition template that the output/last DMA call in the tiling loop does not seem to be tiled for the addition (it is tiled for the convolution), and as such it seems to me that it will always write to the same memory address in the output.

I.e. for the final DMA call, there's no dory_get_tile_3d() and l2_y is set directly in the DMA call field ( DMA_copy_y.ext = l2_y; instead of DMA_copy_y.ext = l2_y_tile;).
Is it possible that this will result in an error if the tensor addition gets tiled?

Can you double check this please? Thanks!

Dory support for Bias add in linear layer?

Hi,
I pushed this simple NN through the Nemo quantization process with no problem/
Screenshot 2023-03-08 at 4 28 27 PM

and the Nemo output this onnx file:
Screenshot 2023-03-08 at 4 30 18 PM

However, when I tried to push this through the Dory process, it has an error:
DORY Frontend Check. Node Add is not accepted inside the DORY Frontend IR.
caused by the bias add after the matmul layer. I check the code and it is this line:

if self.name == 'Addition' and len(self.input_indexes) == 1:
that prevented the Add layer becomes a "Addition" layer.

The other issue #41 mentioned the pattern might not be supported but the linear layer is said to be supported in the README. Could you help me identify the issue here and I am wondering if I could make a small change to make this work.

I am working on a publication for a top-tier journal and will cite these works in my work. Thank you so much for the help!

dory$ git submodule update --init --recursive

uie24011@ozd1161u:~/Desktop/SDK/pulp-sdk/dory$ git submodule update --init --recursive
Cloning into '/home/uie24011/Desktop/SDK/pulp-sdk/dory/pulp-nn-1d'...
Username for 'https://github.com': nagatejapallapu
Password for 'https://[email protected]':
remote: Support for password authentication was removed on August 13, 2021. Please use a personal access token instead.
remote: Please see https://github.blog/2020-12-15-token-authentication-requirements-for-git-operations/ for more information.
fatal: Authentication failed for 'https://github.com/pulp-platform/pulp-nn-1d/'
fatal: clone of 'https://github.com/pulp-platform/pulp-nn-1d' into submodule path '/home/uie24011/Desktop/SDK/pulp-sdk/dory/pulp-nn-1d' failed
Failed to clone 'pulp-nn-1d'. Retry scheduled

Modifications about Dory

Hi,
I noticed that you made some changes to the Dory repository and rebuilt a new repository for dory_example.
Will this improve the efficiency of the chip? By the way, I tried to deploy pulp-dronet-V2 to GAP8 with Dory. Will the modification of Dory has any affection on the deployment of pulp-dronet-V2?
Regards,
Mathilda.

`05_DORY_Frontend_final_graph.onn` seem to be the same as `Original_graph.onnx`

Dear dory developers,

I am using dory to generate my own dronet-network, which is an integer-quantized onnx file generated by nemo (master branch). However, while dory could deal with dory_examples like dronet_complete very well, it can not process my own network. I keep receiving the following issue:
DORY Frontend Check. Node Add is not accepted inside the DORY Frontend IR..
If I look inside the log folder, I found that the automatically generated05_DORY_Frontend_final_graph.onnx is the same as Original_graph.onnx.
Any insights on the possible reasons of this problem? Thank you for your time!

How to get the .json as well as .txt files needed to deploy the model ?

I have successfully quantized the model with the NEMO tool and generated the .onnx file, but when generating the network with the "network_generate.py" file, I get an error: "out_layer0.txt not found." How do I get these .txt files for deploying the model, and I was wondering if the .json file can be generated automatically? Or do I have to fill them in manually? I would be very grateful for your reply, the job is a great one.
MV1-128

Failed to run dronet example

I used the example here: https://github.com/pulp-platform/dory_examples/tree/60dd66bf1d9ce81c0cdd5f70e51040a332dce8e5/examples/Nemo_examples/8-bits-2D/dronet_complete

and I ran
python3 network_generate.py NEMO PULP.PULP_gvsoc./dory/dory_examples/config_files/config_NEMO_dronet.json

it gives this error:

Insert tiling parameters per layer inside graph nodes
RuntimeWarning: Unexpected end-group tag: Not all data was converted
return _pywrapcp.Solver_DefaultSolverParameters()
WARNING: Logging before InitGoogleLogging() is written to STDERR
F0308 13:22:36.521386 1 constraint_solver.cc:1408] Check failed: parameters.array_split_size() > 0 (0 vs. 0) Were parameters built using Solver::DefaultSolverParameters() ?
*** Check failure stack trace: ***
Abort trap: 6

Any idea what might went wrong? Thanks for the help!

dory_examples, make all

Hi,

In dory_examples, after I run python3 network_generate.py to generate the files, in application dir., I run make clean all run CORE=8 platform=gvsoc and get the following error:

DORY_network/inc/mchan_test.h: In function 'mchan_transfer':
DORY_network/inc/mchan_test.h:105:54: error: 'MCHAN_CMD_CMD_TYPE_BIT' undeclared (first use in this function); did you mean 'MCHAN_CMD_CMD_LEN_BIT'?
  *(volatile int*) MCHAN_COMMAND_QUEUE = len | (type<<MCHAN_CMD_CMD_TYPE_BIT) | ( incr<<MCHAN_CMD_CMD_INC_BIT) | (twd_ext <<MCHAN_CMD_CMD__2D_EXT_BIT) | (ele<<MCHAN_CMD_CMD_ELE_BIT) | (ile <<MCHAN_CMD_CMD_ILE_BIT) | (ble <<MCHAN_CMD_CMD_BLE_BIT) | (twd_tcdm << MCHAN_CMD_CMD__2D_TCDM_BIT);
                                                      ^~~~~~~~~~~~~~~~~~~~~~
                                                      MCHAN_CMD_CMD_LEN_BIT
DORY_network/inc/mchan_test.h:105:54: note: each undeclared identifier is reported only once for each function it appears in
DORY_network/inc/mchan_test.h:105:88: error: 'MCHAN_CMD_CMD_INC_BIT' undeclared (first use in this function); did you mean 'MCHAN_CMD_CMD_LEN_BIT'?
  *(volatile int*) MCHAN_COMMAND_QUEUE = len | (type<<MCHAN_CMD_CMD_TYPE_BIT) | ( incr<<MCHAN_CMD_CMD_INC_BIT) | (twd_ext <<MCHAN_CMD_CMD__2D_EXT_BIT) | (ele<<MCHAN_CMD_CMD_ELE_BIT) | (ile <<MCHAN_CMD_CMD_ILE_BIT) | (ble <<MCHAN_CMD_CMD_BLE_BIT) | (twd_tcdm << MCHAN_CMD_CMD__2D_TCDM_BIT);
                                                                                        ^~~~~~~~~~~~~~~~~~~~~
                                                                                        MCHAN_CMD_CMD_LEN_BIT
DORY_network/inc/mchan_test.h:105:124: error: 'MCHAN_CMD_CMD__2D_EXT_BIT' undeclared (first use in this function); did you mean 'MCHAN_CMD_CMD_TYPE_BIT'?
  *(volatile int*) MCHAN_COMMAND_QUEUE = len | (type<<MCHAN_CMD_CMD_TYPE_BIT) | ( incr<<MCHAN_CMD_CMD_INC_BIT) | (twd_ext <<MCHAN_CMD_CMD__2D_EXT_BIT) | (ele<<MCHAN_CMD_CMD_ELE_BIT) | (ile <<MCHAN_CMD_CMD_ILE_BIT) | (ble <<MCHAN_CMD_CMD_BLE_BIT) | (twd_tcdm << MCHAN_CMD_CMD__2D_TCDM_BIT);
                                                                                                                            ^~~~~~~~~~~~~~~~~~~~~~~~~
                                                                                                                            MCHAN_CMD_CMD_TYPE_BIT
DORY_network/inc/mchan_test.h:105:159: error: 'MCHAN_CMD_CMD_ELE_BIT' undeclared (first use in this function); did you mean 'MCHAN_CMD_CMD_LEN_BIT'?
  *(volatile int*) MCHAN_COMMAND_QUEUE = len | (type<<MCHAN_CMD_CMD_TYPE_BIT) | ( incr<<MCHAN_CMD_CMD_INC_BIT) | (twd_ext <<MCHAN_CMD_CMD__2D_EXT_BIT) | (ele<<MCHAN_CMD_CMD_ELE_BIT) | (ile <<MCHAN_CMD_CMD_ILE_BIT) | (ble <<MCHAN_CMD_CMD_BLE_BIT) | (twd_tcdm << MCHAN_CMD_CMD__2D_TCDM_BIT);
                                                                                                                                                               ^~~~~~~~~~~~~~~~~~~~~
                                                                                                                                                               MCHAN_CMD_CMD_LEN_BIT
DORY_network/inc/mchan_test.h:105:191: error: 'MCHAN_CMD_CMD_ILE_BIT' undeclared (first use in this function); did you mean 'MCHAN_CMD_CMD_ELE_BIT'?
  *(volatile int*) MCHAN_COMMAND_QUEUE = len | (type<<MCHAN_CMD_CMD_TYPE_BIT) | ( incr<<MCHAN_CMD_CMD_INC_BIT) | (twd_ext <<MCHAN_CMD_CMD__2D_EXT_BIT) | (ele<<MCHAN_CMD_CMD_ELE_BIT) | (ile <<MCHAN_CMD_CMD_ILE_BIT) | (ble <<MCHAN_CMD_CMD_BLE_BIT) | (twd_tcdm << MCHAN_CMD_CMD__2D_TCDM_BIT);
                                                                                                                                                                                               ^~~~~~~~~~~~~~~~~~~~~
                                                                                                                                                                                               MCHAN_CMD_CMD_ELE_BIT
DORY_network/inc/mchan_test.h:105:223: error: 'MCHAN_CMD_CMD_BLE_BIT' undeclared (first use in this function); did you mean 'MCHAN_CMD_CMD_ILE_BIT'?
 ND_QUEUE = len | (type<<MCHAN_CMD_CMD_TYPE_BIT) | ( incr<<MCHAN_CMD_CMD_INC_BIT) | (twd_ext <<MCHAN_CMD_CMD__2D_EXT_BIT) | (ele<<MCHAN_CMD_CMD_ELE_BIT) | (ile <<MCHAN_CMD_CMD_ILE_BIT) | (ble <<MCHAN_CMD_CMD_BLE_BIT) | (twd_tcdm << MCHAN_CMD_CMD__2D_TCDM_BIT);
                                                                                                                                                                                                  ^~~~~~~~~~~~~~~~~~~~~
                                                                                                                                                                                                                               MCHAN_CMD_CMD_ILE_BIT
In file included from DORY_network/inc/dory.h:20:0,
                 from DORY_network/src/network.c:22:
DORY_network/inc/mchan_test.h:105:261: error: 'MCHAN_CMD_CMD__2D_TCDM_BIT' undeclared (first use in this function); did you mean 'MCHAN_CMD_CMD__2D_EXT_BIT'?
 TYPE_BIT) | ( incr<<MCHAN_CMD_CMD_INC_BIT) | (twd_ext <<MCHAN_CMD_CMD__2D_EXT_BIT) | (ele<<MCHAN_CMD_CMD_ELE_BIT) | (ile <<MCHAN_CMD_CMD_ILE_BIT) | (ble <<MCHAN_CMD_CMD_BLE_BIT) | (twd_tcdm << MCHAN_CMD_CMD__2D_TCDM_BIT);
                                                                                                                                                                                                  ^~~~~~~~~~~~~~~~~~~~~~~~~~
                                                                                                                                                                                                                                                                     MCHAN_CMD_CMD__2D_EXT_BIT
/home/kilic/Documents/gap_sdk/tools/rules/pulp_rules.mk:173: recipe for target '/home/kilic/Documents/thesis/gap/dory/dory_examples/application/BUILD/GAP8_V2/GCC_RISCV/DORY_network/src/network.o' failed
make: *** [/home/kilic/Documents/thesis/gap/dory/dory_examples/application/BUILD/GAP8_V2/GCC_RISCV/DORY_network/src/network.o] Error 1

Anybody knows about this issue? Thanks.

Best regards,
Beran

Repository pulp-nn-1d and pulp-nn-mixed not found!

when trying to clone dory with its submodules I get the following error from git:

Cloning into '/home/quentin/dory/pulp-nn-1d'...
Username for 'https://github.com': ESEQU
Password for 'https://[email protected]':
remote: Repository not found.
fatal: repository 'https://github.com/pulp-platform/pulp-nn-1d/' not found
fatal: clone of 'https://github.com/pulp-platform/pulp-nn-1d' into submodule path '/home/quentin/dory/pulp-nn-1d' failed
Failed to clone 'pulp-nn-1d'. Retry scheduled

That is because the respective repositories do not exist. I also could not find them anywhere else on github.
Can I still use DORY without them or are they crusial for all functionality?
Also how can I get them?
Thanks in advance

Did DORY can be used in any Net?

Hi,
I noticed that in the readme file, the Limitations Part mentioned:The DORY framework is currently tested on feed-forward networks with single-wire residual connections.

I want to know that can Dory be used in any Neural Networks or other Deep Learning Networks ?

Error when using Dory to deploy modified-DroNet.

Hi,there
I made the following modifications to Dronet: add a fully connected layer self.fc2 = nn.Linear(in_features=fc_size, out_features=3, bias=False) in Dronet, where fc_size = 128*7*7.

x = x.flatten(1)
x_copy = x       
x = self.fc1(x) 
steer = x[:, 0]
coll = self.sig(x[:, 1]) 
x_copy = self.fc2(x_copy)
x_copy = self.relu1(x_copy)
sign = self.softmax(x_copy)    # [a,b,c] #a+b+c = 1

return [steer,coll, sign]

which mad the output of dronet like this:
image
but I meet following Error:

Traceback (most recent call last):
  File "network_generate.py", line 102, in <module>
    main()
  File "network_generate.py", line 81, in main
    PULP_Nodes_Graph = onnx_m('GAP8', args.chip, args.network_dir + net).parameters_from_onnx(100)
  File "../ONNX_management.py", line 471, in parameters_from_onnx
    new_node = self.create_node(node_element(), first_node, node_iterating, model, PULP_Nodes_Graph)
  File "../ONNX_management.py", line 247, in create_node
    temp = temp.reshape(temp.shape[0], PULP_Nodes_Graph[-1].output_channels, PULP_Nodes_Graph[-1].output_h, PULP_Nodes_Graph[-1].output_w)
ValueError: cannot reshape array of size 18816 into shape (3,2,1,1)

I add some print for Debug,and got:

Flatten_145
MatMul_146
         MatMul_146
(2, 6272)
2 128 7 7
         1 1
Constant_147
Gather_148
Constant_149
Gather_150
Sigmoid_151
MatMul_152
         MatMul_152
(3, 6272)
3 2 1 1

I guess MatMul_152 take the output of MatMul_146(output_w=1,output_h =1) as input which result in this ValueError.
But according to the design of my network, it is planned to use 128*7*7 as the input of MatMul_152.
So, How can I solve this Error?
Hoping to hearing from you
Regards,
Mathilda

`build_docker.sh` is broken

Not sure whether you're accepting third-party contributions or not, but the case is too simple so hopefully an issue suffices:

In docker_utils/build_docker.sh, docker -t must be switched to docker build -t.

(bug found on Ubuntu 20.04, Docker 24.0.6)

mixed precision

Hi,

What happened to pulp-nn-mixed directory? I want to use a mixed-prec. quantized network if possible. Thanks.

Best regards,
Beran

json files don't get installed with `pip install .`

  File "/tvm-fork/python/tvm/relay/backend/contrib/soma_dory/codegen.py", line 201, in soma_dory_compiler
    converter = onnx_manager(codegen.dory_graph, config_file, '')
  File "/usr/local/lib/python3.8/dist-packages/dory/Hardware_targets/Diana/Diana_TVM/HW_Parser.py", line 42, in __init__
    with open(os.path.join(file_path, "pattern_rules.json")) as f:
FileNotFoundError: [Errno 2] No such file or directory: '/usr/local/lib/python3.8/dist-packages/dory/Hardware_targets/Diana/Diana_TVM/pattern_rules.json'

I installed this with pip install . from the root of the repository
Current workaround is to install with pip install -e

@ABurrello maybe this is due to an issue with setup.py?

CC @maartenvds

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.