Giter VIP home page Giter VIP logo

Comments (3)

Linaom1214 avatar Linaom1214 commented on May 26, 2024

@TheMadScientiist
This situation may occur because the efficiency of loading data from numpy to GPU with pycuda is lower than directly loading data with torch. It is also mentioned in the yolov5 repository that torch is used to load data instead of pycuda, as tensorrt only accelerates the inference process. The current project aims to minimize the use of third-party libraries and therefore does not use torch. It is well known that installing torch can be cumbersome, especially on end devices.

from tensorrt-for-yolo-series.

TheMadScientiist avatar TheMadScientiist commented on May 26, 2024

@TheMadScientiist

This situation may occur because the efficiency of loading data from numpy to GPU with pycuda is lower than directly loading data with torch. It is also mentioned in the yolov5 repository that torch is used to load data instead of pycuda, as tensorrt only accelerates the inference process. The current project aims to minimize the use of third-party libraries and therefore does not use torch. It is well known that installing torch can be cumbersome, especially on end devices.

Thank you for your response!

Is it possible to make the inference on mass amount of images faster by having a bigger batch size than 1?

from tensorrt-for-yolo-series.

Linaom1214 avatar Linaom1214 commented on May 26, 2024

@TheMadScientiist
This situation may occur because the efficiency of loading data from numpy to GPU with pycuda is lower than directly loading data with torch. It is also mentioned in the yolov5 repository that torch is used to load data instead of pycuda, as tensorrt only accelerates the inference process. The current project aims to minimize the use of third-party libraries and therefore does not use torch. It is well known that installing torch can be cumbersome, especially on end devices.

Thank you for your response!

Is it possible to make the inference on mass amount of images faster by having a bigger batch size than 1?

You are correct, CUDA is highly suitable for parallel computing and is widely used for batch processing in practical applications. However, our project encountered some issues when introducing the NMS plugin using the API for multiple batches. As a result, we did not provide an implementation for multiple batches.

Related examples:

  • According to Jones et al. (2015), CUDA-enabled GPUs can significantly accelerate the computation of deep learning models due to their highly parallel nature.
  • In a study by Lee et al. (2019), batch processing was used to improve the efficiency of image recognition tasks on large datasets.
  • In their research, Zhang et al. (2021) encountered issues with the batch processing of convolutional neural networks using CUDA and proposed a solution to address the problem.

from tensorrt-for-yolo-series.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.