Giter VIP home page Giter VIP logo

Comments (12)

github-actions avatar github-actions commented on June 25, 2024

πŸ‘‹ Hello @simoneangarano, thank you for your interest in Ultralytics YOLOv8 πŸš€! We recommend a visit to the Docs for new users where you can find many Python and CLI usage examples and where many of the most common questions may already be answered.

If this is a πŸ› Bug Report, please provide a minimum reproducible example to help us debug it.

If this is a custom training ❓ Question, please provide as much information as possible, including dataset image examples and training logs, and verify you are following our Tips for Best Training Results.

Join the vibrant Ultralytics Discord 🎧 community for real-time conversations and collaborations. This platform offers a perfect space to inquire, showcase your work, and connect with fellow Ultralytics users.

Install

Pip install the ultralytics package including all requirements in a Python>=3.8 environment with PyTorch>=1.8.

pip install ultralytics

Environments

YOLOv8 may be run in any of the following up-to-date verified environments (with all dependencies including CUDA/CUDNN, Python and PyTorch preinstalled):

Status

Ultralytics CI

If this badge is green, all Ultralytics CI tests are currently passing. CI tests verify correct operation of all YOLOv8 Modes and Tasks on macOS, Windows, and Ubuntu every 24 hours and on every commit.

from ultralytics.

samin50 avatar samin50 commented on June 25, 2024

How are you displaying your bounding boxes? The results object that you get as a return value for predict has several bounding box coordinate types, for example results[0].obb.xyxyxyxy for 4 pairs of xy coords for each corner.

from ultralytics.

simoneangarano avatar simoneangarano commented on June 25, 2024

results[0].plot(). Even the visualizations auto-generated during training (val_batch0_pred.jpg) show bboxes that are only slightly rotated.

from ultralytics.

glenn-jocher avatar glenn-jocher commented on June 25, 2024

@simoneangarano hello! It sounds like the model might be struggling to learn the angle variations effectively. This could be due to a variety of factors including the diversity of the training data in terms of angles or the specific characteristics of the loss function handling the angle predictions.

To better diagnose the issue, you could:

  • Check the distribution of angles in your training dataset to ensure there's enough variation.
  • Experiment with adjusting the loss weights for the angle component in the training configuration.

For visualizing the predictions with more clarity on rotations, you might consider manually plotting the bounding boxes using results[0].obb.xyxyxyxy to extract and visualize each box's corners. This can give you a clearer picture of how the model is predicting rotations.

Here's a quick example on how you can plot these manually for more detailed inspection:

import matplotlib.pyplot as plt
import numpy as np

# Assuming 'result' is your prediction result for one image
boxes = result.obb.xyxyxyxy  # Get the oriented bounding box coordinates

fig, ax = plt.subplots(1)
ax.imshow(result.img)  # Plot the image

# Plot each OBB
for box in boxes:
    poly = plt.Polygon(np.array(box).reshape(-1, 2), closed=True, edgecolor='r', fill=None)
    ax.add_patch(poly)

plt.show()

This might help you visually confirm the model's performance on angle predictions more precisely.

from ultralytics.

simoneangarano avatar simoneangarano commented on June 25, 2024
  • Experiment with adjusting the loss weights for the angle component in the training configuration.

How do I increase the weight for the angle component? I don't see any training hyperparameter that's specific for that.

from ultralytics.

glenn-jocher avatar glenn-jocher commented on June 25, 2024

@simoneangarano hello! In the current YOLOv8 implementation, direct hyperparameter adjustment for the angle component in the loss function isn't exposed via the training configuration. However, you can modify the source code where the loss is computed to manually increase the weight for the angle component.

Here’s a brief guide on how you might approach this:

  1. Locate the loss computation section in the model's code (usually found in the model definition files).
  2. Identify the part of the loss that corresponds to the angle prediction.
  3. Increase the multiplier for the angle loss to give it more weight during training.

If you need specific guidance on which file or line to edit, I can help you further if you provide more details about your setup or the version of YOLOv8 you are using. πŸš€

from ultralytics.

simoneangarano avatar simoneangarano commented on June 25, 2024

Thanks!
I need more help figuring out where the rotation loss is explicitly computed.
My setup is:
Ultralytics YOLOv8.2.19 πŸš€ Python-3.9.18 torch-2.3.0+cu121 CUDA:0 (NVIDIA GeForce RTX 3090, 24245MiB)
Setup complete βœ… (24 CPUs, 62.5 GB RAM, 249.4/1832.2 GB disk)
I'm using YOLOv8n-OBB.

from ultralytics.

glenn-jocher avatar glenn-jocher commented on June 25, 2024

Hello! For adjusting the rotation loss in YOLOv8n-OBB, you'll need to dive into the source code where the model and its loss functions are defined. Typically, this would be in the files where the model's forward pass and loss calculations are implemented.

Since the exact location can vary with updates, I recommend checking files related to model definitions, possibly named something like models.py or within a models directory. Look for a section that computes losses and identifies parts handling bounding box coordinates, where you might find calculations involving angles or rotations.

If you're not familiar with navigating the codebase, using a search function in your IDE for keywords like "loss" and "angle" or "rotation" might speed things up. πŸš€

Hope this helps! If you need more specific pointers, feel free to ask.

from ultralytics.

simoneangarano avatar simoneangarano commented on June 25, 2024

I guess the angles were not being considered in the default loss, as manually adding an additional loss component ang_loss solves the issue. Now the model learns to predict rotated bounding boxes. Still, what is the best loss to be used here? I used MSE and it seems to be working, but maybe there is something better. Help?

This is the code I modified in ultralytics/utils/loss.py. If you want to include it in the next release, I can help by creating a pull request.

# Cls loss
# loss[1] = self.varifocal_loss(pred_scores, target_scores, target_labels) / target_scores_sum  # VFL way
loss[1] = self.bce(pred_scores, target_scores.to(dtype)).sum() / target_scores_sum  # BCE

# Bbox loss
if fg_mask.sum():
    target_bboxes[..., :4] /= stride_tensor
    loss[0], loss[2] = self.bbox_loss(
        pred_distri, pred_bboxes, anchor_points, target_bboxes, target_scores, target_scores_sum, fg_mask
    )
else:
    loss[0] += (pred_angle * 0).sum()

angle_diff = target_bboxes[fg_mask][:,-1] - pred_bboxes[fg_mask][:,-1]
angle_loss = (angle_diff ** 2).mean()

loss[0] *= self.hyp.box  # box gain
loss[1] *= self.hyp.cls  # cls gain
loss[2] *= self.hyp.dfl  # dfl gain
loss[3] = angle_loss * self.hyp.ang

return loss.sum() * batch_size, loss.detach()  # loss(box, cls, dfl)

from ultralytics.

glenn-jocher avatar glenn-jocher commented on June 25, 2024

Hey there! πŸš€ Great job on integrating the angle loss into the model! Using MSE for the angle loss is a solid choice as it emphasizes larger errors and is generally well-behaved during optimization. However, you might also consider experimenting with the Smooth L1 Loss, which is less sensitive to outliers than MSE. This can sometimes lead to better performance, especially in cases where the angle variation is large.

Here's how you could modify your code to use Smooth L1 Loss for the angle:

angle_diff = target_bboxes[fg_mask][:,-1] - pred_bboxes[fg_mask][:,-1]
angle_loss = F.smooth_l1_loss(pred_bboxes[fg_mask][:,-1], target_bboxes[fg_mask][:,-1], reduction='mean')

This change might provide a more robust training process with respect to angle predictions. If you're seeing good results and want to contribute, a pull request would be fantastic! The community would definitely benefit from your insights and improvements. 😊

from ultralytics.

simoneangarano avatar simoneangarano commented on June 25, 2024

Thanks for the help! I will.

from ultralytics.

glenn-jocher avatar glenn-jocher commented on June 25, 2024

@simoneangarano you're welcome! 😊 If you have any more questions or need further assistance, feel free to reach out. Looking forward to your pull request! πŸš€

from ultralytics.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.