Comments (15)
Dear Claudio,
I think we have correctly understood the situation. We need to look more in detail because currently it is not so easy to get the picture uncorrected if the corrections are already active in the camera. We receive the corrected picture from camera. Also we will take a deep look in the code to evaluate if this is really necessary, because we may obtain the brightness directly from camera (which we can assume it is correct). What do you think?
Cheers,
Pablo
from pylon-ros-camera.
Hi Pablo, thanks to reaching out!
You may get the correct brightness from the camera, but given that you have the value of gamma
, R_corrected
and R_max
(and equivalently the green and blue counterparts), you can invert this formula (for the red channel and equivalent for the other two)
end evaluate the brightness in PylonCameraNode::calcCurrentBrightness()
😄
What do you think about it?
from pylon-ros-camera.
Hi Claudio,
your approach looks good. The only thing is that this may take some time to implement it from our side. I thought another option: just assume that the camera reached the desired brightness. Do you think this could be a good idea?
Cheers,
Pablo
from pylon-ros-camera.
Hi @pablo-quilez, not really, unfortunately. This is something that relates to light conditions and that could potentially invalidate some work settings.
What we currently do is to disable gamma, correct brightness, and then re-enable gamma.
This is a bit annoying, but given that we can do this, I think it has lower priority wrt to the other issues 😄
from pylon-ros-camera.
I was working on this topic (branch brightness_improvements/23) but I am still fighting with it. Any input information will be appreciated. I have tried both sampling the picture or considering all pixels for the calculations, which makes no difference. 16 bits encodings are reduced to 8 bits per channel for the calculations.
I think gamma only is not enough to get the correct calculation. Ideally we should get the brightness value from the camera instead of calculating it.
Mono
It depends on if the gamma is enabled or not. If gamma is activated, then values are 10-20% far away. If it is not activated they are nearer (e.g. target 200 --> 193, target 100 --> 92), but far away enough to cause error.
// MONO8, MONO16
// The mean brightness is calculated using a subset of all pixels
for ( const std::size_t& idx : sampling_indices_ )
{
if (sensor_msgs::image_encodings::bitDepth(img_raw_msg_.encoding) == 8){
// 8 bit encoding can be directly calculated
sum += img_raw_msg_.data.at(idx);
} else if (sensor_msgs::image_encodings::bitDepth(img_raw_msg_.encoding) == 16) {
// 16 bit to 8 convert (8 bit displacement). Basler Mono12 is converted to 16 bits to ROS using 4 bit displacement.
uint16_t pixel_16bits = (*(uint16_t *) &img_raw_msg_.data.at(idx * 2));
sum += pixel_16bits >> 8;
}
}
if ( sum > 0.0 )
{
sum /= static_cast<float>(sampling_indices_.size());
}
RGB8
I tried many different things but still not succeded.
- Calculating the brightness directly from ROS data. Result: calculation != target.
float fred = (float) red;
float fgreen = (float) green;
float fblue = (float) blue;
sum += fgreen + fblue + fred;
rosservice call /pylon_camera_node/set_brightness "target_brightness: 200
brightness_continuous: true
exposure_auto: true
gain_auto: true"
reached_brightness: 223
reached_exposure_time: 36010.0
reached_gain_value: 0.471995949745
success: False
offset_x: 0
offset_y: 0
reverse_x: False
reverse_y: False
black_level: 0
pgi_mode: -1
demosaicing_mode: -1
noise_reduction: -10000.0
sharpness_enhancement: 0.0
light_source_preset: 1
balance_white_auto: 2
sensor_readout_mode: -1
acquisition_frame_count: -10000
trigger_selector: 0
trigger_mode: 1
trigger_source: 0
trigger_activation: 0
trigger_delay: -10000.0
user_set_selector: 0
user_set_default_selector: 1
is_sleeping: False
brightness: 223.824417114
exposure: 36029.0
gain: 0.479579001665
gamma: 1.0
binning_x: 1
binning_y: 1
roi:
x_offset: 0
y_offset: 0
height: 1200
width: 1600
do_rectify: False
available_image_encoding: [RGB8, YCbCr422_8, BayerRG8, BayerRG12]
current_image_encoding: "RGB8"
current_image_ros_encoding: "rgb8"
sucess: True
message: ''
temperature: 0.0
- Using relative luminance = (r * 0.3) + (g * 0.59) + (b * 0.11) is even worse. Values are always too low and far from target
- Using gamma for the calculation. Results are also similar as not applying gamma.
rosservice call /pylon_camera_node/set_brightness "target_brightness: 200
brightness_continuous: true
exposure_auto: true
gain_auto: true"
reached_brightness: 220
reached_exposure_time: 35782.0
reached_gain_value: 0.0896415784955
success: False
const float fmax = 255.0;
float fgamma = pylon_camera_->currentGamma();
float fred = (float) red;
float fgreen = (float) green;
float fblue = (float) blue;
/*float fred = 0.299 * (float) red;
float fgreen = 0.587 * (float) green;
float fblue = 0.114 * (float) blue; */
fred = pow(((float) red) * pow(fmax, fgamma) / fmax, 1.0 / fgamma);
fgreen = pow(((float) green) * pow(fmax, fgamma) / fmax, 1.0 / fgamma);
fblue = pow(((float) blue) * pow(fmax, fgamma) / fmax, 1.0 / fgamma);
sum += fgreen + fblue + fred;
Code
for ( const std::size_t& idx : sampling_indices_ ){
uint8_t red, green, blue;
int index = idx * sensor_msgs::image_encodings::numChannels(img_raw_msg_.encoding); // byte index
if (img_raw_msg_.encoding == sensor_msgs::image_encodings::RGB8){
red = img_raw_msg_.data.at(index);
green = img_raw_msg_.data.at(index + 1);
blue = img_raw_msg_.data.at(index + 2);
} else if (img_raw_msg_.encoding == sensor_msgs::image_encodings::BGR8){
blue = img_raw_msg_.data.at(index);
green = img_raw_msg_.data.at(index + 1);
red = img_raw_msg_.data.at(index + 2);
} else if (img_raw_msg_.encoding == sensor_msgs::image_encodings::RGB16){
red = *(uint16_t *) &img_raw_msg_.data.at(index) >> 8;
green = *(uint16_t *) &img_raw_msg_.data.at(index + 2) >> 8;
blue = *(uint16_t *) &img_raw_msg_.data.at(index + 4) >> 8;
} else if (img_raw_msg_.encoding == sensor_msgs::image_encodings::BGR16){
blue = *(uint16_t *) &img_raw_msg_.data.at(index) >> 8;
green = *(uint16_t *) &img_raw_msg_.data.at(index + 2) >> 8;
red = *(uint16_t *) &img_raw_msg_.data.at(index + 4) >> 8;
} else {
assert(false); // Cannot happen
}
// Correct red
const float fmax = 255.0;
float fgamma = pylon_camera_->currentGamma();
float fred = (float) red;
float fgreen = (float) green;
float fblue = (float) blue;
/*float fred = 0.299 * (float) red;
float fgreen = 0.587 * (float) green;
float fblue = 0.114 * (float) blue; */
fred = pow(((float) red) * pow(fmax, fgamma) / fmax, 1.0 / fgamma);
fgreen = pow(((float) green) * pow(fmax, fgamma) / fmax, 1.0 / fgamma);
fblue = pow(((float) blue) * pow(fmax, fgamma) / fmax, 1.0 / fgamma);
sum += fgreen + fblue + fred;
}
sum /= static_cast<float>(sampling_indices_.size()) * sensor_msgs::image_encodings::numChannels(img_raw_msg_.encoding);
YUV
It uses UYVY encoding, which means that each odd byte contains the brightness (Y) component of a pixel. Brightness values are also different and far (10-40 units) from the target. For example, target 100 --> produces 140, target 200 produces 214. Changing brightness continuous or gain auto has no effect on the obtained result. Theoretically, the brightness should be the average value of the Y components. Do you know if I am missing something?
rosservice call /pylon_camera_node/set_brightness "target_brightness: 100
brightness_continuous: false
exposure_auto: true
gain_auto: false"
reached_brightness: 140
reached_exposure_time: 41487.0
reached_gain_value: 0.0
success: False
rosservice call /pylon_camera_node/set_brightness "target_brightness: 200
brightness_continuous: false
exposure_auto: true
gain_auto: true"
reached_brightness: 214
reached_exposure_time: 172767.0
reached_gain_value: 0.0
success: False
Code
for (int i = 0; i < pylon_camera_->imageCols() * pylon_camera_->imageRows(); i++){
int index = i * 2; // 2 bytes average per pixel
sum += img_raw_msg_.data.at(index + 1);
}
sum /= pylon_camera_->imageCols() * pylon_camera_->imageRows();
// UYVY encoding, just take the intensity of the first pixel
/* for ( const std::size_t& idx : sampling_indices_ )
{
int index = idx * 2; // 2 bytes average per pixel
sum += (img_raw_msg_.data.at(index + 1) + img_raw_msg_.data.at(index + 2)) / 2.0;
}
sum /= static_cast<float>(sampling_indices_.size()); */
Bayer
Not started
from pylon-ros-camera.
Hi @pablo-quilez, sorry for not coming back to you earlier.
I'll try to have a look at your comment asap and let you know 👍
from pylon-ros-camera.
Thank you @claudiofantacci, please let me know if you have more information!
from pylon-ros-camera.
Hi @pablo-quilez, I'm finally back on this.
I think that
Ideally we should get the brightness value from the camera instead of calculating it.
Is the pain point here and we should be doing that. is this possible?
To complicate things even further, if you use sRGB
as gamma space mode you definitely won't be able to set the proper brightness in any way. I'm currently working with a Dart Camera daA1280-54uc
and I can't just get the auto-brightness feature to work (checkout on current master
branch).
My expectation is to have the same behavior that the PylonViewer
provides when setting TargetBrightness
to a specific value in the range [0, 1]
, but this seems not to be mapped by the ROS driver, or, to the best of my understanding, the brightness on Pylon viewer differs from the brightness implemented in the ROS driver.
For example, on PylonViewer
I have these settings for my daA1280-54uc
:
- Pixel format:
RGB8
- Exposure Time:
20000
- Auto Exposure Time Lower Limit:
100
- Auto Exposure Time Upper Limit:
20000
- Exposure auto:
Off
- Gain:
0.0
- Auto Gain Lower Limit:
0.0
- Auto Gain Upper Limit:
18.02
- Gain auto:
Off
- Gamma:
1.0
- Gamma Space Mode:
sRGB
- Balance White Auto:
Off
- Auto Function Profile:
Minimise Gain
- Target Brightness:
0.2
- Target Brightness Damping:
0.5
and everything is at an equilibrium, where if I enable any auto feature I get not variation of the above parameters.
If I now run the Pylon ROS driver and have a look at rostopic echo /pylon_camera_node/currentParams
I get, among other things, brightness: 107
. And here I think lies the problem.
When you pass a target_brightness
like:
rosservice call /pylon_wrist_front/set_brightness "target_brightness: 107
brightness_continuous: false
exposure_auto: true
gain_auto: true"
this will result in setting Pylon Target Brightness
to 0.42
, i.e. 107 / 255 = 0.42
, which is different from what was set on the camera with the value 0.2
, which according to the [0, 255]
conversion should result in a brightness of 51
.
This means that there is a substantial difference between how the brightness is calculated/used by the PylonViewer and the ROS driver, and also this means that the Target Brightness
that you can set in the PylonViewer
has a different interpretation.
In order for me to set the proper TargetBrightness
I have to:
- Have Gamma Space Mode to
RGB
and Gamma: ~0.4
to have the same result ofsRGB
with1.0
- Set Gamma to
1
to disable it, becauserosservice call /pylon_wrist_front/gamma_enable "data: false"
is not supported, although technically it just need to be set to1
😄 - Trigger auto brightness
- Set back Gamma to ~
0.4
Overall I would say:
- If we can get the brightness from the camera, let's go for it and keep everything else the same.
- If we can't, the driver should just disable gamma correction (either toggling it, or setting it to
1
forRGB
or whatever is needed withsRGB
), trigger auto brightness, enable gamma back.
What do you think?
from pylon-ros-camera.
@claudiofantacci @pablo-quilez Hello, just a short note on that: the pylon Viewer is only used to configure the camera. That is, on Basler cameras such features are implemented on the camera's FPGA and calculations are performed directly on the camera. This avoids delays on the host and reduces cpu load. Thus, Basler usually recommends using the camera features.
from pylon-ros-camera.
@m-binev I totally agree, and I think this ROS driver should make use of that by simply triggering those feature on the camera, and just give back an ack whether everything went fine or not. Am I right?
from pylon-ros-camera.
@claudiofantacci Hi, this is what I would recommend. This makes things simple; reduces processing time/delays and CPU load.
@pablo-quilez FYI.
from pylon-ros-camera.
@m-binev I'm with you here :-)
Let's see what @pablo-quilez thinks about it!
from pylon-ros-camera.
Hi,
I think I struggle with the same problem. (#82). However, I couldn't find a work around for now. When I set gamma to 1.0, I still can't successfully configure auto brightness. @claudiofantacci Do I miss anything? Could you provide more information about your workaround? Thanks a lot for your time!
from pylon-ros-camera.
Hey @RGring can you describe the process with which you try to set auto-brightness and what value do you use?
Here are the steps I would recommend:
- Have the camera providing images at a quality that is ok for you
- Look at the current brightness that is published on (IINW)
/pylon_camera_node/currentParams
- Sign down that number or somewhere close to that
Let's suppose you want to recalibrate the camera to that brightness after some time (and maybe something changed, like illumination)
- Disable Gamma correction, this is different from camera to camera. On ACE cameras you can disable it, on other like DART you can't. If you can't, you need to have Gamma correction on RGB (and not sRGB), set it to
1.0
and then back to the value it was (e.g.0.4
if you want something similar to sRGB). - Trigger atuo-brightness
- Enable Gamma
Note that depending on the brightness you want to achieve, you may be triggering the onboard camera feature (IIRC above 50
) or the one implemented in this driver.
Hope this helps.
from pylon-ros-camera.
Hi all,
As it is an old issue, I am closing it for now.
from pylon-ros-camera.
Related Issues (20)
- Driver installation problem HOT 5
- AutofuntionROI Offset OutOfRangeException HOT 4
- Frame rate not adjusting with new ROI HOT 2
- multiple USB cameras retrieveresult delay HOT 1
- Question about function PylonInitialize HOT 5
- pylon_camera fps not reaching set frame_rate HOT 8
- Camera Config help for short range imaging HOT 2
- Connection multiple cameras ROS2 and publish images HOT 2
- ACE cameras PTP synchronization (acA_1920-40gc) HOT 2
- No available camera. Keep waiting and trying... HOT 2
- Synchronization error with image timestamp
- [Humble] Don't waste time on blocking image data readout HOT 4
- How to extract the different intensities from the polar camera rostopic /image_raw HOT 3
- Merge PR168_change_setLineMode_behavior into Humble HOT 1
- Cannot connect and get images from Blaze over a router HOT 3
- BUG: Image Encoding Parameter is not read as expected HOT 1
- Blaze depth map encoding HOT 6
- Cannot access ethernet camera from within docker container HOT 1
- Not able to change pins Line2 and Line3 mode with camera Ace2 HOT 3
- GenAPI|ROS image_encodings: [BayerGR8'|NO_ROS_EQUIVALENT']
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from pylon-ros-camera.