-
Notifications
You must be signed in to change notification settings - Fork 4.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
解析.bag包获得深度图和rgb图,根据内外参对齐深度图到rgb图得到某一像素点对应的深度值。这个方法与直接在Intel.RealSense.Viewer上使用鼠标停留在深度图上显示的深度值有出入,感觉Intel.RealSense.Viewer上显示的深度值更加准确 #13629
Comments
Hello @weishiguan Have you written your own program script to read the depth from the bag file please? If you have then a difference between the RealSense Viewer and a program script is that the Viewer applies a range of filters to the depth information by default, but a script applies no filters unless you deliberately put the filters into the script. An absence of filters in a program script could cause the depth values to be different from those provided in the Viewer. If you are performing depth to color alignment then it is important to use the color intrinsics or aligned intrinsics, and not the depth intrinsics. This is because when depth to color alignment is performed, the origin of depth changes from the center line of the left infrared sensor to the center line of the RGB sensor. Also, looking at the '3204' distance value that you provided, I wonder if this is the raw pixel depth value. To get the real world distance in meters, you would multiply the raw depth value by the depth scale value of the camera. For most RealSense 400 Series camera models this value will be '0.001'. For example, 3204 (raw pixel depth value) x 0.001 (camera depth scale) = 3.204 (real-world meters) 您好 @weishiguan 您是否编写了自己的程序脚本来从包文件中读取深度?如果您编写了,那么 RealSense Viewer 和程序脚本之间的区别在于,Viewer 默认将一系列过滤器应用于深度信息,但脚本不应用任何过滤器,除非您故意将过滤器放入脚本中。程序脚本中缺少过滤器可能会导致深度值与 Viewer 中提供的值不同。 如果您正在执行深度到颜色对齐,那么使用颜色内在函数或对齐内在函数而不是深度内在函数非常重要。这是因为当执行深度到颜色对齐时,深度的原点从左红外传感器的中心线变为 RGB 传感器的中心线。 此外,查看您提供的“3204”距离值,我想知道这是否是原始像素深度值。要获得以米为单位的真实世界距离,您需要将原始深度值乘以相机的深度比例值。对于大多数 RealSense 400 系列相机型号,此值为“0.001”。例如,3204(原始像素深度值)x 0.001(相机深度比例)= 3.204(真实世界米) |
很高心收到您的回复。如你所说,我确实编写了自己的程序脚本从深度图(解析后的.bag)中读取深度,我能否在我的代码中引入与Viewer相同的过滤器,相关的代码可否提供帮助。 此外,对于深度对齐到rgb,我使用了rgb内参和外参进行对齐,这和您所提示的一致;对于距离值,我提供的单位为mm(毫米),感谢您的提示。 此外,我将提供我的代码作为对齐参考,如果有问题非常感谢您的指导。 |
The RealSense Viewer applies multiple filters including Decimation, Spatial, Disparity and Temporal. Each filter would have to be individually programmed into the script. There is a Python example of post-processing and alignment code at #11246 RealSense Viewer 应用了多个过滤器,包括抽取、空间、视差和时间。每个过滤器都必须单独编程到脚本中。 #11246 上有一个后处理和对齐代码的 Python 示例 |
You said: The RealSense Viewer applies multiple filters including Decimation, Spatial, Disparity and Temporal. Each filter would have to be individually programmed into the script. |
Hi @zhanyaoaaaaaa The effects of post-processing filters are not recorded to bag files, so you have to load in the bag file and then apply the filters in real-time. Applying filters in real-time to bag files uses the same code as when using a live camera, becayse the main difference between a bag file script and a live-camera script is just that the bag file is being used as the data source instead of the camera. There is not a single all-in-one Python script that covers all of the filters. If there is a particular filter that you would like to use then I can assist in finding references for it. For example, Python code for the Threshold Filter that sets a minimum and maximum depth range can be found at #8170 (comment) You may also find the RealSense post-processing filter tutorial for Python at the link below to be helpful. https://github.com/IntelRealSense/librealsense/blob/jupyter/notebooks/depth_filters.ipynb |
I hope you can give some practical suggestions, because after trying to add temporal filter to the code, I found that no matter how I do the code, it will always report an error. May I ask how to add the code for the temporal filter?
For example, when I change the code to the following form, an error will be reported: AttributeError: 'pyrealsense2.pyrealsense2.frame' object has no attribute 'get_distance'.
|
@zhanyaoaaaaaa Usually when adding a new filter, before you insert the .process line to apply the filter, you have to first define the filter itself. See the script at #10078 (comment)
|
Can you explain that when I apply temporal filtering to a deep frame, the attribute of this frame changes from "pyrealsense2. pyrealsense2. depthframe" to "pyrealsense2. pyrealsense2. frame", causing the function get-distance to fail and report an error: AttributeError: 'pyrealsense2.pyrealsense2.frame' object has no attribute 'get_distance'。 Your references are different from my questions every time, and I can't learn from them. My current problem is an error, not not not not not being applied. I have changed your code many times and found that it still reports an error. After applying the time filter, even the properties of the depth frame have been changed, which is really unbelievable. The code as follows:
Looking forwar for your reply! |
Hello @weishiguan When a bad frame occurs, the RealSense SDK will go back to the last known good frame and then progress onwards from the frame that it returned to. This can cause frames to be repeated. There are things that you can do in program scripts that will reduce the risk of frames repeating, but not much that you can do about it in the RealSense Viewer tool. You could try disabling all post-processing filters to see whether the filters are placing a burden on your computer's CPU that is making frame skips more likely to occur, because filters are processed on the CPU and not in the camera hardware. The Spatial filter especially can place a heavy processing burden on the CPU. The Temporal filter could affect the final depth values that you get. For example, increasing the value of the Alpha Smooth setting of the Temporal filter will make the depth values update more frequently (become more unstable), whilst reducing Alpha Smooth reduces fluctuations in the depth values are provides more stable readings by updating the depth values less frequently. The depth values resulting from areas that have been filled in by the Hole-Filling filter may be less accurate because the values are estimated rather than being ones that were produced by the camera. 您好 @weishiguan 当出现坏帧时,RealSense SDK 将返回到最后一个已知的好帧,然后从返回的帧继续前进。这可能会导致帧重复。您可以在程序脚本中执行一些操作来降低帧重复的风险,但在 RealSense Viewer 工具中对此无能为力。 您可以尝试禁用所有后处理过滤器,以查看过滤器是否会给计算机的 CPU 带来负担,从而使跳帧更容易发生,因为过滤器是在 CPU 上而不是在相机硬件上处理的。空间过滤器尤其会给 CPU 带来沉重的处理负担。 时间过滤器可能会影响您获得的最终深度值。例如,增加时间过滤器的 Alpha Smooth 设置的值将使深度值更新更频繁(变得更不稳定),而降低 Alpha Smooth 会减少深度值的波动,并通过以更低的频率更新深度值来提供更稳定的读数。 由填洞过滤器填充的区域产生的深度值可能不太准确,因为这些值是估计的,而不是由相机产生的值。 |
@zhanyaoaaaaaa I do not know the reason why adding a temporal filter is breaking your aligned-data script, unfortunately. What happens if you test the Python script at #13099 (comment) which uses aligned_frames, post-processing and get_distance, please? |
I succeeded, I modified the get_distance function code to the following and it worked successfully. Thank you very much for your guidance. Is the result obtained by the two methods of getting the same distance? (The commented-out code is the original code.)
Additionally, I have another question: How should the parameters of the temporal filter be set for tracking the motion trajectory of moving objects? |
If the pixel value of the depth is obtained with In regard to the temporal filter, I would recommend setting Filter Smooth Alpha to at least '0.1' instead of '0'. The normal default value if the Alpha is not customized is '0.4'. For tracking fast motion, '0.4' is likely to be a good setting so that the depth image updates frequently. The Smooth Delta parameter can usually be left on its default value, which is '20'. |
For tracking slow-moving or objects with small amplitude of motion, should I set the smoothing Alpha value to ‘0.1’? Because for slow-moving objects, frequent updates of the depth image can lead to unnecessary errors. For example, for a stationary object, its depth value should remain largely unchanged over time, but when the smoothing Alpha value is set to ‘0.4’, its depth value shows significant changes over time, with fluctuations possibly ranging from 5 to 10 mm. In addition, how should the persistence_control be set for fast-moving objects and slow-moving objects, respectively? |
If the object is slow-moving then 0.1 should be okay. The choice of persistency index number for the temporal filter should not depend on the speed of the object. Instead, it sets how careful the filter is about replacing missing pixels with the last valid value, with values ranging from 0 to 8, based on the history of previous frames received. The default is 3. As the set value is increased, the filter should become less strict about the validity of a pixel. 0 - Disabled - Persistency filter is not activated and no hole filling occurs. |
I got it! |
You are very welcome. :) |
Hi @weishiguan Do you require further assistance with this case, please? Thanks! 您需要进一步协助处理此案吗?谢谢! |
通过什么方式可以通过解析.bag包得到更准确的 每帧rgb图上某像素点对应的深度值呢?Intel.RealSense.Viewer是如何做到的,我能否使用同样的方法实现我的目的?
The text was updated successfully, but these errors were encountered: