工控编程吧

标题: halcon proj_match_points_ransac_guided函数介绍 [打印本页]

作者: qq263946146    时间: 2019-5-28 20:37
标题: halcon proj_match_points_ransac_guided函数介绍
proj_match_points_ransac_guided(Image1, Image2 : : Rows1, Cols1, Rows2, Cols2, GrayMatchMethod, MaskSize, HomMat2DGuide, DistanceTolerance,
MatchThreshold, EstimationMethod, DistanceThreshold, RandSeed : HomMat2D, Points1, Points2)


在输入图像Image1 Image2中给出一组特征点的坐标(Cols1 Rows1)和(Cols2 Rows2),
并给出一个已知的 Image1和Image2之间的近似变换矩阵HomMat2DGuide。
proj_match_points_ransac_guided自动确定对应点和齐次射影变换矩阵HomMat2D,
HomMat2D可以很好地将不同图像中的对应点相互转换。
例如,可以使用points_foerstner或points_harris提取特征点。
例如,可以使用proj_match_points_ransac在Image1和Image2的低分辨率版本上计算近似HomMat2DGuide。


转换分两步确定:
首先,确定第一幅图像和第二幅图像中输入点周围掩模窗口的灰度值相关性,
利用两幅图像中窗口的相似性生成掩模窗口之间的初始匹配。
掩码窗口的大小为MaskSize x MaskSize。
可以选择三个相关指标。如果GrayMatchMethod的值为‘ssd’,
则使用灰度值差的平方和,‘sad’表示绝对差的和,‘ncc’表示归一化互相关。


详情请参考binocular_differences。在所有可能的点对上最小化('ssd'、'sad')或最大化('ncc')。
因此,只有当度量值低于MatchThreshold值('ssd'、'sad')或高于该值('ncc')时,才接受找到的匹配。


为了提高算法的性能,基于近似变换HomMat2DGuide对匹配的搜索区域进行了限制。
只有在经过HomMat2DGuide变换后的Image2中的点与Image1中的点之间的距离
大于等于一定距离的点才被认为是匹配的。


初始匹配完成后,使用随机搜索算法(RANSAC)确定转换矩阵HomMat2D。
它试图找到符合最大对应数的矩阵。
对于要接受的点,它到转换所预测坐标的距离必须不超过阈值DistanceThreshold。


一旦做出选择,矩阵将使用所有一致的点进一步优化。
对于这种优化,可以选择估计方法作为缓慢但数学上最优的“gold_standard”方法,
或更快的“normalized_dlt”方法。这里使用了vector_to_proj_hom_mat2d算法。


仍然违反最终转换的一致性条件的点对将被删除,
匹配的点作为控制值返回。
Points1包含第一张图像中匹配的输入点的索引,
Points2包含第二张图像中对应点的索引。


参数RandSeed可用于控制RANSAC算法的随机性,从而获得可重复的结果。
如果将RandSeed设置为正数,则操作符在每次调用时都使用相同的参数生成相同的结果,
因为内部使用的随机数生成器是用seed值初始化的。
如果RandSeed = 0,则使用当前时间初始化随机数生成器。
因此,在这种情况下,结果可能无法重现。


Image1,Image2输入的两组图像
Rows1,Cols1,Rows2,Cols2图1,2的特征点的行列坐标。
GrayMatchMethod灰度值比较度量。
MaskSize灰度值掩码的大小。
HomMat2DGuide两幅图像间齐次射影变换矩阵的近似。
DistanceTolerance匹配搜索窗口的容忍
MatchThreshold用于灰度值匹配的阈值。
EstimationMethod 变换矩阵估计算法。有gold_standard', 'normalized_dlt'
DistanceThreshold转换一致性检查的阈值。
RandSeed 种子为随机数生成器。
HomMat2D 输出参数,齐次射影变换矩阵。
Points1,Points2  输出参数,图1,2中匹配输入点的索引。


halcon自带函数:
* This example program shows how images can be combined
* into a mosaic image using proj_match_points_ransac_guided
* and gen_projective_mosaic.
* It is shown how the calculation of the projection between two
* images can be accelerated using an image pyramid.
*
* Initializations
ImgPath := '3d_machine_vision/mosaic/'
ImgName := 'bga_r_'
Times := []
Colors := ['red','coral','yellow','lime green']
read_image (Images, ImgPath + ImgName + ['01','06'])
dev_update_off ()
dev_close_window ()
dev_open_window_fit_size (0, 0, 640, 980, 320, 490, WindowHandle)
dev_open_window_fit_size (0, 330, 490, 490, 1000, 490, WindowHandle1)
set_display_font (WindowHandle, 14, 'mono', 'true', 'false')
set_display_font (WindowHandle1, 14, 'mono', 'true', 'false')
* The internal camera parameters of the used camera
* (necessary to eliminate radial distortions)
gen_cam_par_area_scan_division (0.0121693, -2675.63, 7.40046e-006, 7.4e-006, 290.491, 258.887, 640, 480, CamParam)
change_radial_distortion_cam_par ('adaptive', CamParam, 0, CamParOut)
change_radial_distortion_image (Images, Images, Images, CamParam, CamParOut)
* To show the point matches that are used to compute the
* transformation between the images, we will show both images in a
* tiled image with some space between the images so that the extents
* of the images are easily visible.
tile_images_offset (Images, TiledImage, [0,500], [0,0], [-1,-1], [-1,-1], [-1,-1], [-1,-1], 640, 980)
*
* Now we can determine the transformations between the image pairs.
From := 1
To := 2
select_obj (Images, ImageF, From)
select_obj (Images, ImageT, To)
*
* Repeat the calculation 4 times with a different number of pyramid levels
for NumLevels := 1 to 4 by 1
    *
    dev_clear_window ()
    dev_set_window (WindowHandle)
    dev_clear_window ()
    dev_display (TiledImage)
    disp_message (WindowHandle, ['Calculate point matches','with ' + NumLevels + ' pyramid levels','Please wait ...'], 'window', 20, 10, 'black', 'true')
    *
    * Calculate the projection between the two images
    * Check the procedure's comments for details
    count_seconds (S1)
    proj_match_points_ransac_pyramid (ImageF, ImageT, NumLevels, RowFAll, ColFAll, RowTAll, ColTAll, ProjMatrix, Points1, Points2)
    count_seconds (S2)
    Times := [Times,S2 - S1]
    *
    * Display point correspondences
    gen_cross_contour_xld (PointsF, RowFAll, ColFAll, 6, rad(45))
    gen_cross_contour_xld (PointsT, RowTAll + 500, ColTAll, 6, rad(45))
    RowF := subset(RowFAll,Points1)
    ColF := subset(ColFAll,Points1)
    RowT := subset(RowTAll,Points2) + 500
    ColT := subset(ColTAll,Points2)
    gen_empty_obj (Matches)
    for K := 0 to |RowF| - 1 by 1
        gen_contour_polygon_xld (Match, [RowF[K],RowT[K]], [ColF[K],ColT[K]])
        concat_obj (Matches, Match, Matches)
    endfor
    dev_display (TiledImage)
    dev_set_color ('blue')
    dev_display (Matches)
    dev_set_color ('green')
    dev_display (PointsF)
    dev_display (PointsT)
    disp_message (WindowHandle, [|RowF| + ' point matches','Time used: ' + (S2 - S1)$'.3' + ' s'], 'window', 20, 10, 'black', 'true')
    *
    * Generate the mosaic image
    gen_projective_mosaic (Images, MosaicImage, 1, From, To, ProjMatrix, [2,1], 'false', MosaicMatrices2D)
    *
    * Display mosaic image
    get_image_size (MosaicImage, Width, Height)
    dev_set_window (WindowHandle1)
    dev_resize_window_fit_image (MosaicImage, 0, 330, [400,700], 700)
    dev_clear_window ()
    dev_display (MosaicImage)
    disp_message (WindowHandle1, 'Projective mosaic (used ' + NumLevels + ' pyramid levels)', 'window', 20, 10, 'black', 'true')
    disp_continue_message (WindowHandle1, 'black', 'true')
    stop ()
endfor
*
* Display execution times
dev_set_window (WindowHandle)
dev_close_window ()
MaxTime := max(Times)
BaseRow := 380
RectHeight := 300
disp_message (WindowHandle1, ['Time in s:','(#levels used)'], 'image', BaseRow + 20, 10, 'black', 'true')
for Index := 0 to |Times| - 1 by 1
    gen_rectangle1 (Rectangle, BaseRow - RectHeight * Times[Index] / MaxTime, 200 + Index * 100, BaseRow, 280 + Index * 100)
    disp_message (WindowHandle1, [Times[Index]$'.3','(' + (Index + 1) + ')'], 'image', BaseRow + 20, 200 + 100 * Index, 'black', 'true')
    dev_set_color (Colors[Index])
    dev_set_draw ('fill')
    dev_display (Rectangle)
endfor
disp_finished_message (WindowHandle1, 'black', 'true')

其中本地函数proj_match_points_ransac_pyramid内部 代码调用了
proj_match_points_ransac_guided


* This procedure uses an image pyramid to calculate
* the projective transformation between two images.
*
* If UseRigidTransformation is set to true,
* the results are restricted to rigid transformations
* (instead of projective transformations)
UseRigidTransformation := true
* Parameters for the Harris point detector
SigmaGrad := 0.7
SigmaSmooth := 2
Alpha := 0.04
Threshold := 50
*
* Generate image pyramids for both input images
gen_gauss_pyramid (ImageF, ImageFPyramid, 'constant', 0.5)
gen_gauss_pyramid (ImageT, ImageTPyramid, 'constant', 0.5)
* At the beginning, no approximated projection is known
HomMat2DGuide := []
*
* Calculate projective transformation on each pyramid level
for Level := NumLevels to 1 by -1
    * Select images from image pyramid
    select_obj (ImageFPyramid, ImageFLevel, Level)
    select_obj (ImageTPyramid, ImageTLevel, Level)
    * Extract interest points in both images
    points_harris (ImageFLevel, SigmaGrad, SigmaSmooth, Alpha, Threshold, RowsF, ColsF)
    points_harris (ImageTLevel, SigmaGrad, SigmaSmooth, Alpha, Threshold, RowsT, ColsT)
    * Calculate projection from point correspondences
    if (|HomMat2DGuide| == 0)
        * On the highest pyramid level, use proj_mathc_points_ransac
        get_image_size (ImageFLevel, Width, Height)
        proj_match_points_ransac (ImageFLevel, ImageTLevel, RowsF, ColsF, RowsT, ColsT, 'ncc', 10, 0, 0, Height, Width, [rad(-40),rad(40)], 0.5, 'gold_standard', 2.5 * pow(2,4 - Level), 42, ProjMatrix, Points1, Points2)
    else
        * On lower levels, use approximation from upper level as
        * input for proj_match_points_ransac_guided
        proj_match_points_ransac_guided (ImageFLevel, ImageTLevel, RowsF, ColsF, RowsT, ColsT, 'ncc', 10, HomMat2DGuide, 10 * pow(2.0,4.0 - Level), 0.5, 'gold_standard', 2.5 * pow(2.0,4.0 - Level), 42, ProjMatrix, Points1, Points2)
    endif
    if (UseRigidTransformation)
        * Use found point correspondences to calculate rigid transformation
        * with vector_to_rigid
        * Note, that the resulting transformation of proj_match_points_ransac_guided
        * is ignored in this case.
        RowF := subset(RowsF,Points1)
        ColF := subset(ColsF,Points1)
        RowT := subset(RowsT,Points2)
        ColT := subset(ColsT,Points2)
        vector_to_rigid (RowF + 0.5, ColF + 0.5, RowT + 0.5, ColT + 0.5, ProjMatrix)
        ProjMatrix := [ProjMatrix,0,0,1]
    endif
    * To be used on the next lower pyramid level, the projection has
    * to be adjusted to the new scale.
    hom_mat2d_scale_local (ProjMatrix, 0.5, 0.5, HomMat2DGuide)
    hom_mat2d_scale (HomMat2DGuide, 2, 2, 0, 0, HomMat2DGuide)
endfor
return ()
[halcon]1[/halcon]







欢迎光临 工控编程吧 (https://www.gkbc8.com/) Powered by Discuz! X3.4