Research on measuring method of large-caliber gun muzzle vibration
Gang Zhao^{1} , Qiang Chen^{2} , YingJun Wang^{3}
^{1}PLA 73906, 210041, Nanjing, P. R. China
^{2, 3}Department of Control and System Engineering, Nanjing University, 210093, Nanjing, P. R. China
^{2}Corresponding author
Journal of Vibroengineering, Vol. 17, Issue 5, 2015, p. 2229-2235.
Received 27 May 2015; received in revised form 3 August 2015; accepted 10 August 2015; published 15 August 2015
JVE Conferences
In this paper, a measuring system based on binocular vision is developed for the research on large-caliber gun muzzle vibration. The system is consisted of two high-speed cameras and other equipment. The two-step calibration method based on radial constraint is adopted in order to complete the calibration, the inter-frame differential multiplication method is used to detect the centroid of moving target accurately, the improved Kalman filter tracking algorithm is used to obtain the motion trajectory of the muzzle identification point. The system is applied to practice for a type of muzzle vibration measurement and meaningful original data is acquired.
Keywords: gun muzzle vibration measurement, binocular vision, target detection, Kalman filter.
1. Introduction
In the projectile firing process, the great expansion force produced by propellant transient combustion, the friction of the relative movement between the projectile and gun and the inertia force of gun will lead to the muzzle vibration. The vibration response will directly affect the target lethality and the precision of the gun. Since many factors will influence the measuring instrument range and precision, such as the gun firing instantaneity, the muzzle space displacements and the interference of the muzzle flame and smoke, non-contact measuring method is usually adopted in the muzzle vibration responses detection. The conventional non-contact measurements such as displacement sensor or laser vibration can only be applied for the small caliber gun and one-dimensional motion testing, when facing with large caliber artillery and complex three dimensional motion form, accurate data is hard to obtain.
2. Theoretical background
Single camera will lost object depth information in the imaging process. In order to reconstruct three-dimensional information of space object, two cameras are needed to shoot the same image. In actual measurement, if the relative position information of the two cameras in the world coordinate system is known, the three-dimensional information can be measured in the field of space objects according to the triangle principle.
Fig. 1. Binocular vision
The binocular vision schematic is shown in Fig. 1, where $A\left({x}_{w},{y}_{w},{z}_{w}\right)$ is any point in the space, two cameras’ coordinate system are $({x}_{1},{y}_{1},{z}_{1})$ and $({x}_{2},{y}_{2},{z}_{2})$, and two cameras’ image coordinate system are $({X}_{1},{Y}_{1})$ and $({X}_{2},{Y}_{2})$, the effective focal length of two cameras are ${f}_{1}$ and ${f}_{2}$, so:
The 3-D coordinates of point in the left camera in space can be expressed as Eq. (4):
The relationship between the world coordinate system and the two camera coordinate system can be can be expressed as following:
$\left[\begin{array}{l}{x}_{2}\\ {y}_{2}\\ {z}_{2}\end{array}\right]={R}_{2}\left[\begin{array}{l}{x}_{w}\\ {y}_{w}\\ {z}_{w}\end{array}\right]+{T}_{2},$
3. Methods
3.1. Experimental system
The measurement system is shown in Fig. 2. The measurement system used in this study includes two high-speed cameras, camera lens, cables and triggers. The VISION RESEACH’s Phantom v1610 and the NIKON 400 mm f/2.8D IF-ED AF-S II Nikkor are used to build the system. In considering of the effects of muzzle blast wave, field illumination conditions and field real terrain, two high-speed cameras are placed on one side of the ground artillery and the distance between them is 20 meters, the vertical distance from the camera to the gun is 20 meters. Cameras and other equipment are connected by cables.
3.2. Experiment procedure
3.2.1. Calibration
The calibration model which is self-made is fixed on one side of the muzzle by fixing device, and two cameras take the picture of the model from different perspectives at the same time. After the pictures of the model are acquired, in order to determine the external and internal parameters of two cameras, the two-step calibration method based on radical constraint is adopted.
The calibration model is shown in Fig. 3, in a) ${A}_{1}=$75°, ${A}_{2}=$ 15°; in b) the size of each plate is 2400×4800 mm, the number of the mark point is 20; in c) the diameter of the mark point is 40 mm and the center distance between each point is 80 mm. Fig. 4 shows the picture taken by the two cameras.
Fig. 2. Measurement system
Fig. 3. Calibration model
a)
b)
c)
Fig. 4. Calibration pictures
a) Picture from left camera
b) Picture from right camera
3.2.2. Image acquisition
The Phantom Camera Control Application is used to control two high-speed cameras. And the resolution of the camera is 1280×500, the sample rate is 10000 pps, the exposure time is 15 μm.
3.2.3. Target detection
In this paper, the inter-frame differential multiplication method is used to segment the video image sequences of two cameras. The method can produce the high correlation peak, which not only can effectively avoid the false moving targets, but also can amplify the difference between target area and background value and strengthen the movement of the target area.
The equation of Inter-frame differential multiplication used in the paper:
where ${f}_{1}(x,y)$, ${f}_{2}(x,y)$, ${f}_{3}(x,y)$, ${f}_{4}(x,y)$ denote the four adjacent frames respectively, $G(x,y)$ is the result as pixel value.
3.2.4. Target tracking
Improved kalman filter tracking algorithm is used to obtain the motion trajectory of the muzzle identification point, in which the centroid of moving target is regarded as the observed value input. The algorithm obtain the moving target through the estimation and correction of it’s centroid position.
To restrain the divergence of Kalman filter, improved Kalman filter tracking algorithm based on fading factor is proposed. The algorithm expand the priori estimate covariance ${P}^{-}\left(k\right)$, offset the error accumulation and improve the estimation accuracy and robustness of the Kalman filter. The fading factor and the scalar factor are used:
$H{\left(k\right)}^{\mathrm{\text{'}}}={\alpha}_{k}H\left(k\right).$
where ${\eta}_{k}$ is the fading factor, ${\alpha}_{k}$ is the scalar factor, the new Kalman gain matrix is:
The optimal value of ${\alpha}_{k}$ can generally be used in window estimation method which is estimated as:
$H{\left(k\right)}^{\mathrm{\text{'}}}=\left\{\begin{array}{l}\frac{{\eta}_{k-1}{N}_{i}{{N}_{i}}^{T}}{1+{\eta}_{k-1}},\mathrm{}\mathrm{}\mathrm{}k1,\\ \frac{1}{2}{N}_{0}{{N}_{0}}^{T},\mathrm{}\mathrm{}\mathrm{}\mathrm{}\mathrm{}\mathrm{}\mathrm{}\mathrm{}k=1.\end{array}\right.$
And the optimal value of the fading factor ${\eta}_{k}$ can be estimated as:
3.3. Results and analysis
3.3.1. Calibration information
The calibration results of two cameras are as follows:
Left camera:
Right camera:
3.3.2. Target detection
To achieve the target region accurately, the inter-frame differential multiplication method is used for the video image sequence of two cameras through different perspectives. To calculate the centroid coordinate, the mathematical morphology filtering, connected domain analysis and feature extraction are used in the image processing. The results are shown in Fig. 5.
Fig. 5. Target detection
a) Picture from left camera
b) Picture from right camera
3.3.3. Target tracking
The centroids of the target are used as the input value in the Kalman filter target tracking model, and fading factor is introduced to increase the weight of new information and avoid the divergence of Kalman filter. The results are shown in Fig. 6.
Fig. 6. Target tracking
a) Picture from left camera
b) Picture from right camera
3.3.4. Three dimensional measurement
In the paper, the world coordinate system is determined by the calibration model, the origin of coordinates and coordinate axis shows in Fig. 7. The $X$ axis is parallel to the direction of the muzzle, the $Y$ axis is parallel with the ground and pointing to the cameras, and the $Z$ axis is perpendicular to the muzzle direction and pointing to the sky.
Fig. 7. World coordinate system
The muzzle vibration displacement curve in the two dimensional coordinate shows in Fig. 8.
The three-dimensional displacement coordinate displacement curve shows in Fig. 9.
According to the results, due to the recoil of the gun, the muzzle vibration displacement curve in the $X$ axis direction is rising and constant moving backward, in the $Y$ axis direction is a continuous shock curve, which representing the swing shift and in the $Z$ axis direction is a rising curve which represented the displacement in vertical direction, but the increase is far less than that in the $X$ axis direction.
Fig. 8. The muzzle vibration displacement curve
Fig. 9. The muzzle vibration displacement curve
4. Conclusions
To resolve the problem of conventional measurement methods , the paper realize the research and improvement on the measuring method of large caliber gun muzzle vibration, in which the binocular vision measurement method is applied to practice for a type of muzzle vibration measurement in practice. The measurement result provides meaningful original data support for the gun performance improvement and gun model development.
Acknowledgements
This is supported by the National Key Scientific Instrument Development Projects, No. 2013YQ470765.
References
- Marr D. Vision. Freeman and Company, Oxford, 1982. [Search CrossRef]
- Badenas Jorge, Bober M., Pla F. Motion and intensity-based segmentation and its application to traffic monitoring. International Conference on Image Analysis and Processings, Florence, Italy, 1997, p. 502-509. [Search CrossRef]
- Jang D.-S., Choi H.-I. Active models for tracking moving objects. Pattern Recognition, Vol. 33, Issue 7, 2000, p. 135-1146. [Search CrossRef]
- Vallerand Steve, Kanbara Masayuki, Yokoya Naokazu Binocular vision-based augmented reality system with an increased registration depth using dynamic correction of feature positions. Proceedings of the IEEE Virtual Reality, 2003. [Search CrossRef]
- Tsai R. Y. A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-the-shelf cameras and lens. IEEE Transactions on Robotics and Automation, Vol. 3, Issue 4, 1987, p. 323-344. [Search CrossRef]
- Kim J. S., Kim H. W., Kweon I. S. A camera calibration method using concentric circles for vision applications. The 5th Asian Conference on Computer Vision, 2002, p. 515-520. [Search CrossRef]
- Zhang Z. Y. A flexible new technique for camera calibration. IEEE Transactions on Pattern Analysis and Machine Intelligenee, Vol. 22, Issue 11, 2000, p. 1330-1334. [Search CrossRef]
- Serra J. Image Analysis and Mathematical Morphology. Vol. l. Academic Press, New York, 1982. [Search CrossRef]
- Ragulskis M., Maskeliunas R., Saunoriene L. Identification of in-plane vibrations using time average stochastic Moire. Experimental Techniques, Vol. 29, Issue 6, 2005, p. 41-45. [Search CrossRef]
- Kalman R. E. A new approach to linear filtering and prediction problems. Journal of Basic Engineering, Vol. 82, Issue 1, 1960, p. 35-45. [Search CrossRef]