Site Loader
Rock Street, San Francisco

Abstract—In this work, we evaluate autopicking algorithms for Ground Penetrating Radar (GPR) for applications in potash mining for mining room roof stability. The algorithms reviewed least square fitting, phase assessment, histogram of gradients, machine learning techniques, autoregressive modelling and hidden Markov methods. It is observed that preprocessing can play a very import role in the success of autopicking algorithms and can lead to efficient and robust results. The ability to apply an autopicking algorithm to multiple collection conditions means that it can be used across several geographical locations. The application of GPR to potash mine safety applications will require some modification to the operational characteristic of the algorithms reviewed.Keywords— Introduction Seismic reflection imaging (SRI) is similar to GPR in that energy is transmitted through a medium as waves in order to interrogate or examine the geological characteristics of the medium. While, SRI uses acoustic energy and GPR uses electromagnetic (EM) energy, both methodologies are based on wave propagation through the medium and basically similar algorithms can be applied to the recorded data Forte and Pipan,2017. EM wave propagation is sensitive to the dielectric constant of the medium while acoustic waves are affected by the elasticity properties of the medium.However, Harry M. (rep in proj) highlighted some key differences between GPR and SRI. GPR operates with higher frequency so there is much greater scattering and attenuation. As a result, many phase changes occur as the signal propagates in the medium. This nature of propagation makes recorded data non-stationary and requires complex processing. The scattering that occurs in GPR signals and the high degree of attenuation means that the signal to noise ratio can be quite poor. Other factors like antenna configuration, ground coupling and surface material have an effect on the GPR transmitted wave. Finally, “spatial variation in the strength and polarisation of the propagating energy is different between seismic and GPR. These reasons make GPR signal behaviour different from that of SRI and as such would require different advanced processing algorithms.GPR has been applied in archeology, glaciology, mining, military, civil and utility industries. A critical aspect of GPR data analysis is the accurate picking of sub-surface objects seen in the recorded data and this type of GPR data analysis is called autopicking. A lot of research as gone into designing algorithms to obtain precise and accurate picking of subsurface targets. This work provides a summary of these algorithms with the goal for identifying the most appropriate autopicking algorithms for the novel application of assessing the condition of potash mining room roofs. It is of critical importance to know if the roof of a potash mine is stable and it is very important to recognize conditions in which unstable geology exists.  In this work we examine features and mechanisms that are used in the picking process. We identify the most successful methodologies and identify places where further research into autopicking algorithms will be helpful.In the following sections, we would group algorithms into the following areas of study. Section… discusses the effectiveness of each algorithm discussed as well as highlighting was processing pattern is best suited for a particular situation. Section… is a summary of current research areas and places for improvement.alGORITHMSAuto picking using Least Square Fitting (LSF) and Match Filter / threshold detectors (ref)Least square fitting used to minimize the error between measured and modeled signal. It optimally retrieves the interface reflection signals after a detector has found all reflections in the received signal. LSF is an iterative method works even when there is an overlapping layer reflection which is good for situation with close layers. Since the LSF corrects false alarm errors made by the detector, the detector can then be designed to have a higher probability of detection. LSF can be used in both the time and frequency domain. The Threshold and Matched Filter Detectors are discussed below.The Threshold Detector (TD) compares the level of the analyzed signal to a fixed threshold. Threshold value is selected based on the Neyman – Pearson criterion. An envelope detector when used first can help eliminate multiple peaks. The maximum value for the envelope of the signal is compared against a selected a chosen threshold. If it’s less than the threshold, skip to the next step else the time-delay of the maximum value is that of the detected pulse.     A matched filter detector (MFD) with impulse response h(t) is the time reversal of the GPR signal x(t). The absolute value of the filtered signal is compared against a threshold with its time-delay value subtracted from the duration (T) of signal x(t) to get the time delay of the detected pulse. h(t) = x(T –  t) At low SNR, the MFD is better than the threshold detector, while at higher SNR, both function similarly. In the MFD, it is assumed that the detected signal is a known signal with noise, if this does not hold true at any point in time, the performance will be affected. When the layers are thick, the MFD is better, but with thin layers, the TD is preferred. But with only one layer, irrespective of the size, the MFD is better. This is because multiple thin layers distort the reflected pulse. The output of interest is the two way time (time-delay) which can be used to calculate layer depth. Autopicking and Phase Assessment by means of Attribute Analysis 2The behavior of the signal gives information about the GPR reflection by considering the reflection strength and the phase attribute derived from the complex trace a(t) A discrete Hilbert Transform (DHT) is applied to the real GPR data to generate the quadrature trace (imaginary trace ()). The combination of the real GPR data (real signal) and the quadrature trace makes up the complex trace from which we can derive the cosine of the phase ().                 is the time- varying modulus.Horizon picking is performed by taking the phase information and identifying the peak and troughs as well as the zero-crossing between the phases. A threshold is set to filter out the noise in the phases and the rest are automatically connected to form horizons. When two adjacent traces show the same polarity and close travel times, it is considered as being part of the same horizon which 2.To determine the position of each picked horizon in a GPR reflection a phase assessment is carried out by analyzing the cosine phase of each selected horizon (window of B-scan). The “Averaged Cosine Trace (ACT)” is used to reconstruct the reflected wavelet. Two conditions must be met by the ACT while it iteratively investigates each A-scan in the selected horizon:  1. “The modulus of the peak central phase must reach a minimum threshold close to 1”. and 2. “cross-correlation of the current and previous wavelet has to reach a predefined threshold”. Limitations: Low SNR can affect the picking of a horizon.Interference causes phases from unrelated events to be picked with a horizon. Preprocessing like band-pass filtering, background removal and migration helps.The algorithm is “sensitive to vertical resolution”.Thresholding and or polarity assessment algorithm is need at post processing to limit number of horizon picked.Maintaining the Integrity of the SpecificationsThe.Maintaining the Integrity of the SpecificationsThe.Maintaining the Integrity of the SpecificationsThe.Maintaining the Integrity of the SpecificationsThe.Maintaining the Integrity of the SpecificationsThe.Use a zero before decimal points: “0.25,” not “.25.” Use “cm3,” not “cc.” (bullet list)Maintaining the Integrity of the SpecificationsThe.Discussion  and Observations about Reviewed AlgorithmsFrom a review of these algorithms it may be observed that data with high Signal-to-Noise Ratios (SNR) produce better results. Further, appropriate pre-processing techniques can be used to improve the SNR and highlight the critical features in the data. However, pre-processing steps do add to the computational burden. It is noted that windows and analysis size for both 2D and 3D data sets has an impact.  A single signal might be distorted and the horizon location hidden in the background but the overall form of the target may be still observable in a bigger data window. Having the right window size, moving or fixed, which analyzes a large data is important in eliminating noise and maintaining continuity of the target. At the onset, the decision on what dimension of data to use has to be made in terms of A scan, B scan or C scan. That informs the choice of any other process. It is also observed that some of the algorithms require the development of models and the development of a training system (eg ANN). The development of these models and the training system requires a level of expertise beyond the application of autopicking. Most models are developed offline and it is important to carefully choose applicable training sets. Some algorithms develop adaptive models and the construction of these models is on-site and requires a priori knowledge of expected results. This approach can lead to some immunity to noise.Thresholds and other parameters that are chosen at various stages of some algorithm need to be selected using trial and error or by performing an analysis of training data. Few algorithms have suggestions or guidelines for picking these. If these parameters are not appropriately selected, this will negatively impact the results. In some applications, algorithms are chosen based on the features and characteristics of the output. For example, for land-mine detection, a probability output may be more useful so that a human can evaluate the appropriate response based on other environmental issues and situational factors.Some processing techniques, such as for example the use of the Hough transform are suitable for indicating parametric target but are not so well suited for non-parametric targets. It is observed that analysis which considers the frequency of signals can produce better results when compared to analysis that relies only on spatial features in the data. Typically 3D data sets contain more information than 2D (B scan) or 1D (A scan) data sets. However, the computational load of processing 3D data can be significant and impact the algorithms applicability for portability and real-time processing.Application to potash miningThe autopicking algorithms reviewed above are examined for their applicability to potash mining room roof assessment. GPR systems have been installed on potash boring machines to image clay-seams in the roof above mining rooms. The operator of the boring machine sees the real-time display of the GPR, which includes the output of an autopicking algorithm applied to the GPR data. Using this information the operator monitors the status of the roof of the mining room and makes a decision on proceeding with mining activity, or the need to provide rock-bolts for roof support or even to abandon the mining room if excessive thinning of the salt layer in the roof (also known as the salt-back) is observed.While this system, which has been in place for several years, has served the mining industry well, there is a desire to improve the autopicking algorithm. The ultimate goal is to develop better methods to track the position/location of the clay seam so that the thickness of the salt-back may be known and potentially dangerous operating conditions for potash mining can be detected. The operational conditions in the potash mine have to be considered when choosing algorithms to ensure the algorithm is accurate, interactive, robust, efficient and real-timeOne approach that may be useful to consider for the mining application is the use of the instantaneous amplitude and lateral phase continuity. In the event of low SNR, these algorithms can be augmented by the least square fitting method 1 and additionally by the use of a matched filter detector.  Another approach to consider is the B-Scan of the GPR field data, where a moving window may be specified followed by the application of the HOG algorithm at each pixel to obtain the magnitude and phase information. The HOG algorithm has been shown to be better than other algorithms (such as FROSAW, HMM, EHD Ref) for detection of magnitude and phase. However, the HOG algorithm cannot out perform an edge detector but can be integrated with the HMM (as it performs better than an edge detector) to determine the confidence level (probability) of cracks or breaks from the GPR data.A third approach would be to implement a machine learning algorithm which would try to mimic the way a trained operator evaluates data. Gradient Boosted Trees (GBT) could be used to classify amplitudes on an A-scan and then the time – depth of the chosen amplitude would be used to calculate the depth of the safety margin.One additional approach to consider is to model the potash roof environment using the simulation tool GPRMax Ref. The simulation tool can replicate real life conditions in the mine geology and will be an effective tool for testing new or hybrid algorithms. Using GPRmax and a formulated series of test scenarios will be an effective way to develop and improvement of mine safety autopicking algorithms.CONCLUSIONIn this work we have reviewed a number of leading autopicking algorithms in order to identify their main characteristics and their applicability to assess mine roof conditions. The roofs of mines present a unique situation in that the actions of the boring machine has a significant impact on roof thickness. Thus trends and patterns in the data may be connected to boring machine operations. In addition, understanding of the process by which the salt beds were initially formed is also an important consideration. The salt beds essentially developed as prehistoric seas shifted and dried over millions of years. Thus the concept of layers of material who thickness is fundamentally based on the processes involved in their formation can be considered. These factors are of course altered by natural phenomena of the evolution of the earth crust over the millions of years since the salt bed formations. Ultimately it is expected that a hybrid algorithm which captures all of the various facets of this problem domain will be developed. The expectation is that if an effective methodology can be developed it will find application in numerous potash mining environments.Acknowledgment We acknowledge the support of Nutrien Ltd. in collaboration with Sensors and Software Inc. for their vital support for this project. Further we recognize Mitacs and the University of Regina for financial support.ReferencesG. Eason, B. Noble, and I.N. Sneddon, “On certain integrals of Lipschitz-Hankel type involving products of Bessel functions,” Phil. Trans. Roy. Soc. London, vol. A247, pp. 529-551, April 1955. (references)J. Clerk Maxwell, A Treatise on Electricity and Magnetism, 3rd ed., vol. 2. Oxford: Clarendon, 1892, pp.68-73.I.S. Jacobs and C.P. Bean, “Fine particles, thin films and exchange anisotropy,” in Magnetism, vol. III, G.T. Rado and H. Suhl, Eds. New York: Academic, 1963, pp. 271-350.K. Elissa, “Title of paper if known,” unpublished.R. Nicole, “Title of paper with only first word capitalized,” J. Name Stand. Abbrev., in press.Y. Yorozu, M. Hirano, K. Oka, and Y. Tagawa, “Electron spectroscopy studies on magneto-optical media and plastic substrate interface,” IEEE Transl. J. Magn. Japan, vol. 2, pp. 740-741, August 1987 Digests 9th Annual Conf. Magnetics Japan, p. 301, 1982.M. Young, The Technical Writer’s Handbook. Mill Valley, CA: University Science, 1989.    AuthorAlgorithmApplicationsAdvantagesLimitationM. Dossi, et alAttribute-based automated pickingusing Phase Assessment (lateral phase continuity)1.Seismic data (oil exploration)2.GPR data (Road inspection, Glacier monitoring)1. Independent of the interpreter.2. It is interactive as performance can be evaluated periodically3.Works well without / limited amplitude recovery1. Interference or low signal to noise ratio can affect result.2.  The algorithm is sensitive to vertical resolution.Waleed Al-NuaimyAutomatic feature detection and interpretation – Neural network for Classification Solid Target detection Liquid Target detection. 1. Approaches used have been integrated together to give a more accurate result.2. Its flexible, robust accurate and has noise-immunity. 1. significant computational time due to the continuous information2. No real-time data analysis Peter A. Torrione, et alHistograms of Oriented Gradients (HOG) – feature extraction – using a linear classifier (partial-least-squares discriminant analysis).Landmine detection (GPR data)1. Implemented in real time2.Computationally inexpensive and efficient  3.It makes up for EHD limitations (manual target localization, special-purpose classification)1.It can be slow to analysis every pixel in a GPR data depending on the type of classification used (random forest). Paul D. Gader, et alLandmine Detection using Hidden Markov Models Landmine (GPR data) 1. It can be implemented real-time despite its computational complexity.2. Distinguishes between  a mine and a clutter. Computational complexityUdo Seiffert, et alAutomated Detection of Targets in a GPR data using Machine Learning TechniquesGPR Data (road survey)1. Real-time operation can be obtained.2. The Hybrid of the techniques give a significant improvement in the detection performance1. Each technique has its own strength and weakness2. Computation time can be affected when the techniques are used as hybrid.

Post Author: admin


I'm Eric!

Would you like to get a custom essay? How about receiving a customized one?

Check it out