In this study, we have presented our findings on the deployment of a machine learning (ML) technique to enhance the performance of LTE applications employing quasi-Yagi-Uda antennas at 2100 MHz UMTS band. A number of techniques, including simulation, measurement, and a model of an RLC-equivalent circuit, are discussed in this article as ways to assess an antenna's suitability for the intended applications. The CST simulation gives the suggested antenna a reflection coefficient of -38.40 dB at 2.1 GHz and a bandwidth of 357 MHz (1.95 GHz-2.31 GHz) at a -10 dB level. With a dimension of 0.535λ0×0.714λ0, it is not only compact but also features a maximum gain of 6.9 dB, a maximum directivity of 7.67, VSWR of 1.001 at center frequency and a maximum efficiency of 89.9%. The antenna is made of a low-cost substrate, FR4. The RLC circuit, sometimes referred to as the lumped element model, exhibits characteristics that are sufficiently similar to those of the proposed Yagi antenna. We use yet another supervised regression machine learning (ML) technique to create an exact forecast of the antenna's frequency and directivity. The performance of machine learning (ML) models can be evaluated using a variety of metrics, including the variance score, R square, mean square error (MSE), mean absolute error (MAE), root mean square error (RMSE), and mean squared logarithmic error (MSLE). Out of the seven ML models, the linear regression (LR) model has the lowest error and maximum accuracy when predicting directivity, whereas the ridge regression (RR) model performs the best when predicting frequency. The proposed antenna is a strong candidate for the intended UMTS LTE applications, as shown by the modeling results from CST and ADS, as well as the measured and forecasted outcomes from machine learning techniques.
* Title and MeSH Headings from MEDLINE®/PubMed®, a database of the U.S. National Library of Medicine.