Numerical simulation of the behaviour of blood flow through a stenosed bifurcated
artery with the presence of single mild stenosis at parent artery is investigated. The
flow analysis applies the incompressible, steady, three-dimensional Navier-Stokes equations
for non-Newtonian generalized power law fluids. Behaviour of blood flow is simulated
numerically using COMSOL Multiphysicsthat based on finite element method.The
results showthe effect of severity of stenosis on flow characteristics such as axial velocity
and its exhibit flow recirculation zone for analysis on streamlines pattern.
This paper proposes A Hybrid Wavelet-Auto-Regressive Integrated Moving Average (W-ARIMA) model to explore the ability of the hybrid model over an ARIMA model. It combines two methods, a Discrete Wavelet Transform (DWT) and ARIMA model using the Standardized Precipitation Index (SPI) drought data for forecasting drought modeling development. SPI data from January 1954 to December 2008 used was divided into two - (80%/20% for training/testing respectively). The results were compared with the conventional ARIMA model with Mean Square Error (MSE) and Mean Average Error (MAE) as an error measure. The results of the proposed method achieved the best forecasting performance.
Optimization is central to any problem involving decision making. The area
of optimization has received enormous attention for over 30 years and it is still popular
in research field to this day. In this paper, a global optimization method called Improved
Homotopy with 2-Step Predictor-corrector Method will be introduced. The method in-
troduced is able to identify all local solutions by converting non-convex optimization
problems into piece-wise convex optimization problems. A mechanism which only consid-
ers the convex part where minimizers existed on a function is applied. This mechanism
allows the method to filter out concave parts and some unrelated parts automatically.
The identified convex parts are called trusted intervals. The descent property and the
global convergence of the method was shown in this paper. 15 test problems have been
used to show the ability of the algorithm proposed in locating global minimizer.
Aging is a good indicator in demographic and health areas as the lifespan
of the elderly population increases. Based on the government’s Economic Outlook 2019,
it was found that an aging population would increase the government pension payments
as the pensioners and their beneficiaries have longer life expectancy. Due to mortality
rates decreasing over time, the life expectancy tends to increase in the future. The
aims of this study are to forecast the mortality rates in the years 2020 and 2025 using
the Heligman-Pollard model and then analyse the effect of mortality improvement on
the pension cost (annuity factor) for the Malaysian population. However, this study
only focuses on estimating the annuity factor using life annuities through the forecasted
mortality rates. The findings indicated that the pension cost is expected to increase if
the life expectancy of the Malaysian population increases due to the aging population
the near future. Thus, to reduce pension costs and help the pensioners from insufficient
financial income, the government needs to consider an extension of the retirement age in
future.
Abstract In DNA splicing system, DNA molecules are cut and recombined with the presence of restriction enzymes and a ligase. The splicing system is analyzed via formal language theory where the molecules resulting from the splicing system generate a language which is called a splicing language. In nature, DNA molecules can be read in two ways; forward and backward. A sequence of string that reads the same forward and backward is known as a palindrome. Palindromic and non-palindromic sequences can also be recognized in restriction enzymes. Research on splicing languages from DNA splicing systems with palindromic and non-palindromic restriction enzymes have been done previously. This research is motivated by the problem of DNA assembly to read millions of long DNA sequences where the concepts of automata and grammars are applied in DNA splicing systems to simplify the assembly in short-read sequences. The splicing languages generated from DNA splicing systems with palindromic and non- palindromic restriction enzymes are deduced from the grammars which are visualised as automata diagrams, and presented by transition graphs where transition labels represent the language of DNA molecules resulting from the respective DNA splicing systems.
Prediction analysis has drawn significant interest in numerous field. Taguchi’s T-Method is a prediction tool that developed practically but not limited to small sample analysis. It was developed explicitly for multidimensional system prediction by relying on historical data as the baseline model and adapting the signal to noise ratio (SNR) as well as zero proportional concepts in strengthening its robustness. Orthogonal array (OA) in T-Method is a variable selection optimization technique in improving the prediction accuracy as well as help in eliminating variables that may deteriorate the overall performance. However, the limitation of OA in dealing with higher multidimensionality restraint the optimization accuracy. Binary particle swarm optimization used in this study helps to cater to the limitation of OA as well as optimizing the variable selection process to better prediction accuracy. The results show that if the historical data consist of samples with higher correlation of determination (R2) value for the model creation, the optimization process in reducing the number of variables would be much reliable and accurate. Comparing between T-Method+OA and T-Method+BPSO in four different case study, it shows that T-Method+BPSO performing better with greater R2 and means relative error (MRE) value compared to T-Method+OA.
In this paper, we study the numerical method for solving second order Fuzzy
Differential Equations (FDEs) using Block Backward Differential Formulas (BBDF)
under generalized concept of higher-order fuzzy differentiability. Implementation of
the method using Newton iteration is discussed. Numerical results obtained by BBDF
are presented and compared with Backward Differential Formulas (BDF) and exact
solutions. Several numerical examples are provided to illustrate our methods.
The commutativity degree is the probability that a pair of elements chosen randomly from a group commute. The concept of commutativity degree has been widely discussed by several authors in many directions. One of the important generalizations of commutativity degree is the probability that a random element from a finite group G fixes a random element from a non-empty set S that we call the action degree of groups. In this research, the concept of action degree is further studied where some inequalities and bounds on the action degree of finite groups are determined. Moreover, a general relation between the action degree of a finite group G and a subgroup H is provided. Next, the action degree for the direct product of two finite groups is determined. Previously, the action degree was only de?ned for ?nite groups, the action degree for ?nitely generated groups will be de?ned in this research and some bounds on them are going to be determined.
Let g be a finite group. The probability of a random pair of elements in g are
said to be co-prime when the greatest common divisor of order x and y where x and y in
g, is equal to one. Meanwhile the co-prime graph of a group is defined as a graph whose
vertices are elements of g and two distinct vertices are adjacent if and only if the greatest
common divisor of order x and y is equal to one. In this paper, the co-prime probability
and its graphs such as the types and the properties of the graph are determined.
In this paper, Maxwell fluid over a flat plate for convective boundary layer
flow with pressure gradient parameter is considered. The aim of this study is to compare
and analyze the effects of the presence and absence of λ (relaxation time), and also the
effects of m (pressure gradient parameter) and Pr (Prandtl number)on the momentum
and thermal boundary layer thicknesses. An approximation technique namely Homotopy
Perturbation Method (HPM) has been used with an implementation of Adam and Gear
Method’s algorithms. The obtained results have been compared for zero relaxation time
and also pressure gradient parameter with the published work of Fathizadeh and Rashidi.
The current outcomes are found to be in good agreement with the published results.
Physical interpretations have been given for the effects of the m, Pr and β (Deborah
number) with λ. This study will play an important role in industrial and engineering
applications.
Monthly data about oil production at several drilling wells is an example of
spatio-temporal data. The aim of this research is to propose nonlinear spatio-temporal
model, i.e. Feedforward Neural Network - VectorAutoregressive (FFNN-VAR) and FFNN
- Generalized Space-Time Autoregressive (FFNN-GSTAR), and compare their forecast
accuracy to linearspatio-temporal model, i.e. VAR and GSTAR. These spatio-temporal
models are proposed and applied for forecasting monthly oil production data at three
drilling wells in East Java, Indonesia. There are 60 observations that be divided to two
parts, i.e. the first 50 observations for training data and the last 10 observations for
testing data. The results show that FFNN-GSTAR(11) and FFNN-VAR(1) as nonlinear
spatio-temporal models tend to give more accurate forecast than VAR(1) and GSTAR(11)
as linear spatio-temporal models. Moreover, further research about nonlinear spatiotemporal
models based on neural networks and GSTAR is needed for developing new
hybrid models that could improve the forecast accuracy.
Abstract Demographers and actuaries are very much conscious of the trend of mortality in their own country or in the world in general. This is because mortality is the basis for longevity risk evaluation. Mortality is showing a declining trend and it is expected to further decline in the future. This will lead to continuous increase in life expectancy. Several stochastic models have been developed throughout the years to capture mortality and its variability. This includes Lee Carter (LC) model which has been extended by various researchers. This paper will be focusing on comparing LC model and another mortality model proposed by Cairns, Blake and Dowd (CBD). The LC uses the log of central rate of mortality and CBD uses logit of the mortality odds as dependent variable. Analysis of comparison is done using a few techniques including Akaike information criteria (AIC) and Bayesian information criterion (BIC). From the overall results, there is no model better than the other in every aspect tested. We illustrate this via visual inspection and in sample and outof sample analysis using Malaysian mortality data from 1980 to 2017.
The well-known geostatistics method (variance-reduction method) is commonly used to determine the optimal rain gauge network. The main problem in geostatistics method to determine the best semivariogram model in order to be used in estimating the variance. An optimal choice of the semivariogram model is an important point for a good data evaluation process. Three different semivariogram models which are Spherical, Gaussian and Exponential are used and their performances are compared in this study. Cross validation technique is applied to compute the errors of the semivariograms. Rain-fall data for the period of 1975 – 2008 from the existing 84 rain gauge stations covering the state of Johor are used in this study. The result shows that the exponential model is the best semivariogram model and chosen to determine the optimal number and location of rain gauge station.
The incorporation of non-linear pattern of early ages has led to new research
directions on improving the existing stochastic mortalitymodel structure. Several authors
have outlined the importance of encompassing the full age range in dealing with longevity
risk exposure, by not ignoring the dependence between young and old ages. In this study,
we consider the two extensions of the Cairns, Blake and Dowd model that incorporate the
irregularity profile seen at the mortality of lower ages, which are the Plat, and the O’Hare
and Li models respectively. The models’ performances in terms of in-sample fitting and
out-sample forecasts were examined and compared. The results indicated that the O’Hare
and Li model performs better as compared to the Plat model.
The modelling of splicing systems is simulated by the process of cleaving and recombining DNA molecules with the presence of a ligase and restriction enzymes which are biologically called as endodeoxyribonucleases. The molecules resulting from DNA splicing systems are known as splicing languages. Palindrome is a sequence of strings that reads the same forward and backward. In this research, the splicing languages resulting from DNA splicing systems with one non-palindromic restriction enzyme are determined using the notation from Head splicing system. The generalisations of splicing languages for DNA splicing systems involving a cutting site and two non-overlapping cutting sites of one non-palindromic restriction enzyme are presented in the first and second theorems, respectively, which are proved using direct and induction methods. The result from the first theorem shows a trivial string which is the initial DNA molecule; while the second theorem determines a splicing language consisting of a set of resulting DNA molecules from the respective DNA splicing system.
In this paper, we propose a method how to manage the convergence of
Newton’s method if its iteration process encounters a local extremum. This idea establishes
the osculating circle at a local extremum. It then uses the radius of the
osculating circle also known as the radius of curvature as an additional number of
the local extremum. It then takes that additional number and combines it with the
local extremum. This is then used as an initial guess in finding a root near to that
local extremum. This paper will provide several examples which demonstrate that the
proposed idea is successful and they perform to fulfill the aim of this paper.
It has come to attention that Malaysia have been aiming to build its own
nuclear power plant (NPP) for electricity generation in 2030 to diversify the national
energy supply and resources. As part of the regulation to build a NPP, environmental
risk assessment analysis which includes the atmospheric dispersion assessment has to
be performed as required by the Malaysian Atomic Energy Licensing Board (AELB)
prior to the commissioning process. The assessment is to investigate the dispersion of
radioactive effluent from the NPP in the event of nuclear accident. This article will focus
on current development of locally developed atmospheric dispersion modeling code
based on Gaussian Plume model. The code is written in Fortran computer language
and has been benchmarked to a readily available HotSpot software. The radionuclide
release rate entering the Gaussian equation is approximated to the value found in the
Fukushima NPP accident in 2011. Meteorological data of Mersing District, Johor of
year 2013 is utilized for the calculations. The results show that the dispersion of radionuclide
effluent can potentially affect areas around Johor Bahru district, Singapore
and some parts of Riau when the wind direction blows from the North-northeast direction.
The results from our code was found to be in good agreement with the one
obtained from HotSpot, with less than 1% discrepancy between the two.
The effect of oil shock on the global economy is evident through many studies. However, the effect is heterogeneous over time. One of the reasons that lead to such different impacts is due to the oil source that is either the oil shock is demand or supply- driven. Applying the structural vector autoregressive (SVAR) model to generate the three oil shocks based on the three oil sources (oil supply, oil demand and oil specific- demand), we extended the examination on the effect of oil shock on the global economy using the threshold regression. Our results reveal the threshold effects of oil directly and indirectly on the global economy. The impacts of oil shocks differ across sectors, implying oil intensity, as well as oil sources, are the factors that determine the impact of oil shocks on the global economy. Overall, the oil specific-demand shock is more influential among the three oil shocks. Hence, the global economy is oil demand-driven. Besides that, the impact of oil is relatively large in the energy sector when compared to the non-energy sector and precious metals industry. Despite that, the impact of oil shocks is small if compared to the non-oil shocks such as exchange rate changes and global consumer price inflation shock. Consequently, non-oil shocks are the main determinants of the global economic fluctuation. The study leads to a better understanding of the transmission of oil shock and its sources, the interaction between oil and economic indicators and the policy implication due to oil dependency/ intensity.
In this paper we consider a harvesting model of predator-prey fishery in which
the prey is directly infected by some external toxic substances. The toxic infection is
indirectly transmitted to the predator during the feeding process. The model is a modified
version from the classic Lotka-Volterra predator-prey model. The stability and bifurcation
analyses are addressed. Numerical simulations of the model are performed and bifurcation
diagrams are studied to investigate the dynamical behaviours between the predator and
the prey. The effects of toxicity and harvesting on the stability of steady states found in
the model are discussed.
In this paper, the combined influences of biotic interactions, environmental components and harvesting strategy on the spread of Hantavirus are investigated. By employing a multi-species model consisting of (susceptible and infected) rodents and alien species, we show that interspecific competition from alien species has an effect in reducing the spread of infection, and this species could be employed as a potential biocontrol agent. Our analysis using numerical continuation and simulation also reveals the conditions under which Hantavirus infection occurs and disappears as the environmental conditions and the intensity of harvesting change. Without harvesting, infection emerges when environments are conducive. Inclusion of moderate harvesting in favourable environments can lead to disappearance of infection among rodent species. However, as the intensity of harvesting increases, this situation can cause extinction of all rodents species and consequently, jeopardise biodiversity. Overall, our results demonstrate how the interplay of different factors can combine to determine the spread of infectious diseases.