مجلد 32 عدد 2 (2023)
Articles
Abstract: In this study, the bone state of women in an urban group was measured and compared to a rural group. The assessments revealed the measurements of bone mineral content (BMC) and bone mineral density (BMD). A cross-sectional study was done at the DXA laboratory, Physiology Department, College of Medicine, University of Ninevah, Mosul, Iraq. 139 healthy females were enrolled through a college medical academic center assessment. They were split into two groups: rural (53 participants) and urban (86 participants). Study participants provided detailed anthropometric data. A DXA bone densitometer scanner type (STRATOS) from the (DMS) group in France was used to measure the T- and Z-scores. All sample groups were classified according to age from 30-79 years and divided into subgroups for every 10 years. The results show that BMC and BMD values were higher in the rural group in comparison with the urban group for all age categories with a highly significant p = 0.0001.
Abstract: In this work, the bulk etch rate VB is calculated using various methods, including the removed thickness, the saturation track, and the length-diameter track. A 200 μm-thick CR-39 detector manufactured by Page Moldings (Pershore) in the United Kingdom was cut into several identical pieces (1x1 cm2). In order to obtain longitudinal track profiles, these sheets were exposed to alpha particles with an energy of 2.6 MeV emitted from a 241Am source, followed by 30 minutes of exposure to UV light. The CR-39 samples were etched in a 6.25 N NaOH solution at 70 C. These tracks are etched for 15 minutes before being digitally photographed with an optical microscope. It was established that the predominant lengths of alpha particle tracks are those that fall perpendicular to the detector surface. The bulk etch rates for CR-9 were found to be 1.227 μm/h for irradiation with alpha, and 2.035 μm/h for irradiation with UV and then alpha particles.
Abstract: With the advent of the need for interactive systems, the urgent need for time-sharing systems has emerged. Round-robin algorithms have emerged to achieve time-sharing. The degree of performance of time-sharing systems depends largely on the length of the time slice in the round-robin algorithms. The length of the time slice affects the measuring criteria of the performance of the algorithms. Researchers suggested and are continuing suggesting algorithms in order to obtain the best values for the time slice. Adopting one algorithm over another in a system and for a class of applications requires choosing the best performing algorithm. This research is an attempt to develop an objective approach for accurate comparison between algorithms. For the sake of objectivity in comparison, five algorithms similar in their general characteristics were chosen; Modified Median Round Robin Algorithm(MMRRA), A New Median-Average Round Robin Scheduling Algorithm(NMARR), An Improved Round Robin Scheduling Algorithm with Varying Time Quantum (IRRVQ), A Modified Round Robin CPU Scheduling Algorithm with Dynamic Time Quantum (RRDT), Improved Round Robin Algorithm with Progressive Dynamic Quantum (IRRPDQ). The results showed that the outperformance of an algorithm over a group of algorithms according to a specific criterion is not permanent and fixed in value, and that resorting to statistical measures is the best way to clarify the degree of performance of the algorithms.
Abstract: The software development process is closely related to the creation and evaluation processes. The problem with this software development is that it often lacks testing which leads to software failures. In order To maintain a high quality product in excellent performance condition, testing becomes critical. The software can be tested by using White Box, Black-Box, or Gray testing techniques. In this investigation, the types of tests were reviewed. Performing Testing with White Box Testing uses a number of testing methodologies based on path testing, including the production of flowcharts, cyclomatic complexity assessment, and independent path testing. As a result, it is possible to implement a foundation path testing technique and white box approach to testing. This review included several axes, namely the definition of white box testing tools, then the testing techniques in general, the benefits and gains of each of these technologies, the levels of testing, and finally the steps of conducting the test. This review then came to several conclusions that are mentioned at the end of this paper.
Abstract: This research took advantage of the new features of the MCNPX 6.2 algorithm to model the 3He (n,p) nuclear reaction. The neutron source for the simulation is Am-Be, which emits neutrons with an average energy of 5 MeV. The 3He gas was encapsulated by a thick layer of paraffin. The results obtained have shown that predicting the count values is preferable. additionally, the new versions of MCNPX 6.2 allow the generation of reaction products from proton and triton in addition to the estimated pulse height spectrum of a 3He detector with a considerable detector wall effect. Moreover, it allows studying the dynamics of these reactions as a function of the structure of the 3He detector. The wall effects are small in our modeling design, so the size of the simulation detector is sufficient.The design of the 3He detector is intended to facilitate research into the actions of these reactions.
Abstract: Cyber-attacks have increased in number and severity, which has negatively affected businesses and their services. As such, cyber security is no longer considered merely a technological problem, but must also be considered as critical to the economy and society. Existing solutions struggle to find indicators of unexpected risks, which limits their ability to make accurate risk assessments. This study presents a risk assessment method based on Machine Learning, an approach used to assess and predict companies' exposure to cybersecurity risks. For this purpose, four algorithm implementations from Machine Learning (Light Gradient Boosting, AdaBoost, CatBoost, Multi-Layer Perceptron) were implemented, trained, and evaluated using generative datasets representing the characteristics of different volumes of data (for example, number of employees, business sector, and known vulnerabilities and externel advisor). The quantitative evaluation conducted on this study shows the high accuracy of Machine Learning models and Especially Multi-Layer Perceptron was the best accuracy when working compared to previous work.
Abstract: Software engineering always strives to develop and identify software pitfalls and errors before publishing the software product, in testing the software. Bugs can appear during any stage of development or testing, even after the product has been released. This paper describes different methodologies for data flow testing. Since testing is the process of running a program to identify errors, we need to increase the accuracy of the coverage area by including dataflow elements based on aliases and avoiding useless elements that reduce the overall coverage to increase the applicability and effectiveness of the dataflow test. This page looks at data flow testing, which is a type of basic test (white box). Information flow testing is divided into two main points: properties / usage test and a set of tests embedding measurements; And divide the program into parts according to its factors to make testing programming frameworks more straightforward. It also describes the steps for performing data flow testing as well as how to design test suites that take anomalies into account. It also examines and discusses methods used to date to perform data flow testing. These approaches include node-based design, trend-finding coverage, web application comparison, and analytical testing.
Abstract: In this work, we study the derivation of estimation of some integral inequality in three variables over time scales. The result of this estimation is used as a tool to investigate some properties for solving partial intergrodifferential equation with initial boundary conditions on time scales, such as estimating the difference between two approximate solutions and closeness between the solutions. This difference has many applications to various scientific fields in some branches of mathematics, physics, economics, electric and biology. In this paper we study the problem from two points review, firstly we deal with estimating the difference between two ε -approximate solutions for the given certain nonlinear integrodifferential equation with initial boundary conditions, through which a suitable estimation for the approximate solutions can be obtained, and secondly we provide the conditions for closeness of solution of the problem under the study. The integral inequality in four variables over time scales may be left as a future work.
Abstract: The development of Information and Communication Technology (ICT), globalization of software, search for an abundance of cost and time, and improving the quality of the developed product, these factors helped to grow the use of Global Virtual Teams (GVTs) in Global Software Development (GSD). This enabled software companies to adopt the GSD approach by using GVTs as an alternative to the centralized approach in the development process. Despite the benefits and advantages of this approach, it is affected by a set of challenges that affect the performance of GVTs which must be identified and considered. This paper aims to design a proposed model for developing the performance of (GVTs) in (GSD) by identifying the challenges affecting performance. The model will help workers in this field to work effectively by knowing all the challenges that they will face. The challenges are identified by reviewing the literature and analyzing related research content and then collecting it in the proposed model. To verify the validity of the components of the proposed model an expert questionnaire is conducted and the target group (13) experts in (GSD) using (GVTs). The results are analyzed using the statistical package for Social Sciences (spss) and came out positive in favor of The proposed model by (91.165)%.
Abstract: In this paper, a numerical method for solving linear fractional differential equations using Chebyshev wavelets matrices has been presented. Fractional differential equations have received great attention in the recent period due to the expansion of their uses in many applications, It is difficult to find a solution to them by the analytical method due to the presence of derivatives with fractional orders. Therefore, we resort to numerical solutions. The use of wavelets in solving these equations is a relatively new method, as it was found to give more accurate results than other methods. We created Chebyshev matrices by utilizing Chebyshev sequences, where these matrices can be created in different sizes, and the larger the matrix size, The results are more accurate. Chebyshev wavelet matrices are characterized by their speed when compared to other wavelet matrices. The algorithm converts fractional differential equations into algebraic equations by using the derivative of an operational matrix of the pulsing mass of the fractional integral with Chebyshev matrices. Then, the solution is found by applying the algorithm and comparing it with the exact solution. The results are convergent with very small errors. To prove the effectiveness and applicability of the algorithm, for validation, and show how the results are close to the exact solution, several examples have been solved.
Abstract: Because of the limited data that web applications collect from users, they are subject to information security risks. The most effective way to retain data in the modern era is through online applications. The process of providing data and data systems with appropriate procedural and sophisticated security safeguards is known as cybersecurity. Threats to cyber security are increasing at times. A flaw or weakness in a computer system, security tactics, internal controls, planning, or implementation that can compromise the security policy of a framework is known as a web vulnerability. The social, economic and political spheres of governments can be disrupted due to a vulnerability in the Internet, which can have an impact on the state. An effort is made to identify the defects and weaknesses while studying the weakness in order to take advantage of these weaknesses. The aim of this study is to identify the types of intrusions, find vulnerabilities, and review regulations for vulnerable systems detected in online applications.
Abstract: This study presents new classes of open sets defined in ideal topological space. These classes, namely: i - I -open, weakly i- I-open, ii - I -open, and weakly ii -I -open. Also, we gave new concepts of continuity of a mapping between ideal topological spaces using these classes, such as: i –I- continuity weakly i-I-continuity, ii-continuity, and weakly ii-I-continuity. We got their characteristics with comparisons of these classes and concepts. We prove that all open sets, α – I-open, semi - I - open, ii - I-open, weakly semi - I-open, and weakly ii-I-open, sets are weakly i-I-open for any ideal topological space. Additionally, we show that all α –I-continuous, semi-I-continuous, and ii-I-continuous mappings are i-I-continuous. Finally, for ideal topological space (M,L,I) and D ⊂ M satisfying 〖Int(D)〗^#= Int(D), We show that the following statements are equal.:1) D is open 2) D is i-I-open and D∩H = Int(D) for some H ∈ L∖ {M,∅}3) D is semi-I-open. Similarly, We show that the following statements are equal.: 1) D is a closed set, 2) (D∩F) = cl(D) for some F ∈ L^c 3) D is semi-I-closed.
Abstract: AbstractCatalysts are involved in many industries and due to their high cost, therefore, the research was directed to the exploitation of environmentally friendly materials, which are the natural raw materials that are available in large quantities in the Western Desert region of Anbar, in the preparation of the catalyst support material under study (alumina). Where the ore was exploited after conducting a series of treatments to prepare alumina as a dominant material for the catalyst prepared from platinum and palladium. The catalyst was used in the gaseous treatment of naphtha (58 - 160 °C) at different conditions of temperature, reaction time and the percentage of catalyst added, and after determining the optimal conditions for the catalytic treatment by analyzing the results of infrared (IR) measurement according to the initial conditions, it was found that the best treatment conditions were at a temperature of (250 °C), a reaction time of (2 hours), and a catalyst ratio of (2%), where the catalyst showed high effectiveness towards the reactions of structural reformation and hydrogen removal, in addition to that, a nuclear magnetic resonance spectroscopy (NMR) was measured ( 1HNMR) and gas chromatography (GC) for both untreated and treated naphtha under optimal conditions for the purpose of identifying the nature of structural changes of different hydrocarbon compounds (PONA) and estimating the ratio of methylene, methylene and aromatic protons, respectively. The results showed the effectiveness of the catalyst towards the reactions of structural reforming and dehydrogenation through the formation of aromatic and olefinic compounds.