https://www.lettersinhighenergyphysics.com/index.php/LHEP/issue/feedLetters in High Energy Physics2025-09-11T06:16:39+00:00AGSERpublisher@agser.orgOpen Journal Systems<h1 style="margin: 40px 0 5px;"><span style="text-decoration: underline;"><strong>Letters in High Energy Physics</strong></span></h1> <h2 style="margin: 5px 0 5px;"><strong>ISSN: 2632-2714</strong></h2> <hr> <h2 style="margin: 5px 0 5px;"><span style="text-decoration: underline;">Our Mission</span></h2> <p style="margin: 5px 0;">At <em>Letters in High Energy Physics</em>, our mission is to provide a dynamic platform for the rapid dissemination of high-impact research in the field. We are dedicated to fostering the exchange of knowledge and ideas among physicists and researchers worldwide, facilitating scientific progress and collaboration.</p> <h2 style="margin: 5px 0 5px;"><span style="text-decoration: underline;">Scope</span></h2> <p style="margin: 5px 0 5px;">Our journal covers a broad spectrum of topics within high energy physics, including but not limited to:</p> <ul> <li class="show">Particle Physics</li> <li class="show">Fundamental Particles and Interactions</li> <li class="show">Quarks, leptons, and gauge bosons</li> <li class="show">The Higgs boson and mechanism</li> <li class="show">Neutrino physics and oscillations</li> <li class="show">Exotic particles and states</li> <li class="show">Experimental Particle Physics</li> <li class="show">Results from high-energy particle colliders (e.g., LHC, future colliders)</li> <li class="show">Advances in particle detection technologies</li> <li class="show">Precision measurements and anomaly detection</li> <li class="show">Quantum Field Theory</li> <li class="show">Quantum Chromodynamics (QCD)</li> <li class="show">Electroweak theory</li> <li class="show">Effective field theories</li> <li class="show">Renormalization and perturbation theory</li> <li class="show">Cosmology and Astrophysics</li> <li class="show">Dark Matter and Dark Energy</li> <li class="show">Astroparticle Physics</li> <li class="show">High-energy cosmic rays and their sources</li> <li class="show">Gamma-ray astronomy</li> <li class="show">Neutrino astronomy</li> <li class="show">Development of novel particle detectors</li> <li class="show">Advances in accelerator technologies</li> <li class="show">High-precision measurement techniques</li> <li class="show">Data Analysis and Interpretation</li> <li class="show">Data mining and machine learning applications</li> <li class="show">Statistical methods for high-energy experiments</li> <li class="show">Simulation and reconstruction techniques</li> <li class="show">Computational Physics</li> <li class="show">Phenomenology</li> <li class="show">Astrophysical Cosmology</li> <li class="show">Interdisciplinary Research</li> </ul> <p>Our goal is to ensure that our readers stay at the forefront of the field by providing timely access to the most important and influential research findings.</p> <h2 style="margin: 5px 0 5px;"><span style="text-decoration: underline;">Aim</span></h2> <p style="margin: 5px 0 5px;">The primary aim of <em>Letters in High Energy Physics</em> is to accelerate the exchange of crucial scientific knowledge in the high energy physics community. We strive to:</p> <ul> <li class="show"><strong>Promote Cutting-Edge Research:</strong> Provide a venue for the swift publication of significant discoveries and innovations in high energy physics.</li> <li class="show"><strong>Foster Scientific Dialogue:</strong> Encourage the sharing of new ideas and methodologies that can shape the future direction of the field.</li> <li class="show"><strong>Support Scientific Collaboration:</strong> Facilitate connections among researchers from various subfields and geographic locations.</li> </ul> <h2 style="margin: 5px 0 5px;"><span style="text-decoration: underline;">Submission Criteria</span></h2> <p style="margin: 5px 0 5px;">We invite submissions that:</p> <ul> <li class="show"><strong>Demonstrate Significance:</strong> Present results that have substantial implications for the field or introduce new and promising research directions.</li> <li class="show"><strong>Show Originality:</strong> Offer novel insights or approaches that advance our understanding of high energy physics.</li> <li class="show"><strong>Ensure Clarity:</strong> Communicate findings clearly and effectively, making them accessible to a broad audience within the scientific community.</li> </ul> <p>We encourage researchers to contribute papers that reflect the vibrant and evolving nature of high energy physics. By adhering to these aims and scope, we aim to maintain a high standard of scientific excellence and relevance.</p> <hr>https://www.lettersinhighenergyphysics.com/index.php/LHEP/article/view/1227Strategic Management of Supply Warehouses during Critical Circumstances: Challenges, Innovations, and Policy Implications2025-01-10T00:34:57+00:00Moady Ahmed Asiri Et al.a@a.com<p>Supply warehouses play a critical role in responding to emergencies, ensuring the distribution of essential goods amidst disruptions. This study explores the multifaceted challenges of warehouse management during crises, such as demand surges, workforce disruptions, and logistical barriers. It evaluates innovative strategies, including the adoption of advanced technologies like artificial intelligence (AI), Internet of Things (IoT), and blockchain. Using case studies and data analysis, the study emphasizes the need for resilience, sustainability, and ethical decision-making. Finally, it provides actionable policy recommendations to improve warehouse efficiency and preparedness for future crises.</p>2025-01-08T00:00:00+00:00##submission.copyrightStatement##https://www.lettersinhighenergyphysics.com/index.php/LHEP/article/view/1230Optimization of Production Scheduling in Smart Manufacturing Environments Using Machine Learning Algorithms2025-01-10T00:34:57+00:00Faisal Bin Shaikat et al.a@a.com<p>The transition to Industry 4.0 has introduced smart manufacturing environments, where dynamic processes require real-time decision-making to optimize production scheduling and enhance operational efficiency. This study aims to develop and implement advanced machine learning (ML) algorithms for optimizing production scheduling in smart manufacturing environments, focusing on improving efficiency, resource allocation, and adaptability under dynamic conditions. A hybrid ML model combining reinforcement learning (RL) and genetic algorithms (GA) was developed. Historical and real-time data from a simulated smart factory were analyzed. The model trained on 500 iterations of production scenarios involving dynamic demand, machine availability, and workforce constraints. Performance was benchmarked against traditional heuristic scheduling methods to validate improvements in key performance indicators. The hybrid ML model delivered significant improvements over traditional methods. Production efficiency increased by 39%, resource utilization reached 91% (a 14% improvement), and machine downtime was reduced by 34%. The scheduling system achieved a 94% success rate in meeting delivery deadlines under varying scenarios, compared to 78% using heuristic methods. Energy consumption per task was reduced by 17%, reflecting enhanced sustainability. In large-scale tests involving 1,000 tasks, the model maintained over 96% operational efficiency, confirming its scalability and robustness. The integration of ML in production scheduling demonstrates transformative potential for smart manufacturing environments, offering enhanced efficiency, adaptability, and sustainability. The proposed hybrid ML model represents a scalable, data-driven solution tailored to Industry 4.0 requirements.</p>2025-01-09T00:00:00+00:00##submission.copyrightStatement##https://www.lettersinhighenergyphysics.com/index.php/LHEP/article/view/1257An In-Depth Study of the Central Sterile Supply Department (CSSD): Processes, Challenges, and Technological Advancements2025-02-28T07:20:43+00:00Farhan Ghalib Alsubaie et al.a@a.com<p>The Central Sterile Supply Department (CSSD) is the backbone of modern healthcare systems, ensuring sterile, functional, and ready-to-use medical instruments for patient care. This study provides an in-depth analysis of CSSD operations, focusing on workflow, operational challenges, and innovative advancements in sterilization technologies and processes. By presenting case studies, cost-benefit analyses, and statistical data, this paper highlights the significance of CSSD efficiency in improving patient outcomes, reducing hospital-acquired infections (HAIs), and optimizing resource allocation.</p>2025-01-09T00:00:00+00:00##submission.copyrightStatement##https://www.lettersinhighenergyphysics.com/index.php/LHEP/article/view/1258Management of Multiple Symptoms in the Common Cold: A Comprehensive Review2025-02-28T07:20:44+00:00Hisham Ali M Alzahrani et al.a@a.com<p>The common cold is a self-limiting viral infection of the upper respiratory tract that results in a constellation of symptoms, including nasal congestion, rhinorrhea, sore throat, cough, headache, and fatigue. Its management focuses on symptomatic relief, as no specific antiviral therapies are available for most causative agents. This paper reviews the pathophysiology of common cold symptoms and evidence-based approaches to their management, highlighting pharmacological and non-pharmacological interventions. A focus is placed on combination therapies and patient-centered approaches to address multiple symptoms concurrently.</p>2025-01-09T00:00:00+00:00##submission.copyrightStatement##https://www.lettersinhighenergyphysics.com/index.php/LHEP/article/view/1298Studies on Structural, Morphological, Electrical and Gas Sensing Properties of the Polyaniline/Bismuth Oxide (PANI/Bi2O3) Composites.2025-02-28T07:20:45+00:00B T Vijaykumar et al.a@a.com<p>By adding bismuth oxide to the polymerization mixture of aniline hydrochloride and ammonium persulphate using chemical oxidation polymerization, conducting polyaniline/bismuth oxide (PANI/Bi<sub>2</sub>O<sub>3</sub>) composite was prepared. The effect of adding Bismuth oxide powder, during polymerization process, on the time and temperature of reaction has been studied. The structural and morphology of the composite were distinguished by X-ray diffraction pattern (XRD) and scanning electron microscope (SEM) micrographs. The dc electrical transport property of PANI/Bi<sub>2</sub>O<sub>3</sub> composites has been investigated within temperature range 30-200<sup>0</sup>C and ac electrical property of PANI/Bi<sub>2</sub>O<sub>3</sub> composites has been studied as function of frequency. The gas-sensing performance of the PANI/Bi<sub>2</sub>O<sub>3</sub> composites was studied at room temperature for Liquified Petroleum Gas (LPG). The change in resistance of the composites as function of time towards LPG was investigated.</p>2025-01-09T00:00:00+00:00##submission.copyrightStatement##https://www.lettersinhighenergyphysics.com/index.php/LHEP/article/view/1324Study on Urban Air Mobility: Overview of Ecosystem, Market Potential, and Challenges2025-02-28T07:20:46+00:00Arun Kumar Chaudhary et al.a@a.com<p>From the 1910s, the concept of flying cars and air travel within cities has mesmerized inventors. The new advancements in electrification, automation, and other related fields have provided new avenues for business models, aircraft engineering, and on-demand airborne mobility systems. The goal of Urban Air Mobility (UAM) is to develop safe, eco-friendly, cost effective, and widely available aerial networks for passenger travel, goods transportation, and urgent care services in metropolitan areas. This study applies a mixed-methods research design by conducting 106 interviews with industry professionals and performing two co-creation workshops to assess UAM's past, present, and future. The development of UAM is divided into six phases: (1) absent ‘flying car’ designs (1910s–1950s), (2) regular helicopter services (1950s–1980s), (3) on-demand aerial transport revival (2010s), (4) VTOL corridor integration (2020s), (5) hub-and-spoke expansion, (6) seamless point-to-point systems. There are still significant adoption barriers such as legal restrictions, cultural acceptance, safety concerns, operational noise, social equity, and environmental damages. The lack of infrastructure, complex airspace management, and the lack of revenue certainties inhibit scaling as well. The paper aspires to open up the discussion on the most urgent research areas around UAM such as the socioeconomic effects, the environmental, and UAM's relation to the existing aviation systems.</p>2025-01-09T00:00:00+00:00##submission.copyrightStatement##https://www.lettersinhighenergyphysics.com/index.php/LHEP/article/view/1346Heat Transfer by Non-Newtonian-Based SWCNT-Nanofluids in the Presence of MHD2025-02-26T05:09:36+00:00Mohamed Azizou et. alauthor@email.com<p>This paper presents a numerical investigation of heat transfer involving non-Newtonian-based nanofluids within an inclined square porous medium under the influence of MHD. To analyze the behavior of non-Newtonian fluids, the power-law model, a widely used rheological model for studying flow phenomena in porous media, was employed. The Darcy model is utilized to describe the flow within the porous medium. The problem is characterized by a set of interrelated non-linear differential equations, known as governing equations, which consist of the mass conservation equation (also called the continuity equation), the momentum equation, and the energy equation. These core governing equations are solved numerically using the finite difference method, with convective flow in porous media modeled through Darcy's law and the Boussinesq approximation. The main parameters influencing the problem are the Rayleigh number (????????), the power-law index (????), the volume fraction of nanoparticles????, the inclination angle of the applied magnetic field (????), the Hartmann number (Ha), and the inclination angle of the cavity (Φ). The results show that the power-law index (????), the volume fraction of nanoparticles????, inclination angle (Φ), and Hartmann number (????????) have a significant impact on the flow intensity, as well as on the heat transfer driven by natural convection within the enclosure.</p>2025-01-09T00:00:00+00:00##submission.copyrightStatement##https://www.lettersinhighenergyphysics.com/index.php/LHEP/article/view/1347Laboratory and Clinical Features of Tumor Lysis Syndrome in Children with Non-Hodgkin Lymphoma2025-05-19T10:38:02+00:00Anhar Hadad Alsulami et. alauthor@email.com<p>Tumor lysis syndrome (TLS) is a critical oncological emergency that can occur in children with non-Hodgkin lymphoma (NHL), particularly following the initiation of chemotherapy. This syndrome results from the rapid breakdown of malignant cells, leading to the release of intracellular components such as potassium, phosphate, and nucleic acids into the bloodstream. The subsequent metabolic derangements, including hyperuricemia, hyperkalemia, hyperphosphatemia, and hypocalcemia, can have severe clinical consequences, including acute kidney injury, cardiac arrhythmias, and neurological disturbances. The clinical presentation of TLS in pediatric patients can vary widely, ranging from asymptomatic laboratory abnormalities to life-threatening complications. Symptoms may include fatigue, nausea, palpitations, and seizures, often occurring within 24 to 48 hours after chemotherapy initiation. Laboratory findings are characterized by elevated serum levels of uric acid, potassium, and phosphate, alongside decreased calcium levels, necessitating prompt recognition and intervention. Risk factors for TLS in children with NHL include the type and stage of the lymphoma, the presence of a high tumor burden, and the specific chemotherapy regimen employed. Preventive strategies, such as aggressive hydration, the use of allopurinol or rasburicase, and close monitoring of metabolic parameters, are essential in mitigating the risk of TLS. In cases where TLS develops, immediate management is critical to stabilize the patient and address metabolic abnormalities. Understanding the laboratory and clinical features of TLS is vital for healthcare providers involved in the care of children with NHL. Early recognition and intervention can significantly improve patient outcomes, highlighting the importance of vigilance in monitoring at-risk pediatric patients during chemotherapy. Continued research into TLS mechanisms and management strategies is essential for enhancing the care of affected children.</p>2025-01-09T00:00:00+00:00##submission.copyrightStatement##https://www.lettersinhighenergyphysics.com/index.php/LHEP/article/view/1352Implication of Cosmological Upper Bound on the Validity of Golden Ratio Neutrino Mixings Under Radiative Corrections2025-03-03T05:53:20+00:00Y Monitar Singh et al.a@a.com<p>We study the implication of the most recent cosmological upper bound on the sum of three neutrino masses, on the validity of the golden ratio (GR) neutrino <u>mixings</u> defined at high energy <u>seesaw</u> scale, considering the possibility for generating low energy values of neutrino oscillation parameters through radiative corrections in the minimal <u>supersymmetric</u> standard model (<u>MSSM</u>). The present study is consistent with the most stringent and latest <u>Planck</u> data on cosmological upper bound, ∑|m<sub>i</sub>|<0.12 <u>eV</u>. For the radiative generation of sinθ<sub>13</sub> from an exact form of golden ratio (GR) neutrino mixing matrix defined at high <u>seesaw</u> energy scale, we take opposite <u>CP</u> parity mass eigenvalues (m<sub>1, - </sub>m<sub>2, </sub>m<sub>3</sub>) with a non-zero real value of m<sub>3</sub>, and a larger value of tanβ>60 in order to include large effects of radiative corrections in the calculation. The present analysis including the <u>CP</u> violating Dirac phase and <u>SUSY</u> threshold corrections, shows the validity of golden ratio neutrino <u>mixings</u> defined at high <u>seesaw</u> energy scale in the normal hierarchical (<u>NH</u>) model. The numerical analysis with the variation of four parameters <u>viz</u>. M<sub>R, </sub>m<sub>s</sub> tanβ and shows that the best result for the validity is obtained at M<sub>R</sub>=10<sup>15 </sup><u>GeV</u>, m<sub>s</sub>=1<u>TeV</u>, tanβ= 68 and =0.01. However, the analysis based on inverted hierarchical (<u>IH</u>) model does not conform with this latest <u>Planck</u> data on cosmological bound but it still conforms with earlier <u>Planck</u> cosmological upper bound ∑|m<sub>i</sub>|<0.23 <u>eV</u>, thus indicating possible preference of <u>NH</u> over <u>IH</u> models.</p>2025-01-09T00:00:00+00:00##submission.copyrightStatement##https://www.lettersinhighenergyphysics.com/index.php/LHEP/article/view/1355Development of Writing Processes in Students with Intellectual Disabilities through Teacher Training2025-03-05T05:53:03+00:00Leidy Natalia Montes Arciniegasa@a.com<p><strong>Introduction</strong>: The research aims to enhance writing in students with intellectual disabilities through a teacher training program. Based on references of didactics, inclusive education and the Didactext model, it follows a qualitative and critical approach. The results show that it is possible to develop inclusive projects that encourage the participation of all students, strengthening writing through narrative texts and the exploration of emotions.</p> <p><strong>Objective</strong>: Strengthen the development of writing skills in students with intellectual disabilities through a teacher training program.</p> <p><strong>Methods:</strong> From a qualitative approach, the research follows a systematic and rigorous process that allows reflection on educational practices in their historical and social dimension. It is part of the critical paradigm (Jiménez and Serón, 1992) and is based on the Didactext model, which highlights the importance of cognitive components and their relationship with culture, context, the individual and the didactic approach in the production of texts. Results: The findings show that it is possible to develop inclusive projects in which all students participate according to their abilities, favoring the strengthening of writing through the proposal of narrative texts and the exploration of emotions.</p> <p><strong>Conclusions:</strong> The study shows that teacher training is key to improving writing in students with intellectual disabilities. In addition, it highlights the importance of didactic approaches that integrate culture and the context of learning. Finally, it is concluded that inclusion and creativity in the teaching of writing favor active participation and the development of communicative skills in all students.</p>2025-01-09T00:00:00+00:00##submission.copyrightStatement##https://www.lettersinhighenergyphysics.com/index.php/LHEP/article/view/1357Antidotes: Mechanisms, Applications, and Future Directions2025-03-06T06:00:24+00:00Eman Salim ALsharari et al.a@a.com<p>Antidotes are critical therapeutic agents used to counteract the effects of toxins, poisons, or overdoses. This paper provides a comprehensive review of antidotes, including their mechanisms of action, clinical applications, challenges in development, and future directions in research and innovation. We present detailed tables summarizing key antidotes, their indications, mechanisms of action, and pharmacokinetic properties. The paper highlights the importance of antidotes in toxicology and emergency medicine, emphasizing the need for continued innovation and accessibility. Keywords: antidotes, poisoning, toxicology, mechanisms of action, chelation therapy, receptor antagonism, enzyme reactivation, nanomedicine, artificial intelligence, universal antidotes.</p>2025-01-09T00:00:00+00:00##submission.copyrightStatement##https://www.lettersinhighenergyphysics.com/index.php/LHEP/article/view/1453Advancing Machine Learning Operations (MLOps): A Framework for Continuous Integration and Deployment of Scalable AI Models in Dynamic Environments2025-06-10T06:51:24+00:00Smarth Behl et al.a@a.com<p>The rapid expansion of artificial intelligence (AI) applications has intensified the need for efficient and scalable Machine Learning Operations (MLOps) frameworks to streamline the deployment and lifecycle management of machine learning (ML) models. This study proposes a comprehensive MLOps framework that integrates continuous integration (CI), continuous deployment (CD), automated monitoring, and rollback mechanisms to support the scalable deployment of AI models in dynamic environments. Utilizing a cloud-native architecture built on tools such as Jenkins, Docker, Kubernetes, MLflow, and Airflow, the framework was tested across multiple model types and evaluated using both technical and operational performance metrics. Results show significant improvements in model accuracy, deployment latency, rollback speed, and drift detection compared to baseline systems and industry averages. The framework achieved a 92.8% model accuracy, reduced deployment time by over 65%, and improved rollback efficiency by 95%. A comparative analysis of tool integration and pipeline performance further validated the system’s scalability, flexibility, and resilience. The findings demonstrate the framework’s ability to bridge the gap between experimentation and production, making it a practical and powerful solution for real-time, high-demand AI applications. This study offers valuable insights for researchers and practitioners seeking to enhance the robustness and efficiency of AI deployment in ever-evolving environments.</p>2025-01-09T00:00:00+00:00##submission.copyrightStatement##https://www.lettersinhighenergyphysics.com/index.php/LHEP/article/view/1461Cloud-Scale Data Engineering: Real-Time Streaming Pipelines and Intelligent Infrastructure2025-06-03T06:32:53+00:00Soumya Banerjee et al.a@a.com<p>In the era of big data and continuous digital transformation, real-time data processing has become a strategic imperative for modern enterprises. This study investigates the performance, scalability, and resilience of cloud-scale data engineering architectures by comparing four real-time streaming pipelines: Kafka + Flink, Pulsar + Spark, Pub/Sub + Dataflow, and Kinesis + Lambda. Each configuration was deployed across leading cloud platforms using Kubernetes-based orchestration and evaluated under controlled load simulations ranging from 10,000 to 500,000 events per second. Key metrics such as latency, throughput, message loss rate, resource utilization, and system recovery time were analyzed using ANOVA, multivariate regression, and survival analysis. The results reveal that Pub/Sub + Dataflow delivers the best overall performance with the lowest latency, highest throughput, and superior fault tolerance, while Kinesis + Lambda trails due to higher latency and resource strain under load. Regression analysis identifies CPU usage and input load as dominant performance predictors. Kaplan-Meier survival curves further emphasize the operational resilience of each architecture under stress. These findings offer valuable insights into building scalable, intelligent data pipelines that leverage cloud-native features such as autoscaling, serverless processing, and predictive infrastructure management. The study contributes a validated framework for designing and optimizing real-time streaming systems tailored to dynamic enterprise environments.</p>2025-01-09T00:00:00+00:00##submission.copyrightStatement##https://www.lettersinhighenergyphysics.com/index.php/LHEP/article/view/1466Retracted2025-08-25T07:25:58+00:00Retracted et al.aa@a.com<p>Retracted</p>2025-01-09T00:00:00+00:00##submission.copyrightStatement##https://www.lettersinhighenergyphysics.com/index.php/LHEP/article/view/1467Integrating Physiotherapy and Healthcare Service Management Enhancing Patient Outcomes in Hospital Settings2025-06-19T07:58:39+00:00Fadel Mohammed Alshehri et al.aa@a.com<p>The integration of physiotherapy into the governance of healthcare services is one of the most valuable practices that can be applied in order to enhance patient outcomes within hospitals. The essay explains some of the ways through which the integration of physiotherapy into the governance of healthcare services can enhance patient outcomes within hospitals, including through greater access to the services of physiotherapy, evidence-based practice, and patient-centeredness. There has been evidence indicating that early intervention with physiotherapy can greatly reduce hospital stay and recovery time, thereby optimizing the use of resources. It is an interprofessional collaboration among the physiotherapists and the healthcare managers that enhances communication and coordination among the providers, leading to patient-specific treatment plans to respond to individual patient needs. In addition, keeping the patients involved in their rehabilitation empowers the patients and motivates them to follow the treatment processes; hence, this will achieve success to the fullest. The development of systematic quality improvement frameworks guarantees the effectiveness and usefulness of physiotherapy practice, thus leading to high percentages of patient satisfaction rates. Despite these issues, including poor awareness and logistics, addition of physiotherapy to the management of healthcare services will ensure complete and full recovery of patients. Future research has to keep investigating the implications of such additions and enable the creation of expert, patient-centered models of care.</p>2025-01-09T00:00:00+00:00##submission.copyrightStatement##https://www.lettersinhighenergyphysics.com/index.php/LHEP/article/view/1477Advancing Computational Modeling in Quantum Materials: A DFT-Based Approach to Electronic Structure and Material Properties2025-07-05T07:00:45+00:00Shabir Ahmad et al.a@a.com<p>Computational modeling is now a cornerstone of quantum materials research, where the density functional theory (DFT) provides a key tool at the heart of predicting electronic structure and material properties. We evaluate DFT methods, with a focus on exchange-correlation functional; computational efficiency enhancements; and the incorporation of machine learning (ML). The study leverages first-principles calculations for the analysis of band structures, DOS, and functional-dependent electronic properties variations for quantum materials (2D materials, (Graphene, MoS₂, TMDs), superconductors, topological insulators. Additionally, time-dependent DFT (TDDFT) and orbital-free DFT (OF-DFT) increase accuracy for large-scale simulations, which are limited by computational resources in complex materials. In fact, these new ML-assisted DFT techniques greatly improve the computational speed and predictive accuracy, thereby minimizing the conflicting options between computational cost and precision. The study further presents critical perspectives on defect engineering in the context of semiconductors, asserting its importance in tuning electronic properties for emerging fields such as Nano electronics and quantum computing applications. Standard DFT functional come with accuracy limitations, but by combining them with AI-based surrogate models and many-body physics such as DFT+DMFT, GW, and Quantum Monte Carlo methods, there are powerful options on both fronts. Such innovations are unlocked by more efficient, scalable, and precise quantum material simulations, facilitating advances in next-generation devices based on optoelectronics, spintronic, and superconductivity. This work highlights the potential of AI-assisted computational modeling to revolutionize quantum materials science and ultimately enable advances in energy-efficient electronics and quantum technologies.</p>2025-01-09T00:00:00+00:00##submission.copyrightStatement##https://www.lettersinhighenergyphysics.com/index.php/LHEP/article/view/1487The Role of Thermal Modification and Nanofluids in Improving Energy Conversion Efficiency in Heat Exchangers2025-07-09T12:02:12+00:00Alwan Flayyih Husseinauthor@email.com<p>Enhancing energy conversion efficiency in heat exchangers is a critical objective in modern thermal engineering. Thermal modification techniques and the incorporation of nanofluids represent promising strategies to overcome conventional limitations such as low thermal conductivity and limited heat transfer rates. Thermal modification involves optimizing the design and material properties of heat exchangers to improve heat transfer coefficients, reduce fouling, and increase operational stability. Meanwhile, nanofluids engineered colloidal suspensions of nanoparticles in base fluids demonstrate significantly improved thermal properties, enabling superior convective heat transfer performance. This paper explores the combined and individual roles of thermal modification and nanofluids in improving heat exchanger efficiency, examining the underlying mechanisms, practical implementation challenges, and potential for integration into advanced energy systems.</p>2025-01-09T00:00:00+00:00##submission.copyrightStatement##https://www.lettersinhighenergyphysics.com/index.php/LHEP/article/view/1488The Role of Nanofluids in Enhancing Energy Conversion Technologies: A Comparative Study of Different Types of Nanoparticles2025-07-10T06:20:57+00:00Alwan Flayyih Hussein et. alauthor@email.com<p>The integration of nanofluids into energy conversion systems has emerged as a promising strategy for enhancing heat transfer performance and overall efficiency. This study provides a comparative analysis of different types of nanoparticles—including metal oxides, carbon-based nanomaterials, and hybrid composites—dispersed in base fluids to evaluate their impact on thermophysical properties such as thermal conductivity, viscosity, and specific heat capacity. Experimental and computational investigations reveal that nanofluids significantly improve heat exchanger performance, solar thermal collectors, and cooling systems by facilitating superior thermal transport mechanisms. However, variations in stability, cost, and environmental impact across nanoparticle types necessitate careful material selection for specific applications. The findings underscore the importance of optimizing nanoparticle concentration, size, and morphology to achieve maximum efficiency gains while mitigating potential operational challenges. This comparative study aims to inform the development of next-generation energy conversion technologies that are both efficient and sustainable. Studies on nanofillers show that the increase in the thermal conductivity of nanofluids depends on many variables, including the size of the nano-shaped filler and the surface area of the particle, the amount of filler, particle aggregation, viscosity stability, Brownian motion, and the temperature effect. This article introduces these factors and how they affect the heat transfer of nanofluids.</p> <p> </p>2025-01-09T00:00:00+00:00##submission.copyrightStatement##https://www.lettersinhighenergyphysics.com/index.php/LHEP/article/view/1522Architecting Intelligent Financial Infrastructure: Scalable Machine Learning Systems for Real-Time Data Engineering in FinTech Applications2025-07-17T07:57:57+00:00Alex Chen et al.a@a.com<p>The increasing complexity and velocity of financial data in modern FinTech ecosystems necessitate a shift toward intelligent, scalable, and real-time infrastructures. This study proposes an integrated architecture that combines scalable machine learning systems with real-time data engineering to enable adaptive and high-throughput FinTech applications. Leveraging microservices, distributed processing frameworks, and MLOps practices, the architecture is designed to support diverse use-cases such as fraud detection, high-frequency trading signal prediction, and personalized credit risk profiling. Performance benchmarks demonstrate that the system can sustain over 100,000 transactions per second under peak load, while maintaining sub-50 millisecond latency across streaming data pipelines. Machine learning models achieved high predictive accuracy (AUC up to 0.97 and RMSE as low as 0.028), validated through rigorous statistical analyses including PCA, VIF, t-tests, and ANOVA. Real-time stream processing engines ensured timely and accurate data transformation with >97% window completeness. The integration of MLOps further enhanced model lifecycle management and deployment automation. Overall, this study offers a robust, scalable, and intelligent framework for powering next-generation FinTech platforms capable of delivering real-time, data-driven financial intelligence.</p>2025-01-09T00:00:00+00:00##submission.copyrightStatement##https://www.lettersinhighenergyphysics.com/index.php/LHEP/article/view/1523Designing Trustworthy Data Products for Scalable Enterprise Solutions Through GenAI Engineering Integrated with Financial Modeling and Intelligent Product Development2025-07-17T07:59:25+00:00Dilip Rachamalla et al.a@a.com<p>In the era of intelligent automation and data-driven decision-making, enterprises face growing pressure to develop trustworthy and scalable data products that can deliver actionable insights while maintaining transparency, security, and economic viability. This study proposes a strategic framework for designing such data products by integrating Generative AI (GenAI) engineering with financial modeling and intelligent product development. The methodology combines advanced AI architectures (e.g., GPT-3.5, BERT, TabTransformer) with scenario-based financial simulations and user-centered design practices to evaluate model performance, economic feasibility, scalability, and ethical compliance. Results demonstrate that GenAI models can achieve high accuracy and explainability while financial modeling ensures economic sustainability across market conditions. System testing confirms architectural resilience under enterprise-scale workloads, while user feedback highlights the success of intelligent feature adoption. Moreover, robust governance protocols reinforce trust through data privacy, auditability, and regulatory alignment. The study concludes that a unified approach merging AI innovation with economic rigor and ethical design enables the creation of enterprise-grade data products that are reliable, scalable, and trusted by stakeholders.</p>2025-01-09T00:00:00+00:00##submission.copyrightStatement##https://www.lettersinhighenergyphysics.com/index.php/LHEP/article/view/1529AI-Enabled Business Intelligence Platforms for SaaS solutions: Securing Contracting Intelligence in the Life Sciences Industry2025-07-25T07:49:15+00:00Kalyan Kilaru et al.a@a.com<p>The increasing complexity of regulatory compliance, contract negotiation, and data security in the life sciences industry has created an urgent need for intelligent and scalable contracting solutions. This study investigates the integration of AI-enabled business intelligence (BI) platforms within Software-as-a-Service (SaaS) environments to enhance contracting intelligence in the life sciences sector. Utilizing a mixed-methods approach, the research evaluates 30 organizations, 15 using AI-BI SaaS platforms and 15 relying on traditional contract lifecycle management (CLM) systems across performance, security, and compliance metrics. Results indicate that AI-enabled platforms significantly outperform traditional systems in contract approval time, clause-risk detection accuracy, renewal precision, and regulatory adherence. AI models demonstrated high reliability with F1-scores exceeding 0.90 and anomaly detection AUC values above 0.95. Security assessments reveal that AI-BI platforms implement more advanced measures, such as 256-bit encryption and federated learning, contributing to both enhanced protection and faster processing times. Statistical analyses, including ANOVA and correlation testing, confirm the significance of these improvements. The findings underscore the transformative potential of secure, AI-driven SaaS solutions in creating intelligent, compliant, and efficient contract ecosystems within the life sciences industry, offering valuable insights for digital transformation strategies in regulated domains.</p>2025-01-09T00:00:00+00:00##submission.copyrightStatement##https://www.lettersinhighenergyphysics.com/index.php/LHEP/article/view/1532Startup Security in Industrial IoT: AI-Driven Application Security for Smart Manufacturing Networks2025-08-11T06:32:04+00:00Rohith Narasimhamurthy et al.a@a.com<p>The rapid emergence of Industrial Internet of Things (IIoT) technologies has transformed smart manufacturing by enabling real-time monitoring, automation, and predictive decision-making. Startups play a crucial role in driving this transformation; however, their applications often lack robust security frameworks, making them vulnerable to cyber threats. This study investigates the effectiveness of AI-driven application security in enhancing the resilience of IIoT systems deployed by startups within smart manufacturing networks. A comparative evaluation of machine learning models including Random Forest, Deep Neural Networks, SVM, and Autoencoders was conducted across 30 IIoT startups, assessing detection accuracy, response latency, false-positive rates, and operational impact. Results demonstrate that AI-integrated security significantly improves threat detection (with Random Forest achieving 97.2% accuracy), reduces unpatched vulnerabilities by 75%, and minimizes system downtime by 69.3%. ANOVA and regression analyses confirmed the statistical significance of performance differences and the inverse relationship between model accuracy and latency. Furthermore, adaptive AI systems showed a continuous decline in intrusion attempts over a 30-day simulation, highlighting their real-time learning capabilities. The study also found that CPU overhead remained within acceptable limits, ensuring deployment feasibility even in resource-constrained environments. Overall, this research emphasizes the strategic necessity of integrating AI into application-layer security for IIoT startups, offering scalable, intelligent, and proactive protection that supports long-term sustainability and competitiveness in smart manufacturing ecosystems.</p>2025-01-09T00:00:00+00:00##submission.copyrightStatement##https://www.lettersinhighenergyphysics.com/index.php/LHEP/article/view/1533Modern Software Engineering: API Design and Production Engineering for Scalable Platforms2025-09-11T06:16:39+00:00Prithviraj Kumar et al.a@a.com<p>In the era of cloud-native systems and high-concurrency digital services, achieving scalability and resilience in software platforms requires a seamless integration of modern software engineering practices. This study investigates the dual role of API design and production engineering in enabling scalable platforms. Using a mixed-method approach, five enterprise-grade systems across sectors finance, healthcare, e-commerce, SaaS, and logistics were analyzed through structural API assessments, production maturity evaluations, and performance benchmarking under peak loads. Key metrics included response time, throughput, observability coverage, CI/CD automation, availability, and latency under increasing user demands. The results revealed that platforms with well-structured APIs and advanced production engineering such as ShopCart and FinanceCloud emonstrated significantly superior scalability and operational reliability compared to those with weaker practices. A multiple regression model confirmed that factors like endpoint hierarchy and auto-scaling capability were statistically significant predictors of scalability (R² = 0.71, p < 0.05). Principal Component Analysis and latency trend visualizations further supported the synergistic impact of design and operational workflows. This research concludes that scalable software systems are the result of coordinated architectural clarity and operational robustness. The findings provide actionable insights for engineering teams aiming to build resilient and future-ready platforms in increasingly demanding digital ecosystems.</p>2025-01-09T00:00:00+00:00##submission.copyrightStatement##https://www.lettersinhighenergyphysics.com/index.php/LHEP/article/view/1534Optimizing Production Engineering: Data Science and ML Solutions for Scalable Data Pipelines in Supply Chain Software2025-09-09T11:05:54+00:00Abhishek Gupta et al.a@a.com<p>In the era of Industry 4.0, optimizing production engineering through intelligent systems has become a strategic priority for supply chain-driven industries. This study investigates the integration of Data Science and Machine Learning (ML) solutions within scalable data pipelines to enhance production performance and decision-making in supply chain software platforms. A hybrid methodology was employed, combining real-time data pipeline engineering using Apache Kafka and Airflow with predictive modeling through algorithms such as Random Forest, XGBoost, ARIMA, and Prophet. Empirical analysis was conducted across multiple industrial case studies, evaluating the system on key performance indicators (KPIs) such as production throughput, machine downtime, and inventory turnover. The results revealed notable improvements in operational accuracy, with Prophet outperforming ARIMA in demand forecasting and Random Forest achieving 92.4% accuracy in equipment failure prediction. Scalable data pipelines ensured high throughput and low latency, supporting seamless real-time ML deployment. Statistical analysis confirmed the significance of performance gains, with production efficiency increasing by 9.3% and forecast error decreasing by over 38%. This study provides a practical, data-driven framework for optimizing production workflows and establishes a foundation for AI-enabled supply chain transformation. The findings highlight the critical role of ML and data engineering in advancing modern production systems and driving digital resilience in industrial operations.</p>2025-01-09T00:00:00+00:00##submission.copyrightStatement##https://www.lettersinhighenergyphysics.com/index.php/LHEP/article/view/1535Reinforcement Learning for Distributed AI Systems: Scalable Indexing and LLM Integration in Cloud Architecture2025-07-29T07:17:44+00:00Prithviraj Kumar Dasari et al.a@a.com<p>This study proposes a unified framework for distributed artificial intelligence (AI) systems by integrating reinforcement learning (RL), scalable indexing, and large language models (LLMs) within a cloud-native architecture. The research investigates how advanced RL algorithms, particularly PPO and DQN, function under distributed workloads and how the inclusion of LLMs enhances system interpretability and user interaction. A multi-agent simulation was deployed in a cloud environment using Kubernetes for orchestration and Apache Cassandra for indexing, enabling horizontal scalability and low-latency performance. Results show that PPO outperforms in convergence speed and reward optimization, while DQN integrated with LLMs improves interpretability and dynamic policy updates without compromising performance. Scalable indexing frameworks significantly enhanced throughput and reduced latency, with cache hit rates positively correlating with overall system efficiency. Statistical analyses, including ANOVA and Pearson correlations, confirmed the significance and strength of these improvements. This integrated approach demonstrates the effectiveness of combining learning, reasoning, and storage subsystems in distributed AI applications. It offers a scalable, interpretable, and efficient model suitable for real-time intelligent systems in domains such as autonomous operations, industrial automation, and federated learning.</p>2025-01-09T00:00:00+00:00##submission.copyrightStatement##https://www.lettersinhighenergyphysics.com/index.php/LHEP/article/view/1536Engineering Secure Software: Information Security Strategies for Modern Development Teams2025-07-31T09:44:58+00:00Rajiv Kishore Gadda et al.a@agmail.com<p>In an era where software systems form the backbone of digital transformation, securing applications from the ground up has become a strategic imperative. This study explores the engineering of secure software through the integration of comprehensive information security strategies by modern development teams. Utilizing a mixed-methods approach, the research involved quantitative surveys and qualitative interviews with 120 professionals across industries practicing Agile, DevOps, and hybrid development methodologies. Key strategies such as secure coding, threat modeling, DevSecOps pipeline integration, and automated testing (SAST, DAST, and SCA) were assessed for their implementation frequency, effectiveness, and integration complexity. Statistical analysis revealed strong positive correlations between the adoption of security practices and software robustness, alongside significant inverse relationships with security incident rates and time-to-market pressures. Regression modeling confirmed the Security Practice Index, team collaboration, and training frequency as significant predictors of software quality. Additionally, DevOps-based teams and larger organizations reported significantly lower incident rates, as evidenced by ANOVA results and comparative visualizations. The study concludes that engineering secure software requires not just technical tools but a cultural shift that aligns developers, security analysts, and operations teams around shared security goals. By embedding security into every phase of the SDLC, modern teams can mitigate risks, improve resilience, and sustain agile delivery in an increasingly hostile cyber landscape.</p>2025-01-09T00:00:00+00:00##submission.copyrightStatement##https://www.lettersinhighenergyphysics.com/index.php/LHEP/article/view/1537ML-Driven Application Security: Engineering Intelligent and Secure Software Solutions2025-07-31T09:46:28+00:00Sri Nitchith Akula et al.a@agmail.com<p>In the face of escalating cyber threats and complex software architectures, traditional security approaches often fall short of providing comprehensive protection. This study explores the integration of machine learning (ML) into application security to engineer intelligent and secure software solutions. A multi-layered methodology incorporating supervised, unsupervised, and reinforcement learning techniques was developed and applied across different stages of the Software Development Life Cycle (SDLC). Supervised models such as Random Forest and Gradient Boosting were used for vulnerability prediction, achieving high accuracy and precision. Unsupervised models like Autoencoders and Isolation Forests detected anomalies in real-time system behavior with low false-positive rates. Reinforcement learning agents were employed to automate threat mitigation in dynamic environments, optimizing access control and API usage with minimal latency. The ML modules were embedded into a secure engineering pipeline and evaluated on performance, detection capability, and operational overhead. Results revealed substantial improvements in threat prediction, a 73.8% reduction in real-world security incidents, and minimal impact on system resources. This study affirms that ML-driven application security transforms conventional security practices by enabling intelligent, adaptive, and scalable solutions, marking a paradigm shift toward autonomous and proactive software protection.</p>2025-01-09T00:00:00+00:00##submission.copyrightStatement##https://www.lettersinhighenergyphysics.com/index.php/LHEP/article/view/1545AI-Driven Secure Smart Manufacturing: Integrating Database Indexing, Industrial Cybersecurity, And Real-Time IIoT Analytics in Wireless Automation Architectures2025-08-06T09:48:04+00:00Omkar Ashok Bhalekar et al.a@a.com<p>The rapid evolution of Industry 4.0 has propelled smart manufacturing into an era of AI-driven intelligence, requiring seamless integration of cybersecurity, data analytics, and real-time control within wireless architectures. This study proposes a comprehensive framework that integrates artificial intelligence, intelligent database indexing, industrial cybersecurity, and real-time IIoT analytics for secure and scalable smart manufacturing systems. An experimental simulation was conducted using AI models including LSTM, Random Forest, and autoencoders to optimize predictive maintenance and anomaly detection. Indexing strategies, B-Tree, Hash, and AI-adaptive were evaluated for query latency and data throughput, while wireless protocols such as Zigbee, Wi-Fi 6, and private 5G were assessed for latency, packet loss, and encryption overhead. Results indicate that AI-adaptive indexing achieved the lowest query latency (10 ms) and highest throughput (3,200 QPS), while LSTM delivered superior predictive accuracy (F1 score 95.1%) and autoencoders demonstrated robust anomaly detection (97.5% accuracy, 2.3% false-positive rate). Private 5G emerged as the most reliable wireless medium with minimal latency (7 ms) and the highest data integrity. The integrated approach demonstrates strong statistical significance and operational viability, highlighting the potential of AI-driven solutions in enhancing resilience, efficiency, and security in next-generation smart manufacturing ecosystems.</p>2025-01-09T00:00:00+00:00##submission.copyrightStatement##