People with type 2 diabetes mellitus have increased risk of chronic kidney disease and atherosclerotic cardiovascular disease. Improved care delivery and implementation of guideline-directed medical therapy have contributed to the declining incidence of atherosclerotic cardiovascular disease in high-income countries. By contrast, the global incidence of chronic kidney disease and associated mortality is either plateaued or increased, leading to escalating direct and indirect medical costs. Given limited resources, better risk stratification approaches to identify people at risk of rapid progression to end-stage kidney disease can reduce therapeutic inertia, facilitate timely interventions and identify the need for early nephrologist referral. Among people with chronic kidney disease G3a and beyond, the kidney failure risk equations (KFRE) have been externally validated and outperformed other risk prediction models. The KFRE can also guide the timing of preparation for kidney replacement therapy with improved healthcare resources planning and may prevent multiple complications and premature mortality among people with chronic kidney disease with and without type 2 diabetes mellitus. The present review summarizes the evidence of KFRE to date and call for future research to validate and evaluate its impact on cardiovascular and mortality outcomes, as well as healthcare resource utilization in multiethnic populations and different healthcare settings.
Screen time (ST) on digital devices has increased in recent decades due to digital development. Furthermore, constant engagement with digital devices alters sleep patterns, leading to nocturnal eating behaviour among users. These phenomena are therefore of great concern, as digital device addiction and night eating are associated with unhealthy food intake, increasing the metabolic syndrome (MetS) risks. The purpose of this review was to examine the evidence of the influence of ST and night eating behaviour (NEB) on dietary intake and its association with MetS based on previous literature. Prolonged ST and NEB have an association with excessive intake of energy from overconsumption of high-sugar and high-fat foods. However, the relationship between digital content and its influence on food intake is inconsistent. A higher MetS risk was found in individuals with longer ST due to a sedentary lifestyle, while positive energy balance and a shift in circadian rhythm contributed to night eaters. ST and NEB presented with a significant influence on food intake in adults. Additionally, unhealthy food intake due to excessive consumption of empty-calorie foods such as sweet and fatty foods due to addiction to electronic devices and eating at night has a detrimental effect on metabolic function. Therefore, improving food intake by reducing ST and night binges is essential to reduce the risk of MetS.
This work highlights the role of the clinical laboratory, in the early detection of the use of substances prohibited for doping. This is because most people who practice sports today are non-professional athletes and amateurs, in particular young kids. These persons are not subjected to anti-doping controls but are at risk for their health. Endocrinologists and laboratory tests, by detecting evidence of such usage can help protect their health. Anti-doping testing require specific instruments for qualitative and quantitative chemistry, to meet regulations of official competitions but are impossible to be used in every person because of high cost. A particular role the clinical laboratory can acquire in the future is through its molecular biology sections, when genetic doping will probably be a reality and quantitative chemistry will be unable to detect it. A brief history of doping is provided to understand the reasons of its spread. Although doping has great resonance nowadays, it is not a recent problem. It was common among ancient Greek wrestlers and Romans, who used mixtures of herbs and stimulants. Ancient Greece started the Olympic Games and winners assumed great esteem, akin to demi-god status. Therefore, any attempt to improve athletic performance was a norm, also because the damage caused by the substances used was not known at that time. The use became so widespread that soldiers also used drugs to better combat during recent wars, and doping was practiced by athletes, actors and musicians in attempts to obtain better performance results. Today, doping has been refined so as not to be discovered and there is a continuous race between those who promote new substances and those who, like the World Anti-Doping Agency (WADA), were created to defend the health of athletes and comply with regulations of competitions. The clinical laboratory plays a fundamental role in identifying the use of prohibited substances, especially in competitions not classified as official, which are the majority and involve thousands of amateurs. In this paper a series of laboratory tests are proposed in this perspective, at low cost without the need of qualitative/quantitative chemical analyses required by the sport jurisdictions. Finally, a glance into genetic doping illustrates a likely future and imminent practice.
Artificial intelligence (AI) use in diabetes care is increasingly being explored to personalise care for people with diabetes and adapt treatments for complex presentations. However, the rapid advancement of AI also introduces challenges such as potential biases, ethical considerations, and implementation challenges in ensuring that its deployment is equitable. Ensuring inclusive and ethical developments of AI technology can empower both health-care providers and people with diabetes in managing the condition. In this Review, we explore and summarise the current and future prospects of AI across the diabetes care continuum, from enhancing screening and diagnosis to optimising treatment and predicting and managing complications.
The concepts of sustainability and sustainable chemistry have attracted increasing attention in recent years, being of great importance to the younger generation. In this Viewpoint Article, we share how early-career chemists can contribute to the sustainable transformation of their discipline. We identify ways in which they can engage to catalyse action for change. This article does not attempt to answer questions about the most promising or pressing areas driving research and chemical innovation in the context of sustainability. Instead, we want to inspire and engage early-career chemists in pursuing sustainable actions by showcasing opportunities in education, outreach and policymaking, research culture and publishing, while highlighting existing challenges and the complexity of the topic. We want to empower early-career chemists by providing resources and ideas for engagement for a sustainable future globally. While the article focuses on students and early-career chemists, it provides insights to further stimulate the engagement of scientists from diverse backgrounds.
Deep learning (DL) in orthopaedics has gained significant attention in recent years. Previous studies have shown that DL can be applied to a wide variety of orthopaedic tasks, including fracture detection, bone tumour diagnosis, implant recognition, and evaluation of osteoarthritis severity. The utilisation of DL is expected to increase, owing to its ability to present accurate diagnoses more efficiently than traditional methods in many scenarios. This reduces the time and cost of diagnosis for patients and orthopaedic surgeons. To our knowledge, no exclusive study has comprehensively reviewed all aspects of DL currently used in orthopaedic practice. This review addresses this knowledge gap using articles from Science Direct, Scopus, IEEE Xplore, and Web of Science between 2017 and 2023. The authors begin with the motivation for using DL in orthopaedics, including its ability to enhance diagnosis and treatment planning. The review then covers various applications of DL in orthopaedics, including fracture detection, detection of supraspinatus tears using MRI, osteoarthritis, prediction of types of arthroplasty implants, bone age assessment, and detection of joint-specific soft tissue disease. We also examine the challenges for implementing DL in orthopaedics, including the scarcity of data to train DL and the lack of interpretability, as well as possible solutions to these common pitfalls. Our work highlights the requirements to achieve trustworthiness in the outcomes generated by DL, including the need for accuracy, explainability, and fairness in the DL models. We pay particular attention to fusion techniques as one of the ways to increase trustworthiness, which have also been used to address the common multimodality in orthopaedics. Finally, we have reviewed the approval requirements set forth by the US Food and Drug Administration to enable the use of DL applications. As such, we aim to have this review function as a guide for researchers to develop a reliable DL application for orthopaedic tasks from scratch for use in the market.