The objectives of this study are to assess pediatric radiation exposure in certain barium studies and to quantify the organ and effective doses and radiation risk resultant from patients' irradiation. A total of 69 pediatric barium studies for upper and lower gastrointestinal tract. Patients' radiation dose was quantified in terms of Entrance surface air kerma (ESAKs) using exposure parameters and DosCal software. Organ and effective doses (E) were extrapolated using national Radiological Protection Board software (NRPB-R279). The mean ± (SD) and the range of patient doses per procedure were 3.7 ± 0.4 (1.0-13.0)mGy, 7.4 ± 1.7(5.5-8.0)mGy and 1.4 ± 0.9 (0.5-3.6)mGy for barium meal, swallow and enema, respectively. The mean effective doses were 0.3 ± 0.03 (0.08-1.1)mSv, 0.2 ± 1.6 (0.44-0.7)mSv and 0.3 ± 0.9 (0.1-0.8)mSv at the same order. The radiation dose were higher compared to previous studies. Therefore, pediatrics are exposed to avoidable radiation exposure. Certain optimization measures are recommended along with establishing national diagnostic reference level (DRL) to reduce the radiation risk.
In the modern clinical practice of diagnostic radiology there is a growing demand for radiation dosimetry, it also being recognized that with increasing use of X-ray examinations additional population dose will result, accompanied by an additional albeit low potential for genetic consequences. At the doses typical of diagnostic radiology there is also a low statistical risk for cancer induction; in adhering to best practice, to be also implied is a low but non-negligible potential for deterministic sensitive organ responses, including in regard to the skin and eyes. Risk reduction is important, in line with the principle of ALARP, both in regard to staff and patients alike; for the latter modern practice is usually guided by Dose Reference Levels (DRL) while for the former and members of the public, legislated controls (supported by safe working practices) pertain. As such, effective, reliable and accurate means of dosimetry are required in support of these actions. Recent studies have shown that Ge-doped-silica glass fibres offer several advantages over the well-established phosphor-based TL dosimeters (TLD), including excellent sensitivity at diagnostic doses as demonstrated herein, low fading, good reproducibility and re-usability, as well as representing a water impervious, robust dosimetric system. In addition, these silica-based fibres show good linearity over a wide dynamic range of dose and dose-rate and are directionally independent. In the present study, we investigate tailor made doped-silica glass thermoluminescence (TL) for applications in medical diagnostic imaging dosimetry. The aim is to develop a dosimeter of sensitivity greater than that of the commonly used LiF (Mg,Ti) phosphor. We examine the ability of such doped glass media to detect the typically low levels of radiation in diagnostic applications (from fractions of a mGy through to several mGy or more), including, mammography and dental radiology, use being made of x-ray tubes located at the Royal Surrey County Hospital. We further examine dose-linearity, energy response and fading.
A pacemaker, which is used for heart resynchronization with electrical impulses, is used to manage many clinical conditions. Recently, the frequency of pacemaker implantation procedures has increased to more than 50% worldwide. During this procedure, patients can be exposed to excessive radiation exposure. Wide range of doses has been reported in previous studies, suggesting that optimization of this procedure has not been fulfilled yet. The current study evaluated patient radiation exposure during cardiac pacemaker procedures and quantified the patient effective dose. A total of 145 procedures were performed for five pacemaker procedures (VVI, VVIR, VVD, VVDR, and DDDR) at two hospitals. Patient doses were measured using the kerma-area product meter. Effective doses were estimated using software based on Monte Carlo simulation from the National Radiological Protection Board (NRPB, now The Health Protection Agency). The effective dose values were used to estimate cancer risk from the pacemaker procedure. Patient demographic data and exposure parameters for fluoroscopy and radiography were quantified. The mean patient doses ± SD per procedure (Gycm2) for VVI, VVIR, VVD, VVDR, and DDDR were 1.52 ± 0.13 (1.43-1.61), 3.28 ± 2.34 (0.29-8.73), 3.04 ± 1.67 (1.57-4.86), 6.04 ± 2.326 3.29-8.58), and 8.8 ± 3.6 (4.5-26.20), respectively. The overall patient effective dose was 1.1mSv per procedure. It is obvious that the DDDR procedure exposed patients to the highest radiation dose. Patient dose variation can be attributed to procedure type, exposure parameter settings, and fluoroscopy time. The results of this study showed that patient doses during different pacemaker procedures are lower compared to previous reported values. Patient risk from pacemaker procedure is low, compared to other cardiac interventional procedures. Patients' exposures were mainly influenced by the type of procedures and the clinical indication.
Patient effective doses and the associated radiation risks arising from particular computed tomography (CT) imaging procedures are assessed. The objectives of this research are to measure radiation doses for patients and to quantify the radiogenic risks from CT brain and chest procedures. Patient data were collected from five calibrated CT modality machines in Saudi Arabia. The results are from a study of a total of 60 patients examined during CT procedures using the calibrated CT units. For CT brain and chest, the mean patient effective doses were 1.9 mSv (with a range of 0.6-2.5 mSv) and 7.4 mSv (with a range of 0.5-34.8 mSv) respectively. The radiogenic risk to patients ranged from between 10-5 and 10-4 per procedure. With 65% of the CT procedure cases diagnosed as normal, this prompts re-evaluation of the referral criteria. The establishment of diagnostic reference levels (DRL) and implementation of radiation dose optimisation measures would further help reduce doses to optimal values.
Patient radiation dose and image quality are primary issues in the conduct of nuclear medicine (NM) procedures. A range of protocols are currently used in image acquisition and analysis of quality control (QC) tests, with National Electrical Manufacturers Association (NEMA) methods and protocols widely accepted in providing an accurate description, measurement and report of γ-camera performance parameters. However, no standard software is available for image analysis. Present study compares vendor QC software analysis and three types of software freely downloadable from the internet: NMQC, NM Toolkit and ImageJ-NM Toolkit software. These were used for image analysis of QC tests of γ-cameras based on NEMA protocols including non-uniformity evaluation. Ten non-uniformity QC images were obtained using a dual head γ-camera installed in Trieste General Hospital and then analyzed. Excel analysis was used as the baseline calculation for the non-uniformity test according to NEMA procedures. The results of non-uniformity analysis showed good agreement between the independent types of software and Excel calculations (the average differences were 0.3%, 2.9%, 1.3% and 1.6% for the Useful Field of View (UFOV) integral, UFOV differential, Central Field of View (CFOV) integral and CFOV differential, respectively), while significant differences were detected following analysis using the company QC software when compared with Excel analysis (the average differences were 14.6%, 20.7%, 25.7% and 31.9% for the UFOV integral, UFOV differential, CFOV integral and CFOV differential, respectively). Compared to use of Excel calculations use of NMQC software was found to be in close accord. Variation in results obtained using the three types of software and γ-camera QC software was due to the use of different pixel sizes. It is important to conduct independent analyses tests in addition to using the vendor QC software in order to determine the differences between values.
Medical exposure of the general population due to radiological investigations is the foremost source of all artificial ionising radiation. Here, we focus on a particular diagnostic radiological procedure, as only limited data are published with regard to radiation measurements during urethrograpic imaging. Specifically, this work seeks to estimate patient and occupational effective doses during urethrographic procedures at three radiology hospitals. Both staff and patient X-ray exposure levels were calculated in terms of entrance surface air kerma (ESAK), obtained by means of lithium fluoride thermoluminescent dosimeters (TLD-100(LiF:Mg:Cu.P)) for 243 urethrographic examinations. Patient radiation effective doses per procedure were estimated using conversion factors obtained from the use of Public Health England computer software. In units of mGy, the median and range of ESAK per examination were found to be 10.8 (3.6-26.2), 7.0 (0.2-32.3), and 24.3 (9.0-32.0) in Hospitals A, B, and C, respectively. The overall mean and range of staff doses (in µGy) were found to be 310 (4.0-1750) per procedure. With the exception of hospital C, the present evaluations of radiation dose have been found to be similar to those of previously published research. The wide range of patient and staff doses illustrate the need for radiation dose optimisation.
Graphite ion chambers and semiconductor diode detectors have been used to make measurements in phantoms but these active devices represent a clear disadvantage when considered for in vivo dosimetry. In such circumstance, dosimeters with atomic number similar to human tissue are needed. Carbon nanotubes have properties that potentially meet the demand, requiring low voltage in active devices and an atomic number similar to adipose tissue. In this study, single-wall carbon nanotubes (SWCNTs) buckypaper has been used to measure the beta particle dose deposited from a strontium-90 source, the medium displaying thermoluminescence at potentially useful sensitivity. As an example, the samples show a clear response for a dose of 2Gy. This finding suggests that carbon nanotubes can be used as a passive dosimeter specifically for the high levels of radiation exposures used in radiation therapy. Furthermore, the finding points towards further potential applications such as for space radiation measurements, not least because the medium satisfies a demand for light but strong materials of minimal capacitance.
The various technological advancements in computed tomography (CT) have resulted in remarkable growth in the use of CT imaging in clinical practice, not the least of which has been its establishment as the most valuable imaging examination for the assessment of cardiovascular system disorders. The objective of this study was to assess the effective radiation dose and radiation risk for patients during cardiac CT procedures, based on studies from four different hospitals equipped with 128 slice CT equipment. A total of eighty-three patients were investigated in this study with different clinical indications. Effective doses were also calculated using software based on Monte Carlo simulation. The mean patient age (years), weight (kg), and body mass index (BMI (kg/m2)) were 49 ± 11, 82 ± 12, and 31 ± 6, respectively. The results of the study revealed that the tube voltage (kVp) and tube current-exposure time product (mAs) ranged between 100 to 140 and 50 to 840 respectively. The overall average patient dose values for the volume CT dose index [(CTDIvol), in mGy)] and dose length product (DLP) (in mGy·cm) were 34.8 ± 15 (3.7-117.0) and 383.8 ± 354 (46.0-3277.0) respectively. The average effective dose (mSv) was 15.2 ± 8 (1.2-61.8). The radiation dose values showed wide variation between different hospitals and even within the same hospital. The results indicate the need to optimize radiation dose and to establish diagnostic reference levels (DRLs) for patients undergoing coronary computed tomography angiography (CCTA), also to harmonize the imaging protocols to ensure reduced radiation risk.
Lutetium-177 (DOTATATE) (177Lu; T1/2 6.7 days), a labelled β- and Auger-electron emitter, is widely used in treatment of neuroendocrine tumours. During performance of the procedure, staff and other patients can potentially receive significant doses in interception of the gamma emissions [113 keV (6.4%) and 208 keV (11%)] that are associated with the particle decays. While radiation protection and safety assessment are required in seeking to ensure practices comply with international guidelines, only limited published studies are available. The objectives of present study are to evaluate patient and occupational exposures, measuring ambient doses and estimating the radiation risk. The results, obtained from studies carried out in Riyadh over an 11 month period, at King Faisal Specialist Hospital and Research Center, concerned a total of 33 177Lu therapy patients. Patient exposures were estimated using a calibrated Victoreen 451P survey meter (Fluke Biomedical), for separations of 30 cm, 100 cm and 300 cm, also behind a bed shield that was used during hospitalization of the therapy patients. Occupational and ambient doses were also measured through use of calibrated thermoluminescent dosimeters and an automatic TLD reader (Harshaw 6600). The mean and range of administered activity (in MBq)) was 7115.2 ± 917.2 (4329-7955). The ambient dose at corridors outside of therapy isolation rooms was 1.2 mSv over the 11 month period, that at the nursing station was below the limit of detection and annual occupational doses were below the annual dose limit of 20 mSv. Special concern needs to be paid to comforters (carers) and family members during the early stage of radioisotope administration.
With associated cure rates in excess of 90%, targeted 131I radioactive iodine therapy has clearly improved thyroid cancer survival. Thus said, potential radiation risks to staff represent a particular concern, current study seeking to determine the radiation exposure of staff from 131I patients during hospitalization, also estimating accumulated dose and related risk to staff during preparation of the radioactive iodine. In present study made over the three-month period 1st February to 1st May 2017, a total of 69 patient treatments were investigated (comprising a cohort of 46 females and 23 males), this being a patient treatment load typically reflective of the workload at the particular centre for such treatments. The patients were administered sodium iodide 131I, retained in capsules containing activities ranging from 370 to 5550 MBq at the time of calibration, radioiodine activity depends on many factors such as gender, clinical indication, body mass index and age. The staff radiation dose arising from each patient treatment was measured on three consecutive days subsequent to capsule administration. In units of µSv, the mean and dose-rates range at distances from the patients of 5 cm, 1 m and 2 m were 209 ± 73 (165-294), 6.8 ± 2 (5.3-9.5) and 0.9 ± 0.3 (0.7-1.2). The annual dose (also measured in units of µSv), based on annual records of doses, for medical physicists, technologists and nurses were 604, 680 and 1000 µSv respectively. In regard to current practice and workload, staff exposures were all found to be below the annual dose limit for radiation workers.
This study has sought to evaluate patient exposures during the course of particular diagnostic positron emission tomography and computed tomography (PET/CT) techniques. A total of 73 patients were examined using two types of radiopharmaceutical: 18F-fluorocholine (FCH, 48 patients) and 68Ga-prostate-specific membrane antigen (PSMA, 25 patients). The mean and range of administered activity (AA) in MBq, and effective dose (mSv) for FCH were 314.4 ± 61.6 (462.5-216.8) and 5.9 ± 1.2 (8.8-4.11), respectively. Quoted in the same set of units, the mean and range of AA and effective dose for 68Ga-PSMA were 179.3 ± 92.3 (603.1-115.1) and 17.9 ± 9.2 (60.3-11.5). Patient effective doses from 18F-FCH being a factor of two greater than the dose resulting from 68Ga-PSMA PET/CT procedures. CT accounts for some 84 and 23% for 18F-FCH and 68Ga-PSMA procedures, accordingly CT acquisition parameter optimization is recommended. Patient doses have been found to be slightly greater than previous studies.
The objective of this study is to estimate the annual effective dose for cardiologists and nurses by measuring Hp(10) and Hp(0.07) during cardiac catheterization procedures. A total of 16 staffs members were working in interventional cardiology during 1 year at a tertiary hospital. The occupational dose was measured using calibrated thermo-luminescent dosemeters (TLD-100, LiF:Mg,Ti). The overall mean and range of the annual Hp(10) and Hp(0.07) (mSv) for cardiologists were 3.7 (0.13-14.5) and 3.2 (0.21-14.7), respectively. Cardiologists were frequently exposed to higher doses compared with nurses and technologists. The exposure showed wide variations, which depend on occupation and workload. Staff is adhered to radiation protection guidelines regarding shielding the trunk, thyroid shield, thus appropriately protected. Lens dose measurement is recommended to ensure that dose limit is not exceeded.
In comparison to adults and paediatric are more sensitive to ionizing radiation exposure. Computed tomography (CT) is now the dominant source of medical radiologic tests for patients, accounting for more than 70% of total doses to the general public. Paediatric CT brain scans (with and without contrast) are routinely performed for a variety of clinical reasons. As a result, this parameter must be calculated in order to determine relative radiation risk. The goal of this study is to assess the radiation risk to children during CT brain diagnostic procedures. Three hundred fifty three child patients' radiation risk doses were assessed over the course of a year. The mean and ranged of the children's radiation doses were 40.6 ± 8.8 (27.8-45.8) CTDIvol (mGy) and 850 ± 230 (568.1-1126.4) DLP (mGy.cm) for the brain with contrast medium. For CT brain without contrast, the patients' doses were 40.9 ± 9.4 (14.27-64.07) CTDIvol (mGy), and 866.1 ± 289.3 (203.6-2484.9) DLP (mGy.cm). The characteristics related to the radiation dose were retrieved from the scan protocol generated by the CT system by the participating physicians after each procedure. Furthermore, optimizing the CT acquisition parameter is critical for increasing the benefit while lowering the procedure's radiogenic risk. The patients' radiation dose is comparable with the most previously published studies and international diagnostic reference levels (DRLs). Radiation dose optimization is recommended due to high sensitivity of the paediatric patients to ionizing radiation.
Breast cancer is a common malignancy for females (25% of female cancers) and also has low incidence in males. It was estimated that 1% of all breast malignancies occur in males with mortality rate about 20%, with annual increase in incidence. Risk factors include age, family history, exposure to ionizing radiation and high estrogen and low of androgens hormones level. Diagnosis and screening are challenging due to limiting effectiveness of breast cancer screening. Therefore, patients may expose to ionizing radiation that may contribute in breast cancer incidence in males. In literature, limited studies were published regarding radiation exposure for males during mammography. The objective of this research is to quantify patient doses during male mammogram and to estimate the projected radiogenic risk during the procedure. In total, 42 male patients were undergone mammogram for breast cancer diagnosis during two consecutive years. The mean and range of patient age (years) is 45 (23-80). The mean and standard deviation (SD) of the peak tube potential and tube current time product are 28.64 ± 2. and 149 ± 35.1, respectively. The mean, and range of patients' entrance surface air kerma (ESAK, mGy) per single breast procedure was 5.3 (0.47-27.5). Male patient's received comparable radiation dose per mammogram compared to female procedures. With increasing incidence of male breast cancer, proper guidelines are necessary for the mammographic procedure are necessary to reduce unnecessary radiation doses and radiogenic risk.
Computed tomography is widely used for planar imaging. Previous studies showed that CR systems involve higher patient radiation doses compared to digital systems. Therefore, assessing the patient's dose and CR system performance is necessary to ensure that patients received minimal dose with the highest possible image quality. The study was performed at three medical diagnostic centers in Sudan: Medical Corps Hospital (MCH), Advance Diagnostic Center (ADC), and Advance Medical Center (AMC). The following tools were used in this study: Tape measure, Adhesive tape, 1.5 mm copper filtration (>10 × 10 cm), TO 20 threshold contrast test object, Resolution test object (e.g., Huttner 18), MI geometry test object or lead ruler, Contact mish, Piranha (semiconductor detector), Small lead or copper block (∼5 × 5 cm), and Steel ruler, to do a different type of tests (Dark Noise, Erasure cycle efficiency, Sensitivity Index calibration, Sensitivity Index consistency, Uniformity, Scaling errors, Blurring, Limiting spatial Resolution, Threshold, and Laser beam Function. Entrance surface air kerma (ESAK (mGy) was calculated from patient exposure parameters using DosCal software for three imaging modalities. A total of 199 patients were examined (112 chest X rays, 77 lumbar spine). The mean and standard deviation (sd) for patients ESAK (mGy) were 2.56 ± 0.1 mGy and 1.6 mGy for the Anteroposterior (AP) and lateral projections for the lumbar spine, respectively. The mean and sd for the patient's chest doses were 0.1 ± 0.01 for the chest X-ray procedures. The three medical diagnostic centers' CR system performance was evaluated and found that all of the three centers have good CR system functions. All the centers satisfy all the criteria of acceptable visual tests. CR's image quality and sensitivity were evaluated, and the CR image is good because it has good contrast and resolution. All the CR system available in the medical centers and upgraded from old X-ray systems to new systems, has been found to work well. The patient's doses were comparable for the chest X-ray procedures, while patients' doses from the lumbar spine showed variation up to 2 folds due to the variation in patients' weight and X-ray machine setting. Patients dose optimization is recommended to ensure the patients received a minimal dose while obtaining the diagnostic findings.
Worldwide, thyroid cancer accounts for some 10% of total cancer incidence, most markedly for females. Thyroid cancer radiotherapy, typically using 131I (T1/2 8.02 days; β- max energy 606 keV, branching ratio 89.9%), is widely adopted as an adjunct to surgery or to treat inoperable cancer and hyperthyroidism. With staff potentially receiving significant doses during source preparation and administration, radiation protection and safety assessment are required in ensuring practice complies with international guidelines. The present study, concerning a total of 206 patient radioiodine therapies carried out at King Faisal Specialist Hospital and Research Center over a 6-month period, seeks to evaluate patient and occupational exposures during hospitalization, measuring ambient doses and estimating radiation risk. Using calibrated survey meters, patient exposure dose-rate estimates were obtained at a distance of 30-, 100- and 300 cm from the neck region of each patient. Occupational and ambient doses were measured using calibrated thermoluminescent dosimeters. The mean and range of administered activity (AA, in MBq) for the thyroid cancer and hyperthyroidism treatment groups were 4244 ± 2021 (1669-8066), 1507.9 ± 324.1 (977.9-1836.9), respectively. The mean annual occupational doses were 1.2 mSv, that for ambient doses outside of the isolation room corridors were found to be 0.2 mSv, while ambient doses at the nursing station were below the lower limit of detection. Exposures to staff from patients being treated for thyroid cancer were less compared to hyperthyroidism patients. With a well-defined protocol, also complying with international safety requirements, occupational exposures were found to be relatively high, greater than most reported in previous studies.
In addition to generalised of bone loss and a higher fracture risk, rheumatoid arthritis (RA) causes periarticular bone erosions. Improvements in bone density/erosion and turnover may not go hand in hand with a positive clinical response to biological anti-inflammatory drugs assesed by disease activity score 28 (DAS28) in RA patients. This study aimed to understand how biologic anti-inflammatory drugs affect bone density, erosion, and turnover in RA patients. We examined bone mineral density (BMD) and bone turnover biomarkers. The study population consisted of 62 RA patients, 49 (79%) of whom were female and 13 (21%) of whom were male. The patients ranged in age from 40 to 79 years old. The patients' BMD was measured using a DEXA scan, and their plasma levels of bone turnover biomarkers CTX and osteocalcin were quantified utilizing an ELISA. BMD of the hip and lumbar spine in responder patients rose after therapy by 0.001g/cm2 (0.11 percent, p0.001 vs. before treatment) and 0.0396g/cm2 (3.96 percent, p0.001 vs. before treatment), respectively. Clinically non-responder patients' DAS28 revealed minor reductions in hip BMD values of -0.008g/cm2 (-0.78 percent, p0.001 vs. before therapy), as well as an improvement in lumbar spine BMD of 0.03g/cm2 (3.03 percent, p0.001 vs. before treatment). After 12 weeks of therapy, the CTX levels in responder patients dropped from 164 125 pg/ml to 131 129 pg/ml. Osteocalcin levels in non-responder patients increased substantially from 11.6 ng/ml to 14.9 ng/ml after 12 weeks of therapy compared to baseline (p = 0.01). Treatment with biologic anti-inflammatory medicines decreases widespread bone loss in RA patients' hip and lumbar spine. The beneficial effects of therapy on BMD were not associated with changes in disease activity of RA patients. Changes in plasma levels of bone turnover biomarkers such as sCTX and osteocalcin confirmed the treatment's beneficial effects.
Occupational radiation exposure can occur due to various human activities, including the use of radiation in medicine. Occupationally exposed personnel surpassing 7.4 millions, and respresent the biggest single group of employees who are exposed to artificial radiation sources at work. This study compares the occupational radiation dose levels for 145 workers in four different hospitals located in the Aseer region in Saudi Arabia. The occupational exposure was quantified using thermoluminescence dosimeters (TLD-100). The levels of annual occupational exposures in targeted hospitals were calculated and compared with the levels of the international atomic energy agency (IAEA) Safety Standards. An average yearly cumulative dose for the two consecutive years. The average, highest and lowest resulted occupational doses under examination in this work is 1.42, 3.9 mSv and 0.72 for workers in various diagnostic radiology procedures. The resulted annual effective dose were within the IAEA approved yearly dose limit for occupational exposure of workers over 18, which is 20 mSv. Staff should be monitored on a regular basis, according to current practice, because their annual exposure may surpass 15% of the annual effective doses.
The positron emitters (18F-Sodium Fluoride (NaF)) and X-rays used in Positron emission tomography (PET) combined with computed tomography (PET/CT) imaging have a high radiation dose, which results in a high patient dose. The present research intends to determine the radiation dose and risks associated with PET/CT- 18F-Sodium fluoride examinations in patients. The 18F-NaF PET/CT was used to investigate the doses of 86 patients. Patient exposure parameters and ImPACT software were used to calculate mean effective doses. The administered activity of 185 MBq (5.0 mCi) per procedure has a mean and range based on the patient's BMI (BMI). The range of patient effective doses per procedure was found to be 4-10 mSv, with a radiation risk of 1 × 10-5 per procedure. Patient doses are determined by the patient's size, scanner type, imaging protocol, and reconstruction method. For further dose reduction, proper justification and radiation dose optimization is required.
We have developed a radioluminescence-based survey meter for use in industries in which there is involvement in naturally occurring radioactive material (NORM), also in support of those needing to detect other weak emitters of radiation. The functionality of the system confronts particular shortcomings of the handheld survey meters that are currently being made use of. The device couples a LYSO:Ce scintillator with a photodetector via a polymer optical fibre waveguide, allowing for "intrinsically safe" inspection within pipework, separators, valves and other such component pieces. The small-diameter optical fibre probe is electrically passive, immune to electromagnetic interference, and chemically inert. The readout circuit is entirely incorporated within a handheld casing housing a silicon photomultiplier (SiPM) detection circuit and a microprocessor circuit connected to an LCD display. A 15 m long flexible PMMA optical fibre waveguide is butt coupled to an ABS plastic probe that retains the LYSO:Ce scintillator. Initial tests have included the use of lab-based mixed gamma-ray sources, measurements being made in concert with a reference conventional GM survey-meter. Characterization, via NORM sources at a decontamination facility, has shown useful sensitivity, covering the dose-rate range 0.10- to 28 µSv h-1 (R-squared 0.966), extending to 80 µSv/h as demonstrated in use of a Cs-137 source. The system is shown to provide an effective tool for detection of radioactivity within hard to access locations, in particular for sources emitting at low radiation levels, down to values that approach background.