Saturday, May 31, 2008

interactive digital games could be better designed to improve players' health.

and technologies," Lieberman said. "Together, the 12 studies funded in this round will help us better understand how people respond to various types of health games, and this will potentially lead to new game-based applications that can more effectively engage and motivate players to improve their health."
The 12 grantees were selected from 112 research organizations that applied for Health Games Research funding during the first funding call, which focused on games that engage players in physical activity and/or games that promote and improve players' self-care. In January 2009, Health Games Research will issue its next call for proposals, awarding up to an additional $2 million in grants.
As UNC and the other 11 grantees conduct their studies, Health Games Research will provide them with ongoing assistance and research resources.
------------------
For more information about the Health Games Research program,
visit www.healthgamesresearch.org or
contact healthgamesresearch@isber.ucsb.edu.
To learn more about the Robert Wood Johnson Foundation and the Pioneer Portfolio, visit www.rwjf.org/pioneer.
Note: Deborah Tate can be reached at (919) 966-7546 or dtate@unc.edu.
School of Public Health contact: Ramona DuBose, (919) 966-7467, ramona_dubose@unc.edu News Services contact: Patric Lane, (919) 962-8596, patric_lane@unc.edu

National Institute of Standards and Technology (NIST) Research measures movement of nanomaterials in simple model food chain


New research* shows that while engineered nanomaterials can be transferred up the lowest levels of the food chain from single celled organisms to higher multicelled ones, the amount transferred was relatively low and there was no evidence of the nanomaterials concentrating in the higher level organisms. The preliminary results observed by researchers from the National Institute of Standards and Technology (NIST) suggest that the particular nanomaterials studied may not accumulate in invertebrate food chains.

The same properties that make engineered nanoparticles attractive for numerous applications—biological and environmental stability, small size, solubility in aqueous solutions and lack of toxicity to whole organisms—also raise concerns about their long-term impact on the environment. NIST researchers wanted to determine if nanoparticles could be passed up a model food chain and if so, did the transfer lead to a significant amount of bioaccumulation (the increase in concentration of a substance in an organism over time) and biomagnification (the progressive buildup of a substance in a predator organism after ingesting contaminated prey).
In their study, the NIST team investigated the dietary accumulation, elimination and toxicity of two types of fluorescent quantum dots using a simple, laboratory-based food chain with two microscopic aquatic organisms—Tetrahymena pyriformis, a single-celled ciliate protozoan, and the rotifer Brachionus calyciflorus that preys on it. The process of a material crossing different levels of a food chain from prey to predator is called "trophic transfer."
Quantum dots are nanoparticles engineered to fluoresce strongly at specific wavelengths. They are being studied for a variety of uses including easily detectable tags for medical diagnostics and therapies. Their fluorescence was used to detect the presence of quantum dots in the two microorganisms.
The researchers found that both types of quantum dots were taken in readily by T. pyriformis and that they maintained their fluorescence even after the quantum dot-containing ciliates were ingested by the higher trophic level rotifers. This observation helped establish that the quantum dots were transferred across the food chain as intact nanoparticles and that dietary intake is one way that transfer can occur. The researchers noted that, "Some care should be taken, however, when extrapolating our laboratory-derived results to the natural environment."
"Our findings showed that although trophic transfer of quantum dots did take place in this simple food chain, they did not accumulate in the higher of the two organisms," says lead author David Holbrook. "While this suggests that quantum dots may not pose a significant risk of accumulating in aquatic invertebrate food chains in nature, additional research beyond simple laboratory experiments and a more exact means of quantifying transferred nanoparticles in environmental systems are needed to be certain."
--------------------------
* R.D. Holbrook, K.E. Murphy, J.B. Morrow and K.D. Cole. Trophic transfer of nanoparticles in a simplified invertebrate food chain. Nature Nanotechnology, June 2008 (advance online publication).

Wiley Balckwell Society for Academic research a new New Stam Cell Therapy

New Stem Cell Therapy May Aid the Repair of Damaged Brains

According to some experts, newly born neuronal stem cells in the adult brain may provide a therapy for brain injury. But if these stem cells are to be utilized in this way, the process by which they are created, neurogenesis, must be regulated.
A new study, led by Laurence Katz, Co-Director of the Carolina Resuscitation Research Group at the University of the North Carolina School of Medicine, suggests a way in which this might be achieved.
According to the research, neurogenesis can be regulated through induced hypothermia. In rat subjects, a mild decrease in body temperature was found to substantially decrease the proliferation of newly-born neurons, a discovery that marks a major step forward for the development of neuronal stem cell-based brain therapies.
Since the 1930s, brain damage from stroke, head injury, near drowning and cardiac arrest was considered to be permanent because of a lack of repair mechanisms like other parts of the body. However, discovery of neuronal stem cells in the adult brain challenges that belief.
“Many questions remain before we adequately understand how to control these cells to repair a damaged brain,” says Katz. “However, the findings represent an important step in demonstrating that these cells can be controlled by simple external forces like hypothermia.”
The presentation entitled “Hypothermia Decreases Neurogenesis” will be given by Laurence Katz from The University of North Carolina School of Medicine. This paper will be presented at the 2008 SAEM Annual Meeting, Washington,D.C. on May 31, in the Neurovascular emergencies forum beginning at 10 a.m. in Virginia Rooms A&B of the Marriott Wardman Park Hotel. Abstracts are published in Vol. 15, No. 5, Supplement 1, May 2008 of Academic Emergency Medicine, the official journal of the Society for Academic Emergency Medicine.
Contact:
Sean Wagner
Wiley-Blackwell
781-388-8550
swagner@wiley.com

Wednesday, May 28, 2008

European conference addresses increasing demand for EO data


28 May 2008

For more than 40 years, Earth observing satellites have delivered valuable data about our planet and have enabled a better understanding and improved management of the Earth and its environment. Demands for these data are increasing daily as decision-makers are faced with responding to environmental change, managing sustainable development and responding to natural disasters and civil security issues. In order to address these needs, ESA, the German Space Agency (DLR) and the German Aerospace Industries Association (BDLI) have jointly organised a conference aimed at identifying the challenges ahead and exploring specific needs for the future.
European experts from ESA and DLR are attending the conference being held on 27 and 28 May on the occasion of the ILA Berlin Air Show in Germany to provide an overview of existing Earth observation (EO) applications in the area of climate, environmental management and the civil security sector.
Representatives from public authorities, private companies and international organisations are attending the conference entitled ‘Earth observation: Solutions for Decision Making’ to explore specific demands for EO products.
Speaking at the conference, Dr Volker Liebig, ESA’s Director of Earth Observation, outlined ESA’s vigorous EO programmes, which include launching 17 satellites over the next seven years.
These include the family of Earth Explorers that will measure key Earth system processes to understand their role in climate change and the Sentinels that will provide operational information services for global monitoring of the environment and security.
ESA’s Head of Science, Applications and Future Technologies Department Dr Stephen Briggs introduced ESA’s Climate Change Initiative, a new Programme Proposal that will be presented to the ESA Ministerial Council in November 2008.
The objectives of the programme will focus on the delivery of satellite-based ‘Essential Climate Variables’ to support climate change modelling and prediction.
"Satellite data are critical in providing the basic information for modelling and predicting climate change," Briggs said. "The new initiative will ensure that ESA’s potential in this area is fully realised."
The fleet of ESA’s EO satellites has gathered enormous amounts of data relevant for providing this information. Archived over 30 years and increasing daily, these data will form the basis for extracting the variables most relevant to climate change.
ESA and its member states will process the information in a form readily usable by the scientific community and governmental bodies in order to achieve their policies and to support the Intergovernmental Panel on Climate Change (IPCC) and United Nations’ conventions.

Magnetic nanoparticles: suitable for cancer therapy?

Atomic force microscopic image of magnetic nanoparticles of cobalt ferrite. The diagram shows the size distribution.
------------------------
PTB measuring procedure helps to investigate the characteristics of magnetic nanoparticles
27.5.2008
------------------------
A measuring procedure developed in the Physikalisch-Technische Bundesanstalt (PTB) can help to investigate in some detail the behaviour of magnetic nanoparticles which are used for cancer therapy.
Magnetic nanoparticles (with a size of some few to several hundred nanometres) are a new, promising means of fighting cancer. The particles serve as a carrier for drugs: "loaded" with the drugs, the nanoparticles are released into the blood stream, where they move until they come under the influence of a targeting magnetic field which holds them on to the tumour - until the drug has released its active agent. Besides this pharmaceutical effect, also a physical action can be applied: an electromagnetic a.c. field heats up the accumulated particles so much that they destroy the tumour. Both therapeutic concepts have the advantage of largely avoiding undesired side effects on the healthy tissue.
These procedures have already been successfully been applied in the animal model and have, in part, already been tested on patients. Here it is important to know before application whether the particles tend to aggregate and thus might occlude blood vessels. Information about this can be gained by magnetorelaxometry developed at the PTB. In this procedure, the particles are shortly magnetised by a strong magnetic field in order to measure their relaxation after the switch-off of the field by means of superconducting quantum interferometers, so-called "SQUIDs". Conclusions on their aggregation behaviour in these media can be drawn from measurements of suspensions of nanoparticles in the serum or in whole blood. As an example, it could be shown in this way that certain nanoparticles in the blood serum form clusters with a diameter of up to 200 nm - a clear indication of aggregation, so that these nanoparticles do not appear to be suitable for therapy
At present, the high technical effort connected with the use of helium-cooled magnetic field sensors is still standing in the way of using this method routinely in practice. In a joint project with Braunschweig Technical University supported by the Ministry of Education and Research (BMBF), the procedure is currently being transferred to a simpler technology based on fluxgate magnetometers.
The results of the first orders from customers have served, for example, to optimise the paint drying process in the automobile industry, the thermal design of furnaces as well as the monitoring of glass forming processes.
Another measuring facility is currently being set up in the PTB which will allow emissivity measurements to be performed under vacuum conditions in an extended temperature and wavelength range - in particular for space applications.
This text in the latest issue of PTB-news (08.2): http://www.ptb.de/en/publikationen/news/html/news081/artikel/0815.htm
Contact:
Dr. Lutz Trahms,
PTB Department 8.2 Biosignals, Phone +4930-3481-7213,
e-mail: lutz.trahms@ptb.de

Expansion of high tech materials


Thermal expansion coefficient (α) of a special glass with extremely low expansion. The progress of α shown here derives from α series of interferometrically measured lengths of a sample with parallel end surfaces as a function of temperature. In this example, there is a point of reversal at approximately 17.5 °C at which the thermal expansion disappears. Below this point, α fall in temperature causes an expansion of the material. The figure (upper left) shows a typical topography of the interference phase, which includes the front surface of the body and α wrung end plate. The averaged phase values within the rectangular areas on the end plate (left/right) and on the front surface of the body (middle), are the basis for the interferometrical length measurement.
-----------------------------
With a new precision interferometer developed in the PTB, changes in length can be determined with highest accuracy in an absolute measurement
27.5.2008
---------------------------------------

Industrial applications are ever more frequently demanding materials of highest thermal stability. A precision interferometer has been developed in the Physikalisch-Technische Bundesanstalt (PTB) to exactly measure this property. With this instrument, the change in length can be determined with highest accuracy in an absolute measurement as a function of temperature, time and - if necessary - ambient pressure.
Thermally stable materials play an important role in dimensional metrology and in precision manufacturing. The currently highest requirements on the thermal stability of critical components are made in EUV lithography of reflection masks and mirrors. These are, therefore, based on substrates made of high tech glass/ceramics which are to exhibit a very low thermal expansion coefficient (α <>
For the precise characterization of gauge-blockshaped measuring objects made of high tech materials, a precision interferometer was developed with the aim of measuring samples of up to 400 mm length with uncertainties in the sub-nanometer range. From such exact measurements of length, it is possible to calculate the thermal expansion coefficient as a function of the temperature with uncertainties of up to 2 · 10−10 · K−1 Furthermore, it is possible to get quantitative statements regarding the homogeneity of the thermal expansion, compressibility, length relaxations and also the long-term stability of samples.
Length measurements with sub-nm uncertainties demand, besides the application of frequencystabilized lasers, the consideration of influences whose uncertainty contributions are difficult to minimize. For this purpose, various methods have been developed in the PTB in the last few years and these have been integrated into the measuring process. A new autocollimation process is cited as an example and this ensures that the lightwaves reach the surfaces of the measuring objects exactly perpendicularly. The so-called cosine error is hereby lowered to under 10−11 · L . Furthermore during the electronic evaluation of the interference pattern, the exact assignment of the sample position to the camera pixel coordinates is considered. This is particularly important when it comes to measuring objects whose end faces are non-parallel and when the influence of small temperature-induced changes of the lateral sample position can be corrected. By taking the temperature-related influence of the deflection of the end plate wrung to the back into consideration, the precision could be increased further. When taking thermal expansion measurements on typical samples, length measurement uncertainties of 0.25 nm are now achieved.
In a recently completed international comparison measurement, the leading position of the PTB in the determination of thermal expansion coefficients was confirmed. The new possibilities for the precise characterization of high tech materials are already being used intensively by companies working in the fields of optics and precision manufacturing.
This text in the latest issue of PTB-news (08.2): http://www.ptb.de/en/publikationen/news/html/news081/artikel/0815.htm
Contact:
Dr. Rene Schödel,
PTB Working Group 5.44 Interferometry on Prismatic Bodies, Phone (0531) 592-5440, e-mail: rene.schoedel@ptb.de

Spectral emissivity measurements for radiation thermometry


Local variation of the directed spectral emissivity ελ of a car paint sample at a wavelength of 4 micro;m, measured using a thermography camera.
-----------------------------------------

Modern emissivity measuring facility for industry-orientated calibrations developed at PTB
27.5.2008
----------------------------
Industry and research are increasingly relying on non-contact temperature measurements with the aid of heat radiation, for example, for the reliable and reproducible drying of car paint. In order to attain exact and reliable results, the emissivity of the measured surface has to be known. It can only be determined precisely in complex measuring facilities. The Physikalisch-Technische Bundesanstalt (PTB) has developed a modern emissivity measuring facility for industry-oriented calibrations.

Today, the accuracy of industrial temperature measurements carried out with contact-free radiation thermometers, is often no longer limited by the quality of the radiation thermometers, but rather by insufficient knowledge of the emissivity of the surface observed. Industrial radiation thermometers can furnish a resolution of up to 20 mK, with an uncertainty of 1 K for temperature measurements of 100 °C. In contrast to this, the directional spectral emissivities of surfaces can often only be specified with standard measurement uncertainties of 5 %. When measuring a temperature of 100 °C in the spectral range by around 10 µm, this corresponds to a temperature uncertainty of typically 5 K.
The emissivity is not a constant, but rather changes in general with changes of the surface (roughness, oxidation, impurities etc.), the observation angle, the observation wavelength as well as the temperature. Furthermore, it is often distributed inhomogeneously over the surface. Precise temperature measurements therefore demand exact knowledge of the emissivity. To determine the variety of dependencies of the emissivity on the above-mentioned parameters, complex measuring facilities are necessary.
The spectral emissivity is measured in the PTB by comparing the radiances of a cavity radiator of high quality - resembling an almost ideal black body - with the sample to be investigated by means of a Fourier transform spectrometer, whereby the radiation of the environment and the inherent radiation of the spectrometer are taken into consideration. Holding the sample in a temperature-regulated hemisphere hereby guarantees a constant radiation exchange with the environment. The apparatus allows the determination of the directional spectral emissivity as well as of the total emissivity of opaque samples under ambient conditions in a temperature range from 80 °C to 250 °C and a wavelength range from 4 µm to 40 µm under emission angles of 5° to 70° with a relative standard measurement uncertainly of better than 2 %. The extrapolation of the measured values of the directed spectral emissivity for emission angles above 70° then allows the hemispherical spectral emissivity, which is especially important for calculations of the net radiation exchange, as well as the total emissivity to be calculated. The homogeneity of the directional spectral emissivity at 4 µm is determined with the help of a thermography camera.
The results of the first orders from customers have served, for example, to optimise the paint drying process in the automobile industry, the thermal design of furnaces as well as the monitoring of glass forming processes.
Another measuring facility is currently being set up in the PTB which will allow emissivity measurements to be performed under vacuum conditions in an extended temperature and wavelength range - in particular for space applications.
This text in the latest issue of PTB-news (08.2): http://www.ptb.de/en/publikationen/news/html/news081/artikel/0813.htm
Contact: Dr. Christian Monte, PTB Working Group 7.31 High-temperature Scale,Phone +4930-3481-7246, e-mail: christian.monte@ptb.de

New insights into cellular reprogramming revealed by genomic analysis

Research collaboration of Harvard, Whitehead Institute, and Broad Institute uncovers critical molecular events underlying reprogramming of differentiated cells to a stem cell state
The ability to drive somatic, or fully differentiated, human cells back to a pluripotent or “stem cell” state would overcome many of the significant scientific and social challenges to the use of embryo-derived stem cells and help realize the promise of regenerative medicine. Recent research with mouse and human cells has demonstrated that such a transformation (“reprogramming”) is possible, although the current process is inefficient and, when it does work, poorly understood. But now, thanks to the application of powerful new integrative genomic tools, a cross-disciplinary research team from Harvard University, Whitehead Institute, and the Broad Institute of MIT and Harvard has uncovered significant new information about the molecular changes that underlie the direct reprogramming process. Their findings are published online in the journal Nature.
“We used a genomic approach to identify key obstacles to the reprogramming process and to understand why most cells fail to reprogram,” said Alexander Meissner, assistant professor at Harvard University’s Department of Stem Cell and Regenerative Biology and associate member of the Broad Institute, who led the multi-institutional effort. “Currently, reprogramming requires infecting somatic cells with engineered viruses. This approach may be unsuitable for generating stem cells that can be used in regenerative medicine. Our work provides critical insights that might ultimately lead to a more refined approach.”
Previous work had demonstrated that four transcription factors — proteins that mediate whether their target genes are turned on or off — could drive fully differentiated cells, such as skin or blood cells, into a stem cell-like state, known as induced pluripotent stem (iPS) cells. Building off of this knowledge, the researchers examined both successfully and unsuccessfully reprogrammed cells to better understand the complex process.
“Interestingly, the response of most cells appears to be activation of normal ‘fail safe’ mechanisms”, said Tarjei Mikkelsen, a graduate student at the Broad Institute and first author of the Nature paper. ”Improving the low efficiency of the reprogramming process will require circumventing these mechanisms without disabling them permanently.”
The researchers used next-generation sequencing technologies to generate genome-wide maps of epigenetic modifications — which control how DNA is packaged and accessed within cells — and integrated this approach with gene expression profiling to monitor how cells change during the reprogramming process. Their key findings include:Fully reprogrammed cells, or iPS cells, demonstrate gene expression and epigenetic modifications that are strikingly similar, although not necessarily identical, to embryonic stem cells.Cells that escape their initial fail-safe mechanisms can still become ‘stuck’ in partially reprogrammed states. By identifying characteristic differences in the epigenetic maps and expression profiles of these partially reprogrammed cells, the researchers designed treatments using chemicals or RNA interference (RNAi) that were sufficient to drive them to a fully reprogrammed state. One of these treatments, involving the chemotherapeutic 5-azacytidine, could improve the overall efficiency of the reprogramming process by several hundred percent.
“A key advance facilitating this work was the isolation of partially reprogrammed cells,” said co-author Jacob Hanna, a postdoctoral fellow at the Whitehead Institute, who recently led two other independent reprogramming studies. “We expect that further characterization of partially programmed cells, along with the discovery and use of other small molecules, will make cellular reprogramming even more efficient and eventually safe for use in regenerative medicine.”
------------------------------------------
Paper cited:
Mikkelsen, et al. Dissecting direct reprogramming through integrative genomic analysis. Nature DOI: 10.1038/nature07056.
About the Whitehead Institute
Whitehead Institute for Biomedical Research is a nonprofit, independent research and educational institution. Wholly independent in its governance, finances and research programs, Whitehead shares a close affiliation with Massachusetts Institute of Technology through its faculty, who hold joint MIT appointments.
About the Broad Institute of MIT and Harvard
The Broad Institute of MIT and Harvard was founded in 2003 to bring the power of genomics to biomedicine. It pursues this mission by empowering creative scientists to construct new and robust tools for genomic medicine, to make them accessible to the global scientific community, and to apply them to the understanding and treatment of disease.
The Institute is a research collaboration that involves faculty, professional staff and students from throughout the MIT and Harvard academic and medical communities. It is jointly governed by the two universities.
Organized around Scientific Programs and Scientific Platforms, the unique structure of the Broad Institute enables scientists to collaborate on transformative projects across many scientific and medical disciplines.
For further information about the Broad Institute, go to http://www.broad.mit.edu/.
For more information, contact:
B.D. Colen, Harvard University bd_colen@harvard.edu 617-495-7821
Cristin Carr, Whitehead Institute carr@wi.mit.edu 617-324-0460

Getting better with a little help from our 'micro' friends

PASADENA, Calif.-- A naturally occurring molecule made by symbiotic gut bacteria may offer a new type of treatment for inflammatory bowel disease, according to scientists at the California Institute of Technology.
"Most people tend to think of bacteria as insidious organisms that only make us sick," says Sarkis K. Mazmanian, an assistant professor of biology at Caltech, whose laboratory examines the symbiotic relationship between "good" bacteria and their mammalian hosts. Instead, he says, "bacteria can be beneficial and actively promote health."
For example, the 100 trillion bacteria occupying the human gut have evolved along with the human digestive and immune systems for millions of years. Some harmful microbes are responsible for infection and acute disease, while "other bacteria, the more intelligent ones, have taken the evolutionary route of shaping their environment by positively interacting with the host immune system to promote health, which gives them an improved place to live; it's like creating bacterial nirvana," says Mazmanian.
If bacteria are actively modifying the gut, their work would have to be mediated by molecules. In their recent work, Mazmanian and his colleagues have identified one such molecule, a sugar called polysaccharide A, or PSA, which is produced by the symbiotic gut bacterium Bacteroides fragilis. They have termed this molecule a "symbiosis factor," and predict that many other bacterial compounds with diverse beneficial activities await discovery.
To identify the molecule and its action, the scientists used experimental mice and induced changes to their intestinal bacteria by exposing them to a pathogenic bacterium called Helicobacter hepaticus. This microbe causes a disease in the mice that is similar to Crohn's disease and ulcerative colitis. However, when the animals were co-colonized with B. fragilis, they were protected from the disease--as were animals that were given oral doses of just the PSA molecule.
In particular, Mazmanian and his colleagues found that PSA induced particular immune-system cells called CD4+ T cells to produce interleukin-10 (IL-10), a molecule that has previously been shown to suppress inflammation--and offer protection from inflammatory bowel disease. "Thus, bacteria help reprogram our own immune system to promote health," he says.
"The most immediate and obvious implication is that PSA may potentially be developed as a natural therapeutic for inflammatory bowel disease," says Mazmanian.
Inflammatory bowel disease, a constellation of illnesses that cause inflammation in the intestines, including ulcerative colitis and Crohn's disease, is estimated to affect one million Americans. The rates of inflammatory bowel diseases have skyrocketed in recent years; for example, the incidence of Crohn's disease, a condition that causes debilitating pain, diarrhea, and other gastrointestinal symptoms, has increased by 400 percent over the past 20 years.
The current research, along with other work by Mazmanian and June L. Round, a Caltech postdoctoral researcher, suggests that the interplay between various groups of bacteria living in the intestines has profound effects on human health.
This notion gels with the so-called "hygiene hypothesis." The hypothesis, first proposed two decades ago, links modern practices like sanitation, vaccination, a Western diet, and antibiotic use, which reduce bacterial infections, to the increased prevalence of a variety of illnesses in the developed world, including inflammatory bowel disease, asthma, multiple sclerosis, and Type 1 diabetes. However, it is now clear that increased living standards and antibacterial drugs affect not only infectious microbes, but all of the beneficial ones that we may depend on for our well-being.
"Through societal measures we have changed our association with the microbial world in a very short time span. We don't have the same contact with microbes as we have for millions of years--we just live too clean now," Mazmanian says. So while it is useful to eliminate disease-causing organisms, "perhaps disease results from the absence of beneficial bacteria and their good effects," he suggests. "This study is the first demonstration of that. What it hopefully will do is allow people to re-evaluate our opinions of bacteria. Not all are bad and some, maybe many, are beneficial."
--------------------------------------------------------------------------------------
The article, "A microbial symbiosis factor prevents intestinal inflammatory disease," will be featured on the cover of the May 29 issue of the journal Nature. Mazmanian's coauthors are June L. Round of Caltech and Dennis L. Kasper of Harvard Medical School.
Visit the Caltech Media Relations website at: http://pr.caltech.edu/media.
Public release date: 28-May-2008

X-rays Often Repeated for Patients in Developing Countries


The use of X-rays in medical care is growing in developing countries, and the IAEA is supporting efforts to strengthen quality assurance programmes for radiography at hospitals and clinics. (Credit: IAEA)
----------------------------------------------------Patients in developing countries often need to have X-ray examinations repeated so that doctors have the image quality they need for useful medical diagnosis, the IAEA is learning. The findings come from a survey involving thousands of patients in 45 hospitals and 12 countries of Africa, Asia and Eastern Europe.
"Poor image quality constitutes a major source of unnecessary radiation to patients in developing countries," emphasizes Dr. Madan Rehani of the IAEA Division of Radiation, Waste and Transport Safety, which carried out the survey under technical cooperation (TC) projects of the IAEA. "Fortunately, we´re moving forward to help countries improve the situation and have shown definite improvements."
The survey was done in phases from August 2005 to December 2006 at hospitals in the Democratic Republic of the Congo, Ghana, Madagascar, Sudan, Tanzania, Zimbabwe, Iran, Saudi Arabia, Thailand, United Arab Emirates, Bosnia and Herzegovina, and Serbia. Project counterparts in these countries worked through IAEA-supported regional technical cooperation projects that aim to help countries implement quality assurance programmes for radiographic examinations, in line with international radiation safety standards.
"The use of X-rays in medical care is growing in developing countries," Dr. Rehani says. However, he adds, vital information about both the quality of X-ray images and patient doses is "grossly lacking" at many hospitals where the IAEA has helped launch quality assurance programmes.
The survey found that more than half (53%) of all X-ray images evaluated through the project were of poor quality affecting diagnostic information, Dr. Rehani said. One consequence is that patients then are given repeat examinations, which means exposing them to X-rays again, as well as entailing extra costs. The survey included patients receiving chest, pelvic, abdomen, skull, and spine X-ray examinations.
The good news is that efforts to improve quality through quality assurance (QA) appear to be paying off. In a paper just published in the June edition of the American Journal of Roentgenology, Dr. Rehani and colleagues report that considerable benefits were seen regionally after introduction of QA programmes. The quality of X-ray images improved up to 16% in Africa, 13% in Asia and 22% in Eastern Europe. At the same time, patient dose reductions ranging from 1.4% to 85% were achieved overall.
The IAEA-supported projects could help change the picture at more hospitals in developing countries by changing the approach to quality assurance in radiography. In the past, the QA approach has been dominated by testing the radiographic equipment primarily, Dr. Rehani points out.
"Our work shows that focusing on the machine is not enough," he says. "We´re documenting that the evaluation of image quality and patient dose goes hand in hand with safe and effective medical radiography."
Background
The project on strengthening radiological protection of patients is designed to help countries apply the International Basic Safety Standards for the Protection Against Ionizing Radiation and for the Safety of Radiation Sources (BSS), developed by the IAEA, World Health Organization and other partners. The standards require attention to image quality by considering corrective actions if such exposures do not provide useful diagnostic information and do not yield medical benefits to patients.
Despite the finding that repeat X-ray examinations were often needed, patient doses in the 12 countries surveyed were in line with international diagnostic reference levels and similar to doses recorded in developed countries. Thousands of X-ray images were evaluated as part of the survey. (Worldwide each year, the United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR) estimates that nearly two billion (2000 million) medical X-rays are done, in its report of 2000 and the indications are that it is getting almost doubled now in 2008).
Altogether 34 countries agreed to participate in the IAEA survey, though data presently are available for only 12 countries and more countries are likely to provide data in coming months.
------------------------------
Staff Report
28 May 2008
International Atomic Energy Agency

Kew provides climate for agricultural change


A device to help some of the most impoverished farmers in Africa maximise their crop yields is being tested at London’s Kew Gardens.
Developed by engineers at the University of Leeds, the sensor device gathers data on air temperature, humidity, air pressure, light, and soil moisture and temperature – information crucial to making key agricultural decisions about planting, fertilisation, irrigation, pest and disease control and harvesting.
It is being tested by Kew’s Diploma students and staff over the next four months in the School of Horticulture’s new student vegetable garden at the Royal Botanic Gardens, Kew. The sensors are monitoring conditions around some typical crops to test possible future applications.
The Leeds team has been working with two Kenyan villages to develop the technology as part of the Engineering and Physical Sciences Research Council (EPSRC) Village E-Science for Life (VESEL) project, a collaboration of key research groups in the UK and Kenya. The project aims to apply advanced digital technology to improve quality of life, both through its use in education and to optimise agricultural practices.
“In some areas of Kenya, localised variations in growing conditions can cause severe fluctuations in crop yields. Our part of the VESEL project is about providing the right information at the right time to farmers,” says Professor Jaafar Elmirghani from the School of Electronic and Electrical Engineering. “This means they can use available water more efficiently, minimising wastage and helping to optimise their harvests to feed their families.”
The information is fed back via a wireless network to a central hub, or server, which will be located at the village school, and is then sent to agriculture experts who will provide advice to assist farmers’ decisions. The ongoing data gathered will also feed into agricultural teaching at Kenyan schools, which forms a central part of the education system.
During the tests at Kew, the data collected by the device will be sent back to the University of Leeds, but ultimately, the management of the system will be handed over to the University of Nairobi. “This information will also inform research at the University of Nairobi - and ultimately, we hope, inform agricultural policy in Kenya”, says Professor Elmirghani. “It’s crucial that the work of the project can be sustained long term to benefit future generations.”
“We’re pleased to put these devices through their paces and give feedback to the project. Our students are keen to learn about emerging technologies, especially with such clear sustainability goals as the VESEL project”, says Kew scientist , Rowan Blaik.
The tests are expected to be complete by Autumn 2008, after which time the devices are initially to be trialled in the two Kenyan villages. “We hope that, during 2009 and beyond, the technology will be rolled out to other communities,” says Professor Elmirghani.
Further information from:
Clare Elsley, Campuspr: tel 0113 258 9880, mob: 07767 685168, email:clare@campuspr.co.uk
Simon Jenkins, University of Leeds Press Office: tel 0113 343 5764, email s.jenkins@leeds.ac.uk
Notes to editors:
1. Village e-Science for Life (VESEL) is a project funded by the Engineering and Physical Sciences Research Council under its Bridging the Digital Divide programme. The key aims of the project are to identify the appropriate technologies to assist rural communities in developing countries and design appropriate technologies to meet those needs. The collaboration includes the universities of Leeds and Bradford, Imperial College London, TVU, and the London Knowledge Lab – Birkbeck College, University of London.
2. The Faculty of Engineering at the University of Leeds comprises five Schools:
Civil Engineering; Computing; Electronic and Electrical Engineering; Mechanical Engineering and Process, Materials and Environmental Engineering. All schools in the Faculty have the highest 5 or 5* Research Assessment Exercise ratings, top teaching assessments and strong industrial connections. There are approximately 3,000 students in the Faculty, 80% undergraduates and 20% postgraduates. Two-thirds of our students are from the UK with the remainder representing over 90 different nationalities. http://www.engineering.leeds.ac.uk/
3. The University of Leeds is one of the largest higher education institutions in the UK with more than 30,000 students from 130 countries. With a total annual income of £422m, Leeds is one of the top ten research universities in the UK, and a member of the Russell Group of research-intensive universities. It was recently placed 80th in the Times Higher Educational Supplement's world universities league table and the University's vision is to secure a place among the world's top 50 by 2015. http://www.leeds.ac.uk/
4. The Engineering and Physical Sciences Research Council (EPSRC) is the UK’s main agency for funding research in engineering and the physical sciences. EPSRC invests more than £500 million a year in research and postgraduate training to help the nation handle the next generation of technological change. The areas covered range from information technology to structural engineering, and from mathematics to materials science. This research forms the basis for future economic development in the UK and improvements in everyone’s health, lifestyle and culture. For more information visit www.epsrc.ac.uk/
5. The Royal Botanic Gardens, Kew is a world famous scientific organisation, internationally respected for its outstanding living collection of plants and world-class herbarium as well as its scientific expertise in plant diversity, conservation and sustainable development in the UK and around the world. Kew Gardens is a major international visitor attraction and its 132 hectares of landscaped gardens attract over one million visitors per year. Kew, a UNESCO World Heritage Site, celebrates its 250th anniversary in 2009. There are a wealth of collections held at Kew that offer an opportunity to explore some of the lesser known aspects of RBG Kew’s rich history and heritage and its present day role. Members of the media interested in a behind-the-scenes look at RBG Kew should contact pr@kew.org.
6. Horticulture students come from around the world to study at Kew for the world’s foremost qualification in botanical horticulture – the three-year Kew Diploma. The course offers a broad-based training in amenity and botanical horticulture. The aim is to provide students with an opportunity to study scientific and technical subjects at first degree level, whilst gaining practical experience and responsibility working in this foremost botanic garden. Students are employees of the Royal Botanic Gardens, Kew and receive payment throughout the three-year course, including during the lecture block trimesters. For more information visit www.kew.org/education/diploma
7. Campuspr is a public relations company that specialises in promoting university research and knowledge transfer in the higher education sector. For more research press releases, see http://www.campuspr.co.uk/

New vaccine approach prevents/reverses diabetes in lab study at Children's Hospital of Pittsburgh

Results of study are published in Diabetes, a journal of the American Diabetes Association
Microspheres carrying targeted nucleic acid molecules fabricated in the laboratory have been shown to prevent and even reverse new-onset cases of type 1 diabetes in animal models. The results of these studies were reported by diabetes researchers at the John G. Rangos Sr. Research Center at Children’s Hospital of Pittsburgh of UPMC and Baxter Healthcare Corporation.
In a research study at Children’s Hospital, the scientists injected the microspheres under the skin near the pancreas of mice with autoimmune diabetes. The microspheres were then captured by white blood cells known as dendritic cells which released the nucleic acid molecules within the dendritic cells. The released molecules reprogrammed these cells, and then migrated to the pancreas. There, they turned off the immune system attack on insulin-producing beta cells. Within weeks, the diabetic mice were producing insulin again with reduced blood glucose levels.
Results of the microsphere study are published in the June issue of Diabetes, the journal of the American Diabetes Association.
In type 1 diabetes, T cells from the immune system travel to the pancreas and destroy beta cells, which produce insulin. The scientists – led by Massimo Trucco, MD, and Nick Giannoukakis, PhD – found that the microspheres reprogram dendritic cells to block the signaling mechanism that sends T cells to destroy beta cells. The microsphere research builds on previous research by Drs. Giannoukakis and Trucco in which they used dendritic cells delivered to the pancreas in another method to turn off the immune system’s attack on insulin-producing beta cells, thereby allowing the cells of the pancreas to recover and begin producing insulin again.
Drs. Trucco and Giannoukakis anticipate that the latest research involving microspheres represents a significant improvement over their previous approach to extract (through a process known as leukapheresis) and reprogram the dendritic cells.
“The microspheres prevented the onset of type 1 diabetes and, most importantly, exhibited a capacity to reverse hyperglycemia, suggesting a potential to reverse type 1 diabetes in new-onset patients,” said Dr. Trucco, chief of the Division of Immunogenetics at Children’s. “This novel microsphere approach represents for the first time a vaccine with the potential to suppress and reverse diabetes. This finding holds true promise for clinical testing in people with type 1 diabetes.”
Currently, Drs. Trucco and Giannoukakis are conducting a clinical trial of their leukapheresis-based dendritic cell approach in humans at Children’s. This Phase 1 clinical trial has been approved by the U.S. Food and Drug Administration (FDA).
“Our ultimate goal is to offer this dendritic cell vaccine or microsphere-based therapy to children at risk for or newly diagnosed with type 1 diabetes. We want to make the procedure as safe and comfortable as possible,” Dr. Giannoukakis said.
The trial began late last year and enrollment is ongoing. The study, which plans to enroll a total of 15 adults over age 18 with type 1 diabetes, is expected to conclude later this year.
If the leukapheresis-based approach continues to show exceptional safety, the researchers hope to launch a national clinical trial that will assess the effectiveness of the dendritic cells in pediatric patients to prevent diabetes or reverse the disease right after it is clinically confirmed. At a later date, it is anticipated that Baxter Healthcare will collaborate with Drs. Trucco and Giannoukakis in a clinical trial utilizing the unique microsphere-based approach.
Leukapheresis is a process that allows for the collection of dendritic cell precursors from the patients in the study, which takes two to four hours. After the precursors are collected, they are treated in the lab with specific growth factors that turn them into dendritic cells. The growth factors are also combined with short DNA sequences that specifically block the expression of molecules that are found at the surface of dendritic cells known as CD40, CD80 and CD86. Once these reprogrammed dendritic cells are tested in the lab, they are injected back into the patient. They then orchestrate an anti-diabetic effect by suppressing the activity of T-cells which are responsible for the impairment and destruction of the pancreatic insulin-producing cells.
“Using microspheres will be much less invasive for the patient and much more efficient for clinicians. We wouldn’t need to harvest a patient’s dendritic cells, and it would eliminate the need to genetically reprogram the dendritic cells in a sterile, off-site facility. Instead, the patient would receive the microsphere injection with a small needle in a clinic setting in a matter of minutes,” Dr. Giannoukakis said.
###
The microsphere molecule delivery technology being used is Baxter Healthcare Corporation’s PROMAXX microsphere technology. Larry Brown, ScD, Vice President, Research and Chief Technology Officer and Kimberly Gillis, PhD, Director of Research at Epic Therapeutics (Norwood, MA), part of Baxter’s Medication Delivery business, worked with Drs. Giannoukakis and Trucco to develop the specific diabetes vaccine microspheres. The genetic reprogramming of dendritic cells is an approach developed by Drs. Giannoukakis and Trucco.
Type 1 diabetes is regarded as an autoimmune disease because a person’s immune system’s T cells attack and destroy the beta cells in the pancreas that produce insulin. Symptoms of type 1 diabetes usually develop over a short period of time and include increased thirst, frequent urination, constant hunger, weight loss, blurred vision and extreme fatigue. People with type 1 diabetes require numerous daily injections of insulin to survive. Type 1 diabetes also is known as insulin-dependent diabetes mellitus or juvenile-onset diabetes. The National Institutes of Health (NIH) reports that more than 1 million children and teenagers (age 19 and younger) have type 1 diabetes. According to the NIH, 5 percent to 10 percent of diagnosed diabetes cases in the United States are type 1 diabetes.
Dr. Trucco, an international leader in the field of immunogenetics, has dedicated his life’s work to finding a cure for diabetes. He also is the Hillman Professor of Pediatric Immunology at Children’s Hospital and a professor of Pediatrics at the University of Pittsburgh School of Medicine. His laboratory team at Children’s John G. Rangos Sr. Research Center has pioneered numerous important studies and also maintains a federally funded national bone marrow HLA typing center. For more information about type 1 diabetes or Dr. Trucco’s research, please visit www.chp.edu.
----------------------------------------
Public release date: 28-May-2008
Contact: Marc Lukasiakmarc.lukasiak@chp.edu412-692-7919
Children's Hospital of Pittsburgh

Protein plays key role in transmitting deadly malaria parasite


John Adams, PhD, and his team study the complex life cycle of the malaria parasite (on computer screen) to try to find ways to block transmission of the deadly infection.
--------------------------------------------
Tampa, FL (May 28, 2008) — The protein MAEBL is critical for completing the life cycle of malaria parasites in mosquitoes, allowing the insects to transmit the potentially deadly infection to humans, a University of South Florida study has shown. The research may ultimately help provide a way to better control malaria by blocking development of the malaria parasite in the mosquito.
Researchers with the USF Global Health Infectious Diseases Research team found that the transmembrane protein MAEBL is required for the infective stage of the malaria parasite Plasmodium falciparum to invade the mosquito’s salivary glands. Their findings were published May 28 in the online journal PLoS ONE.
-----------------------------------
“The mosquito is the messenger of death,” said the study’s principal investigator John Adams, PhD, professor of global health at the USF College of Public Health. “If we could eliminate the parasite from the mosquito, people wouldn’t become infected.”
-----------------------------------
Plasmodium falciparum causes three-quarters of all malaria cases in Africa, and 95 percent of malaria deaths worldwide. It is transmitted to humans by the bite of an infected mosquito, which injects the worm-like, one-celled malaria parasites from its salivary glands into the person’s bloodstream.Dr. Adams, center, with his team including, l to r, Steven Maher, Fabian Saenz, PhD, lead author of the PLoS ONE paper, and Sandra Kennedy.
----------------------------------------
The study was done by genetically modifying the malaria parasites and feeding them in a blood meal to uninfected mosquitoes. Parasites in which MAEBL was deleted were not harbored in the salivary glands of mosquitoes, even though an earlier form of these parasites was observed in the gut of the mosquitoes. The researchers concluded that the transmembrane form of MAEBL is essential for the parasite to enter the mosquito’s salivary glands.
-----------------------------------
While more studies are needed, lead author Fabian Saenz, PhD, said the finding suggests that silencing the receptor for MAEBL in the mosquito salivary gland might block passage of the parasite through the mosquito, thereby preventing human infection through mosquito bites.
---------------------------------------------
“Our study shows that MAEBL is a weak link in the parasite’s biology,” Dr. Adams said. “This could provide a potential way to block transmission in the mosquito, before the parasite ever has a chance to infect a new person. It is better to prevent the malaria infection from occurring in the first place than having to kill the parasite already inside humans with vaccines or drugs.”
The study was supported by a grant from the National Institute of Allergy and Infectious Diseases. Other study authors were Dr. Bharath Balu, Jonah Smith and Sarita Mendonca.
--------------------------------------------------

Microscopic view of an Anopeheles mosquito infected with malaria parasites.
--------------------------------------------------
- USF Health -
USF Health is dedicated to creating a model of health care based on understanding the full spectrum of health. It includes the University of South Florida’s colleges of medicine, nursing, and public health; the schools of biomedical sciences as well as physical therapy & rehabilitation sciences; and the USF Physicians Group. With $308 million in research funding last year, USF is one of the nation’s top 63 public research universities and one of Florida’s top three research universities.
----------------------------------------------------
May 28, 2008 @ 8:09 am · Filed under USF Health News, Press Releases

New iron-based and copper-oxide high-temperature



The magnetic structure of the new iron-based superconductors was determined at the thermal triple-axis spectrometer at the National Institute of Standards and Technology Center for Neutron research. Physicists Jeffrey Lynn...


-----------------------------------------------------


NIST's neutron facilities reveal intriguing similarities


GAITHERSBURG, MD—In the initial studies of a new class of high-temperature superconductors discovered earlier this year, research at the Commerce Department’s National Institute of Standards and Technology (NIST) has revealed that new iron-based superconductors share similar unusual magnetic properties with previously known superconducting copper-oxide materials. The research appears in the May 28 Advanced Online Publication of the journal Nature.
These superconductors may one day enable energy and environmental gains because they could significantly heighten the efficiency of transferring electricity over the electric grid or storing electricity in off-peak hours for later use.
“While we still do not understand how magnetism and superconductivity are related in copper-oxide superconductors,” explains NIST Fellow Jeffrey Lynn at the NIST Center for Neutron Research (NCNR), “our measurements show that the new iron-based materials share what seems to be a critical interplay between magnetism and superconductivity.”

-------------------------------------------------------------------------------------------

Part of the team that determined the magnetic and crystal structure of the new iron-based superconductor with the NIST Center for Neutron Research instrument they used for the experiment. Pictured...

-------------------------------------------------------------
The importance of magnetism to high-temperature superconductors is remarkable because magnetism strongly interferes with conventional low-temperature superconductors. “Only a few magnetic impurities in the low-temperature superconductors sap the superconducting properties away,” says Lynn.
By contrast, copper-oxide superconductors, discovered in 1986, tolerate higher magnetic fields at higher temperatures. The highest performance copper-oxide superconductors conduct electricity without resistance when cooled to "transition temperatures" below 140 Kelvin (-133 Celsius) and can simply and cheaply be cooled by liquid nitrogen to 77 Kelvin or (-196 Celsius).
Japanese researchers discovered earlier this year that a new class of iron-based superconducting materials also had much higher transition temperatures than the conventional low-temperature superconductors. The discovery sent physicists and materials scientists into a renewed frenzy of activity reminiscent of the excitement brought on by the discovery of the first high-temperature superconductors over 20 years ago.
Earlier work on the copper-oxide superconductors revealed that they consist of magnetically active copper-oxygen layers, separated by layers of non-magnetic materials. By “doping,” or adding different elements to the non-magnetic layers of this normally insulating material, researchers can manipulate the magnetism to achieve electrical conduction and then superconductivity.
The group of scientists studying the iron-based superconductors used the NCNR, a facility that uses intense beams of neutral particles called neutrons to probe the atomic and magnetic structure of the new material.
As neutrons probed the iron-based sample supplied by materials scientists in Beijing, they revealed a magnetism that is similar to that found in copper-oxide superconductors, that is, layers of magnetic moments—like many individual bar magnets—interspersed with layers of nonmagnetic material. Lynn notes that the layered atomic structure of the iron-based systems, like the copper-oxide materials, makes it unlikely that these similarities are an accident.
One of the exciting aspects of these new superconductors is that they belong to a comprehensive class of materials where many chemical substitutions are possible. This versatility is already opening up new research avenues to understand the origin of the superconductivity, and should also enable the superconducting properties to be tailored for commercial technologies.
-----------------------------------------------------------------------------------------------
Researchers from the following institutions partnered with NIST in these studies: University of Tennessee, Knoxville; Oak Ridge National Laboratory; University of Maryland; Ames Laboratory; Iowa State University and the Chinese Academy of Sciences’ Beijing National Laboratory for Condensed Matter Physics.
*C. de la Cruz, Q. Huang, J.W. Lynn, J. Li, W. Ratcliff II, J.L. Zarestky, H.A. Mook, G.F. Chen, J.L. Luo, N.L. Wang and P. Dai. Magnetic order close to superconductivity in the iron-based layered La(O1-xFx)FeAs systems. Nature Advanced Online Publication, May 28, 2008.

physics : Getting warmer: UT Knoxville researchers uncover information on new superconductors

KNOXVILLE -- The world of physics is on fire about a new kind of superconductor, and a group of researchers at the University of Tennessee, Knoxville, and Oak Ridge National Laboratory led by physicist Pengcheng Dai are in the middle of the heat.
The excitement centers around a new class of high-temperature superconductors -- initially discovered in February and March by Japanese and Chinese researchers -- and the effort to learn more about them. Dai and his team published major new findings about the materials in this week's online edition of the leading scientific journal Nature.
For more than 20 years, scientists have struggled to understand the phenomenon of high-temperature superconductors. The materials move electricity with incredible efficiency -- something that, if fully understood and controlled, could have a major impact on energy use around the world. Their impact could be felt in a variety of ways, from how we transmit electricity into homes to how we power the massive machines used in industry.
Conventional superconductors only possess the property at incredibly cold temperatures -- far too cold for widespread practical use, which is what drives the search for materials that are superconductors at higher temperatures.
When research showed that the new materials could be superconductors at higher temperatures than any conventional superconductors recorded -- 43 Kelvin -- Dai shifted his research group into high gear, contacting colleagues in China to send samples to him.
"When I saw [the superconducting temperature] hit 43K," said Dai, a UT Knoxville-ORNL joint faculty member, "I called and said, 'Send them over.' The sample arrived at UT that Friday. Clarina [de la Cruz, the study's lead author] went to Maryland that night, and ORNL the next week."
De la Cruz, the study's lead author and a postdoctoral researcher in Dai's lab and ORNL, was at the campus of the National Institute of Standards and Technology (NIST) in less than 12 hours, using an instrument that bombards the material with neutrons to learn more about its characteristics. Part of the research also was conducted at ORNL's High Flux Isotope Reactor a few days later.
What de la Cruz and Dai found was that the new materials share a common trait with another class of high-temperature superconductors -- when the materials are doped to become superconducting, they lose their static magnetism.
It's a trait that that Dai and his team have studied extensively in superconductors known as cuprates, and this finding is a step toward showing that there may be a broader significance to the tie between magnetism and superconductivity.
"In our view, it is extremely important to find another example," Dai said. "It is not exactly the same as the cuprates, but it is similar."
The speed with which their research was conducted reflects the competitive nature of superconductivity research, a field which already has led to two Nobel Prizes.
Dai and his research team will continue to analyze the new material, in hopes of finding the common threads that make materials superconductive.
"The hope, the dream of the research is to engineer the process to happen at higher and higher temperatures," he said. The end goal is to be able to harness the unique property at temperatures that do not require incredibly cold and incredibly controlled situations.
---------------------------------------------------------------------------------------------
Dai and de la Cruz were joined as authors on the paper by researchers from NIST, ORNL, the University of Maryland, Iowa State University and the Beijing National Laboratory for Condensed Matter Physics. The work was funded by the U.S. Department of Energy, the National Science Foundation of China, the Chinese Academy of Sciences and the Chinese Ministry of Science and Technology.
----------------------------------------------------------------------------------------------
Public release date: 28-May-2008[ Print Article E-mail Article Close Window ]
Contact: Jay Mayfieldjay.mayfield@tennessee.edu865-974-9409University of Tennessee at Knoxville

Tuesday, May 27, 2008

Geoengineering could slow down the global water cycle

LIVERMORE, Calif. - As fossil fuel emissions continue to climb, reducing the amount of sunlight hitting the Earth would definitely have a cooling effect on surface temperatures.
However, a new study from Lawrence Livermore National Laboratory, led by atmospheric scientist Govindasamy Bala, shows that this intentional manipulation of solar radiation also could lead to a less intense global water cycle. Decreasing surface temperatures through “geoengineering” also could mean less rainfall.
The reduction in sunlight can be accomplished by geoengineering schemes. There are two classes: the so-called “sunshade” geoengineering scheme, which would mitigate climate change by intentionally manipulating the solar radiation on the earth's surface; the other category removes atmospheric CO2 and sequesters it into the terrestrial vegetation, oceans or deep geologic formations.
In the new climate modeling study, which appears in the May 27-30 early online edition of the Proceedings of the National Academy of Sciences, Bala and his colleagues Karl Taylor and Philip Duffy demonstrate that the sunshade geoengineering scheme could slow down the global water cycle.
The sunshade schemes include placing reflectors in space, injecting sulfate or other reflective particles into the stratosphere, or enhancing the reflectivity of clouds by injecting cloud condensation nuclei in the troposphere. When CO2 is doubled as predicted in the future, a 2 percent reduction in sunlight is sufficient to counter the surface warming.
This new research investigated the sensitivity of the global mean precipitation to greenhouse and solar forcings separately to help understand the global water cycle in a geoengineered world.
While the surface temperature response is the same for CO2 and solar forcings, the rainfall response can be very different.
“We found that while climate sensitivity can be the same for different forcing mechanisms, the hydrological sensitivity is very different,” Bala said.
The global mean rainfall increased approximately 4 percent for a doubling of CO2 and decreases by 6 percent for a reduction in sunlight in his modeling study.
“Because the global water cycle is more sensitive to changes in solar radiation than to increases in CO2, geoengineering could lead to a decline in the intensity of the global water cycle” Bala said.
A recent study showed that there was a substantial decrease in rainfall over land and a record decrease in runoff and discharge into the ocean following the eruption of Mount Pinatubo in 1991. The ash emitted from Pinatubo masked some of the sunlight reaching the earth and therefore decreased surface temperatures slightly, but it also slowed down the global hydrologic cycle.
“Any research in geoengineering should explore the response of different components of the climate system to forcing mechanisms,” Bala said.
For instance, Bala said, sunshade geoengineering would not limit the amount of CO2 emissions. CO2 effects on ocean chemistry, specifically, could have harmful consequences for marine biota because of ocean acidification, which is not mitigated by geoengineering schemes.
“While geoengineering schemes would mitigate the surface warming, we still have to face the consequences of CO2 emissions on marine life, agriculture and the water cycle,” Bala said.
------------------------------------------------------------------------------------------------
Founded in 1952, Lawrence Livermore National Laboratory is a national security laboratory, with a mission to ensure national security and apply science and technology to the important issues of our time. Lawrence Livermore National Laboratory is managed by Lawrence Livermore National Security, LLC for the U.S. Department of Energy's National Nuclear Security Administration.

Battling bird flu by the numbers

LOS ALAMOS, N.M., May 27, 2008
Los Alamos mathematical model gauges epidemic potential of emerging diseases
------------------------------------------------------------------------------------------------
A pair of Los Alamos National Laboratory researchers have developed a mathematical tool that could help health experts and crisis managers determine in real time whether an emerging infectious disease such as avian influenza H5N1 is poised to spread globally.
In a paper published recently in the Public Library of Science, researchers Luís Bettencourt and Ruy Ribeiro of Los Alamos’ Theoretical Division describe a novel approach to reading subtle changes in epidemiological data to gain insight into whether something like the H5N1 strain of avian influenza—commonly known these days as the “Bird Flu”—has gained the ability to touch off a deadly global pandemic.
“What we wanted to create was a mathematically rigorous way to account for changes in transmissibility,” said Bettencourt. “We now have a tool that will tell us in the very short term what is happening based on anomaly detection. What this method won’t tell you is what’s going to happen five years from now.”
Bettencourt and Ribeiro began their work nearly three years ago, at a time when the world was wondering whether avian influenza H5N1, with its relatively high human mortality rate, could become a frightening new pandemic. Health experts believe that right now the virus primarily infects humans who come in contact with infected poultry.
But some health experts fear the virus could evolve to a form that would become transmissible from human to human, the basis of a pandemic like the 1918 Spanish Flu that killed an estimated 50 million people.
The Los Alamos researchers set out to create a “smart methodology” to look at changes in disease transmissibility that did not require mounds of epidemiological surveillance data for accuracy. The ability to look at small disease populations in real time could allow responders and health experts to implement quarantine policies and provide medical resources to key areas early on in an emerging pandemic and possibly stem the spread.
Bettencourt and Ribeiro developed an extension of standard epidemiological models that describes the probability of disease spread among a given population. The model then takes into account actual disease surveillance data gathered by health experts like the World Health Organization and looks for anomalies in the expected transmission rate versus the actual one. Based on this, the model provides health experts actual transmission probabilities for the disease. Unlike other statistical models that require huge amounts of data for accuracy, the Los Alamos tool works on very small populations such as a handful of infected people in a remote village.
After developing their Bayesian estimation of epidemic potential, Bettencourt went back and looked at actual epidemiological surveillance data collected during Bird Flu outbreaks in certain parts of the world. Their model accurately portrayed actual transmission scenarios, lending confidence to its methodology.
In addition to its utility in understanding the transmissibility of emerging diseases, the new method is also advantageous because it allows public health experts to study outbreaks of more common ailments such as seasonal influenza early on. This can assist medical professionals in making better estimates of potential morbidity and mortality, along with assessments of intervention strategies and resource allocations that can help a population better cope with a developing seasonal outbreak.
“We are closing the loop on science-based prediction of transmission consequences in real time,” said Ribeiro. “A program of this type is something that needs to be implemented at a worldwide level to provide an integrated way to respond a priori to an emerging disease threat.”
Los Alamos National Laboratory is a multidisciplinary research institution engaged in strategic science on behalf of national security. The Laboratory is operated by a team composed of Bechtel National, the University of California, BWX Technologies, and Washington Group International for the Department of Energy's National Nuclear Security Administration.
Los Alamos enhances national security by ensuring the safety and reliability of the U.S. nuclear stockpile, developing technologies to reduce threats from weapons of mass destruction, and solving problems related to energy, environment, infrastructure, health and global security concerns.

Brown Chemists Create Cancer-Detecting Nanoparticles



NanobondingThe illustration (top) shows how a RGD peptide-coated iron oxide nanoparticle binds with an integrin-rich tumor cell. At bottom left is a MRI of a mouse with the implanted U87MG tumor (red circle). At bottom right is an optical image that reveals iron oxide nanoparticles (blue) amassed in the tumor area (pink). Credit: Jin Xie, Brown University

------------------------------------------------------

A team led by a Brown University chemist has created the smallest iron oxide nanoparticles to date for cancer detection by magnetic resonance imaging (MRI). The magnetic nanoparticles operate like tiny guided missiles, seeking and attaching themselves to malignant tumor cells. Once they bind, the particles emit stronger signals that MRI scans can detect

---------------------------------------------------------

.PROVIDENCE, R.I. [Brown University] — Magnetic resonance imaging (MRI) can be a doctor's best friend for detecting a tumor in the body without resorting to surgery. MRI scans use pulses of magnetic waves and gauge the return signals to identify different types of tissue in the body, distinguishing bone from muscle, fluids from solids, and so on.
Scientists have found that magnetic nanoparticles can be especially helpful in locating cancerous cell clusters during MRI scans. Like teeny guide missiles, the nanoparticles seek out tumor cells and attach themselves to them. Once the nanoparticles bind themselves to these cancer cells, the particles operate like radio transmitters, greatly aiding the MRI's detection capability.
Now, Brown University chemist Shouheng Sun and a team of researchers have created the smallest magnetic nanoparticles to date that can be employed on such seek-and-find missions. With a thinner coating, the particles also emit a stronger signal for the MRI to detect.
The results have been published online this week in the Journal of the American Chemical Society. Brown graduates students Jin Xie, Chenjie Xu and Sheng Peng collaborated on the research, along with Professor Xiaoyuan Chen and his associates from Stanford University.
The team created peptide-coated iron oxide nanoparticles — particles billionths of a meter in size. The researchers injected the particles into mice and tested their ability to locate a brain tumor cell called U87MG. Sun and his collaborators concentrated specifically on the nanoparticle's size and the thickness of the peptide coating, which ensures the nanoparticle attaches to the tumor cell.
Size is important because the trick is to create a nanoparticle that is small enough to navigate through the bloodstream and reach the diseased area. Bigger particles tend to stack up, creating the circulatory system's version of a traffic jam. Sun's team developed a nanoparticle that is about 8.4 nanometers in overall diameter — some six times smaller than the size of particles currently used in medicine.
"We wanted to make (the nanoparticle) very small, so the body's immune system won't recognize it," Sun explained. "That way, you let more particles interact with and attach to the tumor cell."
Nanoparticles are important in MRI detection because they enhance what scientists refer to as the "contrast" between the background, such as water molecules in the body, and a solid mass, such as a tumor.
The coating, while integral to the nanoparticles' attachment to the tumor cell, also is crucial to establishing the "signal-to-noise" ratio that a MRI uses. The thinner the coating, the stronger the emitted signal and vice versa. Sun's team outfitted their nanoparticles with a two-nanometer thick peptide coating — 10 times thinner than the coating available in popular MRI contrast agents such as Feridex. Sun's nanoparticles are like having a 50,000-watt radio transmitter versus a 150-watt station; it's easier for the MRI to "hear" the stronger signal and to hone in on the signal's source.
Another important feature of the team's work is discovering that the RGD peptide coating binds almost seamlessly to the U87MG tumor cell. The team plans to test the particle's ability to bind with other tumor cells in further animal experiments.
The National Cancer Institute, part of the National Institutes of Health, and the Department of Energy's Experimental Program to Stimulate Competitive Research (EPSCoR) funded the research.

-------------------------------------------------------------------------------------------------
Editors: Brown University has a fiber link television studio available for domestic and international live and taped interviews, and maintains an ISDN line for radio interviews. For more information, call (401) 863-2476.

Monday, May 26, 2008

Warning: Using a mobile phone while pregnant can seriously damage your baby

Scientists found that mothers who did use the handsets were 54 per cent more likely to have children with behavioural problems and that the likelihood increased with the amount of potential exposure to the radiation


Study of 13,000 children exposes link between use of handsets and later behavioural problems


By Geoffrey Lean,
Environment EditorSunday,
18 May 2008

Scientists found that mothers who did use the handsets were 54 per cent more likely to have children with behavioural problems and that the likelihood increased with the amount of potential exposure to the radiation enlarge Print Email Search Go Independent.co.uk Web Bookmark & ShareDigg Itdel.icio.usFacebookStumbleupon
What are these?Change font size: A A A
Women who use mobile phones when pregnant are more likely to give birth to children with behavioural problems, according to authoritative research.
A giant study, which surveyed more than 13,000 children, found that using the handsets just two or three times a day was enough to raise the risk of their babies developing hyperactivity and difficulties with conduct, emotions and relationships by the time they reached school age. And it adds that the likelihood is even greater if the children themselves used the phones before the age of seven.
The results of the study, the first of its kind, have taken the top scientists who conducted it by surprise. But they follow warnings against both pregnant women and children using mobiles by the official Russian radiation watchdog body, which believes that the peril they pose "is not much lower than the risk to children's health from tobacco or alcohol".
The research – at the universities of California, Los Angeles (UCLA) and Aarhus, Denmark – is to be published in the July issue of the journal Epidemiology and will carry particular weight because one of its authors has been sceptical that mobile phones pose a risk to health.
The research – at the universities of California, Los Angeles (UCLA) and Aarhus, Denmark – is to be published in the July issue of the journal Epidemiology and will carry particular weight because one of its authors has been sceptical that mobile phones pose a risk to health.
UCLA's Professor Leeka Kheifets – who serves on a key committee of the International Commission on Non-Ionizing Radiation Protection, the body that sets the guidelines for exposure to mobile phones – wrote three and a half years ago that the results of studies on people who used them "to date give no consistent evidence of a causal relationship between exposure to radiofrequency fields and any adverse health effect".
The scientists questioned the mothers of 13,159 children born in Denmark in the late 1990s about their use of the phones in pregnancy, and their children's use of them and behaviour up to the age of seven. As they gave birth before mobiles became universal, about half of the mothers had used them infrequently or not at all, enabling comparisons to be made.
They found that mothers who did use the handsets were 54 per cent more likely to have children with behavioural problems and that the likelihood increased with the amount of potential exposure to the radiation. And when the children also later used the phones they were, overall, 80 per cent more likely to suffer from difficulties with behaviour. They were 25 per cent more at risk from emotional problems, 34 per cent more likely to suffer from difficulties relating to their peers, 35 per cent more likely to be hyperactive, and 49 per cent more prone to problems with conduct.
The scientists say that the results were "unexpected", and that they knew of no biological mechanisms that could cause them. But when they tried to explain them by accounting for other possible causes – such as smoking during pregnancy, family psychiatric history or socio-economic status – they found that, far from disappearing, the association with mobile phone use got even stronger.
They add that there might be other possible explanations that they did not examine – such as that mothers who used the phones frequently might pay less attention to their children – and stress that the results "should be interpreted with caution" and checked by further studies. But they conclude that "if they are real they would have major public health implications".
Professor Sam Milham, of the blue-chip Mount Sinai School of Medicine in New York, and the University of Washington School of Public Health – one of the pioneers of research in the field – said last week that he had no doubt that the results were real. He pointed out that recent Canadian research on pregnant rats exposed to similar radiation had found structural changes in their offspring's brains.
The Russian National Committee on Non-Ionizing Radiation Protection says that use of the phones by both pregnant women and children should be "limited". It concludes that children who talk on the handsets are likely to suffer from "disruption of memory, decline of attention, diminishing learning and cognitive abilities, increased irritability" in the short term, and that longer-term hazards include "depressive syndrome" and "degeneration of the nervous structures of the brain".

Army : Battlefield Weather for Command, Control, Communications, Computers, Intelligence, Surveillance, and Reconnaissance (C4ISR)






Battlefield Weather Research Program - Basic Research: The Army's Transformation Plan will require a greater degree of understanding of the atmospheric processes within the Army's battlespace. This project provides that indepth understanding of the complex atmospheric boundary layer. The continued requirement for military operations in complex and urban terrains requires new approaches to measuring and modeling micro-scale atmospheric phenomena, particularly for fast response in near real time. This includes the detection and tracking of chemical and biological aerosols; the propagation of full-spectrum electro-magnetic signals and acoustic signatures; and the delivery of accurate and timely weather intelligence for battlefield commanders. This project is the research leader in boundary layer meteorology over land and urban terrain. It supports Army objectives through enhanced acoustic modeling techniques for improved target detection and acquisition; the development of objective analysis tools that can assimilate on-scene weather observations and fuse these data with forecasts to provide immediate nowcast products; and the research on novel imaging capabilities to enhance the detection and identification of biowarfare agents.
Battlefield Weather Research Program - Applied Research: The Army's Transformation Plan will require the capabilities for battlefield commanders to make decisions in near real time based on fusion of current tactical weather and forecast products provided from meteorological satellites and from the other services. Data must then be transformed into weather intelligence, including atmospheric effects and weather impacts on friendly and threat systems, and must be made accessible through effective planning tools and intuitive decision aids. These weather intelligence data will not only have to be timely and accurate, but also exchanged on various bandwidth channels between and across all echelons from the continental U.S. (CONUS) home station and strategic operations centers on down to the lowest levels of command in the field, to include the individual soldier. This project focuses on accomplishing this mission through the development and transition of technologies that collect, analyze, and integrate the weather data from a forecast/nowcast model with battlefield observations into a four-dimensional (4D) net centric, distributed meteorological information space. Technologies are developed to automate the knowledge management and optimize bandwidth to meet the goal of providing the best actionable weather intelligence, even while en route, and translate that intelligence into specific weather decision aids for the digital battlefield commander. It is accomplished by applying advanced computing techniques; by incorporating technology in meteorological sensor joint data distribution architectures; by developing data assimilation and information fusion techniques to horizontally integrate disparate sources; by integrating weather and its effects into planning tools; and by enhancing combat power and effectiveness through improved decision aid technologies. Tactical Communications and Networks
Signal Processing for Tactical Communications: In ARL's Signal Processing for Tactical Communications program, we are exploring fundamental aspects in the development of secure, jam-resistant, and adaptive mobile communications that will be effective in noisy, wireless, hostile battlefield environments. The goal of this basic research program is to investigate enhancements to current communications technologies in the areas of anti-jam and spectrally efficient modulation techniques; intelligent interference rejection; secure and jam-resistant multiple-access; robust wideband mobile receivers; adaptive spectrum reuse; laser communications technologies; channel propagation modeling; security and authentication; and jammer detection and mitigations. ARL's Mobile Tactical and Sensor Networks program is focused on providing the Army's fully mobile, fully communicating, agile, and situationally aware force with a highly dynamic, wireless, mobile networking environment for a force consisting of a heterogeneous mixture of individual soldiers, ground vehicles, airborne platforms, unmanned aerial vehicles (UAVs), robotics, and unattended microsensor networks. This program is developing networking technologies that can operate with full mobility, self-configuration, robustness, survivability, and scalability in the complex and demanding mobile environment of the future battlefield. Mobile ad hoc networks, which can rapidly change with the tactical situation, form the basis of our tactical networking research.
Communications & Networks: The Communications and Networks Collaborative Technology Alliance (C&N CTA) is a partnership between the Army Laboratories and Centers, private industry, and academia that is focused on rapid transition to the warfighter. These collaborative efforts seek to enable large, heterogeneous, wireless communications networks for the Future Force that can operate while on-the-move with a highly mobile network infrastructure; under severe bandwidth, energy, and processing constraints; and while providing secure, jam-resistant communications in noisy hostile battlefield environments. The research focuses on four technical thrusts: 1) survivable wireless mobile networks that ensure that tactical networks are self-configuring and self-maintaining, highly mobile, survivable, scaleable, energy-efficient, performance-optimized, and interoperable with joint and coalition forces; 2) signal processing to support efficient comms-on-the-move that is effective in a noisy, cluttered, and hostile wireless environment; 3) secure jam-resistant communications to ensure reliable communications in environments which include dense, multiple access interference that may be generated from within the network or from hostile interferers; and 4) tactical information protection that provides automated, scaleable, efficient, adaptive security for wireless, multi-hop, self-configuring networks.
Tactical Information Protection: ARL's Tactical Information Protection program is developing adaptive, scalable, efficient, and adaptive information protection in tactical wireless, multi-hop, self-configuring networks. This program includes automated intrusion detection and assessment for tactical networks, security infrastructure for sensor networks, and energy-efficient tamper detection for mobile code. Tactical Battlespace Information Processing
Battlefield Information Processing: ARL is exploring fundamental aspects in the development of a real-time, service-based software infrastructure to facilitate communication and information sharing among ad hoc heterogeneous assets, to include developing the processing, sensor networking, and packaging infrastructure for agents to populate both manned and autonomous platforms. ARL efforts in this area will result in the ability to collaboratively aggregate, fuse, and abstract data to information and information to knowledge in such a way that the military decision maker can easily assimilate knowledge. Providing battlefield decision makers with information in a highly visual and easily assessed form (including multi-lingual computing products) will significantly improve the soldier's ability to absorb information and make better decisions. This effort includes the evaluation of machine translation engines and approaches in order to develop and validate metrics in multi-lingual computing.
Intelligent Optics: In ARL's Intelligent Optics program, we are conducting research in the theoretical and experimental aspects of ground to ground imaging, intelligent, and adaptive optics and the development of algorithms, techniques, and devices for advanced military imaging and image processing systems. Adaptive image processing will provide images of outstanding clarity and unlimited depth of field. Intelligent optics methods and techniques will improve military devices and systems so that real-time images of targets can be obtained and presented to decision makers.
Command and Control (C2) in Complex and Urban Terrain: The joint ARL/ Communications-Electronics Research Development and Engineering Center (CERDEC)/ Cold Regions Research & Engineering Laboratory (CRREL) Command & Control in Complex & Urban Terrain (C2CUT) Army Technology Objective will develop a service-based software infrastructure to facilitate communication and information sharing among ad hoc heterogeneous assets and C2 decision aids for Future Force dismounted and mounted commanders, leaders, and soldiers to employ during close combat in complex and urban terrain. The decision aids will be used to integrate critical information day and night in any combat situation. This capability will enhance survivability and increase combat effectiveness by providing enhanced collaboration, information reach back, mixed asset management, and seamless situational understanding.
Fusion Based Knowledge for the Future Force: ARL is developing an advanced knowledge generation and explanation capability to answer the warfighting commanders' Priority Intelligence Requirements (PIRs). This capability will enable the commander to see/understand at a rate supporting the tactical agility concepts of the Future Force Unit of Action (17,000-170,000 reports per hour), and provide the commander with automated enemy course of action and intent analysis with the minimal accuracy of a student analyst.

Advanced Decision Architectures: The purpose of the Advanced Decision Architectures (ADA) Collaborative Technology Alliance (CTA) program is to develop, validate, and transition new knowledge management and decision support technologies to facilitate soldier awareness and understanding of the tactical situation, thereby resulting in more rapid decisions, creating a tempo with which the enemy cannot compete.