Friday, December 24, 2010

Earn extra. No need to invest. Authentic online job

Hey friends, just check this out.............i recently hit upon this site coincidently........it's an easy way to earn some money online doing regular surveys. It's cent per cent authentic.............Just give it a try.

http://www.AWSurveys.com/HomeMain.cfm?RefID=khanjankalyani

Wednesday, June 23, 2010

Semi-automated ontology generation within OBO-Edit

Ontologies and taxonomies have proven highly beneficial for biocuration. The Open Biomedical Ontology (OBO) Foundry alone lists over 90 ontologies mainly built with OBO-Edit. Creating and maintaining such ontologies is a labour-intensive, difficult, manual process. Automating parts of it is of great importance for the further development of ontologies and for biocuration.
Results: We have developed the Dresden Ontology Generator for Directed Acyclic Graphs (DOG4DAG), a system which supports the creation and extension of OBO ontologies by semi-automatically generating terms, definitions and parent–child relations from text in PubMed, the web and PDF repositories. DOG4DAG is seamlessly integrated into OBO-Edit. It generates terms by identifying statistically significant noun phrases in text. For definitions and parent–child relations it employs pattern-based web searches. We systematically evaluate each generation step using manually validated benchmarks. The term generation leads to high-quality terms also found in manually created ontologies. Up to 78% of definitions are valid and up to 54% of child–ancestor relations can be retrieved. There is no other validated system that achieves comparable results.
By combining the prediction of high-quality terms, definitions and parent–child relations with the ontology editor OBO-Edit we contribute a thoroughly validated tool for all OBO ontology engineers.
Availability: DOG4DAG is available within OBO-Edit 2.1 at http://www.oboedit.org

Monday, June 21, 2010

How Lie Detectors Work

A polygraph instrument is basically a combination of medical devices that are used to monitor changes occurring in the body. As a person is questioned about a certain event or incident, the examiner looks to see how the person's heart rate, blood pressure, respiratory rate and electrodermal activity (sweatiness, in this case of the fingers) change in comparison to normal levels. Fluctuations may indicate that person is being deceptive, but exam results are open to interpretation by the examiner. ­

Polygraph exams are most often associated with criminal investigations, but there are other instances in which they are used. You may one day be subject to a polygraph exam before being hired for a job: Many government entities, and some private-sector employers, will require or ask you to undergo a polygraph exam prior to employment.

Polygraph examinations are designed to look for significant involuntary responses going on in a person's body when that person is subjected to stress, such as the stress associated with deception. The exams are not able to specifically detect if a person is lying, according to polygragrapher Dr. Bob Lee , former executive director of operations at Axciton Systems, a manufacturer of polygraph instruments. But there are certain physiological responses that most of us undergo when attempting to deceive another person. By asking questions about a particular issue under investigation and examining a subject's­ physiological reactions to those questions, a polygraph examiner can determine if deceptive behavior is being demonstrated.

SCOPE: a probabilistic model for scoring tandem mass spectra against a peptide database

Proteomics, or the direct analysis of the expressed protein components of a cell, is critical to our understanding of cellular biological processes in normal and diseased tissue. A key requirement for its success is the ability to identify proteins in complex mixtures. Recent technological advances in tandem mass spectrometry has made it the method of choice for high-throughput identification of proteins. Unfortunately, the software for unambiguously identifying peptide sequences has not kept pace with the recent hardware improvements in mass spectrometry instruments. Critical for reliable high-throughput protein identification, scoring functions evaluate the quality of a match between experimental spectra and a database peptide. Current scoring function technology relies heavily on ad-hoc parameterization and manual curation by experienced mass spectrometrists. In this work, a two-stage stochastic model for the observed MS/MS spectrum, given a peptide is proposed. The model explicitly incorporates fragment ion probabilities, noisy spectra, and instrument measurement error. It describes how to compute this probability based score efficiently, using a dynamic programming technique. A prototype implementation demonstrates the effectiveness of the model.

Saturday, June 19, 2010


The turritopsis nutricula species of Jellyfish has been discovered to be the first, and possibly only, immortal creature. Once the creature reaches its adult form (pictured) it can, apparently, use transdifferentiation to transform its cells backwards to the polyp stage of its life and begin the whole cycle again. There's not much more to say beyond that - consider myself officially in awe.

Friday, June 18, 2010

Inferring combined CNV/SNP haplotypes from genotype data

Copy number variations (CNVs) are increasingly recognized as an substantial source of individual genetic variation, and hence there is a growing interest in investigating the evolutionary history of CNVs as well as their impact on complex disease susceptibility. CNV/SNP haplotypes are critical for this research, but although many methods have been proposed for inferring integer copy number, few have been designed for inferring CNV haplotypic phase and none of these are applicable at genome-wide scale. Here, we present a method for inferring missing CNV genotypes, predicting CNV allelic configuration and for inferring CNV haplotypic phase from SNP/CNV genotype data. Our method, implemented in the software polyHap v2.0, is based on a hidden Markov model, which models the joint haplotype structure between CNVs and SNPs. Thus, haplotypic phase of CNVs and SNPs are inferred simultaneously. A sampling algorithm is employed to obtain a measure of confidence/credibility of each estimate.

Results: We generated diploid phase-known CNV–SNP genotype datasets by pairing male X chromosome CNV–SNP haplotypes. We show that polyHap provides accurate estimates of missing CNV genotypes, allelic configuration and CNV haplotypic phase on these datasets. We applied our method to a non-simulated dataset—a region on Chromosome 2 encompassing a short deletion. The results confirm that polyHap's accuracy extends to real-life datasets.

Availability: Our method is implemented in version 2.0 of the polyHap software package and can be downloaded from http://www.imperial.ac.uk/medicine/people/l.coin

Tuesday, June 15, 2010

Why is it so difficult to find cancer cells?

Imagine you're standing in front of a sandbox. You've just been told that you'll receive a million dollars if you can find a certain piece of sand that's remarkable because it has a dot of black ink on it. How would you go about finding it? Do you even think you could?

Finding a single cancer cell in the human body is like looking for one grain of sand in a sandbox. You may think that it would be easier to find the cancer than the grain of sand; after all, hospitals are full of sophisticated diagnostic equipment that should help with the search. But there's no scan or test that can detect a cancer cell. The cell is simply too small, just one cell amidst the billions that make up the human body. Even small groups of cancerous cells are too small to see on test results, and sometimes larger groups of cells are hidden behind bodily organs so that they don't show up, either.

The undetected cancer cells have the opportunity to group together and form tumors. These tumors and cells also have a chance to spread throughout the body, a process known as metastasis. Large tumors and metastasized tumors are difficult to treat, which is why we hear so much about the importance of early detection. Unfortunately, it may be only after the cancer has grown or spread that symptoms start to occur. A tumor can grow so large that it affects organ function or causes bleeding or pain.

Metastasis can also cause the kind of symptoms that might send someone to the doctor; for example, cancer that has spread to the lungs can cause a persistent cough and chest infections, cancer that has spread to the liver can cause jaundice, and cancer that has spread to the lymph nodes can cause swelling. But the cancer may remain asymptomatic even then, which means it may do even more damage before it's found.

The difficulty of finding cancer cells not only presents delays in diagnosis, it can also create challenges to treatments. Cancer therapies such as radiation and chemotherapy are designed to target and kill cancerous cells in the midst of division. However, some cancerous cells lie dormant for periods of time, allowing them to survive several rounds of treatment. That's why cancer can come back years after a struggle with the disease.

Thursday, March 18, 2010

BRAT: bisulfite-treated reads analysis tool

BRAT is a new, accurate and efficient tool for mapping short reads obtained from the Illumina Genome Analyzer following sodium bisulfite conversion. BRAT, supports single and paired-end reads and handles input files containing reads and mates of different lengths. BRAT is faster, maps more unique paired-end reads and has higher accuracy than existing programs. The software package includes tools to end-trim low-quality bases of the reads and to report nucleotide counts for mapped reads on the reference genome.

Availability: The source code is freely available for download at http://compbio.cs.ucr.edu/brat/ and is distributed as Open Source software under the GPLv3.0.

MARTA: A suite of Java-based tools for assigning taxonomic status to DNA sequences

A suite of Java-based software has been created to better provide taxonomic assignments to DNA sequences. It's anticipated that the program will be useful for protistologists, virologists, mycologists and other microbial ecologists. The program relies on NCBI utilities including the BLAST software and Taxonomy database and is easily manipulated at the command-line to specify a BLAST candidate's query-coverage or percent identity requirements; other options include the ability to set minimal consensus requirements (%) for each of the eight major taxonomic ranks (Domain, Kingdom, Phylum, ...) and whether to consider lower scoring candidates when the top-hit lacks taxonomic classification.

Availability: http://bergelson.uchicago.edu/software/marta

Bisque: a platform for bioimage analysis and management

Advances in the field of microscopy have brought about the need for better image management and analysis solutions. Novel imaging techniques have created vast stores of images and metadata that are difficult to organize, search, process and analyze. These tasks are further complicated by conflicting and proprietary image and metadata formats, that impede analyzing and sharing of images and any associated data. These obstacles have resulted in research resources being locked away in digital media and file cabinets. Current image management systems do not address the pressing needs of researchers who must quantify image data on a regular basis.
Results: We present Bisque, a web-based platform specifically designed to provide researchers with organizational and quantitative analysis tools for 5D image data. Users can extend Bisque with both data model and analysis extensions in order to adapt the system to local needs. Bisque's extensibility stems from two core concepts: flexible metadata facility and an open web-based architecture. Together these empower researchers to create, develop and share novel bioimage analyses. Several case studies using Bisque with specific applications are presented as an indication of how users can expect to extend Bisque for their own purposes.
Availability: Bisque is web based, cross-platform and open source. The system is also available as software-as-a-service through the Center of Bioimage Informatics at UCSB.

Beta Glucan MH3 - Immune System Video

Beta-glucans are known as "biological response modifiers" because of their ability to activate the immune system.Immunologists at the University of Louisville, discovered that a receptor on the surface of innate immune cells called Complement Receptor 3 (CR3 or CD11b/CD18) is responsible for binding to beta-glucans, allowing the immune cells to recognize them as "non-self." However, it should be noted that the activity of beta-glucans is different from some pharmaceutical drugs which have the ability to over-stimulate the immune system. Pharmaceutical drugs have the potential to push the immune system to over-stimulation, and hence are contraindicated in individuals with autoimmune diseases, allergies, or yeast infections. Beta-glucans seem to make the immune system work better without becoming overactive. In addition to enhancing the activity of the immune system, beta-glucans also reportedly lower elevated levels of LDL cholesterol, aid in wound healing, help prevent infections, and have potential as an adjuvant in the treatment of cancer.

Tuesday, February 23, 2010

Genome of Ancient Man Sequenced


When a man died some 4,000 years ago in what is now western Greenland, he probably had no idea that his remains would provide the first genetic portrait of people of his era. This man, known now as "Inuk" (a Greenlandic term for "human" or "man") left for posterity just four hairs and a few small fragments of bone frozen in permafrost, but that is now all researchers need to assemble a thorough human genome. And Inuk has just had his code cracked. The researchers were able to sequence about 80 percent of the ancient genome, which is "comparable to the quality of a modern human genome," Eske Willerslev, director of the Center for Ancient Genetics at the University of Copenhagen, said at a press conference held in the England February 9. He and his team, led by Morten Rasmussen, an assistant professor at the University, were able to sequence about three billion base pairs (the human genome includes just over this amount), which is a finer resolution than that of previous genetic work on Neandertals and mammoths. At this level of resolution, the researchers noted, individual features and traits began to emerge. "The guy had most likely brown eyes, brown skin" as well as a genetic predisposition for baldness, Willerslev said. The presence of hair, then, might signal that he was rather young when he died and had yet to lose most of his hair, they noted. The genome also tells us Inuk had the recessive gene for dry earwax (as opposed to the more common wet form) and "a metabolism and body mass index commonly found in those who live in cold climates," David Lambert and Leon Huynenboth of the School of Biomolecular and Physical Sciences at Griffith University in Queensland, Australia, wrote in a commentary that accompanies the study.

Friday, February 19, 2010

COBALT: A new tool for Multiple Sequence Alignment

The simultaneous alignment of multiple sequences (multiple alignment) serves as a building block in several fields of computational biology, such as phylogenetic studies, detection of conserved motifs, prediction of functional residues and secondary structure, prediction of correlations and even quality assessment of protein sequences. For this an accurate multiple sequence alignment tool was one of the biggest requirement from a long time.

COBALT (Constraint based Multiple Alignment Tool) is a multiple sequence alignment tool that finds a collection of pairwise constraints derived from conserved domain database, protein motif database, and sequence similarity, using RPS-BLAST, BLASTP, and PHI-BLAST. Pairwise constraints are then incorporated into a progressive multiple alignment.

COBALT has a general framework that uses progressive multiple alignment to combine pairwise constraints from different sources into a multiple alignment. COBALT does not attempt to use all available constraints but uses only a high-scoring consistent subset that can change as the alignment progresses, where a set of constraints is called consistent if all of the constraints in the set can be simultaneously satisfied by a multiple alignment. Using the RPS-BLAST tool, we can quickly search for domains in CDD that match to regions of input sequences. When the same domain matches to multiple sequences, we can infer several potential pairwise constraints based on these domain matches. Furthermore, CDD also contains auxiliary information that allows COBALT to create partial profiles for input sequences before progressive alignment begins, and this avoids computationally expensive procedures for building profiles.

COBALT is implemented in NCBI C++ Toolkit. More information on COBALT can be found at:http://bioinformatics.oxfordjournals.org/cgi/content/full/23/9/1073

To access COBALT use this link: http://www.ncbi.nlm.nih.gov/tools/cobalt/cobalt.cgi?link_loc=BlastHomeAd

webMGR: An online tool for the multiple genome rearrangement problem

The algorithm MGR enables the reconstruction of rearrangement phylogenies based on gene or synteny block order in multiple genomes. Although MGR has been successfully applied to study the evolution of different sets of species, its utilization has been hampered by the prohibitive running time for some applications. In the current work, we have designed new heuristics that significantly speed up the tool without compromising its accuracy. Moreover, we have developed a web server (webMGR) that includes elaborate web output to facilitate navigation through the results.


webMGR can be accessed via http://www.gis.a-star.edu.sg/~bourque.

Tablet Next generation sequence assembly visualization

Tablet is a lightweight, high-performance graphical viewer for next-generation sequence assemblies and alignments. Supporting a range of input assembly formats, Tablet provides high-quality visualizations showing data in packed or stacked views, allowing instant access and navigation to any region of interest, and whole contig overviews and data summaries. Tablet is both multi-core aware and memory efficient, allowing it to handle assemblies containing millions of reads, even on a 32-bit desktop machine.


Availability: Tablet is freely available for Microsoft Windows, Apple Mac OS X, Linux and Solaris.

Thursday, February 18, 2010

A Drug Discovery Technique In The Battle Against Global Warming


Scientists in Texas are reporting that a technique used in the search for new drugs could also be used in the quest to discover new, environmentally friendly materials for fighting global warming. Such materials could be used to capture the greenhouse gas carbon dioxide from industrial smokestacks and other fixed sources before it enters the biosphere.

The new study appears in ACS' bi-monthly journal Energy and Fuels.

Michael Drummond and colleagues Angela Wilson and Tom Cundari note that greener carbon-capture technologies are a crucial component in mitigating climate change. Existing technology is expensive and can generate hazardous waste.

They point out that proteins, however, can catalyze reactions with carbon dioxide, the main greenhouse gas, in an environmentally friendly way. That fact got the scientists interested in evaluating the possibility of using proteins in carbon capture technology.

In the study, they used the pharmacophore concept to probe how the 3-dimensional structure of proteins affects their ability to bind and capture carbon dioxide.

The German chemist and Nobel Laureate Paul Ehrlich, who originated the concept a century ago, defined a pharmacophore as the molecular framework that carries the key features responsible for a drug's activity. The scientists concluded that the approach could point the way to the development of next-generation carbon capture technologies.

Microbe produces biofuel from biomass


U.S. scientists say they have developed a microbe that can produce an advanced biofuel directly from biomass.

Scientists led by Jay Keasling from the U.S. Department of Energy's Joint BioEnergy Institute said using synthetic biology they engineered a strain of Escherichia coli bacteria to produce biodiesel fuel and other chemicals derived from fatty acids.

"The fact that our microbes can produce a diesel fuel directly from biomass with no additional chemical modifications is exciting and important," Keasling said. "Given that the costs of recovering biodiesel are nowhere near the costs required to distill ethanol, we believe our results can significantly contribute to the ultimate goal of producing scalable and cost effective advanced biofuels and renewable chemicals."

The scientists said they are now working on maximizing the efficiency and the speed by which their engineered strain of E. coli can directly convert biomass into biodiesel. They said they are also looking into ways of maximizing the total amount of biodiesel that can be produced from a single fermentation.

The study is reported in the journal Nature.

Retrograde Spin Of Supermassive Black Holes May Control Galaxy Evolution


Every galaxy has a collection of black holes, regions that gobble up matter and energy and can each be up to 10 times the Sun's mass. In addition to these black holes, there is a supermassive black hole embedded in the heart of each galaxy that is roughly one million to one billion times the mass of the Sun. About 10 percent of these giant black holes feature jets of plasma, or highly ionized gas, that extend in opposite directions.

By spewing huge amounts of mostly kinetic energy, or energy created by motion, from the black holes into the universe, the jets control how stars and other bodies form, and play a crucial role in the evolution of clusters of galaxies, the largest structures in the universe.

How the jets form remains one of the most important unsolved mysteries in extragalactic astrophysics, and Dan Evans, a postdoctoral researcher from MIT's Kavli Institute for Astrophysics and Space Research (MKI), may be one step closer to unlocking that mystery.

For two years, Evans has been comparing several dozen galaxies whose black holes host powerful jets (the galaxies are known as radio-loud active galactic nuclei, or AGN) to those galaxies with supermassive black holes that do not eject jets. All black holes - those with and without jets - feature accretion disks, the clumps of dust and gas rotating just outside the event horizon (the region from which nothing, not even light, can escape).

By examining the light reflected in the accretion disk of an AGN black hole, Evans concludes that jets may form right outside black holes that have a retrograde spin, or that are spinning in the opposite direction from their accretion disk, according to a paper appearing in the Feb. 10 issue of the Astrophysical Journal.

How They Did It
Evans and colleagues from the Harvard-Smithsonian Center for Astrophysics, Yale University, and Keele University and the University of Hertfordshire in the United Kingdom analyzed spectral data collected by the Suzaku observatory, a Japanese satellite launched in 2005 with collaboration from NASA, of a supermassive black hole with a jet located about 800 million light-years away in an AGN named 3C 33.

Although we can't see black holes, scientists can infer their size, location, and other properties by using sensitive telescopes to detect the heat they generate, which we see as X-rays.

They can also see signatures of X-ray emission from the inner regions of the accretion disk, which is located close to the edge of a black hole, as a result of a super hot atmospheric ring called a corona that lies above the disk and emits light that an observatory like Suzaku can detect.

In addition to this direct light, a fraction of light passes down from the corona onto the black hole's accretion disk and is reflected from the disk's surface, resulting in a spectral signature pattern called the Compton reflection hump also detected by Suzaku.

But Evans' team never found a Compton reflection hump in the X-ray emission given off by 3C 33, a finding the researchers believe provides crucial evidence that the accretion disk for a black hole with a jet is truncated, meaning that there is nothing to reflect the light from the corona.

They believe the absence may result from retrograde spin, which pushes out the orbit of the innermost portion of accretion material as a result of general relativity, or the gravitational pull between masses. This absence creates a gap between the disk and the center of the black hole that leads to the piling of magnetic fields that provide the force to fuel a jet.

Next Steps
According to Evans, this field of research will expand considerably in August 2011 with the planned launch of NASA's Nuclear Spectroscopic Telescope Array (NuSTAR) satellite, which is 10 to 50 times more sensitive to spectra and the Compton reflection hump than current technology.

NuSTAR will help researchers conduct a "giant census" of supermassive black holes that "will absolutely revolutionize the way we look at X-ray spectra of AGN," Evans said.

US Laser "Raygun" Plane Shoots Down Ballistic Missile


Feb 15 - The Airborne Laser Testbed (ALTB) transitioned from science fiction to directed energy fact Feb. 11 when it put a lethal amount of 'light on target' to destroy a boosting ballistic missile with help from a megawatt-class laser developed by Northrop Grumman.

While ballistic missiles like the one ALTB destroyed move at speeds of about 4,000 miles per hour, they are no match for a super-heated, high-energy laser beam racing towards it at 670 million mph. The basketball-sized beam was focused on the foreign military asset, as the missile is called officially, for only a few seconds before a stress fracture developed, causing the target to catastrophically split into multiple pieces.

"This experiment shows the incredible potential for directed energy as a premier element in early or ascent phase missile defense," said Steve Hixson, vice president of Space and Directed Energy Systems for Northrop Grumman's Aerospace Systems sector. "The demonstration leaves no doubt whatsoever about ALTB's unprecedented mobility, precision and lethality," he added. Hixson is a former ALTB program manager for the company.

Northrop Grumman executives said the high-energy Chemical Oxygen Iodine Laser the company provides - the most powerful laser ever developed for an airborne environment - performed reliably once again with other critical capabilities onboard the U.S. Missile Defense Agency's ALTB. This includes the low-power, solid-state Beacon Illuminator Laser for atmospheric compensation, a targeting laser Northrop Grumman also supplies for the ALTB system.

"The continued dependable and consistent performance of both laser systems is the result of our dedicated team and its unwavering commitment to develop game-changing technology for our military forces," said Guy Renard, Northrop Grumman's ALTB program manager. "The impressive progress made by the government and industry team during the last three-and-a-half years could not have culminated any more dramatically than this successful experiment."

The experiment, a proof-of-concept demonstration, was the first directed energy lethal intercept demonstration against a liquid-fuel boosting ballistic missile target from an airborne platform.

Northrop Grumman is under contract to The Boeing Company, ALTB's prime contractor, for the two laser systems. The ALTB is a modified Boeing 747-400F whose back half holds the high-energy laser. The front section of the aircraft contains the beam control/fire control system, developed by Lockheed Martin, and the battle management system, provided by Boeing.

Monday, February 15, 2010

Water Movements Can Shape Fish Evolution


Researchers from the University of Minnesota's Institute of Technology have found that the hydrodynamic environment of fish can shape their physical form and swimming style. The research, available on the Journal of Experimental Biology Web site, was sponsored by the National Science Foundation's National Center for Earth-surface Dynamics.

Catch a glimpse of a fish's body shape, and you can often guess how speedy it is. Tuna and mackerel look as if they should outpace frilly reef fish and eels. But how have all of these diverse body shapes evolved? Have fish bodies been shaped by the hydrodynamics of their environment or did they evolve for other reasons?

Turning to computational fish for answers, professor of Civil Engineering Fotis Sotiropoulos, along with postdoctoral researcher Iman Borazjani, from the university's St. Anthony Falls Laboratory decided to race hybrid and realistic fish in a massive parallel computer cluster to find out what influence the aquatic environment has had on fish shapes and swimming techniques.

But building the computational fish was far from straightforward. "We started this work over five years ago," says Sotiropoulos. "It was a challenge because we had never simulated anything living before."

Borazjani explains that the hydrodynamic forces exerted on swimmers vary enormously depending on their size and speed. Knowing that mackerel and eels swimming in water generate and thus experience different hydrodynamic environments, the duo simulated these different environments by varying tail beat frequencies and fluid viscosity (syrupiness).

Building two computational mackerels (one that beat its tail like a mackerel and a second that wriggled like an eel) and two eels (one that wriggled and another that beat its tail like a mackerel), the engineers set the fish racing from standing starts and noted how they performed.

The results showed clearly that all fish swam more efficiently if they had the body form or swimming style appropriate to the speeds at which they swam. For example, a lamprey that needed to swim faster could gain efficiency-which for a real fish would mean tiring less quickly-if it changed its shape or swimming style to mimic a mackerel.

And a mackerel that had to move slowly would be more efficient if it could change shape or swimming style to mimic a lamprey. This is evidence that a fish's optimal range of swimming speeds generates hydrodynamic forces that influence the shape and swimming style it will evolve.

"From these experiments, we can deduce that real mackerel and eel's swimming styles are perfectly adapted to the hydrodynamic environments that they inhabit," says Sotiropoulos. The method could be adapted to study how a fluid environment molds the evolution of other organisms and to design robots that would swim at different speeds or in water of different viscosities, the researchers say.

Foam Replacing Wax In Aerospace Casting Foundries


Funded in part by Air Force Research Laboratory Manufacturing Technology Small Business Innovation Research contracts, FOPAT Production is producing breakthrough foam patterns for casting foundries and other manufacturers of aerospace components. The advanced patterns will improve casting processes by replacing wax, a known problematic material, with foam.

Estimates indicate the development will generate $5 million in yearly energy savings, as well as $140 million in productivity, material savings, and scrap reduction.

The work also supports goals of ManTech's Advanced Manufacturing Propulsion Initiative, which seeks to transform the Air Force propulsion supplier base in order both to assure industrial capability and capacity in meeting production demands and to accelerate the transition of advanced technology.

Traditional casting processes and designs are severely limited due to the properties of wax, with high scrap rates resulting from shell cracking and material distortion historically plaguing the industry.

Under a 2-year Department of Energy "Inventions and Innovation" program grant and consecutive (Phase I and II) AF SBIR contracts, FOPAT Production created a new material comprising a proprietary mix of components that react and expand to form foam-casting patterns with smooth-surface finish and excellent dimensional predictability.

These patterns are temperature-stable, energy-efficient, and cost-effective products. Further, they are easily processed using standard-investment casting techniques.

By eliminating wax pattern making--and, thus, wax melt cycles--the new foam patterning method provides significant warfighter benefits. Among these advantages are improved operational readiness for F-135 and F-136 aircraft engines and a cost savings of approximately $13 million for the Joint Strike Fighter.

The technology also supports spiral development, a process further enhancing industrial readiness and effectiveness. FOPAT Production has already moved into a 14,000 sq ft facility and is regularly producing foam patterns for casting foundries and component manufacturers.

The Asia-Pacific And Kyoto


President Obama's visit to China before December's Copenhagen conference underlined views that the international strategy to tackle climate change truly hinges on cooperation between the United States and the developing Asian economies. This relationship, as represented in the Asia-Pacific Partnership (APP), is controversial to environmental analysts.

In two papers published in WIREs Climate Change, analysts debate the significance of the APP and its role as an alternative to the Kyoto treaty.

Launched in 2006, the APP is a non-treaty agreement between the United States, Australia, Canada, India, Japan, South Korea and, perhaps most importantly, the People's Republic of China. It is increasingly seen as a viable agreement between the United States and the emerging Asian economies, yet is criticised for not being legally binding.

"[The APP] has been hailed as a new model for an international climate agreement and as an alternative to the Kyoto protocol," said Ros Taplin from Bond University in Australia. "However implementation has had challenges. As an opposing model to Kyoto it is a contravention of the United Nations Framework Convention on Climate Change's (UNFCCC) principle of common, but differentiated responsibilities."

The APP's significant difference to Kyoto with regard to greenhouse gas emissions is that it requires participation by developing nations. This is seen as crucial by both the United States and Australia, who contend that it would be economically untenable for their countries to significantly cut their emissions without all countries taking action.

"The APP is based around public private taskforces organised on a sectoral basis. In legal terms the partnership is a nonbinding, soft law" said Taplin, "It is a contributor to the crumbing of climate governance."

Australia and the United States have also attempted to divert debate away from targets and timetables by adopting this sectoral approach. 170 projects have been initiated by taskforces, yet by mid-2009 only 7 had been completed.

However, according to Aynsley Kellow from the University of Tasmania, the APP is far from dead and is an improvement over the "failure" of the Kyoto Protocols, providing important lessons for future climate change negotiations.

"[Kyoto] has failed. It failed horizontally to secure commitments from important players and it failed vertically because of the lack of delivery of outcomes to those who did accede to it," said Kellow. "In comparison the APP represents a useful way forward."

The Kyoto Protocols launched what Kellow calls "a rush to targets and timetables", and promised clear reductions in greenhouse gas emissions of around 5% by industrialised nations. However, between 1997 and 2004, emissions from countries that ratified the protocol increased by 21.1%, whereas emissions from the United States increased by 6.6%.

"This is hardly a picture of policy success," said Kellow. "In contrast the APP is a non-binding initiative aimed at fostering technological development and transfer on a sectoral basis and sits alongside the G8+ 5, launched during the Gleneagles summit in 2006."

Expectations for international cooperation post-Copenhagen may now be modest, but, argues Kelow, initiatives such as the APP and G8+5 should be seen as helping rather than hindering these negotiations.

"We should be looking for silver buck-shot rather than a silver bullet in the quest for an adequate response to the risks of anthropogenic climate change." concludes Kellow, "The APP is one piece of shot, but a significant and helpful one nonetheless."

Bad News For Mosquitoes


Yale University researchers have found more than two dozen scent receptors in malaria-transmitting mosquitoes that detect compounds in human sweat, a finding that may help scientists to develop new ways to combat a disease that kills 1 million people annually.

These olfactory receptors in the mosquito Anopheles gambiae offer scientists potential new targets for repelling, confusing or attracting into traps the mosquitoes that spread a disease afflicting up to 500 million people across a broad swath of the world's tropical regions, according to authors of the article published online Feb. 3 in the journal Nature.

"The world desperately needs new ways of controlling these mosquitoes, ways that are effective, inexpensive, and environmentally friendly," said John Carlson, the Eugene Higgins Professor of Molecular, Cellular, and Developmental Biology at Yale and senior author of the study. "Some of these receptors could be excellent targets for controlling mosquito behavior."

While it has long been known that mosquitoes are attracted to human scents, just how the mosquito's olfactory system detects the different chemical elements of human odor has been unknown.

"Mosquitoes find us through their sense of smell, but we know very little about how they do this," Carlson said. "Here in the United States, mosquitoes are a source of annoyance, but in much of the world they're a source of death."

Carlson's lab identified the first insect odor receptors in 1999 in studies of the fruit fly. The Yale team then found an ingenious way to use the fruit fly to study how the mosquito olfactory system works: They used mutant flies that were missing an odor receptor. Under the leadership of Allison Carey, an M.D./Ph.D. candidate in Carlson's lab and lead author of the study, the researchers systematically activated genes of 72 mosquito odor receptors in fruit fly olfactory cells that lacked their own receptors. The engineered flies were then exposed to a battery of scent compounds, and the responses conferred by each receptor were analyzed. Over the course of the project, Carey recorded 27,000 electrical responses in the genetically engineered fly/mosquito olfactory system to the library of scents.

Particularly strong responses were recorded from 27 receptors - and most of these receptors responded to chemical compounds found in human sweat.

"We're now screening for compounds that interact with these receptors," Carlson said. "Compounds that jam these receptors could impair the ability of mosquitoes to find us. Compounds that excite some of these receptors could help lure mosquitoes into traps or repel them. The best lures or repellents may be cocktails of multiple compounds."

Carey says that more knowledge about mosquito behavior and odor reception will help develop more effective traps and repellents.

Friday, February 12, 2010

NASA to launch solar observatory


NASA made final preparations Tuesday for the launch of its Solar Dynamics Observatory (SDO) which promises to give space scientists the most detailed glimpse ever of our sun and its complicated workings.

The observatory and its Atlas V rocket were rolled out to the launchpad at Florida's Cape Canaveral base for a 10:26 am (1526 GMT) Wednesday liftoff on a five year mission that scientists said would help unravel the mysteries of how the sun's magnetic field affects the rest of our solar system.

US scientists, who have targeted the sun as the next frontier for space research, said they hope the probe will be especially helpful in revealing how changes in the sun alter the levels of radiation and energy within our solar system.

Those changes, which scientists call space weather, can affect communications and satellite signals, electrical power lines, and other objects and energy transmissions in our atmosphere and beyond.

NASA said that there was a 40 percent chance that cloudy and windy conditions, as well as threatening showers, could cause a delay in Wednesday's liftoff.

Telescopes and other gear onboard the probe will scrutinize sunspots and solar flares using more pixels and colors than any other observatory in the history of solar physics.

"SDO is the cornerstone, the foundation mission for the next decade of solar research," said Richard Fisher, heliophysics division director at NASA.

The SDO will take continuous high definition images of the sun, transmitting the data back to Earth for processing.

Dean Pesnell, SDO project scientist, NASA's Goddard Spaceflight Center in the Washington, DC suburb of Greenbelt, Maryland, said the findings gleaned from the research are potentially groundbreaking.

"The sun affects our life more and more as we come to depend more and more on technology," Pesnell said.

"Most of the effects come from the ever-changing magnetic field of the sun. SDO is designed to study that magnetic field from its creation to its destruction and how it then can affect the Earth."

SDO will orbit Earth once every 24 hours, sending solar scientists a continuous stream of high-resolution data throughout its five-year mission.

NASA said the spacecraft will send about 1.5 terabytes of data back to Earth each day -- the equivalent of streaming 380 full-length movies.

The data SDO collects will give scientists the most detailed look yet of the sun and its inner workings to date. It is hoped that this new information will give them insights into the mysteries of the solar cycle that can better protect us from the effects of space weather.

US space physicists said that if they can get a better understanding of the sun's magnetic field, they can predict how it affects the solar system and near-Earth space to create space weather.

NASA said that among the questions that the researchers hope to answer is how the sun's magnetic field is generated and how stored magnetic energy changes into kinetic energy, in the form of the solar wind and energetic particles.

Magnetism'S Role In Superconductors


Neutron scattering experiments performed at the Department of Energy's Oak Ridge National Laboratory give strong evidence that, if superconductivity is related to a material's magnetic properties, the same mechanisms are behind both copper-based high-temperature superconductors and the newly discovered iron-based superconductors.

The work, published in a recent Nature Physics, was performed at ORNL's Spallation Neutron Source (SNS) and High Flux Isotope Reactor (HFIR) along with the ISIS Facility at the United Kingdom's Rutherford Appleton Laboratory.

A simulation of the nature of the spin excitations in a superconducting material's structure is shown in the picture above. High-temperature superconducting materials, in which a material conducts electricity without resistance at a relatively high temperature, have potential for application to energy efficient technologies where little electricity is lost in transmission.

The research community was stirred in 2008 when a Japanese team reported high-temperature superconductivity in an iron-based material. Previously, only copper-based, or cuprate, materials were known to have those properties. The discovery elicited widespread and intense analysis of the material's structure and properties.

"The pairing up of electrons is essential for the formation of the macroscopic quantum state giving rise to superconductivity," said lead researcher Mark Lumsden of ORNL. "One of the leading proposals for the pairing mechanism in the iron-based superconductors is that magnetic interactions, provide the glue that binds the electrons together."

Detailed studies of the magnetic excitations of materials are essential for understanding high-temperature superconductivity. Although superconductivity near absolute zero is common, only certain materials exhibit the property at "high" (i.e., the temperature of liquid nitrogen) temperatures.

The ORNL researchers subjected single crystals of an iron, tellurium and selenium material to neutron scattering analysis at the SNS, HFIR and ISIS.

"Even in comparison to cuprates, this is experimentally the best indication of what the spin excitations are doing. One of the prominent views is that spin excitations are a key ingredient. The first step in evaluating this proposal is understanding what the spin excitations are and what they are doing," Lumsden said.

Neutron scattering analysis is considered to be among the most powerful techniques to understand the molecular structure and interactions in advanced materials.

The newly available pulsed-neutron source intensity at the accelerator-based SNS, combined with steady-state neutron beams from the recently upgraded HFIR, give researchers at Oak Ridge valuable new tools for understanding these properties.

"Neutron scattering is the only way to study the full wave vector and energy dependence of the spin excitations that are believed to be behind these superconducting properties," said co-author Andrew Christianson, who with Lumsden is a member of ORNL's Neutron Scattering Science Division.

The researchers performed time-of-flight neutron scattering measurements with the Wide Angular Range Chopper Spectrometer.

at the SNS, which opened in 2006 and is currently the world's most powerful pulsed-beam neutron source.

Time-of-flight experiments were also conducted in England with the Merlin chopper spectrometer at ISIS, another leading neutron source. Triple-axis neutron scattering experiments were performed at HFIR in Oak Ridge. The single crystals were synthesized and characterized by ORNL's Correlated Electron Materials group.

Using Supercomputers To Explore Nuclear Energy


Ever wanted to see a nuclear reactor core in action? A new computer algorithm developed by researchers at the U.S. Department of Energy's (DOE) Argonne National Laboratory allows scientists to view nuclear fission in much finer detail than ever before.

A team of nuclear engineers and computer scientists at Argonne National Laboratory are developing the neutron transport code UNIC, which enables researchers for the first time to obtain a highly detailed description of a nuclear reactor core.

The code could prove crucial in the development of nuclear reactors that are safe, affordable and environmentally friendly. To model the complex geometry of a reactor core requires billions of spatial elements, hundreds of angles and thousands of energy groups-all of which lead to problem sizes with quadrillions of possible solutions.

Such calculations exhaust computer memory of the largest machines, and therefore reactor modeling codes typically rely on various approximations. But approximations limit the predictive capability of computer simulations and leave considerable uncertainty in crucial reactor design and operational parameters.

"The UNIC code is intended to reduce the uncertainties and biases in reactor design calculations by progressively replacing existing multilevel averaging techniques with more direct solution methods based on explicit reactor geometries," said Andrew Siegel, a computational scientist at Argonne and leader of Argonne's reactor simulation group.

UNIC has run successfully at DOE leadership computing facilities, home to some of the world's fastest supercomputers, including the energy-efficient IBM Blue Gene/P at Argonne and the Cray XT5 at Oak Ridge National Laboratory. Although still under development, the code has already produced new scientific results.

In particular, the Argonne team has carried out highly detailed simulations of the Zero Power Reactor experiments on up to 163,840 processor cores of the Blue Gene/P and 222,912 processor cores of the Cray XT5, as well as on 294,912 processor cores of a Blue Gene/P at the Julich Supercomputing Center in Germany. With UNIC, the researchers have successfully represented the details of the full reactor geometry for the first time and have been able to compare the results directly with the experimental data.

Argonne's UNIC code provides a powerful new tool for designers of safe, environmentally friendly nuclear reactors - a key component of our nation's current and future energy needs. By integrating innovative design features with state-of-the-art numerical solvers, UNIC allows researchers not only to better understand the behavior of existing reactor systems but also to predict the behavior of many of the newly proposed systems having untested design characteristics.

Development of the UNIC code is funded principally by DOE's Office of Nuclear Energy through the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program. The Argonne UNIC project is a key part of the NEAMS efforts to replace the traditional "test-based" approach to nuclear systems design with a new "science-based" approach in which advanced modeling and simulation play a dominant role.

Thursday, February 11, 2010


Biologists at the University of Pennsylvania studying the processes of evolution appear to have resolved a longstanding conundrum: How can organisms be robust against the effects of mutations yet simultaneously adaptable when the environment changes?

The short answer, according to University of Pennsylvania biologist Joshua B. Plotkin, is that these two requirements are often not contradictory and that an optimal level of robustness maintains the phenotype in one environment but also allows adaptation to environmental change.

Using an original mathematical model, researchers demonstrated that mutational robustness can either impede or facilitate adaptation depending on the population size, the mutation rate and a measure of the reproductive capabilities of a variety of genotypes, called the fitness landscape.

The results provide a quantitative understanding of the relationship between robustness and evolvability, clarify a significant ambiguity in evolutionary theory and should help illuminate outstanding problems in molecular and experimental evolution, evolutionary development and protein engineering.

The key insight behind this counterintuitive finding is that neutral mutations can set the stage for future, beneficial adaptation. Specifically, researchers found that more robust populations are faster to adapt when the effects of neutral and beneficial mutations are intertwined. Neutral mutations do not impact the phenotype, but they may influence the effects of subsequent mutations in beneficial ways.

To quantify this idea, the study's authors created a general mathematical model of gene interactions and their effects on an organism's phenotype. When the researchers analyzed the model, they found that populations with intermediate levels of robustness were the fastest to adapt to novel environments. These adaptable populations balanced genetic diversity and the rate of phenotypically penetrant mutations to optimally explore the range of possible phenotypes.

The researchers also used computer simulations to check their result under many alternative versions of the basic model. Although there is not yet sufficient data to test these theoretical results in nature, the authors simulated the evolution of RNA molecules, confirming that their theoretical results could predict the rate of adaptation.

"Biologists have long wondered how can organisms be robust and also adaptable," said Plotkin, assistant professor in the Department of Biology in Penn's School of Arts and Sciences. "After all, robust things don't change, so how can an organism be robust against mutation but also be prepared to adapt when the environment changes? It has always seemed like an enigma."

Robustness is a measure of how genetic mutations affect an organism's phenotype, or the set of physical traits, behaviors and features shaped by evolution. It would seem to be the opposite of evolvability, preventing a population from adapting to environmental change. In a robust individual, mutations are mostly neutral, meaning they have little effect on the phenotype.

Since adaptation requires mutations with beneficial phenotypic effects, robust populations seem to be at a disadvantage. The Penn-led research team has demonstrated that this intuition is sometimes wrong.

Novel Studies Of Decomposition Shed New Light On Our Earliest Fossil Ancestry


Decaying corpses are usually the domain of forensic scientists, but palaeontologists have discovered that studying rotting fish sheds new light on our earliest ancestry.

The researchers, from the Department of Geology at the University of Leicester, devised a new method for extracting information from 500 million year old fossils -they studied the way fish decompose to gain a clearer picture of how our ancient fish-like ancestors would have looked.

Their results, published Sunday Jan 31, indicate that some of the earliest fossils from our part of the tree of life may have been more complex than has previously been thought.

Dr Rob Sansom, lead author of the paper explains: "Interpreting fossils is in some ways similar to forensic analysis - we gather all the available clues to put together a scientific reconstruction of something that happened in the past. Unlike forensics, however, we are dealing with life from millions of years ago, and we are less interested in understanding the cause or the time of death.

"What we want to get at is what an animal was like before it died and, as with forensic analysis, knowing how the decomposition that took place after death altered the body provides important clues to its original anatomy."

This is something that palaeontologists sometimes overlook, according to Sansom, "probably because spending hundreds of hours studying the stinking carcasses of rotting fish is not something that appeals to everyone." But the rewards are worth the discomfort.

Fish-like fossils from half a billion years ago are recognised as being part of our evolutionary history because they possess characteristic anatomical features, such as a tail, eyes and the precursor of a backbone. Sansom continues: "It seems contradictory, but decomposition is an important part of the process by which animals become preserved and fossilized, so by knowing how these important anatomical features change as they rot, we are better able to correctly interpret the most ancient fossils representing the lowest branches of our part of the evolutionary tree."

"These fossils provide our only direct record of when and how our earliest vertebrate ancestors evolved" adds Dr Mark Purnell, one of the leaders of the study.

"Did they appear suddenly, in an evolutionary explosion of complexity, or gradually over millions of years? What did they look like? - in what ways did they differ from their worm-like relatives and how did this set the stage for later evolutionary events? Answers to these fundamental questions - the how, when and why of our own origins - remain elusive because reading the earliest vertebrate fossil record is difficult."

The scarcity of branches in this part of the evolutionary tree could reflect rapid, explosive evolution or the simple fact that, because they lacked bones or teeth, the earliest vertebrates left few fossils.

This is the area in which Dr Sarah Gabbott, who with Purnell conceived the Leicester study, is an expert: "Only in the most exceptional circumstances do soft-tissues, such as eyes, muscles and guts, become fossilized, yet it is precisely such remains that we rely on for understanding our earliest evolutionary relatives: half-a-billion years ago it's pretty much all our ancestors had."

Ecologists Discover Forests Are Growing Faster


Speed is not a word typically associated with trees; they can take centuries to grow. However, a new study to be published the week of Feb. 1 in the Proceedings of the National Academy of Sciences has found evidence that forests in the Eastern United States are growing faster than they have in the past 225 years. The study offers a rare look at how an ecosystem is responding to climate change.

For more than 20 years forest ecologist Geoffrey Parker has tracked the growth of 55 stands of mixed hardwood forest plots in Maryland. The plots range in size, and some are as large as 2 acres. Parker's research is based at the Smithsonian Environmental Research Centre, 26 miles east of the nation's capital.

Parker's tree censuses have revealed that the forest is packing on weight at a much faster rate than expected. He and Smithsonian Tropical Research Institute postdoctoral fellow Sean McMahon discovered that, on average, the forest is growing an additional 2 tons per acre annually. That is the equivalent of a tree with a diameter of 2 feet sprouting up over a year.

Forests and their soils store the majority of the Earth's terrestrial carbon stock. Small changes in their growth rate can have significant ramification in weather patterns, nutrient cycles, climate change and biodiversity. Exactly how these systems will be affected remains to be studied.

Parker and McMahon's paper focuses on the drivers of the accelerated tree growth. The chief culprit appears to be climate change, more specifically, the rising levels of atmospheric CO2, higher temperatures and longer growing seasons.

Assessing how a forest is changing is no easy task. Forest ecologists know that the trees they study will most likely outlive them. One way they compensate for this is by creating a "chronosequence"-a series of forests plots of the same type that are at different developmental stages.

At SERC, Parker meticulously tracks the growth of trees in stands that range from 5 to 225 years old. This allowed Parker and McMahon to verify that there was accelerated growth in forest stands young and old. More than 90% of the stands grew two to four times faster than predicted from the baseline chronosequence.

By grouping the forest stands by age, McMahon and Parker were also able to determine that the faster growth is a recent phenomenon. If the forest stands had been growing this quickly their entire lives, they would be much larger than they are.

Parker estimates that among himself, his colleague Dawn Miller and a cadre of citizen scientists, they have taken a quarter of a million measurements over the years. Parker began his tree census work Sept. 8, 1987-his first day on the job. He measures all trees that are 2 centimeters or more in diameter. He also identifies the species, marks the tree's coordinates and notes if it is dead or alive.

By knowing the species and diameter, McMahon is able to calculate the biomass of a tree. He specializes in the data-analysis side of forest ecology. "Walking in the woods helps, but so does looking at the numbers," said McMahon. He analyzed Parker's tree censuses but was hungry for more data.

It was not enough to document the faster growth rate; Parker and McMahon wanted to know why it might be happening. "We made a list of reasons these forests could be growing faster and then ruled half of them out," said Parker. The ones that remained included increased temperature, a longer growing season and increased levels of atmospheric CO2.

During the past 22 years CO2 levels at SERC have risen 12%, the mean temperature has increased by nearly three-tenths of a degree and the growing season has lengthened by 7.8 days. The trees now have more CO2 and an extra week to put on weight. Parker and McMahon suggest that a combination of these three factors has caused the forest's accelerated biomass gain.

Ecosystem responses are one of the major uncertainties in predicting the effects of climate change. Parker thinks there is every reason to believe his study sites are representative of the Eastern deciduous forest, the regional ecosystem that surrounds many of the population centers on the East Coast.

He and McMahon hope other forest ecologists will examine data from their own tree censuses to help determine how widespread the phenomenon is.

Spiders snare water from the air


Fog-catching nets which provide precious water in rain-starved parts of the world may be poised for a high-tech upgrade thanks to the spider.

In a paper published in the journal Nature, Chinese scientists report on why spider's silk is not only famous for strength but also terrific for collecting water from the air, sparing the creature a hunt for a drink.

The secret, revealed by scanning electron microscope, lies in the silk's tail-shaped protein fibres which change structure in response to water.

Once in contact with humidity, tiny sections of the thread scrunge up into knots, whose randomly arranged nano-fibres provide a roughly, knobbly texture.

In between these "spindle knots" are joints, which are smooth and slender, comprising neatly aligned fibres.

Small droplets then condense randomly on the spider's web. Once they reach a critical size, the droplets slide along the slick-surfaced joints thanks to surface tension.

The droplets then reach the spindle knots, where they coalesce with larger drops.

As a result, the joints are freed up to begin a new cycle of condensation and water collection.

The researchers, led by Lei Jiang of the Chinese Academy of Sciences in Beijing, looked at the silk made by the cribellate spider (Uloborus walckenaerius), which uses a little comb, or cribellum, to separate fibres and spin them into various kinds.

After making their observations, they fabricated fibres aimed at replicating the silk's microscopic structure.

"Our artificial spider silk not only mimics the structure of wet-rebuilt spider silk but also its directional water collection capability," they claim.

The breakthrough will help the development of man-made fibres that will help water collection and could also be used in manufacturing processes to snare airborne droplets, they believe.

Fog collection entails stretching out nets or canvas on poles and using the mesh to catch moisture from the breeze. The runoff is collected in a pipe or a trough on the ground.

The technique, pioneered in the coastal Andes, is being encouraged in poor, dry parts of the world, such as Nepal. It is also being promoted by charities as a useful tool to offset water stress caused by global warming.

Tuesday, February 9, 2010

Turbine light concept uses wind to light highways


Ingenious, eco-friendly concepts are all around us, there's no denying that. This one caught our eye because it's pretty innovative, seemingly well thought out, and good looking to boot. The Turbine Light concept (which is going to be a part of the upcoming Greener Gadgets Conference in New York City at the end of this month) harnesses the power of the wind from cars rushing past to light up the ever-darkening roadways. The turbines use the wind collected to generate energy for the lighting, and while the concept lacks a lot of firm details so far, but we're sure to find out more about it soon -- we'll be sure to check them out at the conference on February 25th.

Bioactive Nanomaterial Helps Build New Cartilage


Northwestern University researchers are the first to design a bioactive nanomaterial that promotes the growth of new cartilage in vivo and without the use of expensive growth factors. Minimally invasive, the therapy activates the bone marrow stem cells and produces natural cartilage. No conventional therapy can do this.

The results will be published online by the Proceedings of the National Academy of Sciences (PNAS).

"Unlike bone, cartilage does not grow back, and therefore clinical strategies to regenerate this tissue are of great interest," said Samuel I. Stupp, senior author, Board of Trustees Professor of Chemistry, Materials Science and Engineering, and Medicine, and director of the Institute for BioNanotechnology in Medicine.

Countless people - amateur athletes, professional athletes and people whose joints have just worn out - learn this all too well when they bring their bad knees, shoulders and elbows to an orthopaedic surgeon.

Damaged cartilage can lead to joint pain and loss of physical function and eventually to osteoarthritis, a disorder with an estimated economic impact approaching $65 billion in the United States. With an aging and increasingly active population, this figure is expected to grow.

"Cartilage does not regenerate in adults. Once you are fully grown you have all the cartilage you'll ever have," said first author Ramille N. Shah, assistant professor of materials science and engineering at the McCormick School of Engineering and Applied Science and assistant professor of orthopaedic surgery at the Feinberg School of Medicine. Shah is also a resident faculty member at the Institute for BioNanotechnology in Medicine.

Type II collagen is the major protein in articular cartilage, the smooth, white connective tissue that covers the ends of bones where they come together to form joints.

"Our material of nanoscopic fibers stimulates stem cells present in bone marrow to produce cartilage containing type II collagen and repair the damaged joint," Shah said. "A procedure called microfracture is the most common technique currently used by doctors, but it tends to produce a cartilage having predominantly type I collagen which is more like scar tissue."

The Northwestern gel is injected as a liquid to the area of the damaged joint, where it then self-assembles and forms a solid. This extracellular matrix, which mimics what cells usually see, binds by molecular design one of the most important growth factors for the repair and regeneration of cartilage. By keeping the growth factor concentrated and localized, the cartilage cells have the opportunity to regenerate.

Together with Nirav A. Shah, a sports medicine orthopaedic surgeon and former orthopaedic resident at Northwestern, the researchers implanted their nanofiber gel in an animal model with cartilage defects.

The animals were treated with microfracture, where tiny holes are made in the bone beneath the damaged cartilage to create a new blood supply to stimulate the growth of new cartilage.

The researchers tested various combinations: microfracture alone; microfracture and the nanofiber gel with growth factor added; and microfracture and the nanofiber gel without growth factor added.

They found their technique produced much better results than the microfracture procedure alone and, more importantly, found that addition of the expensive growth factor was not required to get the best results. Instead, because of the molecular design of the gel material, growth factor already present in the body is enough to regenerate cartilage.

The matrix only needed to be present for a month to produce cartilage growth. The matrix, based on self-assembling molecules known as peptide amphiphiles, biodegrades into nutrients and is replaced by natural cartilage.

The PNAS paper is titled "Supramolecular Design of Self-assembling Nanofibers for Cartilage Regeneration." In addition to Stupp, Ramille Shah and Nirav Shah, other authors of the paper are Marc M. Del Rosario Lim, Caleb Hsieh and Gordon Nuber, all from Northwestern.