Showing posts with label Engineering. Show all posts
Showing posts with label Engineering. Show all posts
Sunday, December 11, 2011
Various approaches to constructing AI
Labels:
Computer Science,
Engineering,
Nanotech,
Neuroscience
Friday, June 24, 2011
Smarter car algorithm shows radio interference risk
An experiment at the Massachussetts Institute of Technology has highlighted some of the hidden risks inherent in (supposedly) smart cars that will depend on radio-based Intelligent Transport Systems (ITS) for extra safety on the road.
In an ITS system, in-car computers communicate with each other over vehicle-to-vehicle (V2V) microwave radio links, while the cars also communicate with traffic lights and roadside speed sensors over a vehicle-to-infrastructure (V2I) radio signalling system (the infrastructure transmits information about cars that are too old to have ITS systems fitted). When two cars are approaching a junction and the V2V/V2I speed signals suggest they are going to crash, a warning can be sounded or a software algorithm can choose to make one of the cars brake, for instance.
I tried this out on the Millbrook test track in Bedfordshire, UK, in 2007: speeding towards a junction in a Saab my brakes were automatically applied to allow a speeding Opel to pass in front of me. It was by turns scary and impressive. But if it hadn't worked I'd have been toast.
But MIT engineer Domitilla Del Vecchio says such systems can be over-protective, taking braking action when there is no real threat. "It's tempting to treat every vehicle on the road as an agent that's playing against you," she says in an MIT research brief issued today.
So she and researcher Rajeev Verma set out to design an algorithm that doesn't over-react - and to test it with model vehicles in a lab. Their trick was simple: calculate not speed but acceleration and deceleration as cars approach a junction, allowing a much finer calculation of the risk. In 97 out of 100 circuits, the collision avoidance technology worked fine.
But in three cases, there were two near-misses and one collision. The reason? Nothing to do with the algorithm: it was due to delays in V2V and V2I radio communication. This highlights the risk of depending upon a complex safety system like ITS - especially a radio-based one which could easily be jammed or electromagnetically interfered with because of the wireless technologies which proliferate in our built environment.
There is only so much that researchers can do against a phenomenon as difficult to predict as radio interference.
The takehome message? ITS technology will doubtless do much to improve road safety - but sometimes it won't. It's never going to substitute for driver alertness.
Source New Scientist
In an ITS system, in-car computers communicate with each other over vehicle-to-vehicle (V2V) microwave radio links, while the cars also communicate with traffic lights and roadside speed sensors over a vehicle-to-infrastructure (V2I) radio signalling system (the infrastructure transmits information about cars that are too old to have ITS systems fitted). When two cars are approaching a junction and the V2V/V2I speed signals suggest they are going to crash, a warning can be sounded or a software algorithm can choose to make one of the cars brake, for instance.
I tried this out on the Millbrook test track in Bedfordshire, UK, in 2007: speeding towards a junction in a Saab my brakes were automatically applied to allow a speeding Opel to pass in front of me. It was by turns scary and impressive. But if it hadn't worked I'd have been toast.
But MIT engineer Domitilla Del Vecchio says such systems can be over-protective, taking braking action when there is no real threat. "It's tempting to treat every vehicle on the road as an agent that's playing against you," she says in an MIT research brief issued today.
So she and researcher Rajeev Verma set out to design an algorithm that doesn't over-react - and to test it with model vehicles in a lab. Their trick was simple: calculate not speed but acceleration and deceleration as cars approach a junction, allowing a much finer calculation of the risk. In 97 out of 100 circuits, the collision avoidance technology worked fine.
But in three cases, there were two near-misses and one collision. The reason? Nothing to do with the algorithm: it was due to delays in V2V and V2I radio communication. This highlights the risk of depending upon a complex safety system like ITS - especially a radio-based one which could easily be jammed or electromagnetically interfered with because of the wireless technologies which proliferate in our built environment.
There is only so much that researchers can do against a phenomenon as difficult to predict as radio interference.
The takehome message? ITS technology will doubtless do much to improve road safety - but sometimes it won't. It's never going to substitute for driver alertness.
Source New Scientist
Wednesday, June 22, 2011
University of Minnesota engineering researchers discover source for generating 'green' electricity
University of Minnesota engineering researchers in the College of Science and Engineering have recently discovered a new alloy material that converts heat directly into electricity. This revolutionary energy conversion method is in the early stages of development, but it could have wide-sweeping impact on creating environmentally friendly electricity from waste heat sources.
Researchers say the material could potentially be used to capture waste heat from a car's exhaust that would heat the material and produce electricity for charging the battery in a hybrid car. Other possible future uses include capturing rejected heat from industrial and power plants or temperature differences in the ocean to create electricity. The research team is looking into possible commercialization of the technology.
"This research is very promising because it presents an entirely new method for energy conversion that's never been done before," said University of Minnesota aerospace engineering and mechanics professor Richard James, who led the research team."It's also the ultimate 'green' way to create electricity because it uses waste heat to create electricity with no carbon dioxide."
To create the material, the research team combined elements at the atomic level to create a new multiferroic alloy, Ni45Co5Mn40Sn10. Multiferroic materials combine unusual elastic, magnetic and electric properties. The alloy Ni45Co5Mn40Sn10 achieves multiferroism by undergoing a highly reversible phase transformation where one solid turns into another solid. During this phase transformation the alloy undergoes changes in its magnetic properties that are exploited in the energy conversion device.
During a small-scale demonstration in a University of Minnesota lab, the new material created by the researchers begins as a non-magnetic material, then suddenly becomes strongly magnetic when the temperature is raised a small amount. When this happens, the material absorbs heat and spontaneously produces electricity in a surrounding coil. Some of this heat energy is lost in a process called hysteresis. A critical discovery of the team is a systematic way to minimize hysteresis in phase transformations. The team's research was recently published in the first issue of the new scientific journal Advanced Energy Materials.
Watch a short research video of the new material suddenly become magnetic when heated: http://z.umn.edu/conversionvideo.
In addition to Professor James, other members of the research team include University of Minnesota aerospace engineering and mechanics post-doctoral researchers Vijay Srivastava and Kanwal Bhatti, and Ph.D. student Yintao Song. The team is also working with University of Minnesota chemical engineering and materials science professor Christopher Leighton to create a thin film of the material that could be used, for example, to convert some of the waste heat from computers into electricity.
"This research crosses all boundaries of science and engineering," James said. "It includes engineering, physics, materials, chemistry, mathematics and more. It has required all of us within the university's College of Science and Engineering to work together to think in new ways."
Source EurekaAlert!
Researchers say the material could potentially be used to capture waste heat from a car's exhaust that would heat the material and produce electricity for charging the battery in a hybrid car. Other possible future uses include capturing rejected heat from industrial and power plants or temperature differences in the ocean to create electricity. The research team is looking into possible commercialization of the technology.
"This research is very promising because it presents an entirely new method for energy conversion that's never been done before," said University of Minnesota aerospace engineering and mechanics professor Richard James, who led the research team."It's also the ultimate 'green' way to create electricity because it uses waste heat to create electricity with no carbon dioxide."
To create the material, the research team combined elements at the atomic level to create a new multiferroic alloy, Ni45Co5Mn40Sn10. Multiferroic materials combine unusual elastic, magnetic and electric properties. The alloy Ni45Co5Mn40Sn10 achieves multiferroism by undergoing a highly reversible phase transformation where one solid turns into another solid. During this phase transformation the alloy undergoes changes in its magnetic properties that are exploited in the energy conversion device.
During a small-scale demonstration in a University of Minnesota lab, the new material created by the researchers begins as a non-magnetic material, then suddenly becomes strongly magnetic when the temperature is raised a small amount. When this happens, the material absorbs heat and spontaneously produces electricity in a surrounding coil. Some of this heat energy is lost in a process called hysteresis. A critical discovery of the team is a systematic way to minimize hysteresis in phase transformations. The team's research was recently published in the first issue of the new scientific journal Advanced Energy Materials.
Watch a short research video of the new material suddenly become magnetic when heated: http://z.umn.edu/conversionvideo.
In addition to Professor James, other members of the research team include University of Minnesota aerospace engineering and mechanics post-doctoral researchers Vijay Srivastava and Kanwal Bhatti, and Ph.D. student Yintao Song. The team is also working with University of Minnesota chemical engineering and materials science professor Christopher Leighton to create a thin film of the material that could be used, for example, to convert some of the waste heat from computers into electricity.
"This research crosses all boundaries of science and engineering," James said. "It includes engineering, physics, materials, chemistry, mathematics and more. It has required all of us within the university's College of Science and Engineering to work together to think in new ways."
Source EurekaAlert!
Tuesday, June 21, 2011
Self-assembling Electronic Nano-components
Magnetic storage media such as hard drives have revolutionized the handling of information: We are used to dealing with huge quantities of magnetically stored data while relying on highly sensitive electronic components. And hope to further increase data capacities through ever smaller components. Together with experts from Grenoble and Strasbourg, researchers of KIT’s Institute of Nanotechnology (INT) have developed a nano-component based on a mechanism observed in nature.
“Self-organization” of nano-devices: Magnetic molecules (green) arrange on a carbon nanotube (black) to build an electronic component.
What if the very tininess of a component prevented one from designing the necessary tools for its manufacture? One possibility could be to “teach” the individual parts to self-assemble into the desired product. For fabrication of an electronic nano-device, a team of INT researchers headed by Mario Ruben adopted a trick from nature: Synthetic adhesives were applied to magnetic molecules in such a way that the latter docked on to the proper positions on a nanotube without any intervention. In nature, green leaves grow through a similar self-organizing process without any impetus from subordinate mechanisms. The adoption of such principles to the manufacture of electronic components is a paradigm shift, a novelty.
The nano-switch was developed by a European team of scientists from Centre National de la Recherche Scientifique (CNRS) in Grenoble, Institut de Physique et Chimie des Matériaux at the University of Strasbourg, and KIT’s INT. It is one of the invention’s particular features that, unlike the conventional electronic components, the new component does not consist of materials such as metals, alloys or oxides but entirely of soft materials such as carbon nanotubes and molecules.
Terbium, the only magnetic metal atom that is used in the device, is embedded in organic material. Terbium reacts highly sensitively to external magnetic fields. Information as to how this atom aligns along such magnetic fields is efficiently passed on to the current flowing through the nanotube. The Grenoble CNRS research group headed by Dr. Wolfgang Wernsdorfer succeeded in electrically reading out the magnetism in the environment of the nano-component. The demonstrated possibility of addressing electrically single magnetic molecules opens a completely new world to spintronics, where memory, logic and possibly quantum logic may be integrated.
The function of the spintronic nano-device is described in the July issue of Nature Materials (DOI number: 10.1038/Nmat3050)for low temperatures of approximately one degree Kelvin, which is -272 degrees Celsius. Efforts are taken by the team of researchers to further increase the component’s working temperature in the near future.
Source KIT
“Self-organization” of nano-devices: Magnetic molecules (green) arrange on a carbon nanotube (black) to build an electronic component.
What if the very tininess of a component prevented one from designing the necessary tools for its manufacture? One possibility could be to “teach” the individual parts to self-assemble into the desired product. For fabrication of an electronic nano-device, a team of INT researchers headed by Mario Ruben adopted a trick from nature: Synthetic adhesives were applied to magnetic molecules in such a way that the latter docked on to the proper positions on a nanotube without any intervention. In nature, green leaves grow through a similar self-organizing process without any impetus from subordinate mechanisms. The adoption of such principles to the manufacture of electronic components is a paradigm shift, a novelty.
The nano-switch was developed by a European team of scientists from Centre National de la Recherche Scientifique (CNRS) in Grenoble, Institut de Physique et Chimie des Matériaux at the University of Strasbourg, and KIT’s INT. It is one of the invention’s particular features that, unlike the conventional electronic components, the new component does not consist of materials such as metals, alloys or oxides but entirely of soft materials such as carbon nanotubes and molecules.
Terbium, the only magnetic metal atom that is used in the device, is embedded in organic material. Terbium reacts highly sensitively to external magnetic fields. Information as to how this atom aligns along such magnetic fields is efficiently passed on to the current flowing through the nanotube. The Grenoble CNRS research group headed by Dr. Wolfgang Wernsdorfer succeeded in electrically reading out the magnetism in the environment of the nano-component. The demonstrated possibility of addressing electrically single magnetic molecules opens a completely new world to spintronics, where memory, logic and possibly quantum logic may be integrated.
The function of the spintronic nano-device is described in the July issue of Nature Materials (DOI number: 10.1038/Nmat3050)for low temperatures of approximately one degree Kelvin, which is -272 degrees Celsius. Efforts are taken by the team of researchers to further increase the component’s working temperature in the near future.
Source KIT
Labels:
AI,
Computer Science,
Engineering,
Nanotech
Monday, June 20, 2011
Improving LED lighting
Researcher from the University of Miami helps create a smaller, flexible LED.
CORAL GABLES, FL (June 20, 2011) — University of Miami professor at the College of Engineering, Jizhou Song, has helped design an light-emitting diode (LED) light that uses an array of LEDs 100 times smaller than conventional LEDs. The new device has flexibility, maintains lower temperature and has an increased life-span over existing LEDs. The findings are published online by the "Proceedings of the National Academy of Sciences."
Incandescent bulbs are not very efficient, most of the power they use is converted into heat and only a small fraction of the power gets converted to light. Since LEDs reduce energy waste and present an alternative to conventional bulbs.
In this study, the scientists focused on improving certain features of LED lights, like size, flexibility and temperature. Song's role in the project was to analyze the thermal management and establish an analytical model that reduces the temperature of the device.
"The new model uses a silicon substrate, novel etching strategies, a unique layout and innovative thermal management method," says Song, co-author of the study. "The combination of these manufacturing techniques allows the new design to be much smaller and keep lower temperatures than current LEDs using the same electrical power."
In the future, the researchers would also like to make the device stretchable, so that it can be used on any surface, such as deformable display monitors and biomedical devices that adapt to the curvilinear surfaces of the human body.
Source EurekaAlert!
CORAL GABLES, FL (June 20, 2011) — University of Miami professor at the College of Engineering, Jizhou Song, has helped design an light-emitting diode (LED) light that uses an array of LEDs 100 times smaller than conventional LEDs. The new device has flexibility, maintains lower temperature and has an increased life-span over existing LEDs. The findings are published online by the "Proceedings of the National Academy of Sciences."
Incandescent bulbs are not very efficient, most of the power they use is converted into heat and only a small fraction of the power gets converted to light. Since LEDs reduce energy waste and present an alternative to conventional bulbs.
In this study, the scientists focused on improving certain features of LED lights, like size, flexibility and temperature. Song's role in the project was to analyze the thermal management and establish an analytical model that reduces the temperature of the device.
"The new model uses a silicon substrate, novel etching strategies, a unique layout and innovative thermal management method," says Song, co-author of the study. "The combination of these manufacturing techniques allows the new design to be much smaller and keep lower temperatures than current LEDs using the same electrical power."
In the future, the researchers would also like to make the device stretchable, so that it can be used on any surface, such as deformable display monitors and biomedical devices that adapt to the curvilinear surfaces of the human body.
Source EurekaAlert!
NIU scientists discover simple, green and cost-effective way to produce high yields of highly touted graphene
DeKalb, Ill. — Scientists at Northern Illinois University say they have discovered a simple method for producing high yields of graphene, a highly touted carbon nanostructure that some believe could replace silicon as the technological fabric of the future.
The focus of intense scientific research in recent years, graphene is a two-dimensional material, comprised of a single layer of carbon atoms arranged in a hexagonal lattice. It is the strongest material ever measured and has other remarkable qualities, including high electron mobility, a property that elevates its potential for use in high-speed nano-scale devices of the future.
In a June communication to the Journal of Materials Chemistry, the NIU researchers report on a new method that converts carbon dioxide directly into few-layer graphene (less than 10 atoms in thickness) by burning pure magnesium metal in dry ice.
“It is scientifically proven that burning magnesium metal in carbon dioxide produces carbon, but the formation of this carbon with few-layer graphene as the major product has neither been identified nor proven as such until our current report,” said Narayan Hosmane, a professor of chemistry and biochemistry who leads the NIU research group.
“The synthetic process can be used to potentially produce few-layer graphene in large quantities,” he said. “Up until now, graphene has been synthesized by various methods utilizing hazardous chemicals and tedious techniques. This new method is simple, green and cost-effective.”
Hosmane said his research group initially set out to produce single-wall carbon nanotubes. “Instead, we isolated few-layer graphene,” he said. “It surprised us all.”
“It’s a very simple technique that’s been done by scientists before,” added Amartya Chakrabarti, first author of the communication to the Journal of Materials Chemistry and an NIU post-doctoral research associate in chemistry and biochemistry. “But nobody actually closely examined the structure of the carbon that had been produced.”
Other members of the research group publishing in the Journal of Materials Chemistry include former NIU physics postdoctoral research associate Jun Lu, NIU undergraduate student Jennifer Skrabutenas, NIU Chemistry and Biochemistry Professor Tao Xu, NIU Physics Professor Zhili Xiao and John A. Maguire, a chemistry professor at Southern Methodist University.
The work was supported by grants from the National Science Foundation, Petroleum Research Fund administered by the American Chemical Society, the Department of Energy and Robert A. Welch Foundation.
Source Northern Illinois University
The focus of intense scientific research in recent years, graphene is a two-dimensional material, comprised of a single layer of carbon atoms arranged in a hexagonal lattice. It is the strongest material ever measured and has other remarkable qualities, including high electron mobility, a property that elevates its potential for use in high-speed nano-scale devices of the future.
In a June communication to the Journal of Materials Chemistry, the NIU researchers report on a new method that converts carbon dioxide directly into few-layer graphene (less than 10 atoms in thickness) by burning pure magnesium metal in dry ice.
“It is scientifically proven that burning magnesium metal in carbon dioxide produces carbon, but the formation of this carbon with few-layer graphene as the major product has neither been identified nor proven as such until our current report,” said Narayan Hosmane, a professor of chemistry and biochemistry who leads the NIU research group.
“The synthetic process can be used to potentially produce few-layer graphene in large quantities,” he said. “Up until now, graphene has been synthesized by various methods utilizing hazardous chemicals and tedious techniques. This new method is simple, green and cost-effective.”
Hosmane said his research group initially set out to produce single-wall carbon nanotubes. “Instead, we isolated few-layer graphene,” he said. “It surprised us all.”
“It’s a very simple technique that’s been done by scientists before,” added Amartya Chakrabarti, first author of the communication to the Journal of Materials Chemistry and an NIU post-doctoral research associate in chemistry and biochemistry. “But nobody actually closely examined the structure of the carbon that had been produced.”
Other members of the research group publishing in the Journal of Materials Chemistry include former NIU physics postdoctoral research associate Jun Lu, NIU undergraduate student Jennifer Skrabutenas, NIU Chemistry and Biochemistry Professor Tao Xu, NIU Physics Professor Zhili Xiao and John A. Maguire, a chemistry professor at Southern Methodist University.
The work was supported by grants from the National Science Foundation, Petroleum Research Fund administered by the American Chemical Society, the Department of Energy and Robert A. Welch Foundation.
Source Northern Illinois University
Saturday, June 18, 2011
NASA's James Webb Space Telescope completes first round of cryogenic mirror test
The first six of 18 segments that will form NASA's James Webb Space Telescope's primary mirror for space observations completed final cryogenic testing this week. The ten week test series included two tests cycles where the mirrors were chilled down to -379 degrees Fahrenheit, then back to ambient temperature to ensure the mirrors respond as expected to the extreme temperatures of space.
A second set of six mirror assemblies will arrive at Marshall in late July to begin testing, and the final set of six will arrive in the fall.
The X-ray and Cryogenic Facility at NASA's Marshall Space Flight Center in Huntsville, Ala. provides the space-like environment to help engineers measure how well the telescope will image infrared sources once in orbit.
Each mirror segment measures approximately 4.3 feet (1.3 meters) in diameter to form the 21.3 foot (6.5 meters), hexagonal telescope mirror assembly critical for infrared observations. Each of the 18 hexagonal-shaped mirror assemblies weighs approximately 88 pounds (40 kilograms). The mirrors are made of a light and strong metal called beryllium, and coated with a microscopically thin coat of gold to enabling the mirror to efficiently collect infrared light.
Source EurekaAlert!
(Click on image for higher resolution)
Engineers and technicians guide six James Webb Space Telescope’s mirror segments off the rails after completing final cryogenic testing this week at Marshall.A second set of six mirror assemblies will arrive at Marshall in late July to begin testing, and the final set of six will arrive in the fall.
The X-ray and Cryogenic Facility at NASA's Marshall Space Flight Center in Huntsville, Ala. provides the space-like environment to help engineers measure how well the telescope will image infrared sources once in orbit.
Each mirror segment measures approximately 4.3 feet (1.3 meters) in diameter to form the 21.3 foot (6.5 meters), hexagonal telescope mirror assembly critical for infrared observations. Each of the 18 hexagonal-shaped mirror assemblies weighs approximately 88 pounds (40 kilograms). The mirrors are made of a light and strong metal called beryllium, and coated with a microscopically thin coat of gold to enabling the mirror to efficiently collect infrared light.
Source EurekaAlert!
Friday, June 17, 2011
Graphene may gain an 'on-off switch,' adding semiconductor to long list of material's achievements
College Park, MD (June 17, 2011)--A team of researchers has proposed a way to turn the material graphene into a semiconductor, enabling it to control the flow of electrons with a laser "on-off switch".
Graphene is thinnest and strongest material ever discovered. It's a layer of carbon atoms only one-atom thick, but 200 times stronger than steel. It also conducts electricity extremely well and heat better than any other known material. It is almost completely transparent, yet so dense that not even atoms of helium can penetrate it. In spite of the impressive list of promising prospects, however, graphene appears to lack a critical property -- it doesn't have a "band gap."
A band gap is the basic property of semiconductors, enabling materials to control the flow of electrons. This on-off property is the foundation of computers, encoding the 0s and 1s of computer languages.
Now, a team of researchers at the National University of Córdoba and CONICET in Argentina; the Institut Catala de Nanotecnologia in Barcelona, Spain; and RWTH Aachen University, Germany; suggest that illuminating graphene with a mid-infrared laser could be a key to switch off conduction, thereby improving the possibilities for novel optoelectronic devices.
In an article featured in Applied Physics Letters, the researchers report on the first atomistic simulations of electrical conduction through a micrometer-sized graphene sample illuminated by a laser field. Their simulations show that a laser in the mid-infrared can open an observable band gap in this otherwise gapless material.
"Imagine that by turning on the light, graphene conduction is turned off, or vice versa. This would allow the transduction of optical into electrical signals," says Luis Foa Torres, the researcher leading this collaboration. "The problem of graphene interacting with radiation is also of current interest for the understanding of more exotic states of matter such as the topological insulators."
Source EurekaAlert!
Graphene is thinnest and strongest material ever discovered. It's a layer of carbon atoms only one-atom thick, but 200 times stronger than steel. It also conducts electricity extremely well and heat better than any other known material. It is almost completely transparent, yet so dense that not even atoms of helium can penetrate it. In spite of the impressive list of promising prospects, however, graphene appears to lack a critical property -- it doesn't have a "band gap."
A band gap is the basic property of semiconductors, enabling materials to control the flow of electrons. This on-off property is the foundation of computers, encoding the 0s and 1s of computer languages.
Now, a team of researchers at the National University of Córdoba and CONICET in Argentina; the Institut Catala de Nanotecnologia in Barcelona, Spain; and RWTH Aachen University, Germany; suggest that illuminating graphene with a mid-infrared laser could be a key to switch off conduction, thereby improving the possibilities for novel optoelectronic devices.
In an article featured in Applied Physics Letters, the researchers report on the first atomistic simulations of electrical conduction through a micrometer-sized graphene sample illuminated by a laser field. Their simulations show that a laser in the mid-infrared can open an observable band gap in this otherwise gapless material.
"Imagine that by turning on the light, graphene conduction is turned off, or vice versa. This would allow the transduction of optical into electrical signals," says Luis Foa Torres, the researcher leading this collaboration. "The problem of graphene interacting with radiation is also of current interest for the understanding of more exotic states of matter such as the topological insulators."
Source EurekaAlert!
Thursday, June 16, 2011
Coming to TV Screens of the Future: A Sense of Smell
San Diego, CA, June 14, 2011 -- Today’s television programs are designed to trigger your emotions and your mind through your senses of sound and sight. But what if they could trigger a few more? What if you could smell or taste the cheesy slices of pizza being eaten by your favorite characters on TV? Is it possible? Would audiences enjoy the experience? Would advertisers jump on the opportunity to reach consumers in a new way?
These questions formed the basis of a two year experiment by researchers at the University of California, San Diego, conducted in collaboration with Samsung Advanced Institute of Technology (SAIT) in Korea. In a proof of concept paper published online today by the journal Angewandte Chemie, the researchers demonstrate that it is possible to generate odor, at will, in a compact device small enough to fit on the back of your TV with potentially thousands of odors.
“For example, if people are eating pizza, the viewer smells pizza coming from a TV or cell phone,” said Sungho Jin, professor in the departments of Mechanical and Aerospace Engineering and NanoEngineering at the UC San Diego Jacobs School of Engineering. “And if a beautiful lady walks by, they smell perfume. Instantaneously generated fragrances or odors would match the scene shown on a TV or cell phone, and that’s the idea.”
Jin and his team of graduate students used an X-Y matrix system in order to minimize the amount of circuitry that would be required to produce a compact device that could generate any odor at any time. The scent comes from an aqueous solution such as ammonia, which forms an odorous gas when heated through a thin metal wire by an electrical current. The solution is kept in a compartment made of non-toxic, non-flammable silicone elastomer. As the heat and odor pressure build, a tiny compressed hole in the elastomer is opened, releasing the odor.
Whether TV and cell phone audiences and advertisers will respond to such idea are questions for another phase of the study. For now, the question was simply whether it’s possible.
“It is quite doable,” said Jin, who is also a world renowned researcher in materials science.
Without an X-Y matrix system, thousands of individual controllers would be needed to accommodate the range of odors required for a commercial system. “That’s a lot of circuitry and wires,” said Jin. By comparison, using the X-Y system, 200 controllers (100 on the X-axis multiplied by 100 on the Y- axis) would selectively activate each of the 10,000 odors.
The UCSD team tested their device with two commercially available perfumes, “Live by Jennifer Lopez,” and “Passion by Elizabeth Taylor.” In both cases, a human tester was able to smell and distinguish the scents within 30 centimeters of the test chamber. When the perfumes were switched, the tester was exposed to coffee beans, which is the common practice for cleansing a tester’s sense of smell in perfume development.
“This is likely to be the next generation TV or cell phone that produces odors to match the images you see on the screen.” said Jin. The multi-odor concept was initiated by Samsung’s research and development group, headed by Jongmin Kim at SAIT. They came to UCSD with a request for a practical means of accomplishing such a vision.
The possible scenarios are endless. A romantic comedy opens with two harried people stopping by their favorite coffee shop to fuel up before work. They are about to meet in some impossibly adorable way. But you’re too distracted by the hazelnut latte that looks so good you think you can smell it. And you can. Thanks to the compact odor-generating device attached to the back of your TV set. Unless the scent is fading, in which case you just need to buy a new one like you would to replace the ink cartridge on your printer.
Next steps in the research would include developing a prototype and demonstrating that it is reliable enough to release odors on cue and scalable to the size needed for consumer electronics like TVs and cell phones. And there are a few other considerations. For example, perfume companies could let you sample new scents through TV, but your TV’s odor-generating device would have to carry that particular perfume meaning the device probably needs to be upgradable like software for your home computer. And TV producers will probably want scents that are tailored to match the personalities of their characters.
“That’s a logistics problem,” said Jin. “But in specific applications one can always think of a way.”
Source UCSD Jacobs
These questions formed the basis of a two year experiment by researchers at the University of California, San Diego, conducted in collaboration with Samsung Advanced Institute of Technology (SAIT) in Korea. In a proof of concept paper published online today by the journal Angewandte Chemie, the researchers demonstrate that it is possible to generate odor, at will, in a compact device small enough to fit on the back of your TV with potentially thousands of odors.
“For example, if people are eating pizza, the viewer smells pizza coming from a TV or cell phone,” said Sungho Jin, professor in the departments of Mechanical and Aerospace Engineering and NanoEngineering at the UC San Diego Jacobs School of Engineering. “And if a beautiful lady walks by, they smell perfume. Instantaneously generated fragrances or odors would match the scene shown on a TV or cell phone, and that’s the idea.”
Jin and his team of graduate students used an X-Y matrix system in order to minimize the amount of circuitry that would be required to produce a compact device that could generate any odor at any time. The scent comes from an aqueous solution such as ammonia, which forms an odorous gas when heated through a thin metal wire by an electrical current. The solution is kept in a compartment made of non-toxic, non-flammable silicone elastomer. As the heat and odor pressure build, a tiny compressed hole in the elastomer is opened, releasing the odor.
Whether TV and cell phone audiences and advertisers will respond to such idea are questions for another phase of the study. For now, the question was simply whether it’s possible.
“It is quite doable,” said Jin, who is also a world renowned researcher in materials science.
Without an X-Y matrix system, thousands of individual controllers would be needed to accommodate the range of odors required for a commercial system. “That’s a lot of circuitry and wires,” said Jin. By comparison, using the X-Y system, 200 controllers (100 on the X-axis multiplied by 100 on the Y- axis) would selectively activate each of the 10,000 odors.
The UCSD team tested their device with two commercially available perfumes, “Live by Jennifer Lopez,” and “Passion by Elizabeth Taylor.” In both cases, a human tester was able to smell and distinguish the scents within 30 centimeters of the test chamber. When the perfumes were switched, the tester was exposed to coffee beans, which is the common practice for cleansing a tester’s sense of smell in perfume development.
“This is likely to be the next generation TV or cell phone that produces odors to match the images you see on the screen.” said Jin. The multi-odor concept was initiated by Samsung’s research and development group, headed by Jongmin Kim at SAIT. They came to UCSD with a request for a practical means of accomplishing such a vision.
The possible scenarios are endless. A romantic comedy opens with two harried people stopping by their favorite coffee shop to fuel up before work. They are about to meet in some impossibly adorable way. But you’re too distracted by the hazelnut latte that looks so good you think you can smell it. And you can. Thanks to the compact odor-generating device attached to the back of your TV set. Unless the scent is fading, in which case you just need to buy a new one like you would to replace the ink cartridge on your printer.
Next steps in the research would include developing a prototype and demonstrating that it is reliable enough to release odors on cue and scalable to the size needed for consumer electronics like TVs and cell phones. And there are a few other considerations. For example, perfume companies could let you sample new scents through TV, but your TV’s odor-generating device would have to carry that particular perfume meaning the device probably needs to be upgradable like software for your home computer. And TV producers will probably want scents that are tailored to match the personalities of their characters.
“That’s a logistics problem,” said Jin. “But in specific applications one can always think of a way.”
Source UCSD Jacobs
Hybrid cars give flywheels a spin
HE next generation of hybrid cars could get a boost from an old technology - the humble flywheel. By replacing hefty batteries in hybrid electric vehicles with a lightweight flywheel that uses a novel form of magnetic gearing, a British engineering company claims that the same fuel-efficiency savings can be achieved at a much lower cost.
The system, called Kinergy, is to be tested initially in airport buses, starting this week. It uses a carbon-fibre flywheel spinning at up to 60,000 revolutions per minute to store energy recovered from the engine and braking, which it then delivers back when needed. Engineers led by Andy Atkins of Ricardo, the Shoreham-on-Sea firm that developed the system, hope it will increase buses' energy efficiency by 13 per cent in urban driving conditions.
Storing vehicles' energy in flywheels has been tried in trains and buses for decades, but the devices have typically proven too large and heavy to be practical. Ricardo's design is just 23 centimetres in diameter and the wheel weighs only 4.5 kilograms, but can deliver 30 kilowatts of power to a vehicle's transmission.
Kinergy's high speeds are possible because it spins in a vacuum - air resistance and heat would otherwise rip it apart. To transfer energy in and out of an object spinning so fast in a vacuum-sealed chamber, the team had to devise a gearing system that makes no mechanical contact.
The system achieves this through magnetic gearing, where an array of powerful permanent magnets is set into the flywheel's shaft and another array of magnets mounted on an external shaft. In between, a ring of steel segments interferes with the magnetic fields in such a way that asone shaft turns, it causes the other to spin at a different rate. With a gearing ratio of 10 to 1, the speed of rotation becomes manageable for conventional transmissions, Atkins says.
If the bus tests are successful, they could hasten the broader adoption of flywheels to help power hybrid cars. "This is going to be very attractive in mobile applications", says Alan Ruddell of the Energy Research Unit at Rutherford Appleton Laboratory in Oxfordshire, UK.
Ricardo won't be alone. Volvo announced last week that it is developing flywheel hybrids. And Jaguar is looking at commercialising a system based on the Formula 1 kinetic-energy recovery system being used by the Hope Polevision team in Le Mans, France, this month, developed by Flybrid, in Silverstone, UK. Flywheels are coming, says Flybrid founder John Hilton. "It will make hybrid technology at between a quarter and a third of the cost of electric," which in turn will make for much more affordable hybrid cars, he says.
Source New Scientist
Wednesday, June 15, 2011
Penn Researchers Break Light-Matter Coupling Strength Limit in Nanoscale Semiconductors
PHILADELPHIA—New engineering research at the University of Pennsylvania demonstrates that polaritons have increased coupling strength when confined to nanoscale semiconductors. This represents a promising advance in the field of photonics: smaller and faster circuits that use light rather than electricity.
The research was conducted by assistant professor Ritesh Agarwal, postdoctoral fellow Lambert van Vugt and graduate student Brian Piccione of the Department of Materials Science and Engineering in Penn’s School of Engineering and Applied Science. Chang-Hee Cho and Pavan Nukala, also of the Materials Science department, contributed to the study.
Their work was published in the journal Proceedings of the National Academy of Sciences.
Polaritons are quasiparticles, combinations of physical particles and the energy they contribute to a system that can be measured and tracked as a single unit. Polaritons are combinations of photons and another quasiparticle, excitons. Together, they have qualities of both light and electric charge, without being fully either.
“An exciton is a combination of a an electron, which has negative charge and an electron hole, which has a positive charge. Light is an oscillating electro-magnetic field, so it can couple with the excitons,” Agarwal said. “When their frequencies match, they can talk to one another; both of their oscillations become more pronounced.”
High light-matter coupling strength is a key factor in designing photonic devices, which would use light instead of electricity and thus be faster and use less power than comparable electronic devices. However, the coupling strength exhibited within bulk semiconductors had always been thought of as a fixed property of the material they were made of.
Agarwal’s team proved that, with the proper fabrication and finishing techniques, this limit can be broken.
“When you go from bulk sizes to one micron, the light-matter coupling strength is pretty constant,” Agarwal said. “But, if you try to go below 500 nanometers or so, what we have shown is that this coupling strength increases dramatically.“
The difference is a function of one of nanotechnology’s principle phenomena: the traits of a bulk material are different than structures of the same material on the nanoscale.
“When you’re working at bigger sizes, the surface is not as important. The surface to volume ratio — the number of atoms on the surface divided by the number of atoms in the whole material — is a very small number,” Agarwal said. “But when you make a very small structure, say 100 nanometers, this number is dramatically increased. Then what is happening on the surface critically determines the device’s properties.”
Other researchers have tried to make polariton cavities on this small a scale, but the chemical etching method used to fabricate the devices damages the semiconductor surface. The defects on the surface trap the excitons and render them useless.
“Our cadmium sulfide nanowires are self-assembled; we don’t etch them. But the surface quality was still a limiting factor, so we developed techniques of surface passivation. We grew a silicon oxide shell on the surface of the wires and greatly improved their optical properties,” Agarwal said.
The oxide shell fills the electrical gaps in the nanowire surface, preventing the excitons from getting trapped.
“We also developed tools and techniques for measuring this light-matter coupling strength,” Piccione said. “We’ve quantified the light-matter coupling strength, so we can show that it’s enhanced in the smaller structures,”
Being able to quantify this increased coupling strength opens the door for designing nanophotonic circuit elements and devices.
“The stronger you can make light-matter coupling, the better you can make photonic switches,” Agarwal said. “Electrical transistors work because electrons care what other electrons are doing, but, on their own, photons do not interact with each other. You need to combine optical properties with material properties to make it work”
This research was supported by the Netherlands Organization for Scientific Research Rubicon Programme, the U.S. Army Research Office, the National Science Foundation, Penn’s Nano/Bio Interface Center and the National Institutes of Health.
Source Penn State University
The research was conducted by assistant professor Ritesh Agarwal, postdoctoral fellow Lambert van Vugt and graduate student Brian Piccione of the Department of Materials Science and Engineering in Penn’s School of Engineering and Applied Science. Chang-Hee Cho and Pavan Nukala, also of the Materials Science department, contributed to the study.
A computer simulation of a one-dimensional cavity wave in a 200nm nanowire.
Polaritons are quasiparticles, combinations of physical particles and the energy they contribute to a system that can be measured and tracked as a single unit. Polaritons are combinations of photons and another quasiparticle, excitons. Together, they have qualities of both light and electric charge, without being fully either.
“An exciton is a combination of a an electron, which has negative charge and an electron hole, which has a positive charge. Light is an oscillating electro-magnetic field, so it can couple with the excitons,” Agarwal said. “When their frequencies match, they can talk to one another; both of their oscillations become more pronounced.”
High light-matter coupling strength is a key factor in designing photonic devices, which would use light instead of electricity and thus be faster and use less power than comparable electronic devices. However, the coupling strength exhibited within bulk semiconductors had always been thought of as a fixed property of the material they were made of.
Agarwal’s team proved that, with the proper fabrication and finishing techniques, this limit can be broken.
“When you go from bulk sizes to one micron, the light-matter coupling strength is pretty constant,” Agarwal said. “But, if you try to go below 500 nanometers or so, what we have shown is that this coupling strength increases dramatically.“
The difference is a function of one of nanotechnology’s principle phenomena: the traits of a bulk material are different than structures of the same material on the nanoscale.
“When you’re working at bigger sizes, the surface is not as important. The surface to volume ratio — the number of atoms on the surface divided by the number of atoms in the whole material — is a very small number,” Agarwal said. “But when you make a very small structure, say 100 nanometers, this number is dramatically increased. Then what is happening on the surface critically determines the device’s properties.”
Other researchers have tried to make polariton cavities on this small a scale, but the chemical etching method used to fabricate the devices damages the semiconductor surface. The defects on the surface trap the excitons and render them useless.
“Our cadmium sulfide nanowires are self-assembled; we don’t etch them. But the surface quality was still a limiting factor, so we developed techniques of surface passivation. We grew a silicon oxide shell on the surface of the wires and greatly improved their optical properties,” Agarwal said.
The oxide shell fills the electrical gaps in the nanowire surface, preventing the excitons from getting trapped.
“We also developed tools and techniques for measuring this light-matter coupling strength,” Piccione said. “We’ve quantified the light-matter coupling strength, so we can show that it’s enhanced in the smaller structures,”
Being able to quantify this increased coupling strength opens the door for designing nanophotonic circuit elements and devices.
“The stronger you can make light-matter coupling, the better you can make photonic switches,” Agarwal said. “Electrical transistors work because electrons care what other electrons are doing, but, on their own, photons do not interact with each other. You need to combine optical properties with material properties to make it work”
This research was supported by the Netherlands Organization for Scientific Research Rubicon Programme, the U.S. Army Research Office, the National Science Foundation, Penn’s Nano/Bio Interface Center and the National Institutes of Health.
Source Penn State University
Saturday, June 11, 2011
‘Artificial leaf’ moves closer to reality
MIT researchers develop a device that combines a solar cell with a catalyst to split water molecules and generate energy.
An important step toward realizing the dream of an inexpensive and simple “artificial leaf,” a device to harness solar energy by splitting water molecules, has been accomplished by two separate teams of researchers at MIT. Both teams produced devices that combine a standard silicon solar cell with a catalyst developed three years ago by professor Daniel Nocera. When submerged in water and exposed to sunlight, the devices cause bubbles of oxygen to separate out of the water.
The next step to producing a full, usable artificial leaf, explains Nocera, the Henry Dreyfus Professor of Energy and professor of chemistry, will be to integrate the final ingredient: an additional catalyst to bubble out the water’s hydrogen atoms. In the current devices, hydrogen atoms are simply dissociated into the solution as loose protons and electrons. If a catalyst could produce fully formed hydrogen molecules (H2), the molecules could be used to generate electricity or to make fuel for vehicles. Realization of that step, Nocera says, will be the subject of a forthcoming paper.
The reports by the two teams were published in the journals Energy & Environmental Science on May 12, and the Proceedings of the National Academy of Sciences on June 6. Nocera encouraged two different teams to work on the project so that each could bring their special expertise to addressing the problem, and says the fact that both succeeded “speaks to the versatility of the catalyst system.”
Ultimately, Nocera wants to produce a low-cost device that could be used where electricity is unavailable or unreliable. It would consist of a glass container full of water, with a solar cell with the catalysts on its two sides attached to a divider separating the container into two sections. When exposed to the sun, the electrified catalysts would produce two streams of bubbles — hydrogen on one side, oxygen on the other — which could be collected in two tanks, and later recombined through a fuel cell or other device to generate electricity when needed.
“These papers are really important, to show that the catalyst works” when bonded to silicon to make a single device, Nocera says, thus enabling a unit that combines the functions of collecting sunlight and converting it to storable fuel. Silicon is an Earth-abundant and relatively inexpensive material that is widely used and well understood, and the materials used for the catalyst — cobalt and phosphorus — are also abundant and inexpensive.
Putting it together
Marrying the technologies of silicon solar cells with the catalyst material — dubbed Co-Pi for cobalt phosphate — was no trivial matter, explains Tonio Buonassisi, the SMA Assistant Professor of Mechanical Engineering and Manufacturing, who was a co-author of the PNAS paper. That’s because the splitting of water by the catalyst creates a “very aggressive” chemical environment that would tend to rapidly degrade the silicon, destroying the device as it operates, he says.
In order to overcome this, both teams had to find ways to protect the silicon surface, while at the same time allowing it to receive the incoming sunlight and to interact with the catalyst.
Professor of Electrical Engineering Vladimir Bulović, who led the other team, says his team's approach was to form the Co-Pi material on the surface of the silicon cell, by first evaporating a layer of pure cobalt metal onto the cell electrode, and then exposing it to a phosphate buffer solution under an electrical charge to transform it into the Co-Pi catalyst. By using the layer of Co-Pi, now firmly bonded to the surface, “we were able to passivate the surface,” says Elizabeth Young, a postdoc who was the lead author of the E&ES paper — in other words, it acts as a protective barrier that keeps the silicon from degrading in water.
“Most people have been staying away from silicon for water oxidation, because it forms silicon dioxide” when exposed to water, which is an insulator that would hinder the electrical conductivity of the material, says Ronny Costi, a postdoc on Bulović’s team. “We had to find a way of solving that problem,” which they did by using the cobalt coating.
Buonassisi’s team used a different approach, coating the silicon with a protective layer. “We did it by putting a thin film of indium tin oxide on top,” explains Joep Pijpers, a postdoc who was the lead author of the PNAS paper. Using its expertise in the design of silicon devices, that team then concentrated on matching the current output of the solar cell as closely as possible to the current consumption by the (catalyzed) water-splitting reaction. The system still needs to be optimized, Pijpers says, to improve the efficiency by a factor of 10 to bring it to a range comparable to conventional solar cells.
“It’s really not trivial, integrating a low-cost, high-performance silicon device with the Co-Pi,” Buonassisi says. “There’s a substantial amount of innovation in both device processing and architecture.”
Both teams had to add an extra power source to the system, because the voltage produced by a single-junction silicon cell is not high enough to use for powering the water-splitting catalyst. In later versions, two or three silicon solar cells will be used in series to provide the needed voltage without the need for any extra power source, the researchers say.
One interesting aspect of these collaborations, says postdoc Mark Winkler, who worked with Buonassisi’s team, was that “materials scientists and chemists had to learn to talk to each other.” That’s trickier than it may sound, he explains, because the two disciplines, even when talking about the same phenomena, tend to use different terminology and even different ways of measuring and displaying certain characteristics.
Portable power?
Nocera’s ultimate goal is to produce an “artificial leaf” so simple and so inexpensive that it could be made widely available to the billions of people in the world who lack access to adequate, reliable sources of electricity. What’s needed to accomplish that, in addition to stepping up the voltage, is the addition of a second catalyst material to the other side of the silicon cell, Nocera says.
Although the two approaches to bonding the catalyst with a silicon cell appear to produce functioning, stable devices, so far they have only been tested over periods of a few days. The expectation is that they will be stable for long periods, but accelerated aging tests will need to be performed to confirm this.
Rajeshwar Krishnan, Distinguished University Professor of Chemistry and Biochemistry at the University of Texas at Arlington, says it remains to be seen “whether this ‘self-healing’ catalyst would hold up to several hours of current flow … under rather harsh oxidative conditions.” But he adds that these papers “certainly move the science forward. The state of the science in water photo-oxidation uses rather expensive noble metal oxides,” whereas this work uses Earth-abundant, low-cost materials. He adds that while there is still no good storage or distribution system in place for hydrogen, “it is likely that the solar photon-to-hydrogen technology will ultimately see the light of day — for transportation applications — with the hydrogen internal combustion engine.”
Meanwhile, Nocera has founded a company called Sun Catalytix, which will initially be producing a first-generation system based on the Co-Pi catalyst material, connected by wires to conventional, separate solar cells.
The “leaf” system, by contrast, is “still a science project,” Nocera says. “We haven’t even gotten to what I would call an engineering design.” He hopes, however, that the artificial leaf could become a reality within three years.
Bulović’s team was funded partly by the Chesonis Family Foundation and the National Science Foundation. Buonassisi’s team had support from the Netherlands Organization for Scientific Research (NOW-FOM), the National Science Foundation and the Chesonis Family Foundation. Nocera’s work was funded by the Chesonis Family Foundation, the Air Force Office of Scientific Research and the National Science Foundation.
Source MIT
An important step toward realizing the dream of an inexpensive and simple “artificial leaf,” a device to harness solar energy by splitting water molecules, has been accomplished by two separate teams of researchers at MIT. Both teams produced devices that combine a standard silicon solar cell with a catalyst developed three years ago by professor Daniel Nocera. When submerged in water and exposed to sunlight, the devices cause bubbles of oxygen to separate out of the water.
The next step to producing a full, usable artificial leaf, explains Nocera, the Henry Dreyfus Professor of Energy and professor of chemistry, will be to integrate the final ingredient: an additional catalyst to bubble out the water’s hydrogen atoms. In the current devices, hydrogen atoms are simply dissociated into the solution as loose protons and electrons. If a catalyst could produce fully formed hydrogen molecules (H2), the molecules could be used to generate electricity or to make fuel for vehicles. Realization of that step, Nocera says, will be the subject of a forthcoming paper.
The reports by the two teams were published in the journals Energy & Environmental Science on May 12, and the Proceedings of the National Academy of Sciences on June 6. Nocera encouraged two different teams to work on the project so that each could bring their special expertise to addressing the problem, and says the fact that both succeeded “speaks to the versatility of the catalyst system.”
Ultimately, Nocera wants to produce a low-cost device that could be used where electricity is unavailable or unreliable. It would consist of a glass container full of water, with a solar cell with the catalysts on its two sides attached to a divider separating the container into two sections. When exposed to the sun, the electrified catalysts would produce two streams of bubbles — hydrogen on one side, oxygen on the other — which could be collected in two tanks, and later recombined through a fuel cell or other device to generate electricity when needed.
“These papers are really important, to show that the catalyst works” when bonded to silicon to make a single device, Nocera says, thus enabling a unit that combines the functions of collecting sunlight and converting it to storable fuel. Silicon is an Earth-abundant and relatively inexpensive material that is widely used and well understood, and the materials used for the catalyst — cobalt and phosphorus — are also abundant and inexpensive.
Putting it together
Marrying the technologies of silicon solar cells with the catalyst material — dubbed Co-Pi for cobalt phosphate — was no trivial matter, explains Tonio Buonassisi, the SMA Assistant Professor of Mechanical Engineering and Manufacturing, who was a co-author of the PNAS paper. That’s because the splitting of water by the catalyst creates a “very aggressive” chemical environment that would tend to rapidly degrade the silicon, destroying the device as it operates, he says.
In order to overcome this, both teams had to find ways to protect the silicon surface, while at the same time allowing it to receive the incoming sunlight and to interact with the catalyst.
Professor of Electrical Engineering Vladimir Bulović, who led the other team, says his team's approach was to form the Co-Pi material on the surface of the silicon cell, by first evaporating a layer of pure cobalt metal onto the cell electrode, and then exposing it to a phosphate buffer solution under an electrical charge to transform it into the Co-Pi catalyst. By using the layer of Co-Pi, now firmly bonded to the surface, “we were able to passivate the surface,” says Elizabeth Young, a postdoc who was the lead author of the E&ES paper — in other words, it acts as a protective barrier that keeps the silicon from degrading in water.
“Most people have been staying away from silicon for water oxidation, because it forms silicon dioxide” when exposed to water, which is an insulator that would hinder the electrical conductivity of the material, says Ronny Costi, a postdoc on Bulović’s team. “We had to find a way of solving that problem,” which they did by using the cobalt coating.
Buonassisi’s team used a different approach, coating the silicon with a protective layer. “We did it by putting a thin film of indium tin oxide on top,” explains Joep Pijpers, a postdoc who was the lead author of the PNAS paper. Using its expertise in the design of silicon devices, that team then concentrated on matching the current output of the solar cell as closely as possible to the current consumption by the (catalyzed) water-splitting reaction. The system still needs to be optimized, Pijpers says, to improve the efficiency by a factor of 10 to bring it to a range comparable to conventional solar cells.
“It’s really not trivial, integrating a low-cost, high-performance silicon device with the Co-Pi,” Buonassisi says. “There’s a substantial amount of innovation in both device processing and architecture.”
Both teams had to add an extra power source to the system, because the voltage produced by a single-junction silicon cell is not high enough to use for powering the water-splitting catalyst. In later versions, two or three silicon solar cells will be used in series to provide the needed voltage without the need for any extra power source, the researchers say.
One interesting aspect of these collaborations, says postdoc Mark Winkler, who worked with Buonassisi’s team, was that “materials scientists and chemists had to learn to talk to each other.” That’s trickier than it may sound, he explains, because the two disciplines, even when talking about the same phenomena, tend to use different terminology and even different ways of measuring and displaying certain characteristics.
Portable power?
Nocera’s ultimate goal is to produce an “artificial leaf” so simple and so inexpensive that it could be made widely available to the billions of people in the world who lack access to adequate, reliable sources of electricity. What’s needed to accomplish that, in addition to stepping up the voltage, is the addition of a second catalyst material to the other side of the silicon cell, Nocera says.
Although the two approaches to bonding the catalyst with a silicon cell appear to produce functioning, stable devices, so far they have only been tested over periods of a few days. The expectation is that they will be stable for long periods, but accelerated aging tests will need to be performed to confirm this.
Rajeshwar Krishnan, Distinguished University Professor of Chemistry and Biochemistry at the University of Texas at Arlington, says it remains to be seen “whether this ‘self-healing’ catalyst would hold up to several hours of current flow … under rather harsh oxidative conditions.” But he adds that these papers “certainly move the science forward. The state of the science in water photo-oxidation uses rather expensive noble metal oxides,” whereas this work uses Earth-abundant, low-cost materials. He adds that while there is still no good storage or distribution system in place for hydrogen, “it is likely that the solar photon-to-hydrogen technology will ultimately see the light of day — for transportation applications — with the hydrogen internal combustion engine.”
Meanwhile, Nocera has founded a company called Sun Catalytix, which will initially be producing a first-generation system based on the Co-Pi catalyst material, connected by wires to conventional, separate solar cells.
The “leaf” system, by contrast, is “still a science project,” Nocera says. “We haven’t even gotten to what I would call an engineering design.” He hopes, however, that the artificial leaf could become a reality within three years.
Bulović’s team was funded partly by the Chesonis Family Foundation and the National Science Foundation. Buonassisi’s team had support from the Netherlands Organization for Scientific Research (NOW-FOM), the National Science Foundation and the Chesonis Family Foundation. Nocera’s work was funded by the Chesonis Family Foundation, the Air Force Office of Scientific Research and the National Science Foundation.
Source MIT
Thursday, June 9, 2011
A NEW WAY TO MAKE LIGHTER, STRONGER STEEL – IN A FLASH
COLUMBUS, Ohio – A Detroit entrepreneur surprised university engineers here recently, when he invented a heat-treatment that makes steel 7 percent stronger than any steel on record – in less than 10 seconds.
In fact, the steel, now trademarked as Flash Bainite, has tested stronger and more shock-absorbing than the most common titanium alloys used by industry.
Now the entrepreneur is working with researchers at Ohio State University to better understand the science behind the new treatment, called flash processing.
What they’ve discovered may hold the key to making cars and military vehicles lighter, stronger, and more fuel-efficient.
In the current issue of the journal Materials Science and Technology, the inventor and his Ohio State partners describe how rapidly heating and cooling steel sheets changes the microstructure inside the alloy to make it stronger and less brittle.
The basic process of heat-treating steel has changed little in the modern age, and engineer Suresh Babu is one of few researchers worldwide who still study how to tune the properties of steel in detail. He’s an associate professor of materials science and engineering at Ohio State, and Director of the National Science Foundation (NSF) Center for Integrative Materials Joining for Energy Applications, headquartered at the university.
“Steel is what we would call a ‘mature technology.’ We’d like to think we know most everything about it,” he said. “If someone invented a way to strengthen the strongest steels even a few percent, that would be a big deal. But 7 percent? That’s huge.”
Yet, when inventor Gary Cola initially approached him, Babu didn’t know what to think.
“The process that Gary described – it shouldn’t have worked,” he said. “I didn’t believe him. So he took my students and me to Detroit.”
Cola showed them his proprietary lab setup at SFP Works, LLC., where rollers carried steel sheets through flames as hot as 1100 degrees Celsius and then into a cooling liquid bath.
Though the typical temperature and length of time for hardening varies by industry, most steels are heat-treated at around 900 degrees Celsius for a few hours. Others are heated at similar temperatures for days.
Cola’s entire process took less than 10 seconds.
He claimed that the resulting steel was 7 percent stronger than martensitic advanced high-strength steel. [Martensitic steel is so named because the internal microstructure is entirely composed of a crystal form called martensite.] Cola further claimed that his steel could be drawn – that is, thinned and lengthened – 30 percent more than martensitic steels without losing its enhanced strength.
If that were true, then Cola’s steel could enable carmakers to build frames that are up to 30 percent thinner and lighter without compromising safety. Or, it could reinforce an armored vehicle without weighing it down.
“We asked for a few samples to test, and it turned out that everything he said was true,” said Ohio State graduate student Tapasvi Lolla. “Then it was up to us to understand what was happening.”
Cola is a self-taught metallurgist, and he wanted help from Babu and his team to reveal the physics behind the process – to understand it in detail so that he could find ways to adapt it and even improve it.
He partnered with Ohio State to provide research support for Brian Hanhold, who was an undergraduate student at the time, and Lolla, who subsequently earned his master’s degree working out the answer.
Using an electron microscope, they discovered that Cola’s process did indeed form martensite microstructure inside the steel. But they also saw another form called bainite microstructure, scattered with carbon-rich compounds called carbides.
In traditional, slow heat treatments, steel’s initial microstructure always dissolves into a homogeneous phase called austenite at peak temperature, Babu explained. But as the steel cools rapidly from this high temperature, all of the austenite normally transforms into martensite.
“We think that, because this new process is so fast with rapid heating and cooling, the carbides don’t get a chance to dissolve completely within austenite at high temperature, so they remain in the steel and make this unique microstructure containing bainite, martensite and carbides,” Babu said.
Lolla pointed out that this unique microstructure boosts ductility -- meaning that the steel can crumple a great deal before breaking – making it a potential impact-absorber for automotive applications.
Babu, Lolla, Ohio State research scientist Boian Alexandrov, and Cola co-authored the paper with Badri Narayanan, a doctoral student in materials science and engineering.
Now Hanhold is working to carry over his lessons into welding engineering, where he hopes to solve the problem of heat-induced weakening during welding. High-strength steel often weakens just outside the weld joint, where the alloy has been heated and cooled. Hanhold suspects that bringing the speed of Cola’s method to welding might minimize the damage to adjacent areas and reduce the weakening.
If he succeeds, his discovery will benefit industrial partners of the NSF Center for Integrative Materials Joining Science for Energy Applications, which formed earlier this year. Ohio State’s academic partners on the center include Lehigh University, the University of Wisconsin-Madison, and the Colorado School of Mines.
Source Ohio State University
In fact, the steel, now trademarked as Flash Bainite, has tested stronger and more shock-absorbing than the most common titanium alloys used by industry.
Now the entrepreneur is working with researchers at Ohio State University to better understand the science behind the new treatment, called flash processing.
What they’ve discovered may hold the key to making cars and military vehicles lighter, stronger, and more fuel-efficient.
In the current issue of the journal Materials Science and Technology, the inventor and his Ohio State partners describe how rapidly heating and cooling steel sheets changes the microstructure inside the alloy to make it stronger and less brittle.
The basic process of heat-treating steel has changed little in the modern age, and engineer Suresh Babu is one of few researchers worldwide who still study how to tune the properties of steel in detail. He’s an associate professor of materials science and engineering at Ohio State, and Director of the National Science Foundation (NSF) Center for Integrative Materials Joining for Energy Applications, headquartered at the university.
“Steel is what we would call a ‘mature technology.’ We’d like to think we know most everything about it,” he said. “If someone invented a way to strengthen the strongest steels even a few percent, that would be a big deal. But 7 percent? That’s huge.”
Yet, when inventor Gary Cola initially approached him, Babu didn’t know what to think.
“The process that Gary described – it shouldn’t have worked,” he said. “I didn’t believe him. So he took my students and me to Detroit.”
Cola showed them his proprietary lab setup at SFP Works, LLC., where rollers carried steel sheets through flames as hot as 1100 degrees Celsius and then into a cooling liquid bath.
Though the typical temperature and length of time for hardening varies by industry, most steels are heat-treated at around 900 degrees Celsius for a few hours. Others are heated at similar temperatures for days.
Cola’s entire process took less than 10 seconds.
He claimed that the resulting steel was 7 percent stronger than martensitic advanced high-strength steel. [Martensitic steel is so named because the internal microstructure is entirely composed of a crystal form called martensite.] Cola further claimed that his steel could be drawn – that is, thinned and lengthened – 30 percent more than martensitic steels without losing its enhanced strength.
If that were true, then Cola’s steel could enable carmakers to build frames that are up to 30 percent thinner and lighter without compromising safety. Or, it could reinforce an armored vehicle without weighing it down.
“We asked for a few samples to test, and it turned out that everything he said was true,” said Ohio State graduate student Tapasvi Lolla. “Then it was up to us to understand what was happening.”
Cola is a self-taught metallurgist, and he wanted help from Babu and his team to reveal the physics behind the process – to understand it in detail so that he could find ways to adapt it and even improve it.
He partnered with Ohio State to provide research support for Brian Hanhold, who was an undergraduate student at the time, and Lolla, who subsequently earned his master’s degree working out the answer.
Using an electron microscope, they discovered that Cola’s process did indeed form martensite microstructure inside the steel. But they also saw another form called bainite microstructure, scattered with carbon-rich compounds called carbides.
In traditional, slow heat treatments, steel’s initial microstructure always dissolves into a homogeneous phase called austenite at peak temperature, Babu explained. But as the steel cools rapidly from this high temperature, all of the austenite normally transforms into martensite.
“We think that, because this new process is so fast with rapid heating and cooling, the carbides don’t get a chance to dissolve completely within austenite at high temperature, so they remain in the steel and make this unique microstructure containing bainite, martensite and carbides,” Babu said.
Lolla pointed out that this unique microstructure boosts ductility -- meaning that the steel can crumple a great deal before breaking – making it a potential impact-absorber for automotive applications.
Babu, Lolla, Ohio State research scientist Boian Alexandrov, and Cola co-authored the paper with Badri Narayanan, a doctoral student in materials science and engineering.
Now Hanhold is working to carry over his lessons into welding engineering, where he hopes to solve the problem of heat-induced weakening during welding. High-strength steel often weakens just outside the weld joint, where the alloy has been heated and cooled. Hanhold suspects that bringing the speed of Cola’s method to welding might minimize the damage to adjacent areas and reduce the weakening.
If he succeeds, his discovery will benefit industrial partners of the NSF Center for Integrative Materials Joining Science for Energy Applications, which formed earlier this year. Ohio State’s academic partners on the center include Lehigh University, the University of Wisconsin-Madison, and the Colorado School of Mines.
Source Ohio State University
Labels:
Chemistry,
Engineering,
Solid state Physics
Tuesday, June 7, 2011
Where's my Holodeck? The latest interactive movie news
IT IS time for cinema to take its next step. 3D technology now fills our screens with beautifully rendered characters and virtual environments, but we could have so much more.
So says Dennis Del Favero, director of what he calls the world's first 3D interactive film, Scenario. Rather than having audience members sit back and enjoy the action, the interactive narrative has them drive the story.
Undoubtedly, the ultimate synthetic interactive environment must be the virtual worlds generated by Star Trek's "Holodeck". To date, steps in this direction have been restricted because computer-generated characters cannot yet understand and speak in natural language. One solution is to sidestep the need for language and interact with audience members using physical markers, like movement.
In Scenario, which is loosely based on the life of Elisabeth Fritzl, the Austrian woman imprisoned by her father in a cellar for 24 years, audience members' movements around the cinema are tracked using 16 near-infrared cameras. Treading the line between a movie and a game, audience members are introduced to the plot, and assigned avatars and a mission: to collect virtual body parts and return them to an oversized baby. But they must work against artificially intelligent sentinel avatars, which use information from the cameras and the position of objects in the film's virtual world to plan their actions. For example, the sentinels can only move the baby's head if they are next to it, but might better achieve their objective by pushing an audience member's avatar.
Makers of interactive films can also hook into physiological reactions. Earlier this year, Unsound debuted in Austin, Texas. In this horror film, the visuals, music score and sound effects change depending on the heart rate and skin response of its collective audience members. For example, something horrific happens to the lead character, but in one version of the scene the audience can see it happening, while in another they can only hear it. "If the audience is highly reactive we will not show the graphic scene, and if the audience is bored to tears we would," says Ben Knapp of BioControl Systems, a technology firm based in San Francisco that collaborated on the film. According to Knapp, amalgamating the average emotional response of an audience overcomes any "noise" in the data - such as an audience member thinking "I need to pee."
Approaching the hurdle of language recognition, Marc Cavazza at Teesside University, UK, has created a computer system that detects the emotional content of speech. While it takes no notice of the user's words per se, it categorises speech according to a range of attributes in their voice including pitch, duration and pauses. Using this technology, you can enjoy a conversation of sorts with a virtual character from Gustave Flaubert's Madame Bovary. The character's responses depend on predispositions based on their personality in the novel.
The Holodeck is still some way off, then. According to Cavazza, for complex plots to be driven by the audience, virtual characters must understand words.
Even once this is achieved, another issue appears: confining the narrative. Computer-memory constraints mean that an interactive movie cannot allow for unlimited plot choices. One work-around is to build a movie like a choose-your-own-adventure book, where audiences only influence the plot at specific points. But Michael Mateas at the University of California, Santa Cruz, reckons it would be more interesting if interactive films are developed as a virtual world, where the plot is directed by characters and their relationships.
Along these lines, Mateas has created Prom Week, which follows a fictional group of high school students in the week before their final dance. To be released on Facebook in August, Prom Week will immerse the player in hallway politics as they dictate the future of the virtual students through a range of options. Each interaction between characters is recorded into a database, the sum of these interactions driving the plot's direction by evolving the characters' sentiments towards each other. "We want it to feel like these characters are alive," says Mateas.
Source New Scientist
So says Dennis Del Favero, director of what he calls the world's first 3D interactive film, Scenario. Rather than having audience members sit back and enjoy the action, the interactive narrative has them drive the story.
Undoubtedly, the ultimate synthetic interactive environment must be the virtual worlds generated by Star Trek's "Holodeck". To date, steps in this direction have been restricted because computer-generated characters cannot yet understand and speak in natural language. One solution is to sidestep the need for language and interact with audience members using physical markers, like movement.
In Scenario, which is loosely based on the life of Elisabeth Fritzl, the Austrian woman imprisoned by her father in a cellar for 24 years, audience members' movements around the cinema are tracked using 16 near-infrared cameras. Treading the line between a movie and a game, audience members are introduced to the plot, and assigned avatars and a mission: to collect virtual body parts and return them to an oversized baby. But they must work against artificially intelligent sentinel avatars, which use information from the cameras and the position of objects in the film's virtual world to plan their actions. For example, the sentinels can only move the baby's head if they are next to it, but might better achieve their objective by pushing an audience member's avatar.
Makers of interactive films can also hook into physiological reactions. Earlier this year, Unsound debuted in Austin, Texas. In this horror film, the visuals, music score and sound effects change depending on the heart rate and skin response of its collective audience members. For example, something horrific happens to the lead character, but in one version of the scene the audience can see it happening, while in another they can only hear it. "If the audience is highly reactive we will not show the graphic scene, and if the audience is bored to tears we would," says Ben Knapp of BioControl Systems, a technology firm based in San Francisco that collaborated on the film. According to Knapp, amalgamating the average emotional response of an audience overcomes any "noise" in the data - such as an audience member thinking "I need to pee."
Approaching the hurdle of language recognition, Marc Cavazza at Teesside University, UK, has created a computer system that detects the emotional content of speech. While it takes no notice of the user's words per se, it categorises speech according to a range of attributes in their voice including pitch, duration and pauses. Using this technology, you can enjoy a conversation of sorts with a virtual character from Gustave Flaubert's Madame Bovary. The character's responses depend on predispositions based on their personality in the novel.
The Holodeck is still some way off, then. According to Cavazza, for complex plots to be driven by the audience, virtual characters must understand words.
Even once this is achieved, another issue appears: confining the narrative. Computer-memory constraints mean that an interactive movie cannot allow for unlimited plot choices. One work-around is to build a movie like a choose-your-own-adventure book, where audiences only influence the plot at specific points. But Michael Mateas at the University of California, Santa Cruz, reckons it would be more interesting if interactive films are developed as a virtual world, where the plot is directed by characters and their relationships.
Along these lines, Mateas has created Prom Week, which follows a fictional group of high school students in the week before their final dance. To be released on Facebook in August, Prom Week will immerse the player in hallway politics as they dictate the future of the virtual students through a range of options. Each interaction between characters is recorded into a database, the sum of these interactions driving the plot's direction by evolving the characters' sentiments towards each other. "We want it to feel like these characters are alive," says Mateas.
Source New Scientist
Friday, May 20, 2011
China Admits Problems With Three Gorges Dam
BEIJING — The Three Gorges Dam, the world’s largest hydroelectric project and a symbol of China’s confidence in risky technological solutions, is troubled by urgent pollution and geologic problems, a high-level government body acknowledged Thursday.
The statement came as technicians were certifying the very last of the dam’s array of generators as suitable for hydroelectric generation, the final step in a contentious 19-year effort to complete the project in defiance of domestic and international concerns over its safety as well as threats to the environment, displaced people, historical areas and natural beauty.
According to official figures, the venture cost China about $23 billion, but outside experts estimate it may have cost double that amount. The dam has been plagued by reports of floating archipelagoes of garbage, carpets of algae and landslides on the banks along the vast expanse of still water since the 600-foot-tall dam on the Yangtze River was completed in 2006. Critics also have complained that the government has fallen far short of its goals in helping to resettle the 1.4 million people displaced by the rising waters behind the dam.
China’s State Council, a coordinating body often likened to the United States president’s cabinet, said in a vague statement that the project suffered from a wide range of serious problems. “Although the Three Gorges project provides huge comprehensive benefits, urgent problems must be resolved regarding the smooth relocation of residents, ecological protection and geological disaster prevention,” the statement said.
The huge dam is meeting the government’s goal of producing pollution-free electric power, the government said, generating 84 billion kilowatt-hours of electricity last year. But critics say the sheer weight of water backed up in the 410-mile-long reservoir behind the dam has increased the danger of earthquakes and landslides. The government has acknowledged that risk, but denied that the project played any role in China’s powerful May 2008 quake in Sichuan Province, in which at least 87,000 people died.
Environmentalists say the lake has become a repository for the waste dumped by cities and industries.
Even the dam’s ability to regulate the notoriously changeable flow of the 3,900-mile-long Yangtze, one of China’s two major rivers, has been called into question. Faced with a historic drought this spring, cities downstream of the dam have been unable to accommodate oceangoing vessels that usually visit their ports, and about 400,000 residents of Hubei Province lost access to drinking water this month.
Although no link has been proved, critics say the dam has changed regional water tables, contributing to the shortage.
The government statement on the dam was released after a meeting led by Prime Minister Wen Jiabao, seen by many outsiders as more responsive to average citizens’ complaints than many others in the nation’s leadership. The statement said that some problems were anticipated during the dam’s construction, but that others “arose because of new demands posed by economic and social development.”
China’s rulers may be most concerned by the impact of the dam on the displaced masses, many of whom appear to have failed to rebuild their lives after being evicted from the land covered by the reservoir. By 2020, the statement promised, displaced residents would enjoy living standards equal to those who had not been displaced.
The Three Gorges project has been dogged by skeptics, even within China’s bureaucracy, since it was approved in 1992. Environmentalists said it would destroy a stunning landscape of limestone cliffs regarded as one of the world’s most scenic sites, and skeptics warned that the new lake would lead to geologic and pollution problems.
Orville Schell, an environmental expert who leads the Asia Society’s Center on United States-China Relations, said he hoped that the government’s statement signaled a commitment to address the dam’s problems.
“There’s a kind of a balance sheet of benefits and liabilities that have come out of this project,” Mr. Schell said. “My sense is that the Chinese government is getting better and better at collecting information about things like this.” He added, “They know if they don’t fix these problems there will be dire consequences.”
Source The New York Times
The Three Gorges Dam faces problems involving pollution and geological disaster prevention.
According to official figures, the venture cost China about $23 billion, but outside experts estimate it may have cost double that amount. The dam has been plagued by reports of floating archipelagoes of garbage, carpets of algae and landslides on the banks along the vast expanse of still water since the 600-foot-tall dam on the Yangtze River was completed in 2006. Critics also have complained that the government has fallen far short of its goals in helping to resettle the 1.4 million people displaced by the rising waters behind the dam.
China’s State Council, a coordinating body often likened to the United States president’s cabinet, said in a vague statement that the project suffered from a wide range of serious problems. “Although the Three Gorges project provides huge comprehensive benefits, urgent problems must be resolved regarding the smooth relocation of residents, ecological protection and geological disaster prevention,” the statement said.
The huge dam is meeting the government’s goal of producing pollution-free electric power, the government said, generating 84 billion kilowatt-hours of electricity last year. But critics say the sheer weight of water backed up in the 410-mile-long reservoir behind the dam has increased the danger of earthquakes and landslides. The government has acknowledged that risk, but denied that the project played any role in China’s powerful May 2008 quake in Sichuan Province, in which at least 87,000 people died.
Environmentalists say the lake has become a repository for the waste dumped by cities and industries.
Even the dam’s ability to regulate the notoriously changeable flow of the 3,900-mile-long Yangtze, one of China’s two major rivers, has been called into question. Faced with a historic drought this spring, cities downstream of the dam have been unable to accommodate oceangoing vessels that usually visit their ports, and about 400,000 residents of Hubei Province lost access to drinking water this month.
Although no link has been proved, critics say the dam has changed regional water tables, contributing to the shortage.
The government statement on the dam was released after a meeting led by Prime Minister Wen Jiabao, seen by many outsiders as more responsive to average citizens’ complaints than many others in the nation’s leadership. The statement said that some problems were anticipated during the dam’s construction, but that others “arose because of new demands posed by economic and social development.”
China’s rulers may be most concerned by the impact of the dam on the displaced masses, many of whom appear to have failed to rebuild their lives after being evicted from the land covered by the reservoir. By 2020, the statement promised, displaced residents would enjoy living standards equal to those who had not been displaced.
The Three Gorges project has been dogged by skeptics, even within China’s bureaucracy, since it was approved in 1992. Environmentalists said it would destroy a stunning landscape of limestone cliffs regarded as one of the world’s most scenic sites, and skeptics warned that the new lake would lead to geologic and pollution problems.
Orville Schell, an environmental expert who leads the Asia Society’s Center on United States-China Relations, said he hoped that the government’s statement signaled a commitment to address the dam’s problems.
“There’s a kind of a balance sheet of benefits and liabilities that have come out of this project,” Mr. Schell said. “My sense is that the Chinese government is getting better and better at collecting information about things like this.” He added, “They know if they don’t fix these problems there will be dire consequences.”
Source The New York Times
Record efficiency of 18.7 percent for flexible CIGS solar cells on plastics
It's all about the money. To make solar electricity affordable on a large scale, scientists and engineers worldwide have long been trying to develop a low-cost solar cell, which is both highly efficient and easy to manufacture with high throughput. Now a team at Empa's Laboratory for Thin Film and Photovoltaics, led by Ayodhya N. Tiwari, has made a major step forward. "The new record value for flexible CIGS solar cells of 18.7% nearly closes the "efficiency gap" to solar cells based on polycrystalline silicon (Si) wafers or CIGS thin film cells on glass", says Tiwari. He is convinced that "flexible and lightweight CIGS solar cells with efficiencies comparable to the "best-in-class" will have excellent potential to bring about a paradigm shift and to enable low-cost solar electricity in the near future."
One major advantage of flexible high-performance CIGS solar cells is the potential to lower manufacturing costs through roll-to-roll processing while at the same time offering a much higher efficiency than the ones currently on the market. What's more, such lightweight and flexible solar modules offer additional cost benefits in terms of transportation, installation, structural frames for the modules etc., i.e. they significantly reduce the so-called "balance of system" costs. Taken together, the new CIGS polymer cells exhibit numerous advantages for applications such as facades, solar farms and portable electronics. With high-performance devices now within reach, the new results suggest that monolithically-interconnected flexible CIGS solar modules with efficiencies above 16% should be achievable with the recently developed processes and concepts.
At the forefront of efficiency improvements In recent years, thin film photovoltaic technology based on glass substrates has gained sufficient maturity towards industrial production; flexible CIGS technology is, however, still an emerging field. The recent improvements in efficiency in research labs and pilot plants – among others by Tiwari's group, first at ETH Zurich and since a couple of years now at Empa – are contributing to performance improvements and to overcoming manufacturability barriers.
Working closely with scientists at FLISOM, a start-up company who is scaling up and commercializing the technology, the Empa team made significant progress in low-temperature growth of CIGS layers yielding flexible CIGS cells that are ever more efficient, up from a record value of 14.1% in 2005 to the new "high score" of 18.7% for any type of flexible solar cell grown on polymer or metal foil. The latest improvements in cell efficiency were made possible through a reduction in recombination losses by improving the structural properties of the CIGS layer and the proprietary low-temperature deposition process for growing the layers as well as in situ doping with Na during the final stage. With these results, polymer films have for the first time proven to be superior to metal foils as a carrier substrate for achieving highest efficiency.
Record efficiencies of up to 17.5% on steel foils covered with impurity diffusion barriers were so far achieved with CIGS growth processes at temperatures exceeding 550°C. However, when applied to steel foil without any diffusion barrier, the proprietary low temperature CIGS deposition process developed by Empa and FLISOM for polymer films easily matched the performance achieved with high-temperature procedure, resulting in an efficiency of 17.7%. The results suggest that commonly used barrier coatings for detrimental impurities on metal foils would not be required. "Our results clearly show the advantages of the low-temperature CIGS deposition process for achieving highest efficiency flexible solar cells on polymer as well as metal foils", says Tiwari. The projects were supported by the Swiss National Science Foundation (SNSF), the Commission for Technology and Innovation (CTI), the Swiss Federal Office of Energy (SFOE), EU Framework Programmes as well as by Swiss companies W.Blösch AG and FLISOM.
Scaling up production of flexible CIGS solar cells The continuous improvement in energy conversion efficiencies of flexible CIGS solar cells is no small feat, says Empa Director Gian-Luca Bona. "What we see here is the result of an in-depth understanding of the material properties of layers and interfaces combined with an innovative process development in a systematic manner. Next, we need to transfer these innovations to industry for large scale production of low-cost solar modules to take off." Empa scientists are currently working together with FLISOM to further develop manufacturing processes and to scale up production.
Source EurekaAlert!
One major advantage of flexible high-performance CIGS solar cells is the potential to lower manufacturing costs through roll-to-roll processing while at the same time offering a much higher efficiency than the ones currently on the market. What's more, such lightweight and flexible solar modules offer additional cost benefits in terms of transportation, installation, structural frames for the modules etc., i.e. they significantly reduce the so-called "balance of system" costs. Taken together, the new CIGS polymer cells exhibit numerous advantages for applications such as facades, solar farms and portable electronics. With high-performance devices now within reach, the new results suggest that monolithically-interconnected flexible CIGS solar modules with efficiencies above 16% should be achievable with the recently developed processes and concepts.
At the forefront of efficiency improvements In recent years, thin film photovoltaic technology based on glass substrates has gained sufficient maturity towards industrial production; flexible CIGS technology is, however, still an emerging field. The recent improvements in efficiency in research labs and pilot plants – among others by Tiwari's group, first at ETH Zurich and since a couple of years now at Empa – are contributing to performance improvements and to overcoming manufacturability barriers.
Working closely with scientists at FLISOM, a start-up company who is scaling up and commercializing the technology, the Empa team made significant progress in low-temperature growth of CIGS layers yielding flexible CIGS cells that are ever more efficient, up from a record value of 14.1% in 2005 to the new "high score" of 18.7% for any type of flexible solar cell grown on polymer or metal foil. The latest improvements in cell efficiency were made possible through a reduction in recombination losses by improving the structural properties of the CIGS layer and the proprietary low-temperature deposition process for growing the layers as well as in situ doping with Na during the final stage. With these results, polymer films have for the first time proven to be superior to metal foils as a carrier substrate for achieving highest efficiency.
Record efficiencies of up to 17.5% on steel foils covered with impurity diffusion barriers were so far achieved with CIGS growth processes at temperatures exceeding 550°C. However, when applied to steel foil without any diffusion barrier, the proprietary low temperature CIGS deposition process developed by Empa and FLISOM for polymer films easily matched the performance achieved with high-temperature procedure, resulting in an efficiency of 17.7%. The results suggest that commonly used barrier coatings for detrimental impurities on metal foils would not be required. "Our results clearly show the advantages of the low-temperature CIGS deposition process for achieving highest efficiency flexible solar cells on polymer as well as metal foils", says Tiwari. The projects were supported by the Swiss National Science Foundation (SNSF), the Commission for Technology and Innovation (CTI), the Swiss Federal Office of Energy (SFOE), EU Framework Programmes as well as by Swiss companies W.Blösch AG and FLISOM.
Scaling up production of flexible CIGS solar cells The continuous improvement in energy conversion efficiencies of flexible CIGS solar cells is no small feat, says Empa Director Gian-Luca Bona. "What we see here is the result of an in-depth understanding of the material properties of layers and interfaces combined with an innovative process development in a systematic manner. Next, we need to transfer these innovations to industry for large scale production of low-cost solar modules to take off." Empa scientists are currently working together with FLISOM to further develop manufacturing processes and to scale up production.
Source EurekaAlert!
Tuesday, May 17, 2011
Sharpening the Nanofocus: Berkeley Lab Researchers Use Nanoantenna to Enhance Plasmonic Sensing
Such highly coveted technical capabilities as the observation of single catalytic processes in nanoreactors, or the optical detection of low concentrations of biochemical agents and gases are an important step closer to fruition. Researchers with the U.S. Department of Energy (DOE)’s Lawrence Berkeley National Laboratory (Berkeley Lab), in collaboration with researchers at the University of Stuttgart in Germany, report the first experimental demonstration of antenna-enhanced gas sensing at the single particle level. By placing a palladium nanoparticle on the focusing tip of a gold nanoantenna, they were able to clearly detect changes in the palladium’s optical properties upon exposure to hydrogen.
Top figure shows hydrogen (red) absorbed on a palladium nanoparticle, resulting in weak light scattering and barely detectable spectral changes. Bottom figure shows gold antenna enhancing light scattering and producing an easy to detect spectral shift. (Image courtesy of Alivisatos group)
“We have demonstrated resonant antenna-enhanced single-particle hydrogen sensing in the visible region and presented a fabrication approach to the positioning of a single palladium nanoparticle in the nanofocus of a gold nanoantenna,” says Paul Alivisatos, Berkeley Lab’s director and the leader of this research. “Our concept provides a general blueprint for amplifying plasmonic sensing signals at the single-particle level and should pave the road for the optical observation of chemical reactions and catalytic activities in nanoreactors, and for local biosensing.”
Alivisatos, who is also the Larry and Diane Bock Professor of Nanotechnology at the University of California, Berkeley, is the corresponding author of a paper in the journal Nature Materials describing this research. The paper is titled “Nanoantenna-enhanced gas sensing in a single tailored nanofocus.” Co-authoring the paper with Alivisatos were Laura Na Liu, Ming Tang, Mario Hentschel and Harald Giessen.
One of the hottest new fields in technology today is plasmonics – the confinement of electromagnetic waves in dimensions smaller than half-the-wavelength of the incident photons in free space. Typically this is done at the interface between metallic nanostructures, usually gold, and a dielectric, usually air. The confinement of the electromagnetic waves in these metallic nanostructures generates electronic surface waves called “plasmons.” A matching of the oscillation frequency between plasmons and the incident electromagnetic waves gives rise to a phenomenon known as localized surface plasmon resonance (LSPR), which can concentrate the electromagnetic field into a volume less than a few hundred cubic nanometers. Any object brought into this locally confined field – referred to as the nanofocus – will influence the LSPR in a manner that can be detected via dark-field microscopy.
“Nanofocusing has immediate implications for plasmonic sensing,” says Laura Na Liu, lead author of the Nature Materials paper who was at the time the work was done a member of Alivisatos’ research group but is now with Rice University. “Metallic nanostructures with sharp corners and edges that form a pointed tip are especially favorable for plasmonic sensing because the field strengths of the electromagnetic waves are so strongly enhanced over such an extremely small sensing volume.”
Scanning electron microscopy image showing a palladium nanoparticle with a gold antenna to enhance plasmonic sensing. (Image courtesy of Alivisatos group)
Plasmonic sensing is especially promising for the detection of flammable gases such as hydrogen, where the use of sensors that require electrical measurements pose safety issues because of the potential threat from sparking. Hydrogen, for example, can ignite or explode in concentrations of only four-percent. Palladium was seen as a prime candidate for the plasmonic sensing of hydrogen because it readily and rapidly absorbs hydrogen that alters its electrical and dielectric properties. However, the LSPRs of palladium nanoparticles yield broad spectral profiles that make detecting changes extremely difficult.
“In our resonant antenna-enhanced scheme, we use double electron-beam lithography in combination with a double lift-off procedure to precisely position a single palladium nanoparticle in the nanofocus of a gold nanoantenna,” Liu says. “The strongly enhanced gold-particle plasmon near-fields can sense the change in the dielectric function of the proximal palladium nanoparticle as it absorbs or releases hydrogen. Light scattered by the system is collected by a dark-field microscope with attached spectrometer and the LSPR change is read out in real time.”
Alivisatos, Liu and their co-authors found that the antenna enhancement effect could be controlled by changing the distance between the palladium nanoparticle and the gold antenna, and by changing the shape of the antenna.
“By amplifying sensing signals at the single-particle level, we eliminate the statistical and average characteristics inherent to ensemble measurements,” Liu says. “Moreover, our antenna-enhanced plasmonic sensing technique comprises a noninvasive scheme that is biocompatible and can be used in aqueous environments, making it applicable to a variety of physical and biochemical materials.”
For example, by replacing the palladium nanoparticle with other nanocatalysts, such as ruthenium, platinum, or magnesium, Liu says their antenna-enhanced plasmonic sensing scheme can be used to monitor the presence of numerous other important gases in addition to hydrogen, including carbon dioxide and the nitrous oxides. This technique also offers a promising plasmonic sensing alternative to the fluorescent detection of catalysis, which depends upon the challenging task of finding appropriate fluorophores. Antenna-enhanced plasmonic sensing also holds potential for the observation of single chemical or biological events.
“We believe our antenna-enhanced sensing technique can serve as a bridge between plasmonics and biochemistry,” Liu says. “Plasmonic sensing offers a unique tool for optically probing biochemical processes that are optically inactive in nature. In addition, since plasmonic nanostructures made from gold or silver do not bleach or blink, they allow for continuous observation, an essential capability for in-situ monitoring of biochemical behavior.”
This research was supported by the DOE Office of Science and the German ministry of research.
Lawrence Berkeley National Laboratory addresses the world’s most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab’s scientific expertise has been recognized with 12 Nobel prizes. The University of California manages Berkeley Lab for the U.S. Department of Energy’s Office of Science. For more, visit www.lbl.gov.
Additional information:
For more information about the research of Paul Alivisatos, visit the Website at http://www.cchem.berkeley.edu/pagrp/
Source Berkeley Lab
Top figure shows hydrogen (red) absorbed on a palladium nanoparticle, resulting in weak light scattering and barely detectable spectral changes. Bottom figure shows gold antenna enhancing light scattering and producing an easy to detect spectral shift. (Image courtesy of Alivisatos group)
“We have demonstrated resonant antenna-enhanced single-particle hydrogen sensing in the visible region and presented a fabrication approach to the positioning of a single palladium nanoparticle in the nanofocus of a gold nanoantenna,” says Paul Alivisatos, Berkeley Lab’s director and the leader of this research. “Our concept provides a general blueprint for amplifying plasmonic sensing signals at the single-particle level and should pave the road for the optical observation of chemical reactions and catalytic activities in nanoreactors, and for local biosensing.”
Alivisatos, who is also the Larry and Diane Bock Professor of Nanotechnology at the University of California, Berkeley, is the corresponding author of a paper in the journal Nature Materials describing this research. The paper is titled “Nanoantenna-enhanced gas sensing in a single tailored nanofocus.” Co-authoring the paper with Alivisatos were Laura Na Liu, Ming Tang, Mario Hentschel and Harald Giessen.
One of the hottest new fields in technology today is plasmonics – the confinement of electromagnetic waves in dimensions smaller than half-the-wavelength of the incident photons in free space. Typically this is done at the interface between metallic nanostructures, usually gold, and a dielectric, usually air. The confinement of the electromagnetic waves in these metallic nanostructures generates electronic surface waves called “plasmons.” A matching of the oscillation frequency between plasmons and the incident electromagnetic waves gives rise to a phenomenon known as localized surface plasmon resonance (LSPR), which can concentrate the electromagnetic field into a volume less than a few hundred cubic nanometers. Any object brought into this locally confined field – referred to as the nanofocus – will influence the LSPR in a manner that can be detected via dark-field microscopy.
“Nanofocusing has immediate implications for plasmonic sensing,” says Laura Na Liu, lead author of the Nature Materials paper who was at the time the work was done a member of Alivisatos’ research group but is now with Rice University. “Metallic nanostructures with sharp corners and edges that form a pointed tip are especially favorable for plasmonic sensing because the field strengths of the electromagnetic waves are so strongly enhanced over such an extremely small sensing volume.”
Scanning electron microscopy image showing a palladium nanoparticle with a gold antenna to enhance plasmonic sensing. (Image courtesy of Alivisatos group)
Plasmonic sensing is especially promising for the detection of flammable gases such as hydrogen, where the use of sensors that require electrical measurements pose safety issues because of the potential threat from sparking. Hydrogen, for example, can ignite or explode in concentrations of only four-percent. Palladium was seen as a prime candidate for the plasmonic sensing of hydrogen because it readily and rapidly absorbs hydrogen that alters its electrical and dielectric properties. However, the LSPRs of palladium nanoparticles yield broad spectral profiles that make detecting changes extremely difficult.
“In our resonant antenna-enhanced scheme, we use double electron-beam lithography in combination with a double lift-off procedure to precisely position a single palladium nanoparticle in the nanofocus of a gold nanoantenna,” Liu says. “The strongly enhanced gold-particle plasmon near-fields can sense the change in the dielectric function of the proximal palladium nanoparticle as it absorbs or releases hydrogen. Light scattered by the system is collected by a dark-field microscope with attached spectrometer and the LSPR change is read out in real time.”
Alivisatos, Liu and their co-authors found that the antenna enhancement effect could be controlled by changing the distance between the palladium nanoparticle and the gold antenna, and by changing the shape of the antenna.
“By amplifying sensing signals at the single-particle level, we eliminate the statistical and average characteristics inherent to ensemble measurements,” Liu says. “Moreover, our antenna-enhanced plasmonic sensing technique comprises a noninvasive scheme that is biocompatible and can be used in aqueous environments, making it applicable to a variety of physical and biochemical materials.”
For example, by replacing the palladium nanoparticle with other nanocatalysts, such as ruthenium, platinum, or magnesium, Liu says their antenna-enhanced plasmonic sensing scheme can be used to monitor the presence of numerous other important gases in addition to hydrogen, including carbon dioxide and the nitrous oxides. This technique also offers a promising plasmonic sensing alternative to the fluorescent detection of catalysis, which depends upon the challenging task of finding appropriate fluorophores. Antenna-enhanced plasmonic sensing also holds potential for the observation of single chemical or biological events.
“We believe our antenna-enhanced sensing technique can serve as a bridge between plasmonics and biochemistry,” Liu says. “Plasmonic sensing offers a unique tool for optically probing biochemical processes that are optically inactive in nature. In addition, since plasmonic nanostructures made from gold or silver do not bleach or blink, they allow for continuous observation, an essential capability for in-situ monitoring of biochemical behavior.”
This research was supported by the DOE Office of Science and the German ministry of research.
Lawrence Berkeley National Laboratory addresses the world’s most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab’s scientific expertise has been recognized with 12 Nobel prizes. The University of California manages Berkeley Lab for the U.S. Department of Energy’s Office of Science. For more, visit www.lbl.gov.
Additional information:
For more information about the research of Paul Alivisatos, visit the Website at http://www.cchem.berkeley.edu/pagrp/
Source Berkeley Lab
Control Desk for the Neural Switchboard
Treating anxiety no longer requires years of pills or psychotherapy. At least, not for a certain set of bioengineered mice.
STANFORD Optogenetics, tested in rodents, can control electrical activity in a few carefully selected neurons, and may hold new insights into our disorders.
In a study recently published in the journal Nature, a team of neuroscientists turned these high-strung prey into bold explorers with the flip of a switch.
The group, led by Dr. Karl Deisseroth, a psychiatrist and researcher at Stanford, employed an emerging technology called optogenetics to control electrical activity in a few carefully selected neurons.
First they engineered these neurons to be sensitive to light. Then, using implanted optical fibers, they flashed blue light on a specific neural pathway in the amygdala, a brain region involved in processing emotions.
And the mice, which had been keeping to the sides of their enclosure, scampered freely across an open space.
While such tools are very far from being used or even tested in humans, scientists say optogenetics research is exciting because it gives them extraordinary control over specific brain circuits — and with it, new insights into an array of disorders, among them anxiety and Parkinson’s disease.
Mice are very different from humans, as Dr. Deisseroth (pronounced DICE-er-roth) acknowledged. But he added that because “the mammalian brain has striking commonalities across species,” the findings might lead to a better understanding of the neural mechanisms of human anxiety.
David Barlow, founder of the Center for Anxiety and Related Disorders at Boston University, cautions against pushing the analogy too far: “I am sure the investigators would agree that these complex syndromes can’t be reduced to the firing of a single small neural circuit without considering other important brain circuits, including those involved in thinking and appraisal.”
But a deeper insight is suggested by a follow-up experiment in which Dr. Deisseroth’s team directed their light beam just a little more broadly, activating more pathways in the amygdala. This erased the effect entirely, leaving the mouse as skittish as ever.
This implies that current drug treatments, which are far less specific and often cause side effects, could also in part be working against themselves.
David Anderson, a professor of biology at the California Institute of Technology who also does research using optogenetics, compares the drugs’ effects to a sloppy oil change. If you dump a gallon of oil over your car’s engine, some of it will dribble into the right place, but a lot of it will end up doing more harm than good.
“Psychiatric disorders are probably not due only to chemical imbalances in the brain,” Dr. Anderson said. “It’s more than just a giant bag of serotonin or dopamine whose concentrations sometimes are too low or too high. Rather, they likely involve disorders of specific circuits within specific brain regions.”
So optogenetics, which can focus on individual circuits with exceptional precision, may hold promise for psychiatric treatment. But Dr. Deisseroth and others caution that it will be years before these tools are used on humans, if ever.
For one, the procedure involves bioengineering that most people would think twice about. First, biologists identify an “opsin,” a protein found in photosensitive organisms like pond scum that allows them to detect light. Next, they fish out the opsin’s gene and insert it into a neuron within the brain, using viruses that have been engineered to be harmless —“disposable molecular syringes,” as Dr. Anderson calls them.
There, the opsin DNA becomes part of the cell’s genetic material, and the resulting opsin proteins conduct electric currents — the language of the brain — when they are exposed to light. (Some opsins, like channelrhodopsin, which responds to blue light, activate neurons; others, like halorhodopsin, activated by yellow light, silence them.)
Finally, researchers delicately thread thin optical fibers down through layers of nervous tissue and deliver light to just the right spot.
Thanks to optogenetics, neuroscientists can go beyond observing correlations between the activity of neurons and an animal’s behavior; by turning particular neurons on or off at will, they can prove that those neurons actually govern the behavior.
“Sometimes before I give talks, people will ask me about my ‘imaging’ tools,” said Dr. Deisseroth, 39, a practicing psychiatrist whose dissatisfaction with current treatments led him to form a research laboratory in 2004 to develop and apply optogenetic technology.
“I say: ‘Interestingly, it’s the complete opposite of imaging, which is observational. We’re not using light to observe events. We’re sending light in to cause events.’ ”
In early experiments, scientists showed that they could make worms stop wiggling and drive mice around in manic circles as if by remote control.
Now that the technique has earned its stripes, laboratories around the world are using it to better understand how the nervous system works, and to study problems including chronic pain, Parkinson’s disease and retinal degeneration.
Some of the insights gained from these experiments in the lab are already inching their way to the clinic.
Dr. Amit Etkin, a Stanford psychiatrist and researcher who collaborates with Dr. Deisseroth, is trying to translate the findings about anxiety in rodents to improve human therapy with existing tools. Using transcranial magnetic stimulation, a technique that is far less specific than optogenetics but has the advantage of being noninvasive, Dr. Etkin seeks to activate the human analog of the amygdala circuitry that reduced anxiety in Dr. Deisseroth’s mice.
Dr. Jaimie Henderson, their colleague in the neurosurgery department, has treated more than 600 Parkinson’s patients using a standard procedure called deep brain stimulation. The treatment, which requires implanting metal electrodes in a brain region called the subthalamic nucleus, improves coordination and fine motor control. But it also causes side effects, like involuntary muscle contractions and dizziness, perhaps because turning on electrodes deep inside the brain also activates extraneous circuits.
“If we could find a way to just activate the circuits that provide therapeutic benefit without the ones that cause side effects, that would obviously be very helpful,” Dr. Henderson said.
Moreover, as with any invasive brain surgery, implanting electrodes carries the risk of infection and life-threatening hemorrhage. What if you could stimulate the brain’s surface instead? A new theory of how deep brain stimulation affects Parkinson’s symptoms, based on optogenetics work in rodents, suggests that this might succeed.
Dr. Henderson has recently begun clinical tests in human patients, and hopes that this approach may also treat other problems associated with Parkinson’s, like speech disorders.
In the building next door, Krishna V. Shenoy, a neuroscience researcher, is bringing optogenetics to work on primates. Extending the success of a similar effort by an M.I.T. group led by Robert Desimone and Edward S. Boyden, he recently inserted opsins into the brains of rhesus monkeys. They experienced no ill effects from the viruses or the optical fibers, and the team was able to control selected neurons using light.
Dr. Shenoy, who is part of an international effort financed by the Defense Advanced Research Projects Agency, says optogenetics has promise for new devices that could eventually help treat traumatic brain injury and equip wounded veterans with neural prostheses.
“Current systems can move a prosthetic arm to a cup, but without an artificial sense of touch it’s very difficult to pick it up without either dropping or crushing it,” he said. “By feeding information from sensors on the prosthetic fingertips directly back into the brain using optogenetics, one could in principle provide a high-fidelity artificial sense of touch.”
Some researchers are already imagining how optogenetics-based treatments could be used directly on people if the biomedical challenge of safely delivering novel genes to patients can be overcome.
Dr. Boyden, who participated in the early development of optogenetics, runs a laboratory dedicated to creating and disseminating ever more powerful tools. He pointed out that light, unlike drugs and electrodes, can switch neurons off — or as he put it, “shut an entire circuit down.” And shutting down overexcitable circuits is just what you’d want to do to an epileptic brain.
“If you want to turn off a brain circuit and the alternative is surgical removal of a brain region, optical fiber implants might seem preferable,” Dr. Boyden said. Several labs are working on the problem, even if actual applications still seem far off.
For Dr. Deisseroth, who treats patients with autism and depression, optogenetics offers a more immediate promise: easing the stigma faced by people with mental illness, whose appearance of physical health can cause incomprehension from family members, friends and doctors.
“Just understanding for us, as a society, that someone who has anxiety has a known or knowable circuitry difference is incredibly valuable,” he said.
Source The New York Times
STANFORD Optogenetics, tested in rodents, can control electrical activity in a few carefully selected neurons, and may hold new insights into our disorders.
In a study recently published in the journal Nature, a team of neuroscientists turned these high-strung prey into bold explorers with the flip of a switch.
The group, led by Dr. Karl Deisseroth, a psychiatrist and researcher at Stanford, employed an emerging technology called optogenetics to control electrical activity in a few carefully selected neurons.
First they engineered these neurons to be sensitive to light. Then, using implanted optical fibers, they flashed blue light on a specific neural pathway in the amygdala, a brain region involved in processing emotions.
And the mice, which had been keeping to the sides of their enclosure, scampered freely across an open space.
While such tools are very far from being used or even tested in humans, scientists say optogenetics research is exciting because it gives them extraordinary control over specific brain circuits — and with it, new insights into an array of disorders, among them anxiety and Parkinson’s disease.
Mice are very different from humans, as Dr. Deisseroth (pronounced DICE-er-roth) acknowledged. But he added that because “the mammalian brain has striking commonalities across species,” the findings might lead to a better understanding of the neural mechanisms of human anxiety.
David Barlow, founder of the Center for Anxiety and Related Disorders at Boston University, cautions against pushing the analogy too far: “I am sure the investigators would agree that these complex syndromes can’t be reduced to the firing of a single small neural circuit without considering other important brain circuits, including those involved in thinking and appraisal.”
But a deeper insight is suggested by a follow-up experiment in which Dr. Deisseroth’s team directed their light beam just a little more broadly, activating more pathways in the amygdala. This erased the effect entirely, leaving the mouse as skittish as ever.
This implies that current drug treatments, which are far less specific and often cause side effects, could also in part be working against themselves.
David Anderson, a professor of biology at the California Institute of Technology who also does research using optogenetics, compares the drugs’ effects to a sloppy oil change. If you dump a gallon of oil over your car’s engine, some of it will dribble into the right place, but a lot of it will end up doing more harm than good.
“Psychiatric disorders are probably not due only to chemical imbalances in the brain,” Dr. Anderson said. “It’s more than just a giant bag of serotonin or dopamine whose concentrations sometimes are too low or too high. Rather, they likely involve disorders of specific circuits within specific brain regions.”
So optogenetics, which can focus on individual circuits with exceptional precision, may hold promise for psychiatric treatment. But Dr. Deisseroth and others caution that it will be years before these tools are used on humans, if ever.
For one, the procedure involves bioengineering that most people would think twice about. First, biologists identify an “opsin,” a protein found in photosensitive organisms like pond scum that allows them to detect light. Next, they fish out the opsin’s gene and insert it into a neuron within the brain, using viruses that have been engineered to be harmless —“disposable molecular syringes,” as Dr. Anderson calls them.
There, the opsin DNA becomes part of the cell’s genetic material, and the resulting opsin proteins conduct electric currents — the language of the brain — when they are exposed to light. (Some opsins, like channelrhodopsin, which responds to blue light, activate neurons; others, like halorhodopsin, activated by yellow light, silence them.)
Finally, researchers delicately thread thin optical fibers down through layers of nervous tissue and deliver light to just the right spot.
Thanks to optogenetics, neuroscientists can go beyond observing correlations between the activity of neurons and an animal’s behavior; by turning particular neurons on or off at will, they can prove that those neurons actually govern the behavior.
“Sometimes before I give talks, people will ask me about my ‘imaging’ tools,” said Dr. Deisseroth, 39, a practicing psychiatrist whose dissatisfaction with current treatments led him to form a research laboratory in 2004 to develop and apply optogenetic technology.
“I say: ‘Interestingly, it’s the complete opposite of imaging, which is observational. We’re not using light to observe events. We’re sending light in to cause events.’ ”
In early experiments, scientists showed that they could make worms stop wiggling and drive mice around in manic circles as if by remote control.
Now that the technique has earned its stripes, laboratories around the world are using it to better understand how the nervous system works, and to study problems including chronic pain, Parkinson’s disease and retinal degeneration.
Some of the insights gained from these experiments in the lab are already inching their way to the clinic.
Dr. Amit Etkin, a Stanford psychiatrist and researcher who collaborates with Dr. Deisseroth, is trying to translate the findings about anxiety in rodents to improve human therapy with existing tools. Using transcranial magnetic stimulation, a technique that is far less specific than optogenetics but has the advantage of being noninvasive, Dr. Etkin seeks to activate the human analog of the amygdala circuitry that reduced anxiety in Dr. Deisseroth’s mice.
Dr. Jaimie Henderson, their colleague in the neurosurgery department, has treated more than 600 Parkinson’s patients using a standard procedure called deep brain stimulation. The treatment, which requires implanting metal electrodes in a brain region called the subthalamic nucleus, improves coordination and fine motor control. But it also causes side effects, like involuntary muscle contractions and dizziness, perhaps because turning on electrodes deep inside the brain also activates extraneous circuits.
“If we could find a way to just activate the circuits that provide therapeutic benefit without the ones that cause side effects, that would obviously be very helpful,” Dr. Henderson said.
Moreover, as with any invasive brain surgery, implanting electrodes carries the risk of infection and life-threatening hemorrhage. What if you could stimulate the brain’s surface instead? A new theory of how deep brain stimulation affects Parkinson’s symptoms, based on optogenetics work in rodents, suggests that this might succeed.
Dr. Henderson has recently begun clinical tests in human patients, and hopes that this approach may also treat other problems associated with Parkinson’s, like speech disorders.
In the building next door, Krishna V. Shenoy, a neuroscience researcher, is bringing optogenetics to work on primates. Extending the success of a similar effort by an M.I.T. group led by Robert Desimone and Edward S. Boyden, he recently inserted opsins into the brains of rhesus monkeys. They experienced no ill effects from the viruses or the optical fibers, and the team was able to control selected neurons using light.
Dr. Shenoy, who is part of an international effort financed by the Defense Advanced Research Projects Agency, says optogenetics has promise for new devices that could eventually help treat traumatic brain injury and equip wounded veterans with neural prostheses.
“Current systems can move a prosthetic arm to a cup, but without an artificial sense of touch it’s very difficult to pick it up without either dropping or crushing it,” he said. “By feeding information from sensors on the prosthetic fingertips directly back into the brain using optogenetics, one could in principle provide a high-fidelity artificial sense of touch.”
Some researchers are already imagining how optogenetics-based treatments could be used directly on people if the biomedical challenge of safely delivering novel genes to patients can be overcome.
Dr. Boyden, who participated in the early development of optogenetics, runs a laboratory dedicated to creating and disseminating ever more powerful tools. He pointed out that light, unlike drugs and electrodes, can switch neurons off — or as he put it, “shut an entire circuit down.” And shutting down overexcitable circuits is just what you’d want to do to an epileptic brain.
“If you want to turn off a brain circuit and the alternative is surgical removal of a brain region, optical fiber implants might seem preferable,” Dr. Boyden said. Several labs are working on the problem, even if actual applications still seem far off.
For Dr. Deisseroth, who treats patients with autism and depression, optogenetics offers a more immediate promise: easing the stigma faced by people with mental illness, whose appearance of physical health can cause incomprehension from family members, friends and doctors.
“Just understanding for us, as a society, that someone who has anxiety has a known or knowable circuitry difference is incredibly valuable,” he said.
Source The New York Times
Labels:
Biology,
Computer Science,
Engineering,
Medicine,
Mind,
Neuroscience
Monday, May 16, 2011
New solar product captures up to 95 percent of light energy
MU engineer plans to make solar panels more effective in collecting energy.
Efficiency is a problem with today's solar panels; they only collect about 20 percent of available light. Now, a University of Missouri engineer has developed a flexible solar sheet that captures more than 90 percent of available light, and he plans to make prototypes available to consumers within the next five years.
Patrick Pinhero, an associate professor in the MU Chemical Engineering Department, says energy generated using traditional photovoltaic (PV) methods of solar collection is inefficient and neglects much of the available solar electromagnetic (sunlight) spectrum. The device his team has developed – essentially a thin, moldable sheet of small antennas called nantenna – can harvest the heat from industrial processes and convert it into usable electricity. Their ambition is to extend this concept to a direct solar facing nantenna device capable of collecting solar irradiation in the near infrared and optical regions of the solar spectrum.
Working with his former team at the Idaho National Laboratory and Garrett Moddel, an electrical engineering professor at the University of Colorado, Pinhero and his team have now developed a way to extract electricity from the collected heat and sunlight using special high-speed electrical circuitry. This team also partners with Dennis Slafer of MicroContinuum, Inc., of Cambridge, Mass., to immediately port laboratory bench-scale technologies into manufacturable devices that can be inexpensively mass-produced.
"Our overall goal is to collect and utilize as much solar energy as is theoretically possible and bring it to the commercial market in an inexpensive package that is accessible to everyone," Pinhero said. "If successful, this product will put us orders of magnitudes ahead of the current solar energy technologies we have available to us today."
As part of a rollout plan, the team is securing funding from the U.S. Department of Energy and private investors. The second phase features an energy-harvesting device for existing industrial infrastructure, including heat-process factories and solar farms.
Within five years, the research team believes they will have a product that complements conventional PV solar panels. Because it's a flexible film, Pinhero believes it could be incorporated into roof shingle products, or be custom-made to power vehicles.
Once the funding is secure, Pinhero envisions several commercial product spin-offs, including infrared (IR) detection. These include improved contraband-identifying products for airports and the military, optical computing, and infrared line-of-sight telecommunications.
Source EurekaAlert!
Efficiency is a problem with today's solar panels; they only collect about 20 percent of available light. Now, a University of Missouri engineer has developed a flexible solar sheet that captures more than 90 percent of available light, and he plans to make prototypes available to consumers within the next five years.
Patrick Pinhero, an associate professor in the MU Chemical Engineering Department, says energy generated using traditional photovoltaic (PV) methods of solar collection is inefficient and neglects much of the available solar electromagnetic (sunlight) spectrum. The device his team has developed – essentially a thin, moldable sheet of small antennas called nantenna – can harvest the heat from industrial processes and convert it into usable electricity. Their ambition is to extend this concept to a direct solar facing nantenna device capable of collecting solar irradiation in the near infrared and optical regions of the solar spectrum.
Working with his former team at the Idaho National Laboratory and Garrett Moddel, an electrical engineering professor at the University of Colorado, Pinhero and his team have now developed a way to extract electricity from the collected heat and sunlight using special high-speed electrical circuitry. This team also partners with Dennis Slafer of MicroContinuum, Inc., of Cambridge, Mass., to immediately port laboratory bench-scale technologies into manufacturable devices that can be inexpensively mass-produced.
"Our overall goal is to collect and utilize as much solar energy as is theoretically possible and bring it to the commercial market in an inexpensive package that is accessible to everyone," Pinhero said. "If successful, this product will put us orders of magnitudes ahead of the current solar energy technologies we have available to us today."
As part of a rollout plan, the team is securing funding from the U.S. Department of Energy and private investors. The second phase features an energy-harvesting device for existing industrial infrastructure, including heat-process factories and solar farms.
Within five years, the research team believes they will have a product that complements conventional PV solar panels. Because it's a flexible film, Pinhero believes it could be incorporated into roof shingle products, or be custom-made to power vehicles.
Once the funding is secure, Pinhero envisions several commercial product spin-offs, including infrared (IR) detection. These include improved contraband-identifying products for airports and the military, optical computing, and infrared line-of-sight telecommunications.
Source EurekaAlert!
Sunday, May 15, 2011
'Computer synapse' analyzed at the nanoscale
Researchers at Hewlett Packard and the University of California, Santa Barbara, have analysed in unprecedented detail the physical and chemical properties of an electronic device that computer engineers hope will transform computing.
Memristors, short for memory resistors, are a newly understood circuit element for the development of electronics and have inspired experts to seek ways of mimicking the behaviour of our own brains' activity inside a computer.
Research, published today, Monday, 16 May, in IOP Publishing's Nanotechnology, explains how the researchers have used highly focused x-rays to map out the nanoscale physical and chemical properties of these electronic devices.
It is thought memristors, with the ability to 'remember' the total electronic charge that passes through them, will be of greatest benefit when they can act like synapses within electronic circuits, mimicking the complex network of neurons present in the brain, enabling our own ability to perceive, think and remember.
Mimicking biological synapses - the junctions between two neurons where information is transmitted in our brains – could lead to a wide range of novel applications, including semi-autonomous robots, if complex networks of neurons can be reproduced in an artificial system.
In order for the huge potential of memristors to be utilised, researchers first need to understand the physical processes that occur within the memristors at a very small scale.
Memristors have a very simple structure – often just a thin film made of titanium dioxide between two metal electrodes – and have been extensively studied in terms of their electrical properties.
For the first time, researchers have been able to non-destructively study the physical properties of memristors allowing for a more detailed insight into the chemistry and structure changes that occur when the device is operating.
The researchers were able to study the exact channel where the resistance switching of memristors occurs by using a combination of techniques.
They used highly focused x-rays to locate and image the approximately one hundred nanometer wide channel where the switching of resistance takes place, which could then be fed into a mathematical model of how the memristor heats up.
John Paul Strachan of the nanoElectronics Research Group, Hewlett-Packard Labs, California, said: "One of the biggest hurdles in using these devices is understanding how they work: the microscopic picture for how they undergo such tremendous and reversible change in resistance.
"We now have a direct picture for the thermal profile that is highly localized around this channel during electrical operation, and is likely to play a large role in accelerating the physics driving the memristive behavior."
This research appears as part of a special issue on non-volatile memory based on nanostructures.
Source EurekaAlert!
Memristors, short for memory resistors, are a newly understood circuit element for the development of electronics and have inspired experts to seek ways of mimicking the behaviour of our own brains' activity inside a computer.
Research, published today, Monday, 16 May, in IOP Publishing's Nanotechnology, explains how the researchers have used highly focused x-rays to map out the nanoscale physical and chemical properties of these electronic devices.
It is thought memristors, with the ability to 'remember' the total electronic charge that passes through them, will be of greatest benefit when they can act like synapses within electronic circuits, mimicking the complex network of neurons present in the brain, enabling our own ability to perceive, think and remember.
Mimicking biological synapses - the junctions between two neurons where information is transmitted in our brains – could lead to a wide range of novel applications, including semi-autonomous robots, if complex networks of neurons can be reproduced in an artificial system.
In order for the huge potential of memristors to be utilised, researchers first need to understand the physical processes that occur within the memristors at a very small scale.
Memristors have a very simple structure – often just a thin film made of titanium dioxide between two metal electrodes – and have been extensively studied in terms of their electrical properties.
For the first time, researchers have been able to non-destructively study the physical properties of memristors allowing for a more detailed insight into the chemistry and structure changes that occur when the device is operating.
The researchers were able to study the exact channel where the resistance switching of memristors occurs by using a combination of techniques.
They used highly focused x-rays to locate and image the approximately one hundred nanometer wide channel where the switching of resistance takes place, which could then be fed into a mathematical model of how the memristor heats up.
John Paul Strachan of the nanoElectronics Research Group, Hewlett-Packard Labs, California, said: "One of the biggest hurdles in using these devices is understanding how they work: the microscopic picture for how they undergo such tremendous and reversible change in resistance.
"We now have a direct picture for the thermal profile that is highly localized around this channel during electrical operation, and is likely to play a large role in accelerating the physics driving the memristive behavior."
This research appears as part of a special issue on non-volatile memory based on nanostructures.
Source EurekaAlert!
Labels:
Computer Science,
Engineering,
Neuroscience
Subscribe to:
Posts (Atom)