Showing posts with label Quantum Physics. Show all posts
Showing posts with label Quantum Physics. Show all posts

Saturday, December 3, 2011

Entangled diamonds blur quantum-classical divide


Two diamonds as wide as earring studs have been made to share the spooky quantum state known as entanglement. The feat, performed at room temperature, blurs the divide between the classical and quantum worlds, since typically the quantum link has been made with much smaller particles at low temperatures.
One laser pulse entangled two diamonds and the next measured the entanglement
Entanglement is one of the weird aspects of quantum mechanics, where the fates of two or more particles are intertwined – even when they are physically far apart. Electrons, for example, have been entangled, so that changing the quantum spin of one affects the spins of its entangled partners.
Macroscopic objects, on the other hand, are supposed to mind their own business – flipping one coin shouldn't force a neighbouring flipped coin to come up heads.
But that's just what happened with two 3-millimetre-wide diamonds on a lab bench at the University of Oxford. Physicists there led by Ka Chung Lee andMichael Sprague were able to show that the diamonds shared one vibrational state between them.
Other researchers had previously shown quantum effects in a supercooled 0.06-millimetre-long strip of metal, which was set in a state where it wasvibrating and not vibrating at the same time. But quantum effects are fragile. The more atoms an object contains, the more they jostle each other about, destroying the delicate links of entanglement.

Fleeting link

Cooling an object down to fractions of a degree above absolute zero was thought to be the only way to keep atoms from doing violence to each other.
"In our case we said, let's not bother doing that," says Ian Walmsley of Oxford, head of the lab where the diamonds were entangled. "It turns out all you need to do is look on a very short timescale, before all that jostling and mugging around has a chance to destroy the coherence."
The team placed two diamonds in front of an ultrafast laser, which zapped them with a pulse of light that lasted 100 femtoseconds (or 10-13 seconds).
Every so often, according to the classical physics that describes large objects, one of those photons should set the atoms in one of the diamonds vibrating. That vibration saps some energy from the photon. The less energetic photon would then move on to a detector, and each diamond would be left either vibrating or not vibrating.
But if the diamonds behaved as quantum mechanical objects, they would share one vibrational mode between them. It would be as if both diamonds were both vibrating and not vibrating at the same time. "Quantum mechanics says it's not either/or, it's both/and," Walmsley says. "It's that both/and we've been trying to prove."

Same state

To show that the diamonds were truly entangled, the researchers hit them with a second laser pulse just 350 femtoseconds after the first. The second pulse picked up the energy the first pulse left behind, and reached the detector as an extra-energetic photon.
If the system were classical, the second photon should pick up extra energy only half the time – only if it happened to hit the diamond where the energy was deposited in the first place. But in 200 trillion trials, the team found that the second photon picked up extra energy every time. That means the energy was not localised in one diamond or the other, but that they shared the same vibrational state.
Entangled diamonds could some day find uses in quantum computers, which could use entanglement to carry out many calculations at once.
"To actually realise such a device is still a way off in the future, but conceptually that's feasible," Walmsley says. He notes that the diamonds were entangled for only 7000 femtoseconds, which is not long enough for practical applications.

Quantum limit

The real value of the experiment may be in probing the boundary between quantum mechanics and classical physics. "We think that it is the first time that a room-temperature, solid-state system has been demonstrably put in this entangled quantum state," Walmsley says. "This is an interesting avenue for thinking about how quantum mechanics can emerge into the classical world."
Erika Andersson of Heriot-Watt University in Edinburgh, UK, agrees.
"We want to push and see how far quantum mechanics goes," she says. "The reported work is a major step in trying to push quantum mechanics to its limits, in the sense of showing that larger and larger physical systems can behave according to the 'strange' predictions of quantum mechanics."

Wednesday, June 22, 2011

Quantum magic trick shows reality is what you make it

Conjurers frequently appear to make balls jump between upturned cups. In quantum systems, where the properties of an object, including its location, can vary depending on how you observe them, such feats should be possible without sleight of hand. Now this startling characteristic has been demonstrated experimentally, using a single photon that exists in three locations at once.

Despite quantum theory's knack for explaining experimental results, some physicists have found its weirdness too much to swallow. Albert Einstein mocked entanglement, a notion at the heart of quantum theory in which the properties of one particle can immediately affect those of another regardless of the distance between them. He argued that some invisible classical physics, known as "hidden-variable theories", must be creating the illusion of what he called "spooky action at a distance".

A series of painstakingly designed experiments has since shown that Einstein was wrong: entanglement is real and no hidden-variable theories can explain its weird effects.
But entanglement is not the only phenomenon separating the quantum from the classical. "There is another shocking fact about quantum reality which is often overlooked," says Aephraim Steinberg of the University of Toronto in Canada.

No absolute reality

In 1967, Simon Kochen and Ernst Specker proved mathematically that even for a single quantum object, where entanglement is not possible, the values that you obtain when you measure its properties depend on the context. So the value of property A, say, depends on whether you chose to measure it with property B, or with property C. In other words, there is no reality independent of the choice of measurement.

It wasn't until 2008, however, that Alexander Klyachko of Bilkent University in Ankara, Turkey, and colleagues devised a feasible test for this prediction. They calculated that if you repeatedly measured five different pairs of properties of a quantum particle that was in a superposition of three states, the results would differ for the quantum system compared with a classical system with hidden variables.
That's because quantum properties are not fixed, but vary depending on the choice of measurements, which skews the statistics. "This was a very clever idea," says Anton Zeilinger of the Institute for Quantum Optics, Quantum Nanophysics and Quantum Information in Vienna, Austria. "The question was how to realise this in an experiment."

Now he, Radek Lapkiewicz and colleagues have realised the idea experimentally. They used photons, each in a superposition in which they simultaneously took three paths. Then they repeated a sequence of five pairs of measurements on various properties of the photons, such as their polarisations, tens of thousands of times.

A beautiful experiment

They found that the resulting statistics could only be explained if the combination of properties that was tested was affecting the value of the property being measured. "There is no sense in assuming that what we do not measure about a system has [an independent] reality," Zeilinger concludes.
Steinberg is impressed: "This is a beautiful experiment." If previous experiments testing entanglement shut the door on hidden variables theories, the latest work seals it tight. "It appears that you can't even conceive of a theory where specific observables would have definite values that are independent of the other things you measure," adds Steinberg.

Kochen, now at Princeton University in New Jersey, is also happy. "Almost a half century after Specker and I proved our theorem, which was based on a [thought] experiment, real experiments now confirm our result," he says.
Niels Bohr, a giant of quantum physics, was a great proponent of the idea that the nature of quantum reality depends on what we choose to measure, a notion that came to be called the Copenhagen interpretation. "This experiment lends more support to the Copenhagen interpretation," says Zeilinger.

Source New Scientist

Quantum leap: Magnetic properties of a single proton directly observed for the first time

Most important milestone in the direct measurement of the magnetic moment of the proton and its anti-particle has been achieved / Focusing the matter-antimatter symmetry.

Researchers at Johannes Gutenberg University Mainz (JGU) and the Helmholtz Institute Mainz (HIM), together with their colleagues from the Max Planck Institute for Nuclear Physics in Heidelberg and the GSI Helmholtz Center for Heavy Ion Research in Darmstadt, have observed spin quantum-jumps with a single trapped proton for the first time. The fact that they have managed to procure this elusive data means that they have overtaken their research competitors at the elite Harvard University and are now the global leaders in this field.

The result is a pioneering step forward in the endeavor to directly measure the magnetic properties of the proton with high precision. The measuring principle is based on the observation of a single proton stored in an electromagnetic particle trap. As it would also be possible to observe an anti-proton using the same method, the prospect that an explanation for the matter-antimatter imbalance in the universe could be found has become a reality. It is essential to be able to analyze antimatter in detail if we are to understand why matter and antimatter did not completely cancel each other out after the Big Bang - in other words, if we are to comprehend how the universe actually came into existence.

The proton has an intrinsic angular momentum or spin, just like other particles. It is like a tiny bar magnet; in this analogy, a spin quantum jump would correspond to a (switch) flip of the magnetic poles. However, detecting the proton spin is a major challenge. While the magnetic moments of the electron and its anti-particle, the positron, were already being measured and compared in the 1980s, this has yet to be achieved in the case of the proton. "We have long been aware of the magnetic moment of the proton, but it has thus far not been observed directly for a single proton but only in the case of particle ensembles," explains Stefan Ulmer, a member of the work group headed by Professor Dr  Jochen Walz at the Institute of Physics at the new Helmholtz Institute Mainz.

The real problem is that the magnetic moment of the proton is 660 times smaller than that of the electron, which means that it is considerably harder to detect. It has taken the collaborative research team five years to prepare an experiment that would be precise enough to pass the crucial test. "At last we have successfully demonstrated the detection of the spin direction of a single trapped proton," says an exultant Ulmer, a stipendiary of the International Max Planck Research School for Quantum Dynamics in Heidelberg.

This opens the way for direct high-precision measurements of the magnetic moments of both the proton and the anti-proton. The latter is likely to be undertaken at CERN, the European laboratory for particle physics in Geneva, or at FLAIR/GSI in Darmstadt. The magnetic moment of the anti-proton is currently only known to three decimal places. The method used at the laboratories in Mainz aims at a millionfold improvement of the measuring accuracy and should represent a new highly sensitive test of the matter-antimatter symmetry. This first observation of the spin quantum jumps of a single proton is a crucial milestone in the pursuit of this aim.
Matter-antimatter symmetry is one of the pillars of the Standard Model of elementary particle physics.

According to this model, particles and anti-particles should behave identically once inversions of charge, parity and time - referred to as CPT transformation – are applied simultaneously. High-precision comparisons of the fundamental properties of particles and anti-particles make it possible to accurately determine whether this symmetrical behavior actually occurs, and may provide the basis for theories that extend beyond the Standard Model. Assuming that a difference between the magnetic moments of protons and anti-protons could be detected, this would open up a window on this "new physics".

The results obtained by the Mainz cooperative research team were published online in the leading specialist journal Physical Review Letters on Monday. The article is presented as an "Editor's Suggestion." Furthermore, the American Physical Society (APS) presents the article as "Viewpoint."
The research work carried out by the team of Professor Dr Jochen Walz on anti-hydrogen and the magnetic moment of protons forms part of the "Precision Physics, Fundamental Interactions and Structure of Matter" (PRISMA) Cluster of Excellence, which is currently applying for future sponsorship under the German Federal Excellence Initiative.

Source Johannes Gutenberg Universitat

New test for elusive fundamental particle - anyon - proposed

In quantum physics there are two classes of fundamental particles. Photons, the quanta of light, are bosons, while the protons and neutrons that make up atomic nuclei belong to the fermions. Bosons and fermions differ in their behavior at a very basic level. This difference is expressed in their quantum statistics. In the 1980s a third species of fundamental particle was postulated, which was dubbed the anyon. In their quantum statistics, anyons interpolate between bosons and fermions.

"They would be a kind of missing link between the two known sorts of fundamental particles," says LMU physicist Dr. Tassilo Keilmann. "According to the laws of quantum physics, anyons must exist – but so far it hasn't been possible to detect them experimentally."

An international team of theorists under Keilmann's leadership has now taken an in-depth look at the question of whether it is possible to create anyons in the context of a realistic experiment. Happily for experimentalists, the answer is yes. The theoreticians have come up with an experimental design in which conventional atoms are trapped in a so-called optical lattice. Based on their calculations, it ought to be possible to manipulate the interactions between atoms in the lattice in such a way as to create and detect anyons. In contrast to the behavior of bosons and fermions, the exotic statistics of anyons should be continuously variable between the endpoints defined by the other two particle types.

"These novel quantum particles should be able to hop between sites in the optical lattice," says Keilmann. "More importantly, they and their quantum statistics should be continuously adjustable during the experiment." In that case, it might even be feasible to transmute bosons into anyons and then to turn these into fermions. Such a transition would be equivalent to a novel "statistically induced quantum phase transition", and would allow the anyons to be used for the construction of quantum computers that would be far more efficient than conventional electronic processors. "We have pointed to the first practical route to the detection of anyons," says Keilmann. "Experimentalists should be able to implement the set-up in the course of experiments that are already underway."

Source  EurekaAlert!

Monday, June 20, 2011

Improving LED lighting

Researcher from the University of Miami helps create a smaller, flexible LED.

CORAL GABLES, FL (June 20, 2011) — University of Miami professor at the College of Engineering, Jizhou Song, has helped design an light-emitting diode (LED) light that uses an array of LEDs 100 times smaller than conventional LEDs. The new device has flexibility, maintains lower temperature and has an increased life-span over existing LEDs. The findings are published online by the "Proceedings of the National Academy of Sciences."

Incandescent bulbs are not very efficient, most of the power they use is converted into heat and only a small fraction of the power gets converted to light. Since LEDs reduce energy waste and present an alternative to conventional bulbs.
In this study, the scientists focused on improving certain features of LED lights, like size, flexibility and temperature. Song's role in the project was to analyze the thermal management and establish an analytical model that reduces the temperature of the device.

"The new model uses a silicon substrate, novel etching strategies, a unique layout and innovative thermal management method," says Song, co-author of the study. "The combination of these manufacturing techniques allows the new design to be much smaller and keep lower temperatures than current LEDs using the same electrical power."
In the future, the researchers would also like to make the device stretchable, so that it can be used on any surface, such as deformable display monitors and biomedical devices that adapt to the curvilinear surfaces of the human body.

Source  EurekaAlert!

Thursday, June 16, 2011

LSU researchers see an indication of a new type of neutrino oscillation at the T2K experiment

T2K experiment has first results

BATON ROUGE – LSU Department of Physics Professors Thomas Kutter and Martin Tzanov, and Professor Emeritus William Metcalf, along with graduate and undergraduate students, have been working for several years on an experiment in Japan called T2K, or Tokai to Kamioka Long Baseline Neutrino Oscillation Experiment, which studies the most elusive of fundamental subatomic particles – the neutrino. The team announced they have an indication of a new type of neutrino transformation or oscillation from a muon neutrino to an electron neutrino.

In the T2K experiment in Japan, a beam of muon neutrinos – one of the three types of neutrinos, which also include the electron and tau – was produced in the Japan Proton
Accelerator Research Complex, or J-PARC, located in Tokai village, Ibaraki prefecture, on the east coast of Japan. The beam was aimed at the gigantic Super-Kamiokande underground detector in Kamioka, near the west coast of Japan, 295 km, or 185 miles away from Tokai. An analysis of the detected neutrino-induced events in the Super-Kamiokande detector indicated that a small number of muon neutrinos traveling from Tokai to Kamioka transformed themselves into electron neutrinos.

As part of the experiment, high energy protons were directed onto a carbon target, where their collisions produced charged particles called pions, which travelled through a helium-filled volume where they decayed to produce a beam of the elusive neutrinos. These neutrinos then flew about 200 meters through the earth to a sophisticated detector system capable of making detailed measurements of their energy, direction and type.
"It took the international collaboration about ten years to realize the project and bring it from first idea to first results," said Kutter, leader of the T2K project at LSU. "The entire LSU team is honored to be part of the collaboration and proud to contribute to the experiment. We expect many more results in the near future and look forward to the new research opportunities which are likely to arise from the tantalizing indication of this new neutrino oscillation."

LSU physicists have been part of a number of measurements over the last decade, which include Super Kamiokande, SNO, KamLAND that have shown that neutrinos possess the strange property of neutrino oscillations – one flavor of neutrino can transform into another as it travels through space. This is significant because neutrinos were first predicted theoretically in 1930, first actually detected in 1956 and for 50 years were assumed to have zero mass. But neutrino oscillations require mass.
With mysterious linkage between the three types, neutrinos challenge the understanding of the fundamental forces and basic constituents of matter. They may be related to the mystery of why there is more matter than anti-matter in the universe, and are the focus of intense study worldwide.

Precision measurements of neutrino oscillations can be made using artificial neutrino beams. This was pioneered in Japan by the K2K neutrino experiment in which neutrinos were produced at the KEK accelerator laboratory near Tokyo and were detected using the Super-Kamiokande neutrino detector, a 50,000 ton tank of ultra-pure water located more than half a mile underground in a laboratory 183 miles away near Toyama.
T2K is a more powerful and sophisticated version of the K2K experiment, with a more intense neutrino beam derived from the newly-built main ring synchrotron at the J-PARC accelerator laboratory. The beam was built by physicists from KEK in cooperation with other Japanese institutions and with assistance from American, Canadian, UK and French T2K institutes. The beam is aimed once again at Super-Kamiokande, which has been upgraded for this experiment with new electronics and software.

Before the neutrinos leave the J-PARC facility, their properties are determined by a sophisticated "near" detector, partly based on a huge magnet donated from the CERN accelerator laboratory in Geneva. The CERN magnet was earlier used for the UA1 experiment, which won the Nobel Prize for the discovery of the W and Z bosons which are the basis of neutrino interactions. The LSU team was responsible for building major components of the "near" detector, which provided an important ingredient to the oscillation analysis.
During the next several years, the search will be improved, with the hope that the three-mode oscillation will allow a comparison of the oscillations of neutrinos and anti-neutrinos, probing the asymmetry between matter and anti-matter in the universe.

Source  EurekaAlert!

Wednesday, June 15, 2011

Penn Researchers Break Light-Matter Coupling Strength Limit in Nanoscale Semiconductors

PHILADELPHIA—New engineering research at the University of Pennsylvania demonstrates that polaritons have increased coupling strength when confined to nanoscale semiconductors. This represents a promising advance in the field of photonics: smaller and faster circuits that use light rather than electricity.
The research was conducted by assistant professor Ritesh Agarwal, postdoctoral fellow Lambert van Vugt and graduate student Brian Piccione of the Department of Materials Science and Engineering in Penn’s School of Engineering and Applied Science. Chang-Hee Cho and Pavan Nukala, also of the Materials Science department, contributed to the study.

 A computer simulation of a one-dimensional cavity wave in a 200nm nanowire.

Their work was published in the journal Proceedings of the National Academy of Sciences.
Polaritons are quasiparticles, combinations of physical particles and the energy they contribute to a system that can be measured and tracked as a single unit. Polaritons are combinations of photons and another quasiparticle, excitons. Together, they have qualities of both light and electric charge, without being fully either.
“An exciton is a combination of a an electron, which has negative charge and an electron hole, which has a positive charge. Light is an oscillating electro-magnetic field, so it can couple with the excitons,” Agarwal said.  “When their frequencies match, they can talk to one another; both of their oscillations become more pronounced.”

High light-matter coupling strength is a key factor in designing photonic devices, which would use light instead of electricity and thus be faster and use less power than comparable electronic devices.  However, the coupling strength exhibited within bulk semiconductors had always been thought of as a fixed property of the material they were made of.
Agarwal’s team proved that, with the proper fabrication and finishing techniques, this limit can be broken.
“When you go from bulk sizes to one micron, the light-matter coupling strength is pretty constant,” Agarwal said. “But, if you try to go below 500 nanometers or so, what we have shown is that this coupling strength increases dramatically.“

The difference is a function of one of nanotechnology’s principle phenomena: the traits of a bulk material are different than structures of the same material on the nanoscale.
“When you’re working at bigger sizes, the surface is not as important. The surface to volume ratio — the number of atoms on the surface divided by the number of atoms in the whole material — is a very small number,” Agarwal said. “But when you make a very small structure, say 100 nanometers, this number is dramatically increased. Then what is happening on the surface critically determines the device’s properties.”
Other researchers have tried to make polariton cavities on this small a scale, but the chemical etching method used to fabricate the devices damages the semiconductor surface. The defects on the surface trap the excitons and render them useless.

“Our cadmium sulfide nanowires are self-assembled; we don’t etch them. But the surface quality was still a limiting factor, so we developed techniques of surface passivation. We grew a silicon oxide shell on the surface of the wires and greatly improved their optical properties,” Agarwal said.
The oxide shell fills the electrical gaps in the nanowire surface, preventing the excitons from getting trapped.
“We also developed tools and techniques for measuring this light-matter coupling strength,” Piccione said. “We’ve quantified the light-matter coupling strength, so we can show that it’s enhanced in the smaller structures,”
Being able to quantify this increased coupling strength opens the door for designing nanophotonic circuit elements and devices.
“The stronger you can make light-matter coupling, the better you can make photonic switches,” Agarwal said. “Electrical transistors work because electrons care what other electrons are doing, but, on their own, photons do not interact with each other. You need to combine optical properties with material properties to make it work”

This research was supported by the Netherlands Organization for Scientific Research Rubicon Programme, the U.S. Army Research Office, the National Science Foundation, Penn’s Nano/Bio Interface Center and the National Institutes of Health.

Source Penn State University

Neutrinos caught 'shape shifting' in new way

Neutrinos have been caught spontaneously flip-flopping from one type to another in a way never previously seen. Further observations of this behaviour may shed light on how matter came to dominate over antimatter in the universe.

Neutrinos are among the most slippery particles known to physics. They rarely interact with ordinary matter, but massive experiments have been set up to detect the flashes of light produced when they do.
There are three known types, or flavours, of neutrino: electron, muon, and tau. Several experiments have found evidence that some flavours can spontaneously change into others, a phenomenon called neutrino oscillations. For example muon neutrinos can change into tau neutrinos.

 The first T2K neutrino event seen in the Super-Kamiokande in 2010. Each dot is a photomultiplier tube that has detected light (Image: T2K experiment)

Now, results from a Japanese experiment called T2K have tentatively added a new kind of transformation to the list of allowed types – the metamorphosis of muon neutrinos into electron neutrinos.
T2K generates muon neutrinos at the J-PARC accelerator in Tokai, Japan, and sends them in a beam towards the Super-Kamiokande neutrino detector in Kamioka, 295 kilometres away. It began operating in February 2010 and stopped gathering data in March, when Japan was rocked by the magnitude-9 megaquakeMovie Camera.

Still tentative

On Wednesday, the team announced that six of the muon neutrinos that started off at J-PARC appear to have transformed into electron neutrinos before reaching Super-Kamiokande, where they were detected. This is the first time anyone has seen electron neutrinos show up in a beam of particles that started off as muon neutrinos.
"It shows the power of our experimental design that with only 2 per cent of our design data we are already the most sensitive experiment in the world for looking for this new type of oscillation," says T2K spokesperson Takashi Kobayashi of Japan's KEK particle physics laboratory.
However, the result is still tentative because of the small number of events seen and because of the possibility – considered rare – that muon neutrinos could be misidentified as electron neutrinos. Still, the researchers say experimental errors should give only 1.5 false events in the amount of data they analysed. There is only a 0.7 per cent chance of producing six false events.

Antimatter counterparts

The transformations appear to be happening relatively frequently. That means researchers will be able to quickly accumulate more events – once the experiment begins running again. The earthquake threw the accelerator used to make the neutrinos out of alignment. After adjustments are made, researchers hope to restart the experiment by year's end.
The researchers may eventually rerun the experiment with a beam of muon antineutrinos to see if their behaviour differs from their normal-matter counterparts.
If differences are found, it could help explain why there is a preponderance of matter in the universe. Standard theories say that matter and antimatter were created in equal amounts in the universe's first instants, but for unknown reasons, matter prevailed.

Skew the balance

Reactions involving neutrinos and antineutrinos in the early universe could have skewed the ratio of matter and antimatter production, leading to our matter-dominated universe. "You need some new laws of physics that aren't the same for matter and antimatter, and neutrino physics is one place you could put such laws," says David Wark of Imperial College London, who is a member of the T2K collaboration.
The US-based MiniBoone experiment recently found hints of an antimatter version of the oscillation seen by T2K. MiniBoone found signs that muon antineutrinos sometimes change into electron antineutrinos.
But physicists are still puzzling over the MiniBoone results. Based on the experiment's design, it should not have seen oscillations unless there are one or more extra types of neutrino that are sterile, meaning they are even more averse to interacting with matter than regular neutrinos.
By contrast, the T2K result can be accommodated without invoking sterile neutrinos.

Source New Scientist

Why the universe wasn't fine-tuned for life

IF THE force of gravity were a few per cent weaker, it would not squeeze and heat the centre of the sun enough to ignite the nuclear reactions that generate the sunlight necessary for life on Earth. But if it were a few per cent stronger, the temperature of the solar core would have been boosted so much the sun would have burned out in less than a billion years - not enough time for the evolution of complex life like us.

In recent years many such examples of how the laws of physics have been "fine-tuned" for us to be here have been reported. Some religious people claim these "cosmic coincidences" are evidence of a grand design by a Supreme Being. In The Fallacy of Fine-tuning, physicist Victor Stenger makes a devastating demolition of such arguments.


A general mistake made in search of fine-tuning, he points out, is to vary just one physical parameter while keeping all the others constant. Yet a "theory of everything" - which alas we do not yet have - is bound to reveal intimate links between physical parameters. A change in one may be compensated by a change in another, says Stenger.

In addition to general mistakes, Stenger deals with specifics. For instance, British astronomer Fred Hoyle discovered that vital heavy elements can be built inside stars only because a carbon-12 nucleus can be made from the fusion of three helium nuclei. For the reaction to proceed, carbon-12 must have an energy level equal to the combined energy of the three helium nuclei, at the typical temperature inside a red giant. This has been touted as an example of fine-tuning. But, as Stenger points out, in 1989, astrophysicist Mario Livio showed that the carbon-12 energy level could actually have been significantly different and still resulted in a universe with the heavy elements needed for life.

The most striking example of fine-tuning appears to be the dark energy - or energy of the vacuum - that is speeding up the expansion of the universe. Calculations show it to be 10120 bigger than quantum theory predicts. But Stenger stresses that this prediction is made in the absence of a quantum theory of gravity, when gravity is known to orchestrate the universe.
Even if some parameters turn out to be fine-tuned, Stenger argues this could be explained if ours is just one universe in a "multiverse" - an infinite number of universes, each with different physical parameters. We would then have ended up in the one where the laws of physics are fine-tuned to life because, well, how could we not have? (For a related philosophical discussion read this article.)

Religious people say that, by invoking a multiverse, physicists are going to extraordinary lengths to avoid God. But physicists have to go where the data lead them. And, currently, there are strong hints from string theory, the standard picture of cosmology and fine-tuning itself to suggest that the universe we can see with our biggest telescopes is only a small part of all that is there.

Source New Scientist

Researchers record two-state dynamics in glassy silicon

CHAMPAIGN, Ill. — Using high-resolution imaging technology, University of Illinois researchers have answered a question that had confounded semiconductor researchers: Is amorphous silicon a glass? The answer? Yes – until hydrogen is added.

Led by chemistry professor Martin Gruebele, the group published its results in the journal Physical Review Letters.

Amorphous silicon (a-Si) is a semiconductor popular for many device applications because it is inexpensive and can be created in a flexible thin film, unlike the rigid, brittle crystalline form of silicon. But the material has its own unusual qualities: It seems to have some characteristics of glass, but cannot be made the way other glasses are.

Most glasses are made by rapidly cooling a melted material so that it hardens in a random structure. But cooling liquid silicon simply results in an orderly crystal structure. Several methods exist for producing a-Si from crystalline silicon, including bombarding a crystal surface so that atoms fly off and deposit on another surface in a random position.

To settle the debate on the nature of a-Si, Gruebele’s group, collaborating with electrical and computer engineering professor Joseph Lyding’s group at the Beckman Institute for Advanced Science and Technology, used a scanning tunneling microscope to take sub nanometer-resolution images of a-Si surfaces, stringing them together to make a time-lapse video.

The video shows a lumpy, irregular surface; each lump is a cluster about five silicon atoms in diameter. Suddenly, between frames, one bump seems to jump to an adjoining space. Soon, another lump nearby shifts neatly to the right. Although few of the clusters move, the action is obvious.

Such cluster “hopping” between two positions is known as two-state dynamics, a signature property of glass. In a glass, the atoms or molecules are randomly positioned or oriented, much the way they are in a liquid or gas. But while atoms have much more freedom of motion to diffuse through a liquid or gas, in a glass the molecules or atom clusters are stuck most of the time in the solid. Instead, a cluster usually has only two adjoining places that it can ferry between.

“This is the first time that this type of two-state hopping has been imaged in a-Si,” Gruebele said. “It’s been predicted by theory and people have inferred it indirectly from other measurements, but this is the first time we’re been able to visualize it.”

The group’s observations of two-state dynamics show that pure a-Si is indeed a glass, in spite of its unorthodox manufacturing method. However, a-Si is rarely used in its pure form; hydrogen is added to make it more stable and improve performance.

Researchers have long assumed that hydrogenation has little to no effect on the random structure of a-Si, but the group’s observations show that this assumption is not quite correct. In fact, adding hydrogen robs a-Si of its two-state dynamics and its categorization as a glass. Furthermore, the surface is riddled with signs of crystallization: larger clusters, cracks and highly structured patches.

Such micro-crystalline structure has great implications for the properties of a-Si and how they are studied and applied. Since most research has been conducted on hydrogenated a-Si, Gruebele sees a great opportunity to delve into the largely unknown characteristics of the glassy state.

“In some ways, I think we actually know less about the properties of glassy silicon than we think we do, because a lot of what’s been investigated of what people call amorphous or glassy silicon isn’t really completely amorphous,” Gruebele said. “We really need to revisit what the properties of a-Si are. There could yet be surprises in the way it functions and the kind of things that we might be able to do with it.”

Next, the group hopes to conduct temperature-depended studies to further establish the activation barriers, or the energy “humps” that the clusters must overcome to move between positions.
The National Science Foundation supported this work.
Source University of Illinois

Monday, June 13, 2011

Under pressure, sodium, hydrogen could undergo a metamorphosis, emerging as superconductor

BUFFALO, N.Y. -- In the search for superconductors, finding ways to compress hydrogen into a metal has been a point of focus ever since scientists predicted many years ago that electricity would flow, uninhibited, through such a material.

Liquid metallic hydrogen is thought to exist in the high-gravity interiors of Jupiter and Saturn. But so far, on Earth, researchers have been unable to use static compression techniques to squeeze hydrogen under high enough pressures to convert it into a metal. Shock-wave methods have been successful, but as experiments with diamond anvil cells have shown, hydrogen remains an insulator even under pressures equivalent to those found in the Earth's core.

To circumvent the problem, a pair of University at Buffalo chemists has proposed an alternative solution for metallizing hydrogen: Add sodium to hydrogen, they say, and it just might be possible to convert the compound into a superconducting metal under significantly lower pressures.
The research, published June 10 in Physical Review Letters, details the findings of UB Assistant Professor Eva Zurek and UB postdoctoral associate Pio Baettig.

Using an open-source computer program that UB PhD student David Lonie designed, Zurek and Baettig looked for sodium polyhydrides that, under pressure, would be viable superconductor candidates. The program, XtalOpt <http://xtalopt.openmolecules.net>, is an evolutionary algorithm that incorporates quantum mechanical calculations to determine the most stable geometries or crystal structures of solids.
In analyzing the results, Baettig and Zurek found that NaH9, which contains one sodium atom for every nine hydrogen atoms, is predicted to become metallic at an experimentally achievable pressure of about 250 gigapascals -- about 2.5 million times the Earth's standard atmospheric pressure, but less than the pressure at the Earth's core (about 3.5 million atmospheres).

"It is very basic research," says Zurek, a theoretical chemist. "But if one could potentially metallize hydrogen using the addition of sodium, it could ultimately help us better understand superconductors and lead to new approaches to designing a room-temperature superconductor."
By permitting electricity to travel freely, without resistance, such a superconductor could dramatically improve the efficiency of power transmission technologies.
Zurek, who joined UB in 2009, conducted research at Cornell University as a postdoctoral associate under Roald Hoffmann, a Nobel Prize-winning theoretical chemist whose research interests include the behavior of matter under high pressure.

In October 2009, Zurek co-authored a paper with Hoffman and other colleagues in the Proceedings of the National Academy of Sciences predicting that LiH6 -- a compound containing one lithium atom for every six hydrogen atoms -- could form as a stable metal at a pressure of around 1 million atmospheres.
Neither LiH6 and NaH9 exists naturally as stable compounds on Earth, but under high pressures, their structure is predicted to be stable.

"One of the things that I always like to emphasize is that chemistry is very different under high pressures," Zurek says. "Our chemical intuition is based upon our experience at one atmosphere. Under pressure, elements that do not usually combine on the Earth's surface may mix, or mix in different proportions. The insulator iodine becomes a metal, and sodium becomes insulating. Our aim is to use the results of computational experiments in order to help develop a chemical intuition under pressure, and to predict new materials with unusual properties."

Source  EurekaAlert!

Sunday, June 12, 2011

First 'living' laser made from kidney cell

It's not quite Cyclops, the sci-fi superhero from the X-Men franchise whose eyes produce destructive blasts of light, but for the first time a laser has been created using a biological cell.
The human kidney cell that was used to make the laser survived the experience. In future such "living lasers" might be created inside live animals, which could potentially allow internal tissues to be imaged in unprecedented detail.

 A human cell, alive and lasing (Image: Malte Gather)

It's not the first unconventional laser. Other attempts include lasers made of Jell-O and powered by nuclear reactors (see box below). But how do you go about giving a living cell this bizarre ability?
Typically, a laser consists of two mirrors on either side of a gain medium – a material whose structural properties allow it to amplify light. A source of energy such as a flash tube or electrical discharge excites the atoms in the gain medium, releasing photons. Normally, these would shoot out in random directions, as in the broad beam of a flashlight, but a laser uses mirrors on either end of the gain medium to create a directed beam.

As photons bounce back and forth between the mirrors, repeatedly passing through the gain medium, they stimulate other atoms to release photons of exactly the same wavelength, phase and direction. Eventually, a concentrated single-frequency beam of light erupts through one of the mirrors as laser light.

Alive and well

Hundreds of different gain media have been used, including various dyes and gases. But no one has used living tissue. Mostly out of curiosity, Malte Gather and Seok-Hyun Yun of Harvard University decided to investigate with a single mammalian cell.
They injected a human kidney cell with a loop of DNA that codes for an enhanced form of green fluorescent proteinMovie Camera. Originally isolated from jellyfish, GFP glows green when exposed to blue light and has been invaluable as a biological beaconMovie Camera, tracking the path of molecules inside cells and lighting up when certain genes are expressed.

After placing the cell between two mirrors, the researchers bombarded it with pulses of blue light until it began to glow. As the green light bounced between the mirrors, certain wavelengths were preferentially amplified until they burst through the semi-transparent mirrors as laser light. Even after a few minutes of lasing, the cell was still alive and well.
Christopher Fang-Yen of the University of Pennsylvania, who has worked on single-atom lasers but was not involved in the recent study, says he finds the new research fascinating. "GFP is similar to dyes used to make commercial dye lasers, so it's not surprising that if you put it in a little bag like a cell and pump it optically you should be able to get a laser," he says. "But the fact that they show it really works is very cool."

Internal imaging?

Yun's main aim was simply to test whether a biological laser was even possible, but he has also been mulling over a few possible applications. "We would like to have a laser inside the body of the animal, to generate laser light directly within the animal's tissue," he says.
In a technique called laser optical tomography, laser beams are fired from outside the body at living tissues. The way the light is transmitted and scattered can reveal the tissues' size, volume and depth, and produce an image. Being able to image from within the body might give much more detailed images. Another technique, called fluorescence microscopyMovie Camera, relies on the glow from living cells doped with GFP to produce images. Yun's biological laser could improve its resolution with brighter laser light.

To turn cells inside a living animal into lasers, they would have to be engineered to express GFP so that they were able to glow. The mirrors in Yun's laser would have to be replaced with nanoscale-sized bits of metal that act as antennas to collect the light.
"Previously the laser was considered an engineering material, and now we are showing the concept of the laser can be integrated into biological systems," says Yun.
You might also like to check out this gallery charting the evolution of the laser.

Source New Scientist

Friday, June 10, 2011

UGA researcher leads discovery of a new driving force for chemical reactions

Athens, Ga. – New research just published in the journal Science by a team of chemists at the University of Georgia and colleagues in Germany shows for the first time that a mechanism called tunneling control may drive chemical reactions in directions unexpected from traditional theories.
The finding has the potential to change how scientists understand and devise reactions in everything from materials science to biochemistry.

The discovery was a complete surprise and came following the first successful isolation of a long-elusive molecule called methylhydroxycarbene by the research team. While the team was pleased that it had "trapped" the prized compound in solid argon through an extremely low-temperature experiment, they were surprised when it vanished within a few hours. That prompted UGA theoretical chemistry professor Wesley Allen to conduct large scale, state-of-the-art computations to solve the mystery.
"What we found was that the change was being controlled by a process called quantum mechanical tunneling," said Allen, "and we found that tunneling can supersede the traditional chemical reactivity processes of kinetic and thermodynamic control. We weren't expecting this at all."
What had happened? Clearly, a chemical reaction had taken place, but only inert argon atoms surrounded the compound, and essentially no thermal energy was available to create new molecular arrangements. Moreover, said Allen, "the observed product of the reaction, acetaldehyde, is the least likely outcome among conceivable possibilities."

Other authors of the paper include Professor Peter Schreiner and his group members Hans Peter Reisenauer, David Ley and Dennis Gerbig of the Justus-Liebig University in Giessen, Germany. Graduate student Chia-Hua Wu at UGA undertook the theoretical work with Allen.
Quantum tunneling isn't new. It was first recognized as a physical process decades ago in early studies of radioactivity. In classical mechanics, molecular motions can be understood in terms of particles roaming on a potential energy surface. Energy barriers, visualized as mountain passes on the surface, separate one chemical compound from another.


For a chemical reaction to occur, a molecular system must have enough energy to "get over the top of the hill," or it will come back down and fail to react. In quantum mechanics, particles can get to the other side of the barrier by tunneling through it, a process that seemingly requires imaginary velocities. In chemistry, tunneling is generally understood to provide secondary correction factors for the rates of chemical reactions but not to provide the predominant driving force.

(The strange world of quantum mechanics has been subject to considerable interest and controversy over the last century, and Austrian physicist Erwin Schrödinger's thought-experiment called "Schrödinger's Cat" illustrates how perplexing it is to apply the rules and laws of quantum mechanics to everyday life.)
"We knew that the rate of a reaction can be significantly affected by quantum mechanical tunneling," said Allen. "It becomes especially important at low temperatures and for reactions involving light atoms. What we discovered here is that tunneling can dominate a reaction mechanism sufficiently to redirect the outcome away from traditional kinetic control. Tunneling can cause a reaction that does not have the lowest activation barriers to occur exclusively."

Allen suggests a vivid analogy between the behavior of methylhydroxycarbene and Schrödinger's iconic cat.
"The cat cannot jump out of its box of deadly confinement because the walls are too high, so it conjures a Houdini-like escape by bursting through the thinnest wall," he said.
The fact that new ideas about tunneling came from the isolation of methylhydroxycarbene was the kind of serendipity that runs through the history of science. Schreiner and his team had snagged the elusive compound, and that was reason enough to celebrate, Allen said. But the surprising observation that it vanished within a few hours raised new questions that led to even more interesting scientific discoveries.

"The initiative to doggedly follow up on a 'lucky observation' was the key to success," said Allen. "Thus, a combination of persistent experimentation and exacting theoretical analysis on methylhydroxycarbene and its reactivity led to the concept I dubbed tunneling control, which may be characterized as `a type of nonclassical kinetic control wherein the decisive factor is not the lowest activation barrier'."
While the process was unearthed for the specific case of methylhydroxycarbene at extremely low temperatures, Allen said that tunneling control "can be a general phenomenon, especially if hydrogen transfer is involved, and such processes need not be restricted to cryogenic temperatures."

Source EurekaAlert!

Monday, June 6, 2011

Erase entangled memory to cool a computer

Imagine cooling a supercomputer not with fans or freezers, but by deleting some of its memory. New calculations show that this is possible, provided some of the bits that make up the computer's memory are "entangled"– a spooky property that can link two quantum systems, no matter how far apart they sit in physical space.
The notion of cooling by erasure seemingly violates a principle articulated by physicist Rolf Landauer in 1961. He showed that erasing information is akin to a decrease in entropy or disorder. As entropy overall must always increase, the deletion of bits must therefore be accompanied by an increase in the entropy of the surroundings, which manifests itself as heat.
The heat produced by a computer today is mainly due to processing inefficiencies, and while these can be reduced, Landauer's insight implies a fundamental limit on how much you can reduce the heat generated by computing.
Now, Lídia del Rio of the Swiss Federal Institute of Technology in Zurich and colleagues have shown that quantum entanglement provides a way to sneak around Landauer's law.

Entropy dip

To understand how this might work, consider two people who are each trying to erase a string of bits in computer memory, which can exist either as 1s or 0s. One of the pair has no knowledge of the stored bits, so to ensure they get erased, he or she must always reset them to "0", regardless of their original content. The second person, however, knows the content of the string and so need only reset those bits that are 1s.
In this situation, the first person has to do more work on average to erase the string than the second. As a result, the "conditional entropy" of the memory is said to be lower for the first person than the second.
Now imagine that the memory bits to be erased are entangled with other objects. In such a system, observing or determining the state of one part immediately fixes the state of the other. So an observer who has access to the entangled objects could know even more about the memory than would be possible otherwise, causing the conditional entropy of the system to dip and become negative when the memory is erased.

Weirdness at work

Del Rio and colleagues have shown mathematically that this negative conditional entropy is the equivalent of extracting heat from the surroundings, or cooling.
The team envisages future computers containing entangled systems of this kind. Deletion of some of a computer's memory should lead to cooling. "If you go to this entangled level of operations, then you will be at the limit of what physics allows you to do," says team member Vlatko Vedral of the University of Oxford.
This does not violate the laws of thermodynamics: there is still an overall increase in entropy because energy is needed to create the entangled system initially.
At the moment, entangled states are not easy to work with: they require extreme cooling and are notoriously fragile. Still, Robert Prevedel of the University of Waterloo, Ontario, Canada, who was not involved in the work, is impressed by the idea. "This demonstrates that the weird features of the quantum world are not only useful as an informational resource, but can actually be used to generate some real, physical work," he says.

Source  New Scientist

Saturday, June 4, 2011

University of Toronto scientist leads international team in quantum physics first

TORONTO, ON - Quantum mechanics is famous for saying that a tree falling in a forest when there's no one there doesn't make a sound. Quantum mechanics also says that if anyone is listening, it interferes with and changes the tree. And so the famous paradox: how can we know reality if we cannot measure it without distorting it?
An international team of researchers, led by University of Toronto physicist Aephraim Steinberg of the Centre for Quantum Information and Quantum Control, have found a way to do just that by applying a modern measurement technique to the historic two-slit interferometer experiment in which a beam of light shone through two slits results in an interference pattern on a screen behind.
That famous experiment, and the 1927 Neils Bohr and Albert Einstein debates, seemed to establish that you could not watch a particle go through one of two slits without destroying the interference effect: you had to choose which phenomenon to look for.
 Patterns emerging from the famous double-slit experiment.

"Quantum measurement has been the philosophical elephant in the room of quantum mechanics for the past century," says Steinberg, who is lead author of Observing the Average Trajectories of Single Photons in a Two-Slit Interferometer, to be published in Science on June 2. "However, in the past 10 to 15 years, technology has reached the point where detailed experiments on individual quantum systems really can be done, with potential applications such as quantum cryptography and computation."

With this new experiment, the researchers have succeeded for the first time in experimentally reconstructing full trajectories which provide a description of how light particles move through the two slits and form an interference pattern. Their technique builds on a new theory of weak measurement that was developed by Yakir Aharonov's group at Tel Aviv University. Howard Wiseman of Griffith University proposed that it might be possible to measure the direction a photon (particle of light) was moving, conditioned upon where the photon is found. By combining information about the photon's direction at many different points, one could construct its entire flow pattern ie. the trajectories it takes to a screen.

"In our experiment, a new single-photon source developed at the National Institute for Standards and Technology in Colorado was used to send photons one by one into an interferometer constructed at Toronto. We then used a quartz calcite, which has an effect on light that depends on the direction the light is propagating, to measure the direction as a function of position. Our measured trajectories are consistent, as Wiseman had predicted, with the realistic but unconventional interpretation of quantum mechanics of such influential thinkers as David Bohm and Louis de Broglie," said Steinberg.

The original double-slit experiment played a central role in the early development of quantum mechanics, leading directly to Bohr's formulation of the principle of complementarity. Complementarity states that observing particle-like or wave-like behaviour in the double-slit experiment depends on the type of measurement made: the system cannot behave as both a particle and wave simultaneously. Steinberg's recent experiment suggests this doesn't have to be the case: the system can behave as both.

"By applying a modern measurement technique to the historic double-slit experiment, we were able to observe the average particle trajectories undergoing wave-like interference, which is the first observation of its kind. This result should contribute to the ongoing debate over the various interpretations of quantum theory," said Steinberg. "It shows that long-neglected questions about the different types of measurement possible in quantum mechanics can finally be addressed in the lab, and weak measurements such as the sort we use in this work may prove crucial in studying all sorts of new phenomena.

"But mostly, we are all just thrilled to be able to see, in some sense, what a photon does as it goes through an interferometer, something all of our textbooks and professors had always told us was impossible."
###
Research partners include the University of Toronto's Centre for Quantum Information and Quantum Control, Department of Physics and Institute for Optical Sciences, the National Institute of Standards and Technology in Boulder, Colorado, the Institute for Quantum Computing at the University of Waterloo, Griffith University, Australia, and the Laboratoire Charles Fabry in Orsay, France. Research was funded by the Natural Sciences and Engineering Research Council of Canada, the Canadian Institute for Advanced Research, and Quantum Works.

Source  EurekaAlert!

Thursday, May 26, 2011

Quantum sensor tracked in human cells could aid drug discovery

Groundbreaking research has shown a quantum atom has been tracked inside a living human cell and may lead to improvements in the testing and development of new drugs.

Professor Lloyd Hollenberg from the University of Melbourne’s School of Physics who led the research said it is the first time a single atom encased in nanodiamond has been used as a sensor to explore the nanoscale environment inside a living human cell.

“It is exciting to see how the atom experiences the biological environment at the nanoscale,” he said.
“This research paves the way towards a new class of quantum sensors used for biological research into the development of new drugs and nanomedicine.”

The sensor is capable of detecting biological processes at a molecular level, such as the regulation of chemicals in and out of the cell, which is critical in understanding how drugs work.
The paper has been published in the journal Nature Nanotechnology.

Funded by the ARC Centre of Excellence for Quantum Computation and Communication Technology, the research was conducted by a cross-disciplinary team from the University of Melbourne’s Physics, Chemistry, and Chemical and Biomolecular Engineering departments.
The researchers developed state of the art technology to control and manipulate the atom in the nanodiamond before inserting it into the human cells in the lab.

Biologist Dr Yan Yan of the University’s Department of Chemical and Biomolecular Engineering who works in the field of nanomedicine, said the sensor provides critical information about the movement of the nanodiamonds inside the living cell.
“This is important for the new field of nanomedicine where drug delivery is dependant on the uptake of similar sized nanoparticles into the cell.”

Quantum physicist and PhD student Liam McGuinness from the University’s School of Physics said that monitoring the atomic sensor in a living cell was a significant achievement.  “Previously, these atomic level quantum measurements could only be achieved under carefully controlled conditions of a physics lab,” he said.
It is hoped in the next few years, that following these proof of principle experiments, the researchers will be able to develop the technology and provide a new set of tools for drug discovery and nanomedicine.

Source The University of Melbourne

Wednesday, May 25, 2011

Ultracold measurements reveal shape of the electron

WHAT shape is an electron? The standard model of particle physics predicts that electrons are egg-shaped, but that the amount of distortion from a perfect sphere is so tiny that no existing experiment could possibly detect it. However, a rival theory called supersymmetry predicts that this egg-shaped distortion should be large enough to be detectable.

Jony Hudson and colleagues at Imperial College London set out to crack the problem. They used ultracold molecules of ytterbium fluoride in which the centres of positive and negative charge differ, creating a dipole. The shape of this dipole reflects the asymmetry of the electron shape, and the team measured this by placing the molecules in an electric and a magnetic field and observing how they spin as the fields are changed. Variations in the rate of spin reveal any asymmetry. Their experiment measured the shape to within a few parts in 1018 but as far as they could tell, rather than being oval, the electron is spherical (Nature, DOI: 10.1038/nature10104).

The result is a challenge to supersymmetry: while the standard model suggests the electron is egg-shaped by only one part in 1028, supersymmetry sets the range at between one part in 1014 and one part in 1019.
"We cannot rule out supersymmetry but we're certainly putting pressure on the theory," says Hudson.
An improvement of one order of magnitude could either confirm supersymmetry or rule it out, something the Imperial team now aims to achieve.

Source New Scientist

Monday, May 16, 2011

Toward faster transistors

MIT physicists discover a new physical phenomenon that could eventually lead to the first increases in computers’ clock speed since 2002.

In the 1980s and ’90s, competition in the computer industry was all about “clock speed” — how many megahertz, and ultimately gigahertz, a chip could boast. But clock speeds stalled out almost 10 years ago: Chips that run faster also run hotter, and with existing technology, there seems to be no way to increase clock speed without causing chips to overheat.

The researchers’ experimental setup consisted of a sample of the lanthanum aluminate-strontium titanate composite, which looks like a slab of thick glass, with thin electrodes deposited on top of it. 

In this week’s issue of the journal Science, MIT researchers and their colleagues at the University of Augsburg in Germany report the discovery of a new physical phenomenon that could yield transistors with greatly enhanced capacitance — a measure of the voltage required to move a charge. And that, in turn, could lead to the revival of clock speed as the measure of a computer’s power.

In today’s computer chips, transistors are made from semiconductors, such as silicon. Each transistor includes an electrode called the gate; applying a voltage to the gate causes electrons to accumulate underneath it. The electrons constitute a channel through which an electrical current can pass, turning the semiconductor into a conductor.

Capacitance measures how much charge accumulates below the gate for a given voltage. The power that a chip consumes, and the heat it gives off, are roughly proportional to the square of the gate’s operating voltage. So lowering the voltage could drastically reduce the heat, creating new room to crank up the clock.

Shoomp!

MIT Professor of Physics Raymond Ashoori and Lu Li, a postdoc and Pappalardo Fellow in his lab — together with Christoph Richter, Stefan Paetel, Thilo Kopp and Jochen Mannhart of the University of Augsburg — investigated the unusual physical system that results when lanthanum aluminate is grown on top of strontium titanate. Lanthanum aluminate consists of alternating layers of lanthanum oxide and aluminum oxide. The lanthanum-based layers have a slight positive charge; the aluminum-based layers, a slight negative charge. The result is a series of electric fields that all add up in the same direction, creating an electric potential between the top and bottom of the material.

Ordinarily, both lanthanum aluminate and strontium titanate are excellent insulators, meaning that they don’t conduct electrical current. But physicists had speculated that if the lanthanum aluminate gets thick enough, its electrical potential would increase to the point that some electrons would have to move from the top of the material to the bottom, to prevent what’s called a “polarization catastrophe.” The result is a conductive channel at the juncture with the strontium titanate — much like the one that forms when a transistor is switched on. So Ashoori and his collaborators decided to measure the capacitance between that channel and a gate electrode on top of the lanthanum aluminate.

They were amazed by what they found: Although their results were somewhat limited by their experimental apparatus, it may be that an infinitesimal change in voltage will cause a large amount of charge to enter the channel between the two materials. “The channel may suck in charge — shoomp! Like a vacuum,” Ashoori says. “And it operates at room temperature, which is the thing that really stunned us.”

Indeed, the material’s capacitance is so high that the researchers don’t believe it can be explained by existing physics. “We’ve seen the same kind of thing in semiconductors,” Ashoori says, “but that was a very pure sample, and the effect was very small. This is a super-dirty sample and a super-big effect.” It’s still not clear, Ashoori says, just why the effect is so big: “It could be a new quantum-mechanical effect or some unknown physics of the material.”

“For capacitance, there is a formula that was assumed to be correct and was used in the computer industry and is in all the textbooks,” says Jean-Marc Triscone, a professor of physics at the University of Geneva whose group has published several papers on the juncture between lanthanum aluminate and strontium titanate. “What the MIT team and Mannhart showed is that to describe their system, this formula has to be modified.”

There is one drawback to the system that the researchers investigated: While a lot of charge will move into the channel between materials with a slight change in voltage, it moves slowly — much too slowly for the type of high-frequency switching that takes place in computer chips. That could be because the samples of the material are, as Ashoori says, “super dirty”; purer samples might exhibit less electrical resistance. But it’s also possible that, if researchers can understand the physical phenomena underlying the material’s remarkable capacitance, they may be able to reproduce them in more practical materials.

Triscone cautions that wholesale changes to the way computer chips are manufactured will inevitably face resistance. “So much money has been injected into the semiconductor industry for decades that to do something new, you need a really disruptive technology,” he says.

“It’s not going to revolutionize electronics tomorrow,” Ashoori agrees. “But this mechanism exists, and once we know it exists, if we can understand what it is, we can try to engineer it.”

Source  MIT News

Sunday, May 15, 2011

Scientists looking to burst the superconductivity bubble

Bubbles are blocking the current path of one of the most promising high temperature superconducting materials, new research suggests.

In a study published today, Monday, 16 may, in IOP Publishing's journal Superconductor Science and Technology, researchers have examined bismuth strontium calcium copper oxide (Bi2Sr2CaCu2Ox, Bi2212) – one of the most promising superconducting materials capable of creating large magnetic fields way beyond the limit of existing magnets – and found that its capabilities are limited by the formation of bubbles during its fabrication process.

Bi2212 is the only high temperature superconductor capable of being made into round wire, providing the preferred flexibility in magnet construction, and giving it potential uses in medical imaging and particle accelerators, such as the Large Hadron Collider in Switzerland.

For magnet applications, these wires must exhibit a high critical current density - the current density at which electrical resistance develops - and sustain it under large magnetic fields. This remains a stumbling block for utilising the huge potential of Bi2212 in the magnet technology as compellingly high critical current densities have not yet been achieved.

Previous studies have shown that a critical current varies widely between Bi2212 wire lengths – the critical current in wires that were 50 to 200m long was 20 to 50% lower than in 5 to 10cm long samples. This led the researchers, from the Applied Superconductivity Centre and the National High Magnetic Field Laboratory, Florida State University, to conclude that this variability must be caused by the connectivity of Bi2212 grains within the wires.

Bi2212 wires, made up of multiple filaments, are fabricated using the powder-in-tube (PIT) method in which Bi2212 powder is packed inside silver tubes and drawn to the desired size. The filaments of Bi2212 powder must firstly be melted inside their silver sheath and then slowly cooled to allow the Bi2212 to reform, greatly enhancing the critical current density.

As the processes between the critical melt and re-growth step is still largely unknown, the researchers decided to rapidly cool samples at different times in the melting process in order to get a snapshot of what occurs inside Bi2212 wires.

Using a scanning electron microscope and synchrotron X-ray microtomography, the researchers observed that the small powder pores, inherent to the PIT process, agglomerate into large bubbles on entering the melting stage.

The consequences of this are major as the Bi2212 filaments become divided into discrete segments of excellent connectivity which are then blocked by the residual bubbles, greatly reducing the long-range filament connectivity, and strongly suppressing the flow of current.

The new findings suggest that a key approach to improve the critical current density of the material would be to make it denser before melting.
Lead author Dr Fumitake Kametani, of The Applied Superconductivity Centre, Florida State University, said, "Our study suggested that a large portion of bubbles originates from the 30-40% of empty space, inevitable in any powder-in-tube process, which requires particle rolling to allow deformation of the metal-powder composite wire. "

"Densification of the filaments at final size - increasing the powder-packing density from 60-70% to greater than 90% - is an excellent way to reduce or eliminate the bubble formation. Various densification processes are now being tested."

Source EurekaAlert!