Mostrando entradas con la etiqueta Ciencia. Mostrar todas las entradas
Mostrando entradas con la etiqueta Ciencia. Mostrar todas las entradas

miércoles, 16 de diciembre de 2015

Scientists watch a planet being born for the first time in history

Scientists watch a planet being born for the first time in history
Scientists watch a planet being born for the first time in history
For the first time ever, scientists have been able to see planets as they are born. In photographs obtained by the Large Binocular Telescope and the Magellan Adaptive Optics System, astronomers watched as a ring of material formed into planets around a young star. This discovery could lead to the discovery of other forming exoplanets and give scientists answers to how planets are formed and then evolve into solar systems such as ours.
enter image description here

The telescope and optics system was able to photograph LkCa 15, a gas-giant exoplanet forming around a young star about 450 light years from Earth. The LkCa 15 system, according to Space.com, features a “disk of dust and gas” around a “sun-like” star that’s just two million years young. The scientific team, led by Stephanie Sallum, a graduate student at the University of Arizona, used the Large Binocular Telescope, an observatory in southeastern Arizona that has two 27-foot-wide mirrors.
The scientists confirmed that one giant protoplanet (not quite a planet, yet) existed and called it LkCa 15b. They were able to see it in hydrogen-alpha photons which are a type of light emitted when “superheated material accretes onto a newly forming world.” Essentially, the new planets are surrounded by “feeder” material.
Another newborn planet, LkCa 15c is also inside the gap between star and the dust ring and it’s possible that another LkCa 15d is also there. “We’re seeing sources in the clearing,” Sallum said. “This is the first time that we’ve been able to connect a forming planet to a gap in a protoplanetary disk.”

Google, NASA: Our Quantum Computer Is 100 Million Times Faster Than A Normal PC

Google, NASA: Our Quantum Computer Is 100 Million Times Faster Than A Normal PC
Google, NASA: Our Quantum Computer Is 100 Million Times Faster Than A Normal PC
But only for very specific optimisation problems.
“Two years ago Google and NASA went halfsies on a D-Wave quantum computer, mostly to find out whether there are actually any performance gains to be had when using quantum annealing instead of a conventional computer. Recently, Google and NASA received the latest D-Wave 2X quantum computer, which the company says has “over 1000 qubits.”
At an event yesterday at the NASA Ames Research Center, where the D-Wave computer is kept, Google and NASA announced their latest findings—and for highly specialised workloads, quantum annealing does appear to offer a truly sensational performance boost. For an optimisation problem involving 945 binary variables, the D-Wave X2 is up to 100 million times faster (108) than the same problem running on a single-core classical (conventional) computer.
Google and NASA also compared the D-Wave X2’s quantum annealing against Quantum Monte Carlo, an algorithm that emulates quantum tunnelling on a conventional computer. Again, a speed-up of up to 108 was seen in some cases.”

enter image description here


Hartmut Neven, the head of Google’s Quantum Artificial Intelligence lab, said these results are “intriguing and very encouraging” but that there’s still “more work ahead to turn quantum enhanced optimization into a practical technology.”
As always, it’s important to note that D-Wave’s computers are not capable of universal computing: they are only useful for a small number of very specific tasks—and Google, NASA, and others are currently trying to work out what those tasks might be. D-Wave’s claim of “over 1,000 qubits” is also unclear. In the past, several physical qubits were clustered to create a single computational qubit, and D-Wave doesn’t make that distinction clear.

Scientists explain origin of heavy elements in the Universe

Scientists explain origin of heavy elements in the Universe (new.huji.ac.il)
Scientists explain origin of heavy elements in the Universe
In a letter published in the prestigious journal Nature Physics, a team of scientists from The Hebrew University of Jerusalem suggests a solution to the Galactic radioactive plutonium puzzle.
All the Plutonium used on Earth is artificially produced in nuclear reactors. Still, it turns out that it is also produced in nature.
“The origin of heavy elements produced in nature through rapid neutron capture (‘r-process’) by seed nuclei is one of the current nucleosynthesis mysteries,” Dr. Kenta Hotokezaka, Prof. Tsvi Piran and Prof. Michael Paul from the Racah Institute of Physics at the Hebrew University of Jerusalem said in their letter.
Plutonium is a radioactive element. Its longest-lived isotope is plutonium-244 with a lifetime of 120 million years.
Detection of plutonium-244 in nature would imply that the element was synthesized in astrophysical phenomena not so long ago (at least in Galactic time scales) and hence its origin cannot be too far from us.
Several years ago it was discovered that the early Solar system contained a significant amount of plutonium-244. Considering its short-lived cycle, plutonium-244 that existed over four billion years ago when Earth formed has long since decayed but its daughter elements have been detected.
But recent measurements of the deposition of plutonium-244, including analysis of Galactic debris that fell to Earth and settled in deep sea, suggest that only very small amount of plutonium has reached Earth from outer space over the recent 100 million years. This is in striking contradiction to its presence at the time when the Solar system was formed, and that is why the Galactic radioactive plutonium remained a puzzle.
The Hebrew University team of scientists have shown that these contradicting observations can be reconciled if the source of radioactive plutonium (as well as other rare elements, such as gold and uranium) is in mergers of binary neutron stars. These mergers are extremely rare events but are expected to produce large amounts of heavy elements.
The model implies that such a merger took place accidentally in the vicinity of our Solar System within less than a hundred million years before it was born. This has led to the relatively large amount of plutonium-244 observed in the early Solar system.
On the other hand, the relatively small amount of plutonium-244 reaching Earth from interstellar space today is simply accounted for by the rarity of these events. Such an event hasn’t occurred in the last 100 million years in the vicinity of our Solar system.

Computing with time travel

Computing with time travel
Computing with time travel
Why send a message back in time, but lock it so that no one can ever read the contents? Because it may be the key to solving currently intractable problems. That’s the claim of an international collaboration who have just published a paper in npj Quantum Information.
It turns out that an unopened message can be exceedingly useful. This is true if the experimenter entangles the message with some other system in the laboratory before sending it. Entanglement, a strange effect only possible in the realm of quantum physics, creates correlations between the time-travelling message and the laboratory system. These correlations can fuel a quantum computation.
Around ten years ago researcher Dave Bacon, now at Google, showed that a time-travelling quantum computer could quickly solve a group of problems, known as NP-complete, which mathematicians have lumped together as being hard.
The problem was, Bacon’s quantum computer was travelling around ‘closed timelike curves’. These are paths through the fabric of spacetime that loop back on themselves. General relativity allows such paths to exist through contortions in spacetime known as wormholes.
Physicists argue something must stop such opportunities arising because it would threaten ‘causality’ – in the classic example, someone could travel back in time and kill their grandfather, negating their own existence.
And it’s not only family ties that are threatened. Breaking the causal flow of time has consequences for quantum physics too. Over the past two decades, researchers have shown that foundational principles of quantum physics break in the presence of closed timelike curves: you can beat the uncertainty principle, an inherent fuzziness of quantum properties, and the no-cloning theorem, which says quantum states can’t be copied.
However, the new work shows that a quantum computer can solve insoluble problems even if it is travelling along “open timelike curves”, which don’t create causality problems. That’s because they don’t allow direct interaction with anything in the object’s own past: the time travelling particles (or data they contain) never interact with themselves. Nevertheless, the strange quantum properties that permit “impossible” computations are left intact. “We avoid ‘classical’ paradoxes, like the grandfathers paradox, but you still get all these weird results,” says Mile Gu, who led the work.
Gu is at the Centre for Quantum Technologies (CQT) at the National University of Singapore and Tsinghua University in Beijing. His eight other coauthors come from these institutions, the University of Oxford, UK, Australian National University in Canberra, the University of Queensland in St Lucia, Australia, and QKD Corp in Toronto, Canada.
“Whenever we present the idea, people say no way can this have an effect” says Jayne Thompson, a co-author at CQT. But it does: quantum particles sent on a timeloop could gain super computational power, even though the particles never interact with anything in the past. “The reason there is an effect is because some information is stored in the entangling correlations: this is what we’re harnessing,” Thompson says.
There is a caveat – not all physicists think that these open timeline curves are any more likely to be realisable in the physical universe than the closed ones. One argument against closed timelike curves is that no-one from the future has ever visited us. That argument, at least, doesn’t apply to the open kind, because any messages from the future would be locked.

Provided by: National University of Singapore

A fundamental quantum physics problem has been proved unsolvable

A fundamental quantum physics problem has been proved unsolvable
A fundamental quantum physics problem has been proved unsolvable
For the first time a major physics problem has been proved unsolvable, meaning that no matter how accurately a material is mathematically described on a microscopic level, there will not be enough information to predict its macroscopic behaviour.
The research, by an international team of scientists from UCL, the Technical University of Music and the Universidad Complutense de Madrid – ICMAT, concerns the spectral gap, a term for the energy required for an electron to transition from a low-energy state to an excited state.
Spectral gaps are a key property in semiconductors, among a multitude of other materials, in particular those with superconducting properties. It was thought that it was possible to determine if a material is superconductive by extrapolating from a complete enough microscopic description of it, however this study has shown that determining whether a material has a spectral gap is what is known as “an undecidable question”.
“Alan Turing is famous for his role in cracking the Enigma, but amongst mathematicians and computer scientists, he is even more famous for proving that certain mathematical questions are `undecidable’ – they are neither true nor false, but are beyond the reach of mathematics code,” said co-author Dr Toby Cubitt, from UCL Computer Science.
“What we’ve shown is that the spectral gap is one of these undecidable problems. This means a general method to determine whether matter described by quantum mechanics has a spectral gap, or not, cannot exist. Which limits the extent to which we can predict the behaviour of quantum materials, and potentially even fundamental particle physics.”

enter image description here


The research, which was published today in the journal Nature, used complex mathematics to determine the undecidable nature of the spectral gap, which they say they have demonstrated in two ways:
“The spectral gap problem is algorithmically undecidable: there cannot exist any algorithm which, given a description of the local interactions, determines whether the resulting model is gapped or gapless,” wrote the researchers in the journal paper.
“The spectral gap problem is axiomatically independent: given any consistent recursive axiomatisation of mathematics, there exist particular quantum many-body Hamiltonians for which the presence or absence of the spectral gap is not determined by the axioms of mathematics.”
In other words, no algorithm can determine the spectral gap, and no matter how the maths is broken down, information about energy of the system does not confirm its presence.

enter image description here


The research has profound implications for the field, not least for the Clay Mathematics Institute’s infamous $1m prize to prove whether the standard model of particular physics, which underpins the behaviour of the most basic particulars of matter, has a spectral gap using standard model equations.
“It’s possible for particular cases of a problem to be solvable even when the general problem is undecidable, so someone may yet win the coveted $1m prize. But our results do raise the prospect that some of these big open problems in theoretical physics could be provably unsolvable,” said Cubitt.
“We knew about the possibility of problems that are undecidable in principle since the works of Turing and Gödel in the 1930s,” agreed co-author Professor Michael Wolf, from the Technical University of Munich.
“So far, however, this only concerned the very abstract corners of theoretical computer science and mathematical logic. No one had seriously contemplated this as a possibility right in the heart of theoretical physics before. But our results change this picture. From a more philosophical perspective, they also challenge the reductionists’ point of view, as the insurmountable difficulty lies precisely in the derivation of macroscopic properties from a microscopic description.”
“It’s not all bad news, though,” added Professor David Pérez-García, from the Universidad Complutense de Madrid and ICMAT. “The reason this problem is impossible to solve in general is because models at this level exhibit extremely bizarre behaviour that essentially defeats any attempt to analyse them.
“But this bizarre behaviour also predicts some new and very weird physics that hasn’t been seen before. For example, our results show that adding even a single particle to a lump of matter, however large, could in principle dramatically change its properties. New physics like this is often later exploited in technology.”

miércoles, 22 de octubre de 2014

Estrellas dobles

Estrellas dobles

Estrellas dobles

Las estrellas dobles (o binarias) son muy frecuentes. Una estrella doble es una pareja de estrellas que se mantienen unidas por la fuerza de la gravitación y giran en torno a su centro común.
Los periodos orbitales, que van desde minutos en el caso de parejas muy cercanas hasta miles de años en el caso de parejas distantes, dependen de la separación entre las estrellas y de sus respectivas masas.
También hay estrellas múltiples, sistemas en que tres o cuatro estrellas giran en trayectorias complejas. Lira parece una estrella doble, pero a través de un telescopio se ve como cada uno de los dos componentes es un sistema binario.
La observación de las órbitas de estrellas dobles es el único método directo que tienen los astrónomos para pesar las estrellas.
En el caso de parejas muy próximas, su atracción gravitatoria puede distorsionar la forma de las estrellas, y es posible que fluya gas de una estrella a otra en un proceso llamado “transferencia de masas”.
A través del telescopio se detectean muchas estrellas dobles que parecían simples. Sin embargo, cuando están muy próximas, sólo se detectan si se estudia su luz medianteespectroscopia. Entonces se ven los espectros de dos estrellas, y su movimiento se puede deducir por el efecto Doppler en ambos espectros. Estas parejas se denominan binarias espectroscópicas.
La mayoría de las estrellas que vemos en el cielo son dobles o incluso múltiples. Ocasionalmente, una de las estrellas de un sistema doble puede ocultar a la otra al ser observadas desde la Tierra, lo que da lugar a una binaria eclipsante.
En la mayoría de los casos, se cree que las componentes de un sistema doble se han originado simultáneamente, aunque otras veces, una estrella puede ser capturada por el campo gravitatorio de otra en zonas de gran densidad estelar, como los cúmulos de estrellas, dando lugar al sistema doble.

El telescopio James Webb supera las pruebas de temperatura extrema

El telescopio James Webb supera las pruebas de temperatura extrema

Telescopio James Webb
Foto: NASA
MADRID, 22 Oct. (@CIENCIAPLUS) -
   El 'corazón' del futuro telescopio espacial más grande que haya existido, el James Webb, ha sobrevivido a una nueva fase de prueba de cara a su lanzamiento en 2018. Según ha informado la NASA, se ha sometido al aparato a116 días de temperaturas extremadamente frías, como las del espacio.
   Concretamente, se han puesto a prueba los instrumentos más sensibles instalados en el Módulo Integrado de Ciencia (ISIM), saliendo todos ellos ilesos de la cámara de vacío térmico del Centro de Vuelo Espacial Goddard de laagencia espacial estadounidense.
   Uno de los responsables de ISIM, Mike Drury, ha indicado que este módulo es "una parte muy importante del observatorio ya que proporcionará todas las imágenes que Webb vaya a proporcionar en su misión". Estas fotografías revelarán las primeras galaxias que se formaron hace 13.500 millones años. El telescopio también atravesará las nubes de polvo interestelar para capturar las estrellas y los planetas que se forman en la Vía Láctea.
   En este sentido, el experto ha indicado que las operaciones que debe llevar a cabo James Webb tienen que ser compatibles con exponerse a temperaturas muy frías, de unos -233ºC. Como es imposible que exista un lugar en la Tierra con esta temperatura, los científicos han usado una cámara térmica de vacío para recrear este clima.
   Se trata de un espacio cilíndrico, de unos 27 metros de diámetro y 40 metros de altura, que utiliza nitrógeno y helio líquidos para bajar la temperatura y simular en ambiente espacial. "Completamos estas pruebas para asegurarnos de que cuando este telescopio se enfríe, las cuatro partes de su interior estén bien colocadas", ha precisado el científico.
   Otro de los participantes en estas pruebas, Paul Geithner, ha indicado que "el mayor estrés para este telescopio será cuando se enfríe". "Cuando la estructura del telescopio vaya desde la temperatura ambiente a la temperatura de funcionamiento muy fría tendrá más tensión", ha añadido.
   El telescopio Janes Webb es un proyecto de la NASA en el que también colaboran la Agencia Espacial Europea (ESA) y la Agencia Espacial de Canadá.

 Longitude  Latitude 
     
IRIS Earthquake Browser: the latest quakes, or choose up to 10 thousand from from an archive of 3.4 million, dated 1970 to minutes ago. IEB also allows you to rotate quakes in 3D! (no Flash or Java used)
Image of earthquake browser with 1000 quakes in Fiji region selected 1000 earthquakes rotating in 3D using IRIS 3D ViewerNew: 3D-compatible

El viaje del hombre a Marte se complica por el aumento de rayos cósmicos

El viaje del hombre a Marte se complica por el aumento de rayos cósmicos

Telescopio CRATER
Foto: CHRIS MEANEY/NASA
MADRID, 22 Oct.(@CIENCIAPLUS) -
   Un equipo de científicos de la Universidad de New Hampshire ha estudiado el "extraño comportamiento" reciente del Sol, que ha provocado un aumento de la radiación en el espacio profundo. Para los expertos, si esto ya era un problema a superar de cara las futuras misiones tripuladas, ahora estos proyectos podrían estar áun más lejos.
   En su artículo,  el autor principal, Nathan Schwadron, apunta que existe una falta de actividad solar muy anormal y prolongada. Como consecuencias, el viento solar está exhibiendo densidades extremadamente bajas, así como las intensidades de campo magnético, lo que hace que los niveles de radiación peligrosos impregnen el ambiente espacial.
   "El comportamiento del Sol ha cambiado recientemente y ahora está en un estado que no se había observado desde hace casi 100 años", ha apuntado Schwadron. En este sentido, ha explicado que en la mayor parte de la era espacial, la actividad solar ha mostrado que el astro cumple un ciclo de 11 años. De ellos, entre seis y ocho años son de pausa en su actividad --lo que se conoce como mínimo solar--, mientras que los dos a tres años siguientes son más activos.
   "Sin embargo, a partir de alrededor de 2006, se observó un mínimo solar más largo y una actividad más débil, probablemente la más baja que se haya detectado en la era espacial", ha señalado el científico.
   Estas condiciones provocan mayores intensidades de los rayos cósmicos galácticos y el empeoramiento de los riesgos de radiación que potencialmente amenazan a futuras misiones de astronautas en el espacio profundo. "Mientras estas condiciones no son necesariamente peligrosas para misiones a la Luna o un asteroide, si que son un factor significativo en relación a la duración de la misión", ha reconocido Schwadron.
   Además, los datos proporcionan información crítica sobre los peligros de la radiación a la que se enfrentarían los astronautas en misiones de larga duración al espacio lejano, como en un viaje a Marte.

UN RETO IMPORTANTE

   La radiación ionizante de los rayos cósmicos galácticos y las partículas energéticas solares sigue siendo "un reto importante" para misiones de larga duración con tripulación al espacio lejano, según han indicado los expertos.
   Los seres humanos se enfrentan a una variedad de consecuencias que van desde efectos agudos (enfermedad por radiación) a los efectos a largo plazo, incluyendo la inducción de cáncer y daños en los órganos, incluyendo el corazón y el cerebro.
   Los altos niveles de radiación vistos durante el pasado ciclo mínimo solar limita los días permitidos para los astronautas, incluso tras el blindaje que proporcionaría una nave espacial. Dada la tendencia de reducción de la radiación solar, los días permitidos en el espacio de los astronautas está disminuyendo y se estima que será un 20 por ciento menor en el próximo ciclo solar