Monday 4 January 2016

The main problems of canonical quantum cosmology are the following

The main problems of canonical quantum cosmology are the following:
The singularity. The Wheeler-DeWitt equation does not remove the singularity, although in some cases the solutions avoid the singularity due to the effective potential as we have seen. In short, canonical quantum gravity is not well defined. The models of canonical quantum cosmology do only apply on a semiclassical approximation and there is no way to make predictions about the dynamics of the singularity.
Decoherence. If the wavefunction of the universe is a linear combination of basic states, then these can interfere with each other. However, we observe a classical universe without interference on the macrocospic world. This requires of a mechanism to allow for decoherence of the wavefunction of the universe.
The problem of time. This is more a general problem in the canonical approach of quantum gravity, but its implications are important in cosmology. The Hamiltonian constraint indicates that the theory does not have a predefined notion of time. Reparametrizations of the time evolution are gauge transformations without physical content. To single out the real time evolution of the universe one has to fix a specific gauge. This breaks the original symmetry of the classical equations under diffeomorphisms and, moreover, different choices of different time variables lead to different quantum theories. The question that arises is related to the next issue: are we allowed to select the scale factor as a parameter that scans time evolution already in the quantum regime?
The minisuperspace approximation. The minisuperspace approximation does fix simultaneously canonically conjugate variables violating the uncertainty principle (due to symmetry the dynamic variable and its conjugate momentum are both zero). It leaves the scale factor as the single degree of freedom in the models. Although our universe is currently homogenous and isotropic, this assumption need not to be valid at the beginning. Note that the inflationary phase, already in the classical regime, might have created homogeneity out of an inhomogeneous and random initial state.
Inhomogeneities. There exist models that describe inhomogeneities, but they are poorly understood.
Initial conditions. Initial conditions that are imposed to the universe cannot be derived from any principle and become the same status than fundamental laws. Bryce DeWitt envisioned a theory in which the requirement of mathematical consistency should be sufficient to guarantee a unique solution to the Wheeler-deWitt equation. However, this cannot be realized in canonical quatum cosmology.
Formalism. The theory has diverse mathematical and consistency problems, like factor ordering and other ambiguities, and the definition of a proper notion of probabilites in a single universe.
Nevertheless, one would expect that some aspects of the mentioned canonical quantum cosmology models should be present in a complete and consistent model of quantum cosmology. There is no clear reason to expect the description of the origin of the universe beyond the singularity to be correct, but the calculations for transition probabilities and set-up of inflation near the classical regime may be a good approximation to reality.
(Sharing Credits with Herr von Bradford)

Cosmic Seeding

Most of the universe's heavy elements, including the iron central to life itself, formed early in cosmic history and spread throughout the universe, according to a new study of the Perseus Galaxy Cluster using Japan's Suzaku satellite.
Suzaku study points to early cosmic 'seeding': http://phy.so/302434001


(Phys.org) —Most of the universe's heavy elements, including the iron central to life itself, formed early in cosmic history and spread throughout the universe, according to a new study of the Perseus Galaxy Cluster using Japan's Suzaku satellite.

Graviton

I have been refreshing myself on quantum gravity, renormalization and the concept of gravitons being experimentally confirmed all morning. This is an interesting article I found on how physicist could find the highly theoretical particle called the graviton. They could use a particle accelerator to create a proton antiproton collision. This collision would result into gravitons decaying into standard model particles.

I have been refreshing myself on quantum gravity, renormalization and the concept of gravitons being experimentally confirmed all morning. This is an interesting article I found on how physicist could find the highly theoretical particle called the graviton. They could use a particle accelerator to create a proton antiproton collision. This collision would result into gravitons decaying into standard model particles. http://www-d0.fnal.gov/Run2Physics/WWW/results/final/NP/N05C/N05C.html

Quantum Tunneling

Quantum Tunneling: A World Stranger Than Science Fiction Think for a moment about a ball rolling up a hill. If you don’t push hard enough, then it will not roll over the hill. This makes sense classically. However, in quantum mechanics, an object does not behave like a classical object (such as ball, or you, or I, or any matter in the universe does). Learn about the strange world we live in at:

Sunday 3 January 2016

Scientists reveal cosmic roadmap to galactic magnetic field

Scientists reveal cosmic roadmap to galactic magnetic field: http://phy.so/311514082
Scientists on NASA's Interstellar Boundary Explorer (IBEX) mission, including a team leader from the University of New Hampshire, report that recent, independent measurements have validated one of the mission's signature findings—a mysterious "ribbon" of energy and particles at the edge of our solar system that appears to be a directional "roadmap in the sky" of the local interstellar magnetic field.

Nuclear-atomic overlap for the isotope thorium-229

Nuclear-atomic overlap for the isotope thorium-229: http://phy.so/311851967
More than 99.9% of the mass of any atom is concentrated into a quadrillionth of its volume, the part occupied by the nucleus. Unimaginably small, dense and energetic, atomic nuclei are governed by laws quite distinct from those that regulate atomic electrons, which constitute the outer part of atoms and which are immediately responsible for light, chemistry and thus life. Yet there are sporadic regions of contact between these disparate realms. JQI Adjunct Fellow Marianna Safronova and her collaborators (1) have been exploring one area of nuclear-atomic overlap for the isotope thorium-229. This isotope is a candidate for a new type of atomic clock and quantum information processor.

"The Basic Elements of String Theory"

"The Basic Elements of String Theory"

Five key ideas are at the heart of string theory. Become familiar with these key elements of string theory right off the bat. Read on for the very basics of these five ideas of string theory in the sections below.

1.Strings and membranes-

When the theory was originally developed in the 1970s, the filaments of energy in string theory were considered to be 1-dimensional objects: strings. (One-dimensional indicates that a string has only one dimension, length, as opposed to say a square, which has both length and height dimensions.)

These strings came in two forms — closed strings and open strings. An open string has ends that don’t touch each other, while a closed string is a loop with no open end. It was eventually found that these early strings, called Type I strings, could go through five basic types of interactions, as shown in the figure.

Type I strings can go through five fundamental interactions, based on different ways of joining and splitting.
The interactions are based on a string’s ability to have ends join and split apart. Because the ends of open strings can join together to form closed strings, you can’t construct a string theory without closed strings.

This proved to be important, because the closed strings have properties that make physicists believe they might describe gravity. Instead of just being a theory of matter particles, physicists began to realize that string theory may just be able to explain gravity and the behavior of particles.

Over the years, it was discovered that the theory required objects other than just strings. These objects can be seen as sheets, or branes. Strings can attach at one or both ends to these branes. A 2-dimensional brane (called a 2-brane) is shown in this figure.
In string theory, strings attach themselves to branes.

2.Quantum gravity-
Modern physics has two basic scientific laws: quantum physics and general relativity. These two scientific laws represent radically different fields of study. Quantum physics studies the very smallest objects in nature, while relativity tends to study nature on the scale of planets, galaxies, and the universe as a whole. (Obviously, gravity affects small particles too, and relativity accounts for this as well.) Theories that attempt to unify the two theories are theories of quantum gravity, and the most promising of all such theories today is string theory.

3.Unification of forces-
Hand-in-hand with the question of quantum gravity, string theory attempts to unify the four forces in the universe — electromagnetic force, the strong nuclear force, the weak nuclear force, and gravity — together into one unified theory. In our universe, these fundamental forces appear as four different phenomena, but string theorists believe that in the early universe (when there were incredibly high energy levels) these forces are all described by strings interacting with each other.

4.Supersymmetry-
All particles in the universe can be divided into two types: bosons and fermions. String theory predicts that a type of connection, called supersymmetry, exists between these two particle types. Under supersymmetry, a fermion must exist for every boson and vice versa. Unfortunately, experiments have not yet detected these extra particles.

Supersymmetry is a specific mathematical relationship between certain elements of physics equations. It was discovered outside of string theory, although its incorporation into string theory transformed the theory into supersymmetric string theory (or superstring theory) in the mid-1970s.

Supersymmetry vastly simplifies string theory’s equations by allowing certain terms to cancel out. Without supersymmetry, the equations result in physical inconsistencies, such as infinite values and imaginary energy levels.

Because scientists haven’t observed the particles predicted by supersymmetry, this is still a theoretical assumption. Many physicists believe that the reason no one has observed the particles is because it takes a lot of energy to generate them. (Energy is related to mass by Einstein’s famous E = mc2 equation, so it takes energy to create a particle.) They may have existed in the early universe, but as the universe cooled off and energy spread out after the big bang, these particles would have collapsed into the lower-energy states that we observe today. (We may not think of our current universe as particularly low energy, but compared to the intense heat of the first few moments after the big bang, it certainly is.)

Scientists hope that astronomical observations or experiments with particle accelerators will uncover some of these higher-energy supersymmetric particles, providing support for this prediction of string theory.

6.Extra dimensions-
Another mathematical result of string theory is that the theory only makes sense in a world with more than three space dimensions! (Our universe has three dimensions of space — left/right, up/down, and front/back.) Two possible explanations currently exist for the location of the extra dimensions:

The extra space dimensions (generally six of them) are curled up (compactified, in string theory terminology) to incredibly small sizes, so we never perceive them.

We are stuck on a 3-dimensional brane, and the extra dimensions extend off of it and are inaccessible to us.

From_ String Theory For Dummies
-Dr. B

A major area of research among string theorists is on mathematical models of how these extra dimensions could be related to our own. Some of these recent results have predicted that scientists may soon be able to detect these extra dimensions (if they exist) in upcoming experiments, because they may be larger than previously expected.


theoretical gravitation particle

There’s another theoretical gravitation particle, and it is quite intriguing. The graviphoton is a particle that would be created when the gravitational field is excited in a fifth dimension. It comes from the Kaluza Klein theory, which proposes that electromagnetism and gravitation can be unified into a single force under the condition that there are more than four dimensions in spacetime. A graviphoton would have the characteristics of a graviton, but it would also carry the properties of a photon and create what physicists call a “fifth force” (there are currently four fundamental forces). Other theories state that a graviphoton would be a superpartner (like a sparticle) of gravitons, but that it would actually attract and repel at the same time. By doing that, gravitons could theoretically create anti-gravity. And that’s only in the fifth dimension—the theory of supergravity also posits the existence of graviphotons, but allows for eleven dimensions.

I will start this post off by saying that this is in the realm of Theoretical Physics. There’s another theoretical gravitation particle, and it is quite intriguing. The graviphoton is a particle that would be created when the gravitational field is excited in a fifth dimension. It comes from the Kaluza Klein theory, which proposes that electromagnetism and gravitation can be unified into a single force under the condition that there are more than four dimensions in spacetime. A graviphoton would have the characteristics of a graviton, but it would also carry the properties of a photon and create what physicists call a “fifth force” (there are currently four fundamental forces). Other theories state that a graviphoton would be a superpartner (like a sparticle) of gravitons, but that it would actually attract and repel at the same time. By doing that, gravitons could theoretically create anti-gravity. And that’s only in the fifth dimension—the theory of supergravity also posits the existence of graviphotons, but allows for eleven dimensions.

The Successes of String Theory

The Successes of String Theory

String theory has gone through many transformations since its origins in 1968 when it was hoped to be a model of certain types of particle collisions. It initially failed at that goal, but in the 40 years since, string theory has developed into the primary candidate for a theory of quantum gravity. It has driven major developments in mathematics, and theorists have used insights from string theory to tackle other, unexpected problems in physics. In fact, the very presence of gravity within string theory is an unexpected outcome!

Predicting gravity out of strings
The first and foremost success of string theory is the unexpected discovery of objects within the theory that match the properties of the graviton. These objects are a specific type of closed strings that are also massless particles that have spin of 2, exactly like gravitons. To put it another way, gravitons are a spin-2 massless particle that, under string theory, can be formed by a certain type of vibrating closed string. String theory wasn’t created to have gravitons — they’re a natural and required consequence of the theory.

One of the greatest problems in modern theoretical physics is that gravity seems to be disconnected from all the other forces of physics that are explained by the Standard Model of particle physics. String theory solves this problem because it not only includes gravity, but it makes gravity a necessary byproduct of the theory.

Explaining what happens to a black hole (sort of)
A major motivating factor for the search for a theory of quantum gravity is to explain the behavior of black holes, and string theory appears to be one of the best methods of achieving that goal. String theorists have created mathematical models of black holes that appear similar to predictions made by Stephen Hawking more than 30 years ago and may be at the heart of resolving a long-standing puzzle within theoretical physics: What happens to matter that falls into a black hole?

Scientists’ understanding of black holes has always run into problems, because to study the quantum behavior of a black hole you need to somehow describe all the quantum states (possible configurations, as defined by quantum physics) of the black hole. Unfortunately, black holes are objects in general relativity, so it’s not clear how to define these quantum states.

String theorists have created models that appear to be identical to black holes in certain simplified conditions, and they use that information to calculate the quantum states of the black holes. Their results have been shown to match Hawking’s predictions, which he made without any precise way to count the quantum states of the black hole.

This is the closest that string theory has come to an experimental prediction. Unfortunately, there’s nothing experimental about it because scientists can’t directly observe black holes (yet). It’s a theoretical prediction that unexpectedly matches another (well-accepted) theoretical prediction about black holes. And, beyond that, the prediction only holds for certain types of black holes and has not yet been successfully extended to all black holes.

Explaining quantum field theory using string theory
One of the major successes of string theory is something called the Maldacena conjecture, or the AdS/CFT correspondence. Developed in 1997 and soon expanded on, this correspondence appears to give insights into gauge theories, such as those at the heart of quantum field theory.

The original AdS/CFT correspondence, written by Juan Maldacena, proposes that a certain 3-dimensional (three space dimensions, like our universe) gauge theory, with the most supersymmetry allowed, describes the same physics as a string theory in a 4-dimensional (four space dimensions) world. This means that questions about string theory can be asked in the language of gauge theory, which is a quantum theory that physicists know how to work with!

Like John Travolta, string theory keeps making a comeback
String theory has suffered more setbacks than probably any other scientific theory in the history of the world, but those hiccups don’t seem to last that long. Every time it seems that some flaw comes along in the theory, the mathematical resiliency of string theory seems to not only save it, but to bring it back stronger than ever.

When extra dimensions came into the theory in the 1970s, the theory was abandoned by many, but it had a comeback in the first superstring revolution. It then turned out there were five distinct versions of string theory, but a second superstring revolution was sparked by unifying them. When string theorists realized a vast number of solutions of string theories (each solution to string theory is called a vacuum, while many solutions are called vacua) were possible, they turned this into a virtue instead of a drawback. Unfortunately, even today, some scientists believe that string theory is failing at its goals.


About the nature of time and time travel

About the nature of time and time travel

Think about the nature of time at first!
Time is related to motion and duration. There is no time as such, time in the concept that we are used to use it is not existent: clocks are not measuring time per se.
Time is the highest form of energy represented by spin, charge and oscillation.

There is NO preserved state of the past, future is a probability distribution of a sample space, not defined. Given some ceteris paribus assumptions and blending out specific relativistic effects: there is no past at all, the covariance of time in view to motion is related to the photon as a state of space at C.
Wormholes, in case they exist, cannot be configurated to a specific polarization and state of the photon for a given locality, but wormholes refer also to a state of space in view of c or FTL, the same principle. You cannot measure the "past" and attribute a preserved past state, but for a time travel process, you would need to do that.

There is no time travel at all, except being able to change the state of space itself with all its properties!

String Theory.................Particle Physics

(Phys.org) —Scientists at Towson University in Towson, Maryland, have identified a practical, yet overlooked, test of string theory based on the motions of planets, moons and asteroids, reminiscent of Galileo's famed test of gravity by dropping balls from the Tower of Pisa.
String theory is infamous as an eloquent theoretical framework to understand all forces in the universe —- a so-called "theory of everything" —- that can't be tested with current instrumentation because the energy level and size scale to see the effects of string theory are too extreme.
Yet inspired by Galileo Galilei and Isaac Newton, Towson University scientists say that precise measurements of the positions of solar-system bodies could reveal very slight discrepancies in what is predicted by the theory of general relativity and the equivalence principle, or establish new upper limits for measuring the effects of string theory.
The Towson-based team presents its finding today, January 6, 2014, between 10 a.m. and 11:30 a.m., at the 223rd meeting of the American Astronomical Society, in Washington, D.C. The work also appears in the journal Classical and Quantum Gravity.
String theory hopes to provide a bridge between two well-tested yet incompatible theories that describe all known physics: Einstein's general relativity, our reigning theory of gravity; and the standard model of particle physics, or quantum field theory, which explains all the forces other than gravity.
String theory posits that all matter and energy in the universe is composed of one-dimensional strings. These strings are thought to be a quintillion times smaller than the already infinitesimal hydrogen atom and thus too minute to detect indirectly. Similarly, finding signs of strings in a particle accelerator would require millions of times more energy than what has been needed to identify the famous Higgs boson.
"Scientists have joked about how string theory is promising...and always will be promising, for the lack of being able to test it," said Dr. James Overduin of the Department of Physics, Astronomy and Geosciences at Towson University, first author on the paper. "What we have identified is a straightforward method to detect cracks in general relativity that could be explained by string theory, with almost no strings attached."


New model shows light can be captured in a Bose-Einstein condensate state

New model shows light can be captured in a Bose-Einstein condensate state.
This research was published in Physical Review A (http://journals.aps.org/pra/abstract/10.1103/PhysRevA.89.033862).
If anybody wants to read the paper then here's the link: http://arxiv.org/abs/1401.0520

New State Of Light Offered Up By Physicist Overseas

A theoretical physicist has explained a way to capture particles of light called photons, even at room temperature, a feat thought only possible at bone-chillingly cold temperatures. Alex Kruchkov, a doctoral student at the Swiss Federal Institute...

visualizing complex electronic state

The new findings are reported this week in the journal Nature Materials, in a paper by MIT postdoc Xin Li, professors Young Lee and Gerbrand Ceder, also of MIT, and 12 others.

Team visualizes complex electronic state
Multidisciplinary group solves mystery of how a potential battery electrode material behaves.


Observation of geometric scaling of Efimov states in a Fermi-Bose Li-Cs mixture

Observation of geometric scaling of Efimov states in a Fermi-Bose Li-Cs mixture

Shih-Kuang Tung, Karina Jimenez-Garcia, Jacob Johansen, Colin V. Parker, Cheng Chin
(Submitted on 24 Feb 2014)
The emergence of scaling symmetry in physical phenomena suggests a universal description that is insensitive to microscopic details. A well known example is critical phenomena, which are invariant under continuous scale transformations and classify second-order phase transitions into distinct universality classes. Equally intriguing are systems with discrete scaling symmetry, which are invariant under scaling transformation with a specific scaling constant, resulting in log-periodic behavior in the observables. A classic example is the self-similar growth of crystals, as in snowflakes. In few-body physics, Vitaly Efimov predicted in 1970 the existence of an infinite series of three-body bound states that obey universal discrete scaling symmetry when pair-wise interactions are resonantly enhanced. Despite abundant reports of Efimov states in cold atom experiments, direct observation of discrete scaling symmetry remains an elusive goal. Here we report the observation of three Efimov resonances in a heteronuclear Li-Cs mixture near a broad interspecies Feshbach resonance. The positions of the resonances provide a model-independent confirmation of discrete scaling symmetry with an averaged scaling constant λexp=4.85(44), in good agreement with the predicted value of 4.88. Our findings also provide new insight into the possible discrete scaling symmetry in Bose gasses and mixtures with resonant interactions.


Question


Gravity is the force that is constantly working to centralize quantum harmonic oscillators into the unified field. Black holes disappear once oscillators are centralized into the unified field by its gravity. What do you say?

Cosmic Inflation Is ‘Fantasy’

Sir Roger Penrose calls string theory a "fashion," quantum mechanics "faith," and cosmic inflation a "fantasy."

Coming from an armchair theorist, these declarations might be dismissed. But Penrose is a well-respected physicist who co-authored a seminal paper on black holes with Stephen Hawking. What's wrong with modern physics—and could alternative theories explain our observations of the universe?

Sir Roger Penrose: Cosmic Inflation Is ‘Fantasy’

What's wrong with modern physics—and could alternative theories explain our observations of the universe?

Neutron Decay

If neutrons were observed to decay via the emission of two beta particles (as opposed to one beta and one neutrino), it would constitute evidence of Majorana neutrinos - that is, neutrinos that are their own antiparticles. The Enriched Xenon Observatory-200 has not detected such decays, but it has placed more stringent lower limits on their likelihood.

EXO-200 narrows its search for Majorana neutrinos - physicsworld.com

Most stringent limit on rate of neutrinoless double β decay set

Alan Guth's Course

MIT's Alan Guth is one of the originators of the inflation theory of the early universe. He also teaches an introductory course on cosmology, which you can take for free, thanks to MIT's Open Courseware project. Click on the link to find lecture videos, slides, assignments and other online course materials.

via- Physics Today


The Early Universe

The Early Universe provides an introduction to modern cosmology. The first part of the course deals with the classical cosmology, and later part with modern particle physics and its recent impact on cosmology.

Quantum phase transitions

Quantum phase transitions take place at 0 K - that is, without the help of thermal fluctuations. A theory that describes how such transitions behave at higher temperatures has just been vindicated by Anne Kinross of McMaster University and her collaborators. Martin Klanjšek of the Jožef Stefan Institute in Ljubljana explains the experiment's significance.

A Critical Test of Quantum Criticality

Theoretically predicted quantum critical behavior in a model magnetic material has been experimentally confirmed at a quantitative level.

EPR paradox

The EPR paradox (or Einstein–Podolsky–Rosen paradox) is a topic in quantum physics and the philosophy of science regarding measurements of microscopic systems (such as individual photons, electrons or atoms) and the description of those systems by the methods of quantum physics. It refers to a dichotomy, where either the measurement of a physical quantity in one system affects the measurement of a physical quantity in another, spatially separated system or the description of reality given by a wave function is not complete.

This challenge to the Copenhagen interpretation originated from the consequences of a thought experiment introduced in 1935 by Einstein, Podolsky, and Rosen and resulted in what seemed to be a contradiction in the interpretation. The thought experiment involves two systems that interact with each other and are then separated so that they presumably interact no longer. Then, the position or momentum of one of the systems is measured, and due to the known relationship between the measured value of the first particle and the value of the second particle, the observer is aware of the value in the second particle. A measurement of the second value is made on the second particle, and again, due to the relationship between the two particles, this value can then be known in the first particle. This outcome seems to violate the uncertainty principle, since both the position and momentum of a single particle would be known with certainty.

Einstein never accepted quantum mechanics as a "real" and complete theory as he could not believe that measurement of the second particle invalidates the measurement of the first as this requires "information" to travel faster than light between the two (although it is not actually possible to transfer information in this manner as the results would be uncontrollable). Einstein struggled to the end of his life for a theory that could better comply with causality, protesting against the view that there exists no objective physical reality other than that which is revealed through measurement as it is interpreted in terms of the quantum mechanical formalism.

Here's a simple video on EPR Paradox brought to you by DrPhysicsA on YouTube -https://www.youtube.com/watch?v=0x9AgZASQ4k


Scientists find a practical test for string theory

Scientists find a practical test for string theory: http://phy.so/308225369


(Phys.org) —Scientists at Towson University in Towson, Maryland, have identified a practical, yet overlooked, test of string theory based on the motions of planets, moons and asteroids, reminiscent of Galileo's famed test of gravity by dropping balls from the Tower of Pisa.

Could Particle 'Spooky Action' Define The Nature Of Gravity?

Could Particle 'Spooky Action' Define The Nature Of Gravity?

Quantum physics is a fascinating yet complicated subject to understand, and one of the things that freaks out physics students every is the concept of entanglement. That occurs when physicists attempt to measure the state of a particle and that affects the state of another particle instantly. (In reality, the particles are in multiple states — spinning in multiple directions, for example — and can only be said to be in one state or another when they are measured.)


How to Observe Schrodinger's Cat Without Killing it

How to Observe Schrodinger's Cat Without Killing it:
Schrödinger’s cat is by far the best known example of demonstrating quantum superpositions. Now, scientists might be able to observe the cat without risking killing it.
First, lets quickly recap Schrodinger’s cat. To quote from the NewScientist picture, “Erwin Schrödinger proposed that a cat in a closed, booby-trapped box with a random quantum trigger could be simultaneously dead and alive.” From here, the only way to know whether the cat is dead or alive is to open the box and look. By opening the box, you risk killing the cat. So, back to the headline, scientists have proposed that we can ‘look inside the box’ without the risk of killing our feline companion.

MEASUREMENT OF THE GRAVITATIONAL MASS OF ANTIHYDROGEN

MEASUREMENT OF THE GRAVITATIONAL MASS OF ANTIHYDROGEN
To oversimplify it, antimatter is the opposite of matter. The two are mirror images with opposite electric charges. But, do the two respond in opposite ways to gravity? Instead of falling down, would antimatter “fall up”? If antimatter does go up instead of down, it will cause scientists to reevaluate how the universe works.
These questions have existed for over 50 years, but now ALPHA collaborators at CERN have been able to approach the problem; an incredible feat in itself. There have been no definitive conclusions yet, but when the experiment resumes in 2014 with a new antimatter trap, researchers hope to achieve more significant results.
More info: http://bbc.in/ 18dBHmj

Paper in Nature: http://bit.ly/ ZWKOV1

leviton

Researchers produce the first experimental pulse-generation of a single electron—a leviton
A team of researchers in France has produced the first experimental pulse-generation of a single electron—they've named it a leviton, in honor of physicist Leonid Levitov and its resemblance to a soliton. In their paper published in the journal Nature, the team describes how they caused the leviton to come about and how it might be used in future applications.
Seventeen years ago, Levitov and colleagues suggested that if a voltage was applied to a nanocircuit and varied over time according to the mathematical expression of a Lorentzian distribution, it should be possible to excite a single electron peak in a sea of electrons. In this new effort, the researchers in France have proved the theory to be true and in the process have opened the door to a whole new subfield of physics involving the use of quantum excitations.
The researchers achieved their goal by building a nanocircuit, which was essentially a nanoscale electrode, to create what is known as a Fermi sea—where electrons are held in a tiny device. They then cooled the circuit to near absolute zero and then applied voltage that varied in time to stir up the electrons—creating chaotic peaks and valleys. Adjusting the time variations to fit the Lorentzian distribution allowed for creating just one single peak—the long sought after leviton. The researchers compare their experiment to a tub of water, stirring causes chaotic waves to form, adding more water (electrons) causes the water level to rise, while letting some out similarly causes the level to fall. But, if the water is stirred in just the right way, as Scottish engineer John Scot Russell observed back in the late 1800's, a soliton can form—a tsunami type wave where the energy in it is not dispersed as it moves across the surface.
This is not the first time single-electron excitations have been caused to come about, but it is the first time it's been done without having to resort to building a special nanostructure. Because of that, the researchers say, it now appears possible to scale up such a circuit to allow for building larger systems that could conceivable carry quantum information.
Abstract -
The on-demand generation of pure quantum excitations is important for the operation of quantum systems, but it is particularly difficult for a system of fermions. This is because any perturbation affects all states below the Fermi energy, resulting in a complex superposition of particle and hole excitations. However, it was predicted nearly 20 years ago that a Lorentzian time-dependent potential with quantized flux generates a minimal excitation with only one particle and no hole. Here we report that such quasiparticles (hereafter termed levitons) can be generated on demand in a conductor by applying voltage pulses to a contact. Partitioning the excitations with an electronic beam splitter generates a current noise that we use to measure their number. Minimal-excitation states are observed for Lorentzian pulses, whereas for other pulse shapes there are significant contributions from holes. Further identification of levitons is provided in the energy domain with shot-noise spectroscopy, and in the time domain with electronic Hong–Ou–Mandel noise correlations. The latter, obtained by colliding synchronized levitons on a beam splitter, exemplifies the potential use of levitons for quantum information: using linear electron quantum optics in ballistic conductors, it is possible to imagine flying-qubit operation in which the Fermi statistics are exploited to entangle synchronized electrons emitted by distinct sources. Compared with electron sources based on quantum dots, the generation of levitons does not require delicate nanolithography, considerably simplifying the circuitry for scalability. Levitons are not limited to carrying a single charge, and so in a broader context n-particle levitons could find application in the study of full electron counting statistics. But they can also carry a fraction of charge if they are implemented in Luttinger liquids3 or in fractional quantum Hall edge channels; this allows the study of Abelian and non-Abelian quasiparticles in the time domain. Finally, the generation technique could be applied to cold atomic gases, leading to the possibility of atomic leviton.
SOURCE-phys.org

Saturday 2 January 2016

All About Neutrinos:

All About Neutrinos:
What is this thing, anyways?

Neutrinos are subatomic particles produced by the decay of radioactive elements and are elementary particles that lack an electric charge, or, as F. Reines would say, "...the most tiny quantity of reality ever imagined by a human being".
"The name neutrino was coined by Enrico Fermi as a word play on neutrone, the Italian name of the neutron."
Of all high-energy particles, only weakly interacting neutrinos can directly convey astronomical information from the edge of the universe - and from deep inside the most cataclysmic high-energy processes and as far as we know, there are three different types of neutrinos, each type relating to a charged particle....Like "Electron Neutrino is with Electron(e), Mu Neutrino with Muon(µ) and Tau Neutrino with 'tau' (τ)..

Copiously produced in high-energy collisions, travelling essentially at the speed of light, and unaffected by magnetic fields, neutrinos meet the basic requirements for astronomy. Their unique advantage arises from a fundamental property: they are affected only by the weakest of nature's forces (but for gravity) and are therefore essentially unabsorbed as they travel cosmological distances between their origin and us.
Where are they coming from?
From what we know today, a majority of the neutrinos floating around were born around 15 billions years ago, soon after the birth of the universe. Since this time, the universe has continuously expanded and cooled, and neutrinos have just kept on going. Theoretically, there are now so many neutrinos that they constitute a cosmic background radiation whose temperature is 1.9 degree Kelvin (-271.2 degree Celsius). Other neutrinos are constantly being produced from nuclear power stations, particle accelerators, nuclear bombs, general atmospheric phenomenae, and during the births, collisions, and deaths of stars, particularly the explosions of supernovae.

The neutrino was first postulated in December, 1930 by Wolfgang Pauli to explain the energy spectrum of beta decays, the decay of a neutron into a proton and an electron. Pauli theorized that an undetected particle was carrying away the observed difference between the energy and angular momentum of the initial and final particles. Because of their "ghostly" properties, the first experimental detection of neutrinos had to wait until about 25 years after they were first discussed. In 1956 Clyde Cowan, Frederick Reines, F. B. Harrison, H. W. Kruse, and A. D. McGuire published the article "Detection of the Free Neutrino: a Confirmation" in Science, a result that was rewarded with the 1995 Nobel Prize.
In 1962 Leon M. Lederman, Melvin Schwartz and Jack Steinberger showed that more than one type of neutrino exists by first detecting interactions of the muon neutrino. When a third type of lepton, the tau, was discovered in 1975 at the Stanford Linear Accelerator, it too was expected to have an associated neutrino. First evidence for this third neutrino type came from the observation of missing energy and momentum in tau decays analogous to the beta decay that had led to the discovery of the neutrino in the first place. The first detection of actual tau neutrino interactions was announced in summer of 2000 by the DONUT collaboration at Fermilab, making it the latest particle of the Standard Model to have been directly observed.

A practical method for investigating neutrino masses (that is, flavour oscillation) was first suggested by Bruno Pontecorvo in 1957 using an analogy with the neutral kaon system; over the subsequent 10 years he developed the mathematical formalism and the modern formulation of vacuum oscillations. In 1985 Stanislav Mikheyev and Alexei Smirnov (expanding on 1978 work by Lincoln Wolfenstein) noted that flavour oscillations can be modified when neutrinos propagate through matter. This so-called MSW effect is important to understand neutrinos emitted by the Sun, which pass through its dense atmosphere on their way to detectors on Earth.


it is the feeble interaction of neutrinos with matter that makes them uniquely valuable as astronomical messengers. Unlike photons or charged particles, neutrinos can emerge from deep inside their sources and travel across the universe without interference. They are not deflected by interstellar magnetic fields and are not absorbed by intervening matter. However, this same trait makes cosmic neutrinos extremely difficult to detect; immense instruments are required to find them in sufficient numbers to trace their origin.
Neutrinos can interact via the neutral current (involving the exchange of a Z boson) or charged current (involving the exchange of a W boson) weak interactions.
In a neutral current interaction, the neutrino leaves the detector after having transferred some of its energy and momentum to a target particle. All three neutrino flavors can participate regardless of the neutrino energy. However, no neutrino flavor information is left behind.
In a charged current interaction, the neutrino transforms into its partner lepton (electron, muon, or tau). However, if the neutrino does not have sufficient energy to create its heavier partner's mass, the charged current interaction is unavailable to it. Solar and reactor neutrinos have enough energy to create electrons. Most accelerator-based neutrino beams can also create muons, and a few can create taus. A detector which can distinguish among these leptons can reveal the flavor of the incident neutrino in a charged current interaction. Because the interaction involves the exchange of a charged boson, the target particle also changes character (e.g., neutron to proton).....


Thermomagnetism: Using Heat to Make Magnets-

Thermomagnetism: Using Heat to Make Magnets-

EPFL scientists have provided the first evidence ever that it is possible to generate a magnetic field by using heat instead of electricity. The phenomenon is referred to as the Magnetic Seebeck effect or ‘thermomagnetism’.

A temperature difference across an electric conductor can generate an electric field. This phenomenon, called the Seebeck effect, lies at the root of thermoelectricity (heat turned into electricity), and is used to drive space probes and power thermoelectric generators, and could be implemented for heat-harvesting in power plants, wrist-watches and microelectronics. In theory, it is also possible to generate a magnetic field by using a temperature difference across an electrical insulator (‘thermomagnetism’). This has been referred to as the Magnetic Seebeck effect, and has enormous applications for future electronics such as solid-state devices and magnetic-tunnel transistors. In a breakthrough Physical Review Letters publication that has been promoted to “Editors’ Suggestion”, EPFL scientists have for the first time predicted and experimentally verified the existence of the Magnetic Seebeck effect.

Thermoelectricity and ‘thermomagnetism’

The Seebeck effect (thermoelectricity) — named after Thomas Johann Seebeck who first observed it in 1821 — is generated when electrons in an electric conductor move as a response to a temperature gradient. On average, the electrons on the hot side of the conductor have more kinetic energy and subsequently move at higher speeds than the electrons on the cold side. This causes them to diffuse from the hot to the cold side, generating an electric field that is directly proportional to the temperature gradient along the conductor.

Using an electrical insulator rather than a conductor, researchers led by Jean-Philippe Ansermet at EPFL have shown that a Magnetic Seebeck effect also exists. Because an insulator does not allow electrons to flow, a temperature gradient does not cause electrons to diffuse. Instead, it affects another property of electrons that forms the basis of magnetism and is referred to as ‘spin.’

In an insulator, a temperature gradient alters the orientation of electrons’ spin. Under certain conditions, this generates a magnetic field that is perpendicular to the direction of the temperature gradient. Similar to thermoelectricity described above, the intensity of the thermomagnetic field is directly proportional to the temperature gradient along the insulator.

First evidence for the Magnetic Seebeck effect

Using an insulating material called YIG (yttrium iron garnet), co-author Antonio Vetrò examined the propagation of magnetization waves along it. What he found was that the direction the magnetic waves propagated along the insulator affected the degree of magnetization loss – a phenomenon called magnetic damping. When the direction of the waves matched the orientation of the temperature gradient along the YIG, then the magnetization damping was reduced; when they propagated to the opposite direction, magnetic damping increased.

The Magentic Seebeck effect combines three distinct fields of physics: thermodynamics, continuum mechanics and electromagnetism. The difficulty lies in that, until now, no-one had ever found a way to consistently unify them. Pursuing this, first author Sylvain Bréchet built upon the work of Ernst Stückelberg (1905-1984), a renowned Swiss physicist who had previously developed a thermodynamical formalism for his teaching. Out of the hundreds of equations that Bréchet produced, one of them predicted that a temperature gradient should generate a magnetic field.

Although at an early stage, this discovery opens new approaches for addressing magnetization damping. This could have a tremendous impact on future devices based on spintronics (Nobel Prize 2007), an emergent technological field that offers an alternative to traditional electronics. In spintronic devices, signal transmission relies on the spin of electrons rather than their charge and movement. For example, the spintronics field is now considering harvesting heat waste coming from microprocessors like those used in personal computers.


Schwarzschild Black Hole :

Schwarzschild Black Hole :

Few months after Einstein published his work on General Relativity (in 1915), Karl Schwarzschild (1916) found one solution to Einstein's equations : the curvature due to a massive non-rotating spherical object.

Using Einstein's equation, Schwarzschild had determined how spacetime is curved due to the presence of a nonrotating spherical mass, assuming it is spherically symmetric.


A black holewith zero charge Q= 0 and no angular momentum J= 0. The solution for such a black hole is known as the Schwarzschild solution (or Schwarzschild metric), and is an exact unique solution to the Einstein field equations of general relativity for the general static isotropic metric.

The Lepton - By: Rabbit

The Lepton - By: Rabbit

“Why The Name Lepton”

Leptons: In 1897 the first lepton was discovered by J.J. Thomson and his team of British physicists. Physicists got the name “lepton” from the Greek language, meaning “slender”. It wasn’t until 1975 that Martin Lewis Perl and his colleagues at the SLAC LBL group discovered the tau through a series of experiments. Up until 1975, it was originally thought that leptons were the lightest subatomic particles, than came the tau in 1975. Although the name “lepton” is no longer appropriate, we still use it today. Now we use it to describe all spin-1/2 particles that are not affected by the strong force.

“The Muons & Carl David Anderson”

In 1936 an American Physicist named Carl David Anderson & Seth Neddermeyer discovered the muon (it’s former name, “mu-meson”), nearly 40 years after the discovery of the electron. In 1936 Carl David Anderson was awarded the Nobel Prize in Physics for his discovery of the positron in 1932, and also for the discovery of the muon, in 1936. He unexpectedly discovered the creation of a particle that had the same mass as the electron, but it had an opposite electrical charge, while experimenting with cosmic rays. This alone validated Paul Dirac’s theoretical prediction of the existence of the positron. Shortly after, Carl David Anderson would produce conclusive evidence by blasting gamma rays into other materials, resulting in the creation of the positron-electron pairs.

After they (Carl Anderson and Seth Neddermeyer) discovered the muon, they quickly noticed this subatomic particle was 207 times more massive than the electron, but it still had the same negative electric charge and the same spin ½ as the electron. Due to it’s shear mass (207 times that of the electron), it was categorized as a meson, rather than a lepton; but this was corrected when physicists noticed it shared more commonalities with electrons, rather than mesons. Since muons do not undergo the strong interaction, muons were reclassified into a new group of particles such as the electrons, muons, and the neutrino; in which they are now known as leptons. At first glance back in 1932, they thought they seen the pion, but they were mistaken. However, the muon was the first of a long list of other subatomic particles, so that alone was a great achievement.

“The Tau & Martin Lewis Perl”

The tau lepton is a superheavy cousin of the electron, and it’s also the carrier of electrical current in household appliances. Both particles are identical in all respects, except the fact that the tau is approximately 3,500 times heavier than the electron. However, the tau only survives for less than a trillionth of a second, while the electron remains stable. In the mid 1970’s Martin Lewis Perl and his team of colleagues at the SLCA-LBL group, conducted a series of experiments that detected the presents of the tau, and before that it was believed that only two quark-lepton families existed. Their equipment consisted of SLAC, a new type of colliding rings called SPEAR, and also the LBL magnetic detector.

With this equipment, they could also detect and distinguish between leptons, hadrons, and also photons. However, they did not detect the tau directly, but rather discovered anomalous events. After more than a year’s worth of careful analysis, Martin Lewis Perl was able to convince his research team that they were in fact observing a new and different type of elementary particle, in which he named it the “tau”. This was significant because the tau turned out to be a newly discovered member of a third quark-lepton family, the top quark. It wasn’t until 1995 that Fermi scientists discovered the top quark; the same year Martin Lewis Perl was awarded the Nobel Prize in Physics for his discovery of the tau lepton.

“How Important is The Tau”

The discovery of the tau was very important because it’s the only lepton that can decay into hadrons. Like the current decay models of the tau, the hadronic decay is through the weak interaction. Being the tauonic lepton number is conserved in weak decays, a tau neutrino is created when a tau decays to a muon or an electron.

The branching ratio of the common purely leptonic tau decays at:

-17.82% for decay into a tau neutrino, electron and electron antineutrino.
-17.39% for decay into a tau neutrino, muon and nuon antineutrino.

The similarity of values of the two branching ratios is a consequence of lepton universality. The tau lepton is also predicted to be able to form exotic atoms like other charged subatomic particles. One example is the tauonium, by the analogy to muonium, consists in antiauon and an electron. Another possibility is an onium atom called true tauonium; however, it’s very difficult to detect.

“Leptons & The 6 Different Flavors”

A lepton is a member of subatomic particles that only respond to electromagnetic force, weak force, and also the gravitational force. However, they are not affected by the strong force. The leptons appear to be point-like particles without an internal structure. Although leptons are considered to be elementary particles, they do not appear to be made up of smaller units of matter. Leptons can either have one unit of electric charge or be neutral.

Each charged lepton has an associated neutral partner, or neutrino that has no electric charge, along with no significant mass. There are six different flavors of leptons; the most familiar lepton is known as the electron. As for the five other leptons, they are known as the muon and the tau particle; and there are also three different types of neutrino associated with each of them, the electron neutrino, the muon neutrino, and of course the tau neutrino. In the six different flavors of leptons, three have an electrical charge (the electrons, muons, and taus); while the other three do not have an electrical charge. The muon and tau are charged like electrons, but they carry larger amounts of mass.

As for the other three types of neutrinos, they have no electrical charge and have very little mass; not to mention, they are very difficult to find. Electrons are the lightest leptons, and only carry a mass of 1/1,840, that of a proton. The muons are much heavier, and have more than 200 times as much mass as the electrons. As for the taus, they are approximately 3,700 times more massive than the electrons. Each charged lepton including the neutrinos have antiparticles, commonly known as, antileptons.

Although the mass of an antilepton, are in fact identical to that of leptons, all the other properties are reversed and each lepton has a corresponding antimatter antilepton. When talking about antimatter, please note that all elementary particles have corresponding antiparticles known as, antimatter. Where antimatter particles have opposite charges than their usual components. For example, positrons have a +1 charge and the electrons have a -1 electric charge. The protons have antiprotons, the neutrons have antineutrons, and the electrons have “anti-electrons”.

The “anti-electrons” are common enough to have their own special name, known as positrons. Mathematically speaking, the total lepton number L is constant. The number of leptons minus the number of antileptons always remains constant. Therefore a conservation law for leptons of each different type seems to hold; the number of electrons and electron-neutrinos, for example, is conserved separately from the number of muons and muon-neutrions. However, the current limit of violation concerning the conservation law is one part per million.

Leptons have other characteristic features in addition to their charge and mass properties, and it’s their intrinsic angular momentum, or spin. Leptons are actually classified within larger groups of subatomic particles; the fermions for example, are characterized by half-integer values of their spin. However, the total number of leptons; appear to be the same in every particle reaction.

“The Effects & Influence of The Lepton in Nuclear Physics”

What’s most intriguing about the leptons is they’re involved in several different processes, like beta decay in nuclear physics. In nuclear physics, beta decay is a type of radioactive decay, where the beta particle -- the electron or positron -- is emitted from an atomic nucleus. The process of beta decay allows an atom to obtain the optimal ratio of both protons and neutrons. There are actually two different types of beta decay, a decay that is mediated by what’s known as weak force: both beta minus and beta plus. When referring to beta minus, it’s a form of beta decay that produces an electron emission; while beta plus is the emission of the positron.

Sincerely,

Rabbit

Klein Tunneling: Coupled Particles Cross Energy Wall:

Klein Tunneling: Coupled Particles Cross Energy Wall:

Model demonstrates that it is possible for two particles to cross an energy barrier together, where a single particle could not......

For the first time, a new kind of so-called Klein tunnelling-representing the quantum equivalent of crossing an energy wall- has been presented in a model of two interacting particles. This work by Stefano Longhi and Giuseppe Della Valle from the Institute of Photonics and Nanotechnology in Milan, Italy, is about to be published in The European Physical Journal B.

Klein tunnelling is a quantum phenomenon referring to the fact that a high-potential barrier can be transparent to a particle moving at a speed nearing that of light, referred to as relativistic. Most of the previous Klein tunnelling models describe the phenomenon for a single particle. However, when two particles are involved, tunnelling can be modified as a result of their mutual interaction. This means, for example, that two electrons hopping on a lattice, or two ultra-cold atoms trapped in an optical lattice can exchange energy when they occupy the same lattice site.
The authors relied on an analytical and numerical study of a landmark model of interacting particles, called the Hubbard model. It is typically used to describe particle pairs in condensed matter such as in semi-conductors and in so-called matter wave physics, used for instance to describe microscopic particles oscillating between their material and wave-like characteristics. Longhi and Della Valle predict a new type of Klein tunnelling for a couple of interacting particles confronted by an energy barrier. Even though the barrier is impenetrable for single particles, it becomes transparent when the two particles cross the energy barrier together.


They expect these predictions to be confirmed experimentally in ultra-cold atoms trapped in optical lattices. If this is the case, similar quantum simulation could be a tool for emulating multiple-particle systems that cannot be modelled using classical computations.

Non-locality & Quantum Theory

Non-locality & Quantum Theory
---------------------------------------
Locality means that an object is influenced directly only by its immediate surroundings and not by remotely located objects.

The fact that quantum mechanics is at odds with this principle is not new. Back in the 1930s, Einstein, Podolsky, Rosen, and Schrödinger, first spotted this striking aspect of quantum theory as they pointed out the possibility of quantum entanglement.

This led them to think that something was not quite right with quantum mechanics and that the theory had to be completed in a way that would restore locality.

But this hope was to fall apartyears later, when Bell proved that quantum mechanics is an inherently nonlocal theory that is simply incompatible with any physical theory in which the principle of locality holds.

Bell argued that in any local theory there are limits to how distant events are correlated. Mathematically, the correlation statistics must obey certain inequalities, now called Bell inequalities. Bell showed that two observers measuring states that are entangled would observe strong correlations between the results of their measurements: such correlations can be so strong that they violate Bell’s inequalities.

This phenomenon, called quantum nonlocality, has been repeatedly observed experimentally. For instance, in quantum optics experiments, it is now standard to create pairs of photons entangled in polarization. Then, each photon of the pair is sent (e.g., via an optical fiber) to a distant observer holding a polarization analyzer. The observed correlations between the results of both polarization measurements can lead to Bell-inequality violations, providing evidence that the physical world can be nonlocal.

Entanglement and nonlocality were first considered as two facets of the same physical effect, but they are now recognized as two different concepts, as research carried out in our group has demonstrated.

However, the relation between them is yet to be fully understood.

"Einstein's Special Relativity" for dummies

"Einstein's Special Relativity"

By: Andrew Zimmerman Jones and Daniel Robbins from String Theory for Dummies

Theory For Dummies

In 1905, Albert Einstein published the theory of special relativity, which explains how to interpret motion between different inertial frames of reference — that is, places that are moving at constant speeds relative to each other.

Einstein explained that when two objects are moving at a constant speed as the relative motion between the two objects, instead of appealing to the ether as an absolute frame of reference that defined what was going on. If you and some astronaut, Amber, are moving in different spaceships and want to compare your observations, all that matters is how fast you and Amber are moving with respect to each other.

Special relativity includes only the special case (hence the name) where the motion is uniform. The motion it explains is only if you’re traveling in a straight line at a constant speed. As soon as you accelerate or curve — or do anything that changes the nature of the motion in any way — special relativity ceases to apply. That’s where Einstein’s general theory of relativity comes in, because it can explain the general case of any sort of motion.

Einstein’s theory was based on two key principles:

The principle of relativity: The laws of physics don’t change, even for objects moving in inertial (constant speed) frames of reference.

The principle of the speed of light: The speed of light is the same for all observers, regardless of their motion relative to the light source. (Physicists write this speed using the symbol c.)

The genius of Einstein’s discoveries is that he looked at the experiments and assumed the findings were true. This was the exact opposite of what other physicists seemed to be doing. Instead of assuming the theory was correct and that the experiments failed, he assumed that the experiments were correct and the theory had failed.

In the latter part of the 19th century, physicists were searching for the mysterious thing called ether — the medium they believed existed for light waves to wave through. The belief in ether had caused a mess of things, in Einstein’s view, by introducing a medium that caused certain laws of physics to work differently depending on how the observer moved relative to the ether. Einstein just removed the ether entirely and assumed that the laws of physics, including the speed of light, worked the same regardless of how you were moving — exactly as experiments and mathematics showed them to be!

Unifying space and time
Einstein’s theory of special relativity created a fundamental link between space and time. The universe can be viewed as having three space dimensions — up/down, left/right, forward/backward — and one time dimension. This 4-dimensional space is referred to as the space-time continuum.

If you move fast enough through space, the observations that you make about space and time differ somewhat from the observations of other people, who are moving at different speeds.

You can picture this for yourself by understanding the thought experiment depicted in this figure. Imagine that you’re on a spaceship and holding a laser so it shoots a beam of light directly up, striking a mirror you’ve placed on the ceiling. The light beam then comes back down and strikes a detector.

(Top) You see a beam of light go up, bounce off the mirror, and come straight down. (Bottom) Amber
(Top) You see a beam of light go up, bounce off the mirror, and come straight down. (Bottom) Amber sees the beam travel along a diagonal path.
However, the spaceship is traveling at a constant speed of half the speed of light (0.5c, as physicists would write it). According to Einstein, this makes no difference to you — you can’t even tell that you’re moving. However, if astronaut Amber were spying on you, as in the bottom of the figure, it would be a different story.

Amber would see your beam of light travel upward along a diagonal path, strike the mirror, and then travel downward along a diagonal path before striking the detector. In other words, you and Amber would see different paths for the light and, more importantly, those paths aren’t even the same length. This means that the time the beam takes to go from the laser to the mirror to the detector must also be different for you and Amber so that you both agree on the speed of light.

This phenomenon is known as time dilation, where the time on a ship moving very quickly appears to pass slower than on Earth.

As strange as it seems, this example (and many others) demonstrates that in Einstein’s theory of relativity, space and time are intimately linked together. If you apply Lorentz transformation equations, they work out so that the speed of light is perfectly consistent for both observers.

This strange behavior of space and time is only evident when you’re traveling close to the speed of light, so no one had ever observed it before. Experiments carried out since Einstein’s discovery have confirmed that it’s true — time and space are perceived differently, in precisely the way Einstein described, for objects moving near the speed of light.

Unifying mass and energy
The most famous work of Einstein’s life also dates from 1905 (a busy year for him), when he applied the ideas of his relativity paper to come up with the equation E=mc2 that represents the relationship between mass (m) and energy (E).

In a nutshell, Einstein found that as an object approached the speed of light, c, the mass of the object increased. The object goes faster, but it also gets heavier. If it were actually able to move at c, the object’s mass and energy would both be infinite. A heavier object is harder to speed up, so it’s impossible to ever actually get the particle up to a speed of c.

Until Einstein, the concepts of mass and energy were viewed as completely separate. He proved that the principles of conservation of mass and conservation of energy are part of the same larger, unified principle, conservation of mass-energy. Matter can be turned into energy and energy can be turned into matter because a fundamental connection exists between the two types of substance.