When we look for life elsewhere in the universe, we often focus on planets like our own: not too hot, not too cold ... warm enough for liquid water. But this model has one glaring problem: In the early days of our solar system, when life on Earth first developed, our sun only emitted about 70 percent of the energy it does today. That might not sound like a huge dissimilarity, but it's the difference between our planet being the beautiful blue marble we experience, and a frozen ice world.

In other words, life shouldn't have been able to develop here — yet it somehow did. This problem is sometimes referred to as the "faint young sun paradox," and it has puzzled scientists for generations. There are theories, however.

One leading theory posits an idea we're all familiar with today: a greenhouse effect. Perhaps the young Earth had a huge amount of atmospheric carbon dioxide, which would have trapped the faint sun's heat, and thus warmed the planet to a degree that made up for the lack of energy from the sun. The only problem with this theory is that it lacks evidence. In fact, geological evidence from ice cores and computer modeling suggest the opposite, that carbon dioxide levels were too low to make a big enough difference.

Another theory suggests that Earth could have been kept warm due to a surplus of radioactive material, but calculations don't quite pan out here either. The young Earth would have needed much more radioactive material than it had.

Some scientists have hypothesized that perhaps the moon could have warmed us, since in the planet's early days the moon would have been much closer to Earth and thus would have exhibited a stronger tidal influence. This would have had a warming effect, but again, calculations don't add up. It wouldn't have been enough to melt enough ice on a large scale.

But now NASA scientists have a new theory, one that has held up to scrutiny so far, reports Quartz. Perhaps, they hypothesize, the sun was weaker but far more volatile than it is today. Volatility is the key; it essentially means that the sun may have once experienced more frequent coronal mass ejections (CMEs) — scorching eruptions that spew plasma out into the solar system.

If CMEs were frequent enough, it might have poured enough energy into our atmosphere to make it warm enough for chemical reactions important for life to occur. This theory has a two-pronged advantage. First, it explains how liquid water might have formed on the young Earth, and it also provides the catalysis for chemical reactions that produce the molecules life needs to get started.

“A rain of [these molecules] onto the surface would also provide fertilizer for a new biology,” explained Monica Grady of Open University.

If this theory does hold up to scrutiny — a big "if" that will need to be investigated — it might finally offer a solution to the faint young sun paradox. It's also a theory that might help us to better understand how life began here on Earth, as well as how it might have gotten started elsewhere.