The Dangerous Relationship Between Hurricanes and the Moon

by Rebecca Boyle – source: medium

As Hurricane Florence makes landfall, scientist eyes are looking to the skies

MYRTLE BEACH, SC — Sept. 14: Storm clouds seen over the 2nd Ave. pier. Photo by Joe Raedle/Getty Images News

It may not be visible on the sodden Eastern Seaboard, but far above Hurricane Florence’s lashing winds and rain, the crescent moon is waxing—and that’s bad news for flooding along the Carolina coast.
Through its gravitational tug, the moon is the primary driver of the Earth’s tides—the daily rising and sinking of the oceans and lakes. High tides are higher than usual in the days around a new or full moon. As wind and rain from Hurricane Florence began pelting North Carolina, the moon’s cycle was just four days old, which means higher tides than average for America’s eastern shores. It also means the storm tide—the term for rising waves brought on by rainfall and surging seawater—will be higher too.
“If you could choose when to get hit by a hurricane, you would want it to be at a low tide,” said Brian McNoldy, a hurricane researcher at the University of Miami, who explained that to reduce the storm tide as much as possible, you want hurricanes to hit—if they must—during the first quarter or third quarter moon, “as far opposite of a new or full moon as you can get,” he said. Unfortunately, the recent new moon is still influencing tides in the Carolinas and in the Miami area.

The moon’s influence may not be obvious to most people, especially those who don’t live near coasts. But it shouldn’t be underestimated.

The moon’s shape appears to change in the sky with each consecutive night because of its alignment with respect to the Earth and sun. When the moon sidles in between the two, we can’t see light reflected off the moon’s surface, and it is invisible to us. That’s a new moon. During this phase, the sun and moon line up on the same side of Earth, so the sun amplifies the moon’s gravitational pull. The water on our planet bulges toward the moon more dramatically, and tides can be significantly higher.
Here’s what that means for Florence: As the effects of the hurricane slammed into the coast Thursday, Sept. 13 (the eye of the storm reached the coast on Friday morning), the moon was four days past new and 240,100 miles away, pretty close to its average distance from Earth. The scenario is not great news for the storm surge and storm tide, according to McNoldy.
The last megastorm to besiege the Carolinas was Hurricane Hazel in 1954. That storm had terrible timing, making landfall during a full moon high tide, bringing an 18-foot storm surge. Hurricane Sandy made landfall in New York City in 2012, also during a full moon, which may have contributed to higher than usual storm-surge levels. Hurricane Florence poses an even greater threat because it is expected to stall along the coast, dumping rain on the Carolinas for days on end. If the storm is as slow-moving as predicted, “it’s going to be there for two or three high tides,” McNoldy said.
Scientists are still trying to work out just how high the water will rise, where it will rise, and for how long. Storm surges—the abnormal rise in sea levels and, often, the deadliest effect of a hurricane—depend on numerous things, like a storm’s strength, speed, and direction; the terrain, coastline shape, and inlet size; the warmth of the water; and even the continental shelf on the ocean floor.

If you could choose when to get hit by a hurricane, you would want it to be at a low tide.

While the moon is partly to blame for high tides, there is one very human reason for Florence’s heightened storm surge risk. Average sea levels are higher now than in the past because of human-caused global warming. This ensures that Florence’s storm surge will be correspondingly higher and reach farther inland than it otherwise would. This is evident in sunny-day flooding already taking place in the Carolinas: Flooding projections were about 25 percent above average in the Carolinas for 2017-2018, according to the National Oceanic and Atmospheric Administration (NOAA). In 2016, Wilmington, North Carolina, saw 84 days of high-tide flooding and Charleston, South Carolina, saw 54 days. By contrast, in 1966, Charleston saw only four days of high-tide floods.
“There are a lot of things that affect this baseline upon which the storm travels,” said Ben Hamlington, an atmospheric scientist at NASA’s Jet Propulsion Laboratory, who studies sea-level rise. “The higher that baseline is, the worse your storm surge is going to be.”
Hamlington recently moved to California from Norfolk, Virginia, where municipal authorities are trying to build street-level flood prediction maps. “There’s a compounding effect with all the rainfall and the storm surge. When all this water gets dumped into an area, where does that water go? How does the land respond to that? All those things are very difficult to understand, and it’s a very active research area,” he said.
McNoldy, who was an astronomy major in college and switched to atmospheric science in graduate school, said the moon’s influence may not be obvious to most people, especially those who don’t live near coasts. But it shouldn’t be underestimated.
“It’s uncommon for people to ask about the tides, but it’s a huge factor, honestly,” he said. ”It can make all the difference.”

Are Moon Colonies Possible or the Fantasy of Billionaires?

by Eric Niiler – source: medium

The economy of lunar mining and tourism

Credit: Education Images/Universal Images Group/Getty

Growing up, my brother and I couldn’t get enough of Space: 1999, a mid-’70s series that hypnotized us with cool special effects, the crush-worthy Barbara Bain, who acted alongside her real-life husband Martin Landau, and its portrayal of the Moon as the main character in an action-packed 48-minute weekly episode. The premise of the show is bit far-fetched: an explosion at a moon base knocks the Moon out of Earth’s orbit and into a voyage to explore strange new worlds across the galaxy. The show was set a mere 15 years in the future.
It’s a reminder that in those post-Apollo years, we fully expected NASA or some international space force to be working on space bases in real life. More than four decades later, we’re still waiting for our Moonbase Alpha — though that’s not for a lack of interest. Ex-astronauts, entrepreneurial dreamers and short-lived sci-fi shows like Space: 1999 have kept alive the dream of a moon colony, and now, the confluence of technology, money, and political interest is pushing this idea out of the realm of sci-fi and closer to reality.
In my interviews with space scientists, industry officials, and futurists it appears that there’s an unofficial blueprint that is slowly shaping up for moon colonization. First, private space companies find ways to reduce the cost of launch. Right now, SpaceX says that it costs $62 million every time its Falcon 9 rocket is launched, while the more powerful Falcon Heavy costs an estimated $90 million per launch. Satellite companies and others wanting to get something in orbit get a discount for bulk purchases. SpaceX is bringing food and supplies to the International Space Station, it hopes to ferry U.S. astronauts by sometime in late 2019.
Then come fly-arounds and orbiting platforms. The Chinese plan to launch an Earth-orbiting space station by 2020, while NASA has asked private companies to develop a “Lunar Orbital Platform — Gateway” near the Moon by 2022. This could be NASA’s launch pad of sorts for future expeditions and settlements on both the Moon and Mars.
At the same time, private firms like Moon Express, as well as China, India and the European Space Agency are moving forward with robotic landers and rovers. The final step, supporters say, will be a permanent human presence on the surface. Maybe a government base first, followed by a private Moon resort.
NASA’s long-term involvement in the Moon is key to getting private companies to build on the lunar surface, according to Chris Lewicki, CEO of Planetary Resources, a Redmond, WA, based startup that plans to mine asteroids for rocket fuel and water.
“Government programs are like anchor tenants in a shopping mall,” Lewicki says about NASA and a future moonbase. “Without those big leaseholders, all smaller businesses don’t have a way to make a living. Without NASA it was hard to do itself.”
Some say all this could happen in the next 10 years. Others say it will take at least 20 years before both the technology for routine lunar launches is developed, and the cost comes down so that there’s actually consumer demand.

Single-planet species don’t survive. Living off the planet is probably not a bad strategy for survival. Sooner or later it will be one of the motivations of having bases on the moon.

And while the pace seems glacial, one Moon expert says likens it to the establishment of New World colonies, which didn’t happen overnight. “There’s a lag between discovery, exploration in detail, and exploitation,” says James W. Head III, a planetary scientist at Brown University who started his career at NASA selecting lunar landing sites for the Apollo missions.

So why go?

Supporters believe a moon colony will allow us to better understand how to reach farther out into the solar system. It also might be fun to visit for a once-in-a-lifetime vacation. The Moon is also a whole lot closer than Mars — it would take three days to get there, versus nine months — making it more of destination for lunar car campers than the hardcore astro-backpackers that would eventually reach Mars.
There’s also the idea of Moon mining.
Some Chinese and European researchers believe that the surface contains large quantities of helium-3, a rare element that could be used as a future energy source to fuel rockets traveling from Earth even further out to space. (The downside is that processing helium 3 into something useable takes an enormous amount of energy.) There’s also frozen water on the Moon’s polar regions: split it into hydrogen and oxygen via electrolysis and you have air to breathe — plus another rocket fuel source. That might be a long way off, but leaders of European and Chinese moon programs have said they have plans to explore it on upcoming lunar missions.
There’s another good case for colonies: our survival. James Head remembers what Apollo Cmdr. John Young, who flew in space during the Gemini, Apollo and shuttle programs, often told him when asked why humans should return to the Moon: “Single-planet species don’t survive,” Head remembers him saying. “Living off the planet is probably not a bad strategy for survival. Sooner or later it will be one of the motivations of having bases on the moon.”

How would a Moon economy work?

To make a moon base work, an economic foundation would be necessary. There’s already a growing “Low-Earth Orbit” economy (LEO for short) of U.S. companies that are putting satellites into space, servicing them and preparing to build places for people to live in work in Earth orbit.
The LEO economy numbers keep growing. Since 2000, more than 180 startup space ventures have attracted over $18.4 billion of investment, according to a May 2018 report by Bryce Space and Technology, an Alexandria, Virginia-based consulting firm. At $28 billion in valuation, SpaceX is the behemoth of the commercial space industry, and CEO Elon Musk wants to do it all: launch a constellation of Earth-orbiting satellites, send people around the Moon, and eventually establish a Mars base.
Musk has a history of missing deadlines, whether it’s delivery of the Tesla Model 3 or his ambitious space plans. But the frequency of SpaceX rocket launches — 28 since the beginning of 2017 — has made it one of the world’s most successful commercial space launch companies.
As it figures out how to use reusable rockets, SpaceX is driving down the cost of launches. That might open the door to a new set of zero-G pit-stops around the Earth and perhaps the Moon. These private rest stops would eventually replace the International Space Station, a $100 billion, 20-year-old NASA-funded mission that is now on its last legs. The White House says it wants someone else to run it after 2024 so that NASA can set its sights on putting people back on the Moon and Mars, although for now, Congress doesn’t agree.
The transition from a low-Earth economy to a moon economy is realistic, says Jeffrey Manber, CEO of Nanoracks, a Houston-based firm that operates its own laboratory space on the space station and launches tiny 10-inch square satellites, known as “cube-sats,” for commercial and university clients from ISS.
“We will have LEO hotels within five years and a decade from now you will see a growing infrastructure,” says Mander. “There will be hotels scattered throughout the frontier along with warehouses, fuel depots, then commercial modules or lunar colonies.”

“The key to having a Heinlein-esque future is that it has to get cheaper to get out of Earth’s gravity, once you do that, it all comes into place.”

Call Manber crazy, but many of the things he’s talking about are actually happening. Bigelow Aerospace, a space tech startup, built an inflatable astronaut work module on the space station in 2016 and plans to put one in orbit around the moon by 2022. The company is owned by Robert Bigelow, the billionaire founder of Budget Suites of America hotels, and an avowed believer in UFO visitations to Earth. Bigelow is one of several billionaires who are competing in the race for the Moon, including Jeff Bezos, with Blue Origin (Amazon founder), Musk, with SpaceX, and Richard Branson, with Virgin Galactic.
Their deep pockets and freedom from quarterly earnings returns are helping push technology forward in decadal leaps. They are building rockets that can get companies like Bigelow and Nanoracks off Earth and over to the Moon. Only NASA during the go-go Apollo years could match the burn rate of folks like Bezos, who said recently that he sells a billion dollars a year in Amazon stock to keep Blue Origin going.
Blue Origin is developing the Blue Moon lander, which could carry cargo to the lunar surface in preparation for a base and its own New Glenn rocket, which has a successful test in July.

How realistic are the colonies?

Getting the economics of rocket launching just right may be the tipping point for landing people — and things — on the Moon, according to Andy Weir. He wrote The Martian, a sci-fi novel about a stranded astronaut that became the blockbuster 2015 Matt Damon movie. As a follow-up, Weir wrote Artemisabout a moon colony that was subject to a heist and extortion plan involving competing mining interests. Weir places Artemis in the 2080s. He believes a real-life moon base is possible.
“The key to having a Heinlein-esque future,” says Weir, referring to the 1950s era sci-fi author Robert Heinlein, “is that it has to get cheaper to get out of Earth’s gravity, once you do that, it all comes into place.”
Weir crunched the numbers to figure out what it would take to get well-heeled space tourists to take a $70,000 vacation at a Moon resort. His rough estimateis that the cost of a rocket launch has to drop from its current rate of $4,635 per kilogram (in 2015 dollars) to about $35 per kilogram. That’s a big drop, but it may not be as long as we think before the figures add up.
Once that problem is solved, Weir believes, the moon already has the natural resources available for building a city.
“Even in a world when you’ve driven the price to LEO down, you still need to use resources locally,” Weir said. “The early pioneers didn’t bring pallets of lumber to build their houses,” Weir said the Moon is extremely rich in exactly the things you need to build a Moon base, for example, anorthite rock, which covers vast areas of the lunar surface, can be separated into aluminum, oxygen, calcium, and silicon (used in glass).
But after all his research, Weir realized that the seafloor, the Earth’s polar regions, and the Sahara are all easier to colonize than the Moon. You have to bring breathable oxygen, protection against space radiation and all your own food and water, he points out.
“The problem is that you still don’t want to send humans to the moon,” Weir said. “You want to send robots. Humans are soft and squishy and they die. Robots are hard and nobody gets upset when they die.”
China’s already working on this. China plans to launch a lander and rover in December to the far side of the moon. The country’s leaders have also talked about putting astronauts on the Moon by 2036 while the White House says it wants NASA to return to the moon but without giving a firm commitment date.
Head, the planetary scientist who works with the Chinese, believes the Chinese government has the ultimate deep pockets needed for such an expensive technological enterprise. China’s space program doesn’t have to worry about justifying its expenses to Congress, like NASA does, or running out of company stock to sell to finance the money-losing space companies.
“For them, the ultimate goal is to have astronauts on the moon, and they are definitely moving in that direction,” Head said. “It’s possible they could get back to the Moon with humans before we do.”

The Path to Galactic Colonization

by Tony Deller – source: medium

The foundation for humanity’s future must be built by our generation

ESA/Gaia/DPAC

Using the newest data from the European Space Agency’s GAIA spacecraft, the ESA created the above map of our galaxy that pinpoints the brightness and positions of nearly 1.7 billion stars.
Our Milky Way is roughly 100,000 light years across. To cross our galaxy end-to-end would take 100K years moving at light speed. Ridiculous. That’s just to cross it in a straight line. Covering the actual volume of space enclosed by the Milky Way in our fictional “U.S.S. Enterprise” would require hundreds of millions of years.
And yet, despite our galaxy’s immense scale, the future of life on Earth ultimately depends on the human race taking to the stars and colonizing other worlds.
If we don’t undertake this fateful mission, it is only a matter of time before man made or cosmic threats kill most or all of humanity and all other living things on Earth. Whether nuclear war or biological plague, major asteroid impact, supernova explosion or gamma ray burst, some disaster WILL befall our planet sometime between now and a few thousand years in the future.
Humanity needs to prepare for this inevitability now, before something terrible occurs that reduces our capacity to do so, like a high energy solar flare or series of supervolcano eruptions.

One Small Step

We began, with that first “small step”, on July 20th, 1969, when the Apollo 11 lunar module fell into our Moon’s gravity well. Buzz Aldrin and Neil Armstrong became the first human beings to land on another celestial body. Today, several plans are underways to take the next steps in our journey outward into the cosmos. These primarily involve setting up bases, and eventually cities, on Mars.
Colonizing Mars will be the test case that allows us to prove we can live on a world significantly different than Earth. The two organizations that are likely to be the first to send humans to Mars are NASA and the private corporation SpaceX. SpaceX, led by entrepreneur Elon Musk, plans to place the first humans on Mars by 2030, with the hope that a permanent colony will be thriving by 2100.

Photo by SpaceX on Unsplash

Looking to the stars for humanity’s ultimate salvation has many detractors, people who believe we should focus solely on solving the problems we face on Earth before we consider the enormity of outer space. While these problems are indeed terrible and need the attention of great minds and philanthropists (such as Bill and Melinda Gates’ work), we cannot entirely ignore the long-term promise of expanding the presence of humanity across multiple locations in the galaxy.
For those who believe we should be stewards of the rest of Earth’s life, since we have presided over so much of its devastation, colonizing the galaxy offers a chance to also save ALL other life on Earth. Humans will not go alone into the depths of space. We will take with us many, if not most, of our companions on Earth, in the form of DNA samples, embryos and seeds. Some day, perhaps 100,000 years from now, there may be a dozen other very Earth-like planets safely distributed throughout the Milky Way, locations chosen specifically to ensure that any single cosmic catastrophe could never devastate them all at once.
In order to reach such a point, we will need to drastically advance our space technology. This will take money and time. As we proceed, every step of the way must result in the creation of a new economy in the place of colonization, since trading actual material goods back and forth between planets will be incredibly costly in every way.

How Do We Go Further?

As new bases are established, ever further from the Solar System, only information will be relatively cheap to transfer. New inventions and thought that results from our explorations and adaptations will benefit everyone, but every successive colony will need to be able to survive on its own.
One way we can help ourselves in this course is to build and program a fleet of smart robots to pave a path for us into the stars. These would be similar to the the idea of Von Neumann machines, named after 20th century Hungarian physicist Jon von Neumann. Such machines would be robust AI-driven creations whose programming sends them to worlds which have the highest possible compatibility with Earth life. For now, we refer to such worlds as being located in a star’s habitable, or “Goldilocks”, zone: Orbiting at just the right distance from its sun to allow the existence of liquid water.
After arrival on such a world, the Von Neumann machine would begin mining the planet for raw materials and then use the most advanced 3D printing technology to turn those materials into fuels and more machines. These resources, in turn, would go about building bases for humans and starting the long process of terraforming the planet.
Later, perhaps a century or a millennium after the Von Neumann machine had begun its work, a starship containing humans held in cryonic suspension, or perhaps just human embryos, would arrive to populate the new world.

www.pexels.com

How long might a process like this take?
Based on data from NASA’s Kepler spacecraft, there may be as many as 40 billion Earth-sized planets orbiting inside their star’s habitable zone in our galaxy.
If exponential growth could be applied to colonizing the galaxy, complete colonization of every habitable exoplanet may happen within the next 500,000 to 10 million years. However, according to Seth Baum, current Director of the Global Catastrophic Risk Institute:

The problem is that this kind of growth may not be possible, and they look at Earth as an example. For any expansion to be sustainable, the growth in resource consumption cannot exceed the growth in resource production. And since Earth’s resources are finite, and it has a finite mass and receives solar radiation at a constant rate, human civilization cannot sustain an indefinite, exponential growth.

The better question may be: How long will it take for humanity to place a stable population on enough planets to ensure the existence of Earth-originated life for a million years?

Preserving Humanity and Earth Life for Eternity

Outside of Mars, the best bets for a sustainable colonies in our own solar system are long shots at best: our Moon, and various moons of Jupiter and Saturn. However, those locations are not amenable to terraforming, which is what is needed to truly preserve human and other Earth life.
The nearest rocky exoplanet that orbits within the habitable zone of its star is Proxima Centauri B, at 4.2 light years away. This world would be our first candidate for colonization and terraforming outside of our Solar System. If we can reach speeds of even 1/10th C (the speed of light, 186,000 miles/second, as in E = MC²), getting any spacecraft to Proxima Centauri B would take 42 years.

The Breakthrough Starshot project is an initiative to propel a small fleet of one-gram nanospacecraft the size of a postage stamp — what they’re calling a “starchip that is really just a super-miniature computer — to 1/5th C (100 million mph) on a journey to Proxima Centauri B and other planets in the Alpha Centauri system. This acceleration will be accomplished via directed light beams at solar sails, allowing the speeds to be reached without any onboard engine of any kind.
We are very limited with how we get into space and continue accelerating currently. Propellent technology only gets us so far. At our present pace, it would take ~30,000 years to reach Proxima Centauri B. The trifecta of fusion, fission and antimatter propulsion are the focus at this point, all believed to be capable of getting us to the 10% C mark. As speeds increase, so do the inherent dangers of space travel, such as radiation and micrometeoroids, which makes 10% of light speed both a safety limit as well as a practical limit on how fast we might be able to travel in the near future.
There are just under 50 exoplanet candidates known right now that could be sites for colonies, and also be in the habitable zone and therefore allow for eventual terraforming. The distances from Earth range the gamut between Proxima Centauri B’s 4.2 light years to Kepler 443b’s 2540 light years. These are only worlds that we’ve discovered up until now: There is no doubt we will discover more potentially habitable exoplanets on the nearer side of that range within the next few decades.
If we want to make certain humanity cannot be completely wiped out in any one cosmic disaster, we need to eventually have colonized worlds that are at a sufficient distance from one another to avoid such possibility. To this end, we look at the largest scale cosmic event we know of: a supernova explosion. According to this article, a minimum safe distance from a supernova is 30 light years, though the size and effects of such an event can vary so much that in order to be certain we should include a buffer of 150%…so let’s say 45 light years.
Aside from a catastrophe on a galactic scale that we could never defend against, this means that we need to reach a point where humanity has colonized at least 45 light years from Earth. The closest candidate world that fits this case is Gliese 163c, at 49 light years away. A journey of 49 light years, at an average speed of 10% C, would take approximately 490 years.
Out of our list of ~50 known potentially habitable exoplanets, about 20 of them are within that 45 light year range of Earth. A few are contained in the same star system, most notably 4 worlds in the Trappist-1 system 39 light years away.

Could this be a scene on a world 50 light years away and a thousand years in the future? Photo by Shot by Cerqueira on Unsplash

Without a doubt, our first goal must be to create a second home for humanity and Earth life on Mars. The world’s close proximity, gravity and water offer our best bet for a sustainable colony and eventual terraforming. Establishing a stable colony on Mars may take until 2100, while a truly thriving population that is no longer dependent on Earth might take another century to develop.
During the middle centuries of this millenium, humanity will be focused on building smaller human stations on moons of Jupiter and Saturn, as well as perhaps on the massive Ceres asteroid in the Asteroid Belt. This time will be one of advancement in our ability to survive for long periods in space without many negative side effects, learning how to terraform Mars, and also developing true starships and medical technology that will empower us to make the jump to Alpha Centauri.
By 3000 AD, humanity should be established on Proxima Centauri B, and dozens, if not hundreds, of other ships should be on their way to the other 50-odd potential colony worlds, all much farther away, containing either humans and other Earth life in suspended animation or in the form of embryos and seeds.
By 12000 AD, humanity should be firmly rooted on a dozen worlds, some of them separated by enough distance from Earth to ensure that a single supernova could never drive Earth life into complete extinction.
At this point, the children of Earth, all life born here, will have a true chance at immortality: The rest of the Milky Way galaxy awaits.
Thank you for reading and sharing!

The story of Jocelyn Bell Burnell

Jocelyn Bell at Cambridge’s Mullard Radio Astronomy Observatory in 1968. (Daily Herald Archive/SSPL/Getty Images)

via Timeline

In the winter of 1967, Jocelyn Bell Burnell pored over the near-frozen dials of a radio telescope. Between curses, she breathed on the instruments hoping to thaw them when, suddenly, the telescope’s recording chart sputtered to life and began transmitting a series of regularly spaced ticks.
This was the second time Bell Burnell had observed the puzzling metronomic space signals as a doctoral student working with the Cambridge astronomer Antony Hewish. Initially unsure what could cause such a measured celestial blink, Bell Burnell and her colleagues jokingly called the beating emissions “LGM” for Little Green Men.
The second time the telescope picked up a similar signal, she knew it wasn’t a quirk in the equipment or an extraterrestrial invitation. Bell Burnett had discovered pulsars—and astrophysics would never be the same.
In 1974, however, it was Antony Hewish whose “decisive role in the discovery of pulsars” would be honored with a Nobel Prize. In later years, Hewish would diminish, with defensive bluster, Bell Burnell’s contribution. “It’s a bit like an analogy I make — who discovered America? Was it Columbus or was it the lookout? Her contribution was very useful, but it wasn’t creative,” Hewish told interviewers in 2007.
But Bell Burnell was always more than a lookout. Susan Jocelyn Bell was born in Northern Ireland in 1943 and encouraged by her parents to pursue a clear propensity for understanding things. She and her family protested fiercely when, on the first Wednesday of secondary school, the girls were segregated for training in the art of “domestic science,” while their male peers pored over Bunsen burners and beakers.
She went on to study at the University of Glasgow, where she again found herself defined by her gender rather than her brain. For two years, whenever Bell Burnell entered a lecture hall her male peers whooped, cat-called, and banged their desks. “It was a little isolating. I had to work very much on my own,” she recalled during a TEDx talk in 2013.
After enduring years of the simian ritual, Bell Burnell made haste for Cambridge in 1965 to pursue a PhD studying under the radio astronomer Antony Hewish. Clad in cat-eye glasses, she spent two years constructing a radio telescope of Hewish’s design — a four-acre affair consisting of wires and pylons with galactic radiation receptors. This vineyard-like tessellation was originally built to study quasars — scintillating deep-space objects discovered in the early 1960s.
The first time the telescope’s radio-frequency needle recorded a regularly timed radiation signal, the team was convinced a glitch had befallen their equipment. What aside from human interference or some intelligent messenger could account for the clockwork pulses of energy? The Cambridge researchers were plagued by the Little Green Men mystery for weeks until Bell Burnell detected a second — and later a third and fourth — percussive signal from separate corners of the heavens.
As the probability of detecting multiple galactic dispatches from distant, intelligent civilizations was near zero, the scientists sought a solution consistent with the laws of physics and the scope of the universe. Hewish interpreted the data as the result of neutron stars or pulsars: superdense dead stars that emit radiation from their magnetic poles like strobe lights.
Before Bell Burnell divined the cosmic transmissions, it was believed that when stars died they simply exploded, releasing their energy in volatile displays we call supernovae. But her discovery suggested that a supernova may not lead to the wholesale destruction of a star — that something might stick around. Pulsars, Hewish and Bell Burnell would establish, were the neutron-rich cores of dead stars emitting radio waves as they rotated around a highly magnetized axis. Pandora’s box was open to all sorts of stellar post-mortem possibilities, most notably the theories of a young astrophysicist named Stephen Hawking, whose ravings about black holes were suddenly taken seriously.

Burnell’s pulsar discovery changed the field of astrophysics forever. SXP 1062 (right) is between 10,000 and 40,000 years old. (NASA)

Bell Burnell would go on to receive her PhD in 1968, sans Nobel, despite co-authoring the article in Nature that would lead to Hewish’s nomination. But it wasn’t just the Nobel committee in Stockholm who were guilty of a double standard. Following the discovery of pulsars, Bell Burnell faced casual sexism from the media and public as well.
“When the press found out I was a woman, we were bombarded with inquiries,” she said. “My male supervisor was asked the astrophysical questions while I was the human interest,” she recalled in an interview with the Belfast Telegraph in 2015. “Photographers asked me to unbutton my blouse lower, whilst journalists wanted to know my vital statistics and whether I was taller than Princess Margaret.”
In the years since the discovery of pulsars Bell Burnell has been a vocal critic of the traditional white male power structure that dominates Western scientific thought and academia. When she was appointed the chair of the physics department at Open University in 1991, Bell Burnell was one of only two female physics professors in the U.K. “Throughout my working life, I’ve been either one of very few women or the most senior woman in the place,” she told the TEDx audience.
After obtaining her PhD, Bell Burnell worked part time for many years while raising a family and following the career of a “peripatetic” husband. “I am very conscious that having worked part time, having had a rather disrupted career, my research record is a good deal patchier than any man’s of a comparable age,” she said in a 1996 interview with the Institute of Physics.
Still, Bell Burnell has continued to advance, earning visiting professorships at Oxford and Princeton. She is currently the president of the Royal Society of Edinburgh, Scotland’s national academy of science and the arts.
In public forums, she often repeats the fundamental truth so many people fail to grasp: that the small number of women in STEM in the West is the result of social restrictions and expectations. “The limiting factor,” she points out, “is culture, not women’s brains, and I regret that its still necessary to say that.”

Extragalactic origin confirmed

Cosmic rays — fast-moving, high-energy nuclei — pervade the Universe. We know that the lower-energy variety that we detect on Earth is funnelled by the solar wind. However, higher-energy cosmic rays have an isotropic distribution due to scattering that makes it difficult to identify their source, although they are likely to be generated by high-energy phenomena like supernova explosions and jets from active galactic nuclei. By looking at the ultrahigh-energy end of the cosmic ray spectrum (on the order of exa-electron volts and higher, where cosmic rays are not scattered by solar-scale magnetic fields), the Pierre Auger Collaboration detected an anisotropy in their arrival directions that indicates an extragalactic origin.
Ultrahigh-energy cosmic rays are rare: typically one cosmic ray with an energy > 10 EeV hits each square kilometre of the Earth’s surface per year. The Pierre Auger Observatory in Argentina detects cosmic rays using two combined techniques: telescopes to detect fluorescence from cosmic-ray-generated air showers, and a network of 12-tonne containers of ultrapure water, spread over an area of 3,000 square kilometres. Photomultiplier detectors in the containers observe the faint Cherenkov radiation generated when cosmic-ray-generated muons encounter water molecules. By reconstructing the cone of emission of the muon (analogous to an aircraft’s sonic boom) an incident direction can be derived. By analysing 32,187 cosmic rays detected over 12.75 years, a map of the sky was produced (pictured), showing evidence of an enhancement (5.2 σ significance) in a region away from the Galactic Centre (marked with an asterisk; the dashed line indicates the Galactic plane). The distance of this hotspot from the Galactic Centre (~125°) points towards an extragalactic origin of ultrahigh-energy cosmic rays, reinforcing previous (less conclusive) results from the Collaboration at lower energies.

Paul Woods, doi:10.1038/s41550-017-0304-0

Space inspires people

Four Eyes of Tatooine by Stefan Lines (created whilst working on his PhD thesis in Dr Leinhardt’s Planet Formation Group at the University of Bristol). This computer generated image, based on data taken from actual super-computer simulations, shows the formation of a planet around a binary star — a so called ‘circumbinary planet’. Tiny unit vectors show the magnitude (colour) and direction (orientation) of the acceleration of millions of tiny rocky ‘planetesimals’ that eventually coalesce to form a planet.

Why do you think space inspires people so much?

I guess that it’s our human nature to explore and we see it, at least most of us can see it, at night and I think that it’s in our nature to ask why things look the way they do, or why a process happens. And since you can look up in the sky and see a bunch of lights, it’s natural to question what that is and want to be able to explain it and go there. So I think it’s just our natural instinct to want to explain what we don’t understand, especially if we can see it.

(Zoë Leinhardtcontinue to read the interview)

Titan brighter at twilight than in daylight

Sketch showing the definition of phase angle

Muñoz, Antonio García, Panayotis Lavvas, and Robert A. West. “Titan brighter at twilight than in daylight.” Nature Astronomy 1, Article number: 0114 (2017) doi:10.1038/s41550-017-0114 (arXiv)

Investigating the overall brightness of planets (and moons) provides insights into their envelopes and energy budgets. Phase curves (a representation of the overall brightness versus the Sun–object–observer phase angle) for Titan have been published over a limited range of phase angles and spectral passbands. Such information has been key to the study of the stratification, microphysics and aggregate nature of Titan’s atmospheric haze and has complemented the spatially resolved observations showing that the haze scatters efficiently in the forward direction. Here, we present Cassini Imaging Science Subsystem whole-disk brightness measurements of Titan from ultraviolet to near-infrared wavelengths. The observations show that Titan’s twilight (loosely defined as the view at phase angles ≳150°) outshines its daylight at various wavelengths. From the match between measurements and models, we show that at even larger phase angles, the back-illuminated moon will appear much brighter than when fully illuminated. This behaviour is unique in our Solar System to Titan and is caused by its extended atmosphere and the efficient forward scattering of sunlight by its atmospheric haze. We infer a solar energy deposition rate (for a solar constant of 14.9 W m−2) of (2.84 ± 0.11) × 1014 W, consistent to within one to two standard deviations with Titan’s time-varying thermal emission from 2007 to 2013. We propose that a forward scattering signature may also occur at large phase angles in the brightness of exoplanets with extended hazy atmospheres and that this signature has a valuable diagnostic potential for atmospheric characterization.

Sink holes and dust jets on comet 67P

by Paul Weissman from Nature 523, 42–43 (02 July 2015) doi:10.1038/523042a


Analyses of images taken by the Rosetta spacecraft reveal the complex landscape of a comet in rich detail. Close-up views of the surface indicate that some dust jets are being emitted from active pits undergoing sublimation.
When do 18 holes not make for a pleasant afternoon playing golf? When the 18 holes are located on the surface of a comet speeding through the Solar System. Vincent et al.(1) describe the holes, also called pits, that comprise one of the many discoveries of the European Space Agency’s Rosetta mission to comet 67P/Churyumov-Gerasimenko (67P). The Rosetta spacecraft went into orbit around 67P in August 2014, and the surprises have been coming fast since then. Vincent et al. propose a mechanism for the formation of the pits and identify them as one of the sources of active dust jets.
Comets are the most primitive bodies in the Solar System; they are the remnants of its formation process. Comets therefore retain a physical and chemical record of the conditions and materials in the solar nebula — the gas and dust cloud out of which the Sun and planets formed 4.56 billion years ago. Conveniently, comets have spent most of that time in two very cold storage locations: the Kuiper belt beyond the orbit of Neptune and the spherical Oort cloud outside the planetary region, stretching halfway to the nearest stars. The distant Oort cloud is the source of the long-period comets that have orbital periods ranging up to millions of years. The Kuiper belt is the source of the Jupiter-family comets, such as 67P, which typically have periods of less than 20 years and orbital dynamics that are strongly affected by Jupiter.
As a comet approaches the Sun and warms up, the central solid part, known as the cometary nucleus (comprised of volatile ices and primitive meteoritic material), begins to sublimate and becomes enveloped by a freely outflowing atmosphere called the coma. One of the first surprises for Rosetta, the first ever comet-rendezvous mission, was the odd shape of the target comet’s nucleus (Fig. 1a)(2). Although some nuclei comprised of two large pieces and looking like a bowling pin had been observed before by fly-by missions to other comets, the two lobes of 67P sit on top of each other, with a narrow ‘neck’ in between. There is intense speculation as to how this odd configuration may have formed. Did two cometary nuclei gently collide randomly in the solar nebula, or is the nucleus a single piece that has been oddly sculpted by sublimation processes? Although the former is the more likely scenario, some scientists on the mission suspect the latter.

Vincent et al.(1) analysed images of comet 67P taken by the Optical, Spectroscopic and Infrared Remote Imaging System cameras on the Rosetta spacecraft. a, The complex nucleus topography includes large, flat-floored basins (indicated by white arrows). A large, circular pit is visible just above the centre of the image (red arrow). b, A string of pits dot the surface of the cometary nucleus. In active pits such as these, bright jets of dust are seen being emitted from the sunlit walls. The contrast of this image has been enhanced to highlight the interiors of the pits and the jets. As a result, the cometary surface looks very bright, but in reality it reflects only about 6% of the incoming sunlight — roughly the same as the black toner particles in a laser printer cartridge. ESA/Rosetta/MPS for OSIRIS Team MPS/UPD/LAM/IAA/SSO/INTA/UPM/DASP/IDA
Vincent et al.(1) analysed images of comet 67P taken by the Optical, Spectroscopic and Infrared Remote Imaging System cameras on the Rosetta spacecraft. a, The complex nucleus topography includes large, flat-floored basins (indicated by white arrows). A large, circular pit is visible just above the centre of the image (red arrow). b, A string of pits dot the surface of the cometary nucleus. In active pits such as these, bright jets of dust are seen being emitted from the sunlit walls. The contrast of this image has been enhanced to highlight the interiors of the pits and the jets. As a result, the cometary surface looks very bright, but in reality it reflects only about 6% of the incoming sunlight — roughly the same as the black toner particles in a laser printer cartridge.
ESA/Rosetta/MPS for OSIRIS Team MPS/UPD/LAM/IAA/SSO/INTA/UPM/DASP/IDA

Rosetta’s camera system, the Optical, Spectroscopic and Infrared Remote Imaging System (OSIRIS), is comprised of narrow-angle and wide-angle digital cameras. As the OSIRIS team of scientists2 began to map the surface of the nucleus using the cameras, they discovered 18 pits on the surface, which Vincent et al. now describe more thoroughly. The cometary nucleus has a diameter of approximately 4 kilometres. The pits are typically about 200 metres in diameter and about 180 metres deep. Pit-like features have been observed on other cometary nuclei, but the morphology of the pits on 67P has not been seen before. They typically have cylindrical shapes with circular openings and near-vertical walls (although at least one pit seems to be lying at a steep angle). And some of the pits are clearly active: images of pits that are illuminated by sunlight show dust jets emanating from their walls and/or floors (Fig. 1b).
How did the pits form? Vincent et al. suggest that they are ‘sink holes’, which formed when material near the surface of the nucleus collapsed into the low-density interior. Rosetta’s Radio Science Investigation team has found(2) that the nucleus has an average bulk density of only 470 ± 45 kilograms per cubic metre, about half the density of solid water ice. But the Grain Impact Analyser and Dust Accumulator instrument has measured(3) a dust-to-ice mass ratio of 4 ± 2, suggesting that silicates and organics, rather than ices, make up about 80% of the mass of the nucleus. This in turn implies that 75–85% of the nucleus interior is empty space, a parameter known as porosity. A high porosity is predicted by the leading scenarios for the internal structure of cometary nuclei, which suggest that they are aggregates(4) of smaller, icy bodies that gently came together in the solar nebula. These aggregates are also referred to as rubble piles(5). This concept has provided insights into the behaviour of comets, such as random and other splitting events.
The morphology of 67P’s surface is dominated in some areas by large, flat-floored basins, similar to features seen on the nucleus of comet(6) Wild 2. It has been suggested that these are sublimation basins that slowly widen as the walls sublimate, leaving large, non-volatile particles that cover the basin floor. The basins cannot be impact craters because they have the wrong size distribution (there are too many large ones), and because not many impact craters are expected on a small cometary nucleus such as 67P.
Could the pits described by Vincent et al. be the precursors of the basins, slowly widening as their walls sublimate? Many of the pits found by OSIRIS are located in the same region on the nucleus where many of the large sublimation basins are found. Both comet 67P and comet Wild 2 are relatively young — that is, they have only recently (within the past 60 years) been perturbed by the gravitational field of Jupiter to perihelion distances (the point in their orbit closest to the Sun) at which it is warm enough for water ice in the nucleus to sublimate, and at which the activity that manifests itself as the bright cometary coma and tails begins. If this is so, why are sublimation basins not observed on other, perhaps older, Jupiter-family comets such as Tempel 1 and Hartley 2? Older nuclei may have accumulated thicker layers of non-volatile materials that have buried the sublimation basins and substantially lowered the activity levels of those comets.
Rosetta has already indicated that it has more surprises for us. On 13 June 2015, the orbiter began receiving signals from the Philae lander, which is on the surface of the comet nucleus and was last heard from in November 2014. With its batteries recharging, Philae probably has much more information to transmit about its final landing location. Also, the activity of the nucleus is expected to reach a maximum soon after the comet passes through perihelion at 1.25 astronomical units from the Sun (a point about 25% farther from the Sun than Earth’s orbit) in mid-August 2015. Rosetta will then follow 67P away from the Sun as cometary activity begins to wane. What changes will we see on the nucleus surface? And how will this alien golf course look from Rosetta’s vantage point then?


(1) Vincent, J.-B. et al. Nature 523, 63–66 (2015).
(2) Sierks, H. et al. Science 347, aaa1044 (2015).
(3) Rotundi, A. et al. Science 347, aaa3905 (2015).
(4) Donn, B. & Hughes, D. in 20th ESLAB Symp. Exploration of Halley’s Comet (eds Battrick, B. et al.) 523–524 (ESA, 1986).
(5) Weissman, P. R. Nature 320, 242–244 (1986).
(6) Kirk, R. et al. in 46th Lunar and Planetary Science Conf. Abstr. 2244 (2015).

Bounty of dark galaxies found

from Nature 523, 9 (02 July 2015) doi:10.1038/523009b

apjl515855f2_lr

Astronomers have discovered more than 850 faint galaxies in a galaxy cluster that could be made mostly of dark matter.
Using archived images from the Subaru Telescope in Hawaii, a team led by Jin Koda at Stony Brook University in New York searched for observations of the Coma galaxy cluster, which is roughly 101 million parsecs (330 million light years) away. The team found 854 ultra-diffuse galaxies, a class of faint galaxy that can be as large as the Milky Way, but which has only 0.1% the number of stars. For these galaxies to remain gravitationally bound together, the researchers show that more than 99% of their mass must be dark matter.
This suggests that the crowded environment sucks gas away from these galaxies, leaving them largely unable to form stars.


Paper: Approximately a thousand ultra-diffuse galaxies in the coma cluster by Jin Koda, Masafumi Yagi, Hitomi Yamanoi, and Yutaka Komiyama

Dust-poor galaxies at early times

by Veronique Buat from Nature 522, 422–423 (25 June 2015) doi:10.1038/522422a


Observations of galaxies that formed early in the Universe’s history reveal much lower dust levels than are found in sources from a slightly later era. It seems that galaxies underwent rapid change during a relatively short period.


The study of the most distant galaxies, observed as they were about 1 billion years after the Big Bang, is crucial for our understanding of the star-forming activity and the physical processes at work in these young systems. Capak et al.(1) present a study of nine such galaxies using a linked-up telescope array. They find that the dust and gas properties in these systems hint at an interstellar medium (ISM) that is much less evolved than in galaxies about 2 billion years older. This suggests that there was a rapid change in the overall properties of galaxies during the early life of the Universe.
The expansion of the Universe shifts the ultraviolet (UV) light emitted by newly formed stars in remote systems to longer (visible and near-infrared) wavelengths that, unlike UV light, can be observed by ground-based telescopes. The most distant objects known today are detected as a result of a break in the continuum of their redshifted spectra at wavelengths of around 0.1 micrometres; this is due to the absorption of UV photons by neutral hydrogen in the intergalactic medium. The absorption occurs for photons with energies corresponding to wavelengths shorter than the Lyman-α line of hydrogen (1,216 nm), and galaxies whose distances have been estimated by this method are known as Lyman break galaxies (LBGs). The most comprehensive surveys undertaken so far have led to detections of very young LBGs that formed approximately 0.5 billion years after the Big Bang(2).
The presence of interstellar dust complicates the study of galaxies, and affects measurements of fundamental, observationally derived properties such as the star-formation rate. This is because dust is efficient at absorbing the energetic UV photons (a proxy for the star-formation rate) that are emitted by young stars and at re-emitting their energy in the infrared domain, at wavelengths longer than 5 μm. This is a complex process that depends not only on the amount of dust present, but also on its distribution relative to the stars and on its composition(3). Overall, dust substantially reduces the intensity of stellar light reaching the telescopes(4).
A straightforward method to account for the UV light produced in galaxies involves observing the radiant energy that is absorbed by dust, is re-emitted and is then redshifted in the far-infrared and submillimetre domains. However, the low sensitivity of detectors, combined with the poor spatial resolution achieved by single-dish telescopes, make surveys of high-redshift galaxies at these wavelengths less efficient than those at optical or near-infrared wavelengths. Even the Herschel Space Observatory, which detected(5) the infrared emission from dust in galaxies at redshifts of up to 2–3 (corresponding to a time roughly 2 billion to 3 billion years after the Big Bang), was able to detect only hyper-luminous sources at much larger distances(6).
Given that directly measuring the long-wavelength emission from dust is so challenging, astronomers resort to empirical relations to derive dust’s infrared luminosity. One such relation links this luminosity to the stellar UV luminosity(7) for a representative sample of nearby, actively star-forming galaxies, for which both luminosities have been accurately measured. Unfortunately, this recipe is not universally applicable because it depends on the properties of the ISM (such as the composition of dust and its distribution relative to the stars), as well as on the stellar populations in the galaxies(8). Despite these caveats, however, it is extensively used to estimate the level of obscuration of the stellar UV light by ISM dust for galaxies across a wide redshift range. It will therefore be important to check the validity of this relationship — especially for young, high-redshift systems. A critical evaluation could also yield clues to the properties of the ISM at those early times.
Capak et al. used the Atacama Large Millimetre Array (ALMA), which was designed to overcome both the resolution and sensitivity problems (Fig. 1). Being an interferometer (a series of telescopes linked up to combine astronomical observations), ALMA has a small field of view that is suitable for observing well-centred sources, and it can detect the weak submillimetre emission originating from dust in ordinary galaxies at high redshifts. The authors used 20 of ALMA’s antennas in unison to observe the dust and gas emissions of 9 typical LBGs located at redshifts 5–6; these correspond to a time when the Universe was about 1 billion years old.

Capak et al.(1) used 20 of ALMA's antennas to study the interstellar medium (ISM) of 9 galaxies that were present when the Universe was only about 1 billion years old. The authors found that their sources contain a smaller amount of dust than expected. Some of the galaxies in the sample may have an ISM similar to that of the Small Magellanic Cloud (a satellite galaxy of the Milky Way), which is visible here (right of centre) as the smaller of the two Magellanic Clouds above the antennas.
Capak et al.(1) used 20 of ALMA’s antennas to study the interstellar medium (ISM) of 9 galaxies that were present when the Universe was only about 1 billion years old. The authors found that their sources contain a smaller amount of dust than expected. Some of the galaxies in the sample may have an ISM similar to that of the Small Magellanic Cloud (a satellite galaxy of the Milky Way), which is visible here (right of centre) as the smaller of the two Magellanic Clouds above the antennas.

Capak and colleagues selected their sample from the Cosmic Evolution Survey field, a two-square-degree area that has been extensively observed by most of the major telescopes, from the ground and from space. ALMA detected the thermal dust emission in four galaxies, and an ISM spectral line emitted from gaseous carbon at a wavelength of 158 μm in all nine of them; the carbon feature is the dominant ISM emission line of galaxies in the far-infrared domain. Such a high detection rate is outstanding, because previous attempts failed to simultaneously detect the carbon feature and thermal dust emission(9).
The authors’ study argues for a very low dust content and stellar-light obscuration in these systems. The four galaxies whose thermal dust emission was detected may harbour an ISM similar to that of the Small Magellanic Cloud (a satellite galaxy of the Milky Way), which is characterized by a low abundance of elements heavier than helium. The upper limits put on the dust emission of the remaining five sources call for an even more extreme situation with a much lower infrared emission. That seems to be at odds with the observed UV luminosity of these systems. The enhanced carbon emission-line intensities also suggest low dust levels relative to the gas present in these early galaxies, although other explanations cannot be excluded.
An immediate consequence of these findings is that the classical calculations used to derive obscuration due to dust from the observed UV continuum luminosity are unlikely to be valid for LBGs in the early Universe. As a result, the star-formation rate considered to be appropriate for these galaxy types is likely to have been overestimated by factors of between two and four in previous studies. Last but not least, this pioneering work paves the way for future observational campaigns. Although observing the low levels of dust emission from large samples of high-redshift galaxies may prove challenging even for ALMA, Capak and co-workers’ finding of enhanced carbon emission lines should become a useful tool in the study of star-forming galaxies at those early epochs.


(1) Capak, P. L. et al. Nature 522, 455–458 (2015).
(2) Bouwens, R. J. et al. Astrophys. J. 803, 34 (2015).
(3) Witt, A. N. & Gordon, K. D. Astrophys. J. 528, 799 (2000).
(4) Burgarella, D. et al. Astron. Astrophys. 554, A70 (2013).
(5) Gruppioni, C. et al. Mon. Not. R. Astron. Soc. 436, 2875–2876 (2013).
(6) Riechers, D. A. et al. Nature 496, 329–333 (2013).
(7) Meurer, G. R., Heckman, T. M. & Calzetti, D. Astrophys. J. 521, 64 (1999).
(8) Boquien, M. et al. Astron. Astrophys. 539, A145 (2012).
(9) Maiolino, R. et al. Mon. Not. R. Astron. Soc. (in the press); Preprint at arXiv