Why should we go to the stars?
We should do both; they are not mutually exclusive goals. Realizing interstellar travel may help address problems on Earth and ensure the long-term survival of Earth-born life, just as spaceflight within our solar system has already done for us.
NASA, the world’s largest space agency by budget, represents less than 0.5% of U.S. federal spending. This level of activity does not come at the expense of addressing other human needs or concerns. This question has been asked about space exploration in our own solar system since the dawn of the Space Age. One of the more eloquent responses comes from Ernst Stuhlinger—a figure whose legacy is complicated by his involvement in Germany’s World War II rocket program—in a 1970 reply to a letter from Sister Mary Jucunda, a nun working in Zambia who asked how spending in space can be justified with such human suffering on earth. In a reply written with grace and humility, Stuhlinger emphasized how Earth-observing spacecraftwould come to help combat food scarcity. Today, few people question the societal good ofsatellites for weather forecasting and more efficient agriculture. The tangible benefits of 20th-century space programs offer compelling evidence that investing in interstellar exploration is a fitting and forward-looking priority for the 21st century.
The value in humanity’s expansion beyond Earth as an insurance policy is now widely recognized. New concerns regarding AI-driven existential risks only sharpen this imperative: the fundamental time-of-light delays between stars become safeguards against such threats. Only interstellar travel—self-sufficient human communities separated by light-years—can ensure that no single catastrophe can extinguish our entire species.
Perhaps a longer perspective is healthy. There is no shortage of talented young people who have been energized to study STEM inspired by the vision of our expansion into the galaxy, and the skills they acquire are foundational to our technological future, interstellar or not.
As a professor of engineering, I have taught thousands of students and have spent considerable time understanding what motivates them. Although anecdotal, my findings suggest that few pursue STEM out of guilt or despair; most are inspired by a radiant vision of the future. I have found studying the challenges of interstellar travel a particularly effective means of challenging students to think and work from fundamentals. While they may not go directly from graduation to building starships, the essential STEM skills they acquire, have been well utilized across today’s deep technology sector upon their graduation from my research group.
Technology is built on physics and physics on mathematics, the latter of which presaged much of modern science. To advance, we need to lay a foundation today, and by tackling these challenges now we can incrementally hasten the arrival of interstellar capabilities. Many in the interstellar community draw inspiration from medieval cathedrals, where builders devoted their lives to projects they would never complete. Those builders remind us that starting a solid foundation today can yield achievements beyond our own lifetimes. Such thinking could supply a healthier perspective amid the anxieties brought on by our era of instant-gratification.
Getting to the stars is impossible
There are many proposed solutions to the Fermi Paradox, with the impossibility of interstellar travel being just one of them. Known physics permits several viable methods for traveling between stars. A sufficiently advanced civilization should be capable of achieving this capability.
Webb has compiled 75 solutions to the Fermi Paradox in his book If the Universe Is Teeming with Aliens…Where Is Everybody? The notion that interstellar travel is not feasible (“Solution 11: The Stars Are Far Away”) is just one possible solution, and it is not favored by that author. There are several options for interstellar propulsion that are consistent with the physics already in our textbooks: laser-driven lightsails, antimatter rockets, fission fragment rockets, and particle-stream-propelled spacecraft. While all of these approaches have significant challenges and potential technological roadblocks might appear, it is difficult to argue that every one of them is unfeasible or cannot be made to work by a civilization that is thousands or millions of years more advanced than our own.
The Fermi Paradox remains a fascinating problem but has little bearing on whether we should start planning for interstellar flight.
Relativistic effects only become significant when traveling near the speed of light. Most currently conceived concepts for interstellar missions do not propose to exceed 30% light speed, where relativistic effects are only a minor correction.
The relativistic mass of an object with a rest mass of is given by the Lorentz (gamma) factor
were is the spacecraft velocity and is the speed of light. At 30% the speed of light, the Lorentz factor is only 1.05. As viewed from the frame of a stationary observer, this corresponds to a 5% increase in effective mass, with acceleration for a given amount of thrust being reduced by a similar amount. Although detailed mission designs must include this correction, it is not a first-order effect and does not determine the feasibility of interstellar travel.
As velocity approaches the speed of light, the Lorentz factor increases sharply. At 87% the speed of light, is about 2, effectively doubling the spacecraft’s apparent inertial mass. This rapid increase in relativistic mass ultimately prevents any object from reaching or exceeding light speed. The extreme energy required to accelerate a spacecraft to velocities approaching the speed of light places near-light-speed travel outside near-term mission planning.
Although no object can exceed light speed, faster-than-light travel is not required for reaching the stars. Achieving 10% of light speed is feasible with several propulsion methods, and such a mission to the nearest stars would take 50 years, comparable to the currently ongoing Voyager missions.
While relativistic effects prevent a spacecraft from exceeding the speed of light, relativity does allow a spacecraft to travel effectively—as viewed from onboard the spacecraft—faster than the speed of light (see Sagan's presentation of these calculations). If a vastly advanced spacecraft could accelerate to 92% the speed of light at the midpoint of its journey and then decelerate to rest at its destination, relativistic time dilation means the crew experiences a duration equal to the distance traveled in light-years. For example, traveling to a star 10 light-years away would take 10 years from the crew’s perspective, though observers back on Earth would see the journey take longer due to time dilation. Accelerating to speeds exceeding 92% of light speed, the crew experiences even less time than it takes light to cross the same distance. From the reference frame of the crew, they are effectively traveling faster than the speed of light, without violating physical laws. Such journey would only be feasible in our far future.
Can we get to the stars using…?
The atomic-bomb-propelled Orion as originally proposed would not have been capable of interstellar flight; it was intended for transportation in the solar system. Freeman Dyson later proposed an interstellar version of Orion, but it was really just a sketch of an idea.
Project Orion was incredibly forward thinking for its day but was intended for missions within the solar system. The working goal for the project was “Saturn by 1970.” While fission and fusion have incredible energy density, the devices that can contain fission and fusion reactions (reactors) are heavy, limiting their acceleration and ultimate velocity. By using nuclear reactions exterior to the ship, or external propulsion, Project Orion overcame this limitation.
The atomic-bomb-propelled Orion as originally proposed would not have been capable of interstellar flight; it was intended for transportation in the solar system with a specific impulse on the order of 10,000 s, comparable to advanced electric propulsion, but not sufficient for interstellar travel within a human lifetime. Freeman Dyson did later propose an interstellar version of Orion, but it would require thermonuclear bombs (tens of thousands of them) and without using a fission trigger. While Freeman Dyson’s article on this topic is fascinating and provides insight into a first-rate mind grappling with the interstellar challenge, it was really just a sketch of an idea.
Later, better versions of nuclear-bomb-driven propulsion were developed: Solem’s Medusa and Zubrin and Andrew’s MagOrion. These concepts addressed deficiencies associated with Orion’s pusher plate. Medusa is derived from the engineering principle that it is better to absorb a nuclear explosive with a parachute connected to the vehicle by a tether in tension, rather than deposit the energy into a solid pusher plate supporting the vehicle via a column in compression. MagOrion would use a magnetic parachute rather than a solid pusher plate or parachute to absorb the bomb energy. Even with these improved approaches, it is still difficult for a bomb-driven vehicle to reach the stars within a human lifetime. Spacecraft propelled by nuclear pulses are still rockets and obey the rocket equation. With an exhaust velocity of only about 4% of light speed, nuclear fusion, even in the form of nuclear bombs, is difficult to use to propel a spacecraft to faster than 10% of light speed.
Project Daedalus and follow-on Project Icarus pioneered studies of interstellar travel but assumed inertial confinement fusion with advanced aneutronic fuels was an available technology. Even if fusion is available, rockets powered by fusion struggle to exceed 10% of light speed.
Project Daedalus, from the early 1970s, was remarkable work and greatly stimulated people’s thinking about interstellar flight. It came about at a time when researchers thought laser inertial confinement fusion (ICF) would be relatively easy, perhaps achievable using a benchtop laser facility. With the breakthrough result from the National Ignition Facility in 2022, we now know ICF can be made to work, but the NIF laser facility is the size of a football stadium. The scale of NIF reveals the enormous challenge of making an ICF-propelled rocket. The original Daedalus study proposed mining the atmospheres of the gas giant planets to extract thousands of tons of helium-3, which could be used in fusion reactions that generate far fewer neutrons than the deuterium-tritium fuel used at NIF, but such a fuel is much more difficult to ignite. Even with these forward-leaning assumptions, Daedalus would have required 50 years to reach its target star system (Barnard’s star) and would not have been able to decelerate.
The follow-on, updated Project Icarus initially considered rendezvous missions but rejected them as being too difficult. The Firefly design that emerged from the Project Icarus study assumed an advanced Z-pinch-based fusion rocket that would enable deceleration at the target solar system, but with flight times of approximately 100 years.
Even if it could be made to work with advanced fusion reactions, a fusion rocket has an exhaust velocity of about 4% of the speed of light, dictated by the energy of the products of the fusion reaction. This means that it would only ever be able to reach about 8% of the speed of light due to the rocket equation (note that Daedalus was a two-stage design, wherein a much larger vehicle was used to launch a smaller vehicle, doubling its velocity at the expense of vast quantities of fusion fuel).
The Bussard Ramjet is a stimulating idea but faces a number of unsolved challenges: we cannot build magnetic scoops that are larger than their own field generators and we have no practical way to achieve efficient proton–proton fusion, to just name two problems.
The Bussard ramjet is a attractive idea, but has so many challenges that interstellar researchers get into arguments as to which is the most important reason why it cannot work. While the Bussard ramjet is not impossible (i.e., it does not violate conservation laws), there is no known technological implementation. Just to elaborate upon two flaws in the concept:We have no idea how to fuse the interstellar medium (which is mostly ordinary hydrogen, not deuterium). The sun can do this, but it requires millions of years to “cook” the hydrogen in a complex sequence of reactions. We do not know how to make that H+H reaction go on command in any machine.
The fusion reactions currently being pursued technologically for fusion energy (D+T, D+D, D+He3, p+B11) were all successfully performed using primitive particle accelerators in the 1930s, although those experiments required more energy input than was generated from fusion. The remaining engineering challenge for fusion is to make a practical, net-gain device using these reactions. Proton-proton fusion has never been done by technological means of any kind.
Another issue is the scoop. A mechanical scoop big enough to collect the tenuous interstellar media would mass too much, so it would need to be a magnetic scoop that can be projected outward from the spacecraft. Magnetic scoops do not work: Magnetic fields in the far field always present as a dipole, and dipole fields scatter approaching charged particles away rather than collect them together. There may be a way to make an efficient magnetic scoop, but we have not been able to figure one out. You could make a mechanical structure that generates a magnetic scoop inside the structure, but such a structure would mass too much. There are more problems with the Bussard Ramjet (see discussions of Opik and Fishback), but these are two major ones.
Still, the concept of spacecraft that extract reaction mass or energy from the interstellar medium remains a compelling avenue to explore. In recent years, Bussard’s idea has inspired promising propulsion approaches that, while distinct from the ramjet, leverage interaction with the interstellar medium. See Greason's Q-drive and wind–pellet shear sailing.
While electric propulsion can accelerate reaction mass (propellant) to the speeds necessary for interstellar travel, the constraint is the mass of the required power supply.
Ion propulsion (and electric propulsion in general) is a wonderful technology that uses propellant very efficiently by accelerating it with electric fields to very high speeds. However, the energy used to accelerate the propellant must come from somewhere, and if this energy source is carried onboard the spacecraft, the spacecraft will become massive, limiting its acceleration.
Advanced electric propulsion concepts exist that could generate a high specific impulse of 50,000 s or faster, and in principle there is no reason why a specific impulse of one million seconds could not be generated using techniques developed for particle accelerators. The issue becomes: where do you get the power to run the engine? Even if you used a very advanced nuclear reactor or fusion reactor, the reactor would have too much mass and the spacecraft would only be able to accelerate at a tiny fraction of a gee. If the power supply is limited, as is usually the case, increasing thrust necessitates using more propellant, reducing the specific impulse (or efficiency of propellant utilization).
Antimatter is real: the positron (the anti-electron) was discovered in 1932. Antimatter is unrelated to dark matter; the nature of dark matter remains unknown. Antimatter is used in PET scans. Even bananas emit tiny amounts of antimatter through natural radioactive decay.
Antimatter is a mirror of normal matter in which the electric charge is the opposite of its usual value. Every time a new charged particle is created in a particle accelerator, its opposite particle is also produced, since the overall charge must be kept constant. Thus, every particle accelerator that makes new particles from energy (i.e., from ) is an antimatter factory. Presently, there is no controversy around the existence of antimatter. While the question of why there should be a prevalence of normal matter in comparison to antimatter in the observable universe is an outstanding issue in cosmology, it does not pertain to the use of antimatter for interstellar travel. Readers wishing to learn more about antimatter are encouraged to search on “antiprotons,” which is the more usual scientific term.
Antimatter is used in medical PET (Positron Emission Tomography) scans. In a PET scan, a patient is injected with a radioactive tracer that collects in tissues and organs, such as cancer cells. The radioactive tracer emits positrons via radioactive decay, and when the positron annihilates with an ordinary electron, it emits a gamma ray that passes out of the patient’s body. The PET scanner detects these gamma rays and can triangulate to their source. Positrons are also emitted by anything that contains potassium, since about 0.01% of natural potassium contains potassium-40, which undergoes positron emission with a half life of about a billion years. A banana is often offered as a source of positrons; a typical banana emits a positron about once an hour.
Antimatter is of interest for interstellar travel because matter-antimatter annihilation is the one propellant combination that can generate exhaust velocities at a significant fraction of the speed of light and thus is the only propellant option for an interstellar mission that could reach its destination in a single human lifetime.
The LHC is not used primarily to produce antimatter. A much smaller CERN accelerator (the Proton Synchrotron, about 200 m in diameter) is used to produce antiprotons for experiments. An accelerator optimized for antimatter production could be made smaller.
Antimatter has been produced at CERN (European Organization for Nuclear Research) in Switzerland and at Fermilab near Chicago. The accelerators that are used for antiproton productions are smaller accelerators, hundreds of meters in diameter. An accelerator optimized for antiproton production would have similar dimensions and beam energies (tens of GeV) but operate with much greater beam currents. Such an accelerator could be built in space or on the moon. See the discussion of antimatter safety.
The total antimatter produced so far at CERN and Fermilab is about 25 nanograms. Interstellar propulsion concepts typically require kilogram amounts of antimatter even for a small robotic probe, so production would need to increase by a factor of 10 billion or more. Current accelerator methods of producing antiprotons are extremely inefficient, simply because they have no need for large quantities of antimatter to do the intended science. Optimistic “antimatter factory” designs using recirculating beams suggest conversion efficiencies on the order of 0.01% of input electrical energy to stored antiparticles. Using those optimistic figures, and an electricity price of $0.01/kWh, the electricity-only cost to produce 1 kg of antimatter would be on the order of $2.5 trillion, which explains why cost and power supply remain the primary barriers to antimatter propulsion. Still, industrialization of space—where we would prefer to produce and store antimatter for safety reasons—using in‑situ resources could exploit abundant sunlight to lower energy costs by one to two orders of magnitude, bringing the cost of antimatter for an interstellar mission in line with other major science projects (e.g., the International Space Station cost $100 billion).
When a single atom of matter contacts antimatter, it annihilates, but this does not trigger the destruction of the entire antimatter sample. The energy released by one-atom annihilation is negligible, comparable to the natural radioactive decay occurring all around us.
Just as a positron emitted in a banana soon meets an electron and annihilates, individual atoms leaking into an antimatter trap will annihilate one by one. Although each proton–antiproton annihilation releases roughly a thousand times more energy than a typical decay event, such events remain imperceptible at macroscopic scales. Losses from imperfect vacuum resemble a slow evaporation of antimatter, not a catastrophic explosion.
Antimatter ions have been trapped in vacuum by electromagnetic fields for months with negligible loss. For macroscopic antimatter, vacuum levitation can be validated using ordinary matter and then the field polarity reversed. Compared to the difficulty of producing, trapping and storing it is comparatively simple.
Antiprotons have been contained in electromagnetic traps (Penning trap) for a period of months without loosing any atoms. For the quantities of antimatter needed for interstellar travel (hundreds of grams to kilograms), the antimatter would need to be stored in condensed form, likely antihydrogen ice encapsulated in anti-lithium. This frozen ball would be slightly charged via an excess of deficit of positrons and levitated electromagnetically. Maglev trains are an example of how an object can be suspended against one gee of acceleration. Similar techniques can be developed and tested to suspend a snowball of antihydrogen ice in vacuum using conventional hydrogen ice, and then the polarity reversed when used for antimatter storage.
Solar sails are a promising technology that use the photons of sunlight for thrust. It is difficult for a solar sail, launched from very near the sun, to achieve more than 1 or 2% of the speed of light.
Extreme solar sailing for interstellar missions necessitates going very close to the sun, so use of high temperature materials that withstand the intense sunlight are of interest (e.g., carbon, beryllium, etc.). This fundamental material limitation limits solar sails leaving the solar system to less than 2% the speed of light. An interstellar-bound solar sail can be useful as a precursor mission to explore the interstellar medium or a long-duration mission to another solar system lasting centuries, but solar sails cannot realize an interstellar mission in a human lifetime.
The sun is a diffuse source of light. Unlike a laser, there is a fundamental limit to how much sunlight can be focused.
The sun is a finite-sized disk emitting diffuse light. This fact sets a limit to how much you can focus sunlight: The most you can do by focusing is form an image of the sun with a flux of sunlight arriving with the same flux as leaves the surface of the sun. Such concentrators would need to be much larger than the sail being accelerated, and a large number of the concentrators would need to be spaced together along the path. Examination of this concept concluded it would be preferable to convert sunlight into laser light that that can be much more effectively focused onto a lightsail rather than a solar sail.
Warp drives and worm holes are an active area of investigation, but research is presently confined to examining mathematical solutions to the field equations of general relativity. These concepts are not at the level where experimental tests can be performed.
Concepts like an Einstein–Rosen bridge (wormhole) the Alcubierre metric are legitimate science, and work continues to be published in top-tier physics journals. This work remains purely theoretical constructs confined to mathematical solutions of general relativity, requiring exotic energy conditions and negative-mass distributions. No experiment or observation has yet validated these ideas in practice, nor do we presently have ideas for how such experiments could be performed.
Experimental work has been performed with laboratory analogs, such as fluids, condensed matter, metamaterials, etc. For example, Bose-Einstein condensates permit states comparable to negative energy to be created, but this should not be interpreted as an experimental path to creating worm holes or warp drives. While enthusiasts for interstellar travel would welcome a warp drive or worm hole, real experimental progress toward flight to the stars must be grounded in the physics of our current textbooks and the experimental tools in our labs and workshops.
Starting and stopping a spacecraft is energy demanding, and objects in the Oort cloud are very sparse, cold, and difficult to find.
It has been suggested perhaps we can use Oort cloud objects as waypoints to expand into interstellar space (a bit like how the Polynesians expanded across the entire Pacific, island by island, using primitive technology). The Polynesian model has been explored for expansion into both our solar system and into interstellar space. While an evocative idea, we may never find any Oort cloud objects. While we know the Oort cloud exists, since there must be a pool of objects that long-period comets originate from, objects in the Oort cloud would be near absolute zero temperature, making them nearly impossible to detect. They are also very widely spaced: On average, there is an empty volume of space the size of Saturn’s orbit around each Oort cloud object.
Next, it is horribly expensive to start and stop a spacecraft at a waypoint. The amount of propellent ratio needed for a rendezvous mission is squared in comparison to a flyby. That is, if you need 10 tons of fusion fuel to fly 1 ton of payload past an Oort cloud object, you would need 100 tons of fusion fuel to make the same journey and decelerate and stop at that target. So, even if you could extract more fusion fuel from the Oort cloud (e.g., deuterium fusion fuel from icy comets), it would probably not be worthwhile to stop and do so. You would arrive at your final destination sooner by not stopping to refuel.
What can stop us from going?
Most interstellar dust grains are smaller than a micron. At 20% light speed, a collision with a micron dust grain releases ~1 J of energy, comparable to a camera flash or starter pistol shot. We do not know if larger grains exist; verifying this requires an interstellar precursor mission.
There are two kinds of particles that will be encountered: The interstellar gas, mainly hydrogen and other atomic ions, which make up 99% of the interstellar medium, and dust, mostly carbon and silicates. The impact of gas is dealt with in the next section (Section 2.4.2).
Interstellar dust grains, comprised of silicates and carbon compounds, are believed to be micron-sized and smaller and make up about 1% of the interstellar medium. Alas, we cannot test these kinds of impacts (at 20% the speed of light) in the lab, but we can calculate how much energy they could explosively release on impact: At 20% the speed of light, it is about = 1 J of energy for a micron-sized dust grain. This energy is the equivalent of a firecracker or small camera flash going off. Such an impact will certainly do some local damage, but the spacecraft can be designed with this in mind, including shielding and self-healing materials. Such an impact will not vaporize the spacecraft, but there will be a lot of these impacts, based on estimates of dust in the interstellar medium: Every second, there will be little flashes going off on every square meter of forward facing surfaces.
We presently do not know if there are dust grains larger than micron-sized in interstellar space. Astronomers do not believe there is a significant mass of material in this size range (in comparison to the gas and micron-size dust they do know something about), otherwise they would observe its effect on starlight attenuation. Based on models of dust evolution, there is no known mechanism for large dust grains to form in interstellar space and larger grains should be broken up by shock waves driven by supernova.
To verify the population of interstellar dust grains over the range of sizes, it will be necessary to send precursor missions to determine the composition of the interstellar medium. Fortunately, just such a mission (the Interstellar Probe) is under development at the Johns Hopkins Applied Physics Lab. This mission would not be a mission to another solar system, but rather to explore out to 1000 AU (ten times farther than the Voyager 1 and 2 spacecraft) to measure fields and particles out to this distance. This mission could be selected to fly in the 2030s.
Even if mm-sized sand grains are encountered, this is not necessarily an insurmountable barrier to interstellar flight. If impacting a thin structure (lightsail, etc.), studies suggest that the grain will pass through the sail and damage an area no larger than the grain itself. It would not release the complete kinetic energy of the sand grain (the impact of which is equivalent to several hundred kilograms of TNT, still not an atomic bomb).
If impacting a solid spacecraft structure, a study from the Lawrence Livermore National Laboratory suggested that the dust grains will be stopped in a distance of millimeters. For a larger, sand-sized grain, such an impact would release hundreds of kg of TNT-equivalent energy would be catastrophic, but this stopping distance provides a potential shielding strategy: a sacrificial shield could fly far in front of the spacecraft. The dust would be approaching the spacecraft straight-on with essentially no sideways motion (like driving into a snowstorm at high speed), so only the forward projected area of the spacecraft needs to be shielded. The impact upon the shield would turn the dust into plasma, which would largely dissipate before reaching the spacecraft. A magnetic field generated by the spacecraft would deflect any residual plasma.
For a larger spacecraft with sufficient power, it is possible an onboard laser could identify (via Light Detection and Ranging, LiDAR) and deflect (via pulsed laser) larger grains. A 1-mm size grain could easily be identified by LiDAR, and again, you only need to search a very small area ahead of the spacecraft since such grains would be approaching head-on.
Much work remains to be done and researchers are actively working on these problems. It would be premature to conclude interstellar travel is unfeasible due to the dust grain impact problem.
Gas impact will erode forward facing surfaces by sputtering. These impacts have been tested in the lab and the amount of material removed has been measured. For a trip to the nearest stars at 20% of light speed, less than a millimeter of surface would be removed.
Interstellar gas, mainly hydrogen and other atomic ions, makes up 99% of the interstellar medium. The remainder is interstellar dust, which is discussed above. The density of gas in interstellar space is about one atom for every five cubic centimeters of space, so atoms will be continuously striking forward-facing spacecraft surfaces.
An impact at 5% the speed of light corresponds to an energy of MeV for a hydrogen nucleus (proton). The impact of MeV ions is something can be—and is—tested in the lab. The field of research called sputtering studies how ions impacting a surface knock-off (or sputter) surface atoms. At the highest energies for which sputtering is studied (MeV levels), a typical impacting ion will knock as many as a hundred ions off the surface. We can draw upon this result to calculate that, over a light year of travel through the interstellar medium, this will result in erosion of a few microns of material from the forward-exposed surfaces.
The interstellar gas results in negligible heating of the spacecraft. In a sputtering process, the energy of the impact goes to knocking ions out of the target surface, so not all the impact energy is deposited as heat. But even if it is, again assuming one ion every traveling at 20% the speed of light, the energy flux to a forward facing surface (heating power) is . Sunlight is 1400 so the heating of forward facing surfaces is less than 3% of sunlight. The impact of gas might even be useful as a source of heating in deep space.
The interstellar medium is very thin. Even if all the gas and dust encountered comes to a stop on the spacecraft, its deceleration is negligible.
99% of the interstellar medium is gas, with a density of about one hydrogen nucleus (proton) in every 5 cm$^3$ of space. If all this gas were to come to stop on the forward-facing surface of a spacecraft that is only 1 mm thick traveling at 20% the speed of light, from conservation of momentum, the spacecraft would lose less than 0.0005% of its initial speed to drag per light year of travel. A larger, heavier spacecraft would lose even less of its speed.
Dust is, from a momentum loss perspective, even less of a concern. An anecdote that is told about the American astronomer Harlow Shapley illustrates this point: In classroom lectures, Shapley would start rubbing a piece of chalk back and forth on the chalkboard, asking his class to tell him when to stop when the thickness of dust equaled the total integrated dust from earth to the center of the galaxy. The answer was to stop almost immediately; just a few layers of chalk dust is all there is between us and the center of the galaxy.
The Oort cloud is sparse. Around each Oort cloud object is a volume of space comparable to the size of our solar system inside Saturn’s orbit. A spacecraft does not need to take any special precautions to avoid hitting an Oort cloud object.
Multiple spacecraft have passed through our asteroid belt without colliding with an asteroid, including four missions in the 1970s (Pioneer 10 and 11 and Voyage 1 and 2) at a time when knowledge of objects in the asteroid belt was only one thousandth of what it is today. Movies (i.e., The Empire Strikes Back) give a misleading impression of what and asteroid belt actually looks like. If standing on the surface of a typical asteroid, no other independently orbiting asteroid is visible to the naked eye. A spacecraft could pass through the asteroid belt millions of times—not taking any special precautions—without striking an asteroid, although micrometeoroids remain a valid concern. The Oort cloud is much less sparce compared to our solar system’s asteroid belt.
The radiation environment of interstellar space is not significantly different from that within the solar system or even in low Earth orbit. While it is a concern, the natural radiation environment is not a showstopper for interstellar travel.
Interstellar space is subjected to galactic cosmic rays (GCRs), consisting mainly of hydrogen and helium nuclei (protons and alpha particles), with the remainder being heavier ions, accelerated to near light speed by supernova explosions and other astrophysical phenomena. GCRs have a power law distribution, similar to the Richter scale for earthquakes, where there are more low energy GCRs than high energy GCRs. The solar system’s heliosphere provides some shielding against lower energy GCRs, but the greatest energy GCRs, with energies of GeV and greater (some as great at 10$^\text{20}$ eV), enter the solar system unimpeded. The Earth’s magnetosphere and the Earth itself provide some additional shielding in low Earth orbit, but any mission above the Earth’s atmosphere will be exposed to GCRs, which is a significant concern for a crewed mission. Spacecraft engineering has largely learned to deal with this issue for hardware via redundancy, namely, having backup flight computers running in parallel in case one is damaged by a cosmic ray, combined with error‑correcting memory and fault‑tolerant software. Thus, while radiation exposure is a concern, for an interstellar mission, this concern can be addressed using the same strategies deployed in today’s spacecraft.
Note the fact that an interstellar spacecraft might be traveling at a significant fraction of the speed of light does not change the cosmic radiation exposure. Even if traveling at 50% of light speed, the spacecraft would be effectively “standing still” in comparison to GCRs with GeV energies or greater. A common 10 GeV cosmic ray proton is already traveling at greater than 99% the speed of light, so adding the spacecraft’s relative velocity makes a negligible change to the proton’s energy. These GCRs would still strike the spacecraft from all directions.
When traveling at a significant percentage of the speed of light, the gas of the interstellar medium becomes an additional radiation concern, as it will present itself to the spacecraft as a directional flux of protons at MeV energy levels (an ion at 5% light speed is about 1 MeV of energy per nucleon).The flux will these protons striking the spacecraft leading surfaces will be greater than the flux of GCRs: In interstellar space, there is about one proton in every 5 cubic centimeters, while there is only one GCR per 10-m-on-a-side cube. However, 1 MeV protons can easily be shielded by a layer one tenth of millimeter thick of nearly any material (e.g., water, aluminum, plastic, etc.). We presently do not have a low-mass technology to shield the highest energy cosmic rays, which require a meter or more of solid material to stop, although this is an area of active research.
For navigation, the interstellar spacecraft will have to use artificial intelligence and use onboard cameras (star trackers) for navigation. Pulsars can be use like a galaxy-wide GPS system.
An early preview of how an interstellar mission might be conducted was given by the New Horizons spacecraft, which used its onboard cameras to search for potential Kuiper belts objects it could target for a flyby. Although not yet successful in finding a new target, this strategy would be used by a robotic spacecraft as it approached the target solar system. New Horizons has also demonstrated interstellar navigation by precision observation of the parallax of Alpha Centauri and other nearby starts as it moves into the galaxy.
Another interesting possibility is X-ray pulsar-based navigation and timing (XNAV for short), which is already used on earth for positioning. This is like a GPS system for our corner of the galaxy and could be used onboard the spacecraft to determine its absolute position.
For a robotic mission, accelerations as great at 10,000 gees can be tolerated by electronics and optics found in g-hardened equipment. 10 minutes of this acceleration is sufficient to reach 20% the speed of light. For an eventual crewed mission, one gee of acceleration for 10 weeks will be sufficient to reach 20% the speed of light.
Artillery shells with laser-guidance optics and GPS receivers can survive 30,000 g of acceleration upon being launched from a canon. Even today’s smartphones usual survives a fall and impact on a hard surface, which is equivalent to a thousand gees of deceleration on impact.
For eventual human missions, a convenient fact to remember is that one gee of acceleration for one year brings you very close to the speed of light (neglecting relativistic effects). So, two months of acceleration at two gees, for example, results in a speed of one-third the speed of light.
What is the point in going if…?
Exoplanet astronomy is making enormous advances, but no proposed exoplanet observatory would be able to resolve the disk of an exoplanet.
While there is much excitement around direct imaging of exoplanets, these observations refer to separating the light reflected from an exoplanet from the light of its star. Direct imaging would not permit the disk of an exoplanet to be resolved. The exoplanet would remain like Carl Sagan’s “Pale Blue Dot,” a sub-pixel image. An enormous amount of information can be extracted from a pixel-width image of an exoplanet via spectroscopy, and there is a potential to detect continents, seasons variations, and perhaps glints of starlight off seas and oceans, but we would not be able to resolve the disk of the planet.
The wave nature of light sets a fundamental limit to how a target can be resolved optically. As a quick estimate, the formula L λ ≈ d D can be used, where L is the distance to target, λ is the wavelength of light used, d is the diameter of target, and D is the size of the optical element used to observe. If a mega-pixel image of an earth-sized planet at a distance of 10 light years is desired at optical wavelengths (λ = 500 nm), the diameter of the telescope would need to be 4000 km in diameter, covering a significant fraction of the surface of the earth. As discussed below, the telescope would need to be a solid array; just combining widely spaced telescopes interferometrically would not provide sufficient signal-to-noise ratio.
While exoplanet astronomy will continue to make amazing discoveries—with or without interstellar travel—consider how our knowledge of planets in the solar system was advanced by robotic interplanetary missions, compared to if our planetary observations had been limited to just earth or space-based telescopes.
No, a distributed array of optical telescopes cannot be combined to image an exoplanet because they would not have sufficient light gathering ability. The EHT was able to do this by imaging a bright radio source against a dark radio background.
The Event Horizon Telescope’s direct imaging of the black hole at the center of galaxy Messier 87 was a remarkable achievement, done by combining observations made by eight separate radio telescopes spaced around the world. By interferometrically combining signals, reinforcing the peaks and troughs of incoming light wavefronts, astronomers synthesized a virtual telescope with the resolving power of an Earth-sized aperture.
This same technique cannot be used to make an image of an exoplanet. For the EHT, a very bright radio source from a black hole was imaged against a sky that is relatively quiet at the radio frequencies used. The EHT used an interferometric array of telescopes to increase its ability to resolve the black hole, meaning finer features could be seen.
For imaging an exoplanet, a similar technique might be attempted, although combining optical wavelengths interferometrically over distances of thousands of kilometers has never been done. However, the more significant challenge is the exoplanet is a weak source of light against a bright background. An earth-like planet only reflects one ten-billionth of its parent star’s light, and there would be other sources of light in the telescope field of view. While the planet’s star could be blocked out using a chronograph, other sources of light cannot be blocked, like the zodiacal light of the exoplanet’s solar system. “Exo-zodi” is the light reflected from the dust located in the target solar system. To have sufficient signal-to-noise for the light from the exoplanet to separate out from the exo-zodi might require too long of an integration time. For an array of several 10-m-sized telescopes spaced around the globe, estimates of the integration time vary from days to years to have a sufficient signal-to-noise ratio, meaning features on the rotating planet would be smeared out during the integration time and not resolved.
Yes. Although there are technical challenges, sending a space telescope to 550 AU would let us use the Sun’s gravitational field as an incredibly powerful lens, enabling extreme light-gathering and resolution by focusing distant starlight through gravitational lensing.
The solar gravitational focus (or lens, SGL) is an exciting prospect for imaging exoplanets with megapixel resolution. Starlight (or light from an exoplanet) passing near the Sun is concentrated along a line extending from the Sun in the direction opposite to that of the star or exoplanet—a phenomenon first described by Einstein. By sampling the Einstein ring formed as the gravitationally focused light from an exoplanet, an enormous amplification of light gathering and resolution capability can be realized.
As the solar gravitational lens is a spherical lens, there is not a focal point, but rather a line that extends away from the sun. While this makes reconstructing the image more challenging, this fact means that the spacecraft does not need to stop upon reaching the SGL location; it can continue to fly along this line and continue imaging the target.
The solar gravitational focus mission still represents an enormous technological and specifically propulsion challenge. It would likely need to travel to 650 AU (to avoid image contamination from the sun’s corona), a distance almost four times further than Voyager 1—the fastest object to ever leave the solar system—has traveled in almost 50 years. With current propulsion technology, such a mission would require 200 years, so we would need to go faster. The JPL team that studied this mission concept in the greatest detail selected a solar sail which might be able to reach the SGL in 30 years. This mission forms a great catalyst for the development of advanced propulsion while being much more realizable in the near term than a true interstellar mission.
The Wait Calculation is sensitive to the assumptions of how technology advances.
“The Wait Calculation” by Kennedy is a thought-provoking and worthwhile-to-read paper. The premise that an interstellar mission will be overtaken by a later mission launched with more advanced technology is well recognized in the interstellar travel community: It is the subject of a science fiction story, “Far Centaurus” by A. E. van Vogt, first published in Astounding Science Fiction in 1944, wherein an interstellar crew awakens from hibernation upon arrival at Alpha Centauri to find the solar system there already settled by missions launched later that overtook their earlier-launched-but-slower ship. The calculation performed by Kennedy depends sensitively on the assumed rate of progress in propulsion technology and does not take into account potential breakthroughs. Heller’s more recent revisit of “The Wait Calculation” arrived at a more optimistic outcome, suggesting the optimal time to launch the first interstellar probe is around 2050.
Projecting technological trends across several orders of magnitude carries significant risk. The somewhat mythologized “Great Horse-Manure Crisis of 1894” exemplifies this danger: had contemporary trends continued unchecked, predictions suggested that London and other major 19th-century cities would soon be inundated by horse manure. This episode illustrates the potential pitfalls of technological extrapolation.
Finally, it is worth pointing out that the “Wait Calculation” also assumes that propulsion technology makes continuous, incremental progress in increasing the speeds that our technology can realize. For this calculation to be a valid reason not to launch an interstellar mission, ironically, we need to be continuously working on advancing our propulsion technology.
Proxima Centauri may not be a habitable system (by our standards), but it is still worth investigating. It is of interest because it is close, but if we have the capability to reach Proxima, other exoplanets are also within reach with a bit more patience.
Proxima Centauri b is a confirmed exoplanet residing within the habitable zone of Proxima Centauri, the nearest star to our solar system. Proxima b lies within the habitable zone, meaning the equilibrium temperature permits liquid water to exist. Proxima Centauri is an M-dwarf (red dwarf) star that is known to be a flare star, such that it bathes the planet in ultraviolet and x-rays and would potentially strip away an atmosphere, suggesting it might not be capable of sustaining life as we know it.
These facts do not mean Proxima b is unworthy of study. In our own solar system, the planet Mercury is not believed to be capable of supporting life and lie outside the habitable zone but is still of scientific interest and has been targeted by several robotic missions. All exoplanets are of intense scientific interest. Thus, due to its nearness, Proxima b would very likely be the subject of the first robotic interstellar mission. We will not know for certain that Proxima b hosts life unless we go there, and as our present knowledge of the genesis of life is limited to n = 1, we should not assume too much. As the fictional Ian Malcolm pointed out, “Life finds a way,” and there is always the possibility of a discovery that, as a starship doctor famously said, “is not life as we know it.” We will not know unless we go.
If Proxima b turns out to be uninteresting, there are another 10 stars within 10 light years of Earth, and with the same technology used to send a probe to Proxima b, we can visit all of them, with the transit time being proportional to distance. Alpha Centauri A is very similar to our sun (both are G2V class stars) and only a bit further than Proxima; we presently do not know if it has planets or not but is a subject of intense scrutiny for this reason. Epsilon Eridani (at 10.5 ly) and Tau Ceti (at 11.9 ly) are both similar to the sun and exhibit strong evidence of having planets. They will take nearly three times as long to reach as the Alpha Centauri system, but if we are three times as patient, we can go there as well.
Planets have been discovered in binary and multi-star systems, demonstrating planets can exist in such systems. We presently do not know if Alpha Centauri A or B host planets; the system is the subject of intense scrutiny.
We presently do not know if there are planets orbiting Alpha Centauri A or B, although there have been tentative reports of detections, including via direct imaging using the James Webb Space Telescope [43]. The Alpha Centauri system continues to be the subject of concentrated observation by astronomers due to its nearness. Simulations of orbit stability have shown that planets orbiting A or B in tight orbits (typically <3–4 AU for Alpha Centauri A/B, within the habitable zone) can be stable from the perturbations of the other star.
If no planets are discovered in the Alpha Centauri A/B system, it is still an attractive target for an interstellar probe mission due to its relative nearness. With the same technology use to reach the Alpha Centauri system, other solar systems can be reached, with transit times proportional to distance. Planets around Alpha Centauri A or B would be an enormous catalyst for further study of interstellar travel, but their absence does not mean interstellar travel is pointless.
Deceleration is difficult. The first interstellar mission will almost certainly be flybys, just as were the first missions in our solar system. There are concepts to decelerate by breaking against the interstellar medium using magnetic parachute.
The first mission to another solar system will almost certainly be a flyby. A mission using rocket propulsion that decelerates upon arrival has to square the propellant-to-payload mass ratio in comparison to a flyby mission. The first missions to the Moon, Venus, Mars, Jupiter, Saturn, and the first comet and asteroid missions were all flybys, with orbiters only coming later. Flybys are still the only type of missions that have gone to Uranus, Neptune, and Pluto. However, there are promising techniques that could be used for deceleration that do not require fuel: The magnetic sails (or, perhaps better described as “magnetic parachutes”) that I discussed with Fraser are a promising technology that would permit later missions to decelerate against the interstellar medium so that they could go into orbit around the star in the solar system to be explored. The deceleration would need to start years or even a decade or more before arriving in the target system by using drag against the charged particles that comprise the interstellar medium.
Promising techniques that could be used for deceleration have been proposed that do not require fuel. Magnetic sails (or, perhaps better described as “magnetic parachutes”) can decelerate by interacting with an enormous volume of the interstellar medium (much larger than the spacecraft itself) so that the spacecraft could go into orbit around the star in the solar system to be explored. The deceleration would need to start years or even a decade or more before arriving in the target system by using drag against the charged particles that comprise the interstellar medium.
Even if traveling a 20% the speed of light, it takes several hours to cross a solar system similar to ours. This should permit sufficient time to image targets of interest. Follow-on missions can target interesting feature found more precisely.
A simulated flyby of Proxima Centauri by a swarm of laser-launched lightsails traveling at 20% the speed of light was produced by Eubanks et al. under a NASA NIAC-funded study. When viewed in real time, the exoplanet flyby lasts about one minute, evoking the sped-up animations of Voyager and New Horizons encounters that originally spanned several days. Although surface features are visible for only a few seconds, this brief window is enough to capture detailed imagery.
A 1-watt transmitter onboard the spacecraft would be able to send megapixel images back to earth as it passed through the Alpha Centauri system. Please see the detailed calculations in Section 5 of NASA-supported study of Lubin et al.
Missions using laser-driven sails tend to favor launching small femtosatellite spacecraft or chipsats that would have a mass on the order of a gram. How can such a tiny spacecraft return images from many light years away? If the spacecraft is equipped with a 10 W laser transmitter (comparable to what is used in laser lightshows), potentially powered by a radioisotope thermoelectric generator, data could be encoded in a laser beam directed back to earth. The beam would spread out due to diffraction, becoming much wider than the earth, but would still be detectable. One option would be to use the optics multi-kilometer wide laser emitter used to launch the lightsail as the receiver. This would enable kilobits per second of data to be transmitted, comparable to the data return from the Voyager 1 and 2 spacecraft.
The ongoing NASA Psyche mission is demonstrating laser communication at 1 kbps from distances across the inner solar system, with sensitivity of superconducting detectors able to pick up a single photon. By the time we are ready to launch an interstellar probe, laser communication technology should be capable of sending data back.
Why we should not develop interstellar technology
Antimatter production is both extraordinarily costly and technically demanding. Any organization attempting to develop an antimatter weapon would almost certainly bankrupt itself. Still, it is prudent that all manufacturing and long-term storage would need to occur off Earth to mitigate the extreme safety and containment challenges.
The exact behavior when a macroscopic chunk of antimatter meets matter has not been fully characterized, but annihilation at the contact interface would push the two masses apart. This effect would be analogous to the Leidenfrost phenomenon, where a liquid droplet levitates on its own vapor cushion above a surface far hotter than its boiling point. As discussed above, a single atom of matter contacting an antimatter reservoir triggers only localized annihilation; it does not result in the immediate destruction of the entire antimatter mass.
Anyone seeking to make a weapon of mass destruction would not select antimatter, since there are much easier and lower-cost methods available. Studies have shown that a clandestine effort to make a nuclear bomb could be realized for as little as $100 million. An quantity of antimatter with equivalent energy release to a 20 kT weapon, using the optimistic, forward-leaning estimates above, would cost billions of dollar per gram. There is also no limit to how powerful a hydrogen bomb can be made: During the Cold War, hydrogen bombs of 500 megaton yield were designed (this would be equivalent of 30,000 Hiroshima bombs). Such a weapon would be much less expensive than making antimatter. Of course, a weapon of mass destruction effort using biological weapons could be much cheaper; a study of bioterrorism threats suggest that only a few hundred dollars (e.g., a gene-sequencing machine purchase via eBay) might be sufficient to produce a virulent pathogen.
The crucial difference between antimatter and nuclear weapons is that every antiatom must be manufactured from pure energy via , whereas a nuclear weapon liberates far more energy than was consumed in its assembly. The 1960s Project Plowshare and 1970s Project PACER explored underground fission- and fusion-bomb detonations to heat water into steam for power generation. Although perhaps objectionable on other grounds, the concept was technically feasible. For this reason, namely the fact that would get net energy out, anyone seeking to use matter-to-energy conversion as a weapon would opt for nuclear explosives over antimatter.
Still, it is true that storing even micrograms of antimatter would demand unprecedented safeguards. Much like handling highly radiotoxic isotopes (e.g., polonium-210 or cobalt-60) or virulent biothreats (e.g., smallpox, botulinum toxin, etc.) and the ability to manipulate them (e.g., CRISPR), antimatter would force our civilization to elevate our containment protocols and safety culture to a new level. Producing the antimatter required for interstellar travel off Earth—whether in deep space or on the Moon—offers clear advantages. Those locations provide a naturally pristine vacuum ideal for both generating and storing antimatter, and they bypass the major safety and security risks of Earth-based production.
By design, the laser would only be able to be focused in the immediate vicinity of the lightsail and not able to be focus back onto the earth.
A laser array designed to push a lightsail to the speeds necessary for interstellar travel would need to be very carefully constructed to focus on the sail as it accelerates (≈0.1 AU), far beyond cis-lunar space. Note that using a mirror to redirect the focus point, since the reflected laser beam would undergo diffraction and be spread out be they time at arrived back on earth.
There are valid reasons why building the laser array needed for interstellar travel off earth, as it would alleviate these concerns. Whether an enormous laser is built for interstellar travel or not, it is likely that high power laser
The wavelength selected for the laser is dictated by being a wavelength for which the atmosphere is transparent. The amount of laser power deposited in the atmosphere is negligible. The intensity of the beam as it leaves the earth is no stronger than sunlight; a bird flying through the beam would be unaffected.
The primary challenge in designing a laser for interstellar propulsion is transmitting its beam through Earth’s atmosphere. Consequently, the laser’s wavelength must lie within one of the atmosphere’s transparency windows. There would be a negligible amount of laser energy absorbed by the atmosphere.
To accelerate a lightsail to interstellar speeds, you need a laser array several kilometers across. A 100 GW beam emitted by a 10 km × 10 km phased array corresponds to an exit flux of about 1 kW/m²—lower than the Sun’s irradiance at Earth’s surface (~1.4 kW/m²). Both the UCSB Starlight concept and Breakthrough Starshot propose using near-infrared lasers at roughly 1 µm wavelength, which carry no additional cancer risk. In fact, one could safely walk through the entire 10 km-wide beam without adverse effects. A bird flying through the beam would be unharmed.
The fact that the laser flux leaving the laser array is comparable to sunlight suggests that the laser array could be solar powered. It is possible to think of the laser array as a giant mirror reflecting sunlight, but rather than direct reflection, the array converts incoming solar energy into a coherent laser beam that can be focused onto the lightsail. The overall power density remains the same until it is focused onto the lightsail.
True, but humanity is presently contending with several technologies that could be species ending. Uniquely, however, interstellar travel would allow humanity to establish a presence in multiple solar systems, safeguarding our species against existential threats that would threaten a single system.
It is true that propulsion technology that can reach a significant faction of the speed of light could be turned into a weapon of mass destruction. If you can accelerate an arbitrary mass to a significant fraction of the speed of light, you can fly toward the earth, starting from deep space, and do unlimited damage on impact. This fact was well express by the science fiction author Larry Niven in 1970 as “The Kzinti Lesson” from his novel Ringworld, which is: “A reaction drive’s efficiency as a weapon is in direct proportion to its efficiency as a drive.” As our civilization continues to increase in technological capability, our capacity for self-destruction will increase as well, and the ethical application of technology is always a relevant consideration. This should not preclude us from someday exploring the stars.
Establishing human outposts across the solar system is now widely accepted as an insurance policy for our species’ survival. Yet ever-emerging and ubiquitous technological advances—such as advanced AI—suggest that planetary separation alone may prove insufficient. Although interstellar travel capability could be weaponized (although inefficiently, as discussed above), mastering interstellar travel offers an unmatched safeguard: a buffer measured in light-years, placing branches of humanity far beyond the reach of any solar-system-based existential threat.
A technologically advanced civilization even slightly more advanced than ours (on the timescale of our galaxy) would be able to detect the passive transmissions from our civilization (radio, television, etc.) and the wavefront of these signals will always be farther into our galaxy than any mission.
This solution to the Fermi Paradox, namely civilizations hide from each other, was suggested by Brin in 1983 and explored in science fiction earlier than that, but has recently received renewed attention due to Liu Cixin’s novel The Dark Forest.
Our biosphere has been signaling life via biosignatures in our atmosphere that are detectable spectroscopically for billions of years. As a technological civilization, we have been emitting high power electromagnetic transmissions (radio and television signals) for more than a century. These signals would be detectable by a another civilization with our present level of technology from a distance on the order of 4 light years. Intentionally directed signals could be detected by another Earth-level civilization halfway across our galaxy (e.g., Arecibo-to-Arecibo communication). A more advanced civilization—and any technological civilization is almost more certainly advanced than ours—would undoubtedly be able to detect our transmissions from further away. Thus, we have already announced—loudly—our presence in the Dark Forest. The launch interstellar spacecraft are unlikely to add to our visibility. Drawing from science fiction tropes, it might be argued that acquiring interstellar travel capability is our ticket into the club of advanced civilizations.