The U.S.-Mexico border isn’t open, but a migrant surge and a mishmash of messages and policies have created another crisis
Doctors who devise treatment plans for cancer patients face two significant obstacles: time and data. Analyzing a patient’s genome takes time—and so does reviewing medical literature, available clinical trials, and other sources of data. Meanwhile, the cancer may be metastasizing. But powerful new computing platforms could help physicians develop treatment plans much more quickly, with potential life-saving consequences.
In a recent study, researchers at the New York Genome Center (NYGC) in collaboration with IBM used a genome-analyzing version of IBM’s famous supercomputer “Watson” to develop a treatment plan for a 76-year-old brain cancer patient in just 10 minutes. By comparison, a human team took “160 person hours” to come up with treatment recommendations.
Watson’s natural-language-processing abilities allowed it to review millions of journal articles in the medical literature, along with other data sources, according to IEEE Spectrum. The supercomputer also scanned the patient’s entire genome for possible mutations: Typically, doctors review only a subset of the genes known to play a role in cancer. The study, published in Neurology Genetics, concluded that scanning a patient’s entire genome provided useful information to doctors by identifying mutations overlooked in smaller “panel” tests.
The researchers noted, though, that Watson’s treatment plan, while faster, might not have been better than the human one. The human clinicians at NYGC were able to identify a relevant clinical trial that Watson missed. But the study authors believe that cognitive computing could help deal with the massive amounts of data needed to effectively treat cancer patients.
“In my view, having doctors cope with the avalanche of data that is here today, and will get bigger tomorrow, is not a viable option,” Dr. Robert Darnell, director of the NYGC, told IEEE Spectrum. “Time is a key variable for patients, and machine learning and natural-language-processing tools offer the possibility of adding something qualitatively different than what is currently available.”
The idea of using a nuclear fission reaction to power a rocket engine has intrigued scientists since the dawn of the “Atomic Age” in the 1940s. Now that concept may become a reality.
NASA’s Marshall Space Flight Center in Huntsville, Ala., recently signed a contract with BWXT Nuclear Energy of Lynchburg, Va., to develop new concepts for nuclear thermal propulsion. A nuclear thermal rocket engine, with its higher exhaust velocities and greater propulsion efficiency, would be ideal for sending large payloads deep into the solar system. With that goal in mind, the U.S. government funded atomic engine test programs from 1955 to 1972 but abandoned them when plans for a crewed Mars mission fizzled.
But with renewed interest in sending humans to Mars, nuclear thermal engine concepts are back on the table. Such a system could cut the voyage time to Mars from six months to four, reducing astronauts’ exposure times to cosmic radiation. And once on the Martian surface, the nuclear engine could also generate electrical power for the mission.
Under the three-year, $18.8 million contract, BWXT will help NASA refine the feasibility of nuclear thermal engine concepts. They will also determine the feasibility of a specialized, low-enriched uranium fuel, which is considered safer than highly enriched uranium and has fewer security requirements, according to tech website New Atlas. —M.C.