Kamala Harris has a complicated record, but her zeal to support abortion and attack its opponents has been consistent
Building a spacecraft is not a job for mass production. Aerospace workers painstakingly assemble each spacecraft one at a time, traditionally following thousand-plus-page technical manuals. But Lockheed Martin—the prime contractor building NASA’s next generation Orion spacecraft—is ditching the paper manuals and equipping its technicians with augmented reality (AR) headsets.
The Microsoft Hololens headsets allow workers to view their section of the spacecraft overlaid with holographic models based on the engineering design drawings. The models display parts and labels right on top of the partially assembled spacecraft, to include detailed instructions for tasks such as torquing bolts positioned right over the relevant holes.
Technicians have embraced the new technology, but the current generation of AR headsets is still too bulky to wear for more than about three hours at a time.
“At the start of the day, I put on the device to get accustomed to what we will be doing in the morning,” spacecraft technician Decker Jory told MIT Technology Review. Jory and his team take the headsets off when they are ready to start drilling.
Lockheed expanded its use of augmented reality after tests showed that technicians needed much less time to familiarize themselves with new tasks as well as correctly execute processes such as drilling holes and twisting fasteners.
Share this article with friends.
Indoor urban farming is, you could say, a growing trend. Indoor farm startups want to provide pesticide-free, locally grown vegetables directly to urban stores and restaurants that otherwise would buy produce that may have been shipped thousands of miles. And one California startup is taking indoor farming a step further: The company is offering a completely autonomous farm with no human workers.
In October, Iron Ox opened its first operational indoor robotic farm in an 8,000-square-foot hydroponic facility in San Carlos, Calif. The company hopes to grow about 26,000 heads of leafy greens each year without soil, a production rate typical of an outdoor farm five times bigger, according to MIT Technology Review.
“We designed the entire process, from the beginning, around robotics,” Iron Ox co-founder and CEO Brandon Alexander told Fast Company. “It required us pretty much going back to the drawing board to see what we could do if robots were in the loop.”
At the farm, robotic arms plant the crops, add nutrients, transplant the plants to larger containers as they grow—maximizing health and yield—and according to Alexander will eventually harvest and package the greens for the market. Another mobile robot autonomously navigates the room carrying the 800-pound trays containing the crops. An artificial intelligence system nicknamed “The Brain” controls the entire operation, which currently includes some minimal human involvement still necessary until the farm is fully automated.
Alexander plans to sell initially to restaurants. “The next step,” he told Fast Company, “is to be working with chefs and say, ‘Hey, we’re your neighborhood robotic farm,’ and we want to supply probably the freshest produce they’ll ever have access to.”
Iron Ox hopes soon to begin working with grocery stores as well as restaurants, and next year plans to expand to other locations throughout the country.
One agricultural analyst suggested to Technology Review that, while the large investment needed for robotic farming might leave smaller family-owned farms behind, automation is needed across the industry to solve long-standing labor shortages.
Share this article with friends.
Mental health clinicians typically use a patient’s answers to specific questions about lifestyle, mood, or past mental illness to diagnose depression. But in a September conference paper, researchers at the Massachusetts Institute of Technology described a new artificial intelligence model they say can predict whether a person is depressed, based solely on raw text and audio from patient interviews, regardless of the topic of conversation.
“The first hints we have that a person is happy, excited, sad, or has some serious cognitive condition, such as depression, is through their speech,” said study co-author Tuka Alhanai, a researcher in MIT’s Computer Science and Artificial Intelligence Laboratory, in a press release.
The researchers trained the artificial intelligence model on a series of 142 audio, text, and video interviews of patients, not all of whom were depressed. The model gradually learned to associate certain speech patterns for people with depression. Its key innovation, according to the researchers, is the ability to detect patterns associated with depression in new individuals without any other diagnostic information.
“The model sees sequences of words or speaking style, and determines that these patterns are more likely to be seen in people who are depressed or not depressed,” Alhanai said. “Then, if it sees the same sequences in new subjects, it can predict if they’re depressed too.”
The MIT scientists hope technology like theirs could lead to computer apps that help people monitor their own mental health. But they believe it would also be effective helping doctors identify mental distress in regular conversations with patients.
“Every patient will talk differently, and if the model sees changes maybe it will be a flag to the doctors,” said co-author James Glass, a senior research scientist at MIT.