By msmash from Slashdot's how-about-that department
A startup founded in Palo Alto, California, by a trio of doctors, including the former director of the US National Institute of Mental Health, is trying to prove that our obsession with the technology in our pockets can help treat some of today's most intractable medical problems: depression, schizophrenia, bipolar disorder, post-traumatic stress disorder, and substance abuse. MIT Technology Review: Mindstrong Health is using a smartphone app to collect measures of people's cognition and emotional health as indicated by how they use their phones. Once a patient installs Mindstrong's app, it monitors things like the way the person types, taps, and scrolls while using other apps. This data is encrypted and analyzed remotely using machine learning, and the results are shared with the patient and the patient's medical provider. The seemingly mundane minutiae of how you interact with your phone offers surprisingly important clues to your mental health, according to Mindstrong's research -- revealing, for example, a relapse of depression. With details gleaned from the app, Mindstrong says, a patient's doctor or other care manager gets an alert when something may be amiss and can then check in with the patient by sending a message through the app (patients, too, can use it to message their care provider).Read Replies (0)
By msmash from Slashdot's marching-together department
Rival semiconductor giants ARM and Intel have agreed to work together to manage networks of connected devices from both firms, clearing a major stumbling block to market growth of the so-called Internet of Things (IoT). From a report: Britain's ARM, a unit of Japan's Softbank, said on Monday it had struck a strategic partnership with Intel to use common standards developed by Intel for managing IoT devices, connections and data. The IoT involves connecting simple chips that detect distance, motion, temperature, pressure and images to be used in an ever wider range of electronics such as lights, parking meters or refrigerators. Some of the world's dumbest electronics devices get smarter by becoming connected into cloud networks, but also harder to protect. ARM's agreement to adopt Intel standards for securely managing such networks marks a breakthrough that promises to drive the spread of IoT across many industries, the two companies said.Read Replies (0)
By msmash from Slashdot's how-about-that department
An anonymous reader shares a Wired report: When Dan Riconda graduated with a master's degree in genetic counseling from Sarah Lawrence College in 1988, the Human Genome Project was in its very first year, DNA evidence was just beginning to enter the courts, and genetic health tests weren't yet on the market. He found one of the few jobs doing fetal diagnostics for rare diseases, which often meant helping young families through the worst time in their lives. What a difference 30 years makes. Today, with precision medicine going mainstream and an explosion of apps piping genetic insights to your phone from just a few teaspoons of spit, millions of Americans are having their DNA decoded every year. That deluge of data means that genetic counselors -- the specialized medical professionals trained to help patients interpret genetic test results -- are in higher demand than ever. With two to three job openings for every new genetic counseling graduate, the profession is facing a national workforce shortage. [...] Pharmaceutical and lab testing firms are routinely hiring genetic counselors to make sure new screening technologies for these targeted drugs are developed in an ethical way. According to a 2018 survey conducted by the National Society for Genetic Counselors, a quarter of the workforce now works in one of these non-patient-facing jobs. A smaller study, published in August, found that one-third of genetic counselors had changed jobs in the past two years, nearly all of them from a hospital setting to a laboratory one.Read Replies (0)
By msmash from Slashdot's for-what-it-is-worth department
Penelope Wang, writing for Consumer Reports: If you are investing in stocks, bonds, or mutual funds, you have a wide range of options to help manage your portfolio -- everything from traditional brokerages to mutual fund companies to online financial firms. But as consumers search for an investment company, many pay little attention to the fees they're being charged, according to a just-released Consumer Reports survey of more than 46,000 CR members. Four out of 10 surveyed said they weren't sure what they paid in fees. And of those who knew the costs, only 60 percent rated their investment company in our survey as Excellent or Very Good on the amount charged. "Hidden and confusing fees are proliferating across the marketplace, making it hard for consumers to know what they're getting for their money, and to comparison shop across providers," says Anna Laitin, director of financial policy at Consumers Union, the advocacy division of Consumer Reports. "It is concerning that so many investors don't know how much they are paying in fees and that many of those who do understand the fees don't appear to think they are getting their money's worth," she says.Read Replies (0)
By msmash from Slashdot's tussle-continues department
Two U.S. senators have called on Indian Prime Minister Narendra Modi to soften India's stance on data localization, warning that measures requiring it represent "key trade barriers" between the two nations. From a report: In a letter to Modi dated Friday and seen by Reuters, U.S. Senators John Cornyn and Mark Warner -- co-chairs of the Senate's India caucus that comprises over 30 senators -- urged India to instead adopt a "light touch" regulatory framework that would allow data to flow freely across borders. The letter comes as relations between Washington and New Delhi are strained over multiple issues, including an Indo-Russian defense contract, India's new tariffs on electronics and other items, and its moves to buy oil from Iran despite upcoming U.S. sanctions. Global payments companies including Mastercard, Visa and American Express have been lobbying India's finance ministry and the Reserve Bank of India to relax proposed rules that require all payment data on domestic transactions in India be stored inside the country by October 15. The letter is most likely a last-ditch effort after the RBI told officials at top payment firms this week that the central bank would implement, in full, its data localization directive without extending the deadline, or allowing data to be stored both offshore as well as locally -- a practice known as data mirroring. "We see this (data localization) as a fundamental issue to the further development of digital trade and one that is crucial to our economic partnership," the U.S. senators said in the letter that has not been previously reported.Read Replies (0)
By msmash from Slashdot's minute-details department
Earlier this week, Microsoft announced that it was joining the open-source patent consortium Open Invention Network (OIN). The press release the two shared this week was short on details on how the two organizations intend to work together and what does the move mean to, for instance, the billions of dollars Microsoft earns each year from its Android patents (since Google is a member of OIN, too.) Software Freedom Conservancy (SFC), a non-profit organization that promotes open-source software, has weighed in on the subject: While [this week's] announcement is a step forward, we call on Microsoft to make this just the beginning of their efforts to stop their patent aggression efforts against the software freedom community. The OIN patent non-aggression pact is governed by something called the Linux System Definition. This is the most important component of the OIN non-aggression pact, because it's often surprising what is not included in that Definition especially when compared with Microsoft's patent aggression activities. Most importantly, the non-aggression pact only applies to the upstream versions of software, including Linux itself. We know that Microsoft has done patent troll shakedowns in the past on Linux products related to the exfat filesystem. While we at Conservancy were successful in getting the code that implements exfat for Linux released under GPL (by Samsung), that code has not been upstreamed into Linux. So, Microsoft has not included any patents they might hold on exfat into the patent non-aggression pact. We now ask Microsoft, as a sign of good faith and to confirm its intention to end all patent aggression against Linux and its users, to now submit to upstream the exfat code themselves under GPLv2-or-later. This would provide two important protections to Linux users regarding exfat: (a) it would include any patents that read on exfat as part of OIN's non-aggression pact while Microsoft participates in OIN, and (b) it would provide the various benefits that GPLv2-or-later provides regarding patents, including an implied patent license and those protections provided by GPLv2 (and possibly other GPL protections and assurances as well).Read Replies (0)
By msmash from Slashdot's this-week-in-history department
With 95% of Americans owning a cellphone, it can feel like we've been calling, texting, and tweeting on the go forever. But the infrastructure supporting our cellphones has actually not been around that long. From a report: While we're now on 4G networks, it was only 35 years ago this week that Ameritech (now part of AT&T) launched 1G, or the first commercial cell phone network. That network, called the Advanced Mobile Phone System (AMPS), went online on October 13, 1983, allowing people in the Chicago area to make and receive mobile calls for the first time. Ameritech president Bob Barnett, who made the first call, decided to make the historic moment count by ringing Alexander Graham Bell's grandson. A little more than a year later, UK's Vodafone hosted its first commercial call on New Year's Day. Israel's Pelephone followed suit in 1986, followed by Australia in 1987. Cellphone technology had been around for quite a while before that. AMPS was in development for around 15 years, and engineers made the first mobile call on a prototype network a decade before the first commercial network call. It took that long to troubleshoot the various hardware, software, and radio frequency issues associated with setting up a fully functional commercial network.Read Replies (0)
The Magic Leap Con
Posted by News Fetcher on October 14 '18 at 11:33 AM
By msmash from Slashdot's topsy-turvy-world department
Reader merbs shares a report about Magic Leap, a US-based startup valued at north of $6 billion and which counts Google, Alibaba, Warner Bros, AT&T, and several top Silicon Valley venture capital firms as its investors. The company, which held its first developer conference this week, announced that it is making its $2,295 AR headset available in more states in the United States. Journalist Brian Merchant attended the conference and shares the other part of the story. From a story: After spending two days at LEAPcon, I feel it is my duty -- in the name of instilling a modicum of sanity into an age where a company that has never actually sold a product to a consumer can be worth a billion dollars more than the entire GDP of Fiji -- to inform you that it is not. Magic Leap clearly wants its public launch to appear huge -- who wouldn't? In decidedly Magic Leapian fashion, the company covered an entire side of LA Mart, the 12-story building in downtown Los Angeles where the conference was to be held, with a psychedelic image of an astronaut and the tagline 'Free Your Mind'. In similarly Leapian fashion, the actual demos and keynote took place in the basement, where a wrong turn could land you in shipping and receiving and cell reception was nil. [...] You know that weird sensation when it feels like everyone around you is participating in some mild mass hallucination, and you missed the dosing? The old 'what am I possibly missing here' phenomenon? That's how I felt at LEAP a lot of the time, amidst crowds of people dropping buzzwords and acronym soup at light speed, and then again while I was reading reviews of the device afterwards -- somehow, despite years of failing to deliver anything of substance, lots of the press is still in Leap's thrall. Demo after demo, I felt like, sure, that was kind of neat. The games were charming, if often glitchy and simplistic, and yes, it might be helpful for architects to be able to blow up and walk around their designs. I liked the developers, who were smart and funny. Some of the graphics and interactions were very nicely rendered. But there wasn't anything -- besides a single demo, which I'll get to in a second -- that I'd feel compelled to ever do again. It felt genuinely crazy to me that people could get too excited about this, especially after years of decent VR and the Hololens, without having a distinct monetary incentive to do so. As many have noted, the hardware is still extremely limiting. The technology underpinning these experiences seems genuinely advanced, and if it were not for a multi-year blitzkrieg marketing campaign insisting a reality where pixels blend seamlessly with IRL physics was imminent, it might have felt truly impressive. (Whether or not it's advanced enough to eventually give rise to Leap's prior promises is an entirely open question at this point.) For now, the field of vision is fairly small and unwieldy, so images are constantly vanishing from view as you look around. If you get too close to them, objects will get chopped up or move awkwardly. And if you do get a good view, some objects appear low res and transparent; some looked like cheap holograms from an old sci-fi film. Text was bleary and often doubled up in layers that made it hard to read, and white screens looked harsh -- I loaded Google on the Helio browser and immediately had to shut my eyes. Further reading: Magic Leap is Pushing To Land a Contract With US Army To Build AR Devices For Soldiers To Use On Combat Missions, Documents Reveal.Read Replies (0)
By msmash from Slashdot's moral-dilemma department
An anonymous reader shares a report: Somewhere in the United States, someone is getting into an Uber en route to a WeWork co-working space. Their dog is with a walker whom they hired through the app Wag. They will eat a lunch delivered by DoorDash, while participating in several chat conversations on Slack. And, for all of it, they have an unlikely benefactor to thank: the Kingdom of Saudi Arabia. Long before the dissident Saudi journalist Jamal Khashoggi vanished, the kingdom has sought influence in the West -- perhaps intended, in part, to make us forget what it is. A medieval theocracy that still beheads by sword, doubling as a modern nation with malls (including a planned mall offering indoor skiing), Saudi Arabia has been called "an ISIS that made it." Remarkably, the country has avoided pariah status in the United States thanks to our thirst for oil, Riyadh's carefully cultivated ties with Washington, its big arms purchases, and the two countries' shared interest in counterterrorism. But lately the Saudis have been growing their circle of American enablers, pouring billions into Silicon Valley technology companies. While an earlier generation of Saudi leaders, like Prince Alwaleed bin Talal, invested billions of dollars in blue-chip companies in the United States, the kingdom's new crown prince, Mohammed bin Salman, has shifted Saudi Arabia's investment attention from Wall Street to Silicon Valley. Saudi Arabia's Public Investment Fund has become one of Silicon Valley's biggest swinging checkbooks, working mostly through a $100 billion fund raised by SoftBank (a Japanese company), which has swashbuckled its way through the technology industry, often taking multibillion-dollar stakes in promising companies. The Public Investment Fund put $45 billion into SoftBank's first Vision Fund, and Bloomberg recently reported that the Saudi fund would invest another $45 billion into SoftBank's second Vision Fund. SoftBank, with the help of that Saudi money, is now said to be the largest shareholder in Uber. It has also put significant money into a long list of start-ups that includes Wag, DoorDash, WeWork, Plenty, Cruise, Katerra, Nvidia and Slack. As the world fills up car tanks with gas and climate change worsens, Saudi Arabia reaps enormous profits -- and some of that money shows up in the bank accounts of fast-growing companies that love to talk about "making the world a better place."Read Replies (0)
By msmash from Slashdot's closer-look department
A new study in Frontiers in Psychology examined why people struggle so much to solve statistical problems, particularly why we show a marked preference for complicated solutions over simpler, more intuitive ones. Chalk it up to our resistance to change. From a report: The study concluded that fixed mindsets are to blame: we tend to stick with the familiar methods we learned in school, blinding us to the existence of a simpler solution. Roughly 96 percent of the general population struggles with solving problems relating to statistics and probability. Yet being a well-informed citizen in the 21st century requires us to be able to engage competently with these kinds of tasks, even if we don't encounter them in a professional setting. "As soon as you pick up a newspaper, you're confronted with so many numbers and statistics that you need to interpret correctly," says co-author Patrick Weber, a graduate student in math education at the University of Regensburg in Germany. Most of us fall far short of the mark. Part of the problem is the counterintuitive way in which such problems are typically presented. Meadows presented his evidence in the so-called "natural frequency format" (for example, 1 in 10 people), rather than in terms of a percentage (10 percent of the population). That was a smart decision, since 1-in-10 a more intuitive, jury-friendly approach. Recent studies have shown that performance rates on many statistical tasks increased from four percent to 24 percent when the problems were presented using the natural frequency format.Read Replies (0)
By FirehoseFavorites from Slashdot's stable-releases department
Long-time Slashdot reader pikester has a friend running a museum "looking to make it more interactive for visitors."
To make this happen, the museum is going to need to have good WiFi connectivity throughout the premises. The good news is that the museum is pretty small. The bad news is that it is located in an old horse barn with many metal walls. I'm hoping to put in a mesh network for him, but most solutions I've seen are pretty bulky. I'm looking for recommendations for a solution that is easily mountable in the building.
Long-time Slashdot reader Spazmania suggests it's "not terribly complicated." After setting access points to same SSID but different channels (and with the transmit power down), "walk around with a piece of free software such as Wifi Analyzer and tweak the positions and transmit power on the access points until the signal levels look good in wifi analyzer." But are there other solutions? Leave your own best answers in the comments.
Can you install a wifi mesh network in a barn?Read Replies (0)
By EditorDavid from Slashdot's animal-tracking-planet department
An anonymous reader quotes New York magazine:
Salmon are just the latest entry in a growing cornucopia of animal faces loaded into databases. For some animals, the biometric data gathered from them is being used to aid in conservation efforts. For others, the resulting AI could help ward off poachers. While partly creepy and partly very cute, monitoring of these animals can both help protect their populations and ensure safe, traceable livestock for developing communities...
U.K. researchers are using online resources like Flickr and Instagram to help build and strengthen a database that will eventually help track global tiger populations in real time. Once collected, the photos are analyzed by everyday people in a free app called Wildsense... The mighty lion is being surveilled too. Conservationists and wildlife teachers are using facial recognition to keep tabs on a database of over 1,000 lions... Wildlife experts are tracking elephants to protect them from encroaching poachers. Using Google's Cloud AutoML Vision machine learning software, the technology will uniquely identify elephants in the wild. According to the Evening Standard, the tech will even send out an alert if it detects poachers in the same frame.
The story of whale facial tracking is one of crowdsourcing success. After struggling to distinguish specific whales from one another on his own, marine biologist Christian Khan uploaded the photos to data-competition site Kaggle and, within four months, data-science company Deepsense was able to accurately detect individual whale faces with 87% accuracy. Since then, detection rates have steadily improved and are helping conservationists track and monitor the struggling aquatic giant.
< article continued at Slashdot's animal-tracking-planet department
>Read Replies (0)
By EditorDavid from Slashdot's type-righter department
MIT Technology Review recently discussed new attempts to replace the standard 'QWERY' keyboard layout, including Tap, "a one-handed gadget that fits over your fingers like rubbery brass knuckles and connects wirelessly to your smartphone."
It's supposed to free you from clunky physical keyboards and act as a go-anywhere typing interface. A promotional video shows smiling people wearing Tap and typing with one hand on a leg, on an arm, and even (perhaps jokingly) on some guy's forehead... But when I tried it, the reality of using Tap was neither fun nor funny. Unlike a conventional QWERTY keyboard, Tap required me to think a lot, because I had to tap my fingers in not-very-intuitive combinations to create letters: an A is your thumb, a B is your index finger and pinky, a C is all your fingers except the index.
The article also acknowledges the Dvorak Simplified Keyboard layout and other alternatives like the one-handed Twiddler keyboard, but argues that "neither managed to dent QWERTY's dominance."
< article continued at Slashdot's type-righter department
>Read Replies (0)
By EditorDavid from Slashdot's pessimistic-predictions department
"Dire as it is, the latest IPCC report is actually too optimistic," writes Slashdot reader Dan Drollette. "It ignores the risk of self-reinforcing climate feedbacks pushing the planet into chaos beyond human control. So says a team of climate experts, including the winner of the 1995 Nobel for his work on depletion of the ozone layer." From their article:
These cascading feedbacks include the loss of the Arctic's sea ice, which could disappear entirely in summer in the next 15 years. The ice serves as a shield, reflecting heat back into the atmosphere, but is increasingly being melted into water that absorbs heat instead. Losing the ice would tremendously increase the Arctic's warming, which is already at least twice the global average rate. This, in turn, would accelerate the collapse of permafrost, releasing its ancient stores of methane, a super climate pollutant 30 times more potent in causing warming than carbon dioxide.
By largely ignoring such feedbacks, the IPCC report fails to adequately warn leaders about the cluster of six similar climate tipping points that could be crossed between today's temperature and an increase to 1.5 degrees -- let alone nearly another dozen tipping points between 1.5 and 2 degrees. These wildcards could very likely push the climate system beyond human ability to control. As the UN Secretary General reminded world leaders last month, "We face an existential threat. Climate change is moving faster than we are.â¦ If we do not change course by 2020, we risk missing the point where we can avoid runaway climate change, with disastrous consequences."
< article continued at Slashdot's pessimistic-predictions department
>Read Replies (0)