By manishs from Slashdot's humans-need-not-apply department
An anonymous reader writes: It is no secret that machines have come to largely replace physical labor, and computers surpass human beings in processing data. But in the future, the development of artificial intelligence may render humans obsolete even in the realm of emotional intelligence (warning: annoying popup adverts), according to Yuval Harari, a renowned professor of history. Harari said:AI today is able to diagnose your personality and emotional state by looking at your face and recognizing tiny muscle movements. It can tell whether you are tired, excited, angry, joyful, in love ... it can tell these things even though AI itself doesn't feel anger or love. In the future, therefore, AI could drive humans out of the job market and make many humans completely useless, from an economic perspective in areas where human interaction was previously considered crucial. Humans only have two basic abilities -- physical and cognitive. When machines replaced us in physical abilities, we moved on to jobs that require cognitive abilities. ... If AI becomes better than us in that, there is no third field humans can move to.Read Replies (0)
By manishs from Slashdot's curious-case-of-bitcoin,-and-whoever-created-it department
Australian entrepreneur Craig Wright has put an end to the years-long speculation about the creator of Bitcoin. In an interview with the BBC, The Economist ( could be a paywall), and GQ, Wright claimed that he is indeed the person who developed the concepts on which Bitcoin cryptocurrency is built. According to the BBC, Mr. Wright provided "technical proof to back up his claim using coins known to be owned by Bitcoin's creator." Wright writes in a blog post: [A]fter many years, and having experienced the ebb and flow of life those years have brought, I think I am finally at peace with what he meant. If I sign Craig Wright, it is not the same as if I sign Craig Wright, Satoshi[...] Since those early days, after distancing myself from the public persona that was Satoshi, I have poured every measure of myself into research. I have been silent, but I have not been absent. I have been engaged with an exceptional group and look forward to sharing our remarkable work when they are ready. Satoshi is dead. But this is only the beginning. According to Wright's website, he is a "computer scientist, businessman and inventor" born in Brisbane, Australia, in October 1970. Some has questioned the authenticity and importance of "technical proof" Wright has provided. Nik Cubrilovic, an Australian former hacker and leading internet security blogger, wrote, "I don't believe for a second Wright is Satoshi. I know two people who worked with Wright, characterized him as crazy and schemer/charlatan." Michele Spagnuolo, Information Security Engineer at Google added, "He's not Satoshi. He just reused a signed message (of a Sartre text) by Satoshi with block 9 key as 'proof.'"Read Replies (0)
By EditorDavid from Slashdot's school-principles department
theodp writes: Last week, Microsoft and some of the biggest names in tech and corporate America threw their weight behind a Change.org petition that urged Congress to fund K-12 Computer Science education. The petition, started by the tech-backed CS Education Coalition (btw, 901 K Street NW is Microsoft's DC HQ) in partnership with tech-backed Code.org, now has 90,000+ supporters. But three years ago, Microsoft backed a very different Change.org petition that called for corporate America to foot the STEM education bill. "While the need to expand high-skilled immigration is immediate," read the letter to Congress, "we also need to expand STEM opportunities in U.S. education. A positive proposal has emerged in Washington to create a national STEM education fund, paid for only by businesses using green cards and visas. This fund will help prepare Americans for 21st-century STEM jobs. The proposal is supported by a broad coalition [PDF] that includes Microsoft, GE, the National Council of La Raza, the National Association of Manufactures, and the National Science Teachers Association, to name a few." The earlier petition, which wound up with 41,009 supporters, was started by Voices for Innovation, a self-described "Microsoft supported community" that says it's now "proud to support the Computer Science Education Coalition" as part of its efforts to "shape public policies for our 21st century digital economy and society." So, what changed? Well, Mother Jones did warn that what Microsoft promises and what it delivers for education isn't necessarily the same...Read Replies (0)
By EditorDavid from Slashdot's brain-wars department
An anonymous reader writes: OpenAI, a billion-dollar research non-profit backed by Elon Musk and other Silicon Valley executives, just released a public beta of a new Open Source gym for computer programmers working on artificial intelligence. "Nothing beats a competitive environment to motivate developers," says Patrick Moorhead, an analyst at Moor Insights & Strategy. "It's like a monster truck rally for AI programmers."
The gym lets developers run tests in a standardized environment and share their results, and was built by OpenAI to develop algorithms for the non-profit's own research, according to the Christian Science Monitor. "The gym's exercises range from robot simulations to Atari games and are designed to develop reinforcement learning, the type of computer skills needed for motor control, and decision-making. 'Long-term, we want this curation to be a community effort rather than something owned by us,' Greg Brockman and John Schulman wrote in an OpenAI blog post. 'We'll necessarily have to figure out the details over time, and we'd would love your help in doing so.'"Read Replies (0)
By EditorDavid from Slashdot's out-brief-candle department
HughPickens.com writes: Robinson Meyer writes in The Atlantic that in its annual report on "global catastrophic risk," the Global Challenges Foundation estimates the risk of human extinction due to climate change -- or an accidental nuclear war at 0.1 percent every year. That may sound low, but when extrapolated to century-scale it comes to a 9.5 percent chance of human extinction within the next hundred years. The report holds catastrophic climate change and nuclear war far above other potential causes, and for good reason citing multiple occasions when the world stood on the brink of atomic annihilation. While most of these occurred during the Cold War, another took place during the 1990s, the most peaceful decade in recent memory. The closest may have been on September 26, 1983, when a bug in the U.S.S.R. early-warning system reported that five NATO nuclear missiles had been launched and were bound for Russian targets. The officer watching the system, Stanislav Petrov, had also designed the system, and he decided that any real NATO first-strike would involve hundreds of I.C.B.M.s. Therefore, he resolved the computers must be malfunctioning. He did not fire a response.
< article continued at Slashdot's out-brief-candle department
>Read Replies (0)
By manishs from Slashdot's surprising-changes-to-Maps'-cartography department
Google Maps has reduced the number of cities it shows by up to 83% over the past few years, according to Justin O'Beirne. Maps, in addition, has increased the number of roads it showcases. O'Beirne, who writes about digital maps, in a blog post outlines the changes Google has made to its mapping and navigation service over the years. The side-by-side screenshots comparison on his blog post shows that Google has largely abandoned labelling towns and cities in favor of showing as many roads as it can. He has also looked into several elements of Maps from the design standpoint, and questioned Google's decision. He writes: If these roads were so important that they deserved to be upgraded in appearance, why weren't they also given shield icons? After all, an unlabeled road is only half as useful as a labeled one. [...] [Comparing Google Maps to a paper map] Even though it's from the early 1960s, the old print map has so much more information than the Google Map. So many more cities. So many more road labels. And the text size is comparable between the two. O'Beirne believes that Google has made these changes to better serve mobile users. "Unfortunately, these 'optimizations' only served to exacerbate the longstanding imbalances already in the maps," he writes. "As is often the case with cartography: less isn't more. Less is just less."Read Replies (0)
By manishs from Slashdot's setting-precedent department
An anonymous reader shares a Guardian report: A professor at Princeton University has published a CV listing his career failures (PDF), in an attempt to "balance the record" and encourage others to keep trying in the face of disappointment. Johannes Haushofer, who is an assistant professor of psychology and public affairs at the university in New Jersey, posted his unusual CV on Twitter last week. The document contains sections titled Degree programs I did not get into , Research funding I did not get and Paper rejections from academic journals. Haushofer writes: Most of what I try fails, but these failures are often invisible, while the successes are visible. I have noticed that this sometimes gives others the impression that most things work out for me. As a result, they are more likely to attribute their own failures to themselves, rather than the fact that the world is stochastic, applications are crapshoots, and selection committees and referees have bad days. This CV of Failures is an attempt to balance the record and provide some perspective. He added another section called "Meta-Failures" to his resume, writing, "This darn CV of Failures has received way more attention than my entire body of academic work."Read Replies (0)
By EditorDavid from Slashdot's not-all-watches-tell-time department
Apple's share of the smartwatch market actually started declining in 2016, dropping down to just 52.4% (down from 63%), according to Business Insider. And following up on Apple's first drop in earnings in over 10 years, Slashdot reader Zanadou shares a Gizmodo's latest story about the Apple Watch.
"I stopped wearing it two months ago, and I'm not sure if I'll ever wear it again. That's because it doesn't really do anything that anyone needs, and even when it does, it doesn't always work like it's supposed to. Here are some things I learned over the past year of strapping the screen vibrator to my wrist."
The article describes wanting to try a new form factor, but ending up confused by the watch's two-button interface (where the buttons perform multiple functions). Gizmodo's writer complains that "there's literally no comfortable way to actually use it," and while he did appreciate things like the time-of-sunrise feature and the ability to read text messages on your wrist, most Apple Watch apps "just end up being a shell of the iPhone app". And worst of all, it was difficult to use the watch to actually tell time, since "the screen doesn't always turn on when you raise your wrist like it's supposed to."Read Replies (0)
By EditorDavid from Slashdot's It's-my-chip-on-a-box department
An anonymous reader writes: "Pretty much any device with a USB port will be able to use advanced neural networks," reports PC Magazine, announcing the new Fathom Neural Compute Stick from chip-maker (and Google supplier) Movidius. "Once it's plugged into a Linux-powered device, it will enable that device to perform neural network functions like language comprehension, image recognition, and pattern detection," and without even using an external power supply. Device manufacturers could now move AI-level processing from the cloud down to end users, PC Magazine reports, with one New York computer science professor saying the technology means that now "every robot, big and small, can now have state-of-the-art vision capabilities." The article argues that this standalone, ultra-low power neural network could start the creation of a whole new category of next-generation consumer technologies.Read Replies (0)