By msmash from Slashdot's what's-happening department
From a report: Instacart shoppers and drivers -- the people who gather your groceries and deliver them to you after you order via the Instacart app -- are on strike. While independent contractors can't technically strike, via a Facebook group some of the company's thousands of employees have organized a "no delivery day" in the hopes of getting higher wages, the San Francisco Chronicle reports. The strike is only taking place in a few of the 154 cities nationwide that Instacart operates in. The action may be small, but the grievances are big. While Instacart, the 5-year-old San Francisco startup, is valued at $3.4 billion, it allegedly pays its workers as little as $1 per order. Ars Technica has a great breakdown of all the issues surrounding how Instacart employees get paid and it's complex, with three different income streams coming together Voltron-like to form a wage. The result, though, is that some shoppers are being paid less than the federal minimum wage, like a Jackson, Miss., worker who put in a 19-hour week in Jackson, Mississippi, that paid out $37.75 (roughly $2/hour). That's far below the $14/hour wage that Ars Technica says Instacart is targeting.Read Replies (0)
By msmash from Slashdot's it's-here department
Pete Warden, engineer and CTO of Jetpac, shares his view on how deep learning is already starting to change some of the programming is done. From a blog post, shared by a reader last week: The pattern is that there's an existing software project doing data processing using explicit programming logic, and the team charged with maintaining it find they can replace it with a deep-learning-based solution. I can only point to examples within Alphabet that we've made public, like upgrading search ranking, data center energy usage, language translation, and solving Go, but these aren't rare exceptions internally. What I see is that almost any data processing system with non-trivial logic can be improved significantly by applying modern machine learning. This might sound less than dramatic when put in those terms, but it's a radical change in how we build software. Instead of writing and maintaining intricate, layered tangles of logic, the developer has to become a teacher, a curator of training data and an analyst of results. This is very, very different than the programming I was taught in school, but what gets me most excited is that it should be far more accessible than traditional coding, once the tooling catches up. The essence of the process is providing a lot of examples of inputs, and what you expect for the outputs. This doesn't require the same technical skills as traditional programming, but it does need a deep knowledge of the problem domain. That means motivated users of the software will be able to play much more of a direct role in building it than has ever been possible. In essence, the users are writing their own user stories and feeding them into the machinery to build what they want.Read Replies (0)
By msmash from Slashdot's secret-sauce department
Christopher Mims, writing for the Wall Street Journal: A funny thing is happening to the most basic building blocks of nearly all our devices. Microchips, which are usually thin and flat, are being stacked like pancakes (Editor's note: the link could be paywalled). Chip designers -- now playing with depth, not just length and width -- are discovering a variety of unexpected dividends in performance, power consumption and capabilities. Without this technology, the Apple Watch wouldn't be possible. Nor would the most advanced solid-state memory from Samsung, artificial-intelligence systems from Nvidia and Google, or Sony's crazy-fast next-gen camera. Think of this 3-D stacking as urban planning. Without it, you have sprawl -- microchips spread across circuit boards, getting farther and farther apart as more components are needed. But once you start stacking chips, you get a silicon cityscape, with everything in closer proximity. The advantage is simple physics: When electrons have to travel long distances through copper wires, it takes more power, produces heat and reduces bandwidth. Stacked chips are more efficient, run cooler and communicate across much shorter interconnections at lightning speed, says Greg Yeric, director of future silicon technology for ARM Research, part of microchip design firm ARM.Read Replies (0)
Spam Is Back
Posted by News Fetcher on November 20 '17 at 09:42 AM
By msmash from Slashdot's closer-look department
Jon Christian, writing for The Outline: For a while, spam -- unsolicited bulk messages sent for commercial or fraudulent purposes -- seemed to be fading away. The 2003 CAN-SPAM Act mandated unsubscribe links in email marketing campaigns and criminalized attempts to hide the sender's identity, while sophisticated filters on what were then cutting-edge email providers like Gmail buried unwanted messages in out-of-sight spam folders. In 2004, Microsoft co-founder Bill Gates told a crowd at the World Economic Forum that "two years from now, spam will be solved." In 2011, cybersecurity reporter Brian Krebs noted that increasingly tech savvy law enforcement efforts were shutting down major spam operators -- including SpamIt.com, alleged to be a major hub in a Russian digital criminal organization that was responsible for an estimated fifth of the world's spam. These efforts meant that the proportion of all emails that are spam has slowly fallen to a low of about 50 percent in recent years, according to Symantec research. But it's 2017, and spam has clawed itself back from the grave. It shows up on social media and dating sites as bots hoping to lure you into downloading malware or clicking an affiliate link. It creeps onto your phone as text messages and robocalls that ring you five times a day about luxury cruises and fictitious tax bills. Networks associated with the buzzy new cryptocurrency system Ethereum have been plagued with spam. Facebook recently fought a six-month battle against a spam operation that was administering fake accounts in Bangladesh, Indonesia, Saudi Arabia, and other countries. Last year, a Chicago resident sued the Trump campaign for allegedly sending unsolicited text message spam; this past November, ZDNet reported that voters were being inundated with political text messages they never signed up for. Apps can be horrid spam vectors, too. Repeated mass data breaches that include contact information, such as the Yahoo breach in which 3 billion user accounts were exposed, surely haven't helped. Meanwhile, you, me, and everyone we know is being plagued by robocalls.Read Replies (0)
By msmash from Slashdot's what-Facebook-really-thinks department
schwit1 shares an op-ed on the New York Times by Sandy Parakilas, a former operations manager on the platform team at Facebook: Sandy Parakilas led Facebook's efforts to fix privacy problems on its developer platform in advance of its 2012 initial public offering. What I saw from the inside was a company that prioritized data collection from its users over protecting them from abuse. As the world contemplates what to do about Facebook in the wake of its role in Russia's election meddling, it must consider this history. Lawmakers shouldn't allow Facebook to regulate itself. Because it won't (Editor's note: the link could be paywalled; alternative source). Facebook knows what you look like, your location, who your friends are, your interests, if you're in a relationship or not, and what other pages you look at on the web. This data allows advertisers to target the more than one billion Facebook visitors a day. It's no wonder the company has ballooned in size to a $500 billion behemoth in the five years since its I.P.O. The more data it has on offer, the more value it creates for advertisers. That means it has no incentive to police the collection or use of that data -- except when negative press or regulators are involved. Facebook is free to do almost whatever it wants with your personal information, and has no reason to put safeguards in place. The company just wanted negative stories to stop. It didn't really care how the data was used. Facebook took the same approach to this investigation as the one I observed during my tenure: react only when the press or regulators make something an issue, and avoid any changes that would hurt the business of collecting and selling data. This makes for a dangerous mix: a company that reaches most of the country every day and has the most detailed set of personal data ever assembled, but has no incentive to prevent abuse. Facebook needs to be regulated more tightly, or broken up so that no single entity controls all of its data. The company won't protect us by itself, and nothing less than our democracy is at stake.Read Replies (0)
By EditorDavid from Slashdot's neurodiversity-training department
James Damore "wants you to know he isn't using autism as an excuse," reports a Silicon Valley newspaper, commenting on the fired Google engineer's new interview with the Guardian. But they also note that "he says being on the spectrum means he 'sees things differently'," and the weekend editor at the entertainment and "geek culture" site The Mary Sue sees a problem in the way that interview was framed.
It's the author of this Guardian article, not James Damore himself, who makes the harmful suggestion that Damore's infamous Google memo and subsequent doubling-down are somehow caused by his autism... It frames autism as some sort of basic decency deficiency, rather than a neurological condition shared by millions of people.... This whole article is peppered with weird suggestions like this, suggestions which detract from an otherwise interesting piece.. All these weird suggestions that autism and misogyny/bigotry are somehow tied (as if autistic feminists didn't exist) do unfortunately detract from one of the article's great points.
Having worked at a number of companies large and small, I can at least anecdotally confirm that their diversity training rarely includes a discussion of neurodiversity, and when it does, it's not particularly empathetic or helpful... Many corporate cultures are plainly designed for neurotypical extroverts and no one else -- and that should change. I really do think Lewis meant well in pointing that out. But the other thing that should change? The way the media scapegoats autism as a source of anti-social behavior.Read Replies (0)
By EditorDavid from Slashdot's earth-shaking-developments department
"Scientists say the number of severe quakes is likely to rise strongly next year because of a periodic slowing of the Earth's rotation," reports the Guardian. "They believe variations in the speed of Earth's rotation could trigger intense seismic activity, particularly in heavily populated tropical regions. Although such fluctuations in rotation are small -- changing the length of the day by a millisecond -- they could still be implicated in the release of vast amounts of underground energy, it is argued."
The theory goes that the slowdown creates a shift in the shape of the Earth's solid iron and nickel "inner core" which, in turn, impacts the liquid outer core on which the tectonic plates that form the Earth's crust rest. The impact is greater on the tectonic plates near some of the Earth's most populous regions along the Equator, home to about a billion people. Scientists from the University of Colorado looked at all earthquakes registering 7 and up on the Richter scale since the turn of the 20th century. In this timeframe, the researchers discovered five periods of significantly greater seismic activity.
The seismic activity follows a five-year period of slowing in the earth's rotatio, and "This link is particularly important because Earth's rotation began one of its periodic slowdowns more than four years ago," according to the article.
"The Earth is offering us a five-year heads-up on future earthquakes," says one of the researchers, adding "The inference is clear. Next year we should see a significant increase in numbers of severe earthquakes."Read Replies (0)
By EditorDavid from Slashdot's brains-of-billionaires department
An anonymous reader quotes Yahoo Finance:
The billionaire media mogul behind such popular sites as Expedia, Match.com and HomeAdvisor has a one-word forecast for traditional media conglomerates concerned about being replaced by tech giants: serfdom. "They, like everyone else, are kind of going to be serfs on the land of the large tech companies," IAC chairman Barry Diller said... That's because Google and Facebook not only have such massive user bases but also dominate online advertising. "Google and Facebook are consolidating," Diller said. "They are the only mass advertising mediums we have..." He expects Facebook, Google and maybe Amazon to face government regulation, simply because of their immense size. "At a certain point in size, you must," he said. "It's inevitable."
He did, however, outline one positive for Big Tech getting so gargantuan. Big Telecom no longer has the economic leverage to roll back today's net-neutrality norms, in which internet providers don't try to charge sites extra for access to their subscribers. "I think it's hard to overturn practically," he said. "It is the accepted system."
Even if the U.S. government takes moves to fight net neutrality, Diller told CNBC that "I think it is over... It is [the] practice of the world... You're still going to be able to push a button and publish to the world, without anybody in between asking you for tribute. I think that is now just the way things are done. I don't think it can be violated no matter what laws are back."Read Replies (0)
By EditorDavid from Slashdot's old-McDonald-had-a-farm department
"This is one of the first clear-cut genetic mutations in human beings that acts upon aging and aging-related disease," Dr. Douglas Vaughan, a medical researcher at Northwestern University, told Newsweek. schwit1 quotes Science Alert:
As far as we know, it looks like the only community in the world known to harbour it is an Old Order Amish community living in Indiana... Vaughan's team tested 177 people from the Amish community of Berne, Indiana, and found 43 people with one mutated SERPINE1 gene copy. Compared to the general Amish population, these 43 people had a 10 percent longer lifespan, and 10 percent longer telomeres (the DNA-protecting structures at the ends of our chromosomes that unravel when the cells reach the end of their lifespans). They also showed lower incidence of diabetes and lower insulin fasting levels. On top of that, the study showed a small indication of lower blood pressure and potentially more flexible blood vessels.
"For the first time we are seeing a molecular marker of aging (telomere length), a metabolic marker of aging (fasting insulin levels) and a cardiovascular marker of aging (blood pressure and blood vessel stiffness) all tracking in the same direction in that these individuals were generally protected from age-related changes," said Vaughan. These people also had 50 percent lower PAI-1 levels than average. It's not known exactly how PAI-1 contributes to aging, but it does play a role in a process called cellular senescence. This is when cells are no longer able to replicate, so they just go dormant. This contributes to the effects of aging.Read Replies (0)
By EditorDavid from Slashdot's seeing-the-CO2 department
Countries are scrambling to limit the rise in the earth's temperature to just two degrees by the end of this century. But Slashdot reader dryriver shares an article titled "What They Don't Tell You About Climate Change."
No, it is not that Climate Change is a hoax or that the climate science gets it all wrong and Climate Change isn't happening. According to the Economist, it is rather that "Fully 101 of the 116 models the Intergovernmental Panel on Climate Change uses to chart what lies ahead assume that carbon will be taken out of the air in order for the world to have a good chance of meeting the 2C target."
In other words, reducing carbon emissions around the world, creating clean energy from wind farms, driving electrical cars and so forth is not going to suffice to meet agreed upon climate targets at all. Negative emissions are needed. The world is going to overshoot the "maximum 2 degrees of warming" target completely unless someone figures out how to suck as much as 810 Billion Tonnes of carbon out of Earth's atmosphere by 2100 using some kind of industrial scale process that currently does not exist.
That breaks down to 1,785,742,000,000,000 pounds of CO2, "as much as the world's economy produces in 20 years," according to the Economist.
"Putting in place carbon-removal schemes of this magnitude would be an epic endeavour even if tried-and-tested techniques existed. They do not."Read Replies (0)
By EditorDavid from Slashdot's meetings-with-meltdowns department
An anonymous reader quotes Gizmodo:
Earlier this year, remotely piloted robots transmitted what officials believe was a direct view of melted radioactive fuel inside Fukushima Daiichi Nuclear Power Plant's destroyed reactors [YouTube] -- a major discovery, but one that took a long and painful six years to achieve... Japanese officials are now hoping that they can convince a skeptical public that the worst of the disaster is over, the New York Times reported, but it's not clear whether it's too late despite the deployment of 7,000 workers and massive resources to return the region to something approaching normal. Per the Times, officials admit the recovery plan -- involving the complete destruction of the plant, rather than simply building a concrete sarcophagus around it as the Russians did in Chernobyl -- will take decades and tens of billions of dollars. Currently, Tepco plans to begin removing waste from one of the three contaminated reactors at the plant by 2021, "though they have yet to choose which one"... Currently, radiation levels are so high in the ruined facility that it fries robots sent in within a matter of hours, which will necessitate developing a new generation of droids with even higher radiation tolerances.
Friday a group of Japanese businesses and doctors sued General Electric of behalf of 150,000 Japanese citizens, saying their designs for the Fukushima reactors were reckless and negligent.Read Replies (0)
By EditorDavid from Slashdot's battle-of-the-benchmarks department
Firefox Quantum was the winner here, with a score of 491 (from an average of five runs, with the highest and lowest results tossed out) to Chrome's 460 -- but that wasn't quite the whole story. Whereas Firefox performed noticeably better on the Organize Album and Explore DNA Sequencing workloads, Chrome proved more adept at Photo Enhancement and Local Notes, demonstrating that the two browsers have different strengths...
In a series RAM-usage tests, Chrome's average score showed it used "marginally" less memory, though the average can be misleading. "In two of our three tests, Firefox did finish leaner, but in no case did it live up to Mozilla's claim that Quantum consumes 'roughly 30 percent less RAM than Chrome,'" reports Laptop Mag.
Both browsers launched within 0.302 seconds, and the article concludes that "no matter which browser you choose, you're getting one that's decently fast and capable when both handle all of the content you're likely to encounter during your regular surfing sessions."Read Replies (0)
By EditorDavid from Slashdot's as-told-by-a-product-manager department
mikeatTB writes: Many Slashdotters weighed in on Steven A. Lowe's post, "Is Project Management Killing Good Products, Teams and Software?", where he slammed project management and called for product-centrism. Many commenters pushed back, but one PM, Yvette Schmitter, has fired back with a scathing response post, noting: "As a project manager, I'm saddened to see that project management and project managers are getting a bad rap from both ends of the spectrum. Business tends not to see the value in them, and developers tend to believe their own 'creativity' is being stymied by them. Let's set the record straight: Project management is a prized methodology for delivering on leadership's expectations.
"The success of the methodology depends on the quality of the specific project manager..." she continues. "If the project is being managed correctly by the project manager/scrum master, that euphoric state that developers want to get to can be achieved, along with the project objectives -- all within the prescribed budget and timeline. Denouncing an entire practice based on what appears to be a limited, misaligned application of the correct methodology does not make all of project management and all project managers bad."
How do Slashdot readers feel about project management for software teams?Read Replies (0)