By BeauHD from Slashdot's full-steam-ahead department
Last night, the city council in Fort Collins, Colorado, voted to move ahead with a municipal fiber broadband network providing gigabit speeds, two months after the cable industry failed to stop the project. Ars Technica reports: Last night's city council vote came after residents of Fort Collins approved a ballot question that authorized the city to build a broadband network. The ballot question, passed in November, didn't guarantee that the network would be built because city council approval was still required, but that hurdle is now cleared. Residents approved the ballot question despite an anti-municipal broadband lobbying campaign backed by groups funded by Comcast and CenturyLink. The Fort Collins City Council voted 7-0 to approve the broadband-related measures, a city government spokesperson confirmed to Ars today.
While the Federal Communications Commission has voted to eliminate the nation's net neutrality rules, the municipal broadband network will be neutral and without data caps. "The network will deliver a 'net-neutral' competitive unfettered data offering that does not impose caps or usage limits on one use of data over another (i.e., does not limit streaming or charge rates based on type of use)," a new planning document says. "All application providers (data, voice, video, cloud services) are equally able to provide their services, and consumers' access to advanced data opens up the marketplace." The city will also be developing policies to protect consumers' privacy. The city intends to provide gigabit service for $70 a month or less and a cheaper Internet tier.Read Replies (0)
By BeauHD from Slashdot's handy-measurements department
An anonymous reader quotes a report from TechCrunch: The feature is arriving later this month on the iRobot app, making it possible for WiFi-enabled Roombas to create a map of indoor signals. The map exists alongside the existing Clean Map feature, letting users toggle between the two, like they would, say, satellite and standard imagery in Google Maps. The maps themselves won't go into too much detail -- no upload and download speeds like you see on many mobile speed test apps. Instead, the information will show up as decibel readings. Really, it's intended as a handy way of showing off where you might want to toss a range extender, to help get rid of dead spots. All of Roomba's vacuums, save for the lowest-end model, will support the feature. The beta program launches January 23rd and appears to only be available for U.S. users.Read Replies (0)
By BeauHD from Slashdot's location-services department
chicksdaddy quotes a report from The Security Ledger: Security researchers say that serious security vulnerabilities linger in GPS software by the China-based firm ThinkRace more than two years after the hole was discovered and reported to the firm, The Security Ledger reports. Data including a GPS enabled device's location, serial number, assigned phone number and model and type of device can be accessed by any user with access to the GPS service. In some cases, other information is available including the device's location history going back 1 week. In some cases, malicious actors could also send commands to the device via SMS including those used to activate or deactivate GEO fencing alarms features, such as those used on child-tracking devices.
The vulnerabilities affect hundreds of thousands of connected devices that use the GPS services, from smart watches, to vehicle GPS trackers, fitness trackers, pet trackers and more. At issue are security holes in back-end GPS tracking services that go by names like amber360.com, kiddo-track.com, carzongps.com and tourrun.net, according to Michael Gruhn, an independent security researcher who noted the insecure behavior in a location tracker he acquired and has helped raise awareness of the widespread flaws. Working with researcher Vangelis Stykas, Gruhn discovered scores of seemingly identical GPS services, many of which have little security, allowing low-skill hackers to directly access data on GPS tracking devices.
< article continued at Slashdot's location-services department
>Read Replies (0)
Spotify Files To Go Public
Posted by News Fetcher on January 03 '18 at 02:12 PM
By BeauHD from Slashdot's initial-public-offering department
According to Bloomberg, Spotify filed to go public on the New York Stock Exchange, "in the highest-profile test yet of a technique that lets companies list shares without raising money through a traditional stock offering." From the report: With steady cash from more than 60 million paying subscribers, the world's largest paid music-streaming service doesn't need more funding. Instead of an initial public offering, it's trying a direct listing, which essentially lets private stakeholders start trading their shares on a public exchange. That avoids underwriting fees and restrictions on stock sales by current owners, and doesn't dilute the holdings of executives and investors. Spotify, which has been valued at about $15 billion, would be the most prominent company by far to attempt a direct listing, a method that until now has been used by small issuers and real estate investment trusts. It would also be a first for the New York Stock Exchange, which has sought permission from the Securities & Exchange Commission to change its rules for the occasion.Read Replies (0)
By BeauHD from Slashdot's slow-and-steady department
schwit1 shares a report from U.S. News & World Report: In October, Tesla reported that it produced 220 Model 3 vehicles in the third quarter. CEO Elon Musk had previously said the company would produce more than 1,600 Model 3s by September. Loup Ventures analyst Gene Munster isn't the only analyst to doubt Tesla's fourth-quarter Model 3 production. KeyBanc analyst Brad Erickson reduced his fourth-quarter Model 3 production target by two-thirds, cutting it from 15,000 to only 5,000. According to Munster, Tesla investors may need to wait several more quarters for the Model 3 story to play out. "We predict a breakout year for the Model 3 in 2019 which means, until then, other elements like solid Model S and X production numbers, increasing energy deployments like the South Australia installation, and future vehicles (Roadster, Semi, Model Y, and pickup truck) will stoke investor optimism," he says.
schwit1 adds: "Elon Musk promised Tesla would produce 500,000 Model 3 sedans in 2018 and has accepted refundable $1,000 deposits on nearly that many. At current production rates, it will be years before pre-orders are filled. The Model 3's good will and good reviews won't matter much if Tesla can't ramp up production, which even bulls like Munster believes is running at least a year late."Read Replies (0)
By msmash from Slashdot's check-mate department
Last month, a notification that YouTube would no longer be available through Fire TV and Fire TV Stick devices starting Jan. 1 popped up, threatening to leave a huge hole in Amazon's streaming lineup. But just last week, Amazon added the ability to surf the web and get to YouTube via a browser. But does it work? GeekWire thinks so: The result is a simple path to YouTube, circumventing Google's move to pull it from Fire TV. Web browsing probably wasn't a direct response to Amazon's issues with Google, which owns YouTube, but it provides a convenient alternative to keep the service accessible for Fire TV users. The first step is downloading one or both of the web browsers. Opening Firefox leads to this home screen with easy access tiles to both Google and YouTube. On Silk, the home screen defaults to Bing search. But as I poked around, I noticed that YouTube for TV showed up in my bookmarks even though this was the first time I opened the browser. A YouTube interface optimized for TV, the same one you would see on other streaming devices, pops up on both browsers. To sign in, YouTube prompted me to activate YouTube for TV through a phone or computer. Once that process was complete, YouTube showed the same personalized recommendations as my phone and computer.Read Replies (0)
By msmash from Slashdot's take-note department
A new study, published on Wednesday, states that drinking alcohol produces a harmful chemical in the body which can lead to permanent genetic damage in the DNA of stem cells, increasing the risk of cancer developing. From a report: The research, using genetically modified mice, provides the most compelling evidence to date that alcohol causes cancer by scrambling the DNA in cells, eventually leading to deadly mutations. During the past decade, there has been mounting evidence of the link between drinking and the risk of certain cancers. "How exactly alcohol causes damage to us is controversial," said Prof Ketan Patel, who led the work at the MRC Laboratory of Molecular Biology in Cambridge. "This paper provides very strong evidence that an alcohol metabolite causes DNA damage [including] to the all-important stem cells that go on to make tissues." The study builds on previous work that had pinpointed a breakdown product of alcohol, called acetaldehyde, as a toxin that can damage the DNA within cells. However, these earlier studies had relied on extremely high concentrations of acetaldehyde and used cells in a dish rather than tracking its effects within the body.Read Replies (0)
By msmash from Slashdot's road-ahead department
A first-of-its kind genetic treatment for blindness will cost $850,000, less than the $1 million price tag that had been expected, but still among the most expensive medicines in the world. Several readers have shared an Associated Press report: Spark Therapeutics said Wednesday it decided on the lower price for Luxturna (Lux-turn-a) after hearing concerns from health insurers about their ability to cover the injectable treatment. Consternation over skyrocketing drug prices, especially in the U.S., has led to intense scrutiny from patients, Congress, insurers and hospitals. "We wanted to balance the value and the affordability concerns with a responsible price that would ensure access to patients," said CEO Jeffrey Marrazzo, in an interview with The Associated Press. Luxturna is still significantly more expensive than nearly every other medicine on the global market, including two other gene therapies approved earlier last year in the U.S. Approved last month, Luxturna, is the nation's first gene therapy for an inherited disease. It can improve the vision of those with a rare form of blindness that is estimated to affect just a few thousand people in the U.S. Luxturna is an injection -- one for each eye -- that replaces a defective gene in the retina, tissue at the back of the eye that converts light into electric signals that produce vision. The therapy will cost $425,000 per injection.Read Replies (0)
By msmash from Slashdot's closer-look department
A reader shares a blog post that talks about why Mac running High Sierra 10.13.2 (and other versions near it) refuses to let users uninstall some third-party applications easily. For instance, when users attempt to uninstall BlueStacks, an Android emulator, the Finder shows this warning: "The operation can't be completed because you don't have the necessary permission." The blog post looks into the subject: The moment that we see the word permission, all becomes clear: it's a permissions problem. So the next step is to select the offending item in the Finder, press Command-I to bring up the Get Info dialog, and change the permissions. It does, though, leave the slight puzzle as to why the Finder didn't simply prompt for authentication instead of cussedly refusing. Sure enough, after trying that, the app still won't go and the error message is unchanged. Another strange thing about this 'app' is that it's not an app at all. Tucked away in a mysterious folder, new to High Sierra, in /Library/StagedExtensions/Applications, its icon is defaced to indicate that the user can't even run it. Neither did the user install it there. Trying to remove it using a conventional Terminal command sudo rm -rf /Library/StagedExtensions/Applications/BlueStacks.app also fails, with the report Operation not permitted.Read Replies (0)
By msmash from Slashdot's closer-look department
Ellen Nakashima and Aaron Gregg, reporting for the Washington Post: The National Security Agency is losing its top talent at a worrisome rate as highly skilled personnel, some disillusioned with the spy service's leadership and an unpopular reorganization, take higher-paying, more flexible jobs in the private sector (Editor's note: the link may be paywalled; alternative source). Since 2015, the NSA has lost several hundred hackers, engineers and data scientists, according to current and former U.S. officials with knowledge of the matter. The potential impact on national security is significant, they said. Headquartered at Fort Meade in Maryland, the NSA employs a civilian workforce of about 21,000 there and is the largest producer of intelligence among the nation's 17 spy agencies. The people who have left were responsible for collecting and analyzing the intelligence that goes into the president's daily briefing. Their work also included monitoring a broad array of subjects including the Islamic State, Russian and North Korean hackers, and analyzing the intentions of foreign governments, and they were responsible for protecting the classified networks that carry such sensitive information.Read Replies (0)
By BeauHD from Slashdot's under-review department
An anonymous reader quotes a report from TechCrunch: You may think, from the pomp accompanying the FCC's vote in December to repeal the 2015 net neutrality rules, that the deed was accomplished. Not so -- in fact, the order hasn't even reached its final form: the Commission is still working on it. But while it may be frustrating, this is business as usual for regulations like this, and concerned advocates should conserve their outrage for when it's really needed. The "Restoring Internet Freedom" rule voted on last month was based on a final draft circulated several weeks before the meeting at which it would be adopted. But as reports at the time noted, significant edits (i.e. not fixing typos) were still going into the draft the day before the FCC voted. Additional citations, changes in wording and more serious adjustments may be underway. It may sound like some serious shenanigans are being pulled, but this is how the sausage was always made, and it's actually one of Chairman Ajit Pai's handful of commendable efforts that the process is, in some ways at least, more open to the public. The question of exactly what is being changed, however, we will have ample time to investigate: The rules will soon be entered into the federal register, at which point they both come into effect and come under intense scrutiny and legal opposition.Read Replies (0)
By BeauHD from Slashdot's finger-pointing department
In 2003, the predominant view in the scientific community was that there was no way to determine the exact influence of climate change on any individual event. "There are just too many other factors affecting the weather, including all sorts of natural climate variations," reports Scientific American. But Myles Allen, a climate expert at the University of Oxford, believes scientists can blame individual natural disasters on climate change. Scientific American reports of how extreme event attribution is one of the most rapidly expanding areas of climate science: Over the last few years, dozens of studies have investigated the influence of climate change on events ranging from the Russian heat wave of 2010 to the California drought, evaluating the extent to which global warming has made them more severe or more likely to occur. The Bulletin of the American Meteorological Society now issues a special report each year assessing the impact of climate change on the previous year's extreme events. Interest in the field has grown so much that the National Academy of Sciences released an in-depth report last year evaluating the current state of the science and providing recommendations for its improvement. And as the science continues to mature, it may have ramifications for society. Legal experts suggest that attribution studies could play a major role in lawsuits brought by citizens against companies, industries or even governments. They could help reshape climate adaptation policies throughout a country or even the world. And perhaps more immediately, the young field of research could be capturing the public's attention in ways that long-term projections for the future cannot. In 2004, Allen and Oxford colleague Daithi Stone and Peter Stott of the Met Office co-authored a report that is widely regarded as the world's first extreme event attribution study. The paper, which examined the contribution of climate change to a severe European heat wave in 2003 -- an event which may have caused tens of thousands of deaths across the continent -- concluded that "it is very likely that human influence has at least doubled the risk of a heat wave exceeding this threshold magnitude." Before this point, climate change attribution science had existed in other forms for several decades, according to Noah Diffenbaugh, a Stanford University climate scientist and attribution expert. Until 2004, much of the work had focused on investigating the relationship between human activity and long-term changes in climate elements like temperature and precipitation. More recently, scientists had been attempting to understand how these changes in long-term averages might affect weather patterns in general.Read Replies (0)