By msmash from Slashdot's shape-of-things-to-come department
An anonymous reader shares a CNBC report: Amazon announced new tools on Friday that will allow gadget-makers to include the smart voice assistant in a whole array of new products. Alexa is Amazon's smart voice assistant and it has slowly made its way from the Amazon Echo into third-party speakers, refrigerators and, soon, even microwaves. Now, with Amazon's Alexa Mobile Accessory Kit, device makers will be able to build Alexa into headphones, smart watches, fitness trackers and more. That means you may soon be able to look down at your wrist and ask Alexa the weather, or to remind you to pick up eggs at the grocery store. CNET reports Kohler, a company that makes plumbing products, wants to bring Alexa to your bathroom as well.Read Replies (0)
By msmash from Slashdot's things-to-note department
HP announced this week that it is recalling the lithium-ion batteries in more than 50,000 laptops because of the danger of fire in cases of battery malfunction. From a report: "These batteries have the potential to overheat, posing a fire and burn hazard to customers," the company said in a statement. "For this reason, it is extremely important to check whether your battery is affected." The recall affects the battery, not the entire computer. Consumers should run HP's Validation Utility software to determine if their battery has been recalled. If the battery needs to be replaced, they should then install an update that will put the device in Battery Safe Mode, which will discharge the battery and prevent it from being charged until it's replaced. This update will allow consumers to continue using the computers safely with AC power while they wait for a new battery. The recall affects batteries sold with, or as accessories for, the following models: HP Probook 640 G2, HP ProBook 640 G3, HP ProBook 645 G2, HP ProBook 645 G3, HP ProBook 650 G2, HP ProBook 650 G3, HP ProBook 655 G2, HP ProBook 655 G3, HP ZBook 17 G3, HP ZBook 17 G4, HP ZBook Studio G3, HP x360 310 G2, HP Pavilion x360, HP ENVY m6, and HP 11 Notebook PC.Read Replies (0)
By msmash from Slashdot's blast-from-the-past department
troublemaker_23 writes: A little more than 20 years ago, Intel faced a problem with its processors, though it was not as big an issue as compared to the speculative execution bugs that were revealed this week. The 1997 bug, which came to be known as the F00F bug, allowed a malicious person to freeze up Pentium MMX and "classic" Pentium computers. Any Intel Pentium/Pentium MMX could be remotely and anonymously caused to hang, merely by sending it the byte sequence "F0 0F C7 C8". At the time, Intel said it learnt about the bug on 7 November 1997, but a report said that at least two people had indicated on an Intel newsgroup that the company knew about it earlier before. The processor firm confirmed the existence on 10 November. But, says veteran Linux sysadmin Rick Moen, the company's reaction to that bug was quite similar to the way it has reacted to this week's disclosures. "Intel has a long history of trying to dissemble and misdirect their way out of paying for grave CPU flaws," Moen said in a post to Linux Users of Victoria mailing list. "Remember the 'Pentium Processor Invalid Instruction Erratum' of 1997, exposing all Intel Pentium and Pentium MMX CPUs to remote security attack, stopping them in their tracks if they could be induced to run processory instruction 'F0 0F C7 C8'? "No, of course you don't. That's why Intel gave it the mind-numbingly boring official name 'Pentium Processor Invalid Instruction Erratum', hoping to replace its popular names 'F00F bug' and 'Halt-and-Catch Fire bug'."Read Replies (0)
By BeauHD from Slashdot's harder-better-faster department
An anonymous reader quotes a report from DSLReports: Under Section 706 of the Telecommunications Act, the FCC is required to consistently measure whether broadband is being deployed to all Americans uniformly and "in a reasonable and timely fashion." If the FCC finds that broadband isn't being deployed quickly enough to the public, the agency is required by law to "take immediate action to accelerate deployment of such capability by removing barriers to infrastructure investment and by promoting competition in the telecommunications market." Unfortunately whenever the FCC is stocked by revolving door regulators all-too-focused on pleasing the likes of AT&T, Verizon and Comcast -- this dedication to expanding coverage and competition often tends to waver.
What's more, regulators beholden to regional duopolies often take things one-step further -- by trying to manipulate data to suggest that broadband is faster, cheaper, and more evenly deployed than it actually is. We saw this under former FCC boss Michael Powell (now the top lobbyist for the cable industry), and more recently when the industry cried incessantly when the base definition of broadband was bumped to 25 Mbps downstream, 4 Mbps upstream. We're about to see this effort take shape once again as the FCC prepares to vote in February for a new proposal that would dramatically weaken the definition of broadband. How? Under this new proposal, any area able to obtain wireless speeds of at least 10 Mbps down, 1 Mbps would be deemed good enough for American consumers, pre-empting any need to prod industry to speed up or expand broadband coverage.Read Replies (0)
By BeauHD from Slashdot's anti-climactic department
An analysis by more than 200 astronomers has been published that shows the mysterious dimming of star KIC 8462852 --
nicknamed Tabby's star -- is not being produced by an alien megastructure. "The evidence points most strongly to a giant cloud of dust occasionally obscuring the star," reports The Guardian. From the report: KIC 8462852 is approximately 1,500 light years away from the Earth and hit the headlines in October 2015 when data from Nasa's Kepler space telescope showed that it was dimming by unexplainably large amounts. The star's light dropped by 20% first and then 15% making it unique. Even a large planet passing in front of the star would have blocked only about 1% of the light. For an object to block 15-20%, it would have to be approaching half the diameter of the star itself. With this realization, a few astronomers began whispering that such a signal would be the kind expected from a gigantic extraterrestrial construction orbiting in front of the star -- and the idea of the alien megastructure was born.
In the case of Tabby's star, the new observations show that it dims more at blue wavelengths than red. Thus, its light is passing through a dust cloud, not being blocked by an alien megastructure in orbit around the star. The new analysis of KIC 8462852 showing these results is to be published in The Astrophysical Journal Letters. It reinforces the conclusions reached by Huan Meng, University of Arizona, Tucson, and collaborators in October 2017. They monitored the star at multiple wavelengths using Nasa's Spitzer and Swift missions, and the Belgian AstroLAB IRIS observatory. These results were published in The Astrophysical Journal.Read Replies (0)
By BeauHD from Slashdot's short-lived department
SpaceX has reportedly worked with the Air Force to develop a GPS-equipped on-board computer, called the "Automatic Flight Safety System," that will safely and automatically detonate a Falcon 9 rocket in the sky if the launch threatens to go awry. Previously, an Air Force range-safety officer was required to be in place, ready to transmit a signal to detonate the rocket. Quartz reports: No other U.S. rocket has this capability yet, and it could open up new advantages for SpaceX: The U.S. Air Force is considering launches to polar orbits from Cape Canaveral, but the flight path is only viable if the rockets don't need to be tracked for range-safety reasons. That means SpaceX is the only company that could take advantage of the new corridor to space. Rockets at the Cape normally launch satellites eastward over the Atlantic into orbits roughly parallel to the equator. Launches from Florida into orbits traveling from pole to pole generally sent rockets too close to populated areas for the Air Force's liking. The new rules allow them to thread a safe path southward, past Miami and over Cuba.
SpaceX pushed for the new automated system for several reasons. One was efficacy: The on-board computer can react more quickly than human beings relying on radar data and radio transmissions to signal across miles of airspace, which gives the rocket more time to correct its course before blowing up in the event of an error. As important, the automated system means the company doesn't need to pay for the full use of the Air Force radar installations on launch day, which means SpaceX doesn't need to pay for some 160 U.S. Air Force staff to be on duty for their launches, saving the company and its customers money. Most impressively, the automated system will make it possible for SpaceX to fly multiple boosters at once in a single launch.Read Replies (0)
By BeauHD from Slashdot's cease-and-desist department
French President Emmanuel Macron has a rather extreme approach to combat fake news: ban entire websites. In a speech to journalists on Wednesday, Macron said he planned to introduce new legislation to strictly regulate fake news during online political campaigns. Gizmodo reports: His proposal included a number of measures, most drastically "an emergency legal action" that could enable the government to either scrap "fake news" from a website or even block a website altogether. "If we want to protect liberal democracies, we must be strong and have clear rules," Macron said. "When fake news are spread, it will be possible to go to a judge... and if appropriate have content taken down, user accounts deleted and ultimately websites blocked."
Macron, himself a target of election interference, also outlined some less extreme measures in his speech yesterday. He proposed more rigid requirements around transparency, specifically in relation to online ads during elections. According to the Guardian, Macron said the legislation would force platforms to publicly identify who their advertisers are, as well as limit how much they can spend on ads over the course of an election campaign.Read Replies (0)
By BeauHD from Slashdot's matter-of-opinion department
An anonymous reader quotes a report from The Verge: According to a new Pew Research Center survey, defining online harassment is just as complicated for the average American user as it is for huge social media companies -- and the line gets even more fuzzy when gender or race come into the picture. The survey polled 4,151 respondents on various scenarios and asked them whether each one crossed the threshold for online harassment. In one hypothetical, a private disagreement between a man and his friend David is forwarded to a third party and posted online, which escalates to David receiving "unkind" messages, "vulgar" messages, and eventually being doxxed and threatened. When asked whether or not David was harassed, 89 percent of respondents agreed that he was. However, opinions on exactly when the harassment began varied widely: 5 percent considered it harassment when David offends his friend; 48 percent said it's when the friend forwards the conversation; 54 percent said it's when the conversation is shared publicly. Others agreed it crossed the line when David received the unkind messages (72 percent), the vulgar messages (82 percent), is doxxed (85 percent), and threatened (85 percent). There was little difference in responses by gender.
< article continued at Slashdot's matter-of-opinion department
>Read Replies (0)
By BeauHD from Slashdot's on-the-clock department
In a blog post, personal analytics service RescueTime revealed exactly what days and times we were most productive in 2017, by studying the anonymized data of how people spent their time on their computers and phones over the past 12 months. From the report: Simply put, our data shows that people were the most productive on November 14th. In fact, that entire week ranked as the most productive of the year. Which makes sense. With American Thanksgiving the next week and the mad holiday rush shortly after, mid-November is a great time for people to cram in a few extra work hours and get caught up before gorging on Turkey dinner. On the other side of the spectrum, we didn't get a good start to the year. January 6th -- the first Friday of the year -- was the least productive day of 2017.
One of the biggest mistakes so many of us make when planning out our days is to assume we have 8+ hours to do productive work. This couldn't be further from the truth. What we found is that, on average, we only spend 5 hours a day working on a digital device. And with an average productivity pulse of 53% for the year, that means we only have 12.5 hours a week to do productive work. Our data showed that we do our most productive work between 10 and noon and then again from 2-5pm each day. However, breaking it down to the hour, we do our most productive work on Wednesdays at 3pm. RescueTime has a separate blog post detailing how they calculate their productivity scores.Read Replies (0)
By BeauHD from Slashdot's it's-what's-on-the-inside-that-matters department
Popular repair site iFixit has acquired an iMac Pro and opened it up to see what's inside. They tore down the base iMac Pro with an 8-core processor, 32GB of RAM, and a 1TB SSD. Mac Rumors reports the findings: iFixit found that the RAM, CPU, and SSDs in the iMac Pro are modular and can potentially be replaced following purchase, but most of the key components "require a full disassembly to replace." Standard 27-inch iMacs have a small hatch in the back that allows easy access to the RAM for post-purchase upgrades, but that's missing in the iMac Pro. Apple has said that iMac Pro owners will need to get RAM replaced at an Apple Store or Apple Authorized Service Provider. iFixit says that compared to the 5K 27-inch iMac, replacing the RAM in the iMac Pro is indeed "a major undertaking."
Apple is using standard 288-pin DDR4 ECC RAM sticks with standard chips, which iFixit was able to upgrade using its own $2,000 RAM upgrade kit. A CPU upgrade is "theoretically possible," but because Apple uses a custom-made Intel chip, it's not clear if an upgrade is actually feasible. The same goes for the SSDs -- they're modular and removable, but custom made by Apple. Unlike the CPU, the GPU is BGA-soldered into place and cannot be removed. The internals of the iMac Pro are "totally different" from other iMacs, which is unsurprising as Apple said it introduced a new thermal design to accommodate the Xeon-W processors and Radeon Pro Vega GPUs built into the machines. The new thermal design includes an "enormous" dual-fan cooler, what iFixit says is a "ginormous heat sink," and a "big rear vent." Overall, iFixit gave the iMac Pro a repairability score of 3/10 since it's difficult to open and tough to get to internal components that might need to be repaired or replaced.Read Replies (0)
By BeauHD from Slashdot's good-news-for-chipmakers department
In a post on Google's Online Security Blog, two engineers described a novel chip-level patch that has been deployed across the company's entire infrastructure, resulting in only minor declines in performance in most cases. "The company has also posted details of the new technique, called Retpoline, in the hopes that other companies will be able to follow the same technique," reports The Verge. "If the claims hold, it would mean Intel and others have avoided the catastrophic slowdowns that many had predicted." From the report: "There has been speculation that the deployment of KPTI causes significant performance slowdowns," the post reads, referring to the company's "Kernel Page Table Isolation" technique. "Performance can vary, as the impact of the KPTI mitigations depends on the rate of system calls made by an application. On most of our workloads, including our cloud infrastructure, we see negligible impact on performance." "Of course, Google recommends thorough testing in your environment before deployment," the post continues. "We cannot guarantee any particular performance or operational impact." Notably, the new technique only applies to one of the three variants involved in the new attacks. However, it's the variant that is arguably the most difficult to address. The other two vulnerabilities -- "bounds check bypass" and "rogue data cache load" -- would be addressed at the program and operating system level, respectively, and are unlikely to result in the same system-wide slowdowns.Read Replies (0)
By BeauHD from Slashdot's behind-the-scenes department
Reuters tells the story of how Daniel Gruss, a 31-year-old information security researcher and post-doctoral fellow at Austria's Graz Technical University, hacked his own computer and exposed a flaw in most of the Intel chips made in the past two decades. Prior to his discovery, Gruss and his colleagues Moritz Lipp and Michael Schwarz had thought such an attack on the processor's "kernel" memory, which is meant to be inaccessible to users, was only theoretically possible. From the report: "When I saw my private website addresses from Firefox being dumped by the tool I wrote, I was really shocked," Gruss told Reuters in an email interview, describing how he had unlocked personal data that should be secured. Gruss, Lipp and Schwarz, working from their homes on a weekend in early December, messaged each other furiously to verify the result. "We sat for hours in disbelief until we eliminated any possibility that this result was wrong," said Gruss, whose mind kept racing even after powering down his computer, so he barely caught a wink of sleep. Gruss and his colleagues had just confirmed the existence of what he regards as "one of the worst CPU bugs ever found." The flaw, now named Meltdown, was revealed on Wednesday and affects most processors manufactured by Intel since 1995. Separately, a second defect called Spectre has been found that also exposes core memory in most computers and mobile devices running on chips made by Intel, Advanced Micro Devices (AMD) and ARM Holdings, a unit of Japan's Softbank.Read Replies (0)
By msmash from Slashdot's next-up department
Google is accepting "prophylactic" takedown requests to keep pirated content out of its search results, an anonymous reader writes, citing a TorrentFreak report. From the article: Over the past year, we've noticed on a few occasions that Google is processing takedown notices for non-indexed links. While we assumed that this was an 'error' on the sender's part, it appears to be a new policy. "Google has critically expanded notice and takedown in another important way: We accept notices for URLs that are not even in our index in the first place. That way, we can collect information even about pages and domains we have not yet crawled," Caleb Donaldson, copyright counsel at Google writes. In other words, Google blocks URLs before they appear in the search results, as some sort of piracy vaccine. "We process these URLs as we do the others. Once one of these not-in-index URLs is approved for takedown, we prophylactically block it from appearing in our Search results, and we take all the additional deterrent measures listed above." Some submitters are heavily relying on the new feature, Google found. In some cases, the majority of the submitted URLs in a notice are not indexed yet.Read Replies (0)
By msmash from Slashdot's breakthrough department
chalsall writes: Persistence pays off. Jonathan Pace, a GIMPS volunteer for over 14 years, discovered the 50th known Mersenne prime, 2^77,232,917 -- 1 on December 26, 2017. The prime number is calculated by multiplying together 77,232,917 twos, and then subtracting one. It weighs in at 23,249,425 digits, becoming the largest prime number known to mankind. It bests the previous record prime, also discovered by GIMPS, by 910,807 digits. You can read a little more in the press release.Read Replies (0)
By msmash from Slashdot's needs-attention department
Ocean dead zones with zero oxygen have quadrupled in size since 1950, scientists have warned, while the number of very low oxygen sites near coasts have multiplied tenfold. From a report: Most sea creatures cannot survive in these zones and current trends would lead to mass extinction in the long run, risking dire consequences for the hundreds of millions of people who depend on the sea. Climate change caused by fossil fuel burning is the cause of the large-scale deoxygenation, as warmer waters hold less oxygen. The coastal dead zones result from fertiliser and sewage running off the land and into the seas. The analysis, published in the journal Science, is the first comprehensive analysis of the areas and states: "Major extinction events in Earth's history have been associated with warm climates and oxygen-deficient oceans." Denise Breitburg, at the Smithsonian Environmental Research Center in the US and who led the analysis, said: "Under the current trajectory that is where we would be headed. But the consequences to humans of staying on that trajectory are so dire that it is hard to imagine we would go quite that far down that path." "This is a problem we can solve," Breitburg said. "Halting climate change requires a global effort, but even local actions can help with nutrient-driven oxygen decline." She pointed to recoveries in Chesapeake Bay in the US and the Thames river in the UK, where better farm and sewage practices led to dead zones disappearing.Read Replies (0)