By msmash from Slashdot's fascinating-questions department
Adam Frank, writing for The Atlantic: We're used to imagining extinct civilizations in terms of the sunken statues and subterranean ruins. These kinds of artifacts of previous societies are fine if you're only interested in timescales of a few thousands of years. But once you roll the clock back to tens of millions or hundreds of millions of years, things get more complicated. When it comes to direct evidence of an industrial civilization -- things like cities, factories, and roads -- the geologic record doesn't go back past what's called the Quaternary period 2.6 million years ago. For example, the oldest large-scale stretch of ancient surface lies in the Negev Desert. It's "just" 1.8 million years old -- older surfaces are mostly visible in cross section via something like a cliff face or rock cuts. Go back much farther than the Quaternary and everything has been turned over and crushed to dust. And, if we're going back this far, we're not talking about human civilizations anymore. Homo sapiens didn't make their appearance on the planet until just 300,000 years or so ago. [...] Given that all direct evidence would be long gone after many millions of years, what kinds of evidence might then still exist? The best way to answer this question is to figure out what evidence we'd leave behind if human civilization collapsed at its current stage of development. Mr. Frank, along with Gavin Schmidt, Director of the NASA Goddard Institute for Space Studies, have published their research on the subject [PDF].Read Replies (0)
By msmash from Slashdot's security-woes department
Caroline Haskins, writing for The Outline: Hundreds of multi-ton liabilities -- soaring faster than the speed of sound, miles above the surface of the earth -- are operating on Windows-95. They're satellites, responsible for everything from GPS positioning, to taking weather measurements, to carrying cell signals, to providing television and internet. For the countries that own these satellites, they're invaluable resources. Even though they're old, it's more expensive to take satellites down than it is to just leave them up. So they stay up. Unfortunately, these outdated systems makes old satellites prime targets for cyber attacks. [...] A malicious actor could fake their IP address, which gives information about a user's computer and its location. This person could then get access to the satellite's computer system, and manipulate where the satellite goes or what it does. Alternatively, an actor could jam the satellite's radio transmissions with earth, essentially disabling it. The cost of such an attack could be huge. If a satellite doesn't work, life-saving GPS or online information could be withheld to people on earth when they need it most. What's worse, if part of a satellite -- or an entire satellite -- is knocked out of its orbit from an attack, the debris could create a domino effect and cause extreme damage to other satellites.Read Replies (0)
By msmash from Slashdot's major-bet department
After making smart speakers a household product (at least to some), Amazon seems to have found its next big consumer product: robots. Amazon is building smart robots that are equipped with cameras that let them drive around homes, Bloomberg reported Monday. These robots could launch as soon as next year. From the report: Codenamed "Vesta," after the Roman goddess of the hearth, home and family, the project is overseen by Gregg Zehr, who runs Amazon's Lab126 hardware research and development division based in Sunnyvale, California. Lab126 is responsible for Amazon devices such as the Echo speakers, Fire TV set-top-boxes, Fire tablets and the ill-fated Fire Phone. The Vesta project originated a few years ago, but this year Amazon began to aggressively ramp up hiring. There are dozens of listings on the Lab 126 Jobs page for openings like "Software Engineer, Robotics" and "Principle Sensors Engineer." People briefed on the plan say the company hopes to begin seeding the robots in employees' homes by the end of this year, and potentially with consumers as early as 2019, though the timeline could change, and Amazon hardware projects are sometimes killed during gestation.Read Replies (0)
By BeauHD from Slashdot's new-and-improved department
"After years of phones, laptops, tablets, and TV screens converging on 16:9 as the 'right' display shape -- allowing video playback without distracting black bars -- smartphones have disturbed the universality recently by moving to even more elongated formats like 18:9, 19:9, or even 19.5:9 in the iPhone X's case," writes Amelia Holowaty Krales via The Verge. "That's prompted me to consider where else the default widescreen proportions might be a poor fit, and I've realized that laptops are the worst offenders." Krales makes the case for why a 16:9 screen of 13 to 15 inches in size is a poor fit: Practically every interface in Apple's macOS, Microsoft's Windows, and on the web is designed by stacking user controls in a vertical hierarchy. At the top of every MacBook, there's a menu bar. At the bottom, by default, is the Dock for launching your most-used apps. On Windows, you have the taskbar serving a similar purpose -- and though it may be moved around the screen like Apple's Dock, it's most commonly kept as a sliver traversing the bottom of the display. Every window in these operating systems has chrome -- the extra buttons and indicator bars that allow you to close, reshape, or move a window around -- and the components of that chrome are usually attached at the top and bottom. Look at your favorite website (hopefully this one) on the internet, and you'll again see a vertical structure.
< article continued at Slashdot's new-and-improved department
>Read Replies (0)
By BeauHD from Slashdot's impending-doom department
While parts of the FCC's new plan will go into effect on Monday, the majority of the order still doesn't have a date for when it will be official. Specific rules that modify data collection requirements still have to be approved by the Office of Management and Budget, and the earliest that can happen is on April 27. Tech experts and consumer policy advocates don't expect changes to happen right away, as ISPs will likely avoid any large-scale changes in order to convince policymakers that the net neutrality repeal was no big deal after all.Read Replies (0)
By BeauHD from Slashdot's hot-seat department
Facebook may be in the hot seat right now for its collection of personal data without our knowledge or explicit consent, but as The Wall Street Journal points out, "Google is a far bigger threat by many measures: the volume of information it gathers, the reach of its tracking and the time people spend on its sites and apps." From the report (alternative source): It's likely that Google has shadow profiles (data the company gathers on people without accounts) on as at least as many people as Facebook does, says Chandler Givens, CEO of TrackOff, which develops software to fight identity theft. Google allows everyone, whether they have a Google account or not, to opt out of its ad targeting, though, like Facebook, it continues to gather your data. Google Analytics is far and away the web's most dominant analytics platform. Used on the sites of about half of the biggest companies in the U.S., it has a total reach of 30 million to 50 million sites. Google Analytics tracks you whether or not you are logged in. Meanwhile, the billion-plus people who have Google accounts are tracked in even more ways. In 2016, Google changed its terms of service, allowing it to merge its massive trove of tracking and advertising data with the personally identifiable information from our Google accounts.
< article continued at Slashdot's hot-seat department
>Read Replies (0)
By BeauHD from Slashdot's end-is-nigh department
Researchers with Netlab 360 warn that attackers are mass-exploiting "Drupalgeddon2," the name of an extremely critical vulnerability Drupal maintainers patched in late March. The exploit allows them to take control of powerful website servers. Ars Technica reports: Formally indexed as CVE- 2018-7600, Drupalgeddon2 makes it easy for anyone on the Internet to take complete control of vulnerable servers simply by accessing a URL and injecting publicly available exploit code. Exploits allow attackers to run code of their choice without having to have an account of any type on a vulnerable website. The remote-code vulnerability harkens back to a 2014 Drupal vulnerability that also made it easy to commandeer vulnerable servers.
Drupalgeddon2 "is under active attack, and every Drupal site behind our network is being probed constantly from multiple IP addresses," Daniel Cid, CTO and founder of security firm Sucuri, told Ars. "Anyone that has not patched is hacked already at this point. Since the first public exploit was released, we are seeing this arms race between the criminals as they all try to hack as many sites as they can." China-based Netlab 360, meanwhile, said at least three competing attack groups are exploiting the vulnerability. The most active group, Netlab 360 researchers said in a blog post published Friday, is using it to install multiple malicious payloads, including cryptocurrency miners and software for performing distributed denial-of-service attacks on other domains. The group, dubbed Muhstik after a keyword that pops up in its code, relies on 11 separate command-and-control domains and IP addresses, presumably for redundancy in the event one gets taken down.Read Replies (0)
By BeauHD from Slashdot's rest-assured department
According to a survey of over 350 Tesla owners, Tesla batteries retain over 90 percent of their charging power after 160,000 miles. The EVs dropped only 5 percent of their capacity after 50,000 miles, but lose it at a much slower rate after that. Most Tesla vehicles will have over 90 percent of their charging power after around 185,000 miles, and 80 percent capacity after 500,000. Engadget reports: Tesla has no battery degradation warranty on its Model S and X luxury EVs, but guarantees that the Model 3 will retain 70 percent battery capacity after 120,000 miles (long-range battery) and 100,000 miles (shorter-range battery). That's a bit more generous than the one Nissan offers on the Leaf (66 percent over 100,000 miles) for instance. According to the survey data, Tesla will easily be able to meet this mark.Read Replies (0)
By BeauHD from Slashdot's banned-material department
Pornhub said in February that it was banning AI-generated deepfake videos, but BuzzFeed News found that it's not doing a very good job at enforcing that policy. The media company found more than 70 deepfake videos -- depicting graphic fake sex scenes with Emma Watson, Scarlett Johanson, and other celebrities -- were easily searchable from the site's homepage using the search term "deepfake." From the report: Shortly after the ban in February, Mashable reported that there were dozens of deepfake videos still on the site. Pornhub removed those videos after the report, but a few months later, BuzzFeed News easily found more than 70 deepfake videos using the search term "deepfake" on the site's homepage. Nearly all the videos -- which included graphic and fake depictions of celebrities like Katy Perry, Scarlett Johansson, Daisy Ridley, and Jennifer Lawrence -- had the word "deepfake" prominently mentioned in the title of the video and many of the names of the videos' uploaders contained the word "deepfake." Similarly, a search for "fake deep" returned over 30 of the nonconsensual celebrity videos. Most of the videos surfaced by BuzzFeed News had view counts in the hundreds of thousands -- one video featuring the face of actor Emma Watson garnered over 1 million views. Some accounts posting deepfake videos appeared to have been active for as long as two months and have racked up over 3 million video views. "Content that is flagged on Pornhub that directly violates our Terms of Service is removed as soon as we are made aware of it; this includes non-consensual content," Pornhub said in a statement. "To further ensure the safety of all our fans, we officially took a hard stance against revenge porn, which we believe is a form of sexual assault, and introduced a submission form for the easy removal of non-consensual content." The company also provided a link where users can report any "material that is distributed without the consent of the individuals involved."Read Replies (0)
By BeauHD from Slashdot's hand-selected department
The Netherlands Gaming Authority has published a study it conducted of 10 video games that reward players with loot boxes, packages players can sometimes buy with real money that contain random-in game rewards, and found that 4 of the 10 games it studied violated the Dutch Gaming Act. "It determined that loot boxes are, in general, addictive and that four of the games allowed players to trade items they'd won outside of the game, which means they've got a market value," reports Motherboard. From the report: According to the study, the authorities picked games "based on their popularity on a leading Internet platform that streams videos of games and players." Motherboard has reached out to the Gaming Authority for clarification on both the games it picked (the study doesn't name them) and the method by which it picked them, but did not receive an immediate reply. However, Twitch is the most popular way gamers watch others play and it's a good bet that Twitch is how the Gaming Authority focused its attention. Six of the ten games the Gaming Authority studied aren't in violation of Dutch law. "With these games, there is no opportunity to sell the prizes won outside of the game," the press release said. "This means that the goods have no market value and these loot boxes do not satisfy the definition of a prize in Section 1 of the Betting and Gaming Act."
The four others though offer the opportunity for players to trade items outside of the game and therefore meet the the Netherlands definition of gambling. To come into compliance, those games need to make their loot boxes less interesting to open. The Gaming Authority wants the companies to "remove the addiction-sensitive elements ('almost winning' effects, visual effects, ability to keep opening loot boxes quickly one after the other and suchlike)...and to implement measures to exclude vulnerable groups or to demonstrate that the loot boxes on offer are harmless."Read Replies (0)
By BeauHD from Slashdot's open-book department
Apple's FoundationDB company announced on Thursday that the FoundationDB core has been open sourced with the goal of building an open community with all major development done in the open. The database company was purchased by Apple back in 2015. As described in the announcement, FoundationDB is a distributed datastore that's been designed from the ground up to be deployed on clusters of commodity hardware. Mac Rumors reports: By open sourcing the project to drive development, FoundationDB is aiming to become "the foundation of the next generation of distributed databases: "The vision of FoundationDB is to start with a simple, powerful core and extend it through the addition of "layers". The key-value store, which is open sourced today, is the core, focused on incorporating only features that aren't possible to write in layers. Layers extend that core by adding features to model specific types of data and handle their access patterns. The fundamental architecture of FoundationDB, including its use of layers, promotes the best practices of scalable and manageable systems. By running multiple layers on a single cluster (for example a document store layer and a graph layer), you can match your specific applications to the best data model. Running less infrastructure reduces your organization's operational and technical overhead." The source for FoundationDB is available on Github, and those who wish to join the project are encouraged to visit the FoundationDB community forums, submit bugs, and make contributions to the core software and documentation.Read Replies (0)
By BeauHD from Slashdot's last-ditch-effort department
An anonymous reader quotes a report from Ars Technica: According to reports from Bloomberg and E&E News, the Trump Administration has been exploring another way to help coal and nuclear generators: the Defense Production Act of 1950. The Act was passed under President Truman. Motivated by the Korean War, it allows the president broad authority to boost U.S. industries that are considered a priority for national security. On Thursday, E&E News cited sources that said "an interagency process is underway" at the White House to examine possible application of the act to the energy industry. The goal would be to give some form of preference to coal and nuclear plants that are struggling to compete with cheap natural gas.
If the DOE decides not to invoke Section 202(c), the president may turn to the Defense Production Act. According to a 2014 summary report (PDF) from the Congressional Research Service (CRS), the act would allow the president to "demand priority for defense-related products," "provide incentives to develop, modernize, and expand defense productive capacity," and establish "a voluntary reserve of trained private sector executives available for emergency federal employment," among other powers. (Some even more permissive applications of the Act were terminated in 1957.) Using the Act to protect coal and nuclear facilities would almost certainly be more controversial, as the link between national defense and keeping uneconomic coal generators running is not well-established. The Administration could apply the Act to "provide or guarantee loans to industry" for material-specific deliveries and production. "The president may also authorize the purchase of 'industrial items or technologies for installation in government or private industrial facilities,'" reports Ars.Read Replies (0)