By msmash from Slashdot's how-to-put-things-to-rest department
An anonymous reader shares a report: Amazon's taxes have become a campaign issue. In last week's Democratic debates, two different candidates (Cory Booker and Andrew Yang) called out Amazon for paying $0 in federal income taxes last year, even after listing $4 billion in profits. Joe Biden, Elizabeth Warren, and President Trump himself have brought up the same point at various points on the campaign trail, always directed at Amazon. In a CNN interview after the second debate, Bernie Sanders singled the company out as an example of a broken tax code, saying simply, "I'm going to tax them."
"We pay every penny we owe in corporate taxes including $2.6 billion over the past three years," Amazon said when reached for comment. "We've invested $270 billion in the US since 2010 and created more than 275,000 jobs." But there's an awkward truth behind the political back-and-forth: we don't know what Amazon's tax bill really is. Like every other company in America, Amazon's tax returns are private, legally considered to be a trade secret. We don't know which tax breaks they're taking, or how they've structured their finances to avoid various taxes in favor of others. If Amazon says its tax bill was lower because of investments, we simply have to take the company at its word.Read Replies (0)
By msmash from Slashdot's up-next department
Germany's cyber-security agency is working on a set of minimum rules that modern web browsers must comply with in order to be considered secure. From a report: The new guidelines are currently being drafted by the German Federal Office for Information Security (or the Bundesamt fur Sicherheit in der Informationstechnik -- BSI), and they'll be used to advise government agencies and companies from the private sector on what browsers are safe to use. A first version of this guideline was published in 2017, but a new standard is being put together to account for improved security measures added to modern browsers, such as HSTS, SRI, CSP 2.0, telemetry handling, and improved certificate handling mechanisms -- all mentioned in a new draft released for public debate last week. According to the BSI's new draft, to be considered "secure," a modern browser must follow the following requirements, among others: Must support TLS, must have a list of trusted certificates, must support extended validation (EV) certificates, must verify loaded certificates against a Certification Revocation List (CRL) or an Online Certificate Status Protocol (OCSP); the browser must use icons or color highlights to show when communications to a remote server is encrypted or in plaintext, connections to remote websites running on expired certificates must be allowed only after specific user approval; must support HTTP Strict Transport Security (HSTS) (RFC 6797). Further reading: Germany and the Netherlands To Build the First Ever Joint Military Internet.Read Replies (0)
By msmash from Slashdot's what-America-will-dare,-America-will-do department
Gene Kranz may be the most famous flight director in NASA's history. He directed the actual landing portion of the first mission to put men on the moon, Apollo 11, and led Mission Control in saving the crew of Apollo 13 after an oxygen tank exploded on the way to the lunar surface. Now Kranz, 85, has completed another undertaking: the reopening of Mission Control at NASA's Johnson Space Center in Houston. From a report: The room where Kranz directed some of NASA's most historic missions, heralding U.S. exploration of space, was decommissioned in 1992. Since then, it had become a stop on guided tours of the space center but had fallen into disrepair. Kranz led a $5 million multiyear effort to restore Mission Control in time for the 50th anniversary of the first moon landing on July 20.
"I walked into that room last Monday for the first time when it was fully operational, and it was dynamite. I literally wept," Kranz said in an interview with NPR. "The emotional surge at that moment was incredible. I walked down on the floor, and when we did the ribbon-cutting the last two days, believe it or not, I could hear the people talking in that room from 50 years ago. I could hear the controllers talking." The room also brought back memories for Kranz of a shared sense of purpose. "That group of people united in pursuit of a cause, and basically the result was greater than the sum of the parts. There was a chemistry that was formed," Kranz said. "[The room] also has a meaning related to the American psyche, that what America will dare, America will do," Kranz said.Read Replies (0)
By msmash from Slashdot's how-about-that department
New documents obtained by Motherboard using a Freedom of Information request show how Amazon, Ring, a GPS tracking company, and the U.S. Postal Inspection Service collaborated on a package sting operation with the Aurora, Colorado Police Department in December. From the report: The operation involved equipping fake Amazon packages with GPS trackers, and surveilling doorsteps with Ring doorbell cameras in an effort to catch someone stealing a package on tape. The documents show the design and implementation of a highly elaborate public relations stunt, which was designed both to endear Amazon and Ring with local law enforcement, and to make local residents fear the place they live. The parties were disappointed when the operation didn't result in any arrests. The Aurora Police Department received 25 Amazon boxes, Amazon-branded tape, and Amazon lithium ion stickers as a part of the operation. It also received 15 Ring doorbell cameras and 15 GL300W GPS trackers from 7P Solutions. "Operation Grinch Grab," as it was called internally, involved seven Aurora zip codes. These companies spent days with the Aurora Police Department preparing them for the operation, and discussing local news coverage and rewriting press releases.Read Replies (0)
By msmash from Slashdot's never-before department
The hospital technology, typically used to identify human ailments, captured perhaps the world's smallest magnetic resonance image. weiserfireman shares a report: Different microscopy techniques allow scientists to see the nucleotide-by-nucleotide genetic sequences in cells down to the resolution of a couple atoms as seen in an atomic force microscopy image. But scientists at the IBM Almaden Research Center in San Jose, Calif. and the Institute for Basic Sciences in Seoul, have taken imaging a step further, developing a new magnetic resonance imaging technique that provides unprecedented detail, right down to the individual atoms of a sample [Editor's note: the link may be paywalled; alternative source]. The technique relies on the same basic physics behind the M.R.I. scans that are done in hospitals. When doctors want to detect tumors, measure brain function or visualize the structure of joints, they employ huge M.R.I. machines, which apply a magnetic field across the human body. This temporarily disrupts the protons spinning in the nucleus of every atom in every cell. A subsequent, brief pulse of radio-frequency energy causes the protons to spin perpendicular to the pulse. Afterward, the protons return to their normal state, releasing energy that can be measured by sensors and made into an image.
< article continued at Slashdot's never-before department
>Read Replies (0)
By msmash from Slashdot's marching-forward department
From a blog post: For 25 years, the Robots Exclusion Protocol (REP) was only a de-facto standard. This had frustrating implications sometimes. On one hand, for webmasters, it meant uncertainty in corner cases, like when their text editor included BOM characters in their robots.txt files. On the other hand, for crawler and tool developers, it also brought uncertainty; for example, how should they deal with robots.txt files that are hundreds of megabytes large? Today, we announced that we're spearheading the effort to make the REP an internet standard. While this is an important step, it means extra work for developers who parse robots.txt files.
We're here to help: we open sourced the C++ library that our production systems use for parsing and matching rules in robots.txt files. This library has been around for 20 years and it contains pieces of code that were written in the 90's. Since then, the library evolved; we learned a lot about how webmasters write robots.txt files and corner cases that we had to cover for, and added what we learned over the years also to the internet draft when it made sense.Read Replies (0)
By msmash from Slashdot's growing-adoption department
An anonymous reader shares a report: Three and a half years ago, Mark Russinovich, Azure CTO, Microsoft's cloud, said, "One in four [Azure] instances are Linux." Next, in 2017, Microsoft revealed that 40% of Azure virtual machines (VM) were Linux-based. Then in the fall of 2018, Scott Guthrie, Microsoft's executive VP of the cloud and enterprise group, told me in an exclusive interview, "About half Azure VMs are Linux". Now, Sasha Levin, Microsoft Linux kernel developer, in a request that Microsoft be allowed to join a Linux security list, revealed that "the Linux usage on our cloud has surpassed Windows." Shocking you say? Not really. Linux is largely what runs enterprise computing both on in-house servers and on the cloud. Windows Server has been declining for years. In the most recent IDC Worldwide Operating Systems and Subsystems Market Shares report covering 2017, Linux had 68% of the market. Its share has only increased since then.Read Replies (0)
By msmash from Slashdot's closer-look department
Border Gateway Protocol has served the internet well for decades. But when it goes wrong, you notice it. From a report: In a weeks-long stretch in 2014, hackers stole thousands of dollars a day in cryptocurrency from owners. In 2017, internet outages cropped up around the United States for hours. Last year, Google Cloud suffered hours of disruptions. Earlier this month, a large swath of European mobile data was rerouted through the state-backed China Telecom. And on Monday, websites and services around the world -- including the internet infrastructure firm Cloudflare -- experienced hours of outages. These incidents may sound different, but they actually all resulted from problems -- some accidental, some malicious -- with a fundamental internet routing system called the Border Gateway Protocol. The web is distributed, but it's also interconnected. It needs to be so that data can move around worldwide without all being controlled by a single entity. So every time you load a website or send an email, BGP is the system responsible for optimizing the route that data takes across these sprawling, intertwined networks. And when it goes wrong, the whole internet feels it.
< article continued at Slashdot's closer-look department
>Read Replies (0)
By msmash from Slashdot's growing-challenge department
Internships have long been an opportunity for inexperienced workers to try out different industries and build valuable contacts. For companies, it is a way to attract future talent. But increasingly interns are being asked to sign noncompete, nondisclosure and forced arbitration agreements, restrictions once reserved for higher-ranking employees [Editor's note: the link may be paywalled]. From a report: Advocates say legal covenants for interns help safeguard trade secrets such as customer lists in an era when it is easy to download information and share it, for instance on social media or with a competitor. But critics argue the agreements hamper young people's job opportunities and mobility even before they get a foot on the career ladder. [...] Ms. Dunne's [anecdote in the story] noncompete agreement stated that she couldn't work for a competitor in software or banking within 15 miles of Wilmington for a year after leaving TekMountain. Ms. Dunne said she was given the agreement on her first day. "I had no idea what I signed, they didn't explain it to me."
After leaving TekMountain, she did a separate three-month internship with nCino, a financial technology company in Wilmington. In a May 7 letter, TekMountain's parent, CastleBranch, laid out her obligations under the noncompete agreement, described the confidentiality of its proprietary information as "very serious," and asked for details about her relationship with nCino. Ms. Dunne said she didn't respond. The noncompete "eliminated a good portion of the companies in town in the industry I wanted to be in," said Ms. Dunne, who is relocating to the Washington, D.C., area for a new job. "I have to leave all of my friends behind and start over."Read Replies (0)
By EditorDavid from Slashdot's warnings-for-Windows department
Consumer tech reporter Gordon Kelly describes "an important new Windows 10 warning (and the failure behind it)" for all 800 million of Microsoft's Windows 10 users:
What Microsoft confirms it did was quietly switch off Registry backups in Windows 10 eight months ago, despite giving users the impression this crucial safeguarding system was still working. As Ghacks spotted at the time, Registry backups would show "The operation completed successfully", despite no backup file being created...
Microsoft has now spelt out what was actually happening: "Starting in Windows 10, version 1803, Windows no longer automatically backs up the system registry to the RegBack folder. If you browse to the WindowsSystem32configRegBack folder in Windows Explorer, you will still see each registry hive, but each file is 0kb in size...."
So why has Microsoft done this? In the company's own words: "to help reduce the overall disk footprint size of Windowsâ. And how big is a registry backup? Typically 50-100MB.
The article notes that this issue was flagged in Microsoft's Feedback Hub -- last October -- but "only now is the company coming clean about what happened."
The Ghacks blog points out that the Registry backup option "has been disabled but not removed according to Microsoft. Administrators who would like to restore the functionality may do so by changing the value of a Registry key."Read Replies (0)
By EditorDavid from Slashdot's playing-monopoly department
An anonymous reader quotes Reuters:
Google appears to have misused its dominant position in India and reduced the ability of device manufacturers to opt for alternate versions of its Android mobile operating system, Indian officials found before ordering a wider probe in an antitrust case. A 14-page order from the Competition Commission of India (CCI), reviewed by Reuters this week, found Google's restrictions on manufacturers seemed to amount to imposition of "unfair conditions" under India's competition law....
The Indian case is similar to one Google faced in Europe, where regulators imposed a $5 billion fine on the company for forcing manufacturers to pre-install its apps on Android devices. Google has appealed against the verdict.
By making pre-installation of Google's proprietary apps conditional, Google "reduced the ability and incentive of device manufacturers to develop and sell devices operated on alternate versions of Android", the CCI said in the order. "It amounts to prima facie leveraging of Google's dominance".Read Replies (0)
By EditorDavid from Slashdot's better-at-blocking department
The Brave web browser "claims to have delivered a '69x average improvement' in its ad-blocking technology using Rust in place of C++" reports ZDNet.
They cite a blog post by Brave performance researcher Dr. Andrius Aucinas and Brave's chief scientist Dr. Ben Livshits:
The improvements can be experienced in its experimental developer and nightly channel releases... "We implemented the new engine in Rust as a memory-safe, performant language compilable down to native code and suitable to run within the native browser core as well as being packaged in a standalone Node.js module," the two Brave scientists said. The new engine means the Chromium-based browser can cut the average request classification time down to 5.6 microseconds, a unit of time that's equal to a millionth of one second.
Aucinas and Livshits argue that the micro-improvements in browser performance might not seem significant to end users but do translate to good things for a computer's main processor. "Although most users are unlikely to notice much of a difference in cutting the ad-blocker overheads, the 69x reduction in overheads means the device CPU has so much more time to perform other functions," the pair explain.
Their blog post notes that loading a web page today can be incredibly complex. "Since loading an average website involves 75 requests that need to be checked against tens of thousands of rules, it must also be efficient."Read Replies (0)
By EditorDavid from Slashdot's renting-a-robot department
"Robotics-as-a-service (RaaS) is about to eat the world of work" argues Hooman Radfar, a partner at the startup studio Expa who's been "actively investing in and looking for new companies" catalyzing the change."
Companies buy massive robots and software solutions that are customized -- at great cost -- to their specific needs. The massive conglomerates that sell these robots have dominated the field for decades, but that is about to change. One major factor driving this change is how dramatically globalization has reduced hardware production costs and capabilities. At the same time, cheap and powerful computing and cloud infrastructure are now also readily available and easy to spin up. As a result, vertical-specific, robotic-powered, solutions can today be offered as variable cost services versus being sold at a fixed cost. Just as cable companies include the costs of set-top boxes in their monthly bill, robots and their associated software will be bundled together and sold in a subscription package. This change to the robotics business model will have profound implications, radically transforming markets and at the same time changing the future of work.
With a new variable cost model in place as a result of subscription packages, it's simple to calculate when a market is about to tip to favor RaaS. A market has hit its automation tipping point when an RaaS solution is introduced with a unit cost that is less than or equal to the unit cost for humans-in-the-loop to conduct the same task... One market that has already reached its automation tipping point is the enterprise building security market... Crop dusting ($70 billion), industrial cleaning ($78 billion), warehouse management ($21 billion), and many more service markets are tipping. When these sectors hit their automation tipping point, we will see the same level of industry disruption currently taking place in the building security market.
< article continued at Slashdot's renting-a-robot department
>Read Replies (0)
By EditorDavid from Slashdot's Plan-B department
Slashdot's gotten over 17,000 votes in its poll about which web browser people use on their desktop. (The current leader? Firefox, with 53% of the vote, followed by Chrome with 30%.)
But Slashdot reader koavf asks an interesting follow-up question: "What's everyone's go-to Plan B browser and why?"
To start the conversation, here's how James Gelinas (a contributor at Kim Komando's tech advice site) recently reviewed the major browsers:
He calls Chrome "a safe, speedy browser that's compatible with nearly every page on the internet" but also says that Chrome "is notorious as a resource hog, and it can drastically slow your computer down if you have too many tabs open."
"Additionally, the perks of having your Google Account connected to your browser can quickly turn into downsides for the privacy-minded among is. If you're uncomfortable with your browser knowing your searching and spending behaviors, Chrome may not be the best choice for you."
He calls Firefox "the choice for safety".
"Predating Chrome by 6 years, Firefox was the top choice for savvy Netizens in the early Aughts. Although Chrome has captured a large segment of its user base, that doesn't mean the Fox is bad. In fact, Mozilla is greatly appreciated by fans and analysts for its steadfast dedication to user privacy... Speedwise, Firefox isn't a slouch either. The browser is lighter weight than Chrome and is capable of loading some websites even faster."
He calls Apple's Safari and Microsoft Edge "the default choice...because both of these browsers come bundled with new computers."
< article continued at Slashdot's Plan-B department
>Read Replies (0)
By EditorDavid from Slashdot's changes-in-the-weather-forecasting department
NPR notes today's "supercomputer-driven" weather modelling can crunch huge amounts of data to accurately forecast the weather a week in advance -- pointing out that "a six-day weather forecast today is as good as a two-day forecast was in the 1970s."
Here's some highlights from their interview with Andrew Blum, author of The Weather Machine: A Journey Inside the Forecast :
One of the things that's happened as the scale in the system has shifted to the computers is that it's no longer bound by past experience. It's no longer, the meteorologists say, "Well, this happened in the past, we can expect it to happen again." We're more ready for these new extremes because we're not held down by past expectations...
The models are really a kind of ongoing concern. ... They run ahead in time, and then every six hours or every 12 hours, they compare their own forecast with the latest observations. And so the models in reality are ... sort of dancing together, where the model makes a forecast and it's corrected slightly by the observations that are coming in...
It's definitely run by individual nations -- but individual nations with their systems tied together... It's a 150-year-old system of governments collaborating with each other as a global public good... The positive example from last month was with Cyclone Fani in India. And this was a very similar storm to one 20 years ago, that tens of thousands of people had died. This time around, the forecast came far enough in advance and with enough confidence that the Indian government was able to move a million people out of the way.Read Replies (0)
By EditorDavid from Slashdot's durable-DRAM department
Lancaster University has announced a "universal computer memory" breakthrough combining the fast, low-energy storage of DRAM memory with the robustness of flash memory. They're now envisioning ultra-low energy consumption computers which would never need to boot up -- and can "instantaneously and imperceptibly" slip into an energy-saving sleep mode.
Long-time Slashdot reader Hrrrg pointed us to this announcement:
A U.S. patent has been awarded for the electronic memory device with another patent pending, while several companies have expressed an interest or are actively involved in the research. The inventors of the device used quantum mechanics to solve the dilemma of choosing between stable, long-term data storage and low-energy writing and erasing... [Specifically, "by exploiting the quantum-mechanical properties of an asymmetric triple resonant-tunnelling barrier."]
Physics Professor Manus Hayne of Lancaster University said, "Our device has an intrinsic data storage time that is predicted to exceed the age of the Universe, yet it can record or delete data using 100 times less energy than DRAM."
The announcement predicts the technology could reduce peak power consumption in data centers by 20%.Read Replies (0)
By EditorDavid from Slashdot's breach-this department
An anonymous reader quotes CNET:
A former Equifax executive who sold his stock in the consumer credit reporting firm before it announced a massive data breach has been sentenced to four months in federal prison for insider trading. Jun Ying, former chief information officer for the company's US Information Solutions, was also ordered to pay about $117,000 in restitution and a $55,000 fine, the US Attorney's Office said Thursday... Ying sold all his shares in Equifax, making more than $950,000. Ying's insider trading happened 10 days before Equifax publicly announced its breach.
Ying, 44, is the second Equifax employee convicted of insider trading related to the data breach. Sudhakar Reddy Bonthu, a former Equifax software development manager, pleaded guilty in 2018 to using the insider information to make more than $75,000 on the stock market. Bonthu was ordered to serve eight months home confinement, pay a $50,000 fine and forfeit the proceeds from the stock sale.
In announcing the sentence, U.S. Attorney Byung J. Pak said that Ying had "thought of his own financial gain before the millions of people exposed in this data breach even knew they were victims."Read Replies (0)
By EditorDavid from Slashdot's surging-silicon department
Slashdot reader MojoKid writes:
AMD announced its 3rd Gen Ryzen 3000 series processors at Computex earlier this month and the company's Zen 2 architecture is promised to bring single threaded performance parity with Intel but exceedingly better multithreaded throughput in content creation and other high-end workloads.
Intel has obviously taken notice of AMD's Zen 2 advancements and nowhere is its renewed keen focus more evident than in an internal memo that just leaked out to public venues. The memo was originally posted on Intel's internal "Circuit News" employee portal and it's quite revealing. The memo, which is entitled, "AMD competitive profile: Where we go toe-to-toe, why they are resurgent, which chips of ours beat theirs", is a surprisingly frank look at how AMD has managed to get the best of Intel, at least currently, and how the company should manage this renewed or "resurgent" competitive threat.
What's most surprising about the memo, which was penned by Circuit News Managing Editor Walden Kirsch, is how flattering it is in general to AMD, pointing out that it was the best-performing stock on the S&P 500 for 2018. In terms of Zen 2 and AMD's Ryzen 3000 series, the author notes, "Intel 9th Gen Core processors are likely to lead AMD's Ryzen-based products on lightly threaded productivity benchmarks as well as many gaming benchmarks," Kirsch writes in the memo. "For multi-threaded workloads, such as heavy content creation workloads, AMD's Matisse is expected to lead." All in, the internal memo is a rather insightful and well-reasoned look at the threat that AMD poses to Intel and how the company might respond.Read Replies (0)