By EditorDavid from Slashdot's days-in-court department
As America's antitrust investigators eye Google, Apple, Facebook, and Amazon for possible government intervention, Bloomberg offers nine "lessons learned" from the way Microsoft handled its own antitrust investigation:
Don't deny the obvious... In the app-store business, Google and iPhone maker Apple together control more than 95 per cent of all US mobile app spending by consumers, according to Sensor Tower data. It could be more effective for these companies not to start by denying that leadership position -- if you have 80% or 90% percent of a market, arguing that you don't really dominate isn't the hill you want your legal reasoning to die on...
At the height of Microsoft's hubris (or carelessness, or both), the company sent Windows chief Jim Allchin to the stand with a doctored video that purported to show how computing performance would be degraded when the browser was removed from Windows on a single PC. It was actually done on several different computers and was an illustration of what might happen rather than a factual test, as the company initially claimed -- a fact that came to light only after several days of the government picking through every inconsistency in the video. Microsoft remade the simulation several times in an effort to save the testimony. The company seemed to think it could get away with baldy stating a technological claim and mocking up something that backed it up, perhaps reasoning that no one would know the difference, but it miscalculated badly...
In an interview last year at the Code Conference, Microsoft President and Chief Legal Officer Brad Smith lamented the distraction the case caused, and cited it as a reason the company missed out on the search market -- the business that fueled the runaway success of Google, now under the microscope itself. Others have pinned Microsoft's abysmal performance in mobile computing partially on constraints and distractions from the case...
Consider settling early.
< article continued at Slashdot's days-in-court department
>Read Replies (0)
By EditorDavid from Slashdot's face-that-launched-a-thousands-flights department
"Delta Air Lines announced it will give passengers who fly out of Minneapolis-St. Paul International Airport the option to use facial recognition to board their flight instead of a standard boarding pass," reported a CBS affiliate this week.
The facial scanners will be installed this week at 16 gates, with availability on all international flights through Delta beginning in July. The airline is working with Customs and Border Protection on the process. The way it works is gate agents use facial scans for boarding passengers so that they don't need to manually compare their faces and their passport photos. They can skip to using the facial technology. Delta says the process saves about two seconds per passenger or about nine minutes for a plane with 270 people.
Delta says 72% of its customers have said they prefer facial recognition to standard boarding procedures. But James Lileks, a columnist for the Star Tribune, explains some of the ways this makes him uncomfortable:
Here's the thing. You don't sign up for the facial recognition. You don't send them your face. They already have it. This part is just... glided over in the news reports, waved away like a minor detail you needn't worry your silly little head about.
The picture they probably have is my passport photo, taken in 2010... So I guess I'll have to stuff my cheeks with cotton before I lean into the machine that connects to a database of everyone's mug, and hope it doesn't go off
"I don't know what they do with people who grew a beard," Lileks adds, "but there's probably the option to shave on the spot."Read Replies (0)
By EditorDavid from Slashdot's making-big-money department
The Hustle tells the story of a mysterious legend who "produced thousands of the ugliest counterfeit $1 bills ever made...so poorly done that the Secret Service thought the perpetrator was intentionally mocking them" -- using a small hand-driven printing press in his kitchen:
It was printed on cheap bond paper that could be found at any stationary store. The serial numbers were "fuzzy" and misaligned, the Secret Service later said. George Washington's likeness was "clumsily retouched, murky and deathlike," with black blotches for eyes. And just for good measure, the ex-president's name was misspelled "Wahsington"...
He also never spent money in the same place twice: His "hits" spanned subway stations, dime stores, and tavern owners all over Manhattan. Investigators set up a map of New York in their office, marking each $1 counterfeit location with a red thumbtack. They handed out some 200,000 warning placards at 10,000 stores. They tracked down dozens of folks who'd spent the bills. But 10 years came and went, and the search for Mister 880 turned into the largest and most expensive counterfeit investigation in Secret Service history. By 1947, the Secret Service had documented some $7,000 of the distinctively terrible fake $1 bills -- about 5% of the $137,318 of fake currency estimated to be in circulation nation-wide. As it turned out, the worst counterfeiter in history was also the most elusive...
Agents busted into the brownstone, expecting to find a criminal mastermind. Instead, they were greeted by a jovial 73-year-old -- "5'3" tall, [with a] lean hard muscled frame, a healthy pink face, bright blue eyes, a shiny bald dome, a fringe of snowy hair over his ears, a wispy white mustache, and hardly any teeth." It was Emerich Jeuttner, the old junk collector. Juettner seemed unfazed and endearingly aloof. When answering questions, he'd pause and offer a toothless grin...
< article continued at Slashdot's making-big-money department
>Read Replies (0)
By EditorDavid from Slashdot's boxes-with-smiles department
theodp writes: Amazon Future Engineer students across the country are graduating from high school," reports the Amazon Day One blog, "and to celebrate, Amazonians visited select classrooms to meet some of the students and to check out their impressive computer science progress and end of year projects [TV coverage of an 'Amazon graduation'].
Amazon Future Engineer "is a four-part, childhood-to-career program aimed at inspiring and educating 10 million students from underrepresented and underserved communities each year to try computer science and coding. Amazon strives to achieve this by inspiring millions of children through coding camps and Code.org's Hour of Code program, funding computer science courses in high schools across the country, providing 100 students with four-year college scholarships in computer science, and offering Amazon internships to scholarship recipients."
The importance of CS education to Amazon is highlighted in a new Washingtonian story, The Real Story of How Virginia Won Amazon's HQ2, which reports, "Northern Virginia's ultimate proposal was centered around an effort to provide Amazon -- or any other tech firm that wanted to come -- with all the educated workers it needed, now and in the future. [Virginia Economic Development Partnership CEO Stephen] Moret's team proposed increasing tech education from kindergarten through 12th grade, expanding university offerings to produce up to 17,500 new bachelor's degrees in computer science and related fields, and building a tech campus that could produce the same number of master's degrees."
< article continued at Slashdot's boxes-with-smiles department
>Read Replies (0)
By EditorDavid from Slashdot's our-own-worst-enemies department
Long-time Slashdot reader chicksdaddy shares news of a recent report from cybersecurity company Forcepoint's X-Lab, examining how cybersecurity decision-making is affected by six common biases:
For instance, Forcepoint found that older generations are typically characterized by information security professionals as "riskier users based on their supposed lack of familiarity with new technologies." However, studies have found the opposite to be true: younger people are far more likely to engage in risky behavior like sharing their passwords to streaming services. The presumption that older workers pose more of a risk than younger workers is an example of so-called "aggregate bias," in which subjects make inferences about an individual based on a population trend. Biases like this misinform security professionals by directing their focus to individual users based on their supposed group membership. In turn, analysts wrongly direct their focus to the wrong individuals as sources of security issues.
Availability bias may influence cybersecurity analysts' decision-making in favor of hot topics in the news, which ultimately cloud other information they may know but are not so frequently exposed to; leading them to make less well-rounded decisions. People encounter "confirmation bias" most frequently during research. By neglecting the bigger picture, assumptions are made and research is specifically tailored to confirm those assumptions. When looking for issues, analysts can often find themselves looking for confirmation of what they already believe to be the cause as opposed to searching for all possible causes.
The fundamental attribution error also plays a significant role in misleading security analysts, Forcepoint found. This is manifested when information security analysts or software developers place blame on users being inept instead of considering that their technology may be faulty or that internal factors contributed to a security lapse.
< article continued at Slashdot's our-own-worst-enemies department
>Read Replies (0)
By EditorDavid from Slashdot's Electronic-Numerical-Integrator-and-Computer department
On Princeton's "Freedom to Tinker" site, the founder of the ENIAC Programmers Project summarizes 20 years of its research, remembering the "incredible acts of computing innovation during and just after WWII" that "established the foundation of modern computing and programming."
Commissioned in 1942, and launched in 1946, the ENIAC computer, with its 18,000 vacuum tubes, was the world's very first modern computer (all-electronic, programmable, and general-purpose). "Key technologists of the time, of course, told the Army that the ENIAC would never work."
Slashdot reader AmiMoJo quotes Cory Doctorow:
The ENIAC programmers had to invent programming as we know it, working without programming codes (these were invented a few years later for UNIVAC by Betty Holberton): they "broke down the differential calculus ballistics trajectory program" into small steps the computer could handle, then literally wired together the program by affixing cables and flicking the machine's 3,000 switches in the correct sequences. To capture it all, they created meticulous flowcharts that described the program's workings.
From the site:
Gunners needed to know what angle to shoot their artillery to hit a target 8 to 10 miles away.... The Army's Ballistics Research Labs (BRL) located women math graduates from schools nearby [who] worked day and night, six days a week, calculating thousands of ballistics trajectories which were compiled into artillery firing tables and sent to soldiers in the battlefields. It was a tremendous effort. Second, the Army and BRL agreed to commission a highly-experimental machine... [Six] women studied ENIAC's wiring and logical diagrams and taught themselves how to program it...
< article continued at Slashdot's Electronic-Numerical-Integrator-and-Computer department
>Read Replies (0)
By EditorDavid from Slashdot's money-matters department
An assistant professor of finance at Stony Brook University criticizes the argument that technology "is quickly displacing a large number of workers, and the pace will only increase as automation and other forms of artificial intelligence become more advanced," specifically calling out Universal Basic Income proponents Elon Musk, Andrew Yang, and YCombinator Chairman Sam Altman:
The problem is, there's no indication that automation is going to make human workers redundant anytime soon. Technologists probably tend to believe in automation-induced job loss because they're familiar with the inventions that are constantly forcing people to change what they do for a living. But even as these new technologies have been rolled out, the fraction of Americans with jobs has remained about the same over time. Meanwhile, evidence that automation causes job losses throughout the economy is slim... [Some studies] fail to say how many new jobs will be created in the process, so they don't give any picture of technology's overall impact on the labor market.
Thus, when UBI proponents make the dubious claim that basic income is necessary to save people from the rise of the robots, they undermine their case. They also send the message that they think a huge percent of American workers are simply too useless to be gainfully employed in the future -- hardly an appealing message.
The second dubious reason to support UBI is the idea that it can replace traditional forms of welfare spending, like food stamps and housing vouchers. Libertarian economist Milton Friedman supported a negative income tax for this reason, and modern-day libertarians often espouse this view as well. But there are reasons UBI will never be a one-size-fits-all solution. First, it's expensive. Giving all Americans $12,000 a year costs a lot more than giving money to poor people only.
< article continued at Slashdot's money-matters department
>Read Replies (0)
By EditorDavid from Slashdot's regrets-I've-had-a-few department
Bill Gates "clearly hasn't got over his biggest mistake," writes Inc. columnist Chris Matyszczyk.
Speaking at a recent VC firm event, Gates told the audience:
The greatest mistake ever is... whatever mismanagement I engaged in that caused Microsoft not to be what Android is. That is, Android is the standard phone platform -- non-Apple form -- phone platform. That was a natural thing for Microsoft to win...
There's room for exactly one non-Apple operating system, and what's that worth? $400 billion that would be transferred from company G to company M.
"You see? He couldn't even utter the word Google," quips Inc's columnist. "That's how much it hurts him."
The column also notes that Google "didn't create Android. It bought it in 2005," and "being open-source meant that Google could offer it to so many phone manufacturers around the world.... Would Microsoft have been so generous of spirit?"Read Replies (0)
By EditorDavid from Slashdot's nation-state-actors department
This year America's National Security Agency (NSA) is once again "developing a cyber challenge and daring more than 330 schools and 2,600 students to solve it," writes Federal News Network.
Slashdot reader eatvegetables shares their report:
Kathy Hutson, the senior strategist for industry and academic engagement at the NSA, said the Codebreaker Challenge has become one of the best ways to attract the next generation of talent to the federal government... NSA launched the Codebreaker Challenge in 2013 as a way to further connect with students and professors, who are focused on technology and cyber issues. Over the last six years, the annual initiative has become a much-anticipated challenge with professors making it a part of their classes and students testing their mettle against NSA's cyber experts...
The initiative provides students, professors and anyone else who is interested "with a hands-on opportunity to develop their reverse-engineering /low-level code analysis skills while working on a realistic problem set centered around the NSA's mission," said Eric Bryant, a technical director in the crypto analysis organization at the NSA. The 2018 challenge focused on ransomware and blockchain, requiring participants to solve eight separate, but related challenges... Bryant said a group of NSA cyber experts develop the challenge each year on top of their regular duties. He said they try to focus on areas that are either up-and-coming or current cyber threats and attack vectors. For the 2019 Codebreaker Challenge, Bryant said it likely will focus on mobile security threats, probably using an Android operating system...
< article continued at Slashdot's nation-state-actors department
>Read Replies (0)
By EditorDavid from Slashdot's saddle-seats department
schwit1 quotes SFGate: Airlines are squeezing as many passengers as they can onto their jets, but one seat manufacturer believes its product can help carriers push capacity to the absolute limit. And it may help push down fares.
Say goodbye to whatever personal space you had left.
At this week's Paris Air Show, lots of curious convention-goers eagerly wanted to try out Avio Interior's "SkyRider" saddle-like airplane seat, but that's probably not the reception it would get if people found it installed on their next flight.
SkyRider passengers would lean on a bicycle-seat type cushion that sits higher than your traditional airline seat. Legs sort of hang off the saddle, as they would if you were riding a horse. The seat back sits straight up, forcing good posture. A knee cut-out provides another precious few inches of legroom.
You're neither sitting nor standing — you're sort of leaning.
Airplanes can install the seats in part of their planes as an alternative to more expensive seating options, the article points out. But it also notes that the company "is still looking for its first buyer...and has been for nearly 10 years."Read Replies (0)
By EditorDavid from Slashdot's permanent-mute department
PolygamousRanchKid shares a report:
GeekWire obtained an internal Microsoft list of prohibited and discouraged technology -- software and online services that the company doesn't want its employees using as part of their day-to-day work. We first picked up on rumblings of the prohibition from Microsoft employees who were surprised that they couldn't use Slack at work, before tracking down the list and verifying its authenticity. While the list references the competitive nature of these services in some situations, the primary criteria for landing in the "prohibited" category are related to IT security and safeguarding company secrets.
Slack is on the "prohibited" category of the internal Microsoft list, along with tools such as the Grammarly grammar checker and Kaspersky security software. Services in the "discouraged" category include Amazon Web Services, Google Docs, PagerDuty and even the cloud version of GitHub, the popular software development hub and community acquired by Microsoft last year for $7.5 billion...
"It's not just the risk that Google will try to find trade secrets from data stored on their servers," said Christopher Budd, who has worked in security technology for 20 years, including past roles in Microsoft security and privacy communications. "When you're at Microsoft, you're at risk of state sponsored industrial espionage."
The article notes that in the past Microsoft adopted an even harsher stance to employees using competing products. "At a company meeting during his tenure as CEO, Steve Ballmer once famously snatched an iPhone from an employee and pretended to stomp on it..."
But GeekWire also argues that Microsoft's prohibiting of a popular chat tool "can have implications in a competitive recruiting environment."Read Replies (0)
By EditorDavid from Slashdot's American-air department
The Grim Reefer shared this report from the Associated Press:
Over the last two years America had more polluted air days than just a few years earlier, federal data shows. While it remains unclear whether this is the beginning of a trend, health experts say it's troubling to see air quality progress stagnate. There were 15% more days with unhealthy air in America both last year and the year before than there were on average from 2013 through 2016, the four years when America had its fewest number of those days since at least 1980...
Air quality is affected by a complex mix of factors, both natural and man-made. Federal regulations that limit the emissions of certain chemicals and soot from factories, cars and trucks have helped dramatically improve air quality over recent decades. In any given year, however, air quality can be affected by natural variations... Air pollution experts agree wildfires likely have had a role, along with random variation, a stronger economy which leads to more consumption of fuels, and a changing climate. Higher temperatures increase the chances for fires and smog.
Even with the recent stagnation, there are far fewer bad air days now than in the early 2000s, 1990s and 1980s.
They also report that "about 100,000 Americans each year die prematurely because of polluted air, studies show."Read Replies (0)
By EditorDavid from Slashdot's losing-controls department
Technology writer Matthew MacDonald began writing QuickBASIC code back in 1988 on the DOS operating system, sharing it on a 3.5-inch floppy disk. "I still remember writing code in white text on its cheery blue background..."
At the same time that Microsoft released Windows 3.0 -- the first version that was truly successful -- they also launched Visual Basic 1.0. Here was something entirely new. You could create buttons for your programs by drawing them on the surface of a window, like it was some kind of art canvas. To make a button do something, all you had to do was double-click it in the design environment and write some code. And you didn't use cryptic C++ code, with piles of classes, complex memory management, and obscure calls into the Windows API. Instead, you wrote friendly-looking VB code, like a civilized person.
All the graphical pizzazz was impressive, but the real secret to VB's success was its practicality. There was simply no other tool that a developer could use to sketch out a complete user interface and get coding as quickly as VB... By the release of VB 6 -- the last version of classic Visual Basic -- it was estimated that there were ten times more coders writing in VB than in the unforgiving C++ language. And they weren't just mocking up toy applications. Visual Basic wormed its way into company offices and even onto the web through ASP (Active Server Pages), another monstrously popular technology. Now you could create web pages that talked to VB components, called databases, and wrote HTML on the fly...
< article continued at Slashdot's losing-controls department
>Read Replies (0)
By EditorDavid from Slashdot's non-aggression-pacts department
"Some businesses, such as pharmaceuticals, still spend enormous amounts of time and money on intellectual property (IP) fights," reports ZDNet. But "thanks to the Open Invention Network (OIN), the largest patent non-aggression community in history, Linux and related open-source technologies have become mostly free of these expensive entanglements."
And now they're reporting that the OIN's membership has grown to over 3,000 licensees:
OIN's mission is to enable Linux, its related software, and its programmers to develop and monetize without being hogtied by patent fights. In Linux's early years, this was a constant threat. Now, thanks largely to the OIN's efforts to get everyone to agree on the basic open-source principle -- that's it's better and more profitable to share than to cling to proprietary property -- open-source software has taken off in the marketplace... The OIN, which has grown by 50% in the last two years, has turned patent non-aggression into policy for thousands of companies. By agreeing to the OIN license, members gain access to patented inventions worth hundreds of millions of dollars while promoting a favorable environment for Linux and related open source software.
The license works by everyone agreeing to patent non-aggression in core open-source technologies by cross-licensing Linux System patents to one another on a royalty-free basis. OIN-owned patents are similarly licensed royalty-free to any organization that agrees not to assert its patents against the Linux System. While it started out just covering the Linux operating system the Linux System has evolved to address Linux and adjacent Linux-related open-source technologies. It now covers open-source programs covering mobile communications, mobile payments, computing, blockchain, cloud, Internet of Things, and embedded and automotive technologies.
< article continued at Slashdot's non-aggression-pacts department
>Read Replies (0)
By EditorDavid from Slashdot's damn-statistics department
Remember that mouse study which concluded gut bacteria may contribute to autism symptoms?
Jon Brock, a cognitive scientist with 18 years research experience on neurodevelopmental conditions, including autism, has posted a Medium post summarizing new critiques of the research emerging online. (For example, from Professor Thomas Lumley, a statistical researcher who has concluded that the study's analysis "is wrong," and "arguably due in part to a poor GUI design.")
Soon after publication, scientists began expressing concerns about the paper on social media. These were echoed in a blogpost by drug discovery chemist Derek Lowe and then in a series of comments on the PubPeer website. Looking more closely at the data, the results are a whole lot less compelling than the media coverage, the press releases, and even the paper itself suggest...
The differences between mice with autistic and non-autistic donors are subtle if they exist at all. And there are reasons to be skeptical about even these small effects. Mice are not tiny humans with tails. Autism is defined in terms of human behaviour. And so the claim that mice showed "autism-like" behaviour relies on an assumption that the mouse behaviours under investigation are in some sense equivalent to the behaviours that define autism in humans...
< article continued at Slashdot's damn-statistics department
>Read Replies (0)
By EditorDavid from Slashdot's nation-state-actors department
An anonymous reader quotes MarketWatch:
Iran has increased its offensive cyberattacks against the U.S. government and critical infrastructure as tensions have grown between the two nations, cybersecurity firms say.
In recent weeks, hackers believed to be working for the Iranian government have targeted U.S. government agencies, as well as sectors of the economy, including oil and gas, sending waves of spear-phishing emails, according to representatives of cybersecurity companies CrowdStrike and FireEye, which regularly track such activity. It was not known if any of the hackers managed to gain access to the targeted networks...
"Both sides are desperate to know what the other side is thinking," said John Hultquist, director of intelligence analysis at FireEye. "You can absolutely expect the regime to be leveraging every tool they have available to reduce the uncertainty about what's going to happen next, about what the U.S.'s next move will be...."
According to the article, one of the phishing emails "appeared to come from the Executive Office of the President and seemed to be trying to recruit people for an economic adviser position.
"Another email was more generic and appeared to include details on updating Microsoft Outlook's global address book."Read Replies (0)
By EditorDavid from Slashdot's blistering-benchmarks department
Slashddot reader dcblogs quote Tech Target:
Ten years ago, China had 21 systems on the Top500 list of the world's largest supercomputing systems. It now has 219, according to the biannual listing, which was updated just this week. At its current pace of development, China may have half of the supercomputing systems on the Top500 list by 2021.... U.S. supercomputers make up 116 of the latest Top500 list.
Despite being well behind China in total system count, the U.S. leads in overall performance, as measured by the High Performance Linpack (HPL) benchmark. The HPL benchmark is used to solve linear equations. The U.S. has about 38% of the aggregate Top500 list performance. China is in second, at nearly 30% of the performance total. But this performance metric has flip-flopped between China and the U.S., because it's heavily weighted by the largest systems. The U.S. owns the top two spots on the latest Top500 list, thanks to two IBM supercomputers at U.S. national laboratories. These systems, Summit and Sierra, alone, represent 15.6% of the HPL performance measure.
Nathan Brookwood, principal analyst at Insight 64, says China is concerned the U.S. may limit its x86 chip imports, and while China may look to ARM, they're also investigating the RISC-V processor architecture.
Paresh Kharya, director of product marketing at Nvidia, tells Tech Target "We expect x86 CPUs to remain dominant in the short term. But there's growing interest in ARM for supercomputing, as evidenced by projects in the U.S., Europe and Japan. Supercomputing centers want choice in CPU architecture."Read Replies (0)
By EditorDavid from Slashdot's asleep-at-the-wheel department
"In the last week, two different people have captured video of Tesla vehicles traveling down a freeway with an apparently sleeping driver behind the wheel," writes Ars Technica.
Iwastheone shares their report:
Both incidents happened in California. Last week, local television stations in Los Angeles aired footage from viewer Shawn Miladinovich of a Tesla vehicle driving on LA's 405 freeway. The driver "was just fully sleeping, eyes were shut, hands nowhere near the steering wheel," said Miladinovich, who was a passenger in a nearby car, in an interview with NBC Channel 4. Miladinovich said he saw the vehicle twice, about 30 minutes apart, as both cars traveled along the 405 freeway. The driver appeared to be asleep both times...
Another video of an apparently sleeping Tesla driver was posted to Reddit over the weekend -- this one from the San Francisco Bay Area. The Reddit user who posted the video, MiloWee, said that she tried "several times" to wake him up by honking. "It worked, but he fell back asleep," she wrote....
Last month, police in the Netherlands pulled over a Tesla driver who appeared to be asleep and intoxicated. Another video posted in January appeared to show Tesla drivers asleep at the wheel. In an incident last November, it took police in Silicon Valley seven miles to pull over a Tesla car with an apparently sleeping driver. He was arrested for driving under the influence. Another driver in early 2018 was discovered passed out behind the wheel of his stopped Tesla vehicle on the San Francisco-Oakland Bay Bridge.
According to the San Francisco Chronicle, the man "attempted to reassure arresting CHP officers onsite that the car was 'on autopilot.'"Read Replies (0)