By BeauHD from Slashdot's not-on-my-watch department
A recently discovered hole in Valve's API allowed observers to generate extremely precise and publicly accessible data for the total number of players for thousands of Steam games. While Valve has now closed this inadvertent data leak, Ars can still provide the data it revealed as a historical record of the aggregate popularity of a large portion of the Steam library. From the report: The new data derivation method, as ably explained in a Medium post from The End Is Nigh developer Tyler Glaiel, centers on the percentage of players who have accomplished developer-defined Achievements associated with many games on the service. On the Steam web site, that data appears rounded to two decimal places. In the Steam API, however, the Achievement percentages were, until recently, provided to an extremely precise 16 decimal places.
This added precision means that many Achievement percentages can only be factored into specific whole numbers. (This is useful since each game's player count must be a whole number.) With multiple Achievements to check against, it's possible to find a common denominator that works for all the percentages with high reliability. This process allows for extremely accurate reverse engineering of the denominator representing the total player base for an Achievement percentage. As Glaiel points out, for instance, an Achievement earned by 0.012782207690179348 percent of players on his game translates precisely to 8 players out of 62,587 without any rounding necessary (once some vagaries of floating point representation are ironed out). Ars has shared the Achievement-derived player numbers in their report; there's also a handy CSV file. Some of the titles with the most total unique players include Team Fortress 2 (50,191,347 player estimate), Counter-Strike: Global Offensive (46,305,966 player estimate), PLAYERUNKNOWN'S BATTLEGROUNDS (36,604,134 player estimate), Unturned (27,381,399 player estimate), and Left 4 Dead 2 (23,143,723 player estimate).Read Replies (0)
By BeauHD from Slashdot's signed-into-law department
In early May, Hawaii lawmakers passed a bill that would prohibit the sale of over-the-counter sunscreens containing chemicals that contribute to the destruction of the state's coral reefs and other ocean life. Hawaii Governor David Ige signed the bill this week, making the ban official. Popular Mechanics reports: Hawaii is the first U.S. state to pass a legislation banning the sale of sunscreen containing [oxybenzone and octinoxate]. The bill will go into effect on January 1, 2021. "We are blessed in Hawaii to be home of some of the most beautiful natural resources on the planet," Ige said at the bill signing, according to The Huffington Post. "But our natural environment is fragile and our own interaction with the Earth can have everlasting impacts, and this bill is a small first step worldwide to really caring about our corals and our reefs in a way that no one else anywhere in the world has done."
A 2015 study conducted by scientists at the University of Central Florida found that oxybenzone, a common UV-filtering compound, kills the coral, causes DNA damage in the coral's adult stage, and deforms the DNA in the larval stage, hindering its development. A separate 2015 study, published in the Archives of Environmental Contamination and Toxicology and conducted by biologist Craig Downs, also found that the chemicals produced water pollution and had damning effects on the coral reefs. In 2012, Women's Health reported that oxybenzone and octinoxate may actually be harmful to humans as well, not just coral reefs. According to the publication, when the skin absorbs oxybenzone, it can cause an eczema-like allergic reaction and disrupt hormone levels. Octinoxate may damage skin cells and lead to premature aging.Read Replies (0)
By msmash from Slashdot's pushing-the-limits department
Victorien Erussard, an experienced ocean racer from the city of Saint-Malo in the north of France, was halfway through a dash across the Atlantic when he lost all power. Never again, he thought. "I came up with the idea to create a ship that uses different sources of energy," he says. The plan was bolstered by the pollution-happy cargo ships he saw while crossing the oceans. "These are a threat to humanity because they use heavy fuel oil." Five years on, that idea has taken physical form in the Energy Observer, a catamaran that runs on renewables. From a report: In a mission reminiscent of the Solar Impulse 2, the solar-powered plane that Bertrand Picard and Andre Borschberg flew around the world a few years back, Erussard and teammate Jerome Delafosse are planning to sail around the planet, without using any fossil fuel. Instead, they'll make the fuel they need from sea water, the wind, and the sun. The Energy Observer started life as a racing boat but now would make a decent space battle cruiser prop in a movie. Almost every horizontal surface on the white catamaran is covered with solar panels (1,400 square feet of them in all), which curve gently to fit the aerodynamic contours. Some, on a suspended deck that extends to the sides of the vessel, are bi-facial panels, generating power from direct sunlight as well as light reflected off the water below. The rear is flanked by two vertical, egg whisk-style wind turbines, which add to the power production. Propulsion comes from two electric motors, driven by all that generated electrical energy, but it's the way that's stored that's clever. The Energy Observer uses just 106-kWh (about equivalent to a top-end Tesla) of batteries, for immediate, buffer, storage and energy demands. It stores the bulk of the excess electricity generated when the sun is shining or the wind is blowing as hydrogen gas.Read Replies (0)
By BeauHD from Slashdot's beginning-of-the-end department
HTC has been struggling to stay competitive for years now with its Android handsets and virtual-reality headsets, and it still can't seem to get any relief. As BGR reports, the latest ominous headline points to a nearly 68-percent sales slump in June, marking HTC's worst results in more than two years. From the report: Even beyond all that, the company has had a tough go of it lately. There have been a few rounds off layoffs this year alone, the most recent being the company's culling of 1,500 workers from its Taiwan manufacturing division. After HTC president of smartphone and connected devices Chialin Chang resigned in February, the company also gave pink slips to several U.S. workers in the wake of combining its smartphone and VR units. Those 1,500 workers being axed, it also should be noted, comprise almost a quarter of the company's worldwide workforce.
Reuters on Friday quoted an unnamed analyst at market research firm Trendforce who puts the blame for some of this at HTC's feet partly as a result of unexciting products. "In the high-end segment, the sales of their flagship phone this year has been lower than expected, leading to lower market share," the analyst notes. "As for HTC's middle-end and entry-level series, the new models feature neither new specs nor high performance-price ratio, influencing the sales."Read Replies (0)
By BeauHD from Slashdot's slow-as-molasses department
dryriver writes: If you are a user of a popular professional desktop software program, it is not uncommon for that program to get anywhere from 5 to 20 major or minor new features and functions about once a year to stay desirable and competitive. But it seems that hugely popular internet-based sites and services like Instagram, Facebook, YouTube, Google Search, Gmail, Outlook, WhatsApp, Telegram and others get major new features/changes much, much slower than desktop software. Quite often you'll come across a barrage of breathless news articles that say "Popular Internet Service X will add Y feature starting from April 1st." It is often one single and very obvious feature or functionality being added that people have wanted for years, not a cluster of 5 or 10 funky new functions at the same time. Why is this the case? How is it that desktop software with just a few hundred thousand users and no more than a few dozen coders working can add 5 to 20 major new functions in just one year, and do this year after year, but a major internet-based service with tens or hundreds of millions of users and presumably hundreds or thousands of techies working behind the curtain keeps everyone waiting three years or longer to build a much requested feature into the system, and then only rolls out that one desired feature to great fanfare as if it is a huge achievement? Is it really that much harder to code major new features into an internet/cloud service, versus coding major new features into desktop software; or is this a deliberate business model that has become popular?Read Replies (0)
By msmash from Slashdot's closer-look department
An anonymous reader shares a report: A&R, or "artists and repertoire," are the people who look for new talent, convince that talent to sign to the record label and then nurture it: advising on songs, on producers, on how to go about the job of being a pop star. It's the R&D arm of the music industry. [...] What the music business doesn't like to shout about is how inefficient its R&D process is. The annual global spend on A&R is $2.8bn, according to the International Federation of the Phonographic Industry, and all that buys is the probability of failure: "Some labels estimate the ratio of commercial success to failure as 1 in 4; others consider the chances to be much lower -- less than 1 in 10," observes its 2017 report. Or as Mixmag magazine's columnist The Secret DJ put it: "Major labels call themselves a business but are insanely unprofitable, utterly uncertain, totally rudderless and completely ignorant." In the golden age of the music industry, none of that really mattered. So much money was flowing in that mistakes could be ignored. There was no way to hear most music other than to buy a record, and when CDs entered the market in the 1980s -- costing little to produce, but selling for a fortune -- the major labels were more or less printing their own money. But then came the internet: first file-sharing, then streaming slashed sales of physical music so deeply that the record business became a safety-first game. Every label executive has always wanted hits, but these days the people who run the big imprints want guaranteed hits. The rise of digital music brought with it a huge amount of data which, industry executives realized, could be turned to their advantage. In his first public speech as CEO of Sony, in May 2017, Rob Stringer asserted: "All our business units must now leverage data and analytics in innovative ways to dig deeper than ever for new talent. The modern day talent-spotter must have both an artistic ear and analytical eyes." Earlier this year, in the same week as Warner announced its acquisition of Sodatone, a company that has developed a tool for talent-spotting via data, another data company, Instrumental, secured $4.2m of funding. The industry appeared to have reached a tipping point -- what the website Music Ally called "A&R's data moment." Which is why, wherever the music industry's great and good gather, the word "moneyball" has become increasingly prevalent.Read Replies (0)
By msmash from Slashdot's gift-that-keeps-giving department
To celebrate this week's holiday, The Vindicator, a small newspaper in Texas, posted sections of the Declaration of Independence. "We hold these truths to be self-evident." "The history of the present King of Great Britain is a history of repeated injuries and usurpations, all having in direct object the establishment of an absolute Tyranny over these States." Yadda, yadda. You get the idea. But a section of the text containing the phrase "Indian Savages" set off Facebook's hate-speech flags. The post was then temporarily taken down by Facebook, Business Insider reports. From a report: He has excited domestic insurrections amongst us, and has endeavored to bring on the inhabitants of our frontiers, the merciless Indian Savages, whose known rule of warfare, is an undistinguished destruction of all ages, sexes and conditions. After The Vindicator ran a story on the censorship, Facebook corrected the mistake. "The post was removed by mistake and restored as soon as we looked into it. We process millions of reports each week, and sometimes we get things wrong," a Facebook spokesperson said. And honestly, as far as Facebook getting things wrong, this is an ideal "mistake."Read Replies (0)
By msmash from Slashdot's shape-of-things-to-come department
Japan's computer giant Fujitsu and RIKEN, the country's largest research institutes, have begun field-testing a prototype CPU for a next-generation supercomputer they believe will take the country back to the leading position in global rankings of supercomputer might. From a report: The next-generation machine, dubbed the Post-K supercomputer, follows the two collaborators' development of the 8 petaflops K supercomputer that commenced operations for RIKEN in 2012, and which has since been upgraded to 11 petaflops in application processing speed. Now the aim is to "create the world's highest performing supercomputer," with "up to one hundred times the application execution performance of the K computer," Fujitsu declared in a press release on 21 June. The plan is to install the souped-up machine at the government-affiliated RIKEN around 2021. If the partners achieve those execution speeds, that would place the Post-K machine in exascale territory (one exaflops being a billion billion floating point operations a second). To do this, they have replaced the SPARC64 VIIIfx CPU powering the K computer with the Arm8A-SVE (Scalable Vector Extension) 512-bit architecture that's been enhanced for supercomputer use, and which both Fujitsu and RIKEN had a hand in developing. The new design runs on CPUs with 48 cores plus 2 assistant cores for the computational nodes, and with 48 cores plus 4 assistant cores for the I/O and computational nodes. The system structure uses 1 CPU per node, and 384 nodes make up one rack.Read Replies (0)
By msmash from Slashdot's how-about-that department
Catalin Cimpanu, writing for BleepingComputer: A recent study conducted by academics from the University of Hertfordshire in the UK has revealed that almost two-thirds of second-hand memory cards still contain remnants of personal data from previous owners. For their study, researchers analyzed 100 second-hand SD and micro SD memory cards purchased from eBay, conventional auctions, second-hand shops, and other sources over a four-month period. All in all, researchers say the memory cards they recovered were previously used in smartphones and tablets, but some cards were also used cameras, SatNav systems, and even drones. The research team says the analysis process consisted of creating a bit-by-bit image of the card and then using freely available software to see if they could recover any data from the card. Their efforts were successful and worrisome at the same time, as the team says it managed to recover data from the memory cards, including intimate photos, selfies, passport copies, contact lists, navigation files, pornography, resumes, browsing history, identification numbers, and other personal documents.Read Replies (0)
By BeauHD from Slashdot's record-breaking department
hackingbear writes: Researchers at the Chinese University of Science and Technology have demonstrated stable quantum entanglement with 18 qubits, surpassing the previous world record of 10, also held by the same team. This represents a step toward realizing large-scale quantum computing, according to a recent study published in the journal Physical Review Letters. Physicist Pan Jianwei and his colleagues achieved the new record by simultaneously exploiting three different degrees of freedom-paths, polarization and orbital angular momentum of six photons, the fundamental particle of light. The outcome combination resulted in a stable 18-qubit state. Full control over the number of entangled particles determines the fundamental ability for quantum information processing, according to the study. There are early-stage quantum computers out there that argue more qubits -- such as IBM's 50-qubit machine and Google's 72-qubit Bristlecone, but in those cases, the individual quantum states of the qubits aren't (fully) controllable. "The team's next step will be to realize a 50-qubit entanglement and manipulation," according to Wang Xilin, a member of the team. The same research team also held the world record on quantum communication distance as well as operating the world's first quantum communication satellite.Read Replies (0)