By EditorDavid from Slashdot's Pow!-Biff!-Whack! department
Adam West, star of the 1960s TV series Batman, has died at age 88. An anonymous reader shares a memory of that time the 53-year-old actor wrote an op-ed for a 1982 issue of Videogame and Computer Gaming Illustrated.
"I've been playing with computers longer than most," West wrote on page 6. [PDF] "I had onboard computers in Robinson Crusoe on Mars, having learned in an episode of TV's The Outer Limits that you can't survive on the Red Planet without them. Then, of course, I was up to my cowl in computers as television's Batman... In 1966, when the series began its three season run, all of that was science fiction. Computers were playthings of the researchers at MIT... Today, a lot of the apparatus we had in Batman -- dressed, of course, in less imposing names -- is fact. And we're lucky this is so."
West called videogames "an ideal means to broaden the imaginations of young people," saying the medium "can expand our awareness of the world as it is, was, or might be. The medium is still in its infancy, but read this again in a few years and see if this prediction hasn't come true: as videogaming grows, we will grow."
My favorite story is how West was cast as Batman after the show's producer spotted his performance as super-spy Agent Q in a commercial for Nestle Quik. And CNN also remembers that "later in life, West made appearances on the animated series 'Family Guy' as Mayor Adam West, the oddball leader of Quahog, Rhode Island."Read Replies (0)
By EditorDavid from Slashdot's cellular-service department
An anonymous reader writes:
86 cancer patients were enrolled in a trial of a drug that helps the immune system attack tumors. Though they had different kinds of tumor -- pancreas, prostate, uterus or bone -- they all shared a genetic mutation that disrupts their cells' ability to fix damaged DNA, found in 4% of all cancer patients. But tumors vanished and didn't return for 18 patients in the study, reports the New York Times, while 66 more patients "had their tumors shrink substantially and stabilize, instead of continuing to grow." The drug trial results were "so striking that the Food and Drug Administration already has approved the drug, pembrolizumab, brand name Keytruda, for patients whose cancers arise from the same genetic abnormality. It is the first time a drug has been approved for use against tumors that share a certain genetic profile, whatever their location in the body."
The researchers say that just in the U.S. there are 60,000 new patients every year who could benefit from the new drug.Read Replies (0)
By EditorDavid from Slashdot's tumbling-techs department
An anonymous reader quotes CNBC:
The so-called "big five" -- Apple , Alphabet Class A shares, Microsoft , Facebook and Amazon -- lost more than $97.5 billion in market value between the close on Thursday and the close on Friday, according to FactSet, dragging the Nasdaq to its worst week of the year. Shares of Apple fell nearly 4 percent on Friday, while the other four companies fell more than 3 percent. For most of the day, only 3 stocks in the S&P 500 tech sector were in the green: IBM , Teradata and Western Union . Apple, Facebook, Amazon, Netflix, and Alphabet all traded more than 2 times their 30-day average volume... "They're just plain overbought," said David Bahnsen founder, managing director and chief investment officer of The Bahnsen Group, a private wealth management firm. "They are extremely stretched from a valuation standpoint."
CNN notes the drop occurred "after a Goldman Sachs analyst questioned this year's run-up in the industry's five biggest names." They also added that "The top five techs today account for 13% of the market value weighting in the S&P 500, even though they are only 1% of the companies in the index."Read Replies (0)
By BeauHD from Slashdot's invisible-ink department
An anonymous reader writes: "Gabor Szathmari, a security researcher for CryptoAUSTRALIA, is working on a method of improving the security of leaked documents by removing hidden dots left behind by laser printers, which are usually used to watermark documents and track down leakers," reports Bleeping Computer. "Szathmari's work was inspired by the case of a 25-year-old woman, Reality Leigh Winner, who was recently charged with leaking top-secret NSA documents to a news outlet." According to several researchers, Winner might have been caught after The Intercept had shared some of the leaked documents with the NSA. These documents had the invisible markings left behind by laser printers, which included the printer's serial number and the date and time when the document was printed. This allowed the NSA to track down Winner and arrest her even before she was able to publish the leaked documents. Now, Szatmari has submitted a pull request to the PDF Redact Tools, a project for securely redacting and stripping metadata from documents before publishing. Szathmari's pull request adds a code routine to the PDF Redact Tools project that would allow app operators to convert documents to black and white before publishing. "The black and white conversion will convert colors like the faded yellow dots to white," Szathmari said in an interview. Ironically, the project is managed by First Look Media, the parent company behind The Intercept news outlet.Read Replies (0)
By BeauHD from Slashdot's there's-always-next-year department
Due to Apple's complicated way of managing the supply of the components embedded in its flagship devices, the company's upcoming iPhones may miss out on the higher-speed data links that many rival smartphones employ. "One of Apple's suppliers, Qualcomm, sells a modem capable of the 1 gigabit download speeds," reports Bloomberg. "Another supplier, Intel, is working on a modem with the same capability, but it won't be ready for the iPhone's introduction, according to people familiar with Apple's decision." From the report: Apple could in theory just use Qualcomm's chips, but it has an aversion to being dependent on a single supplier, and its relationship with San Diego-based Qualcomm is particularly thorny. Cupertino, California-based Apple is embroiled in a bitter legal fight with the chipmaker, accusing the supplier of maintaining an illegal monopoly, and it's seeking to loosen Qualcomm's grip on the market for high-end smartphone modems. That's why Apple will stick with Qualcomm modems for some of its new iPhones while relying on Intel for others. Until Intel is able to offer its chips with matching features, Apple won't enable some of capabilities of the phones running with Qualcomm modems, said the people, who asked not to be identified because the plan isn't public. Apple, Qualcomm and Intel declined to comment. Apple's decision clashes with the marketing plans of a cellular industry desperate to show off faster network speeds to grab market share. The top U.S. wireless carriers -- Verizon AT&T, T-Mobile US Inc. and Sprint Corp. -- have declared 2017 the year of 1 gigabit speeds.Read Replies (0)
By BeauHD from Slashdot's broom-and-dustpan department
Mar Masson Maack reports via The Next Web: At its inception, the internet was a beautifully idealistic and equal place. But the world sucks and we've continuously made it more and more centralized, taking power away from users and handing it over to big companies. And the worst thing is that we can't fix it -- we can only make it slightly less awful. That was pretty much the core of Pirate Bay's co-founder, Peter Sunde's talk at tech festival Brain Bar Budapest. TNW sat down with the pessimistic activist and controversial figure to discuss how screwed we actually are when it comes to decentralizing the internet. In Sunde's opinion, people focus too much on what might happen, instead of what is happening. He often gets questions about how a digitally bleak future could look like, but the truth is that we're living it: "Everything has gone wrong. That's the thing, it's not about what will happen in the future it's about what's going on right now. We've centralized all of our data to a guy called Mark Zuckerberg, who's basically the biggest dictator in the world as he wasn't elected by anyone. Trump is basically in control over this data that Zuckerberg has, so I think we're already there. Everything that could go wrong has gone wrong and I don't think there's a way for us to stop it." One of the most important things to realize is that the problem isn't a technological one. "The internet was made to be decentralized," says Sunde, "but we keep centralizing everything on top of the internet."Read Replies (0)
By BeauHD from Slashdot's off-the-grid department
Only half a dozen Supercharger stations or so out of the over 800 stations have solar arrays and batteries, but that may be about to change. Elon Musk said Tesla plans to deploy more battery and solar systems with the upcoming "Version 3" of the Supercharger, adding that "almost all Superchargers will disconnect from the electricity grid." Electrek reports: Previously, Musk said that Tesla's new Powerpack and solar arrays will power some Supercharger stations in sunny regions to go off-grid -- adding that "the grid won't be needed for moderate use Superchargers in non-snowy regions." While it makes sense to add solar arrays and battery packs, it's not clear why there would be a need to completely disconnect from the grid, which is often still useful -- especially if net metering is available. Even in regions where coal dominates electricity generation, electric cars are still more efficient than some of the most efficient gas-powered cars. Therefore, the argument could have ended here, but Musk apparently wants to take Tesla's Supercharger network off-grid as part of the company's mission to accelerate the advent of sustainable energy. Depending on the size and popularity of a Supercharger station, which generally varies from 6 partly used stalls to 20 stalls in almost constant use, Tesla would need some significantly large solar arrays at some stations -- almost football field in size. Unless there are some impressive advancements in efficiency, it's not clear how they would make it happen.Read Replies (0)
By BeauHD from Slashdot's non-von-Neumann department
The Defense Advanced Research Project Agency (DARPA) is funding a completely new kind of non-von-Neumann processor called a HIVE -- Hierarchical Identify Verify Exploit. According to EE Times, the funding is to the tune of $80 million over four-and-a-half years, and Intel and Qualcomm are participating in the project, along with a national laboratory, a university and defense contractor North Grumman. From the report: Pacific Northwest National Laboratory (Richland, Washington) and Georgia Tech are involved in creating software tools for the processor while Northrup Grumman will build a Baltimore center that uncovers and transfers the Defense Departments graph analytic needs for the what is being called the world's first graph analytic processor (GAP). Graph analytic processors do not exist today, but they theoretically differ from CPUs and GPUs in key ways. First of all, they are optimized for processing sparse graph primitives. Because the items they process are sparsely located in global memory, they also involve a new memory architecture that can access randomly placed memory locations at ultra-high speeds (up to terabytes per second). Together, the new arithmetic-processing-unit (APU) optimized for graph analytics plus the new memory architecture chips are specified by DARPA to use 1,000-times less power than using today's supercomputers. The participants, especially Intel and Qualcomm, will also have the rights to commercialize the processor and memory architectures they invent to create a HIVE. The graph analytics processor is needed, according to DARPA, for Big Data problems, which typically involve many-to-many rather than many-to-one or one-to-one relationships for which today's processors are optimized. A military example, according to DARPA, might be the the first digital missives of a cyberattack.Read Replies (0)
By msmash from Slashdot's killing-things department
Microsoft will close its file storage and sharing service Docs.com Dec. 15, it said today. As a result of its $26 billion acquisition of LinkedIn, Microsoft also got SlideShare, a more popular place for sharing presentations infographics and other materials with an audience of 70 million. SlideShare represents a better platform for storing and publishing Microsoft documents, the company said. From a report: Microsoft is advising users to migrate and/or delete content they shared on Docs.com as soon as possible. As of today, June 9, creating new Docs.com accounts is no longer supported. Those with existing accounts can still view, edit, publish, download, and delete their existing content. As of August 1, publishing and editing content on Docs.com will no longer be supported.Read Replies (0)
By BeauHD from Slashdot's question-and-answer department
Qbertino writes: I'm about to move on in my career after having a "short rethink and regroup break" and was for quite some time now thinking about getting into perhaps a new programming language and technology, like NodeJS or Java/Kotlin or something. But I have the seriously growing suspicion that artificial intelligence is coming for us programmers and IT experts faster than we might want to admit. Just last weekend I heard myself saying to a friend who was a pioneer on the web, "AI is today what the web was in 1993" -- I think that to be very true. So just 20 minutes ago I started thinking and wondering about what types of jobs there are in AI. Is anything popping up in the industry from the AI hype and what are these positions called, what do they precisely do and what are the skills needed to do them? I suspect something like an "AI Architect" for planning AI setups and clearly defining the boundaries of what the AI is supposed to do and explore. Then I presume the requirements for something like an "AI Maintainer" and/or "AI Trainer," which would probably resemble something like an admin of a big data storage, looking at statistics and making educated decisions on which "AI Training Paths" the AI should continue to explore to gain the skill required and deciding when the "AI" is ready to be let go on to the task. You're seeing we -- AFAIK -- don't even have names for these positions yet, but I suspect, just as in the internet/web boom 20 years ago, that is about to change *very* fast. And what about Tensor Flow? Should I toy around with it or are we past that stage already and will others do AI setup and installation better than me before I know how this thing really works? Because I also suspect most of the AI work for humans will closely be tied to services and providers such as Google. You know, renting "AI" as you rent webspace or subscribe to bandwidth today. Any services and industry vendors I should look into -- besides the obvious Google that is? In a nutshell, what work is there in the field of AI that can be done and how do I move into that? Like now. And what should I maybe get a degree in if I want to be on top of this AI thing? And how would you go about gaining skill and knowledge on AI today, and I mean literally, today. I know, tons of questions but insightful advice is requested from an educated slashdot crowd. And I bet I'm not the only one interested in this topic. Thanks.Read Replies (0)
By BeauHD from Slashdot's six-of-one-half-a-dozen-of-the-other department
An anonymous reader quotes a report from Ars Technica: In a study out this week, about 70 percent of home blood-pressure devices tested were off by 5 mmHg or more. That's enough to throw off clinical decisions, such as stopping or starting medication. Nearly 30 percent were off by 10 mmHg or more, including many devices that had been validated by regulatory agencies. The findings, published in The American Journal of Hypertension, suggest that consumers should be cautious about picking out and using such devices -- and device manufacturers need to step up their game. Lead author Raj Padwal and his colleagues set out to test the accuracy of the devices themselves. Funded by the University of Alberta Hospital Foundation, they compared the home blood-pressure monitors of 85 patients with a gold-standard blood-pressure measurement technique. The patients' monitors varied by type, age, and validation-status. But they all used an automated oscillometric method, which measures oscillations in the brachial artery and uses an algorithm to calculate blood pressure. The gold-standard method was the old-school auscultatory method, which involves the arm-squeezing sphygmomanometer and a clinician listening for thumps with a stethoscope. Of the 85 home devices, 59 were inaccurate by 5 mmHg or more in either their systolic (the top number that's the maximum pressure of a heart beat) or diastolic (the bottom number that's the minimum between-beat pressure). That's 69 percent inaccurate. Of those, 25 (or 29 percent) were off by 10 mmHg or more. And six devices (seven percent) were off by 15 mmHg or more.Read Replies (0)