Between an industry-wide push to encrypt all web traffic and the newfound popularity of secure chat apps, it’s been a boom time for online privacy. Virtual private networks, which shield your web traffic from prying eyes, have rightly garnered more attention as well.
BETWEEN AN INDUSTRY-WIDE push to encrypt all web traffic and the newfound popularity of secure chat apps, it’s been a boom time for online privacy. Virtual private networks, which shield your web traffic from prying eyes, have rightly garnered more attention as well. But before you use a VPN to hide your online shopping from the IT department at your company—or help protect yourself from state surveillance—know that not all mobile VPNs are created equal. In fact, some are actively harmful.
“These days, many people know what a VPN is and what they can do with one,” says Kevin Du, a computer security researcher at Syracuse University and IEEE senior member. “Not many people know what a bad or flawed VPN can do to their devices, because they don’t know how VPN works.”
VPNs have been around for years, as have their attending trust issues. But while previously VPN enthusiasts were mostly a core base of desktop users, the mobile boom and app store accessibility has created an explosion in mobile VPN offerings. And while some are genuinely looking to offer security and privacy services, plenty do more harm than good.
In a recent in-depth analysis of 283 mobile VPNs on the Google Play Store from Australia’s Commonwealth Scientific and Industrial Research Organization, researchers found significant privacy and security limitations in a majority of the services. Eighteen percent of the mobile VPNs tested created private network “tunnels” for traffic to move through, but didn’t encrypt them at all, exposing user traffic to eavesdropping or man-in-the-middle attacks. Put another way, almost a fifth of the apps in the sample didn’t offer the level of security that’s basically the entire point of VPNs.
Read the rest at wired.com
Amid complaints that Google Play is always switching on GPS, it appears Google has made it impossible to prevent the app store from tracking your whereabouts unless you completely kill off location tracking for all applications.
If you’re not keen on this, the options are not great: you can either delete Google Maps and/or Google Play, or you have to repeatedly turn your phone’s location services on and off as required throughout the day, which is extremely irritating.
“Kind of defeats the purpose of fine-grained privacy controls,” Al-Bassam noted, adding: “Google is encouraging developers to use the Play location API instead of the native Android API, making an open OS dependent on proprietary software.”
Google was not available for comment.
Google, it seems, is very, very interested in knowing where you are at all times. Users have reported battery life issues with the latest Android build, with many pointing the finger at Google Play – Google’s app store – and its persistent, almost obsessive need to check where you are.
The company keeps defending data-gathering features that some people don’t want instead of just making them optional.
Microsoft has been called to task for the practice by privacy advocate the Electronic Frontier Foundation. A blog post by EFF staffer Amul Kalia criticizes the company not just for collecting information for Cortana, but also for collecting telemetry data. Kalia writes: “A significant issue is the telemetry data the company receives. While Microsoft insists that it aggregates and anonymizes this data, it hasn’t explained just how it does so. Microsoft also won’t say how long this data is retained, instead providing only general timeframes. Worse yet, unless you’re an enterprise user, no matter what, you have to share at least some of this telemetry data with Microsoft and there’s no way to opt-out of it.”
Microsoft keeps making news on the privacy front, and not in a good way. Much has been made of the way Cortana in Windows 10 may invade your privacy by collecting data such as the words you speak and the keys you strike.
Katherine W was seven when her third-grade teacher issued Chromebooks to her class. Her dad, Jeff, is a serious techie, but the school’s tech choices didn’t sit well with him. He was able to get Katherine an exception that let her use a more private, non-cloud computer for the year, but the next year, Katherine’s school said she would have to switch to a laptop that would exfiltrate everything she did to Google’s data-centers.
The rules around data-collection and kids are complicated and full of loopholes. Though they seem, on the surface, to forbid Google from creating an advertising profile of kids using school-issued laptops, the reality is that kids are profiled as soon as they click outside of the Google education suite — so when a kid watches a Youtube video, her choice is added to an advertising profile that’s attached to her school ID.
Jeff worked with the Electronic Frontier Foundation to negotiate Katherine’s right to keep using non-cloud computers in school, with better privacy protections for her.
EFF has published a guide for students to improving Chromebook privacy settings, too — so if your school makes you (or your kids) use Chromebooks, you can make good choices about keeping your data private.
Windows 10 is amazing. Windows 10 is fantastic. Windows 10 is glorious. Windows 10 is faster, smoother and more user-friendly than any Windows operating system that has come before it. Windows 10 is everything Windows 8 should have been, addressing nearly all of the major problems users had with Microsoft’s previous-generation platform in one fell swoop.
But there’s something you should know: As you read this article from your newly upgraded PC, Windows 10 is also spying on nearly everything you do.
Privacy campaigners and open source developers are up in arms over the secret installing of Google software which is capable of listening in on conversations held in front of a computer.
The default behavior of hotword, a new, black-box module in Chrome (and its free/open cousin, Chromium) causes it to silently switch on your computer’s microphone and send whatever it hears to Google.
Verizon advertising partner Turn has been caught using Verizon Wireless’s UIDH tracking header to resurrect deleted tracking cookies and share them with dozens of major websites and ad networks, forming a vast web of non-consensual online tracking. Explosive research from Stanford security expert Jonathan Mayer shows that, as we warned in November, Verizon’s UIDH header is being used as an undeletable perma-cookie that makes it impossible for customers to meaningfully control their online privacy.
Mayer’s research, described in ProPublica
, shows that advertising network and Verizon partner Turn is using the UIDH header value to re-identify and re-cookie users who have taken careful steps to clear their cookies for privacy purposes. This contradicts standard browser privacy controls, users’ expectations, and Verizon’s own claims
that the UIDH header won’t be used to track users because it changes periodically.