Is "Security" Sometimes Just An Excuse?

Go down

Is "Security" Sometimes Just An Excuse?

Post by shanaya on August 24th 2010, 7:07 pm

Is "Security" Sometimes Just An Excuse?

Can security go too far? Just ask someone who's been in protective custody for a prolonged length of time. Too much security can definitely make your life miserable. Of course, if that level of security is necessary to protect your life, you'll put up with the misery. But what about the times when "security" is just an excuse to deprive you of your liberty?

Security mechanisms, like any other technology, can be misused. That's the reason many of us greet the news of newer and better security technologies with mixed emotions. Video cameras on every street corner can help police catch criminals, but they also intrude into the privacy of law-abiding citizens, and those tapes could be used to blackmail a person who has committed no crime but is caught in an embarrassing situation. Full body scanners at airports can help detect weapons and prevent terrorist hijackings, but can also be misused for the amusement of bored employees operating the systems.

Almost everyone is concerned about security today, and that goes double for those who have sensitive or confidential information stored on mobile electronic devices (which includes a large portion of the population). It's difficult enough to protect our desktop computers from intruders. Today's portable devices - laptops, netbooks, tablets, smart phones - are even more vulnerable because not only do you have to worry about them being accessed across the network, you also have the concern that the machines themselves can easily be picked up and carried away. A thief with physical access has more ways to get at your information than someone who is attacking from a remote location.

Those concerns have led to development of technologies such as tracking software that can tell you exactly where your laptop is at all times and remote wipe features that let you delete all the data from a stolen or lost smart phone. Now Apple has applied for a patent on technology that will automatically identify whether the user of a device is "authorized" and remotely disable the device if that user is deemed not to be authorized.

Certainly that's a good thing, if your phone or tablet has fallen into the hands of the wrong person. But this also has some folks wondering about other ways that such technology could be used, for purposes that have little to do with data security and everything to do with vendor control. Microsoft was soundly excoriated for the so-called "kill switch" in the original release of Windows Vista, whereby the Windows Genuine Advantage (WGA) technology would reduce the functionality of the OS if it determined that the operating system license wasn't valid. Security was cited as an important reason for the aggressive anti-piracy measures, on the premise that seeking and using pirated software exposes you to increased security risk. The company sponsored an investigation by tech analysis company IDC that resulted in the white paper titled The Risks of Obtaining and Using Pirated SoftwareI.

Customers were less than appreciative, and Microsoft changed WGA's behavior with Service Pack 1 for Vista, in response to numerous complaints about false positives.

Is it completely unreasonable to wonder if the technology Apple is patenting could be used not only to disable phones or other systems that are stolen from or lost by their legitimate owners, but could also be used to enable a new form of digital rights management (DRM)? It's interesting that the patent application mentions detecting particular activities "that indicate suspicious behavior" including "hacking, jailbreaking, and unlocking" (item 4 in the article).

Now that "jailbreaking" of phones has been exempted from the Digital Millennium Copyright Act (DMCA) by the 2010 Librarian of Congress ruling, Apple no longer has the threat of criminal prosecution to use against those who want to gain administrative access to their phone operating systems so they can install applications from third parties not approved by the Apple Store. Are they looking for stronger technologic means to prevent you from doing that? Could be.

It appears this technology would, upon detecting "suspicious behavior," would "gather screenshots, keylogs and communications served to the electronic device" and then would "restrict at least one function of the device" and it could then send notifications, take photos of the vicinity of the device and geotag the photos with location information, as well as taking photos of the person using the device. It would even be able to sense motion and vibrations and compare the "vibration" profile with a library of such profiles to figure out how the device is being transported.

All of these are very cool features from a security standpoint. I'd love to have them in place if I had a lost or stolen device. However ... do I really want Apple to have the ability to activate such features whenever they want? Do I want them to be able to take pictures of me without my knowledge, to determine exactly where I am and whether I'm in a car, on a bicycle, or walking down the street? Do I want them to be able to take screenshots and key logs of what I'm doing on the device? Sure, the questions are mostly moot. They can do a lot of that already. We all know that our cell phones can be used to trace our locations, and can be activated to listen in on what we're saying (even when we're not in a call). I'm all in favor of better security. But I worry about the development of better ways to spy on users and/or lock them down even more in their use of their legitimately purchased devices, in the name of "security."

As for copy protection, if we must have DRM (and I don't think we're going to convince the vendors to do away with it completely anytime soon), here's what I think is really needed in terms of new DRM technology: Many DRM schemes tie a copy protected file to one or a handful of specific devices. This can be a problem if you buy a new computer or handheld device to replace your old one. The new device isn't recognized and, if you've reached the limit of the number of devices allowed, you may not be able to use the file on the new device.

What we need instead is user authentication technology that would tie the file to a specific user, instead of a specific device. That way, you could play your song or read your ebook on any device you want. Some of the user identification methods mentioned in Apple's patent sound pretty cool - for instance, recording of the user's voice and comparison to a database of voice prints from authorized users (Item 12). Item 13 even describes a heartbeat sensor to detect the heartbeat of the current user and compare it with the heart signatures of authorized users. Wow. But what's the accuracy rate for these technologies? If I have a cold that changes the sound of my voice, or if a medical condition alters the pattern of my heartbeat, will I be misidentified as an unauthorized user?

Tell us what you think. Am I being paranoid to think these new "security" features might be used against the consumers they're ostensibly designed to protect? Do the benefits outweigh the risks? Are you excited about the possibilities inherent in new methods of user authentication and prevention of unauthorized access, wary of it, or a little of both? How much security is too much security - or is there such a thing? We invite you to discuss this topic in our forum at
Admin is da shiznit!
Admin is da shiznit!

Back to top Go down

Back to top

- Similar topics

Permissions in this forum:
You cannot reply to topics in this forum