Tuesday, April 14, 2009

With RSA's new 7.0 release, they have improved their PII scanning capabilities along with reducing the overall TCO of maintaining their DLP solution: http://www.indiaprwire.com/pressrelease/information-technology/2009041423363.htm

Monday, March 23, 2009

Heartland reveals in their annual report that the data breach last year is currently under investigation by SEC, FTC, DOJ, Federal Financial Institutions Examination Council, and the Office of the Comptroller of the Currency. This is in addition to attorneys general of several states, and Canadian authorities.

This breach is going to be a costly affair for the company if the attrition numbers are continuing to grow. Even more costly will be the loss of sponsorship from their primary sponsor bank. Visa booted Heartland off of its list of processors compliant with the Payment Card Industry data-security standards, or PCI last week.

Wednesday, March 11, 2009

California State Sen. Joe Simitian introduced new legislation to Expand Data Breach Notification Law

According to the magazine, Wired - March 06, 2009, California State Sen. Joe Simitian has introduced legislation that would require companies to provide more information in their data breach notification letters to consumers and to send notices to state authorities.

Tuesday, March 10, 2009

According to a study released by the Ponemon institute, 6 out of 10 US employees stole company data when they left their company according to this article in BBC NEWS: http://news.bbc.co.uk/2/hi/technology/7902989.stm

This is really a wake up call, to introduce digital rights management into the corporations to protect customer data, intellectual property and business secrets. Coupling Identity Management practices with DRM will ensure that sensitive information is adequatly protected even when walking out the door when the employee leaves.

Monday, February 16, 2009

Three Florida men arrested for using stolen credit card information stemming from the Heartland breach. The value of attempted and actual fraud committed by these three alone exceeds $100,000:http://computerworld.com/action/article.do?command=viewArticleBasic&articleId=9127984&intsrc=hm_list

Monday, February 09, 2009

FAA gets unvanted visitors into their computer systems last week according to a union leader, accessing names and national identification numbers of 45,000 employees and retirees, View article...

Wednesday, February 04, 2009

Microsoft and EMC announces a continuance of their partnership, and Ballmer is talking about the DLP collaboration between RSA and Microsoft in this article: http://news.cnet.com/8301-10805_3-10156015-75.html?tag=newsLeadStoriesArea.1

Tuesday, February 03, 2009

Search, SharePoint, tagging of sites and documents for classification purposes

How would you improve the security of SharePoint. One would be to classify sites and tag classified documents. The problem is the static nature of a search. A DLP pattern or fingerprint, is really nothing else than a search. It is more specialized than a search conducted by a user, however it is still search using regular expressions and fingerprints in addition to keywords etc.

How can search be improved for security purposes? I believe it is best done by placing more enabling tools in the hands of users. What is needed is improved feedback loops and a better understanding of the users of the system. In other words, can SharePoint security be improved upon by using the playbook from the semantic web movement? I believe it can.

Here is how I envision it to work. The SharePoint sites are scanned for sensitive information using rules and patterns that has a high accuracy rate, and tag/classify the matching documents found. This result set should then be visible to the users who has access to the site, whether it is directly when visiting the site, or when the site is shown in a search result.

Because documents of the same type tends to be clustered, the users of the site should be asked about the sensitivity of the documents not yet tagged on the site. According to research done at Microsoft users with similar interests tended to rank their search results similarly. The assumption I would make, is that high frequency users of a specific SharePoint site would classify the documents the same. If these users are then also asked to supply more information about these documents than just the classification level, you can start creating richness in the tagging such as type of document: Health information, financial information, hr information etc. This could also be done automatically if you know what department t he most frequent users belong to. If the automated tag turns out to be wrong, a feedback opportunity to change should be presented to users. An example where this is done in a similar fashion for searches on Ask.com where users are presented with information telling them about the soundness of the site they are about to visit using tools from Symantec.
Study Finds Consumers Want Control over Data
Consumers try to protect their privacy, but don't fully understand how privacy and security technologies work or what protection is being provided, according to a new study.

Monday, February 02, 2009

I belive the issues surrounding compliance will follow us into the cloud. Here is a great link that explains the cloud taxonomy and cloud ontology: http://news.cnet.com/8301-19413_3-10152106-240.html

Thursday, January 29, 2009

NIST and DLP vendor opportunities

NIST has published a draft guide for protecting PII and it will affect best practices and technology choices in years to come when the draft becomes a full standard. The NIST guide provide guidance to organizations on how they should manage PII stored or processed in their systems based on the level of sensitivity.

If the draft become a released standard, organizations will be using it to prove or disprove the ability to comply with best practices. Therefore mapping technology and policies to the standard is important, and it is important to understand that not one product can solve all of the issues. However a set of complementary products can solve it. DLP products does help in many ways, and it would be good for DLP vendors to start defining best practices that spans beyond DLP such as including Identity Management, Storage, Policy, Policy management, Encryption and risk management. The statement from NIST that not all PII is to be treated the same, is very telling, as a classification and tagging of the data would here help to apply the right set of controls for the high value items, and not overdo the controls for lesser value data.

Some observed issues with the NIST publication is that it defines PII but does not provide an exhaustive list. For example, for the Census Bureau, there may be additional types of PII that they specify are stricter.

NIST recommends that each organization Create Policies and Procedures, Conduct Training, De-identify PII, Employ proper Access Enforcement, Esure Transmission Confidentiality, and Audit Events.

So similar to PCI, DLP might not be the full answer to the story but can provide insight that helps to enable compliance for some of these areas. For de-identifying PII, DLP help by discovering PII. It is then it's up to the organization to de-identify it. This is of course not a straight forward process, and will need some thought before being implemented. With DLP, the organization gains understanding of the business units or groups that are having the most issues and concentrate or focus training activities. Likewise for create policies and procedures - this falls into the realm of understanding the PII inventory and what the priority levels are.

The new collaboration between RSA and Microsoft for DLP solutions coupled with DRM is clearly a step in the right direction.
Regular search does not consider the fact that sensitive documents are typically found in clusters. If your DLP search engine has found one sensitive document in a location such as a file share or laptop, the probability of there being more is very high, however they are found to be false negatives. For example if a sensitive document is found in a file share, there is a high likely hood that there are other documents of equal sensitivity that are not covered. The usage scenario could be an HR professional storing documents in a folder for a specific task. If the filter only finds one, the current assumption with DLP is that there are no other files in this folder that are sensitive. This is a false assumption based on my observations of real incidents.

How to remedy for this?
A manual review can be done for the rest of the folder and folders in the tree
The folder can be marked sensitive, and all documents in this folder is then considered sensitive
The folder can be automatically reviewed by a broader capture filter (filters used are usually tuned to reduce false positives leading to a higher number of false negatives)
Finger printing (full or partial) can be used to see if these documents resides elsewhere
Pattern creation can be used to improve the search patterns
Etc.

The true solution to this is a combined approach using manual inspection, machine learning, and making the assumption that the likely hood of one single sensitive document residing in a repository is low, and that the likely hood of more than one document is sensitive is high, and mitigate the risk of the cluster by classifying, tagging, and protecting the cluster instead of a single document.

Monday, January 26, 2009

Content protection should be tied into access certification. It seems that companies are now improving their compliance by implementing provisioning technologies according to the Burton Group. Considering how hard it is to control whom should have access to what, I believe that a coupling of provisioning tools and DLP is the next logical step.

The content custodian should be notified of the type of content by the DLP system, and the choices should be presented to the custodian for protection measures and marrying this with provisioning systems would lessen the burden on the custodian.
Not long after the public notification of the breach of Heartland Payment systems attorney firms such as Girard Gibbs LLP start their investigation into the breach, and solicit individuals that may be affected by the breach. This may have been the largest breach ever. The numbers may reach tens of millions of credit and debit card transactions according to this article in Washington Post,

Sunday, January 25, 2009

President Obama embarks on wide reaching changes in the regulatory environment according to New York Times. This should translate into busy times for any IT department managing regulatory and compliance issues for their companies.

Friday, January 23, 2009

Researchers have found a relationship between word choices in communications and how well a relationship is functioning. In other words, content in communications can be used to establish the overall health of a relationship: http://www.msnbc.msn.com/id/28814669/

If the choice of words can be used to determine the strength of a relationship based on frequency of certain words, it is not a far conclusion to be drawn that foul play could be found by the same type of study. The choice of words would be different of course, but if there was a large collection of communications that could be mined between criminals, it should be possible to use pattern recognition to ferret out such communications in network traffic.
According to Network World, Forrester research predicts big opportunities for Tech Firms with the Obama Cyber security plan.

Wednesday, January 21, 2009

Interesting link to coverage of data breaches in 2008: http://www.insideidtheft.info/breaches.aspx?gclid=CNrSt4epnpgCFSMSagodjCXQmg
According to an article in Washington Post, Heartland Payment systems may have had the largest breach ever. The numbers may reach tens of millions of credit and debit card transactions.

Friday, January 16, 2009

Two areas for concern for 09 will be sensitive information going into virtualized environments, and into the cloud.

According to this article in WSJ, The Center for Strategic and International Studies report points out the trend towards greater industrial espionage: Quote from WSJ article "Supposedly confidential corporate information, the report warns, is almost certainly being hacked. As more individuals and companies rely on "cloud computing" -- storing information and services such as email remotely on supposedly secure servers -- foreign intelligence agencies and commercial snoops may have access." This is a troubling statement.

According to CIO magazine, CIO's are looking towards virtualization and the cloud for 09 to reduce operating and capital expenses. If these are the areas of investment, this is also where the criminals will spend their resources to wrestle valuable information from the rightful owners.

Internet News is running an article on this subject today