Thursday, January 31, 2008

Bold and commendable move by IBM

IBM is deploying PGP to more than 350,000 employees enabling all these employees to keep their sensitive information confidential even if they should loose their laptop: http://techworld.com/security/news/index.cfm?newsID=11272&pagtype=samechan

Wednesday, January 30, 2008

Blogging and DLP

Should you worry about loosing IP or sensitive information through your employees use of blogs? Well, according to intellectual property attorney Stephen M. Nipper says that employees are more likely to leak closely held data through casual e-mails than through carefully thought-out blog entries. Quote is from the book Naked Conversations.

I assume it is the same Mr. Nipper who has these blogs, and there is some really interesting reads on IP on these blogs: http://inventblog.com/ , http://www.rethinkip.com/ and http://www.shapeblog.com/

On the other hand, it would also be prudent to search your public presence for sensitive information if you have the capability
Why companies and organizations need to think about SharePoint and protecting information within SharePoint.

According to Forrester, SharePoint Leads Way to Enterprise 2.0. With its capabilities for storing documents as well as providing a collaborative environment with wiki's and other social networking capabilities, comes the issues around content sensitivity, entitlement management, protection, discovery retention and audits.

Independent Software Vendors are also jumping on the bandwagon enabling SharePoint increase use in the enterprises and other organizations. Further driving the need for securing content on SharePoint: http://www.crn.com/software/205801189
Here is a link to several good thoughts on various laws and their implications from a DLP perspective: http://hack-igations.blogspot.com/search/label/credit%20card%20law (credit card laws), http://hack-igations.blogspot.com/search/label/data%20breach%20notification (breach notification)

Tuesday, January 29, 2008

In the data loss news:

58 year old Greek mathematics professor steals data causing losses of $ 361 million:
http://news.smh.com.au/greek-authorities-accuse-man-of-selling-stolen-dassault-software/20080126-1o9j.html

Choice Point settles to the tune of $ 10 million:
http://www.consumeraffairs.com/news04/2008/01/choicepoint_settle.html

7 Citibank employees in Singapore is arrested after taking customer data with them to their new employer UBS:
http://www.ft.com/cms/s/0/83d71216-caab-11dc-a960-000077b07658,dwp_uuid=e8477cc4-c820-11db-b0dc-000b5df10621.html

Laptop theft leaves unencrypted healthcare information and customer data unaccounted for at an HMO,and a retailer.

Friday, January 25, 2008

Context and uniqueness, a method for finding the proverbial needle in the haystack. How to find IP amongst large quantities of content that is close in likeness of IP, but is not IP. The false positive problem when searching content for IP, or in other words, avoiding having a million plus false positives over a large area of content containing both.

The problem around context can easily be seen when you think about this example: "He wanted more chips". Without knowing his location or situation, it is impossible to determine if he wanted potato chips, wood chips, or poker chips. Only when you know in what situation this want exists would you know which type of chips was wanted. If you look at this example: "He was outside cooking meat in his smoker. He wanted more chips." Now it becomes apparent what type of chips he wanted.

Context copied from Merriam-Webster OnLine: http://www.m-w.com/dictionary/context
Etymology:
Middle English, weaving together of words, from Latin contextus connection of words, coherence, from contexere to weave together, from com- + texere to weave — more at technical
Date:
circa 1568
1 : the parts of a discourse that surround a word or passage and can throw light on its meaning
2 : the interrelated conditions in which something exists or occurs : environment setting

When you are looking at a piece of source code such as this (randomly chosen) source code on msdn: http://msdn2.microsoft.com/en-us/library/ak5wyby1(VS.80).aspx

// attr_implements.idl
import "docobj.idl";
[ version(1.0), uuid(0ed71801-a1b6-3178-af3b-9431fc00185e) ]
library odod
{
importlib("stdole2.tlb");
importlib("olepro32.dll");

[
object,
uuid(1AECC9BB-2104-3723-98B8-7CC54722C7DD)
]
interface IBar1 {
[id(1)] HRESULT bar1();
};

[
dual,
uuid(1AECCABB-2104-3723-98B8-7CC54722C7DD)
]
interface IBar2 {
[id(1)] HRESULT bar2();
};

[
uuid(1AECC9CC-2104-3723-98B8-7CC54722C7DD)
]
dispinterface ISna {
properties:

methods:
[id(1)] HRESULT sna();
};

[
uuid(159A9BBB-E5F1-33F6-BEF5-6CFAD7A5933F),
version(1.0)
]
coclass CBar {
interface IBar1;
interface IBar2;
dispinterface ISna;
};
}

What could you use to distinguish this piece of code which is publicly available from internal code? Can you read context out of this text? What is unique? Well, if you have a bunch of this kinds of files, they all start to look very much the same, as c++ is a very structured language, and most developers will reuse code, which means that a segment can be found in many files. So the real question becomes, how can you find something unique in your internally developed code that you want to protect versus something that is publicly available. The same is of course true if you want to scan for open source code, or code covered with copy right in your internal code.

One thing this piece of code is almost devoid of are comments. The problem is that when you look at some source code, it will also contain boiler plate comments which is of course useless for identification of source code IP. So you have to do some intelligent searching to find something that can identify what is truly unique.

The method I propose is as follows:

1. Search through each file

2. Extract the comments

3. Search a commercially available search engine for number of hits

If number of hits are none, probability is high that the text combination is unique: Mark as value 9

If number of hits are low, probability are medium to high dependent of number of hits and closeness to the initial search term: Mark value 5

If number of hits are high, probability is low: Mark value 1

If there are more than one comment, the add the values

4. Create search term including the high value terms

5. Test against corpus of known publicly available source code

6. Count false positives

If number of false positives are greater than X then discard search term

7. Test against corpus of known internal code

8. Count false negatives

If number of false negatives are greater than X then go to 4 and add to search term

This method could also be used for identifying other types of IP, or business intelligence. I believe this method would also be very helpful in identifying pre-release marketing material as well as financial or legal documents. Of course, for marketing material, you would have to look for unique phrases, or words, or combinations thereof. For legal documents, you would have to look for what makes the legal document unique. With legal documents, you will also find that those can contain large amounts of boilerplate text, so this method would work well here to.

Thursday, January 24, 2008

DLP and eDiscovery

DLP lends itself well to eDiscovery. The biggest obstacle today is machine learning that can easily enable the process of discovering documents from vast repositories. With email, the situation is the same, but the discovery process is more mature as emails have been sought by litigants for a while now.

DLP would also facilitate a broader eDiscovery search than is possible without DLP. If DLP is deployed for network traffic, Data in Motion, DIM, across repositories, Data at Rest, DAR, and on endpoints, it would be possible to search for any pertaining document across the entire organization.

For DIM, with most DLP vendors, you would only have a record of incidents caught, so an archive of all traffic is not maintained. The most notable difference would be the DIM product Reconnex provides. They capture all traffic, and store it. This would enable eDiscovery on all captured traffic, and not just the traffic causing incidents.

For DAR, most DLP vendors scan the information in a central system, causing the information to move over your network, which could cause a higher utilization of the network than what is desired. With DAR, a new scan would need to be done if the rules are changed as most DAR vendors do not utilize archived and indexed material. Tablus, now RSA/EMC uses a notably different process, as most of their information is scanned locally instead of being transported across the network.

For end points, it is clearly an advantage to do the scanning locally if you have a large number of desktops/laptops deployed, unless you can use a targeted process for which systems to scan against. However each time you do a targeted scan, you increase the number of man hours needed for the process.

With DLP and eDiscovery, it is important to note that good retention policies, and actually enforcing the remediation policies. As anyone is aware of, merely relying on policies without enforcement does not effectively address the risks posed by storing un-needed information. It is my belief that DLP systems, and general search systems will be increasingly used for discovery purposes, and the dredging capabilities enabled by such systems can turn out to be very costly for organizations who does not enforce their retention policies.

Good risk management should drive the adoption of DLP in the organization, and it should pinpoint in what areas the highest return on the investment can be found.

Tuesday, January 22, 2008

EU says IP addresses are personal information

Peter Scharr, Germany's data protection commissioner stated to a European Parliament hearing that IP addresses must be considered personally identifiable information. If this becomes part of the EU privacy laws, it will have a wide impact. Article can be found here: http://www.cbsnews.com/stories/2008/01/21/tech/main3734904.shtml?source=RSSattr=SciTech_3734904
What does Roman aqueducts have in common with security and compliance?

Here is a link to a very good article on the Roman aqueducts and the impact lacking security had on the demise of the Roman society: What does this have to do with DLP you might ask? Well, I would like to point out that what was initially a convenience for the Roman populous became critical to their society, and the clear parallel can be made to today's convenience of collaboration tools such as SharePoint and others. These types of tools enables the users to become more productive, and as it gains a foothold in the organization, other tools and business processes are built on top, and finally you have an infrastructure that if attacked, can severely disrupt your business.

Proper prior planning when deploying these systems with this in mind is important. As the business processes change in the environment, they should be mapped, and risks and compliance requirements should be re-evaluated as the environment changes.
PCI and DLP

How can you use DLP to protect the flow of credit card data outside of your main credit card processing?

Data in Motion should be used to validate that credit card information does not traverse your network in the clear, and should provide audit reports to verify. DIM should also integrate with the data base as well as the web applications such that all flows are understood and mapped. The problem is that over time, tables may contain more information than what is expected, and these tables may be used by applications in ways not fully understood. Data flow mapping is paramount, and in order to do so, content inspection and classification is needed. You should be able to understand the entire lifecycle of the credit card information irrespective of what system it resides on, and if it is within your own organization, or it traverses to a business partner.

Data at Rest should be used to evaluate repositories to ensure that credit card information to and from these are understood, and that retention policies are followed in destruction of credit card data in repositories that contain PCI data that is no longer needed.

Agents should be used on applications and end user systems to ensure that only users entitled to
The information has access, and ensuring that the information is always encrypted. It should also enforce the data retention policies such that when the information is no longer needed, it must be removed. However, you cannot just go and remove information without notifying users, so the solution must take into account how to notify users about expiring material, and the different actions they can take. One way I envision the solution is that users are notified that they must either delete the data, or ask for an exception to be able to keep the information.

This system should be distributed, and should provide compliance metrics and risk management metrics for the organization. It should also be enabled to tie into other security and compliance systems, so the organization has a full view of security issues, compliance and risk being managed.

Monday, January 21, 2008

An interesting report on the state of PCI by Forrester research conducted for RSA points out that encryption and access control are the top challenges for organizations to become PCI compliant. Furthermore, they keep too much data, however the PCI controls are used to drive compliance and improve security in the same organizations.

The report found that companies are concerned about data classification and access control policies. Data classification can of course be achieved with DLP, but in most PCI systems I know of, the data traverses the network, resides in data bases and file shares, and is processed and available through web applications.

This means that the future of DLP will have to answer the challenge of identifying PCI data throughout all its use, and be able to identify business process that does not adhere to the standard, as well as provide audit capabilities driving down the cost of maintaining PCI compliance.
Why classification and protection matters for internal sensitive information is clearly shown in this article: http://www.portfolio.com/news-markets/national-news/portfolio/2008/01/14/Media-Defenders-Profile?print=trueExcerpt:%22verify as internal, unpublished information belonging to MediaDefender was stolen by a hacker who had penetrated their network.

If you are hacked, and you have sensitive information unprotected, and available to any user with a logon ID, you are susceptible to loose this information to the hacker who have penetrated your firewall. It goes to show that the only way to truly protect your information is to deploy a defense in depth solution where sensitivity of your documents are taken into account as well as your traditional perimeter defenses.

If the documents had been encrypted, it is not so likely that these documents would have been of any use to this hacker, and the impact to the organization would have been lesser.

What does such an attack, and subsequent loss of sensitive information mean to this organization's reputation and future revenue stream?

Then there is of course the loss of a laptop in Britain by the Royal Navy, containing information on 600,000 individuals of which the information was not encrypted, quote ""The HMRC data leak happened two months prior to this theft, but apparently the personal data on the Royal Navy laptop was not encrypted despite the easy availability of such software." Article can be found here: http://www.vnunet.com/vnunet/news/2207687/royal-navy-laptop-stolen. The junior officer may face court martial, but I believe the real failure is organizational, and not so much personal. What type of awareness and training in protecting information is given? Is the right set of tools available to personnel helping them identify the sensitive information they are in charge of? Do the personnel have available to them tools that allows them to easily protect the information?

For organizations, I believe it is paramount that they start looking at the ways they can support their users through easy to use classification and protection schemes. What happens if a laptop is lost or stolen? What happens if a laptop is breached through the internet? Being able to answer these questions with a statement such as, yes we had an unfortunate loss, but the information was protected with encryption, would be quite different, and probably wouldn’t make headlines impacting your organization's reputation.
I know this is old news by now, but it illustrates the need for protecting your databases.
Hacker breaches UGA system stealing social security numbers. If your database tables containing social security numbers are encrypted, hackers will not be able to use the data even if they gain access to it, unless your entire database is compromised. Do you know which tables contain this information, do you know if it is protected, and are these tables exposed to the internet?

Friday, January 18, 2008

Health care organizations are now facing review of compliance to HIPAA from The Centers for Medicare and Medicaid Services increasing the need for protection of patient records in these organizations. The first ones to go are the large hospitals, and hospitals who have received complaints. CMS is not going to publicly announce who are being reviewed, but will release a report after the review. It will be interesting to see CMS' findings. The CMS also holds the power to fine these organizations, or levy other punishments. We will see what the outcome will be. http://www.govhealthit.com/online/news/350176-1.html?type=pf
A tape containing financial information of 650,000 people is missing. The tape belongs to GE Money, and was stored at Iron Mountain. It contains information from JC Penny shoppers and maybe as many as one hundred other retailers: http://www.msnbc.msn.com/id/22718442/
Corporate espionage and other threats to companies

SANS has determined that B2B corporate espionage is the next big worry for companies according to their latest research: http://www.infoworld.com/article/08/01/15/Cyber-espionage-moves-into-B2B_1.html. With that in mind, I see further need for proper protection of information in organizations, and most organizations will have to scramble to meet these new challenges. I believe that with a combination of DLP, Classification, Protection, and Audit, will be further needed in organizations. The game has moved from being regulatory compliant, to be able to protect your valuables. The last thing you want is to be notified by the federal government that your information may have been compromised. It is time to start thinking about how to safeguard your information. It is also interesting to see that SANS have identified the insider threat as growing: http://www.sans.org/2008menaces/?utm_source=web-sans&utm_medium=text-ad&utm_content=text-link_2008menaces_homepage&utm_campaign=Top_10__Cyber_Security_Menaces_-_2008&ref=22218. Ensuring that insiders do not have access to more information than they need is paramount. If you combine this threat with new search technologies such as Google for enterprise, Fast, Microsoft Search Server and other technologies enabling the malicious insider an easy way to steal sensitive information.

The three tenants of safeguarding information are:
Identify:
Where it is, What it is, Who has access, if the information is still needed, and to determine if is it protected
Protect: Ensure proper classification, proper access rights, and encrypt sensitive information
If information is found that is properly protected:
Remediate:
Classify information appropriately according to policy and or regulatory requirements
Remove excess access rights and force periodic re-validation of access rights
Apply encryption on sensitive information if found un-encrypted.
Remove unneeded information
Audit:
Prove that the information is in its appropriate location
Prove that only required personnel has access
Prove that information is properly classified
Prove that sensitive information is encrypted
Prove that retention policies are adhered to

What does this mean for the different verticals?

Financial sector
Insurance
Health Care
Pharmaceuticals
Manufacturing
Technology

For the financial sector, the most important aspect will still be regulatory compliance, but being able to protect business secrets will become more important. Example from Wall Street is the use of DRM to protect information to be forwarded when financial gains opportunities are sent out to prospective investors. Of course irate customers who have lost monies, or have been inconvenienced will also be a driving factor

For the insurance industry, I believe the most important aspect will still be the public's perception of how well their information is protected, as well as regulatory compliance. However, the loss of business data such as marketing plans, and customer databases to competitors should also rank high

For health care, I believe the most important task is to protect patients health information. See previous entry in blog on California's new law. HIPAA has been around for a while, but with this new law, regulatory compliance becomes even more important
For pharmaceutical companies, I believe B2B espionage will become an even more troublesome area for them to address. Especially considering the heightened competition from smaller companies abroad.

For manufacturing, manufacturing processes and blueprints are the crown jewels that needs protection. Contract negotiations are also an area of concern for manufacturing. If information is lost that can better the position for the other party, profit loss and lost competitive edge ensues.
For technology companies, customer information, regulatory requirements, Intellectual Property protection and business intelligence are areas of concern along with contract negotiations

Thursday, January 10, 2008

Sears sued over privacy breach. They posted customer purchase information on their website Managemyhome.com: http://www.infoworld.com/article/08/01/08/Sears-sued-over-privacy-breach_1.html

Tuesday, January 08, 2008

DLP solutions for data bases

The three major categories are of course, data in motion, DIM, Data at Rest, DAR, and data in use. Data in motion was the first area the DLP vendors addressed. This was either followed with Data at rest, or data in use, or both. What the major DLP vendors have not addressed is data base solutions. There are companies such as Exeros who provides partial solutions in discovering sensitive information in data bases, but there is not a comprehensive solution in place today that I know of.

When looking at SQL server 2005, it enables encryption, but it does come with a price (increased storage). It enables the Database Admin, DBA, to choose between different symmetric and asymmetric encryption algorithms. This allows for encryption of the sensitive information within the data base, assuming you know where the sensitive information resides as the cost of encryption is too high to apply to the entire database.

The trouble is of course when you need to expose this information to end users or other applications. Setting up connections ensuring that the information is encrypted the entire time is complex, and would either need key management and authorization management with federation if you communicate outside of your organization, which is why data base encryption is not widely deployed.

So how can this be solved?

Search ala DLP can and should be built.
This information should be held in a meta data base for classification purposes
Where regulatory requirements require encryption, or internal policies require encryption, Database encryption should be turned on automatically. In order to do this, a common encryption scheme must be put in place in the organizations database systems, including federation where needed.
If applications transport the information, SSL or IPSEC (with encryption) should be used
If applications expose information to the end user, a DRM solution should be built that enables the IIS server to serve the documents to the end user with the appropriate permissions.

This five items sounds easy enough to implement, but the real challenge is not just to build the core technology to do this, but also build a work flow that enables the business to continue to do business while remediating incidents found where business processes a either broken and or not fully documented.
DRM and ILP

When you have identified your sensitive information, you need to do something about it. The worst position to be in, is to have sensitive information identified in areas it should not belong without a solid business plan on how to remediate it.

Unprotected sensitive information is also available to the malicious insider, and it is important to balance security needs and productivity needs. Finding a balanced solution can be done with a combination of DLP solutions, Classification Solutions, Entitlement Solutions, and Protection Solutions.

The best way to protect sensitive information is to manage the entitlements to the information as well as placing security controls on the information. Entitlement management is hard to achieve unless you can enlist the owner or custodian of the information to manage the entitlement to the information. However if you ask them to both manage entitlement and encryption of the information, you will not be successful unless you automate the solution for the custodian.

If you put in place a classification system, you can manage both via automation. However it is important to also evaluate the encryption technology you want to use. Encryption File System, EFS, may not necessarily be the best solution as when information is copied from one repository to the next, the encryption is lost, therefore exposing the document.

Digital Rights Management, DRM, is a better solution as the document is protected the entire time. It is protected at rest, in transit, and in use. One drawback is of course that DRM solutions may not necessarily protect all documents.

One DRM solution to consider is Individual Rights Management, IRM, which is capable of protecting Microsoft Office documents. IRM works in conjunction with the Rights Management Server, RMS. With such a solution, you can protect Excel spreadsheets, Word documents, PowerPoint decks and other document types. IRM also works on documents retrieved from SharePoint when IRM is used in SharePoint.

A comprehensive solution will use the information from the DLP solution, and apply the correct classification level to the repository where the documents are found. Then the entitlement solution will restrict the number of users allowed access, as well as requiring re-validation of entitlements on a periodic basis. Then finally the protection solution will place IRM protection on the documents.

IRM protection can prevent the un-authorized copy, print and forward of documents. It can in addition be used to control the lifecycle of the document by setting an expiration date on the document.
California SB 1386 extended to cover unencrypted health information. AB 1292 adds medical health and insurance information to the breach disclosure laws: http://www.scmagazineus.com/California-data-breach-disclosure-law-extended-to-cover-medical-records/PrintArticle/100459/. It will be interesting to see which organizations will have to notify after a breach in this upcomming year.
Swedish secret military information left in a public library in Stockholm Sweden on a USB stick: http://www.thelocal.se/9560/20080104/. This shows that users in all sorts of organizations need more awareness training, and easier access to security technologies enabling users to better protect their information. I will write about digital rights management as I believe it can alleviate some of these issues.

Friday, January 04, 2008

Follow up to why classification matters (seems that others believe the same):

"It's not just data. You have to classify everything from a risk perspective," said Brian Cleary, vice president of marketing at access governance firm Aveksa. "Once you have those controls in place, the likelihood of losing that data goes down exponentially." Copied from an article about information loss prevention published here:http://www.crn.com/security/205207370

If you don't keep tabs (meta data) on the types of data you have, who has access (and should have access) its sensitivity, and its age, you are running blind.

A natural extension to a DLP implementation and solution is to implement a classification solution, an entitlement solution (helps SOX), and a retention solution.
Virginia Governor Kaine announces new legislation for consumer protection from Identity loss: http://www.govtech.com/gt/print_article.php?id=242006. The proposal includes breach notification and credit report freeze.

Thursday, January 03, 2008

How to tie it all together?
The way to tie it together is to find a vendor who is willing to expose their APIs so that you can call these APIs with the information you need so you can support the work flow you need for a full compliance solution to your sensitive information.

When you go through your first initial investigation, you should map out the business processes creating sensitive information, and develop a business process for compliance management and remediation. The compliance work flow should include reporting based on business groups, regional reporting, roll ups for each business manager so problem areas can be pin pointed. Furthermore, as you start scanning, you must be ready for business processes that may not be documented, or work differently than how they are documented. Your solution must be able to facilitate on-boarding of new business processes as they are discovered.

For data at rest, a good way to manage this information, is to first detect the sensitive information in its repositories. Second is to notify the owner of the share or site that sensitive information resides in the repository. Thirdly, enable the owner to set the appropriate classification level and accompanying classification level. This step has to be intuitive and easy to perform for the end user, and must not impact the business process the repository is a part of. For this reason, the user should be given ample time to classify and protect the site, and a roll back function must enable roll back in case of adverse affect to the business process.

Measure everything – Determine what the key performance criteria are, then
constantly measure and analyze them. Integrate your findings into your operations. The best way to achieve this goal is to be able to create good reports from all the data you are gathering. By combining information from the DLP solution with your directory services, network information, repository owner information, and remediation efforts made by your compliance workflow, enables very detailed information which allows you to redirect your efforts towards your bottlenecks and highlight high risk areas you were not previously aware of.

Drive Awareness – It is my belief that without awareness in your organization of your policies and how to protect sensitive information, you will have inadvertent loss of sensitive information, possibly leaving your company non compliant with regulatory requirement, or the company looses information enabling other organizations to capitalize on the information your company worked so hard to obtain.

Wednesday, January 02, 2008

A good overview of how to detect credit card data (or how difficult it can be) can be found in the blog entry by Ofer found here: http://www.modsecurity.org/blog/archives/2008/01/detecting_credi.html
An interesting blog on DLP vendor selection from Rich Mogull: Understanding and Selecting a Data Loss Prevention (DLP/CMF/CMP) Solution: Part 1 (of a seven part series)
Necessary additions for an enterprise running a DLP solution:

It is my belief that in order to run a successful DLP solution in your enterprise, you need to address remediation of any incidents that you find. The success of your implementation does not rely solely on how effective the solution is at finding incidents. It also relies on how well you can use the toolset to remediate these incidents.

I feel that most of the solutions available in the market place lacks one significant item, which is permanent classification of the information found. If you don't classify the sensitive information in a visible manner to the end user, you will end up with a whack-a-mole game with less than optimal risk reduction. Please read the previous post about why classification matters.

Furthermore, I believe it is important to tie your incident management into already existing incident management deployments already in place in the organization. That allows a complete tracking and reporting on all your incidents, whether from a DLP incident or a missing patch on a critical server. It also allows you to measure the effectiveness of your service, and if you are meeting your SLAs.

Maintaining strong metrics is an integral part of Service Management, and helps strengthen your ability to prove compliance to regulatory requirements and other requirements. It should also reduce your cost of audits.

Since most of the vendors are still venture capital, VC, funded, they are not able to meet all the requirements of an enterprise. Some of the vendors have now been purchased by larger companies, most notably Vontu, purchased by Symantec, and Tablus purchased by RSA/EMC. Even when a vendor has been acquired by a larger player, it takes time to integrate their solutions, so for a while I believe it is important to spend some development resources to build a comprehensive solution to serve the enterprise needs.

To be successful, Information Loss Prevention should be part of your overall compliance strategy and overall compliance service. This again should be part of your overall Governance, Risk and Compliance strategy, GRC.
If you are an engineer, or specially interested in pattern matching technologies, I would highly recommend this book:

It is well worth reading

Tuesday, January 01, 2008

Proof of Concept of an ILP solution:

Before embarking on a full fledged information loss prevention program, you should conduct a proof of concept testing of the vendors you want to evaluate.

Before you start your proof of concept, you need to gather information from your lines of business on what they consider to be sensitive, as well as information from your legal counsel in regards to what regulatory compliance areas you should be concerned about.

With this information, you should do a thorough risk analysis. If you don't have all of the necessary knowledge inside your organization, I would recommend to hire an independent consultant to help you in this phase, as well as in the execution of the proof of concept.

You should also have a good understanding of how you want to run an ongoing ILP process in your organization after you have purchased the solution that best fits your needs.

The way I prefer to start of the process is to assemble both a core team and a virtual team so that you have the resources you need to succeed. The core team typically consists of a PM, an Architect, maybe a developer if needed, and a system administrator.

The plan for implementation begins with a project plan detailing the steps necessary. It should contain the Request for Information, RFI, process, Request for Proposal, RFP, process, actual testing both in a lab environment and some select production areas, as well as deployment tasks and hand off to service management.

The inception phase of the project is used to create the business requirement document, BRD, and project plan.

The planning phase is also the beginning of the POC phase where testing happens in a controlled environment within a lab.

The Development phase is also used for contract negotiations as well as developing any needed processes and code for the DLP solution to work optimally within the organization.

The testing phase is used to test any custom code, as well as any additional deliverables needed from the DLP vendor, as well as testing the processes established to see if they need any polishing.

The Deployment phase then follows which should contain alternate plans for deployment in case you face any obstacles.

The final phase is hand off to Service Management. I prefer upfront planning and collaboration with the Service Management team, and have RACI (responsible, accountable, communicate, and inform) all in place along with service management documentation such as Service Level Agreements, SLA, Operational Level Agreements, OLA, and Independent Contracts, IC. This makes the hand off much easier.
Worrisome findings:
Companies Clueless about unsecured data loss