In the data loss news:
Symantec who owns Vontu (a data loss prevention solution) lost sensitive information in a laptop theft. The same fate befell HP according to this PC world article:
http://www.pcworld.com/article/155372/hp_symantec_warn_employees_after_laptop_thefts.html
Sony disclosing children's private information without parents consent: http://www.iht.com/articles/2008/12/11/technology/sony.php
Monday, December 15, 2008
Friday, December 12, 2008
looks like DLP is becomming a commodity. Now a firewall vendor, Palo Alto Networks, will offer limited DLP for free as part of their firewall technology: http://www.darkreading.com/security/perimeter/showArticle.jhtml?articleID=212300545
It has been quite some time since I have updated this blog. However, the collaboration between Microsoft and RSA does need mentioning. http://www.microsoft.com/presspass/press/2008/dec08/12-04EMCRSAPR.mspx It marries Data Loss Prevention with Digital Rights Management. Not to be outdone, Liquid Machines and McAfee follows with this anouncement: http://www.pcworld.com/businesscenter/article/155185/liquid_machines_mcafee_partner_on_dataloss_prevention.html
Thursday, July 31, 2008
Government risk for loss of sensitive information still high:
According to an article in computer world http://computerworld.com/action/article.do?command=viewArticleBasic&articleId=9110983&intsrc=hm_list, only 30% of the laptops containing sensitive information are encrypted.
Since it is taking the government such a long time to encrypt, I would suggest they deploy encryption based on sensitivity of documents stored on laptops. They should start searching using DLP, and make some assumptions around employee roles and mandate encryption
According to an article in computer world http://computerworld.com/action/article.do?command=viewArticleBasic&articleId=9110983&intsrc=hm_list, only 30% of the laptops containing sensitive information are encrypted.
Since it is taking the government such a long time to encrypt, I would suggest they deploy encryption based on sensitivity of documents stored on laptops. They should start searching using DLP, and make some assumptions around employee roles and mandate encryption
Monday, July 28, 2008
What were they thinking? I understand the need to provide the court with adequate evidence (I am not a lawyer), but you would think the prosecutor would at least ask the court to conceal the information when it exposes an entire city's network.
San Francisco DA exposes the city's network passwords:http://computerworld.com/action/article.do?command=viewArticleBasic&articleId=9110758&intsrc=hm_list
Maybe it is time to run documents provided to court through a review for sensitivity before actually submitting documents to court? In my own experience, I know that documents containing health information and information about children becomes sealed, and the court has the discretion to seal any information it finds necessary to seal as long as it does not violate the public's right to access of information. Clearly, the public does not need to know San Francisco's network passwords, and the tax payers clearly does not need to see their hard earned money being used to reset all these passwords.
San Francisco DA exposes the city's network passwords:http://computerworld.com/action/article.do?command=viewArticleBasic&articleId=9110758&intsrc=hm_list
Maybe it is time to run documents provided to court through a review for sensitivity before actually submitting documents to court? In my own experience, I know that documents containing health information and information about children becomes sealed, and the court has the discretion to seal any information it finds necessary to seal as long as it does not violate the public's right to access of information. Clearly, the public does not need to know San Francisco's network passwords, and the tax payers clearly does not need to see their hard earned money being used to reset all these passwords.
Friday, July 18, 2008
I have earlier described pattern matching, and "smart" information retrieval by first looking at broad groupings of information to create a set, then search the resultant set with a finer granularity in search terms.
If we use the neo cortex processing as an example, lower levels of information is detected by our sensory organs and processed at a lower level, and a fraction of this information is actually processed in a higher level organ. If we were to process information this way, we could do the following: For each search term, key words being the lowest, we could assign probability of this documents relevance, and then search the resultant set with bigrams. This result set would then be searched with trigrams. These resultant set would then be assigned with a probability of relevance. The finest search using complex patterns would only be used on the final set.
For each of these searches, a registry (data base) would then serve as the index of this information, and it should correlate to a taxonomy. This taxonomy would then be used to create meta data that would be assigned the document. With this, the opportunity to search for hidden patterns would be possible via data mining techniques.
If we use the neo cortex processing as an example, lower levels of information is detected by our sensory organs and processed at a lower level, and a fraction of this information is actually processed in a higher level organ. If we were to process information this way, we could do the following: For each search term, key words being the lowest, we could assign probability of this documents relevance, and then search the resultant set with bigrams. This result set would then be searched with trigrams. These resultant set would then be assigned with a probability of relevance. The finest search using complex patterns would only be used on the final set.
For each of these searches, a registry (data base) would then serve as the index of this information, and it should correlate to a taxonomy. This taxonomy would then be used to create meta data that would be assigned the document. With this, the opportunity to search for hidden patterns would be possible via data mining techniques.
Monday, July 14, 2008
One concern I have heard against using the CLR regex support in SQL server 2005, is performance. One way to overcome the cost of expensive regex queries is to do the search a bit smarter. One could start with the LIKE operator, or equivilant in other systems, and then do a sampling of rows in a table that returned results from the LIKE operation. After obtaining a sample rather than the entire table, one could then perform the operation on a separate system, or in a separate thread on the same system. With this approach, very complex patterns could be searched for, and one could create a separate repository from which chuncking could be used. This would work for not only text, but also images and other information as long as the parser can read and understand the format.
Symantec releases Data Base support for sensitive information: http://biz.yahoo.com/iw/080624/0409691.html
For further thinking about DAM, database access management and scanning data bases for sensitive information, see: http://securosis.com/
For further thinking about DAM, database access management and scanning data bases for sensitive information, see: http://securosis.com/
How to go about crating a better mousetrap (DLP)
If we go through the questions to ask: Where is it, What is it, Who has access, and how is it protected, we can see there are answers in each one of these four questions that can be used to answer others with a high probability.
If we think about the where is it. If we look at one particular user, that user will use a limited number of resources to create and store information. She might have a local lap top used for daily work, a hand full of SharePoint sites she visits, a few file shares and maybe two or three data bases typically accessed via a line of business application, and finally and very important, instant messaging and email.
If we expand the view of this person, and try to define that person in a network, we can look at organizational/hierarchical views of this person, and we can see frequency of communication via SharePoint, file shares, email and IM. With that information, we can create a social network of nodes between her and her co-workers and contacts. If we know that she frequently uses information of high sensitivity, we can apply a higher probability of her network also working on highly sensitive information, or has a greater opportunity to receive sensitive information. Each node going further out, will have a reduced opportunity of receiving sensitive information, unless they also work on sensitive information. Of course a highly connected node will have higher probability than a lesser connected node.
With this, we can create network models and base probability of each one of the nodes accessing, or have the potential to access sensitive information. This network diagram would be created by correlating information from email systems, logon events etc, and then correlate this to known repositories of sensitive information. Of course this approach will take several iterations as one would assume that in the beginning, few of the repositories would be classified and catalogued.
Now, if we start looking at Alice, and what information she receives, we could chunk the sensitive information she receives from let say a data base, and then see if there are hits on these chunks in email, IM, or in documents she creates. If it is, we can then assign a probability of whether the information is sensitive or not. If we have enough information so the probability is higher than a preset threshold, we could then automatically assign the appropriate classification, annotate the information with the appropriate meta data, and assign the correct protection using for example DRM or other encryption technologies, or just set the appropriate access control list permissions on the document.
Assigning rights to a document or repository then becomes a bit easier as you can glean information from previous transactions. With entitlement monitoring on repositories and in AD, you can then see if Alice should still have this access or not. A further development could be done to create a view into the social network to see if there is an increase or decrease of communications between nodes. If there has been a decrease, the organizational chart may not have been updated, but the node's work may have changed, and therefore may no longer need access to this information. In this case, if Alice owns one or more of these repositories, she could then be notified and queried if this node, Bob, still needs access. This system could of course also be used to monitor for abnormalities and anomalies.
We can also make assumptions about sensitivity of information based on protections on the system hosting the information (this may not hold true for end systems, but will generally hold true for financial systems and HR systems etc). If it is encrypted, or have other security measures in place, its probability of containing sensitive information may be higher, however this is a weak assumption in many cases, especially before a program has been put in place to safeguard sensitive information in an organization.
If we go through the questions to ask: Where is it, What is it, Who has access, and how is it protected, we can see there are answers in each one of these four questions that can be used to answer others with a high probability.
If we think about the where is it. If we look at one particular user, that user will use a limited number of resources to create and store information. She might have a local lap top used for daily work, a hand full of SharePoint sites she visits, a few file shares and maybe two or three data bases typically accessed via a line of business application, and finally and very important, instant messaging and email.
If we expand the view of this person, and try to define that person in a network, we can look at organizational/hierarchical views of this person, and we can see frequency of communication via SharePoint, file shares, email and IM. With that information, we can create a social network of nodes between her and her co-workers and contacts. If we know that she frequently uses information of high sensitivity, we can apply a higher probability of her network also working on highly sensitive information, or has a greater opportunity to receive sensitive information. Each node going further out, will have a reduced opportunity of receiving sensitive information, unless they also work on sensitive information. Of course a highly connected node will have higher probability than a lesser connected node.
With this, we can create network models and base probability of each one of the nodes accessing, or have the potential to access sensitive information. This network diagram would be created by correlating information from email systems, logon events etc, and then correlate this to known repositories of sensitive information. Of course this approach will take several iterations as one would assume that in the beginning, few of the repositories would be classified and catalogued.
Now, if we start looking at Alice, and what information she receives, we could chunk the sensitive information she receives from let say a data base, and then see if there are hits on these chunks in email, IM, or in documents she creates. If it is, we can then assign a probability of whether the information is sensitive or not. If we have enough information so the probability is higher than a preset threshold, we could then automatically assign the appropriate classification, annotate the information with the appropriate meta data, and assign the correct protection using for example DRM or other encryption technologies, or just set the appropriate access control list permissions on the document.
Assigning rights to a document or repository then becomes a bit easier as you can glean information from previous transactions. With entitlement monitoring on repositories and in AD, you can then see if Alice should still have this access or not. A further development could be done to create a view into the social network to see if there is an increase or decrease of communications between nodes. If there has been a decrease, the organizational chart may not have been updated, but the node's work may have changed, and therefore may no longer need access to this information. In this case, if Alice owns one or more of these repositories, she could then be notified and queried if this node, Bob, still needs access. This system could of course also be used to monitor for abnormalities and anomalies.
We can also make assumptions about sensitivity of information based on protections on the system hosting the information (this may not hold true for end systems, but will generally hold true for financial systems and HR systems etc). If it is encrypted, or have other security measures in place, its probability of containing sensitive information may be higher, however this is a weak assumption in many cases, especially before a program has been put in place to safeguard sensitive information in an organization.
Thursday, July 10, 2008
It has been a while since I have updated the blog, but here is an article from MSNBC news I found stressing the need for inspection of information leaving your network: "Last year, a Virginia investment firm employee decided to trade music or a movie on the file-sharing network LimeWire on a company computer. He inadvertently shared his firm's files, including personal data of clients, one of them Supreme Court Justice Stephen Breyer" Seems that no-one including our Supreme Court justices are safe against loss of PII.
Thursday, June 19, 2008
The DLP industry is adding encryption capabilities to their offering: http://www.darkreading.com/document.asp?doc_id=156738&WT.svl=news1_1
I have long been a proponent of adding encryption to sensitive information. I do believe the best approach is to not only enable encryption, but also enable digital rights management to sensitive documents as you would then have a much fuller control of the document lifecycle.
Furthermore, DLP should be used in conjunction with a retention policy in the business, and become part of the overall information management of the organization. A tighter integration into storage systems for retention is the next logical step.
I have long been a proponent of adding encryption to sensitive information. I do believe the best approach is to not only enable encryption, but also enable digital rights management to sensitive documents as you would then have a much fuller control of the document lifecycle.
Furthermore, DLP should be used in conjunction with a retention policy in the business, and become part of the overall information management of the organization. A tighter integration into storage systems for retention is the next logical step.
Friday, June 06, 2008
Has credit card information been exposed at CompUSA stores?
I picked up a copy of the 2600 magazine today, and lo and behold, on page 23 is an article on how to log on to systems in the stores to retrive credit card information. The article describes the logon procedures using credentials not tied directly to a user, but rather a common name (store name) and the password is the same as the logon ID.
If this is truly the case, this might be a breach of PCI that could potentially impact many of the customers who have shopped at CompUSA. Maybe the bargain price equipment came with a hidden price in loss of customer information?
I picked up a copy of the 2600 magazine today, and lo and behold, on page 23 is an article on how to log on to systems in the stores to retrive credit card information. The article describes the logon procedures using credentials not tied directly to a user, but rather a common name (store name) and the password is the same as the logon ID.
If this is truly the case, this might be a breach of PCI that could potentially impact many of the customers who have shopped at CompUSA. Maybe the bargain price equipment came with a hidden price in loss of customer information?
Wednesday, June 04, 2008
link to the Gartner event: http://agendabuilder.gartner.com/sec14/webpages/SessionDetail.aspx?EventSessionId=914
Tuesday, June 03, 2008
Monday, April 07, 2008
Information Loss at Antioch University:
Failure to patch a Solaris server caused 60,000 users records to be exposed at Antioch University, including social security numbers: http://www.computerworld.com/action/article.do?command=viewArticleBasic&articleId=9075098&intsrc=hm_list
Failure to patch a Solaris server caused 60,000 users records to be exposed at Antioch University, including social security numbers: http://www.computerworld.com/action/article.do?command=viewArticleBasic&articleId=9075098&intsrc=hm_list
Go skiing, loose your PII: http://www.computerworld.com/action/article.do?command=viewArticleBasic&articleId=9074339&intsrc=hm_list Credit card information stolen as cards were swiped. Maybe it is time to revisit credit cards with a built in smart card chip? In this instance, 46,000 cards were exposed from the Okemo Mountain Resort ski area in Vermont
Tuesday, April 01, 2008
PCI compliant, what does that mean?
Does compliance by an organization to PCI mean that credit card information is safe? According to a news article by informationweek: http://www.informationweek.com/security/showArticle.jhtml?articleID=206904986, this might not be the case as Hannaford Bros, lost 4.2 million credit and debit card numbers, while stating on their website that they are compliant to the industry PCI standard.
Does compliance by an organization to PCI mean that credit card information is safe? According to a news article by informationweek: http://www.informationweek.com/security/showArticle.jhtml?articleID=206904986, this might not be the case as Hannaford Bros, lost 4.2 million credit and debit card numbers, while stating on their website that they are compliant to the industry PCI standard.
Thursday, March 27, 2008
What strategies to follow after you have implemented a DLP solution
If you deployed a DLP strategy, you have probably deployed it in your high risk areas, and if you have become somewhat mature in your current DLP deployment, the next is how to grow the deployment so that you can secure more areas. As you are becoming more successful, your management, or clients within business groups who is not currently enjoying the protection a DLP solution can give, will ask you to protect their areas as well.
So, the question becomes, how do you grow both horizontally and vertically? You can grow horizontally by putting in place in place more monitors, but you will quickly find yourself in a situation where your current rules/policies does not meet the needs of the additional areas where you are now scanning, or maybe the business model you deployed for the corporate roll out does not meet the needs of the business unit you are now supporting in addition to the corporate roll out.
Do you invest in data in motion along with data at rest? Do you invest in end point protection? How about managing different departments ranging from your HR department, to your credit card processing department to your research and development arm. For each one of these, different business problems arise, and different solutions must be put in place. For HR, your main concern is probably the loss or disclosure of personnel data, from your sales organization, customer PII, and from your R&D department, loss of your future bread and butter.
So the discussion becomes the one of head count, and centralized versus de-centralized. Which model is right, and how to ensure comparable results between them? It is a discussion which will be had in many organizations in the upcoming years. Many IT security shops will have the idea that you should have a centralized approach. This will become increasingly difficult for several reasons. One, only the users/business owners in the respective areas will have an understanding of what is valuable, and needs protection, and what doesn’t. Then you have the issue around different IT departments controlling collaboration and messaging. Each one is important for securing your information. I think the right answer is a mix between centralized/decentralized, where information security runs the majority of the tools, but the business owners and IT collaborates on how to identify IP and business secrets, and create and manage policies dependent on roles.
There is one undeniable fact. The amount of information is growing, in fact according to IDC, it is growing by 60% a year, with new regulatory requirements means that IT will have to invest more in managing the information for disclosure, protection and retention.
Demand for storage capacity has grown by 60% per year and shows no signs of slowing down, according to research company IDC. New disclosure laws, which require more data to be preserved and retrievable, also are making storage management a bigger job. http://www.networkworld.com/news/2008/032108-storage-revolution-jobs.html
If you deployed a DLP strategy, you have probably deployed it in your high risk areas, and if you have become somewhat mature in your current DLP deployment, the next is how to grow the deployment so that you can secure more areas. As you are becoming more successful, your management, or clients within business groups who is not currently enjoying the protection a DLP solution can give, will ask you to protect their areas as well.
So, the question becomes, how do you grow both horizontally and vertically? You can grow horizontally by putting in place in place more monitors, but you will quickly find yourself in a situation where your current rules/policies does not meet the needs of the additional areas where you are now scanning, or maybe the business model you deployed for the corporate roll out does not meet the needs of the business unit you are now supporting in addition to the corporate roll out.
Do you invest in data in motion along with data at rest? Do you invest in end point protection? How about managing different departments ranging from your HR department, to your credit card processing department to your research and development arm. For each one of these, different business problems arise, and different solutions must be put in place. For HR, your main concern is probably the loss or disclosure of personnel data, from your sales organization, customer PII, and from your R&D department, loss of your future bread and butter.
So the discussion becomes the one of head count, and centralized versus de-centralized. Which model is right, and how to ensure comparable results between them? It is a discussion which will be had in many organizations in the upcoming years. Many IT security shops will have the idea that you should have a centralized approach. This will become increasingly difficult for several reasons. One, only the users/business owners in the respective areas will have an understanding of what is valuable, and needs protection, and what doesn’t. Then you have the issue around different IT departments controlling collaboration and messaging. Each one is important for securing your information. I think the right answer is a mix between centralized/decentralized, where information security runs the majority of the tools, but the business owners and IT collaborates on how to identify IP and business secrets, and create and manage policies dependent on roles.
There is one undeniable fact. The amount of information is growing, in fact according to IDC, it is growing by 60% a year, with new regulatory requirements means that IT will have to invest more in managing the information for disclosure, protection and retention.
Demand for storage capacity has grown by 60% per year and shows no signs of slowing down, according to research company IDC. New disclosure laws, which require more data to be preserved and retrievable, also are making storage management a bigger job. http://www.networkworld.com/news/2008/032108-storage-revolution-jobs.html
Thursday, March 13, 2008
New concerns regarding health care information misuse. In an article from MSNBC: http://www.msnbc.msn.com/id/23392229/ they highlight the impact an impostor can have on your health when your information is abused.
This should bring attention to the need for medical facilities, and anyone keeping medical information, to be prepared to ensure the accuracy and integrity of the information, as well as protecting it from loss.
A shift from paper based information management to electronic management, enables greater efficiencies of information management, including sharing of information, but also enables loss of information at a much greater level than anytime before in history.
Organizations which have not moved to encrypted storage for sensitive information should do so as soon as possible, and improved authentication and authorization models must be put in place where they are lacking.
Systems must be put in place that ensures that the identity used is that of the person receiving health care, and that only the information needed is available to personnel who provides care, or otherwise handles the information.
According to FTC, 3% of identity theft victims have had someone else use their medical benefits. With identity theft growing, and medical care becoming more expensive, leaving more out, and the move towards electronic health information management, we are poised for the perfect storm.
This should bring attention to the need for medical facilities, and anyone keeping medical information, to be prepared to ensure the accuracy and integrity of the information, as well as protecting it from loss.
A shift from paper based information management to electronic management, enables greater efficiencies of information management, including sharing of information, but also enables loss of information at a much greater level than anytime before in history.
Organizations which have not moved to encrypted storage for sensitive information should do so as soon as possible, and improved authentication and authorization models must be put in place where they are lacking.
Systems must be put in place that ensures that the identity used is that of the person receiving health care, and that only the information needed is available to personnel who provides care, or otherwise handles the information.
According to FTC, 3% of identity theft victims have had someone else use their medical benefits. With identity theft growing, and medical care becoming more expensive, leaving more out, and the move towards electronic health information management, we are poised for the perfect storm.
Monday, March 10, 2008
Information loss prevention and operational risk management
An operational risk framework which would take input across the organization, which also manages exceptions to policy would be a huge benefit to overall risk management. As business users demand web 2.0 applications, easy to use cell phones with dual use capabilities (read using as email client for work purposes and view video and listen to music for personal use), and exceptions given to systems regarding patch level and security reviews.
Roll up operational risk summaries would be the only way to measure the aggregate operational risk in the organization. This married with information flow views, which outlines what objects access what information would make the risk decisions easier to make. If you knew who had access to what information where and when on what device, it would be easy to see what the true risk was, and if a request for an exception came in, it would be easy to determine if the additional risk was substantial, or minimal. It would be also easy to envision a self service model , where the user would be allowed to accept some risk, but if the risk moved above a threshold, a manager or security operator would have to grant it. Each business leader could then set an acceptable threshold within the organization, and its policy would then flow down to the individual users.
An operational risk framework which would take input across the organization, which also manages exceptions to policy would be a huge benefit to overall risk management. As business users demand web 2.0 applications, easy to use cell phones with dual use capabilities (read using as email client for work purposes and view video and listen to music for personal use), and exceptions given to systems regarding patch level and security reviews.
Roll up operational risk summaries would be the only way to measure the aggregate operational risk in the organization. This married with information flow views, which outlines what objects access what information would make the risk decisions easier to make. If you knew who had access to what information where and when on what device, it would be easy to see what the true risk was, and if a request for an exception came in, it would be easy to determine if the additional risk was substantial, or minimal. It would be also easy to envision a self service model , where the user would be allowed to accept some risk, but if the risk moved above a threshold, a manager or security operator would have to grant it. Each business leader could then set an acceptable threshold within the organization, and its policy would then flow down to the individual users.
It has been a busy few days, and for information loss prevention, a few areas are worthy a highlight. iPhone 2.0 is still questioned if it meets the regulatory requirements for data protection: http://computerworld.com/action/article.do?command=viewArticleBasic&articleId=9067319&intsrc=hm_list. Furthermore, here is a link to an article discussing how BlackBerry servers are ripe for the hacking. Yet another concern for IT security personnel who needs to protect sensitive information on all devices serviced by the organization: http://techworld.com/security/news/index.cfm?newsID=11663&pagtype=samechan.
Sunday, March 09, 2008
Information control
Maybe I should rename the blog from information protection, as it is just as much about information control. DLP products along with DRM products, firewalls and other security controls are mere solutions in place to control the flow of information. It is put in place to prevent flow of information to systems or personnel who should not have this information, and allow the flow to systems or personnel who should have access.
DLP tries to identify the type of content, and based on rules, apply various protection mechanisms to the information. In some areas, context is also evaluated. However one area which DLP has not fully gone into is the area of mapping social graphs to ensure that information does not flow from a highly trusted source to a trusted albeit less trusted than the first source downward in the hierarchy towards an untrusted source.
Clear areas of such downward flow can be stopped by reducing the access to broad access groups, however human nature is such that obstacles to sharing information usually is overcome, especially if it is easier to circumvent the control than it is to obey it.
Willful loss of information can only happen if technology, processes and people (the majority) is aligned. The processes much be such that they enable secure sharing to the proper objects, and people must buy into the idea that the value of protecting certain types of information is higher than the cost of loss caused by reducing sharing.
This can seem contrary to many, as we want to communicate, and we will fail in most of our
endeavors if we do not collaborate, at least within the group we belong. The problem is of course that most people belong to many groups, based on work, ideology, hobbies, neighbourhoods, etc. This means that just looking at the objects who have, had, or can access the information is not enough. You also need to look at who these objects are connected to, and who they are in turn connected to. You need to map out objects that form hubs versus spokes (power law distribution), and where these again lead to.
One trick used to track such information is to use a 1x1 pixel, to see who receives certain information. This is however not included in most information as it traverses networks, storage areas, end points, data bases, applications etc. Only when you can marry a map of all objects, and their interrelatedness, and where the information actually moves to and from can you truly understand the risks and or possibilities the organization have in sharing information within and across boundaries.
Today's DLP solutions create classifications in varying degrees, and and some store the result set in a data base, others persist the information within the meta data of the document. Either directly within the document, or in an alternate stream. These can of course be stripped off, and until DRM becomes pervasive, it will not solve this issue either. Actually DRM has another problem, in that if information is presented on a screen, it can be copied and the controls are stripped off as a consequence. However DRM will increase the effort necessary to improperly distribute information to objects who should not have access.
In order to support better protection, identity management is another dimension that must be solved. I will not go much into depth in this posting, other than just saying that roles based identity management is hard, and identity management between organizations are even harder, and is a contributor to the problem.
Maybe I should rename the blog from information protection, as it is just as much about information control. DLP products along with DRM products, firewalls and other security controls are mere solutions in place to control the flow of information. It is put in place to prevent flow of information to systems or personnel who should not have this information, and allow the flow to systems or personnel who should have access.
DLP tries to identify the type of content, and based on rules, apply various protection mechanisms to the information. In some areas, context is also evaluated. However one area which DLP has not fully gone into is the area of mapping social graphs to ensure that information does not flow from a highly trusted source to a trusted albeit less trusted than the first source downward in the hierarchy towards an untrusted source.
Clear areas of such downward flow can be stopped by reducing the access to broad access groups, however human nature is such that obstacles to sharing information usually is overcome, especially if it is easier to circumvent the control than it is to obey it.
Willful loss of information can only happen if technology, processes and people (the majority) is aligned. The processes much be such that they enable secure sharing to the proper objects, and people must buy into the idea that the value of protecting certain types of information is higher than the cost of loss caused by reducing sharing.
This can seem contrary to many, as we want to communicate, and we will fail in most of our
endeavors if we do not collaborate, at least within the group we belong. The problem is of course that most people belong to many groups, based on work, ideology, hobbies, neighbourhoods, etc. This means that just looking at the objects who have, had, or can access the information is not enough. You also need to look at who these objects are connected to, and who they are in turn connected to. You need to map out objects that form hubs versus spokes (power law distribution), and where these again lead to.
One trick used to track such information is to use a 1x1 pixel, to see who receives certain information. This is however not included in most information as it traverses networks, storage areas, end points, data bases, applications etc. Only when you can marry a map of all objects, and their interrelatedness, and where the information actually moves to and from can you truly understand the risks and or possibilities the organization have in sharing information within and across boundaries.
Today's DLP solutions create classifications in varying degrees, and and some store the result set in a data base, others persist the information within the meta data of the document. Either directly within the document, or in an alternate stream. These can of course be stripped off, and until DRM becomes pervasive, it will not solve this issue either. Actually DRM has another problem, in that if information is presented on a screen, it can be copied and the controls are stripped off as a consequence. However DRM will increase the effort necessary to improperly distribute information to objects who should not have access.
In order to support better protection, identity management is another dimension that must be solved. I will not go much into depth in this posting, other than just saying that roles based identity management is hard, and identity management between organizations are even harder, and is a contributor to the problem.
Wednesday, March 05, 2008
According to a survey by Pursuant, http://www.pursuantresearch.com/, 32% of surveyed government IT personnel do not think they will become compliant with requirements such as HSPD-12, FIPS 201, and FISMA. This article: http://www.informationweek.com/news/showArticle.jhtml?articleID=206901345 states that government IT personnel believes national security trumps privacy
Sunday, March 02, 2008
Confluence of HIPAA security audits and increasing attacks from the Internet creates pressure on health care organizations to protect their patient information: http://www.networkworld.com/news/2008/022708-healthcare-cyberattacks.html
The four important questions to ask for any custodian of sensitive information should be:
What information exists on my systems
Where is it located
Who has access
How is it protected
I believe the only way to find out what information exists, cataloguing and classification is a necessity. To find out where it is, the repositories containing information must be scanned, and content then classified based on this scan. To ensure that only users who need access, has access, entitlement management is key. The information that is classified should then be protected.
This cannot be achieved with technology alone. People, Process and Technology all go hand in hand to solve this problem.
The four important questions to ask for any custodian of sensitive information should be:
What information exists on my systems
Where is it located
Who has access
How is it protected
I believe the only way to find out what information exists, cataloguing and classification is a necessity. To find out where it is, the repositories containing information must be scanned, and content then classified based on this scan. To ensure that only users who need access, has access, entitlement management is key. The information that is classified should then be protected.
This cannot be achieved with technology alone. People, Process and Technology all go hand in hand to solve this problem.
Selection Criteria for an ILP solution
Here are the high level selection criteria I would use for selecting a DLP solution
· Accuracy (I would be willing to trade speed for accuracy if needed)
· Speed (can all high risk areas be scanned efficiently without a high bandwidth cost)
· Scalability (can all high risk areas be scanned efficiently)
· Remediation capabilities (if a scanning solution is deployed without proper remediation, it leaves the organization with a much higher risk than prior to scanning)
· Upfront cost of application
· Upfront cost of services needed to deploy application
· Cost of ownership
o How many headcount are needed to manage incidents and systems
o What is the annual support cost
o What is the total life time cost of the application (3 years)
· Risk reduction provided by application
o How is it measured
o Will result set stand up in court (can I prove due diligence when using these tools)
o Can new regulatory requirements or new corporate policy be set up within a standard framework
o Does the reporting meet the following needs
§ Overall risk reduction
§ Specific risk reduction for business unit/regulatory compliance/regional compliance
§ Can ROI be demonstrated
§ Are executive reports easy to understand
§ Can executive reports be rolled into a CIO scorecard
§ Does the reports for the operations team allow for improving efficiency of team and rules (this drives TCO)
Here are the high level selection criteria I would use for selecting a DLP solution
· Accuracy (I would be willing to trade speed for accuracy if needed)
· Speed (can all high risk areas be scanned efficiently without a high bandwidth cost)
· Scalability (can all high risk areas be scanned efficiently)
· Remediation capabilities (if a scanning solution is deployed without proper remediation, it leaves the organization with a much higher risk than prior to scanning)
· Upfront cost of application
· Upfront cost of services needed to deploy application
· Cost of ownership
o How many headcount are needed to manage incidents and systems
o What is the annual support cost
o What is the total life time cost of the application (3 years)
· Risk reduction provided by application
o How is it measured
o Will result set stand up in court (can I prove due diligence when using these tools)
o Can new regulatory requirements or new corporate policy be set up within a standard framework
o Does the reporting meet the following needs
§ Overall risk reduction
§ Specific risk reduction for business unit/regulatory compliance/regional compliance
§ Can ROI be demonstrated
§ Are executive reports easy to understand
§ Can executive reports be rolled into a CIO scorecard
§ Does the reports for the operations team allow for improving efficiency of team and rules (this drives TCO)
Thursday, February 28, 2008
Can you buy PCI compliance, a good article from Information Weeek: http://informationweek.com/security/showArticle.jhtml?articleID=206800868
Of course, you can get solid advice from vendors, but technology is just one part of the equation. First, you should evaluate if you have the right skill set in your organization, then you should evaluate your current processes, and re-engineer if needed. Only when you have evaluated both people and processes, should you start evaluating technology
Of course, you can get solid advice from vendors, but technology is just one part of the equation. First, you should evaluate if you have the right skill set in your organization, then you should evaluate your current processes, and re-engineer if needed. Only when you have evaluated both people and processes, should you start evaluating technology
Password database of stolen passwords found by Finjan: http://www.eweek.com/c/a/Security/Finjan-Finds-Database-of-8700-Stolen-FTP-Credentials/
Passwords should be treated as highly sensitive information as passwords are often reused by users, and can lead to the loss of all types of sensitive information within information systems. However, passwords can be hard to search for unless you already have a database of passwords. In the case passwords has to be stored electronically, they should at all times stay encrypted
Passwords should be treated as highly sensitive information as passwords are often reused by users, and can lead to the loss of all types of sensitive information within information systems. However, passwords can be hard to search for unless you already have a database of passwords. In the case passwords has to be stored electronically, they should at all times stay encrypted
Sunday, February 17, 2008
New study from Symantec
IT organizations are now reporting back to Symantec's survey that work on regulatory compliance is either comparable to other projects, or more important than risk mitigation efforts: http://www.infoworld.com/article/08/01/31/Study-reframes-IT-risk-management_1.html
This should be good news for information loss prevention programs, as PCI is definitely a driver for improved controls on how and when information is shared and to whom.
I believe the future trends will be divestments in some security strategies historically undertaken by an organization, such as extranet solutions, firewall deployments etc, and that the major investments for the future is in a blend between identity management and entitlement management. If you look at current encryption solutions, they usually stop at the enterprise egress point, as most organizations are not able to convince their partners to agree on a federation model.
It is time to divest in underperforming security initiatives, and invest in areas where you can find a better return on your investment. Today investment in compliance can provide better ROI than just merely investing in security controls. If you combine your investment so that you improve uptime, enable business, and can prove compliance, you find much more value than just investing in security controls.
http://www.infoworld.com/article/08/01/31/Study-reframes-IT-risk-management_1.html
IT organizations are now reporting back to Symantec's survey that work on regulatory compliance is either comparable to other projects, or more important than risk mitigation efforts: http://www.infoworld.com/article/08/01/31/Study-reframes-IT-risk-management_1.html
This should be good news for information loss prevention programs, as PCI is definitely a driver for improved controls on how and when information is shared and to whom.
I believe the future trends will be divestments in some security strategies historically undertaken by an organization, such as extranet solutions, firewall deployments etc, and that the major investments for the future is in a blend between identity management and entitlement management. If you look at current encryption solutions, they usually stop at the enterprise egress point, as most organizations are not able to convince their partners to agree on a federation model.
It is time to divest in underperforming security initiatives, and invest in areas where you can find a better return on your investment. Today investment in compliance can provide better ROI than just merely investing in security controls. If you combine your investment so that you improve uptime, enable business, and can prove compliance, you find much more value than just investing in security controls.
http://www.infoworld.com/article/08/01/31/Study-reframes-IT-risk-management_1.html
Saturday, February 09, 2008
Data bases and DLP
Quote from article in eweek: http://www.eweek.com/c/a/Security/DLP-DAM-Share-Common-Data-Security-Objectives/ "Most every security monitoring technology would benefit from DLP content awareness, which is the ability to recognize sensitive content on the fly," said Paul Proctor, an analyst with Gartner."
I completely agree, I believe DLP vendors need to address data bases along with repositories email and endpoints. Furthermore, such solutions should also protect any sensitive information leaving the data base
Quote from article in eweek: http://www.eweek.com/c/a/Security/DLP-DAM-Share-Common-Data-Security-Objectives/ "Most every security monitoring technology would benefit from DLP content awareness, which is the ability to recognize sensitive content on the fly," said Paul Proctor, an analyst with Gartner."
I completely agree, I believe DLP vendors need to address data bases along with repositories email and endpoints. Furthermore, such solutions should also protect any sensitive information leaving the data base
Amendments to Federal Rules of Civil Procedure, FRCP, creating opportunities for content management solutions: http://www.byteandswitch.com/document.asp?doc_id=144806&WT.svl=news1_6
Some solutions sit on email, and use keywords and phrases, others enable retrieval from tapes and other media.
At some time in the not so distant future, eDiscovery solutions and ILP solutions will probably merge, as they are both solving much the same problem.
Some solutions sit on email, and use keywords and phrases, others enable retrieval from tapes and other media.
At some time in the not so distant future, eDiscovery solutions and ILP solutions will probably merge, as they are both solving much the same problem.
Friday, February 08, 2008
Eli Lilly legal documents wrongfully sent to New York Times in a Billion dollar lawsuit
Eli Lilly could probably have been better protected if they had in place a federated trust with their law firm, Pepper Hamilton, and had the opportunity to protect their confidential communication with their outside counsel. This is truly the case for where Digital Rights Management could really protect their information.
http://news.cnet.co.uk/software/0,39029694,49295453,00.htm
This case of information leak is enlightening in several aspects.
One, Eli Lilly could potentially have lost ground in a serious legal matter
Two, this is an understandable mistake by the outside counsel, albeit one could argue that more care should have been taken. Awareness is key, and an awareness program can reduce the risk of such incidents.
Three, when conducting business with partners, just having legal agreements in place on how information is to be handled is not good enough. Contractual obligations should be audited against. This email could potentially have been stopped at the email server if an information loss prevention solution had been in place
Eli Lilly could probably have been better protected if they had in place a federated trust with their law firm, Pepper Hamilton, and had the opportunity to protect their confidential communication with their outside counsel. This is truly the case for where Digital Rights Management could really protect their information.
http://news.cnet.co.uk/software/0,39029694,49295453,00.htm
This case of information leak is enlightening in several aspects.
One, Eli Lilly could potentially have lost ground in a serious legal matter
Two, this is an understandable mistake by the outside counsel, albeit one could argue that more care should have been taken. Awareness is key, and an awareness program can reduce the risk of such incidents.
Three, when conducting business with partners, just having legal agreements in place on how information is to be handled is not good enough. Contractual obligations should be audited against. This email could potentially have been stopped at the email server if an information loss prevention solution had been in place
Thursday, February 07, 2008
An interesting book from the CEO of Kaiser Permanente, George Halvorson: http://www.healthcarereformnow.org/
In the second hard truth, Mr. Halvorson discusses care linkage deficiencies, of which he describes how medical doctors creates paper based medical records for their patients.
It is commendable that a person like Mr. Halvorson which has so much influence, is actively driving for digitizing health care records. If these records are made easily available to care providers as well as care recipients, great efficiencies can be created.
Digitizing medical records does come with some security concerns, which should be addressed. Only authorized personnel should have access. Anecdotal evidence which I have seen and heard points to the need for improving the culture in the health care industry in regards to safe guarding patient information. An awareness campaign is needed among care givers to educate them on how to best secure such information. Furthermore, tools needs to be made available to the health care professionals which allows them to continue to provide healthcare without being bogged down with security measures hindering them in their work.
These tools should address the who, what, when and where in regards to access to highly sensitive information such as patient records, while enabling the health care professionals to spend more time caring for patients. So these tools must enable secure collaboration so each professional who needs access to information readily has this information, however is restricted to only this information and not all information of all patients.
In the second hard truth, Mr. Halvorson discusses care linkage deficiencies, of which he describes how medical doctors creates paper based medical records for their patients.
It is commendable that a person like Mr. Halvorson which has so much influence, is actively driving for digitizing health care records. If these records are made easily available to care providers as well as care recipients, great efficiencies can be created.
Digitizing medical records does come with some security concerns, which should be addressed. Only authorized personnel should have access. Anecdotal evidence which I have seen and heard points to the need for improving the culture in the health care industry in regards to safe guarding patient information. An awareness campaign is needed among care givers to educate them on how to best secure such information. Furthermore, tools needs to be made available to the health care professionals which allows them to continue to provide healthcare without being bogged down with security measures hindering them in their work.
These tools should address the who, what, when and where in regards to access to highly sensitive information such as patient records, while enabling the health care professionals to spend more time caring for patients. So these tools must enable secure collaboration so each professional who needs access to information readily has this information, however is restricted to only this information and not all information of all patients.
An article discussing learning to address High Business Impact, HBI, in the enterprise in the SC magazine written by Joel Christner with Reconnex: http://www.scmagazineus.com/Learning-applications-Revolutionizing-data-loss-prevention/article/105073/
Entitlement management
Entitlement management is important not only for your security posture, it is also important for your compliance efforts for SOX and PCI.
The problem with entitlement management is of course to know who has access to what. You probably know who unless you have too broad of an access policy on your information. How would you know if you have to broad of an access? You need to scan for large user groups, and global groups. These groups should not be allowed for sensitive and highly sensitive information. Do you know all the instances within your organization of sensitive and highly sensitive information? You can of course use DLP to scan for these information types. The problem is of course that the DLP solutions do not map back to who had access when.
With these questions/problems, what are you to do?
One, you should scan all your information, and identify where you have highly sensitive and sensitive information.
When this has been identified, you need to keep a persistent classification of the information, so a classification solution must be deployed and implemented.
When you have applied the classification, you need to ensure that the large groups and or global groups do not have access to this information.
For information where you need to validate that users who should have access have access, and users who should not have access does not have access, custodians of the sensitive information are required to validate the users who has access. By forcing the validation at the lowest level possible, you can effectively address the biggest problem in organizations today, which is entitlement creep. Entitlement creep happens when employees move from one job to another, or the job changes over time, and access needs change with them. Most often, when this happens, the employee gets access to the new areas needed for their job, but the old entitlements are not removed. By clearly assigning custodianship at as low of a level as possible, this can be taken care of if the custodians are reminded periodically to validate who should have access, and that they are aware that they are also audited agaist their accountability
In other words, the full solution is to map your scanning of sensitive information to your identity management systems, as well as a classification and remediation solution
Entitlement management is important not only for your security posture, it is also important for your compliance efforts for SOX and PCI.
The problem with entitlement management is of course to know who has access to what. You probably know who unless you have too broad of an access policy on your information. How would you know if you have to broad of an access? You need to scan for large user groups, and global groups. These groups should not be allowed for sensitive and highly sensitive information. Do you know all the instances within your organization of sensitive and highly sensitive information? You can of course use DLP to scan for these information types. The problem is of course that the DLP solutions do not map back to who had access when.
With these questions/problems, what are you to do?
One, you should scan all your information, and identify where you have highly sensitive and sensitive information.
When this has been identified, you need to keep a persistent classification of the information, so a classification solution must be deployed and implemented.
When you have applied the classification, you need to ensure that the large groups and or global groups do not have access to this information.
For information where you need to validate that users who should have access have access, and users who should not have access does not have access, custodians of the sensitive information are required to validate the users who has access. By forcing the validation at the lowest level possible, you can effectively address the biggest problem in organizations today, which is entitlement creep. Entitlement creep happens when employees move from one job to another, or the job changes over time, and access needs change with them. Most often, when this happens, the employee gets access to the new areas needed for their job, but the old entitlements are not removed. By clearly assigning custodianship at as low of a level as possible, this can be taken care of if the custodians are reminded periodically to validate who should have access, and that they are aware that they are also audited agaist their accountability
In other words, the full solution is to map your scanning of sensitive information to your identity management systems, as well as a classification and remediation solution
Wednesday, February 06, 2008
According to the Ponemon institute, 69% of employees have access to too much information. This validates the need for tigther entitlement management: http://www.informationweek.com/news/showArticle.jhtml?articleID=206104613&subSection=News
Monday, February 04, 2008
An article by Rich Mogull about selecting the right DLP solution for your needs: http://www.networkworld.com/columnists/2008/020408insider.html
Amongst his most important criteria is to identify your key stakeholders within the organization, then agree on what problems to solve and how, then choose the right solution. I very much agree on this approach.
Amongst his most important criteria is to identify your key stakeholders within the organization, then agree on what problems to solve and how, then choose the right solution. I very much agree on this approach.
Vontu gets coverage on CNNMoney.com for winning a contract protecting health information at The Mount Sinai Medical Center. It seems that Mount Sinai chose a desktop/laptop solution to protect their information: http://money.cnn.com/news/newsfeeds/articles/marketwire/0356611.htm
Agent versus no agent what is the right answer?
When looking at DLP as well as other security products such as patch management, anti virus etc, the question comes to mind, is an agent on the end point the answer to the question?
It is neither yes or no. Agents have two main problems, reach and failure rate. For reach, you have to either force an agent out via a systems management systems solution, GPO, script, or distribution via a portal. To have a 100% reach for a large usually becomes either too expensive or outright impossible if your network is segmented into areas of different management segments, such as lab versus production.
The right answer is a mix, where you use agents on high risk desktops, laptops and mobile devices, applications or appliances on email servers and data center servers such as repository systems line of business applications and data bases, and applications/appliances on network ingress/egress points. It is also important to note that if you need to transport sensitive information between organizations, you need to ensure contractual obligations are put in place and met between the organizations.
There are several good ways to deploy agents. One is to use Group Policy Software Installation, GPSI, another is to use System Center, Tivoli or other agent management systems. You could of course also use a portal such that if a user went to a portal (Line of Business) to retrieve sensitive information, they would have to download and install an agent before they were allowed access to the information. The benefit of using an agent management system is of course the breadth of information these systems provide of installation metrics, health metrics, reach etc.
In my opinion, the perfect agent would be installed seamlessly via an agent management system, and control what the user can do with the information without impeding productivity. So for example, blocking USB might not be the answer if the user has a genuine need to transport information using a USB key. A better solution would be, if information goes on a USB, is it sensitive and is it protected? If the information is sensitive and it is not protected, the solution should interact with the user and make it easy to do the right thing. The same goes for emails, and any other communication where the user may divulge sensitive information. For file transfers, a transfer would either be approved or disapproved based on the content sensitivity and where it is going, and protected appropriately.
When looking at DLP as well as other security products such as patch management, anti virus etc, the question comes to mind, is an agent on the end point the answer to the question?
It is neither yes or no. Agents have two main problems, reach and failure rate. For reach, you have to either force an agent out via a systems management systems solution, GPO, script, or distribution via a portal. To have a 100% reach for a large usually becomes either too expensive or outright impossible if your network is segmented into areas of different management segments, such as lab versus production.
The right answer is a mix, where you use agents on high risk desktops, laptops and mobile devices, applications or appliances on email servers and data center servers such as repository systems line of business applications and data bases, and applications/appliances on network ingress/egress points. It is also important to note that if you need to transport sensitive information between organizations, you need to ensure contractual obligations are put in place and met between the organizations.
There are several good ways to deploy agents. One is to use Group Policy Software Installation, GPSI, another is to use System Center, Tivoli or other agent management systems. You could of course also use a portal such that if a user went to a portal (Line of Business) to retrieve sensitive information, they would have to download and install an agent before they were allowed access to the information. The benefit of using an agent management system is of course the breadth of information these systems provide of installation metrics, health metrics, reach etc.
In my opinion, the perfect agent would be installed seamlessly via an agent management system, and control what the user can do with the information without impeding productivity. So for example, blocking USB might not be the answer if the user has a genuine need to transport information using a USB key. A better solution would be, if information goes on a USB, is it sensitive and is it protected? If the information is sensitive and it is not protected, the solution should interact with the user and make it easy to do the right thing. The same goes for emails, and any other communication where the user may divulge sensitive information. For file transfers, a transfer would either be approved or disapproved based on the content sensitivity and where it is going, and protected appropriately.
Friday, February 01, 2008
Omnibank customer information stolen leading to the creation of false ATM card which criminals then used to obtain cash: http://breachblog.com/2008/01/28/omni.aspx
According to news stories, the amounts lost were small, but there was clearly an inconvenience to the customers of the bank.
Unfortunately, these types of attacks will continue to occur.
According to news stories, the amounts lost were small, but there was clearly an inconvenience to the customers of the bank.
Unfortunately, these types of attacks will continue to occur.
Britons working for the UK government are now banned from removing laptops from their offices unless they are encrypted: http://www.personneltoday.com/articles/2008/01/22/44056/laptops-containing-protected-data-banned-from-leaving-public-sector-offices.html
The real question is, is this an enforceable policy? It migth be, but then it begs the question, can current productivty among civil servants be sustained? The answer is no, unless there is a effort put in place to enable civil cervants to encrypt their laptop content easily.
This issue highlights two important areas for compliance. First of all, do you have effective policies addressing your areas of risk? By effective, I mean, are they clear and understandable, and are the users governed by the policies aware of them. Second important area, is of course compliance to policy. How do you effectively enforce, monitor and audit for compliance to your policy?
The hard part is of course to balance policy/compliance with business needs. If your policy and compliance efforts impede your business, then you face loss of productivty and probably profits. So a balance between business needs and your security/compliance needs must be obtained.
The best way to achieve this, is of course to evaluate your current risk profile, and decide if the current risk is something you are willing to accept or not. If you are not willing to accept your current risk, then you must put in place mitigations that moves the risk level to where you are comfortable.
The real question is, is this an enforceable policy? It migth be, but then it begs the question, can current productivty among civil servants be sustained? The answer is no, unless there is a effort put in place to enable civil cervants to encrypt their laptop content easily.
This issue highlights two important areas for compliance. First of all, do you have effective policies addressing your areas of risk? By effective, I mean, are they clear and understandable, and are the users governed by the policies aware of them. Second important area, is of course compliance to policy. How do you effectively enforce, monitor and audit for compliance to your policy?
The hard part is of course to balance policy/compliance with business needs. If your policy and compliance efforts impede your business, then you face loss of productivty and probably profits. So a balance between business needs and your security/compliance needs must be obtained.
The best way to achieve this, is of course to evaluate your current risk profile, and decide if the current risk is something you are willing to accept or not. If you are not willing to accept your current risk, then you must put in place mitigations that moves the risk level to where you are comfortable.
Gartner predicts that users will move to pocketable devices by 2012: http://gartner.com/it/page.jsp?id=593207
This means that we need to start thinking about content management on mobile devices. There are already systems available that enables full encryption on these devices, but they are not broadly deployed yet.
A different way of controlling sensitive information on these types of devices would of course be to give information persistent protection at the aggregation points such as email servers, and for line of busines applications at the web interface. DRM is a good solution for this space. You can control who has access when, and to a certain point where, compared to just letting the information get to the devices unprotected.
Here is a link to what Symantec has to say about mobile phone security: http://www.symantec.com/about/news/release/article.jsp?prid=20060404_01
This means that we need to start thinking about content management on mobile devices. There are already systems available that enables full encryption on these devices, but they are not broadly deployed yet.
A different way of controlling sensitive information on these types of devices would of course be to give information persistent protection at the aggregation points such as email servers, and for line of busines applications at the web interface. DRM is a good solution for this space. You can control who has access when, and to a certain point where, compared to just letting the information get to the devices unprotected.
Here is a link to what Symantec has to say about mobile phone security: http://www.symantec.com/about/news/release/article.jsp?prid=20060404_01
Thursday, January 31, 2008
Bold and commendable move by IBM
IBM is deploying PGP to more than 350,000 employees enabling all these employees to keep their sensitive information confidential even if they should loose their laptop: http://techworld.com/security/news/index.cfm?newsID=11272&pagtype=samechan
IBM is deploying PGP to more than 350,000 employees enabling all these employees to keep their sensitive information confidential even if they should loose their laptop: http://techworld.com/security/news/index.cfm?newsID=11272&pagtype=samechan
Wednesday, January 30, 2008
Blogging and DLP
Should you worry about loosing IP or sensitive information through your employees use of blogs? Well, according to intellectual property attorney Stephen M. Nipper says that employees are more likely to leak closely held data through casual e-mails than through carefully thought-out blog entries. Quote is from the book Naked Conversations.
I assume it is the same Mr. Nipper who has these blogs, and there is some really interesting reads on IP on these blogs: http://inventblog.com/ , http://www.rethinkip.com/ and http://www.shapeblog.com/
On the other hand, it would also be prudent to search your public presence for sensitive information if you have the capability
Should you worry about loosing IP or sensitive information through your employees use of blogs? Well, according to intellectual property attorney Stephen M. Nipper says that employees are more likely to leak closely held data through casual e-mails than through carefully thought-out blog entries. Quote is from the book Naked Conversations.
I assume it is the same Mr. Nipper who has these blogs, and there is some really interesting reads on IP on these blogs: http://inventblog.com/ , http://www.rethinkip.com/ and http://www.shapeblog.com/
On the other hand, it would also be prudent to search your public presence for sensitive information if you have the capability
Why companies and organizations need to think about SharePoint and protecting information within SharePoint.
According to Forrester, SharePoint Leads Way to Enterprise 2.0. With its capabilities for storing documents as well as providing a collaborative environment with wiki's and other social networking capabilities, comes the issues around content sensitivity, entitlement management, protection, discovery retention and audits.
Independent Software Vendors are also jumping on the bandwagon enabling SharePoint increase use in the enterprises and other organizations. Further driving the need for securing content on SharePoint: http://www.crn.com/software/205801189
According to Forrester, SharePoint Leads Way to Enterprise 2.0. With its capabilities for storing documents as well as providing a collaborative environment with wiki's and other social networking capabilities, comes the issues around content sensitivity, entitlement management, protection, discovery retention and audits.
Independent Software Vendors are also jumping on the bandwagon enabling SharePoint increase use in the enterprises and other organizations. Further driving the need for securing content on SharePoint: http://www.crn.com/software/205801189
Here is a link to several good thoughts on various laws and their implications from a DLP perspective: http://hack-igations.blogspot.com/search/label/credit%20card%20law (credit card laws), http://hack-igations.blogspot.com/search/label/data%20breach%20notification (breach notification)
Tuesday, January 29, 2008
In the data loss news:
58 year old Greek mathematics professor steals data causing losses of $ 361 million:
http://news.smh.com.au/greek-authorities-accuse-man-of-selling-stolen-dassault-software/20080126-1o9j.html
Choice Point settles to the tune of $ 10 million:
http://www.consumeraffairs.com/news04/2008/01/choicepoint_settle.html
7 Citibank employees in Singapore is arrested after taking customer data with them to their new employer UBS:
http://www.ft.com/cms/s/0/83d71216-caab-11dc-a960-000077b07658,dwp_uuid=e8477cc4-c820-11db-b0dc-000b5df10621.html
Laptop theft leaves unencrypted healthcare information and customer data unaccounted for at an HMO,and a retailer.
58 year old Greek mathematics professor steals data causing losses of $ 361 million:
http://news.smh.com.au/greek-authorities-accuse-man-of-selling-stolen-dassault-software/20080126-1o9j.html
Choice Point settles to the tune of $ 10 million:
http://www.consumeraffairs.com/news04/2008/01/choicepoint_settle.html
7 Citibank employees in Singapore is arrested after taking customer data with them to their new employer UBS:
http://www.ft.com/cms/s/0/83d71216-caab-11dc-a960-000077b07658,dwp_uuid=e8477cc4-c820-11db-b0dc-000b5df10621.html
Laptop theft leaves unencrypted healthcare information and customer data unaccounted for at an HMO,and a retailer.
Friday, January 25, 2008
Context and uniqueness, a method for finding the proverbial needle in the haystack. How to find IP amongst large quantities of content that is close in likeness of IP, but is not IP. The false positive problem when searching content for IP, or in other words, avoiding having a million plus false positives over a large area of content containing both.
The problem around context can easily be seen when you think about this example: "He wanted more chips". Without knowing his location or situation, it is impossible to determine if he wanted potato chips, wood chips, or poker chips. Only when you know in what situation this want exists would you know which type of chips was wanted. If you look at this example: "He was outside cooking meat in his smoker. He wanted more chips." Now it becomes apparent what type of chips he wanted.
Context copied from Merriam-Webster OnLine: http://www.m-w.com/dictionary/context
Etymology:
Middle English, weaving together of words, from Latin contextus connection of words, coherence, from contexere to weave together, from com- + texere to weave — more at technical
Date:
circa 1568
1 : the parts of a discourse that surround a word or passage and can throw light on its meaning
2 : the interrelated conditions in which something exists or occurs : environment setting
When you are looking at a piece of source code such as this (randomly chosen) source code on msdn: http://msdn2.microsoft.com/en-us/library/ak5wyby1(VS.80).aspx
// attr_implements.idl
import "docobj.idl";
[ version(1.0), uuid(0ed71801-a1b6-3178-af3b-9431fc00185e) ]
library odod
{
importlib("stdole2.tlb");
importlib("olepro32.dll");
[
object,
uuid(1AECC9BB-2104-3723-98B8-7CC54722C7DD)
]
interface IBar1 {
[id(1)] HRESULT bar1();
};
[
dual,
uuid(1AECCABB-2104-3723-98B8-7CC54722C7DD)
]
interface IBar2 {
[id(1)] HRESULT bar2();
};
[
uuid(1AECC9CC-2104-3723-98B8-7CC54722C7DD)
]
dispinterface ISna {
properties:
methods:
[id(1)] HRESULT sna();
};
[
uuid(159A9BBB-E5F1-33F6-BEF5-6CFAD7A5933F),
version(1.0)
]
coclass CBar {
interface IBar1;
interface IBar2;
dispinterface ISna;
};
}
What could you use to distinguish this piece of code which is publicly available from internal code? Can you read context out of this text? What is unique? Well, if you have a bunch of this kinds of files, they all start to look very much the same, as c++ is a very structured language, and most developers will reuse code, which means that a segment can be found in many files. So the real question becomes, how can you find something unique in your internally developed code that you want to protect versus something that is publicly available. The same is of course true if you want to scan for open source code, or code covered with copy right in your internal code.
One thing this piece of code is almost devoid of are comments. The problem is that when you look at some source code, it will also contain boiler plate comments which is of course useless for identification of source code IP. So you have to do some intelligent searching to find something that can identify what is truly unique.
The method I propose is as follows:
1. Search through each file
2. Extract the comments
3. Search a commercially available search engine for number of hits
If number of hits are none, probability is high that the text combination is unique: Mark as value 9
If number of hits are low, probability are medium to high dependent of number of hits and closeness to the initial search term: Mark value 5
If number of hits are high, probability is low: Mark value 1
If there are more than one comment, the add the values
4. Create search term including the high value terms
5. Test against corpus of known publicly available source code
6. Count false positives
If number of false positives are greater than X then discard search term
7. Test against corpus of known internal code
8. Count false negatives
If number of false negatives are greater than X then go to 4 and add to search term
This method could also be used for identifying other types of IP, or business intelligence. I believe this method would also be very helpful in identifying pre-release marketing material as well as financial or legal documents. Of course, for marketing material, you would have to look for unique phrases, or words, or combinations thereof. For legal documents, you would have to look for what makes the legal document unique. With legal documents, you will also find that those can contain large amounts of boilerplate text, so this method would work well here to.
The problem around context can easily be seen when you think about this example: "He wanted more chips". Without knowing his location or situation, it is impossible to determine if he wanted potato chips, wood chips, or poker chips. Only when you know in what situation this want exists would you know which type of chips was wanted. If you look at this example: "He was outside cooking meat in his smoker. He wanted more chips." Now it becomes apparent what type of chips he wanted.
Context copied from Merriam-Webster OnLine: http://www.m-w.com/dictionary/context
Etymology:
Middle English, weaving together of words, from Latin contextus connection of words, coherence, from contexere to weave together, from com- + texere to weave — more at technical
Date:
circa 1568
1 : the parts of a discourse that surround a word or passage and can throw light on its meaning
2 : the interrelated conditions in which something exists or occurs : environment setting
When you are looking at a piece of source code such as this (randomly chosen) source code on msdn: http://msdn2.microsoft.com/en-us/library/ak5wyby1(VS.80).aspx
// attr_implements.idl
import "docobj.idl";
[ version(1.0), uuid(0ed71801-a1b6-3178-af3b-9431fc00185e) ]
library odod
{
importlib("stdole2.tlb");
importlib("olepro32.dll");
[
object,
uuid(1AECC9BB-2104-3723-98B8-7CC54722C7DD)
]
interface IBar1 {
[id(1)] HRESULT bar1();
};
[
dual,
uuid(1AECCABB-2104-3723-98B8-7CC54722C7DD)
]
interface IBar2 {
[id(1)] HRESULT bar2();
};
[
uuid(1AECC9CC-2104-3723-98B8-7CC54722C7DD)
]
dispinterface ISna {
properties:
methods:
[id(1)] HRESULT sna();
};
[
uuid(159A9BBB-E5F1-33F6-BEF5-6CFAD7A5933F),
version(1.0)
]
coclass CBar {
interface IBar1;
interface IBar2;
dispinterface ISna;
};
}
What could you use to distinguish this piece of code which is publicly available from internal code? Can you read context out of this text? What is unique? Well, if you have a bunch of this kinds of files, they all start to look very much the same, as c++ is a very structured language, and most developers will reuse code, which means that a segment can be found in many files. So the real question becomes, how can you find something unique in your internally developed code that you want to protect versus something that is publicly available. The same is of course true if you want to scan for open source code, or code covered with copy right in your internal code.
One thing this piece of code is almost devoid of are comments. The problem is that when you look at some source code, it will also contain boiler plate comments which is of course useless for identification of source code IP. So you have to do some intelligent searching to find something that can identify what is truly unique.
The method I propose is as follows:
1. Search through each file
2. Extract the comments
3. Search a commercially available search engine for number of hits
If number of hits are none, probability is high that the text combination is unique: Mark as value 9
If number of hits are low, probability are medium to high dependent of number of hits and closeness to the initial search term: Mark value 5
If number of hits are high, probability is low: Mark value 1
If there are more than one comment, the add the values
4. Create search term including the high value terms
5. Test against corpus of known publicly available source code
6. Count false positives
If number of false positives are greater than X then discard search term
7. Test against corpus of known internal code
8. Count false negatives
If number of false negatives are greater than X then go to 4 and add to search term
This method could also be used for identifying other types of IP, or business intelligence. I believe this method would also be very helpful in identifying pre-release marketing material as well as financial or legal documents. Of course, for marketing material, you would have to look for unique phrases, or words, or combinations thereof. For legal documents, you would have to look for what makes the legal document unique. With legal documents, you will also find that those can contain large amounts of boilerplate text, so this method would work well here to.
Thursday, January 24, 2008
DLP and eDiscovery
DLP lends itself well to eDiscovery. The biggest obstacle today is machine learning that can easily enable the process of discovering documents from vast repositories. With email, the situation is the same, but the discovery process is more mature as emails have been sought by litigants for a while now.
DLP would also facilitate a broader eDiscovery search than is possible without DLP. If DLP is deployed for network traffic, Data in Motion, DIM, across repositories, Data at Rest, DAR, and on endpoints, it would be possible to search for any pertaining document across the entire organization.
For DIM, with most DLP vendors, you would only have a record of incidents caught, so an archive of all traffic is not maintained. The most notable difference would be the DIM product Reconnex provides. They capture all traffic, and store it. This would enable eDiscovery on all captured traffic, and not just the traffic causing incidents.
For DAR, most DLP vendors scan the information in a central system, causing the information to move over your network, which could cause a higher utilization of the network than what is desired. With DAR, a new scan would need to be done if the rules are changed as most DAR vendors do not utilize archived and indexed material. Tablus, now RSA/EMC uses a notably different process, as most of their information is scanned locally instead of being transported across the network.
For end points, it is clearly an advantage to do the scanning locally if you have a large number of desktops/laptops deployed, unless you can use a targeted process for which systems to scan against. However each time you do a targeted scan, you increase the number of man hours needed for the process.
With DLP and eDiscovery, it is important to note that good retention policies, and actually enforcing the remediation policies. As anyone is aware of, merely relying on policies without enforcement does not effectively address the risks posed by storing un-needed information. It is my belief that DLP systems, and general search systems will be increasingly used for discovery purposes, and the dredging capabilities enabled by such systems can turn out to be very costly for organizations who does not enforce their retention policies.
Good risk management should drive the adoption of DLP in the organization, and it should pinpoint in what areas the highest return on the investment can be found.
DLP lends itself well to eDiscovery. The biggest obstacle today is machine learning that can easily enable the process of discovering documents from vast repositories. With email, the situation is the same, but the discovery process is more mature as emails have been sought by litigants for a while now.
DLP would also facilitate a broader eDiscovery search than is possible without DLP. If DLP is deployed for network traffic, Data in Motion, DIM, across repositories, Data at Rest, DAR, and on endpoints, it would be possible to search for any pertaining document across the entire organization.
For DIM, with most DLP vendors, you would only have a record of incidents caught, so an archive of all traffic is not maintained. The most notable difference would be the DIM product Reconnex provides. They capture all traffic, and store it. This would enable eDiscovery on all captured traffic, and not just the traffic causing incidents.
For DAR, most DLP vendors scan the information in a central system, causing the information to move over your network, which could cause a higher utilization of the network than what is desired. With DAR, a new scan would need to be done if the rules are changed as most DAR vendors do not utilize archived and indexed material. Tablus, now RSA/EMC uses a notably different process, as most of their information is scanned locally instead of being transported across the network.
For end points, it is clearly an advantage to do the scanning locally if you have a large number of desktops/laptops deployed, unless you can use a targeted process for which systems to scan against. However each time you do a targeted scan, you increase the number of man hours needed for the process.
With DLP and eDiscovery, it is important to note that good retention policies, and actually enforcing the remediation policies. As anyone is aware of, merely relying on policies without enforcement does not effectively address the risks posed by storing un-needed information. It is my belief that DLP systems, and general search systems will be increasingly used for discovery purposes, and the dredging capabilities enabled by such systems can turn out to be very costly for organizations who does not enforce their retention policies.
Good risk management should drive the adoption of DLP in the organization, and it should pinpoint in what areas the highest return on the investment can be found.
Tuesday, January 22, 2008
EU says IP addresses are personal information
Peter Scharr, Germany's data protection commissioner stated to a European Parliament hearing that IP addresses must be considered personally identifiable information. If this becomes part of the EU privacy laws, it will have a wide impact. Article can be found here: http://www.cbsnews.com/stories/2008/01/21/tech/main3734904.shtml?source=RSSattr=SciTech_3734904
Peter Scharr, Germany's data protection commissioner stated to a European Parliament hearing that IP addresses must be considered personally identifiable information. If this becomes part of the EU privacy laws, it will have a wide impact. Article can be found here: http://www.cbsnews.com/stories/2008/01/21/tech/main3734904.shtml?source=RSSattr=SciTech_3734904
What does Roman aqueducts have in common with security and compliance?
Here is a link to a very good article on the Roman aqueducts and the impact lacking security had on the demise of the Roman society: What does this have to do with DLP you might ask? Well, I would like to point out that what was initially a convenience for the Roman populous became critical to their society, and the clear parallel can be made to today's convenience of collaboration tools such as SharePoint and others. These types of tools enables the users to become more productive, and as it gains a foothold in the organization, other tools and business processes are built on top, and finally you have an infrastructure that if attacked, can severely disrupt your business.
Proper prior planning when deploying these systems with this in mind is important. As the business processes change in the environment, they should be mapped, and risks and compliance requirements should be re-evaluated as the environment changes.
Here is a link to a very good article on the Roman aqueducts and the impact lacking security had on the demise of the Roman society: What does this have to do with DLP you might ask? Well, I would like to point out that what was initially a convenience for the Roman populous became critical to their society, and the clear parallel can be made to today's convenience of collaboration tools such as SharePoint and others. These types of tools enables the users to become more productive, and as it gains a foothold in the organization, other tools and business processes are built on top, and finally you have an infrastructure that if attacked, can severely disrupt your business.
Proper prior planning when deploying these systems with this in mind is important. As the business processes change in the environment, they should be mapped, and risks and compliance requirements should be re-evaluated as the environment changes.
PCI and DLP
How can you use DLP to protect the flow of credit card data outside of your main credit card processing?
Data in Motion should be used to validate that credit card information does not traverse your network in the clear, and should provide audit reports to verify. DIM should also integrate with the data base as well as the web applications such that all flows are understood and mapped. The problem is that over time, tables may contain more information than what is expected, and these tables may be used by applications in ways not fully understood. Data flow mapping is paramount, and in order to do so, content inspection and classification is needed. You should be able to understand the entire lifecycle of the credit card information irrespective of what system it resides on, and if it is within your own organization, or it traverses to a business partner.
Data at Rest should be used to evaluate repositories to ensure that credit card information to and from these are understood, and that retention policies are followed in destruction of credit card data in repositories that contain PCI data that is no longer needed.
Agents should be used on applications and end user systems to ensure that only users entitled to
The information has access, and ensuring that the information is always encrypted. It should also enforce the data retention policies such that when the information is no longer needed, it must be removed. However, you cannot just go and remove information without notifying users, so the solution must take into account how to notify users about expiring material, and the different actions they can take. One way I envision the solution is that users are notified that they must either delete the data, or ask for an exception to be able to keep the information.
This system should be distributed, and should provide compliance metrics and risk management metrics for the organization. It should also be enabled to tie into other security and compliance systems, so the organization has a full view of security issues, compliance and risk being managed.
How can you use DLP to protect the flow of credit card data outside of your main credit card processing?
Data in Motion should be used to validate that credit card information does not traverse your network in the clear, and should provide audit reports to verify. DIM should also integrate with the data base as well as the web applications such that all flows are understood and mapped. The problem is that over time, tables may contain more information than what is expected, and these tables may be used by applications in ways not fully understood. Data flow mapping is paramount, and in order to do so, content inspection and classification is needed. You should be able to understand the entire lifecycle of the credit card information irrespective of what system it resides on, and if it is within your own organization, or it traverses to a business partner.
Data at Rest should be used to evaluate repositories to ensure that credit card information to and from these are understood, and that retention policies are followed in destruction of credit card data in repositories that contain PCI data that is no longer needed.
Agents should be used on applications and end user systems to ensure that only users entitled to
The information has access, and ensuring that the information is always encrypted. It should also enforce the data retention policies such that when the information is no longer needed, it must be removed. However, you cannot just go and remove information without notifying users, so the solution must take into account how to notify users about expiring material, and the different actions they can take. One way I envision the solution is that users are notified that they must either delete the data, or ask for an exception to be able to keep the information.
This system should be distributed, and should provide compliance metrics and risk management metrics for the organization. It should also be enabled to tie into other security and compliance systems, so the organization has a full view of security issues, compliance and risk being managed.
Monday, January 21, 2008
An interesting report on the state of PCI by Forrester research conducted for RSA points out that encryption and access control are the top challenges for organizations to become PCI compliant. Furthermore, they keep too much data, however the PCI controls are used to drive compliance and improve security in the same organizations.
The report found that companies are concerned about data classification and access control policies. Data classification can of course be achieved with DLP, but in most PCI systems I know of, the data traverses the network, resides in data bases and file shares, and is processed and available through web applications.
This means that the future of DLP will have to answer the challenge of identifying PCI data throughout all its use, and be able to identify business process that does not adhere to the standard, as well as provide audit capabilities driving down the cost of maintaining PCI compliance.
The report found that companies are concerned about data classification and access control policies. Data classification can of course be achieved with DLP, but in most PCI systems I know of, the data traverses the network, resides in data bases and file shares, and is processed and available through web applications.
This means that the future of DLP will have to answer the challenge of identifying PCI data throughout all its use, and be able to identify business process that does not adhere to the standard, as well as provide audit capabilities driving down the cost of maintaining PCI compliance.
Why classification and protection matters for internal sensitive information is clearly shown in this article: http://www.portfolio.com/news-markets/national-news/portfolio/2008/01/14/Media-Defenders-Profile?print=trueExcerpt:%22verify as internal, unpublished information belonging to MediaDefender was stolen by a hacker who had penetrated their network.
If you are hacked, and you have sensitive information unprotected, and available to any user with a logon ID, you are susceptible to loose this information to the hacker who have penetrated your firewall. It goes to show that the only way to truly protect your information is to deploy a defense in depth solution where sensitivity of your documents are taken into account as well as your traditional perimeter defenses.
If the documents had been encrypted, it is not so likely that these documents would have been of any use to this hacker, and the impact to the organization would have been lesser.
What does such an attack, and subsequent loss of sensitive information mean to this organization's reputation and future revenue stream?
Then there is of course the loss of a laptop in Britain by the Royal Navy, containing information on 600,000 individuals of which the information was not encrypted, quote ""The HMRC data leak happened two months prior to this theft, but apparently the personal data on the Royal Navy laptop was not encrypted despite the easy availability of such software." Article can be found here: http://www.vnunet.com/vnunet/news/2207687/royal-navy-laptop-stolen. The junior officer may face court martial, but I believe the real failure is organizational, and not so much personal. What type of awareness and training in protecting information is given? Is the right set of tools available to personnel helping them identify the sensitive information they are in charge of? Do the personnel have available to them tools that allows them to easily protect the information?
For organizations, I believe it is paramount that they start looking at the ways they can support their users through easy to use classification and protection schemes. What happens if a laptop is lost or stolen? What happens if a laptop is breached through the internet? Being able to answer these questions with a statement such as, yes we had an unfortunate loss, but the information was protected with encryption, would be quite different, and probably wouldn’t make headlines impacting your organization's reputation.
If you are hacked, and you have sensitive information unprotected, and available to any user with a logon ID, you are susceptible to loose this information to the hacker who have penetrated your firewall. It goes to show that the only way to truly protect your information is to deploy a defense in depth solution where sensitivity of your documents are taken into account as well as your traditional perimeter defenses.
If the documents had been encrypted, it is not so likely that these documents would have been of any use to this hacker, and the impact to the organization would have been lesser.
What does such an attack, and subsequent loss of sensitive information mean to this organization's reputation and future revenue stream?
Then there is of course the loss of a laptop in Britain by the Royal Navy, containing information on 600,000 individuals of which the information was not encrypted, quote ""The HMRC data leak happened two months prior to this theft, but apparently the personal data on the Royal Navy laptop was not encrypted despite the easy availability of such software." Article can be found here: http://www.vnunet.com/vnunet/news/2207687/royal-navy-laptop-stolen. The junior officer may face court martial, but I believe the real failure is organizational, and not so much personal. What type of awareness and training in protecting information is given? Is the right set of tools available to personnel helping them identify the sensitive information they are in charge of? Do the personnel have available to them tools that allows them to easily protect the information?
For organizations, I believe it is paramount that they start looking at the ways they can support their users through easy to use classification and protection schemes. What happens if a laptop is lost or stolen? What happens if a laptop is breached through the internet? Being able to answer these questions with a statement such as, yes we had an unfortunate loss, but the information was protected with encryption, would be quite different, and probably wouldn’t make headlines impacting your organization's reputation.
I know this is old news by now, but it illustrates the need for protecting your databases.
Hacker breaches UGA system stealing social security numbers. If your database tables containing social security numbers are encrypted, hackers will not be able to use the data even if they gain access to it, unless your entire database is compromised. Do you know which tables contain this information, do you know if it is protected, and are these tables exposed to the internet?
Hacker breaches UGA system stealing social security numbers. If your database tables containing social security numbers are encrypted, hackers will not be able to use the data even if they gain access to it, unless your entire database is compromised. Do you know which tables contain this information, do you know if it is protected, and are these tables exposed to the internet?
Friday, January 18, 2008
Health care organizations are now facing review of compliance to HIPAA from The Centers for Medicare and Medicaid Services increasing the need for protection of patient records in these organizations. The first ones to go are the large hospitals, and hospitals who have received complaints. CMS is not going to publicly announce who are being reviewed, but will release a report after the review. It will be interesting to see CMS' findings. The CMS also holds the power to fine these organizations, or levy other punishments. We will see what the outcome will be. http://www.govhealthit.com/online/news/350176-1.html?type=pf
A tape containing financial information of 650,000 people is missing. The tape belongs to GE Money, and was stored at Iron Mountain. It contains information from JC Penny shoppers and maybe as many as one hundred other retailers: http://www.msnbc.msn.com/id/22718442/
Corporate espionage and other threats to companies
SANS has determined that B2B corporate espionage is the next big worry for companies according to their latest research: http://www.infoworld.com/article/08/01/15/Cyber-espionage-moves-into-B2B_1.html. With that in mind, I see further need for proper protection of information in organizations, and most organizations will have to scramble to meet these new challenges. I believe that with a combination of DLP, Classification, Protection, and Audit, will be further needed in organizations. The game has moved from being regulatory compliant, to be able to protect your valuables. The last thing you want is to be notified by the federal government that your information may have been compromised. It is time to start thinking about how to safeguard your information. It is also interesting to see that SANS have identified the insider threat as growing: http://www.sans.org/2008menaces/?utm_source=web-sans&utm_medium=text-ad&utm_content=text-link_2008menaces_homepage&utm_campaign=Top_10__Cyber_Security_Menaces_-_2008&ref=22218. Ensuring that insiders do not have access to more information than they need is paramount. If you combine this threat with new search technologies such as Google for enterprise, Fast, Microsoft Search Server and other technologies enabling the malicious insider an easy way to steal sensitive information.
The three tenants of safeguarding information are:
Identify:
Where it is, What it is, Who has access, if the information is still needed, and to determine if is it protected
Protect: Ensure proper classification, proper access rights, and encrypt sensitive information
If information is found that is properly protected:
Remediate:
Classify information appropriately according to policy and or regulatory requirements
Remove excess access rights and force periodic re-validation of access rights
Apply encryption on sensitive information if found un-encrypted.
Remove unneeded information
Audit:
Prove that the information is in its appropriate location
Prove that only required personnel has access
Prove that information is properly classified
Prove that sensitive information is encrypted
Prove that retention policies are adhered to
What does this mean for the different verticals?
Financial sector
Insurance
Health Care
Pharmaceuticals
Manufacturing
Technology
For the financial sector, the most important aspect will still be regulatory compliance, but being able to protect business secrets will become more important. Example from Wall Street is the use of DRM to protect information to be forwarded when financial gains opportunities are sent out to prospective investors. Of course irate customers who have lost monies, or have been inconvenienced will also be a driving factor
For the insurance industry, I believe the most important aspect will still be the public's perception of how well their information is protected, as well as regulatory compliance. However, the loss of business data such as marketing plans, and customer databases to competitors should also rank high
For health care, I believe the most important task is to protect patients health information. See previous entry in blog on California's new law. HIPAA has been around for a while, but with this new law, regulatory compliance becomes even more important
For pharmaceutical companies, I believe B2B espionage will become an even more troublesome area for them to address. Especially considering the heightened competition from smaller companies abroad.
For manufacturing, manufacturing processes and blueprints are the crown jewels that needs protection. Contract negotiations are also an area of concern for manufacturing. If information is lost that can better the position for the other party, profit loss and lost competitive edge ensues.
For technology companies, customer information, regulatory requirements, Intellectual Property protection and business intelligence are areas of concern along with contract negotiations
SANS has determined that B2B corporate espionage is the next big worry for companies according to their latest research: http://www.infoworld.com/article/08/01/15/Cyber-espionage-moves-into-B2B_1.html. With that in mind, I see further need for proper protection of information in organizations, and most organizations will have to scramble to meet these new challenges. I believe that with a combination of DLP, Classification, Protection, and Audit, will be further needed in organizations. The game has moved from being regulatory compliant, to be able to protect your valuables. The last thing you want is to be notified by the federal government that your information may have been compromised. It is time to start thinking about how to safeguard your information. It is also interesting to see that SANS have identified the insider threat as growing: http://www.sans.org/2008menaces/?utm_source=web-sans&utm_medium=text-ad&utm_content=text-link_2008menaces_homepage&utm_campaign=Top_10__Cyber_Security_Menaces_-_2008&ref=22218. Ensuring that insiders do not have access to more information than they need is paramount. If you combine this threat with new search technologies such as Google for enterprise, Fast, Microsoft Search Server and other technologies enabling the malicious insider an easy way to steal sensitive information.
The three tenants of safeguarding information are:
Identify:
Where it is, What it is, Who has access, if the information is still needed, and to determine if is it protected
Protect: Ensure proper classification, proper access rights, and encrypt sensitive information
If information is found that is properly protected:
Remediate:
Classify information appropriately according to policy and or regulatory requirements
Remove excess access rights and force periodic re-validation of access rights
Apply encryption on sensitive information if found un-encrypted.
Remove unneeded information
Audit:
Prove that the information is in its appropriate location
Prove that only required personnel has access
Prove that information is properly classified
Prove that sensitive information is encrypted
Prove that retention policies are adhered to
What does this mean for the different verticals?
Financial sector
Insurance
Health Care
Pharmaceuticals
Manufacturing
Technology
For the financial sector, the most important aspect will still be regulatory compliance, but being able to protect business secrets will become more important. Example from Wall Street is the use of DRM to protect information to be forwarded when financial gains opportunities are sent out to prospective investors. Of course irate customers who have lost monies, or have been inconvenienced will also be a driving factor
For the insurance industry, I believe the most important aspect will still be the public's perception of how well their information is protected, as well as regulatory compliance. However, the loss of business data such as marketing plans, and customer databases to competitors should also rank high
For health care, I believe the most important task is to protect patients health information. See previous entry in blog on California's new law. HIPAA has been around for a while, but with this new law, regulatory compliance becomes even more important
For pharmaceutical companies, I believe B2B espionage will become an even more troublesome area for them to address. Especially considering the heightened competition from smaller companies abroad.
For manufacturing, manufacturing processes and blueprints are the crown jewels that needs protection. Contract negotiations are also an area of concern for manufacturing. If information is lost that can better the position for the other party, profit loss and lost competitive edge ensues.
For technology companies, customer information, regulatory requirements, Intellectual Property protection and business intelligence are areas of concern along with contract negotiations
Thursday, January 10, 2008
Sears sued over privacy breach. They posted customer purchase information on their website Managemyhome.com: http://www.infoworld.com/article/08/01/08/Sears-sued-over-privacy-breach_1.html
Tuesday, January 08, 2008
DLP solutions for data bases
The three major categories are of course, data in motion, DIM, Data at Rest, DAR, and data in use. Data in motion was the first area the DLP vendors addressed. This was either followed with Data at rest, or data in use, or both. What the major DLP vendors have not addressed is data base solutions. There are companies such as Exeros who provides partial solutions in discovering sensitive information in data bases, but there is not a comprehensive solution in place today that I know of.
When looking at SQL server 2005, it enables encryption, but it does come with a price (increased storage). It enables the Database Admin, DBA, to choose between different symmetric and asymmetric encryption algorithms. This allows for encryption of the sensitive information within the data base, assuming you know where the sensitive information resides as the cost of encryption is too high to apply to the entire database.
The trouble is of course when you need to expose this information to end users or other applications. Setting up connections ensuring that the information is encrypted the entire time is complex, and would either need key management and authorization management with federation if you communicate outside of your organization, which is why data base encryption is not widely deployed.
So how can this be solved?
Search ala DLP can and should be built.
This information should be held in a meta data base for classification purposes
Where regulatory requirements require encryption, or internal policies require encryption, Database encryption should be turned on automatically. In order to do this, a common encryption scheme must be put in place in the organizations database systems, including federation where needed.
If applications transport the information, SSL or IPSEC (with encryption) should be used
If applications expose information to the end user, a DRM solution should be built that enables the IIS server to serve the documents to the end user with the appropriate permissions.
This five items sounds easy enough to implement, but the real challenge is not just to build the core technology to do this, but also build a work flow that enables the business to continue to do business while remediating incidents found where business processes a either broken and or not fully documented.
The three major categories are of course, data in motion, DIM, Data at Rest, DAR, and data in use. Data in motion was the first area the DLP vendors addressed. This was either followed with Data at rest, or data in use, or both. What the major DLP vendors have not addressed is data base solutions. There are companies such as Exeros who provides partial solutions in discovering sensitive information in data bases, but there is not a comprehensive solution in place today that I know of.
When looking at SQL server 2005, it enables encryption, but it does come with a price (increased storage). It enables the Database Admin, DBA, to choose between different symmetric and asymmetric encryption algorithms. This allows for encryption of the sensitive information within the data base, assuming you know where the sensitive information resides as the cost of encryption is too high to apply to the entire database.
The trouble is of course when you need to expose this information to end users or other applications. Setting up connections ensuring that the information is encrypted the entire time is complex, and would either need key management and authorization management with federation if you communicate outside of your organization, which is why data base encryption is not widely deployed.
So how can this be solved?
Search ala DLP can and should be built.
This information should be held in a meta data base for classification purposes
Where regulatory requirements require encryption, or internal policies require encryption, Database encryption should be turned on automatically. In order to do this, a common encryption scheme must be put in place in the organizations database systems, including federation where needed.
If applications transport the information, SSL or IPSEC (with encryption) should be used
If applications expose information to the end user, a DRM solution should be built that enables the IIS server to serve the documents to the end user with the appropriate permissions.
This five items sounds easy enough to implement, but the real challenge is not just to build the core technology to do this, but also build a work flow that enables the business to continue to do business while remediating incidents found where business processes a either broken and or not fully documented.
DRM and ILP
When you have identified your sensitive information, you need to do something about it. The worst position to be in, is to have sensitive information identified in areas it should not belong without a solid business plan on how to remediate it.
Unprotected sensitive information is also available to the malicious insider, and it is important to balance security needs and productivity needs. Finding a balanced solution can be done with a combination of DLP solutions, Classification Solutions, Entitlement Solutions, and Protection Solutions.
The best way to protect sensitive information is to manage the entitlements to the information as well as placing security controls on the information. Entitlement management is hard to achieve unless you can enlist the owner or custodian of the information to manage the entitlement to the information. However if you ask them to both manage entitlement and encryption of the information, you will not be successful unless you automate the solution for the custodian.
If you put in place a classification system, you can manage both via automation. However it is important to also evaluate the encryption technology you want to use. Encryption File System, EFS, may not necessarily be the best solution as when information is copied from one repository to the next, the encryption is lost, therefore exposing the document.
Digital Rights Management, DRM, is a better solution as the document is protected the entire time. It is protected at rest, in transit, and in use. One drawback is of course that DRM solutions may not necessarily protect all documents.
One DRM solution to consider is Individual Rights Management, IRM, which is capable of protecting Microsoft Office documents. IRM works in conjunction with the Rights Management Server, RMS. With such a solution, you can protect Excel spreadsheets, Word documents, PowerPoint decks and other document types. IRM also works on documents retrieved from SharePoint when IRM is used in SharePoint.
A comprehensive solution will use the information from the DLP solution, and apply the correct classification level to the repository where the documents are found. Then the entitlement solution will restrict the number of users allowed access, as well as requiring re-validation of entitlements on a periodic basis. Then finally the protection solution will place IRM protection on the documents.
IRM protection can prevent the un-authorized copy, print and forward of documents. It can in addition be used to control the lifecycle of the document by setting an expiration date on the document.
When you have identified your sensitive information, you need to do something about it. The worst position to be in, is to have sensitive information identified in areas it should not belong without a solid business plan on how to remediate it.
Unprotected sensitive information is also available to the malicious insider, and it is important to balance security needs and productivity needs. Finding a balanced solution can be done with a combination of DLP solutions, Classification Solutions, Entitlement Solutions, and Protection Solutions.
The best way to protect sensitive information is to manage the entitlements to the information as well as placing security controls on the information. Entitlement management is hard to achieve unless you can enlist the owner or custodian of the information to manage the entitlement to the information. However if you ask them to both manage entitlement and encryption of the information, you will not be successful unless you automate the solution for the custodian.
If you put in place a classification system, you can manage both via automation. However it is important to also evaluate the encryption technology you want to use. Encryption File System, EFS, may not necessarily be the best solution as when information is copied from one repository to the next, the encryption is lost, therefore exposing the document.
Digital Rights Management, DRM, is a better solution as the document is protected the entire time. It is protected at rest, in transit, and in use. One drawback is of course that DRM solutions may not necessarily protect all documents.
One DRM solution to consider is Individual Rights Management, IRM, which is capable of protecting Microsoft Office documents. IRM works in conjunction with the Rights Management Server, RMS. With such a solution, you can protect Excel spreadsheets, Word documents, PowerPoint decks and other document types. IRM also works on documents retrieved from SharePoint when IRM is used in SharePoint.
A comprehensive solution will use the information from the DLP solution, and apply the correct classification level to the repository where the documents are found. Then the entitlement solution will restrict the number of users allowed access, as well as requiring re-validation of entitlements on a periodic basis. Then finally the protection solution will place IRM protection on the documents.
IRM protection can prevent the un-authorized copy, print and forward of documents. It can in addition be used to control the lifecycle of the document by setting an expiration date on the document.
California SB 1386 extended to cover unencrypted health information. AB 1292 adds medical health and insurance information to the breach disclosure laws: http://www.scmagazineus.com/California-data-breach-disclosure-law-extended-to-cover-medical-records/PrintArticle/100459/. It will be interesting to see which organizations will have to notify after a breach in this upcomming year.
Swedish secret military information left in a public library in Stockholm Sweden on a USB stick: http://www.thelocal.se/9560/20080104/. This shows that users in all sorts of organizations need more awareness training, and easier access to security technologies enabling users to better protect their information. I will write about digital rights management as I believe it can alleviate some of these issues.
Friday, January 04, 2008
Follow up to why classification matters (seems that others believe the same):
"It's not just data. You have to classify everything from a risk perspective," said Brian Cleary, vice president of marketing at access governance firm Aveksa. "Once you have those controls in place, the likelihood of losing that data goes down exponentially." Copied from an article about information loss prevention published here:http://www.crn.com/security/205207370
If you don't keep tabs (meta data) on the types of data you have, who has access (and should have access) its sensitivity, and its age, you are running blind.
A natural extension to a DLP implementation and solution is to implement a classification solution, an entitlement solution (helps SOX), and a retention solution.
"It's not just data. You have to classify everything from a risk perspective," said Brian Cleary, vice president of marketing at access governance firm Aveksa. "Once you have those controls in place, the likelihood of losing that data goes down exponentially." Copied from an article about information loss prevention published here:http://www.crn.com/security/205207370
If you don't keep tabs (meta data) on the types of data you have, who has access (and should have access) its sensitivity, and its age, you are running blind.
A natural extension to a DLP implementation and solution is to implement a classification solution, an entitlement solution (helps SOX), and a retention solution.
Virginia Governor Kaine announces new legislation for consumer protection from Identity loss: http://www.govtech.com/gt/print_article.php?id=242006. The proposal includes breach notification and credit report freeze.
Thursday, January 03, 2008
How to tie it all together?
The way to tie it together is to find a vendor who is willing to expose their APIs so that you can call these APIs with the information you need so you can support the work flow you need for a full compliance solution to your sensitive information.
When you go through your first initial investigation, you should map out the business processes creating sensitive information, and develop a business process for compliance management and remediation. The compliance work flow should include reporting based on business groups, regional reporting, roll ups for each business manager so problem areas can be pin pointed. Furthermore, as you start scanning, you must be ready for business processes that may not be documented, or work differently than how they are documented. Your solution must be able to facilitate on-boarding of new business processes as they are discovered.
For data at rest, a good way to manage this information, is to first detect the sensitive information in its repositories. Second is to notify the owner of the share or site that sensitive information resides in the repository. Thirdly, enable the owner to set the appropriate classification level and accompanying classification level. This step has to be intuitive and easy to perform for the end user, and must not impact the business process the repository is a part of. For this reason, the user should be given ample time to classify and protect the site, and a roll back function must enable roll back in case of adverse affect to the business process.
Measure everything – Determine what the key performance criteria are, then
constantly measure and analyze them. Integrate your findings into your operations. The best way to achieve this goal is to be able to create good reports from all the data you are gathering. By combining information from the DLP solution with your directory services, network information, repository owner information, and remediation efforts made by your compliance workflow, enables very detailed information which allows you to redirect your efforts towards your bottlenecks and highlight high risk areas you were not previously aware of.
Drive Awareness – It is my belief that without awareness in your organization of your policies and how to protect sensitive information, you will have inadvertent loss of sensitive information, possibly leaving your company non compliant with regulatory requirement, or the company looses information enabling other organizations to capitalize on the information your company worked so hard to obtain.
The way to tie it together is to find a vendor who is willing to expose their APIs so that you can call these APIs with the information you need so you can support the work flow you need for a full compliance solution to your sensitive information.
When you go through your first initial investigation, you should map out the business processes creating sensitive information, and develop a business process for compliance management and remediation. The compliance work flow should include reporting based on business groups, regional reporting, roll ups for each business manager so problem areas can be pin pointed. Furthermore, as you start scanning, you must be ready for business processes that may not be documented, or work differently than how they are documented. Your solution must be able to facilitate on-boarding of new business processes as they are discovered.
For data at rest, a good way to manage this information, is to first detect the sensitive information in its repositories. Second is to notify the owner of the share or site that sensitive information resides in the repository. Thirdly, enable the owner to set the appropriate classification level and accompanying classification level. This step has to be intuitive and easy to perform for the end user, and must not impact the business process the repository is a part of. For this reason, the user should be given ample time to classify and protect the site, and a roll back function must enable roll back in case of adverse affect to the business process.
Measure everything – Determine what the key performance criteria are, then
constantly measure and analyze them. Integrate your findings into your operations. The best way to achieve this goal is to be able to create good reports from all the data you are gathering. By combining information from the DLP solution with your directory services, network information, repository owner information, and remediation efforts made by your compliance workflow, enables very detailed information which allows you to redirect your efforts towards your bottlenecks and highlight high risk areas you were not previously aware of.
Drive Awareness – It is my belief that without awareness in your organization of your policies and how to protect sensitive information, you will have inadvertent loss of sensitive information, possibly leaving your company non compliant with regulatory requirement, or the company looses information enabling other organizations to capitalize on the information your company worked so hard to obtain.
Wednesday, January 02, 2008
A good overview of how to detect credit card data (or how difficult it can be) can be found in the blog entry by Ofer found here: http://www.modsecurity.org/blog/archives/2008/01/detecting_credi.html
An interesting blog on DLP vendor selection from Rich Mogull: Understanding and Selecting a Data Loss Prevention (DLP/CMF/CMP) Solution: Part 1 (of a seven part series)
Necessary additions for an enterprise running a DLP solution:
It is my belief that in order to run a successful DLP solution in your enterprise, you need to address remediation of any incidents that you find. The success of your implementation does not rely solely on how effective the solution is at finding incidents. It also relies on how well you can use the toolset to remediate these incidents.
I feel that most of the solutions available in the market place lacks one significant item, which is permanent classification of the information found. If you don't classify the sensitive information in a visible manner to the end user, you will end up with a whack-a-mole game with less than optimal risk reduction. Please read the previous post about why classification matters.
Furthermore, I believe it is important to tie your incident management into already existing incident management deployments already in place in the organization. That allows a complete tracking and reporting on all your incidents, whether from a DLP incident or a missing patch on a critical server. It also allows you to measure the effectiveness of your service, and if you are meeting your SLAs.
Maintaining strong metrics is an integral part of Service Management, and helps strengthen your ability to prove compliance to regulatory requirements and other requirements. It should also reduce your cost of audits.
Since most of the vendors are still venture capital, VC, funded, they are not able to meet all the requirements of an enterprise. Some of the vendors have now been purchased by larger companies, most notably Vontu, purchased by Symantec, and Tablus purchased by RSA/EMC. Even when a vendor has been acquired by a larger player, it takes time to integrate their solutions, so for a while I believe it is important to spend some development resources to build a comprehensive solution to serve the enterprise needs.
To be successful, Information Loss Prevention should be part of your overall compliance strategy and overall compliance service. This again should be part of your overall Governance, Risk and Compliance strategy, GRC.
It is my belief that in order to run a successful DLP solution in your enterprise, you need to address remediation of any incidents that you find. The success of your implementation does not rely solely on how effective the solution is at finding incidents. It also relies on how well you can use the toolset to remediate these incidents.
I feel that most of the solutions available in the market place lacks one significant item, which is permanent classification of the information found. If you don't classify the sensitive information in a visible manner to the end user, you will end up with a whack-a-mole game with less than optimal risk reduction. Please read the previous post about why classification matters.
Furthermore, I believe it is important to tie your incident management into already existing incident management deployments already in place in the organization. That allows a complete tracking and reporting on all your incidents, whether from a DLP incident or a missing patch on a critical server. It also allows you to measure the effectiveness of your service, and if you are meeting your SLAs.
Maintaining strong metrics is an integral part of Service Management, and helps strengthen your ability to prove compliance to regulatory requirements and other requirements. It should also reduce your cost of audits.
Since most of the vendors are still venture capital, VC, funded, they are not able to meet all the requirements of an enterprise. Some of the vendors have now been purchased by larger companies, most notably Vontu, purchased by Symantec, and Tablus purchased by RSA/EMC. Even when a vendor has been acquired by a larger player, it takes time to integrate their solutions, so for a while I believe it is important to spend some development resources to build a comprehensive solution to serve the enterprise needs.
To be successful, Information Loss Prevention should be part of your overall compliance strategy and overall compliance service. This again should be part of your overall Governance, Risk and Compliance strategy, GRC.
Tuesday, January 01, 2008
Proof of Concept of an ILP solution:
Before embarking on a full fledged information loss prevention program, you should conduct a proof of concept testing of the vendors you want to evaluate.
Before you start your proof of concept, you need to gather information from your lines of business on what they consider to be sensitive, as well as information from your legal counsel in regards to what regulatory compliance areas you should be concerned about.
With this information, you should do a thorough risk analysis. If you don't have all of the necessary knowledge inside your organization, I would recommend to hire an independent consultant to help you in this phase, as well as in the execution of the proof of concept.
You should also have a good understanding of how you want to run an ongoing ILP process in your organization after you have purchased the solution that best fits your needs.
The way I prefer to start of the process is to assemble both a core team and a virtual team so that you have the resources you need to succeed. The core team typically consists of a PM, an Architect, maybe a developer if needed, and a system administrator.
The plan for implementation begins with a project plan detailing the steps necessary. It should contain the Request for Information, RFI, process, Request for Proposal, RFP, process, actual testing both in a lab environment and some select production areas, as well as deployment tasks and hand off to service management.
The inception phase of the project is used to create the business requirement document, BRD, and project plan.
The planning phase is also the beginning of the POC phase where testing happens in a controlled environment within a lab.
The Development phase is also used for contract negotiations as well as developing any needed processes and code for the DLP solution to work optimally within the organization.
The testing phase is used to test any custom code, as well as any additional deliverables needed from the DLP vendor, as well as testing the processes established to see if they need any polishing.
The Deployment phase then follows which should contain alternate plans for deployment in case you face any obstacles.
The final phase is hand off to Service Management. I prefer upfront planning and collaboration with the Service Management team, and have RACI (responsible, accountable, communicate, and inform) all in place along with service management documentation such as Service Level Agreements, SLA, Operational Level Agreements, OLA, and Independent Contracts, IC. This makes the hand off much easier.
Before embarking on a full fledged information loss prevention program, you should conduct a proof of concept testing of the vendors you want to evaluate.
Before you start your proof of concept, you need to gather information from your lines of business on what they consider to be sensitive, as well as information from your legal counsel in regards to what regulatory compliance areas you should be concerned about.
With this information, you should do a thorough risk analysis. If you don't have all of the necessary knowledge inside your organization, I would recommend to hire an independent consultant to help you in this phase, as well as in the execution of the proof of concept.
You should also have a good understanding of how you want to run an ongoing ILP process in your organization after you have purchased the solution that best fits your needs.
The way I prefer to start of the process is to assemble both a core team and a virtual team so that you have the resources you need to succeed. The core team typically consists of a PM, an Architect, maybe a developer if needed, and a system administrator.
The plan for implementation begins with a project plan detailing the steps necessary. It should contain the Request for Information, RFI, process, Request for Proposal, RFP, process, actual testing both in a lab environment and some select production areas, as well as deployment tasks and hand off to service management.
The inception phase of the project is used to create the business requirement document, BRD, and project plan.
The planning phase is also the beginning of the POC phase where testing happens in a controlled environment within a lab.
The Development phase is also used for contract negotiations as well as developing any needed processes and code for the DLP solution to work optimally within the organization.
The testing phase is used to test any custom code, as well as any additional deliverables needed from the DLP vendor, as well as testing the processes established to see if they need any polishing.
The Deployment phase then follows which should contain alternate plans for deployment in case you face any obstacles.
The final phase is hand off to Service Management. I prefer upfront planning and collaboration with the Service Management team, and have RACI (responsible, accountable, communicate, and inform) all in place along with service management documentation such as Service Level Agreements, SLA, Operational Level Agreements, OLA, and Independent Contracts, IC. This makes the hand off much easier.
Subscribe to:
Posts (Atom)