Information Loss at Antioch University:
Failure to patch a Solaris server caused 60,000 users records to be exposed at Antioch University, including social security numbers: http://www.computerworld.com/action/article.do?command=viewArticleBasic&articleId=9075098&intsrc=hm_list
Monday, April 07, 2008
Go skiing, loose your PII: http://www.computerworld.com/action/article.do?command=viewArticleBasic&articleId=9074339&intsrc=hm_list Credit card information stolen as cards were swiped. Maybe it is time to revisit credit cards with a built in smart card chip? In this instance, 46,000 cards were exposed from the Okemo Mountain Resort ski area in Vermont
Tuesday, April 01, 2008
PCI compliant, what does that mean?
Does compliance by an organization to PCI mean that credit card information is safe? According to a news article by informationweek: http://www.informationweek.com/security/showArticle.jhtml?articleID=206904986, this might not be the case as Hannaford Bros, lost 4.2 million credit and debit card numbers, while stating on their website that they are compliant to the industry PCI standard.
Does compliance by an organization to PCI mean that credit card information is safe? According to a news article by informationweek: http://www.informationweek.com/security/showArticle.jhtml?articleID=206904986, this might not be the case as Hannaford Bros, lost 4.2 million credit and debit card numbers, while stating on their website that they are compliant to the industry PCI standard.
Thursday, March 27, 2008
What strategies to follow after you have implemented a DLP solution
If you deployed a DLP strategy, you have probably deployed it in your high risk areas, and if you have become somewhat mature in your current DLP deployment, the next is how to grow the deployment so that you can secure more areas. As you are becoming more successful, your management, or clients within business groups who is not currently enjoying the protection a DLP solution can give, will ask you to protect their areas as well.
So, the question becomes, how do you grow both horizontally and vertically? You can grow horizontally by putting in place in place more monitors, but you will quickly find yourself in a situation where your current rules/policies does not meet the needs of the additional areas where you are now scanning, or maybe the business model you deployed for the corporate roll out does not meet the needs of the business unit you are now supporting in addition to the corporate roll out.
Do you invest in data in motion along with data at rest? Do you invest in end point protection? How about managing different departments ranging from your HR department, to your credit card processing department to your research and development arm. For each one of these, different business problems arise, and different solutions must be put in place. For HR, your main concern is probably the loss or disclosure of personnel data, from your sales organization, customer PII, and from your R&D department, loss of your future bread and butter.
So the discussion becomes the one of head count, and centralized versus de-centralized. Which model is right, and how to ensure comparable results between them? It is a discussion which will be had in many organizations in the upcoming years. Many IT security shops will have the idea that you should have a centralized approach. This will become increasingly difficult for several reasons. One, only the users/business owners in the respective areas will have an understanding of what is valuable, and needs protection, and what doesn’t. Then you have the issue around different IT departments controlling collaboration and messaging. Each one is important for securing your information. I think the right answer is a mix between centralized/decentralized, where information security runs the majority of the tools, but the business owners and IT collaborates on how to identify IP and business secrets, and create and manage policies dependent on roles.
There is one undeniable fact. The amount of information is growing, in fact according to IDC, it is growing by 60% a year, with new regulatory requirements means that IT will have to invest more in managing the information for disclosure, protection and retention.
Demand for storage capacity has grown by 60% per year and shows no signs of slowing down, according to research company IDC. New disclosure laws, which require more data to be preserved and retrievable, also are making storage management a bigger job. http://www.networkworld.com/news/2008/032108-storage-revolution-jobs.html
If you deployed a DLP strategy, you have probably deployed it in your high risk areas, and if you have become somewhat mature in your current DLP deployment, the next is how to grow the deployment so that you can secure more areas. As you are becoming more successful, your management, or clients within business groups who is not currently enjoying the protection a DLP solution can give, will ask you to protect their areas as well.
So, the question becomes, how do you grow both horizontally and vertically? You can grow horizontally by putting in place in place more monitors, but you will quickly find yourself in a situation where your current rules/policies does not meet the needs of the additional areas where you are now scanning, or maybe the business model you deployed for the corporate roll out does not meet the needs of the business unit you are now supporting in addition to the corporate roll out.
Do you invest in data in motion along with data at rest? Do you invest in end point protection? How about managing different departments ranging from your HR department, to your credit card processing department to your research and development arm. For each one of these, different business problems arise, and different solutions must be put in place. For HR, your main concern is probably the loss or disclosure of personnel data, from your sales organization, customer PII, and from your R&D department, loss of your future bread and butter.
So the discussion becomes the one of head count, and centralized versus de-centralized. Which model is right, and how to ensure comparable results between them? It is a discussion which will be had in many organizations in the upcoming years. Many IT security shops will have the idea that you should have a centralized approach. This will become increasingly difficult for several reasons. One, only the users/business owners in the respective areas will have an understanding of what is valuable, and needs protection, and what doesn’t. Then you have the issue around different IT departments controlling collaboration and messaging. Each one is important for securing your information. I think the right answer is a mix between centralized/decentralized, where information security runs the majority of the tools, but the business owners and IT collaborates on how to identify IP and business secrets, and create and manage policies dependent on roles.
There is one undeniable fact. The amount of information is growing, in fact according to IDC, it is growing by 60% a year, with new regulatory requirements means that IT will have to invest more in managing the information for disclosure, protection and retention.
Demand for storage capacity has grown by 60% per year and shows no signs of slowing down, according to research company IDC. New disclosure laws, which require more data to be preserved and retrievable, also are making storage management a bigger job. http://www.networkworld.com/news/2008/032108-storage-revolution-jobs.html
Thursday, March 13, 2008
New concerns regarding health care information misuse. In an article from MSNBC: http://www.msnbc.msn.com/id/23392229/ they highlight the impact an impostor can have on your health when your information is abused.
This should bring attention to the need for medical facilities, and anyone keeping medical information, to be prepared to ensure the accuracy and integrity of the information, as well as protecting it from loss.
A shift from paper based information management to electronic management, enables greater efficiencies of information management, including sharing of information, but also enables loss of information at a much greater level than anytime before in history.
Organizations which have not moved to encrypted storage for sensitive information should do so as soon as possible, and improved authentication and authorization models must be put in place where they are lacking.
Systems must be put in place that ensures that the identity used is that of the person receiving health care, and that only the information needed is available to personnel who provides care, or otherwise handles the information.
According to FTC, 3% of identity theft victims have had someone else use their medical benefits. With identity theft growing, and medical care becoming more expensive, leaving more out, and the move towards electronic health information management, we are poised for the perfect storm.
This should bring attention to the need for medical facilities, and anyone keeping medical information, to be prepared to ensure the accuracy and integrity of the information, as well as protecting it from loss.
A shift from paper based information management to electronic management, enables greater efficiencies of information management, including sharing of information, but also enables loss of information at a much greater level than anytime before in history.
Organizations which have not moved to encrypted storage for sensitive information should do so as soon as possible, and improved authentication and authorization models must be put in place where they are lacking.
Systems must be put in place that ensures that the identity used is that of the person receiving health care, and that only the information needed is available to personnel who provides care, or otherwise handles the information.
According to FTC, 3% of identity theft victims have had someone else use their medical benefits. With identity theft growing, and medical care becoming more expensive, leaving more out, and the move towards electronic health information management, we are poised for the perfect storm.
Monday, March 10, 2008
Information loss prevention and operational risk management
An operational risk framework which would take input across the organization, which also manages exceptions to policy would be a huge benefit to overall risk management. As business users demand web 2.0 applications, easy to use cell phones with dual use capabilities (read using as email client for work purposes and view video and listen to music for personal use), and exceptions given to systems regarding patch level and security reviews.
Roll up operational risk summaries would be the only way to measure the aggregate operational risk in the organization. This married with information flow views, which outlines what objects access what information would make the risk decisions easier to make. If you knew who had access to what information where and when on what device, it would be easy to see what the true risk was, and if a request for an exception came in, it would be easy to determine if the additional risk was substantial, or minimal. It would be also easy to envision a self service model , where the user would be allowed to accept some risk, but if the risk moved above a threshold, a manager or security operator would have to grant it. Each business leader could then set an acceptable threshold within the organization, and its policy would then flow down to the individual users.
An operational risk framework which would take input across the organization, which also manages exceptions to policy would be a huge benefit to overall risk management. As business users demand web 2.0 applications, easy to use cell phones with dual use capabilities (read using as email client for work purposes and view video and listen to music for personal use), and exceptions given to systems regarding patch level and security reviews.
Roll up operational risk summaries would be the only way to measure the aggregate operational risk in the organization. This married with information flow views, which outlines what objects access what information would make the risk decisions easier to make. If you knew who had access to what information where and when on what device, it would be easy to see what the true risk was, and if a request for an exception came in, it would be easy to determine if the additional risk was substantial, or minimal. It would be also easy to envision a self service model , where the user would be allowed to accept some risk, but if the risk moved above a threshold, a manager or security operator would have to grant it. Each business leader could then set an acceptable threshold within the organization, and its policy would then flow down to the individual users.
It has been a busy few days, and for information loss prevention, a few areas are worthy a highlight. iPhone 2.0 is still questioned if it meets the regulatory requirements for data protection: http://computerworld.com/action/article.do?command=viewArticleBasic&articleId=9067319&intsrc=hm_list. Furthermore, here is a link to an article discussing how BlackBerry servers are ripe for the hacking. Yet another concern for IT security personnel who needs to protect sensitive information on all devices serviced by the organization: http://techworld.com/security/news/index.cfm?newsID=11663&pagtype=samechan.
Sunday, March 09, 2008
Information control
Maybe I should rename the blog from information protection, as it is just as much about information control. DLP products along with DRM products, firewalls and other security controls are mere solutions in place to control the flow of information. It is put in place to prevent flow of information to systems or personnel who should not have this information, and allow the flow to systems or personnel who should have access.
DLP tries to identify the type of content, and based on rules, apply various protection mechanisms to the information. In some areas, context is also evaluated. However one area which DLP has not fully gone into is the area of mapping social graphs to ensure that information does not flow from a highly trusted source to a trusted albeit less trusted than the first source downward in the hierarchy towards an untrusted source.
Clear areas of such downward flow can be stopped by reducing the access to broad access groups, however human nature is such that obstacles to sharing information usually is overcome, especially if it is easier to circumvent the control than it is to obey it.
Willful loss of information can only happen if technology, processes and people (the majority) is aligned. The processes much be such that they enable secure sharing to the proper objects, and people must buy into the idea that the value of protecting certain types of information is higher than the cost of loss caused by reducing sharing.
This can seem contrary to many, as we want to communicate, and we will fail in most of our
endeavors if we do not collaborate, at least within the group we belong. The problem is of course that most people belong to many groups, based on work, ideology, hobbies, neighbourhoods, etc. This means that just looking at the objects who have, had, or can access the information is not enough. You also need to look at who these objects are connected to, and who they are in turn connected to. You need to map out objects that form hubs versus spokes (power law distribution), and where these again lead to.
One trick used to track such information is to use a 1x1 pixel, to see who receives certain information. This is however not included in most information as it traverses networks, storage areas, end points, data bases, applications etc. Only when you can marry a map of all objects, and their interrelatedness, and where the information actually moves to and from can you truly understand the risks and or possibilities the organization have in sharing information within and across boundaries.
Today's DLP solutions create classifications in varying degrees, and and some store the result set in a data base, others persist the information within the meta data of the document. Either directly within the document, or in an alternate stream. These can of course be stripped off, and until DRM becomes pervasive, it will not solve this issue either. Actually DRM has another problem, in that if information is presented on a screen, it can be copied and the controls are stripped off as a consequence. However DRM will increase the effort necessary to improperly distribute information to objects who should not have access.
In order to support better protection, identity management is another dimension that must be solved. I will not go much into depth in this posting, other than just saying that roles based identity management is hard, and identity management between organizations are even harder, and is a contributor to the problem.
Maybe I should rename the blog from information protection, as it is just as much about information control. DLP products along with DRM products, firewalls and other security controls are mere solutions in place to control the flow of information. It is put in place to prevent flow of information to systems or personnel who should not have this information, and allow the flow to systems or personnel who should have access.
DLP tries to identify the type of content, and based on rules, apply various protection mechanisms to the information. In some areas, context is also evaluated. However one area which DLP has not fully gone into is the area of mapping social graphs to ensure that information does not flow from a highly trusted source to a trusted albeit less trusted than the first source downward in the hierarchy towards an untrusted source.
Clear areas of such downward flow can be stopped by reducing the access to broad access groups, however human nature is such that obstacles to sharing information usually is overcome, especially if it is easier to circumvent the control than it is to obey it.
Willful loss of information can only happen if technology, processes and people (the majority) is aligned. The processes much be such that they enable secure sharing to the proper objects, and people must buy into the idea that the value of protecting certain types of information is higher than the cost of loss caused by reducing sharing.
This can seem contrary to many, as we want to communicate, and we will fail in most of our
endeavors if we do not collaborate, at least within the group we belong. The problem is of course that most people belong to many groups, based on work, ideology, hobbies, neighbourhoods, etc. This means that just looking at the objects who have, had, or can access the information is not enough. You also need to look at who these objects are connected to, and who they are in turn connected to. You need to map out objects that form hubs versus spokes (power law distribution), and where these again lead to.
One trick used to track such information is to use a 1x1 pixel, to see who receives certain information. This is however not included in most information as it traverses networks, storage areas, end points, data bases, applications etc. Only when you can marry a map of all objects, and their interrelatedness, and where the information actually moves to and from can you truly understand the risks and or possibilities the organization have in sharing information within and across boundaries.
Today's DLP solutions create classifications in varying degrees, and and some store the result set in a data base, others persist the information within the meta data of the document. Either directly within the document, or in an alternate stream. These can of course be stripped off, and until DRM becomes pervasive, it will not solve this issue either. Actually DRM has another problem, in that if information is presented on a screen, it can be copied and the controls are stripped off as a consequence. However DRM will increase the effort necessary to improperly distribute information to objects who should not have access.
In order to support better protection, identity management is another dimension that must be solved. I will not go much into depth in this posting, other than just saying that roles based identity management is hard, and identity management between organizations are even harder, and is a contributor to the problem.
Wednesday, March 05, 2008
According to a survey by Pursuant, http://www.pursuantresearch.com/, 32% of surveyed government IT personnel do not think they will become compliant with requirements such as HSPD-12, FIPS 201, and FISMA. This article: http://www.informationweek.com/news/showArticle.jhtml?articleID=206901345 states that government IT personnel believes national security trumps privacy
Sunday, March 02, 2008
Confluence of HIPAA security audits and increasing attacks from the Internet creates pressure on health care organizations to protect their patient information: http://www.networkworld.com/news/2008/022708-healthcare-cyberattacks.html
The four important questions to ask for any custodian of sensitive information should be:
What information exists on my systems
Where is it located
Who has access
How is it protected
I believe the only way to find out what information exists, cataloguing and classification is a necessity. To find out where it is, the repositories containing information must be scanned, and content then classified based on this scan. To ensure that only users who need access, has access, entitlement management is key. The information that is classified should then be protected.
This cannot be achieved with technology alone. People, Process and Technology all go hand in hand to solve this problem.
The four important questions to ask for any custodian of sensitive information should be:
What information exists on my systems
Where is it located
Who has access
How is it protected
I believe the only way to find out what information exists, cataloguing and classification is a necessity. To find out where it is, the repositories containing information must be scanned, and content then classified based on this scan. To ensure that only users who need access, has access, entitlement management is key. The information that is classified should then be protected.
This cannot be achieved with technology alone. People, Process and Technology all go hand in hand to solve this problem.
Selection Criteria for an ILP solution
Here are the high level selection criteria I would use for selecting a DLP solution
· Accuracy (I would be willing to trade speed for accuracy if needed)
· Speed (can all high risk areas be scanned efficiently without a high bandwidth cost)
· Scalability (can all high risk areas be scanned efficiently)
· Remediation capabilities (if a scanning solution is deployed without proper remediation, it leaves the organization with a much higher risk than prior to scanning)
· Upfront cost of application
· Upfront cost of services needed to deploy application
· Cost of ownership
o How many headcount are needed to manage incidents and systems
o What is the annual support cost
o What is the total life time cost of the application (3 years)
· Risk reduction provided by application
o How is it measured
o Will result set stand up in court (can I prove due diligence when using these tools)
o Can new regulatory requirements or new corporate policy be set up within a standard framework
o Does the reporting meet the following needs
§ Overall risk reduction
§ Specific risk reduction for business unit/regulatory compliance/regional compliance
§ Can ROI be demonstrated
§ Are executive reports easy to understand
§ Can executive reports be rolled into a CIO scorecard
§ Does the reports for the operations team allow for improving efficiency of team and rules (this drives TCO)
Here are the high level selection criteria I would use for selecting a DLP solution
· Accuracy (I would be willing to trade speed for accuracy if needed)
· Speed (can all high risk areas be scanned efficiently without a high bandwidth cost)
· Scalability (can all high risk areas be scanned efficiently)
· Remediation capabilities (if a scanning solution is deployed without proper remediation, it leaves the organization with a much higher risk than prior to scanning)
· Upfront cost of application
· Upfront cost of services needed to deploy application
· Cost of ownership
o How many headcount are needed to manage incidents and systems
o What is the annual support cost
o What is the total life time cost of the application (3 years)
· Risk reduction provided by application
o How is it measured
o Will result set stand up in court (can I prove due diligence when using these tools)
o Can new regulatory requirements or new corporate policy be set up within a standard framework
o Does the reporting meet the following needs
§ Overall risk reduction
§ Specific risk reduction for business unit/regulatory compliance/regional compliance
§ Can ROI be demonstrated
§ Are executive reports easy to understand
§ Can executive reports be rolled into a CIO scorecard
§ Does the reports for the operations team allow for improving efficiency of team and rules (this drives TCO)
Thursday, February 28, 2008
Can you buy PCI compliance, a good article from Information Weeek: http://informationweek.com/security/showArticle.jhtml?articleID=206800868
Of course, you can get solid advice from vendors, but technology is just one part of the equation. First, you should evaluate if you have the right skill set in your organization, then you should evaluate your current processes, and re-engineer if needed. Only when you have evaluated both people and processes, should you start evaluating technology
Of course, you can get solid advice from vendors, but technology is just one part of the equation. First, you should evaluate if you have the right skill set in your organization, then you should evaluate your current processes, and re-engineer if needed. Only when you have evaluated both people and processes, should you start evaluating technology
Password database of stolen passwords found by Finjan: http://www.eweek.com/c/a/Security/Finjan-Finds-Database-of-8700-Stolen-FTP-Credentials/
Passwords should be treated as highly sensitive information as passwords are often reused by users, and can lead to the loss of all types of sensitive information within information systems. However, passwords can be hard to search for unless you already have a database of passwords. In the case passwords has to be stored electronically, they should at all times stay encrypted
Passwords should be treated as highly sensitive information as passwords are often reused by users, and can lead to the loss of all types of sensitive information within information systems. However, passwords can be hard to search for unless you already have a database of passwords. In the case passwords has to be stored electronically, they should at all times stay encrypted
Sunday, February 17, 2008
New study from Symantec
IT organizations are now reporting back to Symantec's survey that work on regulatory compliance is either comparable to other projects, or more important than risk mitigation efforts: http://www.infoworld.com/article/08/01/31/Study-reframes-IT-risk-management_1.html
This should be good news for information loss prevention programs, as PCI is definitely a driver for improved controls on how and when information is shared and to whom.
I believe the future trends will be divestments in some security strategies historically undertaken by an organization, such as extranet solutions, firewall deployments etc, and that the major investments for the future is in a blend between identity management and entitlement management. If you look at current encryption solutions, they usually stop at the enterprise egress point, as most organizations are not able to convince their partners to agree on a federation model.
It is time to divest in underperforming security initiatives, and invest in areas where you can find a better return on your investment. Today investment in compliance can provide better ROI than just merely investing in security controls. If you combine your investment so that you improve uptime, enable business, and can prove compliance, you find much more value than just investing in security controls.
http://www.infoworld.com/article/08/01/31/Study-reframes-IT-risk-management_1.html
IT organizations are now reporting back to Symantec's survey that work on regulatory compliance is either comparable to other projects, or more important than risk mitigation efforts: http://www.infoworld.com/article/08/01/31/Study-reframes-IT-risk-management_1.html
This should be good news for information loss prevention programs, as PCI is definitely a driver for improved controls on how and when information is shared and to whom.
I believe the future trends will be divestments in some security strategies historically undertaken by an organization, such as extranet solutions, firewall deployments etc, and that the major investments for the future is in a blend between identity management and entitlement management. If you look at current encryption solutions, they usually stop at the enterprise egress point, as most organizations are not able to convince their partners to agree on a federation model.
It is time to divest in underperforming security initiatives, and invest in areas where you can find a better return on your investment. Today investment in compliance can provide better ROI than just merely investing in security controls. If you combine your investment so that you improve uptime, enable business, and can prove compliance, you find much more value than just investing in security controls.
http://www.infoworld.com/article/08/01/31/Study-reframes-IT-risk-management_1.html
Saturday, February 09, 2008
Data bases and DLP
Quote from article in eweek: http://www.eweek.com/c/a/Security/DLP-DAM-Share-Common-Data-Security-Objectives/ "Most every security monitoring technology would benefit from DLP content awareness, which is the ability to recognize sensitive content on the fly," said Paul Proctor, an analyst with Gartner."
I completely agree, I believe DLP vendors need to address data bases along with repositories email and endpoints. Furthermore, such solutions should also protect any sensitive information leaving the data base
Quote from article in eweek: http://www.eweek.com/c/a/Security/DLP-DAM-Share-Common-Data-Security-Objectives/ "Most every security monitoring technology would benefit from DLP content awareness, which is the ability to recognize sensitive content on the fly," said Paul Proctor, an analyst with Gartner."
I completely agree, I believe DLP vendors need to address data bases along with repositories email and endpoints. Furthermore, such solutions should also protect any sensitive information leaving the data base
Amendments to Federal Rules of Civil Procedure, FRCP, creating opportunities for content management solutions: http://www.byteandswitch.com/document.asp?doc_id=144806&WT.svl=news1_6
Some solutions sit on email, and use keywords and phrases, others enable retrieval from tapes and other media.
At some time in the not so distant future, eDiscovery solutions and ILP solutions will probably merge, as they are both solving much the same problem.
Some solutions sit on email, and use keywords and phrases, others enable retrieval from tapes and other media.
At some time in the not so distant future, eDiscovery solutions and ILP solutions will probably merge, as they are both solving much the same problem.
Friday, February 08, 2008
Eli Lilly legal documents wrongfully sent to New York Times in a Billion dollar lawsuit
Eli Lilly could probably have been better protected if they had in place a federated trust with their law firm, Pepper Hamilton, and had the opportunity to protect their confidential communication with their outside counsel. This is truly the case for where Digital Rights Management could really protect their information.
http://news.cnet.co.uk/software/0,39029694,49295453,00.htm
This case of information leak is enlightening in several aspects.
One, Eli Lilly could potentially have lost ground in a serious legal matter
Two, this is an understandable mistake by the outside counsel, albeit one could argue that more care should have been taken. Awareness is key, and an awareness program can reduce the risk of such incidents.
Three, when conducting business with partners, just having legal agreements in place on how information is to be handled is not good enough. Contractual obligations should be audited against. This email could potentially have been stopped at the email server if an information loss prevention solution had been in place
Eli Lilly could probably have been better protected if they had in place a federated trust with their law firm, Pepper Hamilton, and had the opportunity to protect their confidential communication with their outside counsel. This is truly the case for where Digital Rights Management could really protect their information.
http://news.cnet.co.uk/software/0,39029694,49295453,00.htm
This case of information leak is enlightening in several aspects.
One, Eli Lilly could potentially have lost ground in a serious legal matter
Two, this is an understandable mistake by the outside counsel, albeit one could argue that more care should have been taken. Awareness is key, and an awareness program can reduce the risk of such incidents.
Three, when conducting business with partners, just having legal agreements in place on how information is to be handled is not good enough. Contractual obligations should be audited against. This email could potentially have been stopped at the email server if an information loss prevention solution had been in place
Thursday, February 07, 2008
An interesting book from the CEO of Kaiser Permanente, George Halvorson: http://www.healthcarereformnow.org/
In the second hard truth, Mr. Halvorson discusses care linkage deficiencies, of which he describes how medical doctors creates paper based medical records for their patients.
It is commendable that a person like Mr. Halvorson which has so much influence, is actively driving for digitizing health care records. If these records are made easily available to care providers as well as care recipients, great efficiencies can be created.
Digitizing medical records does come with some security concerns, which should be addressed. Only authorized personnel should have access. Anecdotal evidence which I have seen and heard points to the need for improving the culture in the health care industry in regards to safe guarding patient information. An awareness campaign is needed among care givers to educate them on how to best secure such information. Furthermore, tools needs to be made available to the health care professionals which allows them to continue to provide healthcare without being bogged down with security measures hindering them in their work.
These tools should address the who, what, when and where in regards to access to highly sensitive information such as patient records, while enabling the health care professionals to spend more time caring for patients. So these tools must enable secure collaboration so each professional who needs access to information readily has this information, however is restricted to only this information and not all information of all patients.
In the second hard truth, Mr. Halvorson discusses care linkage deficiencies, of which he describes how medical doctors creates paper based medical records for their patients.
It is commendable that a person like Mr. Halvorson which has so much influence, is actively driving for digitizing health care records. If these records are made easily available to care providers as well as care recipients, great efficiencies can be created.
Digitizing medical records does come with some security concerns, which should be addressed. Only authorized personnel should have access. Anecdotal evidence which I have seen and heard points to the need for improving the culture in the health care industry in regards to safe guarding patient information. An awareness campaign is needed among care givers to educate them on how to best secure such information. Furthermore, tools needs to be made available to the health care professionals which allows them to continue to provide healthcare without being bogged down with security measures hindering them in their work.
These tools should address the who, what, when and where in regards to access to highly sensitive information such as patient records, while enabling the health care professionals to spend more time caring for patients. So these tools must enable secure collaboration so each professional who needs access to information readily has this information, however is restricted to only this information and not all information of all patients.
An article discussing learning to address High Business Impact, HBI, in the enterprise in the SC magazine written by Joel Christner with Reconnex: http://www.scmagazineus.com/Learning-applications-Revolutionizing-data-loss-prevention/article/105073/
Entitlement management
Entitlement management is important not only for your security posture, it is also important for your compliance efforts for SOX and PCI.
The problem with entitlement management is of course to know who has access to what. You probably know who unless you have too broad of an access policy on your information. How would you know if you have to broad of an access? You need to scan for large user groups, and global groups. These groups should not be allowed for sensitive and highly sensitive information. Do you know all the instances within your organization of sensitive and highly sensitive information? You can of course use DLP to scan for these information types. The problem is of course that the DLP solutions do not map back to who had access when.
With these questions/problems, what are you to do?
One, you should scan all your information, and identify where you have highly sensitive and sensitive information.
When this has been identified, you need to keep a persistent classification of the information, so a classification solution must be deployed and implemented.
When you have applied the classification, you need to ensure that the large groups and or global groups do not have access to this information.
For information where you need to validate that users who should have access have access, and users who should not have access does not have access, custodians of the sensitive information are required to validate the users who has access. By forcing the validation at the lowest level possible, you can effectively address the biggest problem in organizations today, which is entitlement creep. Entitlement creep happens when employees move from one job to another, or the job changes over time, and access needs change with them. Most often, when this happens, the employee gets access to the new areas needed for their job, but the old entitlements are not removed. By clearly assigning custodianship at as low of a level as possible, this can be taken care of if the custodians are reminded periodically to validate who should have access, and that they are aware that they are also audited agaist their accountability
In other words, the full solution is to map your scanning of sensitive information to your identity management systems, as well as a classification and remediation solution
Entitlement management is important not only for your security posture, it is also important for your compliance efforts for SOX and PCI.
The problem with entitlement management is of course to know who has access to what. You probably know who unless you have too broad of an access policy on your information. How would you know if you have to broad of an access? You need to scan for large user groups, and global groups. These groups should not be allowed for sensitive and highly sensitive information. Do you know all the instances within your organization of sensitive and highly sensitive information? You can of course use DLP to scan for these information types. The problem is of course that the DLP solutions do not map back to who had access when.
With these questions/problems, what are you to do?
One, you should scan all your information, and identify where you have highly sensitive and sensitive information.
When this has been identified, you need to keep a persistent classification of the information, so a classification solution must be deployed and implemented.
When you have applied the classification, you need to ensure that the large groups and or global groups do not have access to this information.
For information where you need to validate that users who should have access have access, and users who should not have access does not have access, custodians of the sensitive information are required to validate the users who has access. By forcing the validation at the lowest level possible, you can effectively address the biggest problem in organizations today, which is entitlement creep. Entitlement creep happens when employees move from one job to another, or the job changes over time, and access needs change with them. Most often, when this happens, the employee gets access to the new areas needed for their job, but the old entitlements are not removed. By clearly assigning custodianship at as low of a level as possible, this can be taken care of if the custodians are reminded periodically to validate who should have access, and that they are aware that they are also audited agaist their accountability
In other words, the full solution is to map your scanning of sensitive information to your identity management systems, as well as a classification and remediation solution
Subscribe to:
Posts (Atom)