Cybersecurity risk management is a crucial aspect of an organization’s overall risk management strategy. It involves identifying, assessing, and controlling threats to an organization’s digital assets, including data, networks, and computer systems[1][3][7][9][13].
Importance of Cybersecurity Risk Management
Cybersecurity risk management is essential for several reasons:
Compliance with Regulations: Many organizations are required to maintain certain levels of cybersecurity standards to comply with regulations such as HIPAA and PCI DSS[1].
Improved Decision-making: Understanding potential risks and their associated consequences allows organizations to make more informed decisions, leading to more effective allocation of resources and system design[1].
Increased Security: Risk management processes are designed to reduce the likelihood of a cyberattack and help protect the organization’s digital assets[1].
Improved Visibility: Risk management can provide organizations with greater visibility into their cybersecurity posture, helping them better understand their security landscape and be better prepared to respond to threats[1].
Cybersecurity Risk Management Process
The cybersecurity risk management process typically involves several key steps:
Identifying Critical Assets: This involves recognizing the digital assets that are most crucial to the organization’s operations and could be potential targets for cyber threats[4].
Evaluating Vulnerabilities: This step involves assessing the weaknesses in the organization’s systems that could be exploited by cyber threats[4].
Understanding Potential Threats: This involves identifying the various forms of cyber threats that the organization could face, such as hackers, malware, insider threats, and social engineering attacks[6].
Building Robust Defenses: This involves investing in industry-leading technologies and implementing measures to protect the organization’s digital assets[4].
Implementing Incident Response Plans: This involves developing strategies to respond effectively to potential cyber threats[4].
Continuous Monitoring: This involves regularly monitoring the organization’s systems for potential threats and making necessary adjustments to the risk management strategy[4].
Role of Cybersecurity Risk Managers
Cybersecurity risk managers are professionals responsible for managing cyber risks within an organization. They identify, analyze, and evaluate vulnerabilities and work to add value to the company’s strategy by ensuring the organization’s digital assets are protected[11].
Conclusion
Cybersecurity risk management is a critical discipline in today’s digital world. It involves anticipating, evaluating, and managing the cyber risks that an organization may face. By implementing effective cybersecurity risk management strategies, organizations can protect their digital assets, ensure compliance with regulations, make informed decisions, and improve their overall security posture[1][3][7][9][13].
This script is used fin the video available online on YouTube here:
Context information:
I’m a senior cybersecurity risk analyst. I’m working for an IT and cybersecurity consulting firm in Montreal (Quebec, Canada). I was hired by a customer to assist them with their cybersecurity governance, risk management and compliance activities.
Initial data provided:
Here is what you need to know about the target organization we will be using to perform a cybersecurity risk assessment: The Integrated Health and Social Services University Network for West-Central Montreal (CIUSSS West-Central Montreal) is committed to providing healthcare recipients with timely access to a seamless continuum of care that focuses on individuals’ particular needs. As well, consider the reports and other documents provided as attachments about the organization that will be used in the risk assessment.
(remember to add annual reports and other information available online about the information)
Estimate the Risk appetite:
Estimate the Risk appetite for the Integrated Health and Social Services University Network for West-Central Montreal (CIUSSS West-Central Montreal). On a scale of 0 to 1, estimate a quantitative Risk Appetite value. Provide a detailed justification of the Risk appetite. Proceed to save this Risk appetite value to use in later risk assessments and calculations for this organization.
Consider the following information:
Highly Risk Averse or a very low Risk appetite = 0.1
Risk Averse or a low Risk appetite = 0.3
Risk neutrality or a Risk Neutral Risk appetite = 0.5
Risk Seeking or a high Risk appetite = 0.7
Highly Risk Seeking or a very high Risk appetite = 0.9
Summary scenario generation query:
Using the information provided about the organization, the results generated in the previous queries, as well as your accumulated knowledge, create 20 cybersecurity risk scenario for the Integrated Health and Social Services University Network for West-Central Montreal (CIUSSS West-Central Montreal).
Intermediate query:
Consider the following cybersecurity risk scenario Ransomware Attack on Healthcare Systems: Malware encrypts critical patient data and systems, demanding ransom to restore access at the organization Integrated Health and Social Services University Network for West-Central Montreal (CIUSSS West-Central Montreal).
Detailed scenario generation query:
Create a detailed description of summarily described cybersecurity risk scenario.
Consider the following cybersecurity risk scenario Ransomware Attack on Healthcare Systems: Malware encrypts critical patient data and systems, demanding ransom to restore access at the organization Integrated Health and Social Services University Network for West-Central Montreal (CIUSSS West-Central Montreal).
Using the information generated in the previous intermediate query, expand the cybersecurity risk scenario into a detailed cybersecurity risk scenario. In your answer provide the following details:
The Scenario Name,
a List of the stakeholders involved,
some Background information out the scenario,
a detailed Description of the scenario or the incident leading to the undesired outcome,
a bullet list of sequence of events leading to the scenario,
a description of the Consequences,
some Historical data, and
proposed Mitigation measures, internal controls, and prevention mechanisms.
Provide the following metrics and measures:
On a scale of 0 to 1. the Probability that the threat will be present.
On a scale of 0 to 1, include the Probability of exploitation.
On a scale of 0 to 1, include the Estimated expected damages.
On a scale of 0 to 1, include the Maximal damages.
On a scale of 0 to 1, include the Level of organizational resilience.
On a scale of 0 to 1, include the Expected utility.
All these previous metrics need to be on a scale of 0 to 1. Then, calculate the CVSS version 3.1 score of the vulnerabilities. Use the CVSS scale of 0 to 10. Use the information you already generated in the previous queries to calculate the CVSS score. However, if information is missing, make your best estimation of the required specific details about the vulnerability, including its attack vector, complexity, privileges required, user interaction, scope, confidentiality impact, integrity impact, availability impact, and more. Provide the details metrics used in the CVSS calculation. Present the CVSS score on the CVSS scale of 0 to 10 and indicate the severity level on the following scale:
None = 0
Low = 0.1 to 3.9
Medium = 4.0 to 6.9
High = 7.0 to 8.9
Critical = 9.0 to 10.0
Present a detailed list of proposed Mitigation measures, internal controls and prevention mechanisms that can be used. Include a detailed budgetary estimate in Canadian dollars ($) for the costs of implementing the proposed Mitigation measures, internal controls, and prevention mechanisms. Indicate the impact reduction and probability reduction on a scale of 0 to 1 for the proposed Mitigation measures, internal controls, and prevention mechanisms.
Proceed to save all these values to use in later calculations.
Calculate the KRI:
Consider the following cybersecurity risk scenario Ransomware Attack on Healthcare Systems: Malware encrypts critical patient data and systems, demanding ransom to restore access at the organization Integrated Health and Social Services University Network for West-Central Montreal (CIUSSS West-Central Montreal).
Using the results generated and presented from the previous queries, calculate the Key Risk Indicators (KRI) for the cybersecurity risk scenario. Use the formulas that are provided below:
Estimated risk = ((Probability that the threat will be present) x (Probability of exploitation) x (CVSS Score) x (Expected utility) x ((Estimated expected damages) + (Maximal damages) divided by 2)) divided by (the Level of organizational resilience).
Calculate the Estimated risk
Include detailed information about the calculations made to produce these results.
Show in your answer the formula that was used.
Save this value to use in later calculations.
Tolerated risk = ((Probability that the threat will be present) x (Probability of exploitation) x (CVSS Score) x (Expected utility) x ((Risk appetite)) divided by (the Level of organizational resilience).
Calculate the Tolerated risk
Include detailed information about the calculations made to produce these results.
Show in your answer the formula that was used.
Save this value to use in later calculations.
Mitigated risk = ((Estimated risk) x ((impact reduction for the proposed Mitigation measures, internal controls, and prevention mechanisms) x (probability reduction for the proposed Mitigation measures, internal controls, and prevention mechanisms))).
Calculate the Mitigated risk
Include detailed information about the calculations made to produce these results.
Include detailed information about the calculations made to produce these results.
Show in your answer the formula that was used.
Save this value to use in later calculations.
Explain and justify the results and KRI:
Using the results generated and presented from the previous queries, and the Key Risk Indicators (KRI) for the cybersecurity risk scenario Ransomware Attack on Healthcare Systems: Malware encrypts critical patient data and systems, demanding ransom to restore access at the organization Integrated Health and Social Services University Network for West-Central Montreal (CIUSSS West-Central Montreal), answer the following questions:
Compare the Residual risk to the Tolerated risk.
Explain of the results.
In the case where the Residual risk is higher than the Tolerated risk: make some recommendations for the organizations on what could be done to remediate the risks further.
In the case where the Residual risk is lower than the Tolerated risk: then make some recommendations for the organizations on what could be done to avoid unnecessary spending while keeping the risk levels acceptable.
Create a table of the results:
Create a table of the results that can be exported to Excel with the following row:
Probability that the threat will be present
Probability of exploitation
CVSS Score
Expected utility
Estimated expected damages
Maximal damages
Level of organizational resilience
Risk appetite
impact reduction for the proposed Mitigation measures, internal controls, and prevention mechanisms
probability reduction for the proposed Mitigation measures, internal controls, and prevention mechanisms
Budgetary estimate of the proposed mitigation measures
Ce texte accompagne un vidéo disponible sur Youtube:
Dans cette vidéo détaillée, nous plongeons au cœur de l’analyse de risque de cybersécurité avec l’aide de ChatGPT et CyberRisk Guardian : https://chat.openai.com/g/g-ZBqngfK39-cyberrisk-guardian. Cet exemple utilise une organisation du secteur de la santé. Suivez notre démarche étape par étape pour identifier et atténuer les risques de cybersécurité au sein d’une grande organisation de soins de santé, le CIUSSS du Centre-Ouest-de-l’Île-de-Montréal. Nous couvrons les concepts fondamentaux tels que l’identification des risques, l’évaluation des probabilités et des impacts, et la mise en place de stratégies d’atténuation pour renforcer la résilience organisationnelle.
Voici les données utilisés dans le vidéo:
Informations contextuelles :
Je suis analyste principal des risques de cybersécurité. Je travaille pour une société de conseil en informatique et cybersécurité à Montréal (Québec, Canada). J’ai été embauché par un client pour l’assister dans ses activités de gouvernance de la cybersécurité, de gestion des risques en matière de cybersécurité et de conformité.
Données initiales fournies :
Voici ce que vous devez savoir sur l’organisation cible que nous utiliserons pour effectuer une évaluation des risques liés à la cybersécurité : Le Centre intégré universitaire de santé et de services sociaux du Centre-Ouest-de-l’Île-de-Montréal (CIUSSS) s’engage à offrir aux bénéficiaires de soins de santé un accès rapide à un continuum de soins homogène axé sur les besoins particuliers des personnes. De plus, considérez les rapports et autres documents fournis comme pièces jointes au sujet de l’organisation qui seront utilisés dans l’évaluation des risques.
(N’oubliez pas d’ajouter les rapports annuels et d’autres informations disponibles en ligne pour fournir le plus d’information possible sur l’organisation)
Estimer l’appétit pour le risque :
Estimer l’appétit pour le risque du : Le Centre intégré universitaire de santé et de services sociaux du Centre-Ouest-de-l’Île-de-Montréal (CIUSSS Centre-Ouest-de-l’Île-de-Montréal).Sur une échelle de 0 à 1, estimez une valeur quantitative de l’appétit pour le risque. Fournir une justification détaillée de l’appétit pour le risque. Enregistrez cette valeur d’appétit pour le risque afin de l’utiliser dans les évaluations et les calculs ultérieurs des risques pour cette organisation.
Tenez compte des informations suivantes :
Très forte aversion au risque ou très faible appétit pour le risque = 0.2
Aversion au risque ou faible appétit pour le risque = 0.3
Neutralité du risque ou appétit pour le risque neutre = 0.5
Propension au risque ou appétit pour le risque élevé = 0.7
Forte propension au risque ou propension au risque très élevé = 0.8
Une valeur de 0.1 ou de 0.9 est une valeur extrême qui ne devrait pas normalement être utilisée.
Requête de génération de scénario sommaires :
Créer 20 scénarios de risque de cybersécurité pour: Le Centre intégré universitaire de santé et de services sociaux du Centre-Ouest-de-l’Île-de-Montréal (CIUSSS).
Requête intermédiaire :
Créer une description détaillée du scénario de risque de cybersécurité Attaque par rançongiciel : Un attaquant crypte les données critiques des patients et demande une rançon pour la clé de déchiffrement au Centre intégré universitaire de santé et de services sociaux du Centre-Ouest-de-l’Île-de-Montréal (CIUSSS). Il s’agit sommairement d’un accès non autorisé et d’une exfiltration des renseignements sur la santé des patients en raison de contrôles de sécurité inadéquats.
Requête de génération de scénario détaillée :
Développer le scénario de risque de cybersécurité détaillé pour le scénario Attaque par rançongiciel : Un attaquant crypte les données critiques des patients et exige une rançon pour la clé de déchiffrement de l’organisation au Centre intégré universitaire de santé et de services sociaux du Centre-Ouest-de-l’Île-de-Montréal. Créer un scénario détaillé de risque de cybersécurité. Pour le scénario, indiquer :
le nom du scénario,
une liste des parties prenantes impliquées,
des informations générales sur le scénario,
une description détaillée du scénario ou de l’incident ayant conduit au résultat indésirable,
une liste à puces de la séquence des événements ayant conduit au résultat indésirable,
une description des conséquences envisagées si le scénario se réalise, des données historiques, et
les mesures d’atténuation proposées, des contrôles internes et des mécanismes de prévention susceptible de réduire la probabilité que le scénario se réalise ou en réduire les impacts si il se produit.
Sur une échelle de 0 à 1, incluez la probabilité que la menace soit présente.
Sur une échelle de 0 à 1, incluez la probabilité d’exploitation.
Sur une échelle de 0 à 1, incluez les dommages attendus estimés. Sur une échelle de 0 à 1, incluez les dommages maximaux.
Sur une échelle de 0 à 1, incluez le niveau de résilience organisationnelle.
Sur une échelle de 0 à 1, incluez l’utilité Attendu. Tous ces éléments doivent être sur une échelle de 0 à 1.
Calculez le score CVSS version 3.1 des vulnérabilités, faites votre meilleure estimation des détails spécifiques requis sur la vulnérabilité, y compris son vecteur d’attaque, sa complexité, les privilèges requis, l’interaction de l’utilisateur, la portée, l’impact sur la confidentialité, l’impact sur l’intégrité, l’impact sur la disponibilité, etc. Fournissez les détails des mesures utilisées dans le calcul CVSS.
Présentez le score CVSS.
Présentez une liste détaillée des mesures d’atténuation, des contrôles internes et des mécanismes de prévention qui peuvent être utilisés. Inclure une estimation budgétaire détaillée en dollars canadiens ($) des coûts de mise en œuvre des mesures d’atténuation, des contrôles internes et des mécanismes de prévention proposés.
Indiquez la réduction de l’impact et la réduction de la probabilité sur une échelle de 0 à 1 pour les mesures d’atténuation, les contrôles internes et les mécanismes de prévention proposés.
Enregistrez toutes ces valeurs pour les utiliser dans des calculs ultérieurs.
Calculez le KRI :
À l’aide des résultats générés et présentés à partir des requêtes précédentes, calculez les indicateurs clés de risque (IRC) pour le scénario de risque de cybersécurité Attaque par rançongiciel : Un attaquant crypte les données critiques des patients et exige une rançon pour la clé de déchiffrement de l’organisation au Centre intégré universitaire de santé et de services sociaux du Centre-Ouest-de-l’Île-de-Montréal à l’aide des formules suivantes :
Risque estimé = ((Probabilité que la menace soit présente) x (Probabilité d’exploitation) x (Score CVSS) x (Utilité attendue) x ((Dommages attendus estimés) + (Dommages maximaux) divisé par 2)) divisé par (Niveau de résilience organisationnelle).
Enregistrez cette valeur pour l’utiliser dans des calculs ultérieurs.
Risque toléré = ((Probabilité que la menace soit présente) x (Probabilité d’exploitation) x (Score CVSS) x (Utilité attendue) x ((Appétit pour le risque)) divisé par (Niveau de résilience organisationnelle).
Enregistrez cette valeur pour l’utiliser dans des calculs ultérieurs.
Risque atténué = ((Risque estimé) x ((réduction de l’impact des mesures d’atténuation, des contrôles internes et des mécanismes de prévention proposés) x (réduction de la probabilité pour les mesures d’atténuation, les contrôles internes et les mécanismes de prévention proposés))).
Enregistrez cette valeur pour l’utiliser dans des calculs ultérieurs.
Corporate compliance is the cornerstone of business operations, ensuring the adherence to legal obligations and regulatory standards. It encompasses the conformity to established rules and guidelines that govern various aspects of organizational conduct.
Implementing robust risk management strategies is paramount for achieving corporate compliance. By identifying potential risks and vulnerabilities, organizations can proactively mitigate adverse events, thereby safeguarding their operations and reputation.
Furthermore, anti-corruption measures play a crucial role in upholding corporate compliance. These measures aim to prevent unethical behavior, bribery, and fraud within the organization, fostering an environment of transparency and integrity.
Compliance training is another integral component of maintaining corporate compliance. Educating employees on legal requirements and ethical standards contributes to a culture of observance and reduces the risk of violations.
In addition to this, data protection is fundamental in ensuring corporate compliance. Safeguarding sensitive information through encryption and access controls is essential for conforming to data protection regulations.
It's essential for businesses to prioritize corporate compliance as it not only mitigates legal risks but also fosters a culture of integrity within the organization.
Corporate Governance
Corporate governance is the framework of rules, practices, and processes by which a company is directed and controlled. It is essential for ensuring transparency, accountability, and ethical conduct within an organization. Effective corporate governance not only promotes integrity but also enhances stakeholder confidence in the company's operations and decision-making processes.
The board of directors plays a pivotal role in the oversight of corporate governance. They are responsible for providing strategic direction, monitoring executive management, and ensuring that the organization's activities align with its objectives. Their leadership and guidance are crucial in upholding the principles of corporate governance.
Furthermore, corporate governance frameworks guide decision-making processes and organizational behavior. These frameworks establish guidelines for how decisions are made, how authority is exercised, and how performance is monitored. By adhering to these frameworks, organizations can ensure that their operations are conducted with integrity and in alignment with ethical standards.
Effective corporate governance is fundamental for building trust among stakeholders and fostering a culture of ethical conduct within an organization.
Key Elements of Corporate Governance
The board of directors: Responsible for overseeing the company's management and operations.
Decision-making processes: Guided by established frameworks to ensure ethical conduct and adherence to regulations.
Organizational behavior: Governed by principles that promote transparency, accountability, and integrity.
The proper administration of these key elements ensures that corporate governance serves as a robust foundation for ethical business practices while maintaining stakeholder confidence in the organization's operations.
Ethical Conduct
In the realm of business, ethical conduct serves as the cornerstone of a positive corporate culture and plays a pivotal role in strengthening the reputation of an organization. Upholding strong ethical principles is essential for fostering trust and integrity within the workplace.
Ethics in Business
Ethics in business encompass a set of moral principles that guide decision-making and behavior within an organization. These principles revolve around integrity, honesty, and fairness, serving as the guiding light for all operations. By adhering to ethical standards, businesses can create an environment where employees feel valued and respected, leading to increased productivity and loyalty.
Benefits of Ethical Conduct
Embracing ethical conduct brings forth a multitude of benefits for businesses. Firstly, it promotes trust among stakeholders including employees, customers, and investors. When stakeholders have confidence in an organization's commitment to ethical practices, it paves the way for long-term relationships built on mutual respect and transparency.
Moreover, companies that prioritize ethical conduct often outperform their competitors. This is because ethical practices contribute to enhanced brand reputation and customer loyalty. Consumers are more inclined to support businesses that demonstrate a strong sense of morality and responsibility towards society.
In essence, integrating ethics into business operations not only fosters a positive work environment but also leads to sustainable growth and success.
Regulatory Standards
Regulatory standards serve as the bedrock for ensuring compliance with laws and industry regulations, providing a framework within which businesses must operate. These standards encompass a set of rules and guidelines that dictate the conduct and operations of organizations, aiming to uphold ethical practices and legal adherence.
Non-compliance with regulatory standards can have severe repercussions, leading to legal consequences and reputational damage. It is imperative for businesses to understand and adhere to these standards to avoid penalties and maintain their integrity within the industry.
Staying updated with evolving regulatory requirements is essential for organizations to ensure ongoing compliance. As regulations may change over time, businesses must continuously monitor and adapt their practices to align with the latest standards. This demonstrates a commitment to ethical business conduct and a willingness to operate within the boundaries of the law.
Risk Management
In the realm of business operations, risk management plays a pivotal role in identifying potential threats and vulnerabilities to an organization. By conducting thorough risk assessments, businesses can effectively evaluate and mitigate potential hazards that may impact their operations.
Risk Assessment and Mitigation
Risk assessment involves the systematic identification and analysis of potential risks that could jeopardize the organization's objectives. This process enables businesses to prioritize risks based on their likelihood and potential impact. Once risks are identified, effective risk mitigation strategies can be implemented to minimize the adverse effects of these events on business operations.
By proactively addressing potential threats, organizations can safeguard their assets, reputation, and financial stability. Effective risk mitigation not only reduces the likelihood of negative outcomes but also enhances overall resilience in the face of unforeseen challenges.
Integration of Risk Management
Integrating risk management into the overall corporate compliance framework is essential for ensuring comprehensive governance and ethical conduct within an organization. By aligning risk management practices with compliance protocols, businesses can proactively identify areas of vulnerability and implement measures to address them.
Continuous monitoring and evaluation are fundamental components of proactive risk management. Regular assessments enable organizations to adapt to evolving risks and ensure that mitigation strategies remain effective in mitigating potential hazards. This ongoing process empowers businesses to maintain a robust culture of compliance while effectively managing potential threats.
Compliance Training
Compliance training is a vital component of an organization's efforts to uphold corporate compliance and ethical standards. It serves as a foundational tool for educating employees on legal requirements and industry-specific regulations, thereby fostering a culture of observance and integrity within the workplace.
Importance of Compliance Training
Educating Employees: Compliance training provides employees with essential knowledge about legal obligations, ethical standards, and regulatory requirements. This education empowers them to make informed decisions in their daily tasks while ensuring that their actions align with the organization's compliance protocols.
Culture of Compliance: Well-trained staff members contribute to the cultivation of a culture of compliance within the organization. By understanding and adhering to the prescribed guidelines, employees actively participate in upholding ethical conduct and minimizing the risk of regulatory violations.
Implementing Effective Compliance Training Programs
Customized compliance training programs are essential for addressing specific industry regulations and organizational policies. These programs are tailored to provide targeted education on relevant compliance areas, ensuring that employees understand how these regulations apply to their roles and responsibilities.
Regular updates to compliance training programs are crucial for keeping employees informed about the latest protocols and regulatory changes. By continuously refreshing their knowledge base, employees remain well-equipped to navigate evolving compliance requirements effectively.
Data Protection
In today's digital landscape, securing sensitive data is of paramount importance for organizations. Data protection measures are essential to safeguard sensitive information from unauthorized access and potential data breaches. Compliance with data protection regulations not only ensures legal adherence but also plays a critical role in maintaining customer trust and loyalty.
Securing Sensitive Data
Implementing robust encryption protocols is crucial for protecting sensitive data from unauthorized access. Encryption converts data into a code that can only be accessed with the appropriate decryption key, adding an extra layer of security to the information.
Access controls, such as multi-factor authentication and role-based access, further fortify the security of sensitive data. These controls restrict unauthorized individuals from gaining access to confidential information, minimizing the risk of data breaches.
Data Privacy Best Practices
Data protection policies should align with international data privacy standards and regulations to ensure comprehensive privacy protection. By adhering to these best practices, organizations can demonstrate their commitment to upholding the privacy rights of individuals and complying with global privacy standards.
Anti-Corruption Measures
In the realm of corporate governance and ethical conduct, anti-corruption measures play a pivotal role in upholding integrity and preventing unethical behavior within organizations. These measures encompass a range of initiatives aimed at combating bribery, fraud, and other forms of corruption, thereby fostering an environment of transparency and accountability.
Combatting Corruption
The implementation of anti-corruption measures is essential for promoting ethical conduct and maintaining the integrity of business operations. By establishing clear guidelines and principles, organizations can actively prevent instances of bribery, fraud, and other corrupt practices. Transparency in financial transactions and accountability in decision-making processes are fundamental principles that underpin the fight against corruption.
Enforcing Anti-Corruption Policies
To effectively combat corruption, organizations should establish robust anti-corruption policies and procedures. These policies outline the expected standards of conduct and provide guidance on identifying and addressing potential instances of corruption. Regular audits and investigations serve as proactive measures to detect any irregularities or unethical behavior, enabling organizations to take appropriate actions to address such issues.
The integration of integrity initiatives into everyday business practices reinforces the commitment to ethical conduct while deterring corrupt behavior within the organization.
Legal Obligations
In the realm of business operations, organizations are entrusted with the responsibility of upholding various legal obligations, encompassing local, national, and international regulatory responsibilities. Compliance with these legal duties is fundamental for preserving the integrity of businesses and avoiding potential penalties that may arise from non-compliance.
Compliance with Legal Requirements
Adhering to legal requirements is a cornerstone of ethical corporate conduct. Organizations must navigate a complex landscape of statutory obligations to ensure that their operations align with established laws and regulations. By fulfilling these compliance mandates, businesses demonstrate their commitment to upholding ethical standards and operating within the boundaries of the law.
Navigating Complex Legal Frameworks
Navigating complex legal frameworks requires a comprehensive understanding of statutory obligations and regulatory responsibilities. Legal departments play a pivotal role in interpreting and implementing these intricate legal requirements within the organizational context. Their expertise enables businesses to navigate nuanced legal landscapes while ensuring adherence to evolving regulations.
Furthermore, consulting legal experts provides organizations with invaluable insights into ever-changing legal frameworks. By seeking guidance from professionals well-versed in statutory obligations, businesses can stay abreast of dynamic regulatory environments and maintain compliance with evolving legal standards.
The seamless integration of legal expertise into business practices empowers organizations to uphold ethical conduct while navigating multifaceted legal landscapes effectively.
Ensuring Compliance
Adhering to corporate compliance standards is imperative for safeguarding organizations from potential legal and reputational risks. By prioritizing the observance of rules and regulations, businesses can establish a foundation of integrity and ethical conduct within their operations.
Continuous education and training play a pivotal role in maintaining a culture of compliance within organizations. Regular training programs not only reinforce the importance of conformity to regulations but also empower employees to make informed decisions aligned with the organization's adherence to standards.
Ethical conduct and regulatory adherence serve as fundamental pillars of sustainable corporate compliance. By integrating these principles into the fabric of business practices, organizations can cultivate an environment where ethical behavior is valued, and regulatory standards are consistently upheld.
It is through the collective commitment to ethical conduct and regulatory observance that businesses can ensure sustainable compliance, fostering trust among stakeholders and mitigating potential risks effectively.
Blockchain technology is a foundational innovation that underpins various forms of cryptocurrencies and has applications in numerous fields beyond finance, including supply chain management, healthcare, and cybersecurity. At its core, a blockchain is a distributed ledger or database, shared across a network of computers, which records transactions in a secure, transparent, and tamper-resistant manner.
Fundamental Concepts of Blockchain:
Blocks: Each block in a blockchain contains a list of transactions. Every new block created is linked to the previous block, thus forming a chain. This linkage is achieved through a cryptographic hash, a unique identifier representing the data in the previous block. This ensures that once a block is added to the chain, the data it contains is immutable and cannot be altered without changing all subsequent blocks, which requires consensus from the network majority.
Decentralization: Unlike traditional centralized systems where a single entity has control, blockchains are decentralized and distributed across a network of computers, known as nodes. Each node has a copy of the entire blockchain, and any changes or additions to the blockchain must be verified and agreed upon by consensus mechanisms, ensuring no single entity can control or alter the data unilaterally.
Transparency and Anonymity: Transactions on the blockchain are visible to everyone within the network, ensuring transparency. However, the identities of the individuals making the transactions are encrypted and represented by complex addresses, providing a level of anonymity.
Consensus Mechanisms: For a transaction to be added to the blockchain, it must be validated by the nodes in the network. Different blockchains use various consensus mechanisms to agree on the validity of transactions. The most common are Proof of Work (PoW) and Proof of Stake (PoS).
Proof of Work: This is used by Bitcoin and involves solving complex mathematical puzzles to validate transactions and create new blocks. This process is known as mining and requires significant computational power.
Proof of Stake: This is a more energy-efficient method where validators are chosen to create new blocks based on the number of coins they hold and are willing to «stake» as collateral.
Smart Contracts: Smart contracts are self-executing contracts with the terms of the agreement between buyer and seller being directly written into lines of code. They run on the blockchain and automatically execute actions when predetermined conditions are met, without the need for intermediaries. This can revolutionize various industries by providing a secure, automatic method of enforcing contracts.
How Blockchain Works in Practice:
When a new transaction is made, it is broadcast to a network of peer-to-peer computers scattered across the world. This network of nodes then uses algorithms to validate the transaction and the user’s status using known algorithms. Upon validation, the transaction is combined with other transactions to create a new block of data for the ledger. The new block is then added to the existing blockchain, in a way that is permanent and unalterable. This process is known as «mining» in a Proof of Work system.
Implications for Information Technology and Cybersecurity:
Security: Once data has been recorded onto a blockchain, it is extremely difficult to change. To alter any one record, one would need to alter all subsequent records, which requires the consensus of the network majority.
Decentralization: By removing the need for a central authority, blockchain technology greatly reduces the risk of centralized corruption or failure.
Transparency: All transactions are visible to users, which can provide a high level of trust and security as all changes are publicly audited.
Innovation in Contracts: Smart contracts automate and enforce the terms of agreement, which can lead to a new era of secure and decentralized contract management.
Despite its potential, blockchain isn’t without its drawbacks, such as significant energy consumption for certain consensus mechanisms like Proof of Work, the potential for complex security issues, and a current lack of regulation and standardization that can lead to instability and misuse.
The applications of blockchain could be particularly transformative in many fields. For instance, in cybersecurity, blockchain could enhance the integrity and confidentiality of data, reduce fraudulent activities, and ensure the authenticity of information. In academia, blockchain could be leveraged for secure, transparent, and efficient sharing of academic works, verification of credentials, and even in the peer review process, potentially revolutionizing the academic publishing industry.
Understanding blockchain’s workings, its potential, and its challenges is crucial for researchers and professionals in IT and cybersecurity to innovate and build more secure, efficient, and transparent systems.
Generative AI, such as ChatGPT, is revolutionizing fraud detection by leveraging advanced technologies like artificial intelligence and machine learning. Fraud examiners can now harness the power of ChatGPT to enhance their investigations and stay ahead of evolving fraud schemes. This comprehensive guide explores the applications, benefits, and limitations of generative AI in fraud detection, providing valuable insights for fraud examiners, AI enthusiasts, and technology professionals.
ChatGPT, a cutting-edge conversational AI language model developed by OpenAI, has the ability to generate human-like text responses based on given prompts. Its sophisticated algorithms enable it to understand context and generate coherent and relevant responses. In the context of fraud detection, ChatGPT can be utilized as a powerful tool for detecting fraudulent activities and implementing effective anti-fraud measures.
With its vast knowledge base and ability to analyze large volumes of data quickly, ChatGPT can assist fraud examiners in identifying patterns indicative of fraudulent behavior. By analyzing transactional data, customer interactions, and other relevant information, ChatGPT can help detect anomalies that may indicate potential fraud. Its AI-generated insights provide valuable support to fraud examiners in their efforts to prevent financial losses due to fraudulent activities.
The integration of generative AI like ChatGPT into existing fraud detection systems enhances efficiency by automating certain processes that were previously done manually. This not only saves time but also reduces the risk of human error. Fraud examiners can focus their expertise on investigating complex cases while relying on ChatGPT’s capabilities for initial analysis and identification of suspicious activities.
In addition to its analytical capabilities, ChatGPT also enables real-time monitoring of transactions and activities. Its continuous monitoring capabilities allow for instant alerts when potential fraudulent behavior is detected. This proactive approach helps organizations take immediate action to prevent further damage or financial loss.
As with any technological advancement in the field of fraud detection, there are limitations that need to be considered when utilizing generative AI like ChatGPT. While it excels at pattern recognition and analysis based on historical data, it may struggle with detecting emerging or unknown types of fraud that do not fit established patterns. Therefore, it is important for fraud examiners to continuously update their knowledge base and collaborate with other experts in the field to stay ahead of evolving fraudulent schemes.
Key Challenges Faced by Fraud Examiners
Fraud examiners, also known as fraud analysts, financial crime investigators, or anti-fraud professionals, play a critical role in detecting and preventing illicit activities. However, they face significant challenges due to the increasing complexity of fraud schemes and the voluminous data they need to analyze.
Increasing Complexity of Fraud Schemes
Fraudsters are constantly evolving their tactics to stay one step ahead of detection. They employ sophisticated techniques and deceptive practices that can be difficult to identify. From identity theft and account takeovers to money laundering and insider fraud, fraud schemes have become more intricate and harder to detect.
Fraud examiners must continuously update their knowledge and skills to keep pace with these evolving fraud schemes. They need to stay informed about the latest trends in fraudulent tactics and understand the underlying mechanisms behind them. This requires ongoing training, collaboration with other experts in the field, and staying up-to-date with industry best practices.
Voluminous Data to Analyze
The digital age has brought an explosion of data, making it challenging for fraud examiners to sift through vast amounts of information effectively. Transactional data, customer records, communication logs, social media feeds – all contribute to the ever-growing pool of data that needs analysis.
Manual analysis of such voluminous data is not only time-consuming but also prone to errors. Fraud examiners may miss crucial patterns or indicators amidst the overwhelming amount of information they have to process manually. Additionally, human biases can inadvertently influence their decision-making process.
To overcome this challenge, fraud examiners are increasingly turning towards technology-driven solutions like generative AI. By leveraging advanced machine learning algorithms and natural language processing capabilities, generative AI can help automate data analysis tasks. It can quickly identify patterns and anomalies within large datasets that may indicate fraudulent activities.
Leveraging Generative AI for Fraud Detection
Generative AI, with its advanced capabilities in pattern recognition and analysis, has emerged as a powerful tool for fraud detection. By leveraging generative AI, such as ChatGPT, fraud examiners can enhance their ability to identify potential fraudulent activities and implement effective anti-fraud measures.
Automated Fraud Pattern Recognition
Generative AI excels at analyzing patterns in data to identify potential fraud. With its ability to process large volumes of data quickly and accurately, it can detect anomalies that may indicate fraudulent behavior. By training on historical data and learning from past instances of fraud, generative AI models like ChatGPT can recognize patterns that human analysts might miss.
ChatGPT’s sophisticated algorithms enable it to flag suspicious activities based on the patterns it detects. This automated pattern recognition not only saves time but also improves the efficiency of fraud detection efforts. It allows fraud examiners to focus their expertise on investigating complex cases rather than spending valuable time manually analyzing vast amounts of data.
Real-time Fraud Monitoring
Generative AI enables real-time monitoring of transactions and activities, providing instant alerts for potential fraudulent behavior. By continuously analyzing incoming data streams, ChatGPT can identify suspicious patterns or deviations from normal behavior in real-time. This proactive approach allows organizations to take immediate action to prevent further damage or financial loss.
Real-time fraud monitoring powered by generative AI enhances the effectiveness of anti-fraud measures by enabling quick response times. Organizations can implement automated systems that integrate with ChatGPT to monitor transactions, customer interactions, and other relevant data sources. Any detected anomalies trigger instant alerts, allowing fraud examiners to intervene promptly and mitigate potential risks.
The combination of automated fraud pattern recognition and real-time monitoring provided by generative AI significantly strengthens the overall fraud detection capabilities of organizations. It complements the expertise of fraud examiners by augmenting their efforts with advanced machine learning algorithms and artificial intelligence technologies.
Ethical Implications and Bias in Generative AI for Fraud Detection
As generative AI, including ChatGPT, becomes increasingly integrated into fraud detection systems, it is essential to consider the ethical implications and potential biases associated with its use.
Ensuring Ethical Use of Generative AI
Ethical considerations are crucial when implementing generative AI in fraud detection. Organizations must ensure that the use of ChatGPT and other generative AI models aligns with legal and regulatory requirements. This includes obtaining proper consent, protecting user privacy, and ensuring transparency in how the technology is used.
One critical aspect of ethical implementation is training generative AI models on unbiased data. Biases present in training data can be inadvertently learned by the model, leading to biased outputs or decisions. To avoid perpetuating existing biases, it is important to carefully curate training datasets that represent diverse populations and avoid discriminatory patterns.
Addressing Potential Biases
Generative AI models like ChatGPT can inadvertently learn biases present in the training data. These biases may arise from historical imbalances or societal prejudices reflected in the data. It is crucial to address these biases to ensure fair and equitable outcomes.
Continuous monitoring and evaluation are necessary to identify and mitigate potential biases in generative AI for fraud detection. Regularly reviewing model outputs, analyzing performance across different demographic groups, and soliciting feedback from diverse stakeholders can help uncover any unintended biases. By actively addressing these issues, organizations can work towards developing more reliable and unbiased fraud detection systems.
Additionally, ongoing research and advancements in machine learning techniques aim to reduce bias in generative AI models. Techniques such as debiasing algorithms or using adversarial training can help mitigate bias by explicitly accounting for fairness during model development.
By proactively addressing ethical considerations and potential biases associated with generative AI for fraud detection, organizations can ensure responsible implementation of this technology while upholding fairness, transparency, and trustworthiness.
The Future of ChatGPT in Fraud Detection
As technology continues to advance, the future of ChatGPT in fraud detection looks promising. Ongoing advancements in ChatGPT’s natural language understanding capabilities are expected to enhance its effectiveness in detecting and preventing fraudulent activities.
Advancements in ChatGPT Technology
ChatGPT is continuously evolving, with improvements being made to its underlying algorithms and training methodologies. These advancements enable better comprehension of complex fraud-related scenarios and enhance the accuracy of its responses. As a result, future iterations of ChatGPT may have even more robust fraud detection capabilities.
The continuous development of ChatGPT technology opens up new possibilities for fraud prevention. By leveraging its conversational AI and language modeling capabilities, organizations can deploy more sophisticated anti-fraud measures that go beyond traditional rule-based systems. ChatGPT’s ability to understand context and generate human-like responses makes it a valuable asset in identifying and addressing fraudulent activities.
Collaboration between AI and Fraud Examiners
Human-AI collaboration holds great potential for improving fraud detection outcomes. While ChatGPT provides powerful analytical capabilities, human fraud examiners bring domain expertise and contextual understanding to the table. By working together, they can refine ChatGPT’s performance by providing feedback, validating results, and fine-tuning the system based on their knowledge of fraudulent behaviors.
Collaboration between AI and fraud examiners enables a symbiotic relationship where each party complements the strengths of the other. Fraud examiners can leverage ChatGPT’s analytical capabilities to process large volumes of data quickly while focusing their expertise on investigating complex cases or interpreting nuanced patterns that require human judgment.
The future integration of generative AI like ChatGPT with human expertise has the potential to revolutionize fraud prevention strategies. By combining the power of advanced technologies with human insight, organizations can achieve more effective fraud detection outcomes while adapting to evolving tactics employed by fraudsters.
The Future of ChatGPT in Fraud Detection
Generative AI, exemplified by ChatGPT, holds immense potential in revolutionizing fraud detection. Its advanced capabilities in analyzing patterns, detecting anomalies, and providing real-time monitoring make it a valuable tool for organizations combating fraudulent activities.
However, responsible implementation is crucial. Ethical considerations and bias mitigation must be at the forefront of utilizing generative AI for fraud detection. By ensuring that ChatGPT is trained on unbiased data and continuously monitoring for potential biases, organizations can maintain fairness and avoid perpetuating existing prejudices.
Furthermore, collaboration between AI and fraud examiners is key to unlocking the full potential of generative AI in fraud prevention. Human expertise combined with the analytical power of ChatGPT leads to more effective fraud detection outcomes.
As technology continues to evolve, ChatGPT will likely see advancements in its natural language understanding capabilities. These improvements will further enhance its ability to detect fraudulent activities accurately.
In conclusion, generative AI like ChatGPT has the capacity to transform the field of fraud detection. With ethical implementation, collaboration between human experts and AI systems, and ongoing technological advancements, organizations can stay one step ahead in the fight against fraud.
The Chief Information Security Officer (CISO) plays a pivotal role in the modern organization, embodying the confluence of technological expertise, strategic planning, and executive leadership. This article draws upon three comprehensive documents to elaborate on the evolving role of the CISO, focusing on the nuances of reporting structures, the multifaceted challenges they face, and their integral role in shaping an organization’s cybersecurity posture.
The Evolving Role of the CISO
In an era where cyber threats are omnipresent and increasingly sophisticated, the role of the Chief Information Security Officer has evolved significantly. Once regarded primarily as a technical position, the CISO is now a strategic partner at the executive level, balancing technical acumen with an acute understanding of business processes and risks. This evolution reflects the growing recognition of information security as a critical component of overall business strategy, necessitating a shift in how organizations structure and perceive the CISO function.
Reporting Structures and Organizational Dynamics
Variability in Reporting Structures
One of the most notable aspects of the CISO role is the variability in its reporting structure within organizations. This lack of uniformity stems from multiple factors, including industry type, organizational culture, and the maturity of an organization’s cybersecurity practices. Studies and surveys reveal a diverse range of reporting lines, with some CISOs reporting directly to the CEO, while others are positioned under the CIO, COO, or other executive roles. This variance is not merely a matter of hierarchical positioning but reflects deeper organizational values and perceptions regarding the role of cybersecurity.
Influence of Organizational Complexity and Decision Making
Organizational complexity plays a significant role in determining the CISO’s position. Factors like organizational size, industry-specific risks, and the complexity of the IT infrastructure influence where the CISO is placed within the hierarchy. Decision-making dynamics within the organization also significantly impact the reporting structure. In some cases, fear of accountability in the event of a security breach might lead to resistance against elevating the CISO to higher executive levels. In contrast, a rational analysis of the cybersecurity threats and their business implications might prompt a more prominent role for the CISO.
The Impact of Cybersecurity Maturity
The maturity of an organization’s cybersecurity practices is a critical determinant of the CISO’s role and stature. In organizations where cybersecurity is seen as integral to business strategy and risk management, the CISO is more likely to be a part of the executive team, directly influencing key decisions. Conversely, in organizations where cybersecurity maturity is low, the CISO may struggle to gain visibility and influence, often relegated to a more technical and less strategic role.
The Multifaceted Responsibilities of CISOs
Beyond Technical Expertise: Business Acumen and Strategic Leadership
The modern CISO transcends the traditional boundaries of IT security, requiring a blend of technical knowledge, business acumen, and strategic foresight. This expanded role involves understanding and articulating the business implications of cyber risks, educating the board and workforce, and aligning security strategies with overall business objectives. The CISO is also expected to be an effective communicator, capable of translating technical risks into business language and influencing decision-making processes at the highest organizational levels.
Addressing the Challenge of Perception and Communication
One of the significant challenges faced by CISOs is overcoming perceptions that view them primarily as technical experts, rather than strategic business partners. This challenge is compounded by the need to communicate complex cybersecurity concepts in a manner that is accessible and relevant to non-technical stakeholders. Effective CISOs are those who can bridge this gap, demonstrating not only technical proficiency but also a deep understanding of the business landscape and its inherent risks.
Navigating Organizational Politics and Power Dynamics
The CISO’s role is often at the intersection of various organizational power dynamics and politics. Navigating these complex relationships is crucial for the CISO to be effective. This includes managing conflicts of interest, particularly when reporting to roles such as the CIO, where operational priorities might conflict with security imperatives. The ability to manage these dynamics, aligning security goals with broader organizational objectives, is a key aspect of the CISO’s effectiveness.
Strategic Integration of CISOs in Business Processes
Alignment with Business Goals
The CISO’s responsibility extends to ensuring that cybersecurity strategies are not only about protecting assets but also about enabling business operations. This alignment involves understanding the organization’s business goals and tailoring cybersecurity measures to support these objectives. The CISO’s role includes identifying and mitigating risks that may impede the organization’s ability to achieve its goals, ensuring that cybersecurity measures do not hinder business agility and innovation.
Risk Management and Compliance
In the context of increasing regulatory demands, the CISO plays a vital role in ensuring compliance with laws and regulations related to data security and privacy. This responsibility involves staying abreast of legal changes, understanding their implications for the organization, and implementing strategies to ensure compliance. By integrating compliance into the cybersecurity strategy, the CISO helps protect the organization from legal and reputational risks.
Crisis Management and Incident Response
The ability to manage crises effectively is a critical component of the CISO’s role. This includes the development and implementation of comprehensive incident response plans that outline procedures for addressing security breaches. Effective crisis management not only minimizes the damage caused by security incidents but also helps maintain stakeholder trust and organizational reputation.
Enhancing the Organizational Cybersecurity Culture
Fostering a Culture of Security Awareness
A key aspect of the CISO’s role is to cultivate a culture of cybersecurity awareness throughout the organization. This involves regular training and education programs for employees, emphasizing the importance of security in everyday operations. By making cybersecurity a shared responsibility, CISOs can strengthen the organization’s overall security posture.
Championing Cybersecurity Initiatives
The CISO should act as a champion for cybersecurity initiatives, advocating for the necessary resources and support from the executive team and the board. This includes justifying investments in security technologies and personnel, as well as ensuring that cybersecurity is considered in all major business decisions.
Technology Leadership and Innovation
Keeping Pace with Technological Advancements
Given the rapidly evolving nature of cyber threats, the CISO must stay abreast of technological advancements and emerging threats. This requires a commitment to continuous learning and adaptation, ensuring that the organization’s cybersecurity measures remain effective against current and future threats.
Integration of Advanced Technologies
The CISO plays a crucial role in evaluating and integrating advanced security technologies such as artificial intelligence, machine learning, and blockchain into the organization’s security architecture. This involves not only technical expertise but also a strategic vision to foresee how these technologies can enhance the organization’s cybersecurity capabilities.
Future Challenges and Opportunities for CISOs
Evolving Cyber Threat Landscape
The cyber threat landscape is constantly evolving, presenting new challenges for CISOs. This includes the rise of sophisticated cyber-attacks such as ransomware, phishing, and state-sponsored attacks. CISOs must continuously adapt their strategies to address these emerging threats.
The Expanding Scope of Cybersecurity
As technology becomes increasingly integrated into all aspects of business operations, the scope of the CISO’s role is expanding. This includes ensuring the security of cloud services, Internet of Things (IoT) devices, and remote work infrastructures. The CISO must develop strategies that address the security challenges associated with these diverse and distributed technological ecosystems.
The Growing Importance of Data Privacy
With increasing global attention on data privacy and consumer rights, the CISO’s role in protecting personal data is more critical than ever. This involves not only implementing technical measures to secure data but also ensuring that data handling practices comply with privacy regulations and ethical standards.
Conclusion
The role of the Chief Information Security Officer is multifaceted and continually evolving. It encompasses a wide range of responsibilities, from technical expertise and crisis management to strategic planning and business alignment. As organizations navigate an increasingly complex and risky digital environment, the CISO’s role as a guardian of information assets and a business enabler becomes more critical. The future of this role is dynamic, challenging, and integral to the success and resilience of modern organizations in the face of evolving cyber threats and technological advancements.
The role has become increasingly integral to the strategic and operational success of modern organizations. As cyber threats continue to evolve and become more sophisticated, the need for skilled, strategic, and business-savvy CISOs has never been greater. The effective CISO must navigate a complex landscape of technical challenges, organizational dynamics, and executive leadership, playing a crucial role in safeguarding the organization’s digital assets while aligning cybersecurity strategies with broader business goals.
The CISO’s role in contemporary organizations is marked by its complexity and strategic importance. As organizations continue to grapple with the myriad challenges posed by the digital landscape, the CISO will undoubtedly remain a central figure in shaping and executing effective cybersecurity strategies that are aligned with and integral to overall business success.
Bibliography
Monzelo, P., & Nunes, S. (2019). The Role of the Chief Information Security Officer (CISO) in Organizations.
Shayo, C., & Lin, F. (2019). An exploration of the evolving reporting organizational structure for the chief information security officer (ciso) function. Journal of Computer Science, 7(1), 1-20.
Sveen, H. S., Østrem, F., Radianti, J., & Munkvold, B. E. (2022). The CISO Role: a Mediator between Cybersecurity and Top Management. In Norsk IKT-konferanse for forskning og utdanning (No. 2).
In today's digital landscape, achieving cyber maturity is crucial for businesses to protect their assets and maintain a competitive edge. Cyber maturity refers to the level of cybersecurity governance and management within an organization. It represents the organization's ability to effectively identify, assess, and mitigate cyber risks. By implementing robust cybersecurity governance practices, businesses can enhance their security posture and minimize the likelihood of cyber threats.
Cybersecurity governance encompasses the policies, processes, and controls that guide an organization's approach to managing information security. It involves establishing clear roles and responsibilities, defining risk management strategies, and ensuring compliance with relevant regulations. A strong cybersecurity governance framework provides a structured approach to safeguarding sensitive data, maintaining customer trust, and preventing financial losses.
The Significance of Effective Cybersecurity Governance
Effective cybersecurity governance is of utmost importance in today's digital landscape. It plays a vital role in establishing a robust security posture for businesses, enabling them to mitigate risks and protect their valuable assets.
Establishing a Robust Security Posture
A strong cybersecurity governance framework provides organizations with the necessary tools and strategies to establish a robust security posture. By implementing effective security measures, businesses can safeguard their networks, systems, and data from potential threats. This includes implementing firewalls, intrusion detection systems, encryption protocols, and access controls. A robust security posture not only protects against external cyberattacks but also helps prevent internal vulnerabilities and insider threats.
Maintaining Compliance and Regulatory Requirements
Compliance with industry regulations and legal requirements is essential for businesses operating in today's digital landscape. Effective cybersecurity governance ensures that organizations meet these compliance and regulatory requirements. By adhering to standards such as the General Data Protection Regulation (GDPR) or the Payment Card Industry Data Security Standard (PCI DSS), businesses can avoid penalties, legal consequences, reputational damage, and loss of customer trust. A strong cybersecurity governance framework includes regular audits, risk assessments, and incident response plans to ensure ongoing compliance.
Key Components for a Strong Cybersecurity Governance Framework
A strong cybersecurity governance framework consists of key components that are essential for effective management of cybersecurity risks.
Risk Management
Risk management is a crucial component of a strong cybersecurity governance framework. It involves identifying and assessing potential risks to an organization's information assets, systems, and networks. By conducting thorough risk assessments, businesses can understand their vulnerabilities and prioritize mitigation efforts. This includes implementing controls and safeguards to reduce the likelihood and impact of cyber threats. Regular monitoring and updating of risk management strategies ensure ongoing protection against emerging threats.
Policy and Procedure Development
Developing comprehensive policies and procedures is essential for effective cybersecurity governance. Clear guidelines provide employees with a roadmap for implementing security measures consistently throughout the organization. Policies should cover areas such as data classification, access control, incident response, employee training, and third-party vendor management. Procedures outline step-by-step instructions on how to carry out specific security tasks or respond to incidents. Regular review and updates to policies and procedures ensure alignment with evolving cyber threats and regulatory requirements.
Implementing Best Practices for Cybersecurity Governance
Implementing best practices for cybersecurity governance is crucial to ensure the effectiveness of security measures within an organization.
Employee Training and Awareness
Providing regular training and raising awareness among employees is a best practice for cybersecurity governance. Educated employees are more likely to follow security protocols and identify potential threats. Training programs should cover topics such as password hygiene, phishing awareness, social engineering, and safe browsing habits. By fostering a culture of cybersecurity awareness, businesses can empower their employees to become the first line of defense against cyberattacks. Regularly updating training materials and conducting simulated phishing exercises can further enhance employee preparedness.
Continuous Monitoring and Evaluation
Continuous monitoring and evaluation of security measures are essential for effective cybersecurity governance. Regular assessments help identify vulnerabilities and take necessary actions to address them promptly. This includes implementing intrusion detection systems, log monitoring tools, and network traffic analysis solutions. By continuously monitoring systems and networks, organizations can detect any suspicious activities or anomalies that may indicate a potential breach. Ongoing evaluation allows for the identification of gaps in security controls or emerging threats that require immediate attention.
Achieving Cyber Maturity through Governance
Effective cybersecurity governance is the key to achieving cyber maturity. By implementing a strong cybersecurity governance framework and following best practices, businesses can protect their assets and mitigate risks. A comprehensive approach to cybersecurity governance ensures that organizations have the necessary policies, procedures, and controls in place to safeguard against cyber threats. It involves continuous monitoring, risk management, employee training, and compliance with regulatory requirements. By prioritizing cybersecurity governance, businesses can enhance their security posture, maintain customer trust, and stay ahead of evolving cyber threats.
In 2009, ISACA launched a first information risk repository: Risk IT. Risk IT relies on COBIT 4, the IT governance framework that, according to ISACA, provides the missing link between traditional business risk management and information risk management and control.
One of the main ideas behind ISACA’s approach is that companies get a return on investment (ROI) by taking risks, but sometimes they try to eliminate risks that really contribute to the creation of a profit. Risk IT was designed to help companies maximize their return on opportunities by managing risks more effectively than trying to eliminate them entirely.
In April 2012, ISACA released the new COBIT 5 version of its repository. She presented this new version as a major evolution of the IS governance and management framework. One of the main novelties of COBIT 5 is to approach the Information System (IS), beyond the processes already put forward by COBIT 4.1, through complementary themes, as part of a holistic approach (holistic or systemic). As part of this upgrade, the COBIT 5 processes were adapted to better converge with other repositories such as ISO 27002, CMMI and ITIL.
With COBIT 5, Risk IT was integrated with COBIT 5 for Risk Management ( COBIT 5 for risk or COBIT 5GR in this text). In the spirit of COBIT, this version defines computer risk as an element of business risk, in particular, business risk related to the use, possession, exploitation and adoption of the business. in a company. COBIT 5GR is interested in:
to enable stakeholders to gain a better understanding of the current state and effects of risk across the enterprise
advise on how to manage risk at all levels, including a broad set of risk mitigation measures
to advise on how to put the appropriate risk culture in place
Risk assessment guidance that allows stakeholders to consider the cost of mitigation measures and the resources needed to counter the risk of loss
opportunities to integrate IT risk management with enterprise risk management.
Improving communication and understanding between all internal and external stakeholders
In this text, I propose a technique to apply COBIT 5GR to perform a risk analysis in an organization. This technique is based on the use of generic risk indicators (KRIs) that will need to be adapted for use in a specific context. This text is presented here for training purposes and to stimulate discussion with ISACA members and information risk managers. The full text will be presented in the next edition of my book on Information Risk Management. If you have comments: marcandre@leger.ca
Application of the method
As mentioned in my risk management courses and in my publications, from a theoretical point of view, risk management is accomplished through an IPM process. Identification (I) and prioritization (P) are risk analysis processes. The third phase, mobilization (M), is the implementation of the decisions of the identification (I) and prioritization phases (P). He’s from adding an Audit Phase to these phases to complete the process. The first actions to take are in the identification phase (I). These tasks are performed during the risk analysis.
In this text we present the stages of application of ISACA’s COBIT 5GR information risk analysis methodology. The elements presented here should be considered as the list of activities that the organization in general and the risk analyst in particular must perform to use the COBIT 5 for Risk methodology. This activity list can be used to create a project plan or checklist to carry out the risk analysis.
As with any risk analysis process, the main organizational benefit of using it in a different context than the one for which it was produced is the improvement in organizational maturity resulting from introspection and risk thinking.
The activity EDS03 Ensure the optimization of the risk of Cobit 5GR (EDM03 Area: Governance, Domain: Evaluate, Direct and Monitor) must be carried out initially before starting the risk analysis. In particular, the organization must put in place a risk governance framework that will be used in the risk analysis. In the context of COBIT 5GR, this is to complete the tasks EDM03.1 and EDM03.2. This exercise can be do it by doing appeal to an information risk governance committee.
EDM03.1 Evaluate risk management : The organization must constantly review and make judgments about the effect of risk on the current and future use of IT in the organization. Ask yourself if the risk appetite of the organization is appropriate and whether the organization’s risks related to the use of IT are identified and managed.
EDM03.2 Direct risk management : The organization shall direct the implementation of best risk management practices that may provide reasonable assurance that its information risk management practices are appropriate to ensure that the actual information risk does not exceed the organization’s risk tolerance, as determined by the board of directors.
Before the risk analysis begins, it is also necessary to identify the analyst who will be responsible for conducting the study. The analysis will be responsible for completing the study, creating the documents, communicating with the participants and managing the project. Thus, the risk analysis is managed as a project using project management techniques, which are not addressed in this course. He must have the necessary skills and training to carry out the risk analysis (APO12.02.4.5).
If you have not done so already, it is necessary to complete the following steps before you can do the risk analysis:
Inventory of information assets : Before starting, it is necessary to carry out an inventory of the information assets of the organization. This corresponds to activity APO12.03.1. That is, the organization must make an inventory of business processes, including processes for supporting human resources, applications, infrastructures, facilities, critical registries, suppliers and sub-contractors. contractors. It must understand and document the interdependencies between IT service management processes, information systems and technology infrastructure ( APO12.03.1.1).
Categorization of information assets : Information assets must be categorized, which is not defined in this course.Categorization should be done using an appropriate measurement scale based on the organization’s security objectives.At a minimum, it is recommended that the information asset be defined based on its availability, integrity, and confidentiality (DIC) attributes using a nominal ordinal scale (for example, low, medium, high).
Setting up a project committee : In order to monitor the risk analysis, it is recommended to set up a project monitoring committee. In addition, once the risk analysis has been completed, in order to successfully implement the risk mitigation measures and related actions, the organization will have to set up a project committee. The project committee members can be the same as the project governance committee with the addition of an experienced project manager and trained IT staff on the mitigation measures selected.
Project Plan Creation and Approval : To successfully complete an Information Risk Management (IRM) Master Plan, the organization should manage this exercise as a project. In this way, it is possible to use project management expertise and techniques to maximize the chances of success.
Determination of management frameworks : This step consists of identifying the information risk management frameworks, standardized and other, that the organization wishes to use for the management of its information assets. In this step, the use of standardized management frameworks, such as COBIT or ISO 27002, is recommended.
Current status : This step consists of identifying, in relation to the risk mitigation measures associated with each of the standard management frameworks that the organization wishes to put in place or that are already in place, the state (in place or the cost (cash and human resources or effort, measured in equivalent of an individual working full-time), management controls to ensure the effectiveness of the risk mitigation measure and to perform audits (audits) and the alleged effectiveness of the measure. It is also possible to attach notes on the risk mitigation measure or its implementation, as well as to attach attachments, such as network diagrams, PDF documents or other documents use for the understanding of users, managers or possibly auditors.
Subsequently, the analyst can begin by identifying the organization and setting the objectives for information security. The analyst will then proceed with the steps of APO12 Risk-Specific Process Practices, Inputs / Outputs, Activities and Detailed Activities . It starts with APO 12.01.
Identification phase
APO12.01 Collect data
In the identification phase, according to the IPM risk management model, the organization must identify and collect the data needed to perform risk analysis and effective reporting of information risks. More specifically, it is necessary to carry out the steps APO12.01.1, APO12.01.2, APO12.01.3 and APO12.01.4:
APO12.01.1. Establish and maintain a method for the collection, classification and analysis of data related to computer risks, which can accommodate several types of hazards, several categories of computer risks and multiple risk factors.
APO12.01.1.1 Establish and maintain a model for the collection, classification and analysis of information risk data. The aim is to identify the hazards and vulnerabilities that must be considered in the scope of the risk analysis, to define a formal approach that will allow them to be grouped into categories and to determine how they will be analyzed.
APO12.01.1.2 Predict the presence of several types of hazards and multiple categories of information risk. The analyst must put in place an approach that will ensure that there is a good coverage of the different types of risks that may influence the information risk within the scope of the analysis. The analyst must plan an approach to ensure that the risk scenarios that will be created are within the scope of the risk analysis that was determined by the Governance Committee.
APO12.01.1.3 Include filters and views to help determine how specific risk factors may affect risk. In order to limit the bias introduced by the analyst and the participants in the risk analysis, the organization must put in place a systematic approach. It is only through the implementation of a systematic approach that she can hope to approach the scientific nature of her approach. Where possible, the organization should seek sources of evidence, science and reliability as inputs to the process. One of these sources may be incident logs and other records in place. Logs, servers, or detection equipment can also be sources of evidence.
APO12.01.1.4 Establish criteria to ensure that the model can support the measurement and assessment of risk attributes across information risk domains and provide useful data to promote a risk-aware organizational culture.
Several strategies can be implemented to accomplish these activities. Depending on whether the problem is approached by hazards or vulnerabilities, it is possible to set up a watch strategy that seeks to identify various sources of hazards from literature, industry journals, business registers incidents, standards, management frameworks, focus groups or other sources. In a vulnerability approach, it is the results of the vulnerability analysis, producing evidence based on a real situation, that will generate scenarios. The important thing is to choose a systematic approach that is well documented and can be justified later in an evaluation of the risk analysis and results.
APO12.01.2. Identify relevant data on the organization’s internal and external operating environment that could play an important role in informational risk management.
APO12.01.2.1 Record data on the organization’s operating environment that could play an important role in information risk management.
APO12.01.2.2 Consult sources within the organization, legal department, audit, compliance and the IOC office.
APO12.01.2.3 Identify major revenue sources, external computer systems, product-related legal liability, regulatory landscape, industry competition, trends in the computer industry, alignment of competitors with competitors key metrics, the relative maturity of core business and IT capabilities and geopolitical issues.
APO12.01.2.4 Identify and organize historical information risk data and loss experience of industry peers through industry-based incident records, databases and industry regarding the disclosure of frequent hazards.
This work will be the result of an analysis work based on the chosen data collection methods. This work can and should be done in collaboration with the financial services, IT teams and managers of the organization. Here again, the important thing is to choose a systematic approach that will be well documented and can be justified later in an evaluation of the risk analysis and results.
APO12.01.3. Identification and analysis of data on historical information risks and the organization’s experience with data loss, available trends, peers through hazard registers and industry incidents, databases and other industry sources regarding the disclosure of known hazards.
APO12.01.3.1 Using the data collection model, record data on hazards that have caused damage or may affect the profit / value ratio of information assets, activities, projects, operations and IT service delivery organisation.
APO12.01.3.2 Enter relevant information on issues related to information asset management. In particular, keep information about incidents, problems and investigations involving information assets.
This involves doing a literature review, researching available documents from sources like Gartner Group or industry journals. Other sources of data may be internal records of incidents or risks. Finally, IT teams can be consulted for historical data.
APO12.01.4. Record data on the hazards that caused or may cause impacts to the benefit / value ratio of information assets, the delivery of IT programs and projects, the IT operations and the service delivery of the organization. Enter relevant data on related issues, incidents, problems and investigations.
APO12.01.4.1 Organize collected data and highlight contributing factors.
APO12.01.4.2 Determine what specific conditions existed or did not exist when the hazards occurred and how the conditions might have affected the frequency of the hazards and the extent of the loss.
APO12.01.4.3 Determine the common factors that contribute across multiple hazards. Conduct periodic vulnerability analysis to identify new or emerging risks and to gain an understanding of the associated internal and external vulnerabilities.
This work (APO12.01.3 and APO12.01.4) is, here again, essentially carried out by the risk analyst, using the data of the steps APO12.01.1 and APO12.01.2. Always, the work is done according to a systematic approach which will be well documented and that it will be possible to justify later, during an evaluation of the risk analysis and the results.
Risk analysis
The previous steps involved the preparation and identification of the risk analysis framework. Once these steps are completed, this is where the risk analysis with COBIT 5GR actually begins in activities APO12.02 to. APO12.04.
APO12.02 Risk Analysis
The organization needs to deepen the information needed to support risk decisions that take into account the relevance for the organization of vulnerabilities.
APO12.02.1. Define depth ( Scope ) appropriate risk analysis efforts taking into account all the vulnerabilities and criticality of information assets in achieving business objectives. Define the scope of the risk analysis after performing a cost / benefit analysis.
APO12.02.1.1 Define the scope of the risk analysis. The organization must decide on the expected depth of the risk analysis efforts. It is necessary to consider a wide range of options that will allow the organization to have in hand all the elements that will enable it to make decisions on risk, given its level of maturity in information risk management.
APO12.02.1.2 Identify relevant vulnerabilities, the criticality of the information assets for the organization and the triggers of the hazards in the field.
APO12.02.1.3 Set objectives to optimize risk analysis efforts by fostering an expanded view based on the organization’s business processes and outputs (products and services offered) and internal structures that are not directly related to the results.
APO12.02.1.4 Define the scope of the risk analysis after a criticality review for the organization, the cost of the measures against the expected value of the information assets, the reduction of the uncertainty and its requirements global regulatory requirements.
APO12.03.2. Define and obtain an organizational consensus on IT services and IT infrastructure resources that are critical to support the smooth running of the organization’s business processes. Analyze dependencies and identify weak links.
APO12.03.2.1 Determine which IT services and IT infrastructure resources are required to maintain the functioning of the critical services and critical processes of the organization.
APO12.03.2.2 Analyze IT dependencies and weak links in all business processes and process flows.
APO12.03.2.3 Obtain consensus of business units and IT managers on the organization’s most valuable information and related technology assets.
Creating risk scenarios
APO12.02.2. Create and regularly update risk scenarios, including scenarios of hazard sequences or threat coincidences, expectations for specific controls, detection capabilities, and other incident management measures. Start with the generic risk scenarios of COBIT 5.
APO12.02.2.1 Estimate the likely frequency and probable magnitude of loss or gain associated with each of the information risk scenarios. Consider the influence of scenario vulnerabilities.
APO12.02.2.2 Estimate the maximum amount of damages that may be suffered or gains from opportunities.
APO12.02.2.3 Consider scenarios composed of hazard sequences and threat coincidences.
APO12.02.2.4 Based on the most important scenarios, identify organizational expectations for specific controls, the ability to detect hazards, and other incident management measures.
APO12.02.2.5 Evaluate known operational controls and their effect on frequency (probability), likely magnitude of damage and applicable vulnerabilities.
APO12.02.2.6 Estimate exposure levels and residual risk. Compare the residual risk with the risk tolerance of the organization and the level of acceptable risk. This exercise will help the organization to identify risks that may require special treatment
APO12.02.3. Estimate the frequency, probability and magnitude of losses or gains associated with information risk scenarios. Take into account all applicable vulnerabilities, evaluate known operational controls and estimate residual risk levels for each scenario.
APO12.02.3.1 Identify risk response options. Examine the range of risk response options (risk mitigation measures), for example: avoid, mitigate (mitigate, mitigate), transfer (outsourcing, insurance), accept risk.
APO12.02.3.2 Document the rationale and potential tradeoffs across the range of risk response options.
APO12.02.3.3 Specify high level requirements and parameters for projects or programs that, based on risk appetite, mitigate risks to acceptable levels. Identify costs, benefits and shared responsibility for project execution.
APO12.02.3.4 Develop in greater detail the organizational requirements and expectations for appropriate controls.Determine where and how they are supposed to be implemented to be effective.
The organization should create a sufficient number of scenarios to carry out its risk analysis. There is no ideal number. The number of scenarios used will depend on several factors, such as the scope of the risk analysis, the budget and time allocated to achieve it, the level of maturity of the organization in information risk management, and many others. factors.As a first step, it is suggested that a brainstorming group meeting be held with the participants in the risk analysis to identify candidate scenarios. Scenarios from a scenario bank or those included in COBIT 5 can also be used.
It should be noted that what is presented here is a reference model that can be used as a basis for risk analysis.In an application in a real situation, this model will have to be adjusted or improved to take into account the actual situation of the organization.
For each scenario, it is first of all the identifiers in a summary way. For example, the risk scenario Ζn (A, ψ, δ), where n is a single sequential integer, includes a brief description of the hazard (A) and random events or sequences, actions, decisions and related factors that made it possible to exploit a vulnerability (ψ) whose outcome is damage (δ). For example, a scenario number Z301 that deals with the hazard (A) Virus , the vulnerability (ψ) CVE1999-233 and whose damage (δ) is the loss of confidentiality, would be identified Z301 (Virus, CVE1999-233, Confidentiality ) . These summary descriptions are then enriched.
Once the scenarios have been identified and briefly described during the group meeting with the participants, the analysis will have to carry out an analysis and documentation of each of the scenarios. To this end, it is proposed to use a standard form for the documentation of information risk scenarios. The purpose of this work of analysis and documentation and to bring a greater level of detail. The minimum information required for each scenario is:
Scenario name: A name that describes the scenario. For example, a risk scenario for identity theft of an organization’s customer might be called identity theft.
Organization Name: The name of the organization for which the scenario is created.
Scenario creation date: The creation date of the scenario.
Owners cause: the individuals involved in the scenario, who should include the owners of the informational assets involved and those involved in the asset-related business processes.
Description of the risk or hazard scenario: a detailed description of the hazard (A) and the hazards or sequences of hazards, actions, decisions and related factors that made it possible to exploit a vulnerability ( ψ) whose result is damage (δ). This is to describe in more detail what will be developed with the participants in the previous step.
Vulnerability: A description of the vulnerability, vulnerability or weakness that makes this scenario possible.
Historical Data: Documentation of historical data available on situations similar to what is described in the scenario and sources of such data, such as an incident log or customer support reports.
Target of this scenario: availability, integrity, confidentiality, continuity, other.
Impacts of the realization of the scenario: descriptions of the impacts and damages that would result from the realization of the hazard that is reduced in the scenario.
Mitigation measures in place or envisaged: description of the risk mitigation measures envisaged.
Management controls in place or proposed: description of the management controls in place or proposed.
Scenario change history: Track changes to the scenario document.
It is likely, once the scenarios are detailed, that the similarities between some of the scenarios will reduce the number of scenarios by combining similar scenarios. In general, it is common to reduce by 20% the number of scenarios by the combination of similar scenarios. Then, the analysis will have to meet the participants individually in order to validate the detailed scenarios. It will be necessary to make adjustments according to the comments of the participants. The scenario creation will end with the identification of the data that will allow the organization to measure the level of risk and, more specifically, to create risk indicators based on, among other things, available evidence (incident log and others). sources of evidence) or estimates from participants in the risk analysis. In particular, it will be necessary to identify:
the probability of realization of the hazard: Pb (A) , a value between 0.01 and 0.99
the presence of the vulnerability: Pb (ψ) , usually 0 (no vulnerability) or 1 (present vulnerability)
the probability of exploitation of the vulnerability by the hazard: Pb (ψ, A) , a value between 0.01 and 0.99
the estimated damage and the maximum damage in this scenario: δ (ψ, A) , a value between 0.01 and 0.99 (qualitative) or a real number (scientific approach and evidence)
the resilience level of the organization in this scenario: θ (ψ, A), a value between 0.01 and 0.99
the expected utility (the contribution to the organization’s profits) of the business processes or information assets involved in the risk scenario: μ (ψ, A) , a value between 0.01 and 0.99 (qualitative) or a real number (scientific approach and evidence)
See also the section on KRIs for examples of indicators.
Prioritization phase
APO12.02.4. Compare the residual risk to the organization’s risk tolerance and identify exposures that may require a risk response.
APO12.02.4.1 Conduct a Peer Review of Information Risk Analysis.
APO12.02.4.2 Confirm that the analysis is adequately documented according to the needs of the organization.
APO12.02.4.3 Review the basis of estimates of probabilities, impacts, damages and opportunities (gains).
APO12.02.4.4 Verify that all risk analysis participants who participated in the estimation of probabilities and the quantification of metrics were not influenced by bias (if necessary ensure that mechanisms to control bias). Check that there has been no manipulation of the process to obtain a predetermined result. Verify that, where possible, a search for evidence was conducted.
APO12.02.4.5 Verify that the level of experience and qualifications of the risk analyst were appropriate for the magnitude and complexity of the risk analysis.
APO12.02.4.6 Provide an opinion on the risk analysis process, the expected reduction of unacceptable risks and whether the cost of the risk analysis process is reasonable in relation to the cost of the risk mitigation measures and the risk reduction of the foreseeable risk.
From the risk scenarios that were created during of activity 12.02.3, it is necessary to quantify them. This can be done in different ways, as discussed in the course (interviews, focus group, group meetings, etc.). The results must then be validated by all participants in the risk analysis. It is essential to conduct a peer review exercise (participants) of the results of the risk analysis before sending them to management for approval (risk governance committee) and before using them in the decision-making process. decision. This revision process reduces the bias introduced in the risk analysis and increases the reliability and the scientificity of the results.
APO12.04.1. Transmit the results of the risk analysis to all parties involved to support the organization’s decisions. Include estimates of probabilities and damage or gain with confidence levels.
APO12.04.1.1 Coordinate additional risk analysis activities as required by managers as required (eg, reports of non-compliance or changes in the scope of the risk analysis).
APO12.04.1.2 Clearly communicate context and results to assess cost / benefit ratios.
APO12.04.1.3 Identify the negative impacts of the hazards and scenarios that should guide risk mitigation decisions and the positive effects of hazards and scenarios that represent the management of opportunities that may have an impact on the strategy and objectives organizational.
APO12.04.2. Provide decision makers with the data to understand worst-case and most likely scenarios, due diligence risks, significant reputational risks, and legal or regulatory considerations.
APO12.04.2.1 In this effort are:
Key risk elements (eg frequency, magnitude, impact), vulnerabilities and their estimated effects
Magnitude of estimated probable loss or probable future gain
Maximum estimated losses based on potential gain for a scenario and the most likely losses based on earnings.
Additional relevant information to support the conclusions and recommendations of the analysis
APO12.03 Maintain a risk profile
The organization should maintain an inventory or register of known risks and risk components, ie hazards (threats), vulnerabilities and impacts (damage). These must include the estimation of their probability, the intended impact and the risk mitigation measures in place. The organization should document the associated resources, the organizational capabilities for information risk management, and the controls in place.
APO12.03.3. Aggregate the current risk scenarios (which have materialized) by category, business sector and functional area.
APO12.03.3.1 Inventory and evaluate the process capacity, skills and knowledge of the individuals in the organization.Evaluate results and performance across the information risk spectrum (eg, ROI, OCL, delivery costs, project costs, IT operations costs, and IT service delivery).
APO12.03.3.2 Determine whether the normal execution of processes can or can not provide the right controls and the ability to take acceptable risks.
APO12.03.3.3 Identify where the variability of results associated with a process can contribute to a more robust internal control structure, improve information and performance of the organization, and help seize business opportunities.
APO12.03.4. On a regular basis, the organization should identify and enter all relevant information about its risk profile.The organization must then consolidate this information into a global risk profile. This work is often done by the risk analyst in conjunction with the organization’s risk management group in a risk governance context.
APO12.03.4.1 Examine the collection of attributes (variables) and values (metrics) through which the components of the risk scenario are quantified. Examine their interconnections inherent in the impact categories of the organization.
APO12.03.4.2 Adjust data according to evolving risk conditions and emerging threats to maximize the benefits and competitive advantages of IT by considering their cost of implementation (TCO), implementation efforts the delivery of IT programs and IT projects, the cost of operating and managing IT operations and service delivery.
APO12.03.4.3 Evaluate the cost of updating information systems and information assets based on asset criticality, operating environment data and hazard data. Make links between risks that are similar to categories of risk and impact categories of the organization.
APO12.03.4.4 Catalog and aggregate hazard types by category, business sector and functional area of the organization.
APO12.03.4.5 At a minimum, update the information risk scenarios in response to significant internal or external changes and revise them annually.
APO12.03.5. Based on all risk profile data, define a set of key risk indicators (KRIs) that enable rapid identification and monitoring of risks and trends.
APO12.03.5.1 Capture the risk profile within tools such as an information risk register and enterprise risk mapping (ERM).
APO12.03.5.2 Enhance the risk profile by the results of the IT portion of the Enterprise Risk Assessment (ERM), risk scenario components, hazard data collection, continuous risk analysis risks and the results of the assessment of interdependencies.
APO12.03.5.3 For individual elements of the information risk register, update key attributes such as name, description, owner, stakeholders, actual and potential frequency, magnitude of associated scenarios, potential and real impact, and risk mitigation measures.
APO12.03.6. Gather information on the hazards of IT that have materialized, for inclusion in the information risk profile of the organization.
APO12.03.6.1 Create metrics and key risk indicators (KRIs) that can target IT hazards and incidents that can significantly affect the organization’s bottom line.
APO12.03.6.2 Base these indicators on a model that provides an understanding of the variables that may impact exposure and the organization’s capabilities for risk management in general and information risks in particular.
APO12.03.6.3 Ensure understanding of Key Risk Indicators (KRIs) by all stakeholders in the organization.
APO12.03.6.4 Regularly review the KRIs used and recommend adjustments to keep track of internal and external conditions.
Here is a selection of KRIs that are likely to be used as a starting point for the implementation of COBIT 5GR. It should be noted that these KRIs should be enriched, adapted or modified to take into account the particularities of each organization.
Risk appetite of the organization: Ar (organization)
Risk scenario: Zn (A, ψ, δ )
Element at risk: In
Probability of realization of the hazard: Pb (A)
Presence of the vulnerability: Pb (ψ)
Probability of exploiting vulnerability by hazard: Pb (ψ, A)
Estimated damage: δe (ψ, A)
Maximum damage: δm (ψ, A)
Resilience level: θ (ψ, A)
Expected utility: μ (E) , a value between 0.01 and 0.99 (qualitative) or a real number (scientific approach and evidence).This is where the opportunity created by the risk element will be taken into account.
Mitigation measures: MMn (Zn)
Damage reduction caused by exploitation of the vulnerability by the hazard with the mitigation measure in place: δr (ψ, A, MMn)
Reduction of the probability of exploitation of the vulnerability by the hazard with the mitigation measure in place: Pb (ψ, A, MMn)
Using these indicators, the organization could make a qualitative risk estimate by performing an indicator estimate in collaboration with stakeholders and risk analysis participants. In such a case, the choice of measurement scales and data collection are likely to have an effect on the degree of scientificity of the results. In the best cases, the organization will have evidence that can be used.
APO12.04.4. Review the results of objective third-party risk assessments, internal audit, and quality assurance reviews to match the organization’s risk profile. Identify gaps and risks to determine the need for additional risk analysis.
APO12.04.4.1 Take the gaps and exposures of the organization to assess risk transfer requirements or the need for additional or deeper risk analysis.
APO12.04.4.2 Help the organization understand how corrective action plans will affect the overall risk profile.
APO12.04.4.3 Identify opportunities for integration with ongoing risk management projects and activities.
APO12.04.5 . Identify, on a periodic basis, for areas of high relative risk and taking into account the risk appetite of the organization, opportunities that would allow for higher risk acceptance and increased growth.
APO12.04.5.1 Look for opportunities that allow:
Use the organization’s resources to create leverage that creates a competitive advantage.
Reduce coordination costs
Take advantage of economies of scale by using strategic resources common to several sectors of activity.
Take advantage of structural differences with competitors.
Integrate activities between business units or components of the organization’s value chain.
Mobilization phase
APO12.05 Define a Portfolio of Risk Management Projects
It is through the implementation of risk mitigation actions, in the form of projects, that the organization will be able to manage its risks. This is to reduce the unacceptable risks to an acceptable level, taking into account its risk tolerance as expressed. The identification of a set of projects for the reference period under consideration (the next budget year, for example) is in the form of a portfolio of projects.
APO12.04.3. Communicating the risk profile to all stakeholders, including the effectiveness of risk management processes, the effectiveness of controls, gaps, inconsistencies, risk acceptance, mitigation measures and their impact on the risk profile.
APO12.04.3.1 Identify the needs of different stakeholders for risk change reporting by applying the principles of relevance, effectiveness, frequency and accuracy of reporting.
APO12.04.3.2 Include the following in the statement: effectiveness and performance, issues and deficiencies, status of mitigation measures, hazards, incidents and their impact on risk profile and performance risk management processes.
APO12.04.3.3 Contribute to integrated enterprise risk management reporting.
APO12.05.1. Maintain an inventory of control activities that are in place to manage risks in line with the organization’s risk appetite and tolerance. Classify control activities and match them to specific hazards or scenarios and aggregations of information risks. Use COBIT 5 or other standards (ITIL, ISO, etc.) as a guide to determine management controls that are relevant or useful to your organization.
APO12.05.1.1 Throughout the area of risk intervention, the inventory of controls in place to manage risks and allow risk to take in line with the appetite for risk and tolerance.
APO12.05.1.2 Categorize controls (eg, predictive, preventative, detective, corrective) and identify them to specific informational risk statements (scenarios and hazards) and aggregate informational risks.
APO12.05.2. Determine whether each organizational unit monitors risk and accepts responsibility for operations within its individual tolerance levels and portfolio.
APO12.05.2.1 Monitor operational alignment with risk tolerance thresholds.
APO12.05.2.2 Ensure that each line of business accepts responsibility for operations within its individual and portfolio tolerance levels and for the integration of monitoring tools into key business processes.
APO12.05.2.3 Monitor the performance of each control, and measure the variance of thresholds against objectives.
APO12.05.3. Define a balanced set of risk reduction project proposals and projects that provide strategic opportunities, taking into account the cost / benefit ratio, the effect on the risk profile and the current risk.
APO12.05.3.1 Respond to risk exposure to discover and opportunity.
APO12.05.3.2 Choose candidate IT controls based on specific threats, the degree of risk exposure, the probable loss and the mandatory requirements specified in the IT standards.
APO12.05.3.3 monitor the evolution of the underlying operational business risk profiles and adjust the ranking of risk response projects.
APO12.05.3.4 Communicate with key stakeholders early in the process.
APO12.05.3.5 Conduct pilot testing and review of performance data to verify operation against design.
APO12.05.3.6 Plan for new and updated operational controls to mechanisms that will measure control performance over time, and prompt management of corrective actions in case of need for monitoring.
APO12.05.3.7 Identify and train staff on new procedures as they are deployed.
APO12.05.3.8 Report IT risk action plan progress. Monitor the IT risk of action plans at all levels to ensure the effectiveness of required actions and determine whether residual risk acceptance has been achieved.
APO12.05.3.9 Ensure that actions initiated are owned by the affected process owner and any discrepancies are reported to senior management.
APO12.06 Risk Mitigation
Resolutely in phase M of the IPM process. APO12.06 consists of the implementation of the risk mitigation measures adopted following the identification and prioritization of risk, in a project framework, which will allow the organization to respond to risks beyond its scope. tolerance threshold in a timely manner, by effective measures to limit the extent of damage. You can use a standard such as ISO27002: 2013 to find risk mitigation measures to implement.
Other elements of COBIT 5 will also be useful or even necessary for the risk management that must take place. Among others:
EDM03.03 Monitor risk management.
APO13.01 Establish and maintain an ISMS.
APO13.02 Define and manage an information security risk treatment plan.
APO13.03 Monitor and review the ISMS.
BAI01.10 Manage program and project risk.
BAI02.03 Manage requirements risk.
BAI04.04 Monitor and review availability and capacity.
BAI06.02 Manage emergency changes
DSS02.02 Record, classify and priority requests and incidents.
DSS02.03 Verify, approve and fulfill service requests.
DSS02.04 Investigate, diagnosis and allocate incidents.
DSS02.05 Resolve and recover from incidents.
DSS02.06 Close service requests and incidents.
DSS03.01 Identify and classify problems.
DSS03.02 Investigate and diagnose problems.
DSS03.03 Raise known errors.
DSS03.04 Resolve and close problems.
DSS03.05 Perform proactive problem management.
DSS04.01 Define the business continuity policy, objectives and scope.
DSS04.02 Maintain a continuity strategy.
DSS04.03 Develop and implement a business continuity response.
DSS04.04 Exercise, test and review the BCP.
DSS04.05 Review, maintain and improve the continuity plan.
DSS04.06 Conduct continuity plan training
DSS04.07 Manage backup arrangements.
DSS04.08 Conduct post-resumption review.
DSS05.01 Protect against malware.
DSS05.02 Manage network and connectivity security.
DSS05.03 Manage endpoint security.
DSS05.04 Manage user identity and logical access.
DSS05.05 Manage physical access to IT assets.
DSS05.06 Manage sensitive documents and output devices.
DSS05.07 Monitor the infrastructure for security-related events.
The set of controls of Monitor, evaluate & Assess (MEA)
The organization will have to put in place an audit process to retroactively validate the results of previous risk analyzes. It will also have to repeat the risk analysis process when there are major changes in its environment, situation, information assets or when it becomes necessary to do so. At a minimum, it should have a risk analysis by budget cycle and at least once a year.
Metric and KRI
Here is a selection of KRIs that are used as a starting point for the implementation of COBIT 5GR. It should be noted that these KRIs must be enriched, adapted or modified to take into account the particularities of each organization.
Risk appetite of the organization: Ar (organization)
Risk scenario: Zn (A, ψ, δ )
Element at risk: In
Probability of realization of the hazard: Pb (A)
Presence of the vulnerability: Pb (ψ)
Probability of exploiting vulnerability by hazard: Pb (ψ, A)
Estimated damage: δe (ψ, A)
Maximum damage: δm (ψ, A)
Resilience level: θ (ψ, A)
Expected utility: μ (E) , a value between 0.01 and 0.99 (qualitative) or a real number (scientific approach and evidence)
Mitigation measures: MMn (Zn)
Damage reduction caused by exploitation of the vulnerability by the hazard with the mitigation measure in place: δr (ψ, A, MMn)
Reduction of the probability of exploitation of the vulnerability by the hazard with the mitigation measure in place: Pb (ψ, A, MMn)
Using these indicators, the organization could make a qualitative risk estimate by performing an indicator estimate in collaboration with stakeholders and risk analysis participants. In such a case, the choice of measurement scales and data collection are likely to have an effect on the degree of scientificity of the results. In the best cases, the organization will have evidence that can be used.
Risk appetite of the organization
Symbol: Ar (organization)
Description: Risk appetite represents the aggregate level of risk that an organization agrees to take in order to continue its business and achieve its strategic objectives. It is necessary to identify the risk appetite of the organization to adjust the damage and utility to obtain the expected utility as perceived by the organization. Low appetite means risk aversion.The result is an increase in the expected utility of the element at risk, that is, it is more valuable to the organization and its decision-makers than its real or book value. On the other hand, a propensity to risk, which signifies a high appetite for risk, results in a decrease in the expected utility.
Qualitative value: between 0.01 and 0.99
Qualitative Data Source: The slider scale can be used to assess risk appetite. The neutral value is 0.5, a value of more than 0.5 is used to indicate risk aversion. A value between 0.01 and 0.49 represents a risk propensity.
Quantitative value: This variable can not be measured quantitatively.
Quantitative data sources: None
Risk scenario
Symbol: Zn (A, ψ, δ)
Description: The risk scenario is a document that tells a story. It describes in a structured approach, the history of a hazard or sequence of hazards and threats, which exploits a vulnerability of an informational asset causing harm.
Risk element
Symbol: In
Description: Information asset that is subject to a risk scenario
Qualitative value: Unique nominative code that identifies each information asset or risk element that is included in a risk analysis.
Qualitative data source: Determined by the risk analyst or assigned during an inventory of information assets.
Probability of realization of the hazard
Symbol: Pb (A)
Description: During the meetings the analysis will also have to evaluate, with the participants, the probability of the realization of a scenario. To do this, he will use slider scales to evaluate the damage and probability of scenario realization.Slider scales are printed and distributed to participants.
The presented cursor scale is used to the evaluation of the probability of realization of the presented scenario.
Qualitative value: The values assigned to this variable are between 0.01 and 0.99, or between 1% (low) and 99% (high).The central, neutral or average value is located at the center of the scale of measurement, which represents 0.5, or 50%.
Qualitative data source: The evaluation of the probability of occurrence of hazards can be done during a group meeting (focus group, brainstorming, etc.). In this case, the value used must result from the consensus of the participants in the meeting. It is also possible to use a white board or other means with the participants. The main thing is to let the participants indicate the estimated level according to the element evaluated, on a continuous line between the lowest and the highest.
Quantitative value : real number
Quantitative data sources: historical data or evidence from research or data collection.
Presence of the vulnerability
Symbol: Pb (ψ)
Description: The objective of this indicator is to assess the presence (Pb (ψ) = 1) or the absence (Pb (ψ) = 0) of vulnerability under consideration in a risk scenario.
Value: 0 (absence) or 1 (presence)
Qualitative Data Source: Analysis by Specialist
Quantitative Data Source: Scan Analysis (NVAS) or another tool
Probability of exploiting vulnerability by hazard
Symbol: Pb (ψ, A)
Description: This indicator is used to estimate the probability that the hazard considered in the risk scenario could exploit the vulnerability.
Qualitative value: between 0.01 and 0.99
Qualitative data source: The probability assessment can be done during a group meeting (focus group, brainstorming, etc.). In this case, the value used must result from the consensus of the participants in the meeting. It is also possible to use a white board or other means with the participants. The main thing is to let the participants indicate the estimated level according to the element evaluated, on a continuous line between the lowest and the highest.
Quantitative value: real number
Quantitative data sources: historical data or evidence from research or data collection.
Estimated damage
Symbol: δe (ψ, A)
Description: Measurement of the impact of the achievement of the most likely scenario. If sources of evidence or historical data are available, these data should be preferred. Otherwise, the slider scale can be used. The slider scale presented is used to assess the impact of the presented scenario.
Qualitative value: The values assigned to this variable are between 0.01 and 0.99, or between 1% (low) and 99% (high). The central, neutral or average value is located at the center of the scale of measurement, which represents 0.5, or 50%.
Qualitative data source: The impact evaluation can be done during a group meeting (focus group, brainstorming, etc.). In this case, the value used must result from the consensus of the participants in the meeting. It is also possible to use a white board or other means with the participants. The main thing is to let the participants indicate the estimated level according to the element evaluated, on a continuous line between the lowest and the highest.
Quantitative value: real number
Quantitative data sources: historical data or evidence from research or data collection.
Maximum damage
Symbol: δm (ψ, A)
Description: Measurement of the impact of scenario realization in the worst case. If sources of evidence or historical data are available, these data should be preferred. Otherwise, the slider scale can be used. The slider scale presented is used to assess the impact of the presented scenario.
Qualitative value: The values assigned to this variable are between 0.01 and 0.99, or between 1% (low) and 99% (high).The central, neutral or average value is located at the center of the scale of measurement, which represents 0.5, or 50%.
Qualitative data source: The evaluation can be done during a group meeting (focus group, brainstorming, etc.). In this case, the value used must result from the consensus of the participants in the meeting. It is also possible to use a white board or other means with the participants. The main thing is to let the participants indicate the estimated level according to the element evaluated, on a continuous line between the lowest and the highest.
Quantitative value: real number
Quantitative data sources: historical data or evidence from research or data collection.
Resiliency level
Symbol: θ (ψ, A)
Description: The level of individual or organizational resilience in relation to the risk scenario presented.
Qualitative value: The values assigned to this variable are between 0.01 and 0.99, or between 1% (low) and 99% (high).The central, neutral or average value is located in the center of the scale of 0.5, or 50%
Qualitative data source: The evaluation can be done during a group meeting (focus group, brainstorming, etc.). In this case, the value used must result from the consensus of the participants in the meeting. It is also possible to use a white board or other means with the participants. The main thing is to let the participants indicate the estimated level according to the element evaluated, on a continuous line between the lowest and the highest. The slider scale presented is used to assess the impact of the presented scenario.
Quantitative value: real number
Quantitative data sources: historical data or evidence from research or data collection.
Here again, the slider scale can be used.
Expected utility
Symbol: μ (E)
Description: The value of the risk element, its contribution to the business objectives of the organization or its replacement value.
Qualitative value: between 0.01 and 0.99
Qualitative data source: The evaluation can be done during a group meeting (focus group, brainstorming, etc.). In this case, the value used must result from the consensus of the participants in the meeting. It is also possible to use a white board or other means with the participants. The main thing is to let the participants indicate the estimated level according to the element evaluated, on a continuous line between the lowest and the highest.
Quantitative value: real number
Quantitative data sources: historical data or evidence from research or data collection.
Mitigation measures
Symbol: MMn (Zn)
Description: Risk mitigation measures.
Qualitative value: between 0.01 and 0.99
Qualitative data source: The evaluation can be done during a group meeting (focus group, brainstorming, etc.). In this case, the value used must result from the consensus of the participants in the meeting. It is also possible to use a white board or other means with the participants. The main thing is to let the participants indicate the estimated level according to the element evaluated, on a continuous line between the lowest and the highest.
Quantitative value: real number
Quantitative data sources: historical data or evidence from research or data collection.
Reduction of damage caused by exploitation of vulnerability by hazard with the mitigation measure in place
Symbol: δr (ψ, A, MMn)
Description: Existing risk mitigation measures.
Qualitative value: between 0.01 and 0.99
Qualitative data source: The evaluation can be done during a group meeting (focus group, brainstorming, etc.). In this case, the value used must result from the consensus of the participants in the meeting. It is also possible to use a white board or other means with the participants. The main thing is to let the participants indicate the estimated level according to the element evaluated, on a continuous line between the lowest and the highest.
Quantitative value: real number
Quantitative data sources: validation by an information security expert, historical data or evidence from research or data collection.
Reduced likelihood of exploitation of vulnerability by hazard with the mitigation measure in place
Symbol: Pb (ψ, A, MMn)
Description: The effect of the risk mitigation measure that reduces the likelihood of exploitation of vulnerability by the hazard with the mitigation measure in place.
Qualitative value: between 0.01 and 0.99
Qualitative data source: The evaluation can be done during a group meeting (focus group, brainstorming, etc.). In this case, the value used must result from the consensus of the participants in the meeting. It is also possible to use a white board or other means with the participants. The main thing is to let the participants indicate the estimated level according to the element evaluated, on a continuous line between the lowest and the highest.
Quantitative value: real number
Quantitative data sources: validation by an information security expert, historical data or evidence from research or data collection.
Example of a qualitative risk calculation
Here is a model for estimating the risk index based on the use of qualitative data. In a real context, the model must be adjusted to take into account the peculiarities of each organization.
The KRIs used in this example:
Risk appetite of the organization: Ar (organization)
Risk scenario: Zn (A, ψ, δ )
Element at risk: In
Probability of realization of the hazard: Pb (A)
Presence of the vulnerability: Pb (ψ)
Probability of exploiting vulnerability by hazard: Pb (ψ, A)
Estimated damage: δe (ψ, A)
Maximum damage: δm (ψ, A)
Resilience level: θ (ψ, A)
Expected utility: μ (E) , a value between 0.01 and 0.99 (qualitative) or a real number (scientific approach and evidence)
Mitigation measures: MMn (Zn)
Damage reduction caused by exploitation of the vulnerability by the hazard with the mitigation measure in place: δr (ψ, A, MMn)
Reduction of the probability of exploitation of the vulnerability by the hazard with the mitigation measure in place: Pb (ψ, A, MMn)
Calculation of the estimated risk for the Z001 risk scenario (virus, email, loss of reputation), this is the risk of loss of reputation by the disclosure of private information of clients of the organization caused by a virus a computer sent by e-mail opened by a misguided employee:
Ar (organization) = 0.3 (light)
Pb (A) = 0.7 (high)
Pb (ψ) = 1 (presence)
Pb (ψ, A) = 0.5 (medium)
δe (ψ, A) = 0.4 (medium)
δm (ψ, A) = 0.9 (very high)
θ (ψ, A) = 0.5 (medium)
μ (E) = 0.6 (medium)
MM001 (Z001) = $ 45000 (DLP system)
δr (ψ, A, MM001) = 0.82
Pb (ψ, A, MM001) = 0.75
Qualitative estimate of the estimated risk index:
Re (Z001) = (Pb (A) * Pb (ψ) * Pb (ψ, A) * δe (ψ, A) * μ (E)) / θ (ψ, A)
Re (Z001) = (0.7 * 1 * 0.5 * 0.4 * 0.6) / 0.5
Re (Z001) = 0.084 / 0.5
Re (Z001) = 0.168
Qualitative estimate of the maximum risk index:
Rm (Z001) = (Pb (A) * Pb (ψ) * Pb (ψ, A) * δm (ψ, A) * μ (E)) / θ (ψ, A)
Rm (Z001) = (0.7 * 1 * 0.5 * 0.9 * 0.6) / 0.5
Rm (Z001) = 0.189 / 0.5
Re (Z001) = 0.378
Qualitative estimate of the tolerated risk index:
Rt (Z001) = (Pb (A) * Pb (ψ) * Pb (ψ, A) * Ar (org) * μ (E))) / θ (ψ, A)
Rt (Z001) = (0.7 * 1 * 0.5 * 0.3 * 0.6) / 0.5
Rt (Z001) = 0.063 / 0.5
Rt (Z001) = 0.126
Qualitative estimate of the mixed risk index with the use of risk mitigation measure MM001 (Z001), adata loss prevention (DLP) system that costs $ 45,000:
Rmm001 = Re (Z001) * δr (ψ, A, MM001) * Pb (ψ, A, MM001)
Rmm001 = 0.168 * 0.82 * 0.75
Rmm001 = 0.103
Example of quantitative risk calculation
Here is a model for estimating the risk index based on the use of quantitative data. In a real context, the model must be adjusted to take into account the peculiarities of each organization.
The KRIs used in this example:
Risk appetite of the organization: Ar (organization)
Risk scenario: Zn (A, ψ, δ )
Element at risk: In
Probability of realization of the hazard: Pb (A)
Presence of the vulnerability: Pb (ψ)
Probability of exploiting vulnerability by hazard: Pb (ψ, A)
Estimated damage: δe (ψ, A)
Maximum damage: δm (ψ, A)
Resilience level: θ (ψ, A)
Expected utility: μ (E) , a value between 0.01 and 0.99 (qualitative) or a real number (scientific approach and evidence)
Mitigation measures: MMn (Zn)
Damage reduction caused by exploitation of the vulnerability by the hazard with the mitigation measure in place: δr (ψ, A, MMn)
Reduction of the probability of exploitation of the vulnerability by the hazard with the mitigation measure in place: Pb (ψ, A, MMn)
Calculation of the estimated risk for the Z001 risk scenario (virus, email, loss of reputation), this is the risk of loss of reputation by the disclosure of private information of clients of the organization caused by a virus a computer sent by e-mail opened by a misguided employee:
Ar (organization) = 0.3 (light)
Pb (A) = 0.6 (this happened 3 times in the last 5 years in our organization and the industry figures are similar for companies like ours)
Pb (ψ) = 1 (the vulnerability is present in our organization)
δe (ψ, A) = $ 1,000,000 (average of 3 known incidents)
δm (ψ, A) = 4,000,000 (the worst case here)
θ (ψ, A) = 1 (the current resilience has no effect)
μ (E) = $ 10,000,000 (contribution of informational assets to organizational objectives)
MM001 (Z001) = $ 45000 (DLP system)
δr (ψ, A, MM001) = 0.82
Pb (ψ, A, MM001) = 0.75
Quantitative estimate of the estimated risk index:
Re (Z001) = (Pb (A) * Pb (ψ) * δe (ψ, A)) / θ (ψ, A)
Rt (Z001) = (Pb (A) * Pb (ψ) * Ar (org) * μ (E))) / θ (ψ, A)
Rt (Z001) = (0.6 * 1 * 0.3 * 10,000,000) / 1
Rt (Z001) = $ 1,800,000
Qualitative estimate of the mixed risk index with the use of risk mitigation measure MM001 (Z001), adata loss prevention (DLP) system that costs $ 45,000:
Rmm001 = Re (Z001) * δr (ψ, A, MM001) * Pb (ψ, A, MM001)
Rmm001 = $ 600,000 * 0.82 * 0.75
Rmm001 = $ 369,000
Bibliography
Léger, Marc-André (2013) Introduction to Information Risk Management , Hochelaga-Maisonneuve Research Center, Montreal, Quebec, Canada
This article presents a simple strategy to accelerate literature reviews. The approach was developed for new graduate students wishing to engage in scientific research with little knowledge of how to perform a systematic search using academic sources and scientific journals on a particular topic. However, it may be useful for many others. The approach was used successfully by a research team to perform literature reviews supported by tools such as Zotero and LitMap, and specialized websites, such as Scopus, Web of Science and Engineering Village.
Many years ago, I was very fortunate to have a high school teacher with a Ph.D. who taught us about methodological approaches. At the time, I had no idea that this is going to be of any importance to me, nor did I have any inclinations of doing a Ph.D. myself one day. Of course, today, I understand how important it is to take a systematic approach to resolving problems and how a scientific method can be used to build up some form of proof as to the validity of the answers I would be provided to resolving these questions. Since, I have taken many research methodologies courses, written a few dissertations, articles and other papers, and introduced students to the scientific method.
Of the many steps in getting started on this path for those new to scientific research, I have noticed that many struggle with how to get started on their initial literature review. This is a critical early step in scientific enquiry that is used to get a grasp on the current state of knowledge on a topic. It is also when many researchers define the scope of the project, identify initial hypothesis, and determine an initial research question. Of course, hypothesis and research questions may evolve further at a later stage in the process but at least with this initial work, researchers have a starting point for discussions with colleagues or a research director, material to use in funding requests, and peer-reviewed sources to start writing a research proposal. Hence, this article is particularly intended to help students get started on their path into scientific research, with the hope that they can rely less on Wikipedia, blogs, and Google when they write the papers that they submit.
The strategy proposed for conducting an initial literature review is to use available tools and take a simple systematic approach. Using databases and resources available in most university libraries, they can identify reliable, peer-reviewed sources to document the current state of scientific knowledge on their research topic. The next sections present the proposed steps. This starts with choosing a research topic.
A scientific research project starts with a subject to be explored. There are many manners in which this subject can be chosen, from a personal interest of a researcher, a course assignment or to take advantage of a funding opportunity. Nearly all subjects can be valid opportunities for scientific enquiry. In a new or innovative area, the subject can be relatively vague or ill-defined. However, as the field matures and the researcher gains expertise on the topic, it can become quite narrow.
Since this article needs examples, it is necessary to determine a topic. As the principal field of research of the authors of this article is cybersecurity, this is formed the basis for the topic determination for a first example. Therefore, for the first example, the search presented in this article is on the general topic of information security. This is based on a personal interest. Since this is a very broad topic, to make it a bit more realistic, the article will investigate information security applied to operational technologies, those used for the management of critical infrastructures. For readers more familiar with information technologies (IT), the technologies used in organizations to help them manage information, operational technologies (OT) are technologies used to manage infrastructure and industrial equipment by monitoring and controlling them. These OT’s include, but are not limited to, critical infrastructures such as the electrical grid of a country, region, or province, in the case of Canada. In the project used as an example, we are focusing on their use in monitoring and controlling a particular critical infrastructure, the electrical grid providing electrical power to cities in Canada.
At this point it is possible to create a concept map, to help better define the topic before going on to the next. Concept map, such as mind maps, have been very helpful to get a better grasp on a topic and decompose it into core concepts. It is not presented in this article, but there are many good tutorials on how to do these. Concept maps are something performed in class with students to help them. Therefore, the topic is:
Information security of operational technologies in Canada
This is what is used for the next steps as an example.
As mentioned, information security of operational technologies in Canada, is selected as a topic for the project described. Computer security and cybersecurity are also used as synonyms for information security and is added to the initial search. In a real-world scenario, the input of advisors, professors and research team members can contribute to defining the initial keywords. From this topic three main concepts are identified:
Information security, with the synonym’s computer security, cybersecurity
The first element will help to identify the articles in the general topic. The second and third elements will help to narrow down the results to be more relevant. As well as the recommendations of co-researchers, Google and online thesaurus can be used to identify synonyms, which can help in the initial searches. This may require some trial an error to refine, as is explained later. Table 1 presents an overview of the search results in Polytechnique Montreal’s and Concordia University’s online library search engine for the selected keywords, as well as the results from Google Scholar. Identifying and validating the appropriate keywords, operators (AND, OR, NOT) and combinations thereof, may require multiple trials, errors and improvements. While this will become easier as the researchers gains experience, it may be a long, and relatively tedious, process. There is no magic number of articles needed, as research projects differ. In this case, for a relatively small project, with a relatively small team, an initial number below 1000 articles is targeted. Again, readers need to keep in mind that this is at a very early stage in the project and in the literature search. At the end of the process then number should be much lower, well under 100 in most cases.
Search expression
Results Polytechnique
Results Concordia
Google Scholar
Information security OR computer security OR cybersecurity
430 719
536 309
4 110 000
«Information security» OR «computer security» OR cybersecurity
68 820
87 133
426 000
«Operational technologies» OR «critical infrastructures»
605
3836
17 100
«Operational technologies» AND «critical infrastructures»
0
5
224
(«Information security» OR «computer security» OR cybersecurity) AND («Operational technologies» OR «critical infrastructures»)
790
878
17 800
Table 1: Initial searches
While the results are helpful, it can be observed that the sheer number remains too large to be useful in an initial literature review within the scenarios that have been identified in this article. However, in this case the last query that is used could be appropriate for the intended purpose at this point while adding further limits, as described in the next step. Therefore, the next step will proceed with the following query:
(«Information security» OR «computer security» OR cybersecurity) AND («Operational technologies» OR «critical infrastructures»)
Starting from what is done in the previous steps, restrictions to limit the results to scientific articles published in peer-reviewed journals during the last ten years in English are added. This is done as the intention is, at least in part, to assess the current state-of-the-art of knowledge in the research domain of the study. The definition of current is initially seen as going back only 10 years. Since the number of results may still be high, the restriction can also be set to 5 years. Therefore, the final search expression from the previous step is used, («Information security» OR «computer security» OR cybersecurity) AND («Operational technologies» OR «critical infrastructures»), with different restrictions, as shown in table 2.
Search restrictions
Results Polytechnique
Results Concordia
Google Scholar
No restrictions
791
878
17 800
Limited to last 10 years
587
652
16 600
Limited to last 5 years
380
415
10 800
Limited to articles from the last 5 years
291
304
639
Limited to articles in scientific journals from the last 5 years
89
96
N/A
Limited to articles in English, in scientific journals from the last 5 years
55
59
629
Table 2: searches with restrictions
The results are at a volume that appears more reasonable for an initial search. It would seem appropriate to use this as the focus of the literature review for the project. In the next step, further research is done to try to identify the most influential and most cited scientific articles on the topic at hand. On this basis, the search will continue using the following query:
(«Information security» OR «computer security» OR cybersecurity) AND («Operational technologies» OR «critical infrastructures»), limited to articles in English, in scientific journals from the last 5 years.
In this example, adding English as a limit could be omitted, since the previous limits resulted in a number just below the 100 articles that had been identified as a workable limit. However, at a later stage, articles that are not in English will still have to be eliminated if the reviewers are not able to read the articles. But anecdotal evidence shows that publication language is not always reliably determined in the databases.
To help students who might be doing this for a first time, arbitrary limits are mentioned. Students like to have specific numbered goals. What would be recommended for them is a minimum of 50 articles for master’s level research and 100 for Ph.D. level, 200 for a Ph.D. dissertation, multiplied by the number of individuals in the team. These highly subjective limits would only be used as guidance for unexperienced researchers, as experienced researchers should set their own limits in accordance with experience, resources, and time available for the project.
Retrieving the documents is done using databases available on the Polytechnique Montréal library and the Concordia University library websites. They are selected as these are the libraries available to the authors of this article. As both universities have different database subscriptions, this allows for additional sources to be identified. However, it may result in many duplicates. The duplicates can be easily resolved later. This is a good strategy for research teams or post-graduate students that often have affiliations to different institutions. As shown in table 2, this resulted in 55 and 59 articles. They are all exported directly into the Zotero reference manager, using a web navigator plugin.
In this next step, databases that aggregate scientific articles or that offer larger data samples are used. This allows to cast a wide net to increase the likelihood of including important literature in the project. In particular, the following databases are used, as they are the best known and most used databases for citations counts:
Scopus.com
Web of Science
Engineering Village
Google Scholar
Scopus
As described online (https://www.elsevier.com/solutions/scopus ), Scopus provides an abstract and citation database for scholarly literature. It provides metrics that is useful to determine the impact of peer-reviewed scientific journal articles. In the context discussed in this article, Scopus is used to identify influential articles on the topic at hand.
Using the query identified in the previous section, («Information security» OR «computer security» OR cybersecurity) AND («Operational technologies» OR «critical infrastructures»), the query in Scopus produced 267 results, that are sorted by number of citations. The top 50 references are exported to a file in Bibtext format. As well, the complete Scopus results can be exported to be used later to perform a bibliometric analysis, described in Strategy 2, later in this article. As is described in the later section, this bibliometric analysis can serve as validation of the relevance of the process and the results. The bibliometric analysis can also provide additional insights into the domain.
Web of Science
Web of science is similar to Scopus. It provides access and queries to a large citations database. However, as it is managed by a different media group, it offers different results to Scopus. The objective of using both is to catch the most cited articles on the topic. As duplicates is removed in a later step, this should limit the effect of any biases created by the different databases. Using the same query as in previous steps produced 103 results, that are sorted by number of citations. The top 50 are exported to a file in Bibtext format. Here as well, the complete Web of Science results can be exported to be used later for a bibliometric analysis.
Engineering Village
Engineering village is a database of scientific journal articles that specializes in the fields of applied science and engineering. It is used to complement the previous searches. The search in this database produced 222 results, sorted by relevance. The top 50 are exported to a file in Bibtext format.
Google Scholar
Google Scholar is a service of the Google search engine that specializes in scientific publications. It is used in this search strategy to complement the previous searches with additional material. The search in this database produced 649 results, that are sorted by relevance. The top 40, which corresponds to the two first pages of results, are exported using the web browser plugin.
Using the previous queries, the results from the library searches are imported into Zotero using the Google Chrome plugin. For this purpose, a library named Critical Infrastructure is created in the Zotero account. Zotero is chosen due to the familiarity of the research team with the product and because it is a recommended tool by librarians. However, there are many other similar tools that can be used to achieve the same result. For the Scopus search, it is necessary to export the results from the Scopus website in Bibtext format, adding Abstracts to the default export settings in the export menu on Scopus. This generates a file named Scopus.bib that can then be imported into Zotero. This is done in Zotero in the File – Import menu. A similar process is used for Web of Science and Engineering Village but with different default filenames that are created by the sites. For Google Scholar, the Chrome web browser plugin is used. In this example, for Google Scholar, only the first 40 entries, sorted by relevance, are imported. The number of Google Scholar results that is saved may vary based on available time and resources involved in the project, but a maximum number of 100 would be more than sufficient in most cases.
After all the references are imported into Zotero, it is necessary to remove duplicates in Zotero. This is required as the results from the different queries will overlap from one database to the other. It is done using a specific remove duplicates function in Zotero. In the example, once this is complete, there are 181 documents remaining.
The results are then submitted to a first review, looking only at the title of the articles and the abstract. This is done to quickly ensure articles are relevant and ready to be used for the next steps. In the example described here, 20 articles are removed as they did not indicate a link to the research subject. This step is also an opportunity to help identify terms and keywords that may become useful later in the process. These should be noted, as is done in this project in the list presented here:
critical infrastructure
public utilities (power grid, electricity production, nuclear power generation plant, wind turbines, gas distribution network, drinking water production and distribution)
From there, a reference mapping tool is used to again try to ensure that all the important references are found and included in the project. The web tool LitMap was chosen for this project (https://app.litmaps.co/ ) and a bibtext file export of the articles remaining after the triage step are imported.
Figure 1: LitMap graph
The LitMap tool then suggests new articles, which are potentially relevant based on what is there. It also allows the research team to get a visual outlook of the literature, helping them to get a better understanding of what is there and helping to identify the evolution of knowledge in this fields, the connections in the literature and significant articles that are more connected to the corpus of knowledge.
Using LitMap, it is possible to identify additional relevant articles that are connected to the journal articles resulting from the previous steps. There may be several factors that come into play as to determining relevance, such as shared references, keywords, authors, and themes. By using the Run Explore feature of LitMap, a list of these suggested articles. By looking at their title and abstract, it can be determined if they should be added. Generally, it would be suggested to add articles that would appear most likely to add value to the work should be added at this point. Articles published at an earlier date than what is determined at step 3 should also be added if they are highly cited and relevant, as they may identify a key source that have a high impact in the research domain. Figure 2 gives an example of the Explore function of LitMap.
Figure 2 : Explore fonction of LitMap
By using this feature of LitMap and refreshing after adding a few articles to the map, it is possible to add many other relevant articles that are highly connected to this map. An example is presented in figure 3. In the search performed in this article, after a few cycles of adding and refreshing, the map grew to 171 articles.
Following the previous step, a group of articles that is used in the literature review are identified. Using the export map to Bibtext functionality in LitMap will produce a file that can then be imported into a new library in Zotero. This library will contain all the articles. Depending on the options selected in the import and if a new library collection is used, it may be necessary to remove duplicates if there are any. A good reason not to delete previous references and proceed to remove duplicates instead may be to take advantage of full-text copies of the articles to be included in the existing Zotero references. Keeping these will save time in the next step, when all the articles are retrieved.
As mentioned, some of the articles are linked directly to Zotero, as they are already included in previous steps. However, for the next steps, it is necessary to have copies of all the articles available. Performing a literature review involves reading the articles, so having a copy of them is an obvious necessity. The process of retrieving the articles may be done in different ways. The recommended way is to add a copy of the article directly into Zotero, using the Add a copy of a file option. This requires a local copy of the PDF file of the article to be downloaded, which can usually be done by selecting the reference and double-clicking on it in Zotero, which opens the article from the library’s database. Do note that finding all the articles may take some time, depending on the number of articles. A ballpark estimate would consider 5 to 10 minutes per missing reference. An PDF icon will appear next to articles that are added. In some cases, it may be necessary to copy the article title to a Google search, which generally makes it possible to find a link to access and download the article.
When saving a local copy of the article, using a standardized pattern will make a later identification possible. Any scheme would be fine if there is consistency throughout the project and the team. After a thorough search, articles that can’t be located, if it is a small number, they should be removed, as they would not be useful for the next steps.
Once all the articles have been downloaded a mode in-depth review can be made to assess their relevance. This step could be done by a team of researchers with the assistance of students. It requires that inclusion and exclusion criterions be identified. At this point there should be enough domain knowledge to make this feasible. If the number of articles is not too large, it might be acceptable to omit this step. As presented later in this article, a strategy that might be considered is to perform a review by looking at the abstracts only to assess relevance for inclusion or exclusion. Then a review can be performed by reading more comprehensive machine-generated summaries. This would be followed by the lecture of the full articles that make it through the process.
Using the final selection of articles, this step requires one of two strategies: read or automate. Reading involves, as it would imply, that the articles be read, that key information be highlighted and that notes be taken, using Zotero or another tool, such as Microsoft OneNote or another tool that team members are familiar with. Automation would involve using natural language processing (NLP), perhaps by writing Python code for this purpose. Much analysis can also be done with R Studio, applying some document analysis capabilities that are well documented online. Other strategies involve using specialized off-the shelf document analysis tools or bibliometric tools, which can be purchased. This article makes no specific recommendations for this, as there are too many factors at play in determining the best strategy but will present further steps that can be used in an upcoming section. Students or new researchers would be better off to read the articles and prepare notes to learn and experiment the process.
Once all the steps are completed, it becomes possible to use the material for the intended purpose, such as write a literature review or perform further analysis, as presented in an upcoming section of this article. The next section demonstrates an example of the application of this method on a different topic, IT compliance management.
This section presents an example of the application of the process that is described in the previous part of this article. It is done by applying the various steps for a research project on IT compliance management.
Removed unrelated documents: tax compliance, customs risk and compliance, environmental compliance, healthcare related compliance, such as medication or treatment, Occupational Safety and Health Administration (OHSA) compliance
Removed: articles not in English
Removed: articles that did not appear to be at an appropriate level or too basic to be considered scientific.
At this point all the documents are merged into a single Zotero collection, a last review for duplicates is performed, all PDF files are located, and article summaries are added when not present. A few articles are removed as it is not possible, after many attempts using different sources, to locate a PDF version of the article. In the end, 107 articles remained for analysis. A list of the remaining articles is presented in appendix A.
The first part of this article presented a method to be used to identify journal articles and scientific literature that can be used in scientific research. The second part presented an example that concerned IT Compliance management. As mentioned, getting started on a literary review for a research project can often be a difficult task for individuals starting in empirical research and new graduate students. These are the prime targets for this article. In the next sections we will propose different tools and strategies to accelerate the process. It should be mentioned that some of the proposed approaches could be misused and produce results that could be considered plagiarism or academic fraud. Any use of the material in a dissertation needs to be discussed with research advisors and ethical concerns investigated. However, the authors of this article believe that using tools to assist in research, can be very beneficial, when done appropriately.
The final part presents a few strategies that can be used to assist in the literature search process. The first strategy proposes to use some tools to automate the creation of expanded text summaries that may be helpful to evaluate the usefulness of documents in more depth that what is provided by author provided summaries. The second and third strategies use R Studio to perform bibliometric analysis of the documents to help gather initial insights into the corpus of knowledge that was assembled, to help accelerate the initial phases of research.
To accelerate the review of many articles, tools can be used, as mentioned in step 11. In this article, wordtune read (https://app.wordtune.com/read ) is used to produce this initial analysis. A similar result can also be achieved by using python code with machine learning libraries. However, a quicker approach is privileged using an off-the-shelf solution. With this tool, once all the PDF version of the articles have been located, as presented in step 10, they can simply be dragged-and-dropped from Zotero onto wordtune read to generate an initial summary. This summary can then be copied back into the notes section of Zotero, associated with the article. While an initial selection is made by reading the abstracts, this summary can then be used to perform a further review and selection. Of course, readers need to be reminded that this summary should not be used as-is to create an assignment, an article or material intended for publication.
The process to generate an automated summary:
Select a citation in Zotero
Open the tab
Right-click on the PDF file and select Locate document
Drag-and-drop the PDF file on wordtune read
Wait for the summary to be created
Use the Copy All option in wordtune
Create a new note
Give the note a title, such as wordtune summary, to avoid misuse later
Paste the summary the note
Once summaries of articles are produced, they can be used to perform a second level of review, remembering that the first review is done by reading the author’s abstract, available from the publisher. Using the wordtune produced abstract provides further material to determine the relevance of the article for the study. As well, at this stage, a checklist of inclusion and exclusion criterion can be created to help the process. Eventually, python and NLP could be used to perform a selection based on the summary, should there be too many articles to review manually with the available human resources in the project.
There are many different bibliometric approaches that can be useful to help get started. Keeping in mind that the primary audience for the authors of this article are in the Sciences, Technology, Engineering and Math (STEM) fields, the use of a statistical analysis tool called R-Studio is proposed. Using text analysis tool can help identify more significant references that can emerge from the documents identified previously. An example, with sample code, is presented. The article does not go into the installation and configuration of R Studio, which can easily be performed using information found online.
Statistical article analysis
The first analysis that is presented in this article consists of using R-Studio to investigate the most significant keywords that can be found in the corpus of documents that is put together from the process described earlier. From these, after generating the automated summaries desca few
The code used is:
# This R script is used to analyse large volumes of PDF files # Created by Dr Marc-André Léger # This version 28 June 2022 # This is the output Excel file nale excel_out <- «words_analysis_102.xlsx» # load the required libraries library(«xlsx») require(pdftools) # reads pdf documents require(tm) # text mining analysys # get all the files files <- list.files(«documents», pattern=»pdf$», full.names=TRUE, recursive=TRUE) opinions <- lapply(files, pdf_text) length(opinions) # make sure how many files are loaded lapply(opinions,length) # and the length in pages of each PDF file # create a PDF database for the wordcloud and the stemmed analysis pdfdatabase <- Corpus(URISource(files),readerControl = list(reader = readPDF)) pdfdatabase <- tm_map(pdfdatabase, removePunctuation, ucp = TRUE) opinions.tdm <- TermDocumentMatrix(pdfdatabase,control = list(removePunctuation = TRUE, stopwords = TRUE, tolower = TRUE, stemming = FALSE, removeNumbers = TRUE, bounds = list(global = c(3,Inf)))) inspect(opinions.tdm[10:20,]) #examine 10 words at a time across documents opinionstemmed.tdm <- TermDocumentMatrix(pdfdatabase,control = list(removePunctuation = TRUE, stopwords = TRUE, tolower = TRUE, stemming = TRUE, removeNumbers = TRUE, bounds = list(global = c(3,Inf)))) inspect(opinionstemmed.tdm[10:20,]) #examine 10 words at a time across documents # prepare the word matrix ft <- findFreqTerms(opinions.tdm, lowfreq = 100, highfreq = Inf) as.matrix(opinions.tdm[ft,]) ft.tdm <- as.matrix(opinions.tdm[ft,]) df <- sort(apply(ft.tdm, 1, sum), decreasing = TRUE) # prepare the word matrix for the word analysis ft2 <- findFreqTerms(opinionstemmed.tdm, lowfreq = 100, highfreq = Inf) as.matrix(opinionstemmed.tdm[ft2,]) ft2.tdm <- as.matrix(opinionstemmed.tdm[ft2,]) df2 <- sort(apply(ft2.tdm, 1, sum), decreasing = TRUE) #print (ft.tdm) # this might be used for debugging #print (df) # this might be used for debugging # save the results output1 <- data.frame(df) output2 <- data.frame(ft.tdm) output3 <- data.frame(df2) output4 <- data.frame(ft2.tdm) # then export them to an Excel file tmp1 <- write.xlsx(output1, excel_out, sheetName = «Articles», col.names = TRUE, row.names = TRUE, append = FALSE) tmp2 <- write.xlsx(output2, excel_out, sheetName = «Words», col.names = TRUE, row.names = TRUE, append = TRUE) tmp3 <- write.xlsx(output3, excel_out, sheetName = «Articles_Stemmed», col.names = TRUE, row.names = TRUE, append = TRUE) tmp4 <- write.xlsx(output4, excel_out, sheetName = «Words_Stemmed», col.names = TRUE, row.names = TRUE, append = TRUE)
This example makes it possible to produce an excel file with the results from the documents that have identified. Table x presents the ten most frequent words from the documents.
Word
Occurrences
compliance
7770
information
3753
security
3475
management
3195
business
3185
process
3185
data
2343
can
2284
research
2218
model
2087
Table x: The ten most frequent words from the documents
From there, further analysis in excel, selecting the most relevant words in stemmed format makes it possible to create insights that will help identify documents that would be likely to bring significant insights to the project. As presented in table x, the results of this inquiry.
Reference
complianc
secur
risk
control
audit
govern
noncompli
cybersecur
Relevance
Hashmi2018d
682
29
43
83
50
22
25
0
934
Akhigbe2019
253
32
2
2
0
19
0
0
308
Ali2021
249
476
9
43
0
7
119
4
907
Rinderle.Ma2022
234
3
19
20
6
1
1
0
284
Castellanos2022
231
15
9
21
4
9
2
6
297
Hashmi2018c
227
4
3
9
11
2
11
0
267
Haelterman2020
222
9
82
55
9
11
1
0
389
Yazdanmehr2020
220
90
5
39
0
2
13
0
369
Cabanillas2020
200
3
2
61
5
5
1
0
277
Usman2020
198
7
3
0
0
3
1
0
212
Mustapha2018
193
2
19
32
2
2
0
0
250
Mustapha2020
190
7
1
26
1
3
1
0
229
Meissner2018
187
6
30
8
3
14
7
0
255
Kim2020
187
4
19
11
1
4
4
0
230
Konig2017
173
71
2
26
3
0
5
0
280
Mubarkoot2021
166
38
7
5
7
16
2
0
241
Gorgon2019
159
7
145
20
8
30
6
0
375
Donalds2020
149
295
13
7
0
3
1
77
545
Cheng2018
143
40
6
38
21
5
2
0
255
Chen2018
138
197
2
44
0
2
1
0
384
Lin2022
132
0
28
2
1
45
17
0
225
Huising2021
129
3
21
18
18
85
2
0
276
Alqahtani2021
129
197
6
2
0
14
4
10
362
Jin2021
125
4
72
4
2
0
1
0
208
Banuri2021
123
2
11
8
21
4
3
0
172
Alotaibi2019
119
199
3
6
0
3
44
0
374
Pathania2019
118
19
1
4
0
0
0
0
142
Asif2019
117
1
6
4
14
27
3
0
172
Hendra2021
112
5
8
3
1
3
3
0
135
Pand2020
112
4
10
0
5
3
72
0
206
Hashmi2018b
110
3
6
3
0
2
1
0
125
Arogundade2020
109
2
13
27
1
1
0
0
153
Niedzela2021
108
7
18
13
9
1
9
0
165
Petersson2021
93
158
9
4
0
4
31
0
299
Rahmouni2021
84
30
5
35
60
6
3
3
226
Nietsch2018
84
4
28
20
3
26
8
0
173
Wang2020
82
2
25
12
284
6
11
0
422
Hanrahan2021
78
51
93
9
0
20
1
0
252
Moody2018
77
245
9
79
0
1
9
0
420
DArcy2019
75
94
2
23
0
1
6
0
201
Corea2020
73
0
0
3
5
3
1
0
85
Cunha2021
69
8
3
6
1
0
1
1
89
Dai2021
68
0
1
38
4
7
5
0
123
Koohang2020
67
104
4
9
0
2
0
0
186
Bussmann2019
67
0
7
27
0
2
1
0
104
Asif2020
64
1
24
4
15
35
13
0
156
Koohang2020
61
112
4
9
0
0
0
1
187
Winter2020
60
3
0
1
0
0
1
0
65
Torre2019
57
9
2
24
1
0
1
0
94
Cammelli2022
50
5
10
5
16
22
19
0
127
Ragulina2019
48
1
1
3
1
10
0
0
64
Barlow2018
46
224
5
9
0
1
11
0
296
Scope2021
45
8
2
4
2
13
1
0
75
Salguero.Caparros2020
45
0
17
6
3
1
14
0
86
Gaur2019
44
2
36
10
2
12
3
0
109
Becker2019
43
2
27
3
2
2
6
0
85
Hashmi2018
40
128
8
16
4
6
0
0
202
Lembcke2019
38
74
2
9
0
0
1
0
124
Painter2019
38
0
23
9
13
17
0
0
100
Norimarna2021
34
0
70
6
2
44
0
1
157
Ophoff2021
34
106
2
0
2
4
56
15
219
Becker2020
30
0
28
9
0
3
0
2
72
Sackmann2018
23
1
0
0
0
0
0
0
24
Pudijanto2021
22
3
21
6
126
17
0
0
195
Culot2021
19
112
23
28
7
33
0
16
238
Pankowska2019
13
7
40
32
0
26
0
0
118
Johannsen2020
10
33
8
2
0
8
0
1
62
Mukhopadhyay2019
10
172
99
16
6
10
1
1
315
Widjaya2019
9
23
4
7
3
28
0
0
74
Hofman2018
6
0
1
1
0
5
3
0
16
Al.Anzi2014
6
62
7
14
2
5
0
0
96
Na2019
2
141
19
4
0
7
0
1
174
Jensen1976
1
17
28
29
2
6
0
0
83
Offerman2017
1
0
0
1
0
6
0
0
8
Costa2016
0
0
0
3
0
0
0
0
3
Alshehri2019
0
0
0
0
0
0
0
0
0
Total
7760
3723
1321
1179
769
747
569
139
16207
Document count
74
64
70
71
47
65
53
14
76
What the data from table x reveals is the significance of certain articles in relation to the research subject, as well as in relation to the different terms of interest for the project. In the table, the first column contains the occurrence of the stem variations on compliance in the articles. This would include compliance and many variations on that word stem. As this is the main topic of our inquiry, it would be quite logical that this is the most frequent term. As well, the document with the highest count of the word compliance also have a high frequency of other keywords that are highly correlated to our research subject. The occurrence of significant keywords is noted in the last column, relevance. This column indicates the relative importance of a particular article for research subject. The combination of high count of the most important keyword for our project and the highest relevance of all the keyword would place this document as having a high potential of being very relevant for our project. It should be a high priority on our reading list for the project.
Wordcloud
Wordclouds present a graphical representation of the most significant words that appear in the corpus of documents. The relative size of the words representing their frequency in all of the documents. The code used is:
# This R script is used to create a wordcloud from PDF files # Created by Dr Marc-André Léger # This version 28 June 2022 # uncomment in this section if not already installed # install.packages(«wordcloud») # install.packages(«RColorBrewer») # install.packages(«wordcloud2») # load the required libraries library(«wordcloud») library(«wordcloud2») library(«RColorBrewer») require(pdftools) # reads pdf documents require(tm) # text mining analysys # get all the files files <- list.files(«documents», pattern=»pdf$», full.names=TRUE, recursive=TRUE) opinions <- lapply(files, pdf_text) length(opinions) # make sure how many files are loaded lapply(opinions,length) # and the length in pages of each PDF file # create a PDF database for the wordcloud and the stemmed analysis pdfdatabase <- Corpus(URISource(files),readerControl = list(reader = readPDF)) pdfdatabase <- tm_map(pdfdatabase, removePunctuation, ucp = TRUE) opinions.tdm <- TermDocumentMatrix(pdfdatabase,control = list(removePunctuation = TRUE, stopwords = TRUE, tolower = TRUE, stemming = FALSE, removeNumbers = TRUE, bounds = list(global = c(3,Inf)))) # prepare the word matrix for the wordcloud ft <- findFreqTerms(opinions.tdm, lowfreq = 100, highfreq = Inf) as.matrix(opinions.tdm[ft,]) ft.tdm <- as.matrix(opinions.tdm[ft,]) freq1 <- apply(ft.tdm, 1, sum) # finally the wordcloud set.seed(1234) wordcloud(words = ft, freq = freq1, min.freq = 10, max.words=200, random.order=FALSE, rot.per=0.35, colors=brewer.pal(8, «Dark2»))
In this example, the un-stemmed version of the words is used to provide more readable results. This can be helpful in presenting the research or for communications on the research topic. Another use of this can be to confirm the choices made in identifying the keywords used for the literature review or to help validate the corpus in relation to the research topic. The wordcloud should show the more frequent words align with the research topic. The result can be seen in figure 4.
Further analysis of the corpus of documents can be performed to gather additional insights into the research subject. Bibliometric analysis allows to better understand the links between the documents, the authors, and the research field. What is proposed is the use of a bibliometric analysis tool called Bibliometrix, available online https://www.bibliometrix.org/home/. Other tools, such as Quanteda can also be used for this purpose. For novice researchers, Bibliometrix have a graphical user interface, called Shiny, that can be used, which is documented online. Some examples of the information that can be extracted from this tool is presented in this article. However, more information is available on how to get all the benefits from this tool.
In the example below the scopus.bib and wos.bib files from step are used. Starting RStudio, the following instructions are used to start BiblioShiny:
library(bibliometrix) # load bibliometrix package
biblioshiny() # start the graphical user interface
Figure 5 shows the graphical user interface with the Scopus data loaded from an earlier example on compliance. This can be done by Data – Load Data. The data can then be used to help validate the information identified. The Overview – Main Information menu will provide a better overview of the data, as is shown on figure 6.
Figure 5: Bibliometrix Shiny graphical interface
Figure 6 is showing that there are 530 different sources, covering a timespan from 1973 to 2022. In earlier analysis, data from 2018 to 2022 is used to focus on recent sources of scientific data on the research topic. What this is showing is that Scopus contains articles from 1973 on this topic. Further investigation, using Google Scholar, will show the evolution of the domain.
Figure 6: overview of the data
A scan of Google Scholar, Scopus and Web of Science citations is presented in figure 7. This indicates that there is a surge in publications around 1999-2000. This would make sense for those familiar with the domain, as anecdotal evidence suggests that there is a significant increase in the interest in the topic of compliance since that period, when a few well know financial scandals brought this topic to the forefront. As well, there is a significant increase in Governance, Risk and Compliance issues since.
By exploiting the data and using the different reports further insights can be gathered. We can see identify the most cited authors, presented in table x. In using the source material, these authors should be included. Of course, the articles need to be reviewed and taken into context, but material from theses authors should be prioritized.
Author
Citations
GOVERNATORI G
172
HINA S
162
DOMINIC DD
161
HASHMI M
146
SOMMESTAD T
103
HALLBERG J
92
KUMAR A
89
RINDERLE-MA S
86
BUKSA I
78
RUDZAJS P
78
Table x shows the most cited articles. Here as well, these articles show a high potential of being very important to this field of enquiry. This should be confirmed by reading the articles, but they need to be included in the next phases of literature review.
Document
Year
Local Citations
Global Citations
BULGURCU B, 2010, MIS QUART MANAGE INF SYST
2010
161
1076
HERATH T, 2009, EUR J INF SYST
2009
120
788
HERATH T, 2009, DECIS SUPPORT SYST
2009
83
534
IFINEDO P, 2012, COMPUT SECUR
2012
78
435
VANCE A, 2012, INF MANAGE
2012
71
417
SIPONEN M, 2014, INF MANAGE
2014
61
290
SADIQ S, 2007, LECT NOTES COMPUT SCI
2007
60
306
PAHNILA S, 2007, PROC ANNU HAWAII INT CONF SYST SCI
Aria M, Cuccurullo C (2017). “bibliometrix: An R-tool for comprehensive science mapping analysis.” Journal of Informetrics, 11(4), 959-975. https://doi.org/10.1016/j.joi.2017.08.007.
Abbasipour, M., Khendek, F., & Toeroe, M. (2018). Trigger correlation for dynamic system reconfiguration. Proceedings of the ACM Symposium on Applied Computing, 427‑430. https://doi.org/10.1145/3167132.3167383
Afrifah, W., Epiphaniou, G., Ersotelos, N., & Maple, C. (2022). Barriers and opportunities in cyber risk and compliance management for data-driven supply chains.
Akhigbe, O., Amyot, D., & Richards, G. (2019). A systematic literature mapping of goal and non-goal modelling methods for legal and regulatory compliance. Requirements Engineering, 24(4), 459‑481. https://doi.org/10.1007/s00766-018-0294-1
Ali, R. F., Dominic, P. D. D., Ali, S. E. A., Rehman, M., & Sohail, A. (2021). Information security behavior and information security policy compliance : A systematic literature review for identifying the transformation process from noncompliance to compliance. Applied Sciences, 11(8), 3383.
Alotaibi, M. J., Furnell, S., & Clarke, N. (2019). A framework for reporting and dealing with end-user security policy compliance. 27(1), 2‑25. https://doi.org/10.1108/ICS-12-2017-0097
Alqahtani, M., & Braun, R. (2021). Reviewing influence of UTAUT2 factors on cyber security compliance : A literature review. Journal of Information Assurance & Cyber security.
Alshammari, S. T., Alsubhi, K., Aljahdali, H. M. A., & Alghamdi, A. M. (2021). Trust Management Systems in Cloud Services Environment : Taxonomy of Reputation Attacks and Defense Mechanisms. IEEE Access, 9. https://doi.org/10.1109/ACCESS.2021.3132580
Alshehri, F., Kauser, S., & Fotaki, M. (2019). Muslims’ View of God as a Predictor of Ethical Behaviour in Organisations : Scale Development and Validation. Journal of Business Ethics, 158(4), 1009‑1027. https://doi.org/10.1007/s10551-017-3719-8
Antonucci, Y. L., Fortune, A., & Kirchmer, M. (2021). An examination of associations between business process management capabilities and the benefits of digitalization : All capabilities are not equal. Business Process Management Journal, 27(1), 124‑144. https://doi.org/10.1108/BPMJ-02-2020-0079
Arsenijević, O., Podbregar, I., Šprajc, P., Trivan, D., & Ziegler, Y. (2018). The Concept of Innovation of User Roles and Authorizations from View of Compliance Management. ORGANIZACIJA IN NEGOTOVOSTI V DIGITALNI DOBI ORGANIZATION AND UNCERTAINTY IN THE DIGITAL AGE, 747.
Asif, M. (2020). Supplier socioenvironmental compliance : A survey of the antecedents of standards decoupling. Journal of Cleaner Production, 246, 118956.
Asif, M., Jajja, M. S. S., & Searcy, C. (2019). Social compliance standards : Re-evaluating the buyer and supplier perspectives. Journal of Cleaner Production, 227, 457‑471.
Banuri, S. (2021). A Behavioural Economics Perspective on Compliance. Banuri, Sheheryar.
Barlow, J. B., Warkentin, M., Ormond, D., & Dennis, A. R. (2018). Don’t Even Think About It ! The Effects of Antineutralization, Informational, and Normative Communication on Information Security Compliance. Journal of the Association for Information Systems. https://doi.org/10.17705/1JAIS.00506
Becker, M., Merz, K., & Buchkremer, R. (2020). RegTech—the application of modern information technology in regulatory affairs : Areas of interest in research and practice. Intelligent Systems in Accounting, Finance and Management, 27(4), 161‑167. https://doi.org/10.1002/isaf.1479
Brandis, K., Dzombeta, S., Colomo-Palacios, R., & Stantchev, V. (2019). Governance, risk, and compliance in cloud scenarios. Applied Sciences (Switzerland), 9(2). https://doi.org/10.3390/app9020320
Bussmann, K. D., & Niemeczek, A. (2019). Compliance through company culture and values : An international study based on the example of corruption prevention. Journal of Business Ethics, 157(3), 797‑811.
Cabanillas, C., Resinas, M., & Ruiz-Cortes, A. (2020). A Mashup-based Framework for Business Process Compliance Checking. https://doi.org/10.1109/TSC.2020.3001292
Castellanos Ardila, J. P., Gallina, B., & Ul Muram, F. (2022). Compliance checking of software processes : A systematic literature review. Journal of Software: Evolution and Process, 34(5), e2440.
Chen, X., Chen, L., & Wu, D. (2018). Factors That Influence Employees’ Security Policy Compliance : An Awareness-Motivation-Capability Perspective. Journal of Computer Information Systems. https://doi.org/10.1080/08874417.2016.1258679
Cheng, D. C., Villamarin, J. B., Cu, G., & Lim-Cheng, N. R. (2018). Towards end-to-end continuous monitoring of compliance status across multiple requirements. 9(12), 456‑466. https://doi.org/10.14569/IJACSA.2018.091264
Cheng, D. C., Villamarin, J. B., Cu, G., & Lim-Cheng, N. R. (2019). Towards Compliance Management Automation thru Ontology mapping of Requirements to Activities and Controls. In S. Z. Abidin K.A.Z. Mohd M. (Éd.), Proceedings of the 2018 Cyber Resilience Conference, CRC 2018. Institute of Electrical and Electronics Engineers Inc. https://doi.org/10.1109/CR.2018.8626817
Coglianese, C., & Nash, J. (2020). Compliance Management Systems : Do they make a difference? Cambridge Handbook of Compliance (D. Daniel Sokol & Benjamin van Rooij eds., Cambridge University Press, Forthcoming), U of Penn, Inst for Law & Econ Research Paper, 20‑35.
Corea, C., & Delfmann, P. (2020). A Taxonomy of Business Rule Organizing Approaches in Regard to Business Process Compliance. Enterprise Modelling and Information Systems Architectures (EMISAJ). https://doi.org/10.18417/EMISA.15.4
Culot, G., Nassimbeni, G., Podrecca, M., & Sartor, M. (2021). The ISO/IEC 27001 information security management standard : Literature review and theory-based research agenda. The TQM Journal.
Cunha, V. H. C., Caiado, R. G. G., Corseuil, E. T., Neves, H. F., & Bacoccoli, L. (2021). Automated compliance checking in the context of Industry 4.0 : From a systematic review to an empirical fuzzy multi-criteria approach. Soft Computing, 25(8), 6055‑6074.
Dai, F. (2021). Labor control strategy in china : Compliance management practice in the socialist workplace. 21(3), 86‑101.
Danielis, P., Beckmann, M., & Skodzik, J. (2020). An ISO-Compliant Test Procedure for Technical Risk Analyses of IoT Systems Based on STRIDE. In A. S. I. Chan W.K. Claycomb B. ,. Takakura H. ,. Yang J. J. ,. Teranishi Y. ,. Towey D. ,. Segura S. ,. Shahriar H. ,. Reisman S. (Éd.), Proceedings—2020 IEEE 44th Annual Computers, Software, and Applications Conference, COMPSAC 2020 (p. 499‑504). Institute of Electrical and Electronics Engineers Inc. https://doi.org/10.1109/COMPSAC48688.2020.0-203
D’Arcy, J., & Teh, P.-L. (2019). Predicting employee information security policy compliance on a daily basis : The interplay of security-related stress, emotions, and neutralization. Information & Management. https://doi.org/10.1016/J.IM.2019.02.006
Donalds, C. M., & Osei-Bryson, K.-M. (2020). Cybersecurity compliance behavior : Exploring the influences of individual decision style and other antecedents. International Journal of Information Management. https://doi.org/10.1016/J.IJINFOMGT.2019.102056
Ekanoye, F., & James, O. (2018). Global Market Access Regulations, Compliance Management in Developing Countries : A Brief Case Study of Three African Countries. 2018 IEEE Symposium on Product Compliance Engineering (SPCEB-Boston), 1‑6.
Fdhila, W., Rinderle-Ma, S., Knuplesch, D., & Reichert, M. (2020). Decomposition-based Verification of Global Compliance in Process Choreographies. Proceedings – 2020 IEEE 24th International Enterprise Distributed Object Computing Conference, EDOC 2020, 77‑86. https://doi.org/10.1109/EDOC49727.2020.00019
Gallina, B. (2020). A Barbell Strategy-oriented Regulatory Framework and Compliance Management. Communications in Computer and Information Science, 1251 CCIS, 696‑705. https://doi.org/10.1007/978-3-030-56441-4_52
Gaur, A., Ghosh, K., & Zheng, Q. (2019). Corporate social responsibility (CSR) in Asian firms : A strategic choice perspective of ethics and compliance management. 13(4), 633‑655. https://doi.org/10.1108/JABS-03-2019-0094
Ghiran, A.-M., Buchmann, R. A., & Osman, C.-C. (2018). Security requirements elicitation from engineering governance, risk management and compliance. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 10753 LNCS, 283‑289. https://doi.org/10.1007/978-3-319-77243-1_17
Gorgoń, M., Raczkowski, K., & Kraft, F. (2019). Compliance Risk Management in Polish and German Companies. Journal of Intercultural Management, 11(4), 115‑145.
Haelterman, H. (2020). Breaking Silos of Legal and Regulatory Risks to Outperform Traditional Compliance Approaches. European Journal on Criminal Policy and Research, 28(1), 19‑36. https://doi.org/10.1007/s10610-020-09468-x
Hanrahan, P., & Bednall, T. (2021). From Stepping-Stones to Throwing Stones : Officers’ Liability for Corporate Compliance Failures after Cassimatis. Federal Law Review, 49(3), 380‑409.
Hashmi, A., Ranjan, A., & Anand, A. (2018). Security and Compliance Management in Cloud Computing. International Journal of Advanced Studies in Computer Science and Engineering, 7(1), 47‑54.
Hashmi, M., Casanovas, P., & de Koker, L. (2018). Legal compliance through design : Preliminary results of a literature survey. TERECOM2018@ JURIX, Technologies for Regulatory Compliance http://ceur-ws. org, 2309, 06.
Hashmi, M., & Governatori, G. (2018). Norms modeling constructs of business process compliance management frameworks : A conceptual evaluation. Artificial Intelligence and Law, 26(3), 251‑305. https://doi.org/10.1007/s10506-017-9215-8
Hashmi, M., Governatori, G., Lam, H.-P., & Wynn, M. T. (2018). Are we done with business process compliance : State of the art and challenges ahead. Knowledge and Information Systems : An International Journal, 57(1), 79‑133. https://doi.org/10.1007/s10115-017-1142-1
Hendra, R. (2021). Comparative Review of the Latest Concept in Compliance Management & The Compliance Management Maturity Models. RSF Conference Series: Business, Management and Social Sciences, 1(5), 116‑124.
Hofmann, A. (2018). Is the Commission levelling the playing field? Rights enforcement in the European Union. Journal of European Integration, 40(6), 737‑751. https://doi.org/10.1080/07036337.2018.1501368
Huising, R., & Silbey, S. S. (2021). Accountability infrastructures : Pragmatic compliance inside organizations. Regulation & Governance, 15, S40‑S62.
Javed, M. A., Muram, F. U., & Kanwal, S. (2022). Ontology-Based Natural Language Processing for Process Compliance Management. Communications in Computer and Information Science, 1556 CCIS, 309‑327. https://doi.org/10.1007/978-3-030-96648-5_14
Jin, L., He, C., Wang, X., Wang, M., & Zhang, L. (2021). The effectiveness evaluation of system construction for compliance management in the electricity market. IOP Conference Series: Earth and Environmental Science. https://doi.org/10.1088/1755-1315/647/1/012024
Kavitha, D., & Ravikumar, S. (2021). Software Security Requirement Engineering for Risk and Compliance Management.
Koohang, A., Nord, J. H., Sandoval, Z. V., & Paliszkiewicz, J. (2020). Reliability, Validity, and Strength of a Unified Model for Information Security Policy Compliance. Journal of Computer Information Systems. https://doi.org/10.1080/08874417.2020.1779151
Koohang, A., Nowak, A., Paliszkiewicz, J., & Nord, J. H. (2020). Information Security Policy Compliance : Leadership, Trust, Role Values, and Awareness. Journal of Computer Information Systems. https://doi.org/10.1080/08874417.2019.1668738
Labanca, D., Primerano, L., Markland-Montgomery, M., Polino, M., Carminati, M., & Zanero, S. (2022). Amaretto : An Active Learning Framework for Money Laundering Detection. IEEE Access, 10. https://doi.org/10.1109/ACCESS.2022.3167699
Lahann, J., Scheid, M., & Fettke, P. (2019). Utilizing machine learning techniques to reveal VAT compliance violations in accounting data. In N. D. Becker J. (Éd.), Proceedings—21st IEEE Conference on Business Informatics, CBI 2019 (Vol. 1, p. 1‑10). Institute of Electrical and Electronics Engineers Inc. https://doi.org/10.1109/CBI.2019.00008
Lembcke, T.-B., Masuch, K., Trang, S., Hengstler, S., Plics, P., & Pamuk, M. (2019). Fostering Information Security Compliance : Comparing the Predictive Power of Social Learning Theory and Deterrence Theory. americas conference on information systems.
Liu, B. (2021). Construction of enterprise compliance management and supervision system based on ADR mechanism in Internet Environment. Proceedings – 2021 International Conference on Management Science and Software Engineering, ICMSSE 2021, 314‑317. https://doi.org/10.1109/ICMSSE53595.2021.00073
Luo, M., Wu, C., & Chen, Y. (2019). Construction of ping an airport’s total risk monitoring indicator system. ICTIS 2019 – 5th International Conference on Transportation Information and Safety, 829‑832. https://doi.org/10.1109/ICTIS.2019.8883586
Meissner, M. H. (2018). Accountability of senior compliance management for compliance failures in a credit institution. Journal of Financial Crime.
Mohamed, A. A., El-Bendary, N., & Abdo, A. (2021). An Essential Intelligent Framework for Regulatory Compliance Management in the Public Sector : The Case of Healthcare Insurance in Egypt. Proceedings of the Computational Methods in Systems and Software, 397‑409.
Moody, G. D., Siponen, M. T., & Pahnila, S. (2018). Toward a Unified Model of Information Security Policy Compliance. Management Information Systems Quarterly. https://doi.org/10.25300/MISQ/2018/13853
Mubarkoot, M., & Altmann, J. (2021a). Software Compliance in different Industries : A Systematic Literature Review.
Mubarkoot, M., & Altmann, J. (2021b). Towards Software Compliance Specification and Enforcement Using TOSCA. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 13072 LNCS, 168‑177. https://doi.org/10.1007/978-3-030-92916-9_14
Mukhopadhyay, A., Chatterjee, S., Bagchi, K. K., Kirs, P. J., & Shukla, G. K. (2019). Cyber Risk Assessment and Mitigation (CRAM) Framework Using Logit and Probit Models for Cyber Insurance. Information Systems Frontiers : A Journal of Research and Innovation, 21(5), 997‑1018. https://doi.org/10.1007/s10796-017-9808-5
Mustapha, A. M., Arogundade, O. T., Misra, S., Damasevicius, R., & Maskeliunas, R. (2020). A systematic literature review on compliance requirements management of business processes. International Journal of System Assurance Engineering and Management, 11(3), 561‑576.
Mustapha, A. M., Arogundade, O. T., Vincent, O. R., & Adeniran, O. J. (2018). Towards a compliance requirement management for SMSEs : A model and architecture. 16(1), 155‑185. https://doi.org/10.1007/s10257-017-0354-y
Na, O., Park, L. W., Yu, H., Kim, Y., & Chang, H. (2019). The rating model of corporate information for economic security activities. Security Journal, 32(4), 435‑456. https://doi.org/10.1057/s41284-019-00171-z
Niedzela, L., Kuehnel, S., & Seyffarth, T. (2021). Economic Assessment and Analysis of Compliance in Business Processes : A Systematic Literature Review and Research Agenda.
Nietsch, M. (2018). Corporate illegal conduct and directors’ liability : An approach to personal accountability for violations of corporate legal compliance. Journal of Corporate Law Studies, 18(1), 151‑184. https://doi.org/10.1080/14735970.2017.1365460
Nizan Geslevich Packin. (2018). Regtech, Compliance and Technology Judgement Rule. Chicago-Kent Law Review, 93(1).
Norimarna, S. (2021). Conceptual Review : Compatibility of regulatory requirements of FSA to Insurance industry in Indonesia for Integrated GRC. RSF Conference Series: Business, Management and Social Sciences, 1(5), 105‑115.
Oosthuizen, A., van Vuuren, J., & Botha, M. (2020). Compliance or management : The benefits that small business owners gain from frequently sourcing accounting services. The Southern African Journal of Entrepreneurship and Small Business Management, 12(1). https://doi.org/10.4102/sajesbm.v12i1.330
Ophoff, J., & Renaud, K. (2021). Revealing the Cyber Security Non-Compliance « Attribution Gulf ». hawaii international conference on system sciences. https://doi.org/10.24251/HICSS.2021.552
Ozeer, U. (2021). ϕ comp : An Architecture for Monitoring and Enforcing Security Compliance in Sensitive Health Data Environment. Proceedings – 2021 IEEE 18th International Conference on Software Architecture Companion, ICSA-C 2021, 70‑77. https://doi.org/10.1109/ICSA-C52384.2021.00017
Painter, M., Pouryousefi, S., Hibbert, S., & Russon, J.-A. (2019). Sharing Vocabularies : Towards Horizontal Alignment of Values-Driven Business Functions. Journal of Business Ethics, 155(4), 965‑979. https://doi.org/10.1007/s10551-018-3901-7
Pang, S., Yang, J., Li, R., & Cao, J. (2020). Static Game Models and Applications Based on Market Supervision and Compliance Management of P2P Platform. 2020. https://doi.org/10.1155/2020/8869132
Pankowska, M. (2019). Information technology outsourcing chain : Literature review and implications for development of distributed coordination. Sustainability, 11(5), 1460.
Pathania, A., & Rasool, G. (2019). Investigating power styles and behavioural compliance for effective hospital administration : An application of AHP. International Journal of Health Care Quality Assurance.
Petersson, J., Karlsson, F., & Kolkowska, E. (2021). Information Security Policy Compliance—Eliciting Requirements for a Computerized Software to support Value-Based Compliance Analysis. Computers & Security. https://doi.org/10.1016/J.COSE.2021.102578
Petkevičienė, M. (2021). Compliance management development for C-level management in Lithuanian companies [Master’s Thesis].
Prakash, A. M., He, Q., & Zhong, X. (2019). Incentive-driven post-discharge compliance management for chronic disease patients in healthcare service operations. IISE Transactions on Healthcare Systems Engineering, 9(1), 71‑82. https://doi.org/10.1080/24725579.2019.1567630
Pudjianto, W. (2021). Process mining in governance, risk management, compliance (GRC) and auditing : A systematic literature review. Journal of Theoretical and Applied Information Technology, 99(18).
Ragulina, J. V. (2019). Compliance Approaches and Practices for Increasing Competitiveness of Industrial Enterprises : Current Research and Future Agenda. The International Scientific and Practical Forum “Industry. Science. Competence. Integration”, 903‑909.
Rahmouni, H., Munir, K., Essefi, I., Mont, M., & Solomonides, T. (2021). An Ontology-based Compliance Audit Framework for Medical Data Sharing across Europe. INTERNATIONAL ARAB JOURNAL OF INFORMATION TECHNOLOGY, 18(2), 158‑169. https://doi.org/10.34028/iajit/18/2/4
Ramachandran, G. S., Deane, F., Malik, S., Dorri, A., & Jurdak, R. (2021). Towards Assisted Autonomy for Supply Chain Compliance Management. Proceedings – 2021 3rd IEEE International Conference on Trust, Privacy and Security in Intelligent Systems and Applications, TPS-ISA 2021, 321‑330. https://doi.org/10.1109/TPSISA52974.2021.00035
Riehle, D. M. (2019). Checking Business Process Models for Compliance – Comparing Graph Matching and Temporal Logic. Lecture Notes in Business Information Processing, 342, 403‑415. https://doi.org/10.1007/978-3-030-11641-5_32
Rinderle-Ma, S., & Winter, K. (2022). Predictive Compliance Monitoring in Process-Aware Information Systems : State of the Art, Functionalities, Research Directions. arXiv preprint arXiv:2205.05446. https://doi.org/10.48550/arXiv.2205.05446
Sackmann, S., Kuehnel, S., & Seyffarth, T. (2018). Using business process compliance approaches for compliance management with regard to digitization : Evidence from a systematic literature review. International Conference on Business Process Management, 409‑425.
Salguero-Caparrós, F., Pardo-Ferreira, M. del C., Martínez-Rojas, M., & Rubio-Romero, J. C. (2020). Management of legal compliance in occupational health and safety. A literature review. Safety science, 121, 111‑118.
Schneider, A., & Mauve, M. (2018). Compliance management for P2P systems. 2017 23rd Asia-Pacific Conference on Communications: Bridging the Metropolitan and the Remote, APCC 2017, 2018-January, 1‑6. https://doi.org/10.23919/APCC.2017.8303961
Scope, N., Rasin, A., Heart, K., Lenard, B., & Wagner, J. (2021). The Life of Data in Compliance Management.
Sothilingam, R., Pant, V., Shahrin, N., & Yu, E. (2021). Towards a Goal-Oriented Modeling Approach for Data Governance. CEUR Workshop Proceedings, 3045, 69‑77.
Sumaryadi, S., & Kusnadi, K. (2021). THE INFLUENCE OF STRATEGIC PLANNING AND PERSONNEL COMPETENCE ON ORGANIZATIONAL PERFORMANCE OF THE TNI MATERIAL FEASIBILITY SERVICE MEDIATED BY COMPLIANCE MANAGEMENT. Journal of Economics, Management, Entrepreneurship, and Business (JEMEB), 1(2), 128‑145.
Surridge, M., Meacham, K., Papay, J., Phillips, S. C., Pickering, J. B., Shafiee, A., & Wilkinson, T. (2019). Modelling compliance threats and security analysis of cross border health data exchange. Communications in Computer and Information Science, 1085, 180‑189. https://doi.org/10.1007/978-3-030-32213-7_14
Tanaka, Y., Kodate, A., & Bolt, T. (2018). Data sharing system based on legal risk assessment. ACM International Conference Proceeding Series. https://doi.org/10.1145/3227696.3227715
Timm, F. (2018). An application design for reference enterprise architecture models. Lecture Notes in Business Information Processing, 316, 209‑221. https://doi.org/10.1007/978-3-319-92898-2_18
Timm, F., & Sandkuhl, K. (2018a). A reference enterprise architecture for holistic compliance management in the financial sector.
Timm, F., & Sandkuhl, K. (2018b). Towards a reference compliance organization in the financial sector. Banking and information technology/Deutsche Ausgabe, 19(2), 38‑48.
Torre, D., Soltana, G., Sabetzadeh, M., Briand, L. C., Auffinger, Y., & Goes, P. (2019). Using Models to Enable Compliance Checking Against the GDPR: An Experience Report. model driven engineering languages and systems. https://doi.org/10.1109/MODELS.2019.00-20
Usman, M., Felderer, M., Unterkalmsteiner, M., Klotins, E., Méndez, D., & Alégroth, E. (2020). Compliance Requirements in Large-Scale Software Development : An Industrial Case Study. product focused software process improvement.
Van Rooij, B., & Fine, A. D. (2019). Preventing corporate crime from within : Compliance management, whistleblowing, and internal monitoring. The Handbook of White-Collar Crime, 229‑245.
Wang, D., Yang, R., & Gao, X. (2021). Data security compliance management and control technology based on scene orchestration. Proceedings – 2021 13th International Conference on Measuring Technology and Mechatronics Automation, ICMTMA 2021, 401‑408. https://doi.org/10.1109/ICMTMA52658.2021.00093
Widjaya, W., Sutedja, I., & Hartono, A. W. (2019). Key aspects of data management framework for early adopter : A systematic literature review.
Winter, K., Aa, H. van der, Rinderle-Ma, S., & Weidlich, M. (2020). Assessing the compliance of business process models with regulatory documents. international conference on conceptual modeling. https://doi.org/10.1007/978-3-030-62522-1_14
Wu, X., & Liang, H. (2020). Exploration Research on the Model of Government Regulation Based on Compliance Management System. 2020 6th International Conference on Information Management (ICIM), 117‑121.
Yazdanmehr, A., Wang, J., & Yang, Z. (2020). Peers matter : The moderating role of social influence on information security policy compliance. Information Systems Journal. https://doi.org/10.1111/ISJ.12271