Risk Management

View our 2018 and 2019 listing of information security (infosec) / cyber security training courses, events and conferences from around the world that are associated with Risk Management.

IT risk management (RM) is the application of the principles of risk management to an IT organization in order to manage the risks associated with the field. It aims to manage the risks that come with the ownership, involvement, operation, influence, adoption and use of IT as part of a larger enterprise.

It is a component of a larger enterprise RM system. This encompasses not only the risks and negative effects of service and operations that can degrade organizational value, but it also takes the potential benefits of risky ventures into account.

IT risk management is a process done by IT managers to allow them to balance economic and operational costs related to using protective measures to achieve nominal gains in capability brought about by protecting the data and information systems that support an organization’s operations.

As a general rule, risk is defined as the product of the likelihood of occurrence and the impact an even could have. In IT, however, risk is defined as the product of the asset value, the system’s vulnerability to that risk and the threat it poses for the organization.

IT risks are managed according to the following steps:

  • Assessment: Each risk is discovered and assessed for severity
  • Mitigation: Countermeasures are put in place to reduce the impact of particular risks
  • Evaluation and Assessment: At the end of a project, the effectiveness of any countermeasures (along with their cost-effectiveness) is evaluated. Based on the results, actions will be taken to improve, change or keep up with the current plans.