Risk1, the most common term in cybersecurity, is the measure of potential for loss due to a vulnerability. Risk is often expressed as the probability of a vulnerability being exploited multiplied by the impact of the exploit.
Due to the unpredictable nature of software, and computing in general, it is impossible to accurately calculate the probability of a vulnerability being discovered and exploited in the wild. Risk is the nearest proxy we can use to measure the potential for loss.
In cybersecurity, risk is the potential for loss, damage or destruction of assets or data. Threat is a negative event, such as the exploit of a vulnerability. And a vulnerability is a weakness that exposes you to threats, and therefore increases the likelihood of a negative event.
As John mentions, risk is not the same as threat or vulnerability though they are often used interchangeably. Risk is the potential for loss, Threat is the exploitation of a vulnerability, and Vulnerability is a weakness that exposes you to threats.
The importance of understanding risk should not be understated as it is the primary driver behind security controls. The more risk the more controls, the lower the risk, the fewer controls are implemented. Risk is the primary driver of security budgets. The more risk, the more money is spent on security. The less risk, the less money is spent on security.
NIST developed a risk management framework2 that is widely used in the industry and is the basis for many security standards.