Add to Brochure

A History of Workplace Safety

October 14, 2014

The battle to keep workers safe in the workplace has been an on-going campaign for more than four decades, however, there is still much left to achieve to ensure that staff are only exposed to minimal risks. This is especially apparent for healthcare professionals who are at risk of suffering occupational needlestick and other percutaneous injuries.

It was back in 1970 when the US congress introduced the Occupational Safety and Health Act (OSHA), which first established that every worker has the right to a "safe and healthy workplace". As part of this landmark legislation, it was decided that the employer was at the heart of this with the "primary responsibility" to provide workplaces "free of recognised hazards".

Healthcare workers are among the most at risk professionals, with many dangers, threats and hazards common during their working day. Although needlestick injuries are commonly thought to be one of the most dangerous risks for these staff members, healthcare professionals are also at risk of suffering back injuries, workplace violence, as well as exposure to chemicals and infectious diseases.

Campaigning in the late 1980s led to awareness being raised about the specific dangers of working with contaminated instruments, which resulted in the newly established OSHA publishing the "Bloodborne Pathogens Standard" in 1991. This had a significant impact on worker safety and helped to reduce the number of employees injured in the workplace.

The creation of this document provoked a number of other studies surrounding the dangers of  contaminated sharps and the spreading of blood-borne pathogens. After much research and development, in 1993, the first pilot safety devices appeared on the healthcare market.

During this period, much progress was made in other industries, especially for workers who may come into contact with asbestos, with legislation being passed to protect people and the general public from the material.

However, it wasn't until November 2000 that the US moved to launch its "Needlestick Prevention and Safety Act" under then president Bill Clinton. This was a landmark move as it was the first time safety devices were made a legal requirement for healthcare organisations in the US, when the law finally passed in 2001. Soon after, the Department of Labor created new standards for the OSHA, providing guidance on how the law could be practically applied for employers in the healthcare industry.

A significant part of this law saw front-line workers being highlighted as important for the review process of safety devices, and where more could be introduced. Over the next few years, manufacturers were constantly designing, creating and developing new safety devices to improve on their previous instrument. This meant that more products were on the market, making them more economical, which had a significant impact on the number of organisations and nations willing to adopt them.

In 2006, the European Parliament requested a proposal to protect healthcare workers to be developed to safeguard member states across the continent, which led to the organisation introducing the safety legislation in 2010.

This gave all complying countries three years to implement the rules and in May 2013 the EU Sharps Directive was officially in place. It sets a framework that includes measures for how to address risk assessment, risk prevention, training and information, awareness raising and monitoring, and response and follow-up procedures in relation to sharps injuries.

However, there is still much progress to be made. Research has suggested that a number of organisations could be falling below the standard outlined by the relevant legislation of some nations.

There is still a strong campaigning force, led by organisations such as Safe in Common, to ensure that workers are only exposed to minimal risks and that needlestick injuries are a "never event".

Related Clinician and Patient Safety: