Friday, May 8, 2020
Is a Work Injury Really the End of Your Career
Is a Work Injury Really the End of Your Career Being injured is never an enjoyable experience regardless of where or when it happens. However, the situation surrounding the injury can potentially cause it to become even more devastating to oneâs life. A common fear that people have after suffering an injury in the workplace, be it an office or a construction site, is that they think theyâll lose their job if they take their employer to court. Thankfully, this is actually against the law. Employers canât actually terminate an employment contract following a claim for compensation in your workplace. Your employer is responsible for taking care of their employees and this extends to ensuring the workplace is safe for all staff. After your claim, your employer should be taking measures to ensure that other staff are not injured in the future. This means that your employer should be more than willing to accept responsibility for your injury. In addition, your colleagues will be thankful that something is done regarding the situation to improve their own safety. So no, an injury is not the end of your career and claiming compensation doesnât mean that youâll be at risk of losing your job. It also doesnt mean that youâll develop a negative relationship with your employer. If anything, it will help because your employer will be compelled to improve the general health and safety of its employees. To help you out, weâve included an infographic below regarding personal injury lawyers and how you can choose the best one to represent you. Infographic: Choosing a Personal Injury Lawyer
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.