Skip to main content

SAGE Social Science Thesaurus

Search from vocabulary

Concept information

Preferred term

workplace violence in the United States  

Definition

  • Workplace violence has always existed in the United States—indeed, during some periods of our history, fear, intimidation, and physical violence were commonplace in work settings. Contemporary expectations in industrialized democracies, however, are that all workers are entitled to a workplace free from recognized hazards. [Source: Encyclopedia of Victimology and Crime Prevention; Workplace Violence, United States]

Belongs to group

URI

http://data.loterre.fr/ark:/67375/N9J-VMTGNBQJ-7

Download this concept: