Answer:
The term work ethics means how one looks at his job and what he expects from their job, and how he would go ahead with their profession. The term ethics in the work place means the positive aspects that make the work force of the company, like honesty, integrity, dedication, determination, commitment, etc.
Previous Question | Next Question |
Described the most important ethics in the workplace? | When the question asked about ethics, about what one should speak? |