475,632 views
2 votes
2 votes
PLEASE :( is it better to always tell the truth, or is it sometimes acceptable to tell a lie ? be sure to support your response with evidence from stories, movies, real world events. at least 3 paragraphs or four if u want

User CrazyTim
by
3.0k points

1 Answer

15 votes
15 votes

Answer:

The answer to this depends on both your morals, and the situation. We all learned when we were younger that we should "do the right thing even when no one's watching". But what if it's a situation where people get hurt? Let's say you just found out your best friend's significant other cheated on him/her, and you were the only one who knew. What should you do? Should you be the one to break the painful news to your friend? Or should you remain silent while they stay with them. In my own opinion, I think it'd be better if you told the truth instead of keeping that secret from them.

Telling the truth will always benefit you in the end. It's better to be truthful than to lie and create a reputation that will lead others to steer clear of you. Telling the truth also saves you from the string of lies that would follow you. When you lie, you have to remember every single lie you tell. Otherwise you get into deeper trouble than you're already in.

In school, we see students making up excuses and lying about missing assignments. These lies could get them into serious trouble. Suspension, detention, even expulsion. Sometimes it's better to own up to your actions, and take responsibility for things; even if you don't want to.

Step-by-step explanation:

User MitchEff
by
3.0k points