Have a sudden thought over this topic recently, becos hubby think that due to some of the job's natural it will somehow changes ones character or view in life. Which somehow I do agree with this, some changes to a more better person some changes to a totally different person that you couldn't recognize them.
So sometimes I wonder is it really a good idea to be working in a job that actually changes the real you?
As for me I did change quite a bit ever since I started working, I used to be very very bad temper even now I am still a bad temper person lah...but I tend to learn how to control myself more better. I also started to be more "fake" in putting a mask on my face lah but only to the working people around me. But recent years I think I started to learn how to accept the people that I dislike and turn them to become my "friends". I used to wonder how my ex-manager is able to do that and not noticing it I think I recently learn to treat the people that I used to hate very much at work better. And to my surprise I think I did manage to turn them to become standing in the same line as me. Although I don't see the real improvement on the work but things starts to be more easier and smooth when it comes to liaise with that person. I no longer put on a face when I hear their names or even hate to pick up their calls. I guess that's a good sign.
So have you wonder if you have changes your character after u took up the current job?
No comments:
Post a Comment