I have always wanted to know what it would be like to go to the United States and practice nursing. After all, is it not the place to be if you are an RN? As a Canadian, we have always heard of the 'brain drain'--people studying in Canada and when they finished, they end up acquiring a job in the states. Why not? They are dying for nurses (not as though we are not here), but they pay more money and they offer all kinds of incentives. As well, I think that there are more full-time, nursing job opportunities in the states.
I chose Texas because I have heard that the cost of living there is very low. Let us face it, the reason why we get educated is because we would like to acquire a position where we do not have to work anymore than one job to make ends meet. I want job satisfaction plus enough income to provide for my family without being taxed to death. Could that be Texas?
References: Texas flag (n.d.).Retrieved October 22,2008 from http://image.bizrate.com/resize?sq=160&uid=706009234&mid=74728