I sit back and observe our society and think to myself, “What matters anymore?”. Feels like we have lost our morals and standards. The only principles we live by are “making money” and “getting attention”. Nothing else matters to us anymore. As long as we can get paid for it or can become famous from it we will do it. Reality shows are the perfect example. On these shows people make themselves look like complete idiots. Fighting and cursing at each other for a small amount of money and some popularity on social media. These people put their personal business on these shows for attention. We don’t even value the truth anymore. There are many websites and magazines that are popular because of making up stories about celebrities. We just accept that as a part of the business. Every week there is a new lie about a “famous” person. We don’t care as long as it is entertaining. I can’t think of anything that has morals or a set of standards that people follow anymore.
In the black community we are losing our self-respect and love for one another. I read about decades ago when we would protect each other and care for the well-being of one another. Now we just fight each other and put each other down. Decades ago when there where even fewer opportunities in America for colored people we still always had each other but those days are long gone. Does our history even matter anymore? We do all these things that are just disrespectful to the people who fought for us to get the little bit of freedom we have. We no longer have morals as a whole. I’m sure there are standards within our families that we follow and live by but we need better than that as a country. Thank you.