Drinking the Covid Kool Aid: Algorithm Reality Tunnels & The Psychology of Perception

In the dominant expressions of the religious traditions, certainty had a pretty high value. Certainty that I know who the right god is, and I didn’t get the wrong god, and I’m dedicated to that… and I’m certain enough that we can burn women as witches, and do crusades, and whatever we need to do because we have enough certainty about rightness… it’s actually an emotional structure, of associating certainty with security and value, and associating uncertainty, within security, a fear and lack of value, which is an emotional and existential bias, leading to a cognitive bias. So then Postmodernism saw these flaws, and it said shit — look at all the things that we fucked up in the name of over-certainty. All the things we were certain about that were wrong later, even within the sciences when we were pretty certain we had it all figured out, around the time of Maxwell’s equations, and modern physics fucked it all up, and this just happened so many times. We really shouldn’t ever be certain about anything, because historically we’ve been wrong about all of it. And whenever we were over-certain, we could do really terrible stuff.

Daniel Shmachtenberger

Do you trust me? 



Trust is something to be earned, and once broken, it is very difficult to get back. 

A good metaphor for trust is that it’s like a plate. If you break someone’s trust, you might be able to glue it back together. It may even be stronger than before. Break it again though, and chances are it can never be repaired to its original state.

Why are we quick to withdraw trust from individuals, but not from institutional bodies and groups such as companies, NGO’s, corporations, and governments?

What caused us to accept a global lockdown culture simply because we were told that it was in our best interest? 

With practically all the knowledge in the world literally at our fingertips, how do we become so divided from each other that we can justify violence when our worldview is challenged?

Comedian attacked by cyclist during peaceful protest in Vancouver, Canada on March 28, 2021 (Lasarev, 2021)

This article seeks to answer these questions, observing that:

  • Paradoxically, having access to unprecedented amounts of information can make us less knowledgeable 
  • The current divide in society is not one of principles, but of perception — we want the best for each other, but can’t agree on what ‘the best’ is 
  • The collective response to covid is rooted in faith in the authorities and cultural stewards of society, resembling a religious cult mindset 

For the general population, the dialectic that’s been created around covid and its vaccines is not born from deep understanding of facts, it’s a fear-based divide rooted in misplaced trust, propaganda, and modern forms of religion and mind control.

You might find this article triggering, and feel the need to school me with facts. That’s great — I’m not a doctor or scientist, I just understand media tactics that play us against ourselves and each other. 

Keep an open mind, read the whole article, and maybe in the end we can all learn a thing or two. 

The Algorithm Reality Tunnel

A big part of the way people view the world is through their social media lens. 

Akira the Don

If there is one thing I learned from getting a degree in digital media studies and marketing, it’s the importance of balancing your media intake with a heavy dose of reality. Another way of putting it is, if you’re dipping your toes in the ocean of mass media, be sure to keep one foot grounded on the shore of real life experience. Otherwise, you might get sucked in and carried away by the tides.

Engaging with digital or mass media is like playing the telephone game, where a simple sentence is slightly manipulated by each person in a lineup, and by the end it’s completely distorted. It’s hard enough to communicate something from one person to another, let alone through the countless filters and biases and agendas that information is subject to as it gets passed around. This example from the TV series King of the Hill illustrates how information gets distorted as it travels from one person to the next.

A series of events from King of the Hill S1E1 illustrates how easily we become misinformed (Daniels & Judge, 1997)

You might have heard the saying that ‘a little bit of knowledge is a dangerous thing.’ We have a tendency to read one book on a subject and fancy ourselves an expert, then overestimate our understanding and abilities. “In many cases, incompetence does not leave people disoriented, perplexed, or cautious,” wrote David Dunning in an article for Pacific Standard. “Instead, the incompetent are often blessed with an inappropriate confidence, buoyed by something that feels to them like knowledge” (Cherry, 2019). Consider the teenager who thinks they have life all figured out, and already knows everything they need to know. Was this you at some point? 

source: (Orac, 2018)

Take the TV show example above — the little bit of information that the child protective worker received led him to take actions that were based in a heavily distorted, literal telephone-game perception of reality. If it weren’t for his boss taking time to confirm the facts, the results could have been disastrous. 

That misinformed character represents all the times we simply accept something as true and proceed to take actions that we think are moral and just, but in reality create serious harm. Despite good intentions, in order to take right action, it is essential that we get as close to the truth as possible. 

Note that the services worker didn’t actively acquire knowledge of the situation, he passively received information. This is reflective of how we get most of our information today — we sit back and let the information come to us via apps and news feeds. This might not be a problem if these tools were designed to promote truly holistic education, but the reality is they’re designed to get as much of our time and attention as possible to earn advertising dollars. And what’s the best way to get someone’s time and attention? 

Tell them what they want to hear!

If you’re a Spotify user, you might have noticed that when you put your whole library on shuffle, it tends to play songs that you’ve already been listening to lately. It’s gotten to know your tastes and preferences, and wants to give you more of what it knows you like so you’ll keep using it. Social media, YouTube, news apps, search engines, etc. use algorithms that work the same way — they get to know what you like when it comes to news, opinions, politics, humor, etc. and give you as much of that as you’re willing to consume. As we click and comment and like what we agree with, these algorithms function as confirmation-bias machines, constantly re-affirming our perceptions and making us overconfident in our beliefs. 

Learning theorists observe that most human behaviour is learned. Learning occurs through the interplay of drives, stimuli, cues, responses, and reinforcement (Armstrong et al, 2011). Beyond early childhood, we learn mainly through reinforcement, or repetition. As these algorithms constantly feed us more of what we already think, they become self-reinforcing feedback loops, ultimately giving us a kind of tunnel vision that locks us into our preconceived perceptions of reality. This is what I call ‘the algorithm reality tunnel.’ 

Social media, search engines and similar algorithm-based software lead us away from holistic knowledge and toward extreme confirmation bias.

Perceive, Receive, Believe

Indeed, the line between perceiving and hallucinating is not as crisp as we like to think. In a sense, when we look at the world, we are hallucinating all the time. One could almost regard perception as the act of choosing the one hallucination that best fits the incoming data.

Vilayanur S. Ramachandran

A quick review of the psychology shines some light on how reinforced perceptions lead to rigid beliefs and attitudes. 

Perception is the process by which people select, organize, and interpret information to form a meaningful picture of the world. People can form different perceptions of the same thing because of three perceptual processes: selective attention, selective distortion, and selective retention.

  • Selective attention: the tendency for people to screen out most of the information to which they are exposed
  • Selective distortion: the tendency of people to interpret information in a way that will support what they already believe
  • Selective retention: People are likely to remember good points about what they favour and to forget good points about what they don’t favour 

Through doing and learning, people make perceptions that become beliefs, which in turn influence their behaviour. A belief is a descriptive thought that a person has about something. Beliefs may be based on real knowledge, opinion, or faith and may carry an emotional charge.

Reinforced beliefs become attitudes. An attitude consists of a person’s relatively consistent evaluations, feelings, and tendencies toward an object or an idea. Attitudes put people into a binary frame of mind of liking or disliking things, of moving toward or away from them. A person’s attitudes fit into a pattern, or belief system, and changing one attitude may require difficult adjustments in many others. 

Changing our attitudes is difficult and complex because attitudes are a significant part of what comprise our overall sense of identity. We often define ourselves by our collection of cultural badges and labels derived from our attitude toward things like political parties, social issues, musical genres, etc. Ask someone to tell you about themselves and they’ll likely rattle off a list of things they enjoy, things they do or have done, titles they have, and beliefs they hold. Once our belief systems become our identity, we take it personally when something we believe in is questioned, disrespected, attacked, whatever. This is why people get offended when someone insults their favourite politician, band, religion, clothing brand, etc. 

These concepts are used in marketing and public relations to manufacture consent and create demand. Demand for a product, service, politician, etc. can be built up by associating it with a strong internal drive (like the desire to act morally and help others), using motivational cues (e.g. distressing visuals or statistics), and by providing positive reinforcement (a reward for taking the desired action, for example).  

To sum things up: 

  • As we go about the world doing and learning, we make perceptions about reality, and seek to confirm and validate those perceptions. We use media to do this, by communicating and finding out if others share our perceptions or have evidence that supports them 
  • By design, many digital media tools we use confirm our perceptual biases and beliefs. The repeated reinforcement we get from algorithms makes us think our beliefs are accurate and knowledge-based, when they are actually rooted in selective perception 
  • This reinforcement, supported by the triggering of internal drives and external motivational cues, creates emotionally charged attitudes
  • Once our beliefs become our attitude, we get even more selective with information, close our minds to conflicting possibilities, and make the liking or disliking of something a part of our identity

Many of us fall victim to what I call the ‘perceive, receive, believe’ way of acquiring a sense of security in our understanding of the world around us. This is how a person can get to the point where they are unable to explain and validate the beliefs they hold, or engage in civil discourse about them, but will get violently defensive when those beliefs are challenged. 

If we are dogmatic and unwavering in our attitudes and belief systems, we are said to be ‘religious about it.’ The thing about belief systems and religion is, they need have no basis in reality so long as they consistently provide explanations that we deem adequate. 

Doctors & Mechanics

For many people science has become a belief system — a world view. This is sometimes called ‘scientism’ — where people take the dogmas of science to be a kind of religious belief system… And it’s this dogmatic belief system which I believe is now constricting and holding science back in a very serious way. 

Rupert Sheldrake

When we outsource our perception of reality, because we are unwilling or unable to verify the facts, the actions we proceed with are based not in knowledge and understanding, but in belief and faith. Since no one can know everything, of course we sometimes have no choice but to trust others. The harder something is for us to understand or evaluate, the more we are forced to put our faith in sources outside ourselves, i.e. ‘the experts.’ 

How Product Characteristics Affect Ease of Evaluation (Wirtz & Lovelock, 2018)

In marketing, we learn that the quality of a product or service can be difficult to evaluate. For example, if you don’t know how to fix your car, when a mechanic tells you that it needs a new part, you have to trust that they aren’t taking advantage by replacing it prematurely, or overcharging you for time and labour.  

These types of services are considered high in credence attributes, meaning they require trust derived from other than personal knowledge, i.e. your belief in the integrity of the service provider. This is why you find their ‘credentials’ hanging on the wall, to give the perception of credibility

The graph above shows that complex surgery is one of the most difficult things to evaluate. The doctor tells us we need something, and even if we do not perceive or understand the problem, or will experience any noticeable difference before and after, we do it based on trust. 

Western cultures place an inordinate amount of trust in medical practitioners, stemming from a high level of faith in the institutions of education, medicine, and government. We think, ‘this person has a license, spent so many years in school, studied the science, is highly recommended, so they must be trustworthy.’ Your trust is not in the doctor, but in the government license, the educational certificate, and the media or opinion leaders that recommend them. 

In general, our attitudes toward doctors and mechanics don’t come from experience and knowledge, but rather from reinforced perception and belief. Both offer a service that most people don’t fully understand, but we’re conditioned to trust one and distrust the other. Consider the portrayal of doctors in movies and medical shows as virtuous life-saving heroes on the frontlines, whereas mechanics are usually depicted as dim-witted, blue collar ‘grease monkeys.’ This perception of doctors might change if people knew that studies have estimated medical errors may account for as many as 251,000 deaths annually in the United States (U.S)., making medical errors the third leading cause of death. At the same time, less than 10 percent of medical errors are reported (Anderson J.G., Abrahamson K. 2017).

The fact is, as a human being, a doctor is no less capable of error or unethical behaviour than a mechanic, and being in a position that requires the highest level of trust, they would face even more temptation to abuse their position and take advantage of people. It could be deliberate, or it could be something like they don’t have enough time to adequately assess every patient, as they require a quick turnover to cover expenses and pay off debt.

Even if a doctor truly believes they’re acting in their clients best interest, abuse of trust can occur unconsciously as a result of following orders dictated by the institutions or companies that they work for. The education systems and legal frameworks required for medical practice can take well-meaning people and mislead, coerce or force them to prescribe medications, surgeries, etc. that don’t heal the cause of sickness, but rather treat the symptoms and keep the patient, i.e. customer, coming back. The medical industry is just that — an industry, where a key factor for financial success is to cash in on customer lifetime value, or the amount of profit that a loyal customer will generate with their repeat business over the course of a lifetime.

Suffice it to say, our overconfidence and socio-cultural conditioning around health and medicine has most of us far more concerned about who we trust with our car than who we trust with the well-being of our mind and body.

This is what’s keeping people totally trapped in a mind controlled state… believing their position from a complete state of imbalance. Not from looking at any facts — from believing in a religion. Whether that religion be traditionalist, religion-type thinking, of any given cultural religion, or whether it be from the religion called scientism, which is also a religion. You know, it’s a religion where your priest class is the guys in the white lab coats in the laboratories and research facilities, who are telling you exactly how reality is and you’re buying into it wholesale and eating it up with a spoon and straw. And, what’s leading to all of their conclusions isn’t truth, it isn’t real research, it isn’t real science, it’s ‘what did the government grant money allow us to do this week? What did big daddy say we were approved, and given funding to do, because he wants the outcome to be skewed in that direction.’ You know, that’s what science is in the modern day.

Mark Passio

Trust the Plan(demic)

It’s amazing how desperately we cling to our beliefs. As history shows, the fastest way to reduce otherwise decent people to a state of savagery is by tampering with their belief system. The word for someone who does so is heretic, and historically the punishments reserved for him are more brutal than for any other class of offender… No one approaches me and asks to have their hard-won beliefs demolished. They come to build upon what they already have and to continue along the course they’ve already begun. Demolition, though, is exactly what they need. If, that is, they want to wake up.

Jed McKenna

What is a virus? How does covid-19 spread and infect people? What will happen if you get it? By what metric does a pandemic officially begin and end? 

Most people are not able to answer these questions with clarity or certainty. When you ask someone for their opinion on covid and the pandemic, the general response is that they heard it’s a problem somewhere, so we should all be afraid of it everywhere, until we’re told not to be. 

There is so much confusion and uncertainty around this thing we are afraid of that we do not understand it, aren’t clear about how it operates, what it is, or even what the effects are. 

 It is also so imperceptible and invisible that most do not have any firsthand experience with the actual illness in real life. We might have a friend or relative diagnosed, a test come back positive but no symptoms, case numbers you heard on the news, but more than likely no direct experience with it. 

If we don’t understand the problem, don’t see the problem, and don’t experience the problem, why would we be so confident in the solutions presented that we find it offensive when others do not share our view, or act in discordance with that view? To put it in context, if there were a math equation that you didn’t understand and couldn’t even read, would you be certain that the answer given to you was correct? Not unless you simply accepted it to be true, based purely on trust and faith. 

We’re not told to understand the science, we’re told to trust the science.

People who simply accept the coercive lockdown culture and abide by its impositions are just following orders, doing so based on social pressure and faith in the government, media, and medical establishment, which time and time again have proven themselves untrustworthy. Why do you trust a vaccine produced by Johnson & Johnson, a company that knowingly exposed its own customers to cancer-causing carcinogens for decades? (Girion, 2018). How about the scientists and doctors that kept it under wraps, and a government that then takes your money and gives it to them? If it was a person and not a corporation that went around poisoning people, they’d be labeled a criminal and you’d never trust them again. 

The Covid Cult

People go to a rally, or join some kind of cult, or they join some kind of radical political movement, and they get really high on it. Aldous Huxley said that the greatest religion of the 19th century was nationalism. People lose themselves in the intoxication of the crowd, they lose themselves in the intoxication of the crowd rally… and you can see that today coming back can’t you, in the kind of return of far left and far right politics, and people getting caught up in this almost religious fervor at political rallies… Huxley would say this is a toxic substitute for religion.

Jules Evans

We tend to think religion is a thing of the past, but it is alive and well. In the words of David Foster Wallace, “You get to consciously decide what has meaning and what doesn’t… Everybody worships. The only choice we get is what to worship” (2020). Worship, i.e. reverence of a person, object, idea, etc., isn’t exclusive to the classical religions, gods and deities. In the modern day, we are far more likely to worship money, celebrity, status, beauty, authority, etc. 

If the response to covid is rooted in belief and faith, it’s valuable to frame this pandemic in a religious context. Are we worshipping at the altar of corona as we partake in the mask wearing ritual, undergo ‘baptism’ by vaccination, and condemn each other for not living according to the covid commandments? Let’s look at some of the parallels or hallmarks of religion applied to the pandemic.

Just as those who didn’t understand god and feared his wrath would turn to the black-coated middlemen for protection and knowledge, we now turn to the white-coated priests of scientism to save us. And just like organized religion, we’ve been convinced that our fate rests in an invisible entity outside of ourselves and we need to seek protection against it. 

It’s almost like a quick fix, ‘flash mob religion’ that isn’t meant to endure over time, but rather pops up to serve a specific purpose and then disappears once the purpose is achieved. People dedicate their lives and fight wars over religion — imagine being able to trigger and harness this kind of fervour over a short term period to serve your agenda.


I keep trying to explain to people, religion, false religion, is that which holds us back from where we say we want to go. False religion is what holds back consciousness, from the term ‘re-ligare’ in Latin, ‘to bind, to tie back, to hold back, or to thwart from forward progress.’ Most of all in consciousness — to thwart from forward progress in consciousness, in awareness, in growing in morality. In understanding of natural law, in understanding of moral law.

Mark Passio

How much safety is too much? Where should we draw the line? You’ve probably heard that high risk equals high reward, which is to say that low risk equals low reward. If your precautions are diminishing the quality of life for yourself and those around you, you’re probably taking it too far. This isn’t to say that we should be reckless, but are we here to live in fear, blaming each other for our problems and running away from every potential threat? Fuck no! We’re here to live this miraculous life, rise up to our infinite potential, and let nothing stand in our way. 

You might be wondering — if knowledge and facts are so important, then why is this all about perception and belief? Because it is essential to understand belief, as no amount of factual evidence can overcome a dogmatic belief system. If someone is convinced that Jesus is in the room, you can’t logically prove that he is not. 

The reason that logic can’t override belief is because human beings are super adept at self-deception. ‘Cognitive dissonance’ has become a widely used term in recent years, which describes the phenomenon of simultaneously holding contradictory beliefs, values, or ideas, and is usually expressed as psychological stress or anxiety. We often will take this stress out on others when they bring us face to face with our inner contradictions. Our trickster brains want to avoid acknowledging our cognitive dissonance, because it threatens our ego identity, so instead of facing the truth we drive ourselves further into self-deception by ridiculing opposing beliefs and engaging in arguments. These arguments are a pressure release valve and a form of reinforcement — we start fights so we can blow off steam, write off the others and their perspective as rude, stupid, wrong, etc., and reassure ourselves that our perception is correct.

It’s not hard to understand how we get into this frame of mind. Let’s say that you’ve started to question the integrity of the medical system, but several of your family members are doctors. They’ve supported you your whole life, have always been kind and loving, and serve as a model of what it means to be successful. You’re now at odds with yourself — on the one hand you love and respect your family, but on the other hand you’re no longer confident in who they work for and what they do for a living. You might have feelings of guilt or shame come up about being a benefactor of unethical activity. Following your intuition at this point would require a whole shift in your beliefs, attitudes and identity, basically shattering your very idea of reality. You would have to face the fact that you can’t trust your culture and society, you can’t trust your loved ones who always told you it’s all good, and you can’t even trust yourself since you believed it all in the first place. This is where you have a proverbial red pill / blue pill moment, and decide whether to open or close your mind. 

A good example of this is the Mrs. Farmer character from the movie Donnie Darko. She’s so enamored with the Tony Robbins-esque, new age guru character Jim Cunningham that she refuses to believe he’s a pedophile after child pornography is found on his property. Admitting the truth is just too painful, so she refuses to accept reality and goes a little crazy. Soren Kierkegaard once said, “there are two ways to be fooled. One is to believe what isn’t true; the other is to refuse to believe what is true.” When something threatens our sense of identity, we refuse truths and accept untruths to maintain a sense of security.

Most people who support lockdown culture, regardless of how well-meaning their intentions are, are doing so based on perception and faith rather than true knowledge and understanding. They’re trying to preserve the sense of safety and security that comes with attachment to ego identity. And it’s fine to think however you want when it comes to your own life, but it becomes a deep ethical issue when you start to impose your beliefs on others. If there were only two people on earth, would it be moral for one to restrict and coerce the other because they believed it would make themself safer? 

To be clear, this article is not meant to vilify all science and vaccines, or completely deny the existence of covid-19. Yes I take a stance on this pandemic, but the lessons here can be applied to staunch believers on either side of the debate. Both sides of a dialectic typically contain some truth, so if you are vehemently pro or anti anything, or frequently attach these labels to others, chances are you’re heavily involved emotionally, and caught in a selectively perceived reality tunnel mindset. 

So, what is the way forward? How can we reconcile our differences and build unity instead of creating more division?

We can start by admitting that we don’t know everything, and that our beliefs could be flawed or incorrect. 

We can embrace uncertainty, which I feel was the prevailing lesson of 2020. 

We can let go of preconceptions, check our biases and blind spots, and relinquish the need to prove ourselves right. 

We can actively seek knowledge from an eclectic array of sources, instead of just receiving information. 

Am I biased? Hell yea I am. Thanks to this whole lockdown situation I lost my job, the quality of education at my university deteriorated severely, and my hobbies and social life became non-existent in an instant. Luckily I still have my mental health, which is more than I could say about a woman I met in at a hospital in Calgary, Alberta, who had a stroke as a result of her medical procedure. I was there to be assessed for the same brain surgery that she had, and I’m very grateful that I ended up not getting it. That experience was the culmination of 10+ years following doctor’s orders, who had nothing really to offer aside from drugs laden with side effects, and dangerous surgery. The root of my problem turned out not to be a physical defect but rather a psychological issue, which was never even considered by the doctors who wanted to sell me their products. 

So, I would say that I have a pretty good reason to be biased. Western medicine is great for a broken leg or a gunshot wound, and not so great for complex issues requiring emotional and psychological healing. But, I realize that my perception is not the only view of this thing called reality, and I can’t expect someone to consider my perspective if I’m not even open-minded and kind enough to hold space for theirs. 

Thanks for reading!

Acid Raindrops is about cultivating and celebrating the individual. On that note, here are some of the most courageous people out there in the alternative media community speaking up about these topics. If it wasn’t for these individuals, this website wouldn’t exist. If anything in this article resonated, you definitely want to check out the work that these guys are doing.

Panic, Profit & Power: Decoding the COVID-19 Conspiracy (Full Length Presentation)


World View Violence & The REAL Pandemics


Inoculum of Truth 2 💉🦠🧬 Cov🆔 2021 ☣️ CoronaVirus Documentary



If you found this information valuable, Please consider a donation.


Make a monthly donation

Make a yearly donation

Choose an amount


Or enter a custom amount


Your contribution is much appreciated!

Your contribution is appreciated.

Your contribution is appreciated.

DonateDonate monthlyDonate yearly

Success! You're on the list.

Works Cited

Anderson JG, Abrahamson K. (2017). Your Health Care May Kill You: Medical Errors. Stud Health Technol Inform. https://pubmed.ncbi.nlm.nih.gov/28186008/. 

Armstrong, G., Kotler, P., Trifts, V., & Buchwitz, L. (2011). Marketing: An introduction (5th ed.). Boston: Prentice Hall.

Cherry, K. (2019, June 14). Dunning-kruger effect: Why incompetent people think they are superior. Retrieved from https://www.verywellmind.com/an-overview-of-the-dunning-kruger-effect-4160740

Daniels, J. & Judge, M (1997). Pilot [Television series episode]. King of the Hill.

Girion, L. (2018, December 14). J&J knew for decades that asbestos lurked in its baby powder. Retrieved from https://www.reuters.com/investigates/special-report/johnsonandjohnson-cancer/

Lasarev, A. (Producer). (2021, March 29). Cyclist flips out, smashes car window [Video file]. Retrieved from https://www.youtube.com/watch?v=cGRY4EZIkWw

McKenna, J. (2002). Spiritual Enlightenment: The Damndest Thing. Wisefool Press.

Orac. (2018, July 31). The dunning-kruger effect, antivaxers, and the arrogance of ignorance. Retrieved from https://respectfulinsolence.com/2018/07/31/antivaxers-dunning-kruger-effect/

The HighExistence Podcast. Shmachtenberger, D., Evans, J, & Akira The Don (n.d.). HighExistence LLC, (Episodes 7, 19, 20).

Passio, M. (n.d.). What On Earth Is Happening [Audio blog]. Retrieved from https://www.whatonearthishappening.com/podcast, (Episode 154).

Wallace, D.F. (2020, January 28). After Skool (Producer). Your mind is an excellent servant, but a terrible master –  David Foster Wallace. Retrieved from https://www.youtube.com/watch?v=OsAd4HGJS4o.

Wirtz, J., & Lovelock, C. H. (2018). Essentials of Services Marketing (3rd ed.). Pearson.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s