Talons Philosophy

An Open Online Highschool Philosophy Course

By

Infants Can Cry and I Can Write a Midterm – And Nothing May Be True

The mind gains knowledge through processing information in stimuli and internally rationalizing it. This I know to be true, but it cannot stand alone. Therefore, the following propositions must also be taken into account for us to all take this statement as true:

If the brain is a blank slate aside from instinctual qualities

And if those qualities include rational thought

And if knowledge does not have to be true to be known

As long as those statements are all true, then our final statement on how we gain knowledge also applies. Therefore, rather than prove my statement, we can prove the propositions that come before it, as the statement would logically follow as true.

The brain is a blank slate, aside from instinctual qualities.

This statement serves as two ideas in one, two ideas that would at face value contradict each other, but that can live in a balanced harmony to explain the brain and how it is. First, we can define what the blank slate is. Although cited in history many times, the theory was popularized in John Locke’s Essay Concerning Human Understanding.

The idea behind the Blank Slate theory is that at birth, an infant emerges with a mind blank of anything – thoughts, personality, instincts, and even the ability to process information. From there, processing, personality, thoughts, and all other basic brain functions are learned through sensory experience.

This theory obviously stands as undeniable pure empiricism, and because my statement does not, we are simply going to modify Locke’s theory as so many others have. Locke wrote his Essay Concerning Human Understanding in the late 1600’s, and by the late 1800’s Wilhelm Wundt had characterized all repeated human behaviour as human instinct, the most basic definition. From there, many psychologists and philosophers alike have toyed with the idea of instincts. For this statement, we’re going to use the criteria outlined in the book Instinct: An Enduring Problem in Psychology. The criteria go as followed:

To be considered instinctual, a behavior must:

a) be automatic
b) be irresistible
c) occur at some point in development
d) be triggered by some event in the environment
e) occur in every member of the species
f) be unmodifiable
g) govern behavior for which the organism needs no training (although the organism may profit from experience and to that degree the behavior is modifiable)

Warning for Baby Nudity

In layman’s terms, an instinct must be a behaviour that can occur in every human being when stimulated in a certain way, and it must be a behaviour that overrides reason and rational thought, therefore requiring no prior skill. Think fight or flight, a popularly cited and discussed human instinct. As for infant instincts, there are quite a few recorded that are cited by psychologists and parenting websites alike.

The instinctual qualities we are born with include rational thought

Once again, to answer this we must address and answer two things. The first is to define what rational thought is (and the purpose it plays in this statement on epistemology), and the second is to state that we are born with that rational thought.

Due to the nature of the word rational an the amount of people who have studied, defined, and warped it’s definition. this case, rational thought is the ability to process information, eg. rationalism, the theory that reasoning is the main source of our knowledge. Of course, because of our reliance on empiricism for the blank slate theory, we’ve reached a point here where rationalism and empiricism play an equal part in the gaining of knowledge.

With our definition of rational thought defined as the ability to process information through reasoning, we can safely assume infants are born with the ability to reason at the most basic levels. It’s undeniable that infants cry when they require attention, and in this case we can assume that the following basic reasoning is occurring.

“I’m hungry, so I will call for my mother.”
“My diaper is soiled, I will call for an adult.”
“Something has startled me, I will call for help.”

We can also apply the instinctual qualities earlier defined to rationalizing, further cementing the idea. Infant rationalizing is instant. For example, an infant will cry immediately after being started. It’s irresistible, babies cannot resist crying when they need help, unless serious trauma has rendered them silent. It occurs immediately at birth, a point in development. It is triggered by stimuli in the environment, such as fear, discomfort, and hunger. It occurs in all infants who are born healthy. It does not vary or change. And, finally, it does not need any prior training. In fact, quite the opposite, as most healthy infants come out into the world screaming.

Knowledge does not have to be true to be known.

This is perhaps the hardest statement to prove, if only because once we define knowledge and truth, we are left with something that still must be believed with perhaps a little bit of faith. Or, perhaps not, because even if it’s untrue it is known.

Either way, let us use the most literal dictionary definition of knowledge.

noun knowl·edge \ˈnä-lij\
: information, understanding, or skill that you get from experience or education
Although the dictionary is often not the best source for defining words in depth, in this case I’ve chosen the most basic definition for a very basic reason – this definition is the one most people recognize and ascribe to. Since humans have created language, humans can define language, and in this case knowledge is understood as information, understanding, and skills that are gained through experience.

As for truth… Well, truth is unknown. That is to say, there is no giant checklist that will say whether what we know is really a truth or not, and when so many things are either subjective or wholly based on perspective, we may never know. Because of that, humans have the potential to be knowledgeless if knowledge MUST be true to be known, so we will simply say that knowledge as potentially untrue is fair.

The mind gains knowledge through processing information in stimuli and internally rationalizing it.
Finally, we’ve gone through our propositional statements and defined them to the point where we can say that this statement is true.
The mind gains knowledge, (which does not need to be true,) through processing information in stimuli, (empiricism,) and internally rationalizing, (and instinct all humans are born with, and also rationalism) it.
With this statement, many (if not all,) schools of epistemology can argue their case. After all, as long as the stimuli is there and as long as the brain is functional enough to rationalize it, then it can be known. It can be known as competence and acquaintance, it can be argued as a true belief or not, it can serve itself to foundationalism or anti-foundationalism, and it can do almost any conceivable mixture of these schools.
 

By

The Hivemind Continues

Last time my words were on this website, it was to talk about the concept of the Collective Unconscious, where I posed a few questions in relation to the topic and how it shapes what we know, what we create, and who we are. They looked like this:

But, how do we prove it? [The collective unconscious.]
Can we let it define us?
Are we our own people, let alone capable of originality, if our ideas all come from before?

And, unsurprisingly, the people who can help answer these questions the most are the two philosophers/psychologists that I mentioned in my last post, Carl Jung, and following him, Joseph Campbell.

Through the past week and a half or so, I’ve been reading up on Jung’s idea of the Collective Unconscious, specifically his ideas on archetypes. I’ve also been reading Joseph Campbell’s book, The Hero With a Thousand Faces, and when that proved to be too dense, I accompanied it with summaries of the book to make better sense of it. Like adding milk to coffee.

And, through my research, I’ve come to learn a few things. I’ve learned that we can’t understand the collective unconscious, let alone prove it’s existence. For some, it may be nothing more than a distant conspiracy theory, but for other’s it’s simply an idea that they live with. Yet, despite the fact that I can say that now, earlier in the week I was struggling to grasp a way to define it. I asked anyone I could, hoping the answer would come to me.


A past Philosophy 12 student told me to make a metaphor about breakfast, as if that would somehow make the idea of the collective unconscious more within the grasp of our mind. As if toast and eggs would help define something that our brains really aren’t quite able to define. And I had to face facts, then and there. The same way the brain can’t process the vast size of the universe, or comprehend large numbers quite right, it also cannot comprehend the ideal that somewhere in the deepest reaches of our minds we are all connected. As our brains are now, with the limitations they possess, no matter what we do it’ll always be out of our grasp to really fully understand the idea that inside our brains there are ideas fed to us that are not our own, but are not someone else’s. They are everybody’s ideas, from the beginning of time to the end of it.

Can we let it define us though? From what I can really understand, yes. But also, no. That is to say, from a metaphysical standpoint, the collective unconscious being a part of what makes humans what we are, then it’s a part of our identities. But it’s also a constant in all humans, which means that it’s as much a part of an identity as any other constant is. As we all have a voice in our heads, we all have a collective unconscious. As we all have senses, whether real or not, we all have a collective unconscious.
In that way, we do have our own identities. Or, at the very least, those concerned with being lost within the rat race and being just another face in a crowd, do not worry. If it is something we all share, then it’s not something you need to distinguish yourself. Although identity as a whole is a topic I’m only touching the surface of, I know that much is true.

Which leads to my final question, the one that relates the most to Joseph Campbell and his idea of the Monomyth. Are we truly capable of being original if all our ideas come down to one isolated framework for writing, and furthermore, that framework comes from a collective unconscious?

At the risk of coming off as a preacher, I want to say it does not matter, and that it’s better to write a good story than to strive for something original. 

But on a more metaphysical note, we really have to just ask ourselves if there can be such a thing as originality? Take, for example, this website. Essentially, it contains mostly random gibberish, but because of how random works it creates strings of texts that can write out word for word your name, birth, location, the entire Harry Potter series, and hypothetically anything that has ever and will ever be said.

The point I’m making with this is that originally cannot be true, because this website somewhere on the internet has already said whatever you wanted to say. Going back to the idea of a Monomyth then, it simply shows that a compelling story is one that touches the archetypes we know within us. It groups together all myths that have been written and will be written and says that they all follow the same idea. Yet, as we read them we do not see that without analyzing them. The Monomyth is a skeleton for which stories become fluid beings.

At this point, I feel like I have a good understanding of what the collective unconscious is, and how the monomyth ties in, and why we as a people are not truly as separate as we might wish we are. Yet, now I wonder, is there a reason why we, at least in the Western World, cling so tightly to our identities?

And, as a final note, feel free to comment any fictional stories you’ve read that follow the Monomyth, as they come from all places and all genres.

 

By

Come Join the Hivemind

(Okay, don’t really.)

So, consciousness and unconsciousness. In theory, we’ve all got one. They’re the defining traits of what makes us who we are, and, in the words of Descartes, “thought exists, it alone cannot be separated from me. I am; I exist,” the idea then, that nothing can be confirmed except ourselves – except the presence of our own thoughts and conscious. That concept then, is that we are alone.

But, we cannot perceive all of our unconscious. For the most part, it’s unknown to us, coming out in the form of passive dreams, hidden desires, and for some, intrusive thoughts that we know we didn’t have. Nobody knows all there is to know about the thoughts they have, no matter how much they may claim they do.

So, if we cannot perceive all of our conscious that lets us Be, then who’s to say that it functions autonomously? What if Descartes was right, that the only thing that can be proven is our own thought. But also, what if our thought was not only ours? Taking a page out of Carl Jung‘s book, what if we shared a collective unconscious?

It’s not a new idea. Archetypes, the concept that the collective unconscious relies on the most heavily, were first mentioned with Plato relating to his Theory of Forms. However, it was Jung who refined the idea the most.

His idea was that we all collectively are aware of archetypes as concepts, and as history and culture move forwards, we experience people and moments that display these archetypes, whether through real or fiction. (In the case of fiction specifically, try The Hero With a Thousand Faces by Joseph Campbell.)

So, through the collective unconscious and the archetypes within, we see reflections of concepts such as the motherthe devilthe childthe tricksterthe wise old man, and others. While broad terms, they’re seen reflected throughout history and throughout all cultures. These archetypes touch our myths and define the heros of media even today. Play them straight or juxtapose them, but they come out all the same regardless.

A collective unconscious, a shared reality. They’re ideas that have been touched both by Psychologists and Philosophers, due to the very distinct nature of the consciousness and our understanding of ourselves, which makes it a very rewarding topic to broach.

But, how do we prove it?
Can we let it define us?
Are we our own people, let alone capable of originality, if our ideas all come from before?

We might not ever be able to tell, but we might as well ask anyway.

 

By

Technology is ruining our society???

I think most people have heard this sort of argument, or something similar to it at least. There are a lot of people who look at how attached we are to our technology

and immediately come to the worst of conclusions. Technology is making us all anti-social, and as we stare down at the blinding screen we dismiss the presence of others to stay in our own technological bubble, and it’s ruining our society.

Take this example of an extreme argument, for example. One way to pull it apart is as follows:

Premise One: Technology is becoming an addiction in our society.

Premise Two: Our addiction is pulling us apart from each other.

Conclusion: Therefore, we need to pull away from technology and speak face-to-face

Or, in the words of the author themselves, “We need to try and go back to the good old days of people sitting down on porches talking to their neighbours.”

Now, lets actually look at the argument for Factual Correctness.

  • Premise One: Yes, and no. At least, according to the DSM-V. Under the DSM, Internet Gaming Addiction is categorized as a non-substance addiction, specifically separate from gambling addiction due to the lack of money on the line. (That is, money would be spent on games and micro-transactions – not on luck. Internet gambling is simply diagnosed as part of gambling as a whole.) However, there are three very specific points listed in the DSM itself that refute technology becoming an addiction to our society at large, and they are as follows:

“Note: Only non-gambling Internet games are included in this disorder. Use of the Internet for required activities in a business or profession is not included; nor is the disorder intended to include other recreational or social Internet use. Similarly, sexual Internet sites are excluded.”

“They typically devote 8-10 hours or more per day to this activity and at least 30 hours per week. If they are prevented from using a computer and returning to the game, they become agitated and angry. They often go for long periods without food or sleep. Normal obligations, such as school or work, or family obligations are neglected.”

“Excessive use of the Internet not involving playing of online games (e.g., excessive use of social media, such as Facebook; viewing pornography online) is not considered analogous to Internet gaming disorder, and future research on other excessive uses of the Internet would need to follow similar guidelines as suggested herein.”

  • From these quotes, it’s clear that the DSM only defines internet gaming as an addiction, (at least at the moment,) as there is not sufficient data yet to say that excessive use of social media or other forms of internet entertainment can be classified as addictive, despite the common myth that there is an agreed on criteria for diagnosing and evaluating them.There is also the criteria that normal obligations must be ignored. That is, there needs to be a lack of proper social functioning present to differentiate what is merely a strong passion, and an addiction. There also needs to be continued symptoms to count as an addiction – merely engaging in extended internet use once (say, for a friendly get-together,) does not count, even if it was for over ten hours per day.  And that does not even touch on the growing populace of people who have turned gaming into their job, through lets play commentaries, tutorials, speedruns, and other gaming based tournaments.

Gamebreaking has never been this profitable!

  • Context must also be taken into consideration, when agitation and anger are being assessed. Is this anger always present, and paired with a lack of the ability to complete obligations? Or, is it fair to assume that many people would be angry upon being forcibly removed from an activity they were engaged with.All in all, premise one is factually incorrect, as technology as a whole is specifically excluded from the DSM, and even if it was not, there are currently no consistent and factual studies to say that the majority of people are addicted – and going by the criteria for what an addiction must entail (irritability when separated from the object of addiction, increasing need to be exposed to more of the object of addiction, and an inability to keep up on other obligations,) most people in our society are not facing addiction at all.
  • Premise Two: Technology, social media especially, aredesigned as social. Their literal intent is to give people a means to communicate rapidly and comfortably. It gives users an outlet to connect with those that have similar interests, and gives them a social circle that understands them on a level those in their “real” life may not.The internet is also full of information, and with so much knowledge at the tips of our fingers, it’s not at all surprising that one would engage in technology during everyday life to enrich conversations or back up arguments. After all, that’s what we are doing right now with this project.
  • Conclusion: the conclusion is as limp as a wilted piece of lettuce when faced with all the things that keep people apart beyond technology. Work and school have people using the internet to do their work, people are enjoying their alone time by playing video games, people are using the internet to be social, and, frankly, if the people around you do not treat you fairly, there’s someone somewhere in the world who will. You don’t even have to buy a plane ticket to see them.

It’s still up in the air if we as a society really are “addicted” to the internet, or if it’s just become a part of everyday life. Frankly, I feel like it’s just something that has become an integral part of life, and that’s not a bad thing. I mean, remember that large scale power outage in August? There was no internet, no power, no technology. And yet, people managed to entertain themselves. They went shopping, they went on drives or walks or hikes, they read and painted and did puzzles. Unless it was too dark to do things, people were functioning just fine. But if the choice is there to do something online, then why not do it? It’s fun!

It’s not like going hiking or reading a book is something that makes you an inherently better person. (And lets not even get into how much reading the average person actually does online – there’s a good reason that people are taking note of fanfics – they are LONG.)

Besides that, a lot of what technology is becoming is interactive, that is to say, where TV was once a soapbox for those lucky enough to get onto it, it’s much more viewer concerned now. Social media has as much of an influence on a show’s popularity as simple ratings do, and people have HUGE meta conversations about things they love. And people have power online! Anybody can say anything, which gives oppressed groups that were never given a chance on the soapbox the same grounds to speak on what matters to them as the historic oppressors do.

So because of all of that, and even more things, I really can’t bring myself to consider the internet (and technology in general,) something we’re addicted to. Because it isn’t a problem, it isn’t stopping us from living life. It’s just changing how we live life, and I don’t believe for a moment that it’s changed for the worse.

(Edited on 19/10/15)

 

By

Welcome to the Big Kids Club

Yes, Philosophy is that kid. That kid that’s got some hardcore Peter Pan syndrome and just can’t accept the fact that the rest of the world is quite ready to grow up – Or is it the other way around?

This is a nice little paraphrase of the presentation I gave on September 23rd in class, kind of a way to get down what I was trying to say without all the messy stuff involved in actually presenting.

Philosophy is a treehouse.
It’s a place of wonder and curiosity for children all around, who spend their time hanging out in it together and enjoying their carefree and youthful ways.

And as these children grow up, they abandon the treehouse they once knew to embrace ‘bigger and better’ sciences, and they look back on the treehouse they once loved as a thing of the past that should really be gotten rid of. It’s rotting and dirty and as much as kids love to hang out in it, is anything good happening?

A lot of people who have left this treehouse don’t believe so, and think it’s time to move on. But, obviously, considering we’re in this class and people still major in philosophy, there’s something going on that is still worth it to someone. We just have to decide what.

 

By

If I’m memeing and you’re memeing then who is driving the plane?

Hi, my name is Sam, Samson, Samburger, any works. Pronoun wise, I use she/her and/or they/them. This is a blog post about me, and what I think I’m like, and what I hope to accomplish here! ( ´ ▽ ` )ノ

Sometimes I play video games. Sometimes I play too many video games!! It is really a bit of an issue. (Please buy me Splatoon) I like writing, drawing, baking, and the occasional meme aged like a fine wine.

But I mean, aside from that I really do care about the direction the world is heading in. It feels like it’s going downhill at a rapid pace, and for every step forwards we as a collective people make, we take two steps back.

 

Seriously, this is just what I could find using Google and the most current topics, which isn’t always the best way to find things. But if you were to ask me what’s grating painfully on my mind regarding the state of the world, I can assure you it can go on for hours. It’s just… Exhausting. Sometimes the only thing I can do is numb my brain to the hardships of the world and take some time for myself.

Maybe is a sign of privilege, that I can push aside the problems of the world because they aren’t constantly hurting me. Maybe it’s just a form of self-preservation for whatever sliver of comfort I still have in my brain that the world is a good and loving place.

Also, I care about Psychology a lot. It’s a big part of why I ended up taking this course, really. It’s the logical

extension of the more social aspects of psychology, and being able to tackle the big questions in life can never hurt when I want to spend the rest of my life helping other people cope with theirs. It’s a big picture thing.

As a result, my Philosophy will heavily focus on psychological and social justice aspects and world views. This isn’t any sort of ‘I’m here to say THIS and I will say it, I promise,’ deal, I just know my own tells and personality enough to know it’ll sort of pop out that way.

So far, honestly, Philosophy really seems to agree with me. If we’re going to tackle the hard things in life, we need to discuss them. We can talk about what draws people in…

This gladiatorial element is a challenge for many young philosophers…

Or pushes them out.

…but it seems to put more women off than it does men.

And really, it’s only when we’re willing to talk about the tough stuff (exhausting that it may be,) that we can start asking ourselves why what one person can consider moralistic is considered extremely immoral to others, and where we as a functioning group society draw that line.

Plus, we get to talk about the cool and strange stuff, like aesthetics, and uncanny valley, and all that fun stuff.

And finally, a few testimonials from my friends and family.

“Gay” – Nikki Salandong

“A homosexual meme” – Kiuko Notoya

“2.5 boats” – Ashley Smith

(●ↀωↀ●) See you later fellow philosophers.


 

 

 
css.php