Thursday, April 18, 2019

Adaptive Overreaction

Arianna's post considered the asset of overreaction as a result of poorly calculated risk.  If my perception of a risk is exaggerated beyond the actual reality of the threat, then my response may evoke a beneficial response.  This is in contrast to perceiving a risk and judging it to be relatively low, responding to that level of risk, and then having a response that is insufficient or ineffective.

At first I was thinking that overreaction does not actually exist.  People are just reacting according to their perceptions.  Some people may just a risk as low, while others judge that same threat as a high risk situation.  Therefore, the second individual may appear to be overreacting, when in reality, they are just reacting in accordance to their perception.  This may be true.  Or, maybe people learn overreaction as an adaptive behavior.

I tend to have a heightened response to perceived threats.  Some may consider this overreaction.  For example, when I calculate how many tasks I am responsible for in a week and weight that with the amount of finite time that I have, a common conclusion that I come to is: I should definitely panic.  This panic calls me to full alert, I make lists, I create plans, I start implementing those plans.  I cut out what is actually not necessary to do this week, or some things get cancelled and I get bonus time that I didn't account for.  And, inevitably, I get through the week.

It would be a great disappointment if I could not work him into another SE discussion.  Andy tends to have a very dampened response to perceived threats.  Or maybe, he does not perceive anything to be that threatening.  Because we live in the exact same house, it seems unreasonable that we would come to such divergent conclusions.  He has many demands and responsibilities and the same finite amount of time that I do, but the result is quite literally never panic.  In fact, the result might be that the Master's Tournament is a logical task to add to the list of responsibilities.

After 13 years of marriage to Andy, I believe what I may be experiencing is adaptive overreaction.  If part of my process in making sense of a mismatch between demands and time is delegating responsibilities to Andy, then I must try to evoke reaction within him that calls him to action, moves him away from watching Tiger revive a once-though dead career, and toward landscape beautification in preparation for hosting Easter celebrations.  In overreacting, I create a reaction that benefits my steps toward self regulation.  Relative to his natural inclination, reacting at all is likely an adaptive overreaction.  At any rate, the landscape looks lovely.

Categorization and Experiences

Kahneman's dichotomous framework in Shantanu's blog makes me think about how people start to categorize things, physically or abstractly, and how those categories in mind would influence their ways of perceiving the world. If a person only knows two kinds of color: white and black, then how would he or she think about the blue? Whenever a person encounters this problem, it seems to be assimilation or accommodation processes. But the underlying mechanisms remain to be mysterious. If that person accepts the new color as blue just because other people tell him that is the truth, is this a kind of fast thinking? In this way, probably many ideas or categories in our existing mind would be revised if we apply slow thinking in the assimilation or accommodation processes.

In the current educational context, much knowledge is learned through secondhand experiences in which ideas and concepts are assimilated or accommodated into thinking system. For slow thinkers, they might probably hold more sophisticated thinking system and it would take more time to fit a new situation into a suitable category or they may try to recreate categories in order to justify the new problems. But for fast thinkers, what are reasons that they wouldn't be willing to slow down and think other possibilities? Comparing first-hand experiences with secondhand experiences, I'm thinking about how they would shape people's thinking system differently. For self-efficacy which originates from first-hand successful experiences, does this account for slow thinking process if there is any link between them?

Wednesday, April 17, 2019

Asking what Mezirow would and trying to link it to Kahneman

Reading through last week's posts really made me ask some questions about how we may tend to use system 1 thinking to label things as worthy or unworthy of our attention even when we read for class in a controlled setting, with more time on our hands to practice System 2.  I think that we really need to look at our everyday lives as students rather than generalizing our experiences at a conference to our daily lives to really realize how we tend to brush off so many things just because they "seem irrelevant". I don't know about you guys, but sometimes, I've come across reading that I will say I have problems with, but often refuse to reconcile when certain strengths are unearthed by others in class, because I've made my evaluation using system 1 because "I don't like it" or it "lies outside the theory I read". Like Ziye said in her last post, abstract academic concepts render us with the option to use System 2 effectively. Why do we so often use System 1 then? The answer could lie in self-efficacy, or in something as simple as convenience.

 Something I'd like to bring in here, because I've been studying it extensively  is Mezirow's conception of true learning. I believe that the ways in which we have come to be 'reared' as students have made us look at existing academic systems in an instrumental manner. Habermas and Mezirow would label our reticence to accept the possible strengths in something we don't favor as instrumental reflection, because we look at systems we don't agree with, and try to break them down by hook or crook using strong words and broken arguments rather than understanding how to use them to transform and mold our own ideologies. Essentially, this can be pictured as pulling the trigger on something, shooting it till it dies, and waiting for it to be reborn as something deformed, molded to fit what we think. This type of reflection lies at the boundary between legality and legitimacy, and could lead to civil disobedience and the annihilation of civil order within a learning context.  In Habermas' opinion, the lifeworld, or context, is filled with "incalculable presuppositions" that need to be united through a bridging of social capital to incite true communicative learning.

The true conception that is imbued within academia (at least from my perspective) is that of transformation. Transformative learning is something that is rarely seen within the academic context, because we've been told that things are black and white, and that's how the system is.  The go-getters with instrumental opinions who refuse to budge get it all (as we were discussing last class), and those that dip their fingers into many ponds for the sake of social good are suddenly told it's quicksand, and are left to slowly sink to their scholarly demise. As academicians, don't we need to adopt this transformative approach to thrive and cram things into our minds, rather than an instrumental one to cram just what's needed to merely survive the semester, or even academia? We often have the tendency to say that we need to "survive", but the point is, we have the "privilege" and luxury to work hard and thrive, just like our undergraduate subjects and students who we often judge and call spoilt. The truth is, we're all in the same boat. We just don't want to thrive, because of how the system has reared us (much like cattle).

To conclude, I think that the links between systems 1 and 2, and instrumental and transformative patterns of thinking seems undeniable, and I'm wondering if there's any journal articles that talk about Kahneman's dichotomous framework that can be cited and linked to Mezirow's work.

Thursday, April 11, 2019

The interaction between Slow Thinking and Fast thinking

Through our discussion in last class, I've been thinking about what is slow thinking and fast thinking. Are they domain specific or domain general? For academic learning, such as science and literacy, it would probably help people develop logical thinking which takes more time to generate deep thoughts on certain ideas. But for people engaging in art creation, do they have the same kinds of learning approaches? Because I'm considering the role of intuition which is always valued by many artists. In another words, I don't quite understand how we should distinguish these two different thinking systems and what their functions are in different learning contexts.

For the linkage with self-efficacy, I'm considering it as people's belief system which is formed by different thinking systems. As we discussed before, it is closely related to successful learning experiences. However, both slow thinking and fast thinking could create different levels or kinds of success. For example, a child could learn how to plant flower very quickly by following other peers or parents' behaviors. Is this based on fast thinking which might also increase their self-efficacy? And for college students who study sciences, they may experience more slow thinking to achieve success. How people's belief system would be changed based on different thinking systems according to different cognitive development stages or learning contexts?

Is self-efficacy fast or slow thinking?

At the end of the last class I was pretty convinced that what we call self-efficacy could be categorize in terms of slow thinking (system 2). But now that I am thinking more about it, I am not sure; I think I used the system 1 in my first opinion. I think that most of our thoughts, emotions, and behaviors associated with self-efficacy can be classified in system 1. But what could explain that? Based on the discussions in previous classes and my (limited) understanding of Kahneman’s work, my answer to that question is: our concrete experiences. In the first sessions we discussed about the relevance of the consequences of people’s behaviors in the formation of their self-efficacy, and why this form of learning is more effective than vicarious experience and verbal persuasion. 
So, when people behave in a particular way and, as consequence, they receive something that they expect (reinforcement), their self-efficacy is strengthened. But this experience also implies that people learn associations among events (this include emotions, behaviors, and thoughts) that take place during those experiences. As people are exposed to the same experiences because they become more self-efficacious, the learned associations among events are also be strengthened. Consequently, people’s future decision in similar context will be highly dominated by system 1. One of Kahneman’s examples could help me to illustrate what I am saying. If I ask you what the result is of 2 + 2? the answer comes to your mind immediately without think about it. In fact, you can “see” this math operation in your mind! But if I ask you what the result is of 345 X 56? The answer does not come to our minds immediately; most of us will use rules to find the solution. The first case happens not only because our memory capacity allows us to perform that kind of operation, but also because we have constantly been exposed to “2+2”, right? 

The availability cascade


What I gather from Kahneman in relation to self-efficacy is that we feel very efficacious for things we are actually rather bad at. That being said, I feel like this can be advantageous; if we knew how bad we were at decision making or risk calculation or probability, it would be crippling, and we might not try to accomplish anything. Although Kahneman points out some pitfalls, like Love Canal and the Alar scare, I’m not convinced that these were necessarily unwarranted overreactions. The availability cascade may have led to extreme responses but being adamant about not accepting toxic water and potentially harmful chemicals on your food seems rational to me. I think overreactions can serve a purpose to hold people accountable and enact change. Were these events not to have engaged the public so drastically, how much more might have the responsible parties tried to get away with? Should we wait for something more risky or detrimental to occur before demanding change? The cost of not reacting in these types of situations is potentially greater than overreacting. A false sense of efficacy for weighing the risk of an event or outcome seems evolutionarily necessary. Though we might be technically bad at weighing risks maybe that makes us good at other more important things, such as engaging in collective efficacy and demanding change before something bad has to happen. 

Wednesday, April 10, 2019

What lead to the self-contradiction


In the chapter “less is more”, the notion of conjunction fallacy reveals that people’s thinking structure is not logical and is easily being biased. From my perspective, the “less” refers to the information provided for an event, while “more” refers to the uncertainty of possibility and mental effort needed to draw a conclusion. People’s mind works much slower when processing abstract problems than dealing with those are concrete. Abstract problems, such as calculating possibility, is not suitable for people to think. Thus people tend to make them more crystallized such as analyzing them into selecting people in red clothes from the crowed, which is more understandable. When people confront with abstract questions, if they rely on intuition, they are more likely to make mistakes. That might could explain the one of the reason why people are self-contradicted in decision making. Compared with computers which are predetermined an overarching algorithm, people’s priority are changeable depending on present condition and scenario.

However, as the ultimate form of nationality, logic is just a tool. System one which basically depends on the intuition of mind, fits the needs of survive and the creature’s nature of “profit and avoid loss” of creature. It is faster, more effective but more likely to make wrong decision than lazy logical thinking. Intuition are suitable for events that we wouldn’t lose much if we make mistakes. The reason why people make decision off the top of one’s head is the wrong estimation of results and conceit of the decision maker.


It might be better if we do not make judgement when you don’t have enough evidence to support it. With limited clue provided, the possible assumption could be substantial, however, it means the possibility of making mistakes would be larger. It is noticeable in the decision making process. in most cases, we are usually under much urgency to draw out a conclusion and move on to the next step, employing information in mental account of system one. When deducting and making decisions, be on your guard, slow down and collect sufficient factual evidence. If people get impatient and try to speed things up, that will lead to self-contradiction when confronting similar cases.