יום ראשון, 22 ביולי 2018

פונדקאות

https://www.facebook.com/or.hadar.or/posts/10156494210863158


בואו נשחק משחק:
מה הדבר הראשון שעולה לכם לראש כשאתם שומעים את המילה "פונדקאות"?
ובכן, יש כאלו, מסתבר לא מעט, שהדבר הראשון שקופץ להם הוא - ניצול.
ניצול, החפצה, נשים עניות ולא משכילות שעושות זאת בלית ברירה, מוכרות את גופן, מוסרות את הילד שלהן. לא פעם אפילו שמעתי הקבלה לזנות.
אז בואו תנו לי לנפץ לכם כמה מיתוסים -
אני הדר ואני פונדקאית גאה.
אף אחד לא ניצל אותי, אף אחד לא פגע בי, אף אחד לא הכריח.
אני אישה משכילה ואינטילגנטית, בעלת משפחה מהממת ותומכת.
אני לא עניה מרודה ולא מכרתי את גופי.
אני אמא, אישה,
שעברה טיפולי פוריות בעצמה וחוותה את הכמיהה, את האכזבה שבכשלון ואז גם את ההצלחה המיוחלת. אני אישה שחשבה על כל אותם הורים משתוקקים, אלו שמתאכזבים שוב ושוב ושוב ולא זוכים להגיע להצלחה, או שבכלל - מראש אין להם סיכוי
והחליטה, לעזור להם להגשים את החלום.
חוק הפונדקאות *בישראל* הוא מוסדר, מוקפד, הוגן ומגן קודם כל! על הפונדקאיות.
האמת, שאחרי כל הדרך, הבדיקות, המבחנים והאבחנות שעברתי בדרכי להיות פונדקאית,
אני יכולה להמשיך למסלול ישיר לקבלה לנאס"א 
כן, עד כדי כך.
והנה עוד כמה עובדות שבטח לא ידעתם על הליך פונדקאות בישראל:
*אישה עוברת הליך ארוך עד שמאושרת להיות פונדקאית.
היא עוברת בדיקות גופניות, אבחון פסיכולוגי, שיחות עם עו"ד, הסברים מפורטים מרופא פוריות, ראיון אישי שלה ושל בן/בת זוגה ועוד. אף אחת לא נכנסת להליך כזה בעיניים עצומות או באופן שיאפשר ניצול שלה.
* פונדקאית מלווה לאורך כל התהליך ע"י עו"ד, מבוטחת כמעט בכל ביטוח אפשרי, זכאית להתייעצות עם רופא צד שלישי בכל דילמה רפואית, מלווה ע"י פסיכולוג לה ולילדיה אם היא חפצה בכך.
*אישה לא מאושרת להיות פונדקאית אם עברה משבר (לדוגמא גירושין או מוות במשפחה) בשנה שקדמה לפונדקאות, לא אם פונה להליך מתוך עוני ועושה זאת רק עבור הכסף בלית ברירה, לא אם היתה עם סכרת הריון בהריונה הקודם או עברה 2 ניתוחים קיסריים.
היא לא מאושרת אם טרם עברה שנה מהלידה האחרונה שלה או אם היא מעל גיל 38. לא אם עברה יותר מ3 לידות ולא אם בן/בת הזוג שלה לא תומכ/ת בה ב100%.
כל התנאים האלו, כמובן, כדי לדאוג ולהגן לבריאותן הפיזית והנפשית של פונדקאיות.
*השלב הסופי, רגע לפני חתימת הסכם על הליך פונדקאות, הוא שיחה נוקבת עם חברי הוועדה לנשיאת עוברים - רופא, פסיכולוג, עובדת סוציאלית, עו"ד. כולם שם כדי לוודא שהאישה בוחרת בדרך הזו בראש צלול ולא מתוך חוסר ברירות, שהיא מודעת לכל הסיכונים וההשלכות, שיש לה תמיכה.
הם מוודאים שקיבלה הסברים והבינה היטב את הזכויות שלה.
*הורמונים שידועים כמגבירי סיכויים לחלות בסרטן, לא מעורבים בהליך פונדקאות, לפחות לא אצל הפונדקאית. מדובר בהורמונים שניתנים לצורך גירוי מוגבר של השחלות לייצור מירבי של ביציות בהליך IVF. את ההליך הזה עוברת בכלל האם המיועדת, לא הפונדקאית (שברוב המקרים, המעורבות התרופתית מסתכמת בהורמוני תמיכה בהריון).
* אין פה עיניין של יחסי "מעמדות" וכוח, אפילו לא של כסף,
ולא, לא כל מי שפונה לפונדקאות כדי להביא ילד לעולם הוא במעמד "גבוה", גם לא כלכלי.
אני מכירה גם הורים מיועדים שלוקחים הלוואות, מוכרים את הבית ומגרדים שקל לשקל כדי לזכות בילד.
*אין פה יחסי עובד-מעביד,
הליך פונדקאות מבוסס על כבוד הדדי ויחסי אמון, ברוב המקרים גם יחסי חברות שנרקמים במהלך הזמן.
* אף אחד לא "קונה" את הפונדקאית או "שוכר" את רחמה,
יש לה יכולת בחירה מלאה.
שמירה על פרטיותה וזכותה על גופה הם בחשיבות עליונה.
*התשלום שמעורב בהליך פונדקאות, הוא *פיצוי! ולא רווח. לא מדובר בסכום משנה חיים, אף פונדקאית לא התעשרה מהתהליך.
*אני לא מכירה אף פונדקאית (ואני מכירה למעלה מ100) שהגיעה לתהליך רק "בשביל הכסף".
גם אם המניע העיקרי/הראשוני הוא כסף - הוא לא יכול להיות המניע *היחיד*. אישה שזהו המניע היחיד עבורה, פשוט לא תעמוד בזה ריגשית ואותה, ידעו כבר לסנן בשלב האבחון הפסיכולוגי, או מקסימום בראיון בוועדה.
*פונדקאית יכולה בהחלט לשמור על קשר עם המשפחה, אם כל הצדדים מעוניינים בכך. אף אחד לא חוטף את התינוק מבין רגליה ומונע ממנה לראות אותו ובדר"כ המשך הקשר זה עיניין ששני הצדדים מסכמים בינהם מראש.
*פונדקאית לא מוסרת את הילד שלה, היא פשוט *מחזירה* אותו להורים שלו.
בישראל, החוק אוסר את האפשרות שפונדקאית תהיה גם תורמת הביצית, במקרה שיש צורך בתרומה כזו.
לעובר חייב להיות קשר ביולוגי לפחות לאחד מההורים המיועדים ושום קשר ביולוגי לפונדקאית.
בנימה אישית -
הידיעה שהפכתי זוג למשפחה,
הרגע הזה שהם אחזו לראשונה את הבן שלהם, לו חיכו כ"כ הרבה זמן -
היו ה"שכר" הכי שווה עבורי.
הרגשתי שמחה, הרגשתי גאווה, הרגשתי שנפלה בחלקי זכות גדולה,
ממש לא הרגשתי מנוצלת.
#פונדקאית_מבחירה
ועכשיו לעינייננו -
הליכי פונדקאות בחו"ל הם שונים מבארץ.
אין לי מספיק ידע בנושא ואני נמנעת מלכתוב בנחרצות לגבי תחומים שאין לי מספיק ידע לגביהם, אבל אני כן מאמינה שלפחות בחלק מהמדינות קיים ניצול.
אז לא הבנתי,
אנשים שמתנגדים למאבק הלהטב"קים כי הם חוששים מניצול;
מה ההגיון למנוע מזוגות חד מיניים פונדקאות כאן בארץ, היכן שהחוק מוסדר ומוקפד כך שלא קיים ניצול
ובעצם כך, לדחוק אותם לעבר פונדקאות חו"ל, שם הם מעורבים בלית ברירה במה שקרוב לוודאי הוא ניצול??
נ.ב
הנה עוד כמה קולות של פונדקאיות מהממות שמרגישות כמוני
https://m.facebook.com/story.php?story_fbid=10155394998505124&id=666010123

קורי עכביש


קורי עכביש
הבלונים: ההגזמה בחומרה של מה שקורה משרתת את אויבינו. אח"כ הם אומרים היהודים מפונקים, בכיינים, לא מסוגלים לעמוד בשום פגיעה באוכלוסיה האזרחית. זה מה שנסראללה אמר ב 2006 ועל זה הם בונים. עמדנו בתקופות הרבה יותר קשות, אפילו "נשכח" את המלחמות הגדולות ,מלחמת העצמאות ( 6,000 הרוגים) מלחמת ששת הימים ( 700 הרוגים) מלחמת ההתשה ( 1,000 הרוגים) , מלחמת יום הכיפורים ( 2,500 הרוגים) ,מלחמת לבנון וכל השהיה בלבנון ( 1,000 הרוגים) , תזכרו את תקופת פיגועי האוטובוסים, המחבלים המתאבדים, האינתיפדה השניה. ב 1940 היה בלונדון הבליץ. המטוסים של היטלר הפציצו את לונדון ,עיר של 8 מליון תושבים אז ,לילה לילה ללא רחם .35,000 אזרחים נהרגו כשבתיהם נשרפו. כל בוקר אחרי הפצצה תושבי לונדון קמו והלכו לעבודה במפעלי התחמושת כאילו כלום לא קרה. מספיק עם הבכינות הזאת. אתם רציניים ? עם ישראל מאבד את זה מבלונים אחרי חורבן שני בתי מקדש, 2,000 שנות גלות ופוגרומים, השואה, כל מה שסבלנו מהערבים 100 שנה ?

יום שישי, 20 ביולי 2018

לבוש

חלק מהבחורות בימנו שמתלבשות באופן פרובוקטיבי או חושפני יודעות על הכלל הפמיניסטי  
בתגובה להודעה מספר 0
 
  
שמותר לבחורה להתלבש איך שהיא רוצה ושום דבר לא מצדיק הטרדה ( הכלל נכון בעיני גם) אבל יודעות איך ניתן לנצל לרעה ובאופן מניפולטיבי את הכלל למטרת השגת תשומת לב/פורקן תחושת קורבנות/ כסף פיצויי, והן פשוט מסתובבות ככה ומחכות לאידיוט התורן שיגיד איזה מילה בטיפשות ובכוונה או שיפלט לו בלי כוונה ואז יוכלו לפצוח במופע הדמעות ובזעקות השבר, אוי הוטרדתי על המחיר הכבד שהקורבן ישלם. חלקן מושכות אותו בלשון במילים או באופן סוגסטיבי, משדרות שהשיחה החופשית ליגיטימית מבחינתן והן מסכימות לה, במסרים מילוליים מרומזים או בשפת הגוף.הוא אמר איזה מילה. חיי הרוסים והעולם על סף חורבן.
לכן כל גבר זהיר חייב בימנו לנתק מגע באלגנטיות מכל בחורה שמתהלכת חושפני. כמובן שאסור להגיד לה בגלוי שאתה מתרחק ממנה בגלל זה אלא פשוט להתרחק בלי להגיד כלום, לעשות את זה באלגנטיות, אם היא שואלת לתת תירוץ. בכל אופן לא להמצא בקשרי חברות או עבודה או ידידות עם בחורות שמתלבשות חשוף, זו סכנה די גדולה. נכון, יש בחורות שמתלבשות חשוף ובהילוכן ודיבורן משדרות שהן לא שומרות דיסטנס ולא רואות בכללי ה PC המחמירים צו לחיים ואינן כאלו,אינן טומנות פח מניפולטיביות, אבל אין דרך להבחין בין טומנות המלכודת לתמימות. יש לשמור על שפה מאופקת, התנהגות קורקטית ויבשה ,כדי למנוע את סכנת ההסתבכות על שטויות.
כמובן שעלבונות שלא קשורים למיניות או אף התעללות מילולית שיטתית שאינה מינית היא חוקית ואף אחד לא יתרגש אם תתלונן שעשו לך כזה דבר, רק הערות מיניות ראויות לגינוי החברה.

יום שלישי, 17 ביולי 2018

כשלי חשיבה


הטיות בקבלת החלטות,אמונות,התנהגות , הטיות חברתיות,הטיות חברתיות :



https://en.wikipedia.org/wiki/List_of_cognitive_biases


https://en.wikipedia.org/wiki/Emotional_bias


https://en.wikipedia.org/wiki/List_of_fallacies


https://en.wikipedia.org/wiki/Fallacy


http://rationalwiki.org/wiki/Bias


----------------------- ----------------------------------- 

https://he.wikipedia.org/wiki/%D7%90%D7%A4%D7%A7%D7%98_%D7%93%D7%90%D7%A0%D7%99%D7%A0%D7%92-%D7%A7%D7%A8%D7%95%D7%92%D7%A8


אפקט דאנינג-קרוגר

אפקט דאנינג-קרוגר הוא הטיה קוגניטיבית שבה אנשים לא מיומנים סובלים מעליונות מדומה. הטיה זאת מיוחסת לחוסר שימוש במטא קוגניציה של האדם הלא מיומן לקבל את טעותו.[1] לעומת זאת, מיומנות בפועל עלולה להנמיך את הביטחון העצמי, כאשר אנשים מוסמכים עלולים להניח בטעות שלאחרים יש הבנה שוות ערך.[2]
הסיבה הנוספת לתופעה היא שאדם חסר השכלה אינו מודע להיקפו של התחום ולכן עלול להעריך ביתר את ידיעותיו המועטות. לעומת זאת אדם בעל השכלה מרובה בתחום אשר מודע להיקפו הרחב, עלול להעריך בחסר את הידע ביחס לגבולותיו הרחבים.


קללת הידע

קללת הידע (באנגלית: Curse of Knowledge) היא מושג המביע את הרעיון שבעל ידע מסוים יתקשה מאוד לחשוב על הידוע לו מנקודת מבטו של מי שחסר  את הידע המסוים.
המושג, המשמש בין היתר בפסיכולוגיה חינוכית ובכלכלהנטבע ככל הנראה בשנת 1989 במאמר ב-Journal of Political Economy.[1] האחים צ'יפ הית (Chip Heath), פרופסור להתנהגות ארגונית באוניברסיטת סטנפורד, ודן הית (Dan Heath), מוציא לאור של ספרי חינוך, עשו שימוש במונח בשני ספרים: "קללת הידע" ו"ככה זה נדבק" (Made to stickk). בספר האחרון טענו כי רעיונות שמצליחים לחדור ולהישאר בתודעה הציבורית מתאפיינים בכך שאינם נגועים בקללת הידע.
הגדרה אחרת מתייחסת לחוסר היכולת לתפוס את הנפש (mind) של האחר כשונה משלך (תאוריה של תודעה). האזור במוח שעוסק בתפיסת ה-mind של האחר הוא הקורטקס האורביטו-פרונטלי.


Cognitive biases are tendencies to think in certain ways that can lead to systematic deviations from a 
standard of rationality or good judgment, and are often studied 
in psychology and behavioral economics.



In
 economics, hyperbolic discounting is a time-inconsistent model of discounting.

The discounted utility approach: Intertemporal choices are no different from other choices, except that some consequences are delayed and hence must be anticipated and discounted (i.e., reweighted to take into account the delay).
Given two similar rewards, humans show a preference for one that arrives sooner rather than later. Humans are said to discount the value of the later reward, by a factor that increases with the length of the delay. This process is traditionally modeled in form of exponential discounting, a time-consistent model of discounting. A large number of studies have since demonstrated that the constant discount rate assumed in exponential discounting is systematically being violated.[1] Hyperbolic discounting is a particular mathematical model devised as an alternative to exponential discounting.
According to hyperbolic discounting, valuations fall relatively rapidly for earlier delay periods (as in, from now to one week), but then fall more slowly for longer delay periods (from ten weeks to 21). This contrasts with exponential discounting, in which valuation falls by a constant factor per unit delay. The standard experiment used to reveal a test subject's hyperbolic discounting curve is to compare short-term preferences with long-term preferences. For instance: "Would you prefer a dollar today or three dollars tomorrow?" or "Would you prefer a dollar in one year or three dollars in one year and one day?" It has been claimed that a significant fraction of subjects will take the lesser amount today, but will gladly wait one extra day in a year in order to receive the higher amount instead.[2] Individuals with such preferences are described as "present-biased".
Individuals using hyperbolic discounting reveal a strong tendency to make choices that are inconsistent over time – they make choices today that their future self would prefer not to have made, despite using the same reasoning. This dynamic inconsistency happens because the value of future rewards is much lower under hyperbolic discounting than under exponential discounting.[3]

The ambiguity effect is a cognitive bias where decision making is affected by a lack of information, or "ambiguity".[1] The effect implies that people tend to select options for which the probability of a favorable outcome is known, over an option for which the probability of a favorable outcome is unknown. 




בילבול בין עובדות לתיזות, השערות, הנחות ,ניחושים, אמונות ,דעות ,סברות,ערכים, מוסר.

לחשוב את מה שאני רוצה שיהיה.


הנחת סיבתיות ללא כל ראיה או הגיון.

רעידת אדמה זה בגלל חטאי האנשים.


העדפת הסבר מקיף על הכל על פני הסבר חלקי. בגלל שההסבר מלא הוא נכון.

 המדענים לפעמים לא יודעים וכל מדען אומר משהו אחר.ההסבר בתורה מקיף וכולל הכל ונותן תשובה לכל אז זה מוכיח שהוא נכון.

התאמה חלקית אינה יכולה להיות מקרית אלא היא הוכחה לקשר תוך התעלמות ממה שלא מתאים.

מדענים מצאו במאה ה 20 דברים שכתובים בתנך. מה עם מליון דברים שמצאו שלא הוזכרו בתנך.


 איזה מזל שביקרתי אצל רב מקובל הבוקר. בדרך חזרה מקבלת הברכה נפצעתי בפיגוע. אלמלי הלכתי לרב הייתי נהרג.

אנשים נוטים להעריך באופן אופטימי את העתיד.
אנשים נוטים לחשוב שהאחר חושב כמו שהם חושבים.

wishful thinking.


magical thinking.


הטית הנחת קונספירציה.


הטיה שלילת ההנחה של קונספירציה.



הטית האופטימיות.




------------------------------ --------------

חוק פו.
חוק פוֹ, הקרוי על שם ממציאו נייתן פו (Nathan Poe), הוא מימרה אינטרנטית המשקפת את העובדה שבאינטרנט, בהיעדר הבעות פנים או טון הקול של האדם שאומר משפט, פרודיה עלולה להתפרש באופן רציני אם לא הסבירו אותה כהלכה.
https://he.wikipedia.org/.../%D7%97%D7%95%D7%A7_%D7%A4%D7%95




המשקפת את העובדה שבאינטרנט, בהיעדר הבעות פנים או טון הקול של האדם שאומר משפט, פרודיה עלולה להתפרש באופן רציני אם לא הסבירו אותה כהלכה.
HE.WIKIPEDIA.ORG
------------------------------- ----------------------------------------

List of cognitive biases





Name
Description
The tendency to avoid options for which missing information makes the probability seem "unknown".[9]
Anchoring or focalism
The tendency to rely too heavily, or "anchor", on one trait or piece of information when making decisions (usually the first piece of information acquired on that subject)[10][11]
Anthropomorphism or personification
The tendency to characterize animals, objects, and abstract concepts as possessing human-like traits, emotions, and intentions.[12]
The tendency of our perception to be affected by our recurring thoughts.[13]
The tendency to depend excessively on automated systems which can lead to erroneous automated information overriding correct decisions.[14]
The tendency to overestimate the likelihood of events with greater "availability" in memory, which can be influenced by how recent the memories are or how unusual or emotionally charged they may be.[15]
A self-reinforcing process in which a collective belief gains more and more plausibility through its increasing repetition in public discourse (or "repeat something long enough and it will become true").[16]
The reaction to disconfirming evidence by strengthening one's previous beliefs.[17] cf. Continued influence effect.
The tendency to do (or believe) things because many other people do (or believe) the same. Related to groupthink and herd behavior.[18]
Base rate fallacy or Base rate neglect
The tendency to ignore base rate information (generic, general information) and focus on specific information (information only pertaining to a certain case).[19]
An effect where someone's evaluation of the logical strength of an argument is biased by the believability of the conclusion.[20]
A person who has performed a favor for someone is more likely to do another favor for that person than they would be if they had received a favor from that person.
The tendency to misinterpret statistical experiments involving conditional probabilities.
The tendency to see oneself as less biased than other people, or to be able to identify more cognitive biases in others than in oneself.[21]
The tendency for people to appear more attractive in a group than in isolation.[22]
The tendency to remember one's choices as better than they actually were.[23]
The tendency to overestimate the importance of small runs, streaks, or clusters in large samples of random data (that is, seeing phantom patterns).[11]
The tendency to search for, interpret, focus on and remember information in a way that confirms one's preconceptions.[24]
The tendency to test hypotheses exclusively through direct testing, instead of testing possible alternative hypotheses.[11]
The tendency to assume that specific conditions are more probable than general ones.[25]
The tendency to revise one's belief insufficiently when presented with new evidence.[4][26][27]
The tendency to believe previously learned misinformation even after it has been corrected. Misinformation can still influence inferences one generates after a correction has occurred.[28] cf. Backfire effect
The enhancement or reduction of a certain perception's stimuli when compared with a recently observed, contrasting object.[29]
Courtesy bias
The tendency to give an opinion that is more socially correct than one's true opinion, so as to avoid offending anyone.[30]
When better-informed people find it extremely difficult to think about problems from the perspective of lesser-informed people.[31]
The belief that a society or institution is tending towards decline. Particularly, it is the predisposition to view the past favourably (rosy retrospection) and future negatively.[32]
Preferences for either option A or B change in favor of option B when option C is presented, which is similar to option B but in no way better.
The tendency to spend more money when it is denominated in small amounts (e.g., coins) rather than large amounts (e.g., bills).[33]
The tendency to sell an asset that has accumulated in value and resist selling an asset that has declined in value.
The tendency to view two options as more dissimilar when evaluating them simultaneously than when evaluating them separately.[34]
The tendency for unskilled individuals to overestimate their own ability and the tendency for experts to underestimate their own ability.[35]
The neglect of the duration of an episode in determining its value
The tendency to underestimate the influence or strength of feelings, in either oneself or others.
The tendency for people to demand much more to give up an object than they would be willing to pay to acquire it.[36]
Exaggerated expectation
Based on the estimates, real-world evidence turns out to be less extreme than our expectations (conditionally inverse of the conservatism bias).[unreliable source?][4][37]
The tendency for experimenters to believe, certify, and publish data that agree with their expectations for the outcome of an experiment, and to disbelieve, discard, or downgrade the corresponding weightings for data that appear to conflict with those expectations.[38]
The tendency to place too much importance on one aspect of an event.[39]
Forer effect or Barnum effect
The observation that individuals will give high accuracy ratings to descriptions of their personality that supposedly are tailored specifically for them, but are in fact vague and general enough to apply to a wide range of people. This effect can provide a partial explanation for the widespread acceptance of some beliefs and practices, such as astrology, fortune telling, graphology, and some types of personality tests.
Drawing different conclusions from the same information, depending on how that information is presented
Frequency illusion
The illusion in which a word, a name, or other thing that has recently come to one's attention suddenly seems to appear with improbable frequency shortly afterwards (not to be confused with the recency illusion or selection bias).[40] This illusion may explain some examples of the Baader-Meinhof Phenomenon, whereby someone hears a new word or phrase repeatedly in a short span of time.
Limits a person to using an object only in the way it is traditionally used.
The tendency to think that future probabilities are altered by past events, when in reality they are unchanged. The fallacy arises from an erroneous conceptualization of the law of large numbers. For example, "I've flipped heads with this coin five times consecutively, so the chance of tails coming out on the sixth flip is much greater than heads."
Based on a specific level of task difficulty, the confidence in judgments is too conservative and not extreme enough[4][41][42][43]
Sometimes called the "I-knew-it-all-along" effect, the tendency to see past events as being predictable[44]at the time those events happened.
The "hot-hand fallacy" (also known as the "hot hand phenomenon" or "hot hand") is the fallacious belief that a person who has experienced success with a random event has a greater chance of further success in additional attempts.
Discounting is the tendency for people to have a stronger preference for more immediate payoffs relative to later payoffs. Hyperbolic discounting leads to choices that are inconsistent over time – people make choices today that their future selves would prefer not to have made, despite using the same reasoning.[45]Also known as current moment bias, present-bias, and related to Dynamic inconsistency.
The tendency to respond more strongly to a single identified person at risk than to a large group of people at risk.[46]
The tendency for people to place a disproportionately high value on objects that they partially assembled themselves, such as furniture from IKEA, regardless of the quality of the end result.
The tendency to overestimate one's degree of influence over other external events.[47]
Belief that furtherly acquired information generates additional relevant data for predictions, even when it evidently does not.[48]
Inaccurately perceiving a relationship between two unrelated events.[49][50]
A tendency to believe that a statement is true if it is easier to process, or if it has been stated multiple times, regardless of its actual veracity. These are specific cases of truthiness.
The tendency to overestimate the length or the intensity of the impact of future feeling states.[51]
The tendency to seek information even when it cannot affect action.[52]
The tendency to under-expect variation in small samples.
The phenomenon where people justify increased investment in a decision, based on the cumulative prior investment, despite new evidence suggesting that the decision was probably wrong. Also known as the sunk cost fallacy.
"If all you have is a hammer, everything looks like a nail."
The tendency to prefer a smaller set to a larger set judged separately, but not jointly.
The disutility of giving up an object is greater than the utility associated with acquiring it.[53] (see also Sunk cost effects and endowment effect).
The tendency to express undue liking for things merely because of familiarity with them.[54]
The tendency to concentrate on the nominal value (face value) of money rather than its value in terms of purchasing power.[55]
The tendency of a track record of non-prejudice to increase subsequent prejudice.
Negativity bias or Negativity effect
Psychological phenomenon by which humans have a greater recall of unpleasant memories compared with positive memories.[56] [57] (see also actor-observer bias, group attribution error, positivity effect, and negativity effect).[58]
The tendency to completely disregard probability when making a decision under uncertainty.[59]
The refusal to plan for, or react to, a disaster which has never happened before.
Aversion to contact with or use of products, research, standards, or knowledge developed outside a group. Related to IKEA effect.
When a researcher expects a given result and therefore unconsciously manipulates an experiment or misinterprets data in order to find it (see also subject-expectancy effect).
The tendency to judge harmful actions as worse, or less moral, than equally harmful omissions (inactions).[60]
The tendency to be over-optimistic, overestimating favorable and pleasing outcomes (see also wishful thinkingvalence effectpositive outcome bias).[61][62]
Ignoring an obvious (negative) situation.
The tendency to judge a decision by its eventual outcome instead of based on the quality of the decision at the time it was made.
Excessive confidence in one's own answers to questions. For example, for certain types of questions, answers that people rate as "99% certain" turn out to be wrong 40% of the time.[4][63][64][65]
A vague and random stimulus (often an image or sound) is perceived as significant, e.g., seeing images of animals or faces in clouds, the man in the moon, and hearing non-existent hidden messages on records played in reverse.
The tendency for some people, especially those suffering from depression, to overestimate the likelihood of negative things happening to them.
The tendency to underestimate task-completion times.[51]
The tendency to persuade oneself through rational argument that a purchase was good value.
The tendency to have an excessive optimism towards an invention or innovation's usefulness throughout society, while often failing to identify its limitations and weaknesses.
The tendency to overestimate how much our future selves share one's current preferences, thoughts and values, thus leading to sub-optimal choices.[66] [67][57]
The tendency to make risk-averse choices if the expected outcome is positive, but make risk-seeking choices to avoid negative outcomes.[68]
The urge to do the opposite of what someone wants you to do out of a need to resist a perceived attempt to constrain your freedom of choice (see also Reverse psychology).
Devaluing proposals only because they purportedly originated with an adversary.
The illusion that a word or language usage is a recent innovation when it is in fact long-established (see also frequency illusion).
Regressive bias
A certain state of mind wherein high values and high likelihoods are overestimated while low values and low likelihoods are underestimated.[4][69][70][unreliable source?]
The tendency to overestimate one's ability to show restraint in the face of temptation.
Rhyming statements are perceived as more truthful. A famous example being used in the O.J Simpson trial with the defense's use of the phrase "If the gloves don't fit, then you must acquit."
Risk compensation / Peltzman effect
The tendency to take greater risks when perceived safety increases.
The tendency for expectations to affect perception.
The tendency to reject new evidence that contradicts a paradigm.[27]
The tendency, when making hiring decisions, to favour potential candidates who don't compete with one's own particular strengths.[71]
The tendency to over-report socially desirable characteristics or behaviours in oneself and under-report socially undesirable characteristics or behaviours.[72]
The tendency to like things to stay relatively the same (see also loss aversionendowment effect, and system justification).[73][74]
Expecting a member of a group to have certain characteristics without having actual information about that individual.
The tendency to judge probability of the whole to be less than the probabilities of the parts.[75]
Perception that something is true if a subject's belief demands it to be true. Also assigns perceived connections between coincidences.
Concentrating on the people or things that "survived" some process and inadvertently overlooking those that didn't because of their lack of visibility.
Underestimations of the time that could be saved (or lost) when increasing (or decreasing) from a relatively low speed and overestimations of the time that could be saved (or lost) when increasing (or decreasing) from a relatively high speed.
Belief that mass communicated media messages have a greater effect on others than on themselves.
The tendency to give disproportionate weight to trivial issues. Also known as bikeshedding, this bias explains why an organization may avoid specialized or complex subjects, such as the design of a nuclear reactor, and instead focus on something easy to grasp or rewarding to the average participant, such as the design of an adjacent bike shed.[76]
Unit bias
The tendency to want to finish a given unit of a task or an item. Strong effects on the consumption of food in particular.[77]
Difficulty in comparing small differences in large quantities.
Underestimation of the duration taken to traverse oft-traveled routes and overestimation of the duration taken to traverse less familiar routes.
Preference for reducing a small risk to zero over a greater reduction in a larger risk.
A bias whereby a situation is perceived to be like a zero-sum game (i.e., one person gains at the expense of another).
Social biases[edit]
Most of these biases are labeled as attributional biases.
Name
Description
The tendency for explanations of other individuals' behaviors to overemphasize   the influence of their personality and underemphasize the influence of their situation (see also Fundamental attribution error), and for explanations of one's own behaviors to do the opposite (that is, to overemphasize the influence of our situation and underemphasize the influence of our own personality).
The tendency to attribute greater accuracy to the opinion of an authority figure (unrelated to its content) and be more influenced by that opinion.[78]
Attributing more blame to a harm-doer as the outcome becomes more severe or as personal or situational similarity to the victim increases.
Occurs when people claim more responsibility for themselves for the results of a joint action than an outside observer would credit them with.
An exception to the fundamental attribution error, when people view others as having (situational) extrinsic motivations and (dispositional) intrinsic motivations for oneself
The tendency for people to overestimate the degree to which others agree with them.[79]
Forer effect (aka Barnum effect)
The tendency to give high accuracy ratings to descriptions of their personality that supposedly are tailored specifically for them, but are in fact vague and general enough to apply to a wide range of people. For example, horoscopes.
The tendency for people to over-emphasize personality-based explanations for behaviors observed in others while under-emphasizing the role and power of situational influences on the same behavior[57] (see also actor-observer bias, group attribution error, positivity effect, and negativity effect).[80]
The biased belief that the characteristics of an individual group member are reflective of the group as a whole or the tendency to assume that group decision outcomes reflect the preferences of group members, even when information is available that clearly suggests otherwise.
The tendency for a person's positive or negative traits to "spill over" from one personality area to another in others' perceptions of them (see also physical attractiveness stereotype).[81]
People perceive their knowledge of their peers to surpass their peers' knowledge of them.[82]
When people view self-generated preferences as instead being caused by insightful, effective and benevolent agents
People overestimate others' ability to know them, and they also overestimate their ability to know others.
Overestimating one's desirable qualities, and underestimating undesirable qualities, relative to other people. (Also known as "Lake Wobegon effect", "better-than-average effect", or "superiority bias".)[83]
The tendency for people to give preferential treatment to others they perceive to   be members of their own groups.
The tendency for people to want to believe that the world is fundamentally just, causing them to rationalize an otherwise inexplicable injustice as deserved by the victim(s).
The tendency for people to ascribe greater or lesser moral standing based on the outcome of an event.
Expecting more egocentric bias in others than in oneself.
The belief that we see reality as it really is – objectively and without bias; that the facts are plain for all to see; that rational people will agree with us; and that those who don't are either uninformed, lazy, irrational, or biased.
Individuals see members of their own group as being relatively more varied than members of other groups.[84]
The tendency to claim more responsibility for successes than failures. It may also manifest itself as a tendency for people to evaluate ambiguous information in a way beneficial to their interests (see also group-serving bias).[85]
Known as the tendency for group members to spend more time and energy discussing information that all members are already familiar with (i.e., shared information), and less time and energy discussing information that only some members are aware of (i.e., unshared information).[86]
Sociability bias of language
The disproportionally higher representation of words related to social interactions, in comparison to words related to physical or mental aspects of behavior, in most languages. This bias attributed to nature of language as a tool facilitating human interactions. When verbal descriptors of human behavior are used as a source of information, sociability bias of such descriptors emerges in factor-analytic studies as a factor related to pro-social behavior (for example, of Extraversion factor in the Big Five personality traits [57]
The tendency to defend and bolster the status quo. Existing social, economic, and political arrangements tend to be preferred, and alternatives disparaged, sometimes even at the expense of individual and collective self-interest. (See also status quo bias.)
The tendency for people to view themselves as relatively variable in terms of personality, behavior, and mood while viewing others as much more predictable.
Similar to the fundamental attribution error, in this error a person is likely to make an internal attribution to an entire group instead of the individuals within the group.
A tendency to believe ourselves to be worse than others at tasks which are difficult.[87]
Memory errors and biases[edit]
Main article: List of memory biases
In psychology and cognitive science, a memory bias is a cognitive bias that either enhances or impairs the recall of a memory (either the chances that the memory will be recalled at all, or the amount of time it takes for it to be recalled, or both), or that alters the content of a reported memory. There are many types of memory bias, including:
Name
Description
Bizarre material is better remembered than common material.
In a self-justifying manner retroactively ascribing one's choices to be more informed than they were when they were made.
Change bias
After an investment of effort in producing change, remembering one's past performance as more difficult than it actually was[88][unreliable source?]
The retention of few memories from before the age of four.
Conservatism or Regressive bias
Tendency to remember high values and high likelihoods/probabilities/frequencies as lower than they actually were and low ones as higher than they actually were. Based on the evidence, memories are not extreme enough[69][70]
Consistency bias
Incorrectly remembering one's past attitudes and behaviour as resembling present attitudes and behaviour.[89]
That cognition and memory are dependent on context, such that out-of-context memories are more difficult to retrieve than in-context memories (e.g., recall time and accuracy for a work-related memory will be lower at home, and vice versa)
The tendency for people of one race to have difficulty identifying members of a race other than their own.
A form of misattribution where a memory is mistaken for imagination, because there is no subjective experience of it being a memory.[88]
Recalling the past in a self-serving manner, e.g., remembering one's exam grades as being better than they were, or remembering a caught fish as bigger than it really was.
A bias in which the emotion associated with unpleasant memories fades more quickly than the emotion associated with positive events.[90]
A form of misattribution where imagination is mistaken for a memory.
Generation effect (Self-generation effect)
That self-generated information is remembered best. For instance, people are better able to recall memories of statements that they have generated than similar statements generated by others.
The tendency to forget information that can be found readily online by using Internet search engines.
The inclination to see past events as being more predictable than they actually were; also called the "I-knew-it-all-along" effect.
Humor effect
That humorous items are more easily remembered than non-humorous ones, which might be explained by the distinctiveness of humor, the increased cognitive processing time to understand the humor, or the emotional arousal caused by the humor.[91]
That people are more likely to identify as true statements those they have previously heard (even if they cannot consciously remember having heard them), regardless of the actual validity of the statement. In other words, a person is more likely to believe a familiar statement than an unfamiliar one.
Inaccurately remembering a relationship between two events.[4][50]
Lag effect
The phenomenon whereby learning is greater when studying is spread out over time, as opposed to studying the same amount of time in a single session. See also spacing effect.
Memory distortions introduced by the loss of details in a recollection over time, often concurrent with sharpening or selective recollection of certain details that take on exaggerated significance in relation to the details or aspects of the experience lost through leveling. Both biases may be reinforced over time, and by repeated recollection or re-telling of a memory.[92]
That different methods of encoding information into memory have different levels of effectiveness.[93]
List-length effect
A smaller percentage of items are remembered in a longer list, but as the length of the list increases, the absolute number of items remembered increases as well.[94][further explanation needed]
Memory becoming less accurate because of interference from post-event information.[95]
That memory recall is higher for the last items of a list when the list items were received via speech than when they were received through writing.
The improved recall of information congruent with one's current mood.
Next-in-line effect
That a person in a group has diminished recall for the words of others who spoke immediately before himself, if they take turns speaking.[96]
That being shown some items from a list and later retrieving one item causes it to become harder to retrieve the other items.[97]
That people seem to perceive not the sum of an experience but the average of how it was at its peak (e.g., pleasant or unpleasant) and how it ended.
Persistence
The unwanted recurrence of memories of a traumatic event.[citation needed]
The notion that concepts that are learned by viewing pictures are more easily and frequently recalled than are concepts that are learned by viewing their written word form counterparts.[98][99][100][101][102][103]
That older adults favor positive over negative information in their memories.
That items near the end of a sequence are the easiest to recall, followed by the items at the beginning of a sequence; items in the middle are the least likely to be remembered.[104]
Processing difficulty effect
That information that takes longer to read and is thought about more (processed with more difficulty) is more easily remembered.[105]
The recalling of more personal events from adolescence and early adulthood than personal events from other lifetime periods[106]
The remembering of the past as having been better than it really was.
Self-relevance effect
That memories relating to the self are better recalled than similar information relating to others.
Confusing episodic memories with other information, creating distorted memories.[107]
That information is better recalled if exposure to it is repeated over a long span of time rather than a short one.
The tendency to overestimate the amount that other people notice your appearance or behavior.
Stereotypical bias
Memory distorted towards stereotypes (e.g., racial or gender), e.g., "black-sounding" names being misremembered as names of criminals.[88][unreliable source?]
Suffix effect
Diminishment of the recency effect because a sound item is appended to the list that the subject is not required to recall.[108][109]
A form of misattribution where ideas suggested by a questioner are mistaken for memory.
The tendency to displace recent events backward in time and remote events forward in time, so that recent events appear more remote, and remote events, more recent.
The fact that you more easily remember information you have read by rewriting it instead of rereading it.[110]
Tip of the tongue phenomenon
When a subject is able to recall parts of an item, or related information, but is frustratingly unable to recall the whole item. This is thought to be an instance of "blocking" where multiple similar memories are being recalled and interfere with each other.[88]
Travis Syndrome
Overestimating the significance of the present.[111] It is related to the enlightenment Idea of Progress and chronological snobbery with possibly an appeal to novelty logical fallacy being part of the bias.
Verbatim effect
That the "gist" of what someone has said is better remembered than the verbatim wording.[112] This is because memories are representations, not exact copies.
That an item that sticks out is more likely to be remembered than other items[113]
That uncompleted or interrupted tasks are remembered better than completed ones.



List of cognitive biases

Decision-making and behavioral biases

Many of these biases are studied for how they affect belief formation and business decisions and scientific research.
  • Bandwagon effect — the tendency to do (or believe) things because many other people do (or believe) the same. Related to groupthinkcrowd psychologyherd behaviour, and manias.
  • Bias blind spot — the tendency not to compensate for one's own cognitive biases.
  • Choice-supportive bias — the tendency to remember one's choices as better than they actually were.
  • Confirmation bias — the tendency to search for or interpret information in a way that confirms one's preconceptions.
  • Congruence bias — the tendency to test hypotheses exclusively through direct testing, in contrast to tests of possible alternative hypotheses.
  • Contrast effect — the enhancement or diminishment of a weight or other measurement when compared with recently observed contrasting object.
  • Déformation professionnelle — the tendency to look at things according to the conventions of one's own profession, forgetting any broader point of view.
  • Endowment effect — "the fact that people often demand much more to give up an object than they would be willing to pay to acquire it".[2]
  • Exposure-suspicion bias — a knowledge of a subject's disease in a medical study may influence the search for causes.
  • Extreme aversion — most people will go to great lengths to avoid extremes. People are more likely to choose an option if it is the intermediate choice.
  • Focusing effect — prediction bias occurring when people place too much importance on one aspect of an event; causes error in accurately predicting the utility of a future outcome.
  • Framing — drawing different conclusions from the same information, depending on how that information is presented.
  • Hyperbolic discounting — the tendency for people to have a stronger preference for more immediate payoffs relative to later payoffs, the closer to the present both payoffs are.
  • Illusion of control — the tendency for human beings to believe they can control or at least influence outcomes that they clearly cannot.
  • Impact bias — the tendency for people to overestimate the length or the intensity of the impact of future feeling states.
  • Information bias — the tendency to seek information even when it cannot affect action.
  • Irrational escalation — the tendency to make irrational decisions based upon rational decisions in the past or to justify actions already taken.
  • Loss aversion — "the disutility of giving up an object is greater than the utility associated with acquiring it".[3] (see also sunk cost effects and Endowment effect).
  • Neglect of probability — the tendency to completely disregard probability when making a decision under uncertainty.
  • Mere exposure effect — the tendency for people to express undue liking for things merely because they are familiar with them.
  • Obsequiousness bias — the tendency to systematically alter responses in the direction they perceive desired by the investigator.
  • Omission bias — the tendency to judge harmful actions as worse, or less moral, than equally harmful omissions (inactions).
  • Outcome bias — the tendency to judge a decision by its eventual outcome instead of based on the quality of the decision at the time it was made.
  • Planning fallacy — the tendency to underestimate task-completion times. Also formulated as Hofstadter's Law: "It always takes longer than you expect, even when you take into account Hofstadter's Law."
  • Post-purchase rationalization — the tendency to persuade oneself through rational argument that a purchase was a good value.
  • Pseudocertainty effect — the tendency to make risk-averse choices if the expected outcome is positive, but make risk-seeking choices to avoid negative outcomes.
  • Reactance — the urge to do the opposite of what someone wants you to do out of a need to resist a perceived attempt to constrain your freedom of choice.
  • Selective perception — the tendency for expectations to affect perception.
  • Status quo bias — the tendency for people to like things to stay relatively the same (see also Loss aversion and Endowment effect).[4]
  • Survivorship bias — a form of selection bias focusing on what has survived to the present and ignoring what must have been lost.
  • Unacceptability bias — questions that may embarrass or invade privacy are refused or evaded.
  • Unit bias — the tendency to want to finish a given unit of a task or an item with strong effects on the consumption of food in particular
  • Von Restorff effect — the tendency for an item that "stands out like a sore thumb" to be more likely to be remembered than other items.
  • Zero-risk bias — the preference for reducing a small risk to zero over a greater reduction in a larger risk. It is relevant e.g. to the allocation of public health resources and the debate about nuclear power.

[edit]Biases in probability and belief

Many of these biases are often studied for how they affect business and economic decisions and how they affect experimental research.
  • Ambiguity effect — the avoidance of options for which missing information makes the probability seem "unknown".
  • Anchoring — the tendency to rely too heavily, or "anchor," on a past reference or on one trait or piece of information when making decisions.
  • Anthropic bias — the tendency for one's evidence to be biased by observation selection effects.
  • Attentional bias — neglect of relevant data when making judgments of a correlation or association.
  • Availability heuristic — a biased prediction, due to the tendency to focus on the most salient and emotionally-charged outcome.
  • Clustering illusion — the tendency to see patterns where actually none exist.
  • Conjunction fallacy — the tendency to assume that specific conditions are more probable than general ones.
  • Frequency illusion — the phenomenon in which people who just learn or notice something start seeing it everywhere. Also known as the Baader-Meinhof Phenomenon.[5]
  • Gambler's fallacy — the tendency to assume that individual random events are influenced by previous random events. For example, "I've flipped heads with this coin five times consecutively, so the chance of tails coming out on the sixth flip is much greater than heads."
  • Hindsight bias — sometimes called the "I-knew-it-all-along" effect: the inclination to see past events as being predictable, based on knowledge of later events.
  • Hostile media effect — the tendency to perceive news coverage as biased against your position on an issue.
  • Illusory correlation — beliefs that inaccurately suppose a relationship between a certain type of action and an effect.
  • Ludic fallacy — the analysis of chance related problems with the narrow frame of games. Ignoring the complexity of reality, and the non-gaussian distribution of many things.
  • Neglect of prior base rates effect — the tendency to fail to incorporate prior known probabilities which are pertinent to the decision at hand.
  • Observer-expectancy effect — when a researcher expects a given result and therefore unconsciously manipulates an experiment or misinterprets data in order to find it (see also subject-expectancy effect).
  • Optimism bias — the systematic tendency to be over-optimistic about the outcome of planned actions. Found to be linked to the "left inferior frontal gyrus" section of the brain, and disrupting this section of the brain removes the bias. Article summarising this finding
  • Overconfidence effect — the tendency to overestimate one's own abilities.
  • Positive outcome bias — a tendency in prediction to overestimate the probability of good things happening to them (see also wishful thinking, optimism bias and valence effect).
  • Primacy effect — the tendency to weigh initial events more than subsequent events.
  • Recency effect — the tendency to weigh recent events more than earlier events (see also 'peak-end rule').
  • Reminiscence bump — the effect that people tend to recall more personal events from adolescence and early adulthood than from other lifetime periods.
  • Rosy retrospection — the tendency to rate past events more positively than they had actually rated them when the event occurred.
  • Subadditivity effect — the tendency to judge probability of the whole to be less than the probabilities of the parts.
  • Telescoping effect — the effect that recent events appear to have occurred more remotely and remote events appear to have occurred more recently.
  • Texas sharpshooter fallacy — the fallacy of selecting or adjusting a hypothesis after the data are collected, making it impossible to test the hypothesis fairly.

[edit]Social biases

Most of these biases are labeled as attributional biases.
  • Actor-observer bias — the tendency for explanations for other individual's behaviors to overemphasize the influence of their personality and underemphasize the influence of their situation. This is coupled with the opposite tendency for the self in that one's explanations for their own behaviors overemphasize their situation and underemphasize the influence of their personality. (see also fundamental attribution error).
  • Dunning-Kruger effect — "...when people are incompetent in the strategies they adopt to achieve success and satisfaction, they suffer a dual burden: Not only do they reach erroneous conclusions and make unfortunate choices, but their incompetence robs them of the ability to realize it. Instead, ...they are left with the mistaken impression that they are doing just fine."[6] (See also the Lake Wobegon effect, and overconfidence effect).
  • Egocentric bias — occurs when people claim more responsibility for themselves for the results of a joint action than an outside observer would.
  • Forer effect (aka Barnum Effect) — the tendency to give high accuracy ratings to descriptions of their personality that supposedly are tailored specifically for them, but are in fact vague and general enough to apply to a wide range of people. For example, horoscopes.
  • False consensus effect — the tendency for people to overestimate the degree to which others agree with them.
  • Fundamental attribution error — the tendency for people to over-emphasize personality-based explanations for behaviors observed in others while under-emphasizing the role and power of situational influences on the same behavior (see also actor-observer bias, group attribution error, positivity effect, and negativity effect).
  • Halo effect — the tendency for a person's positive or negative traits to "spill over" from one area of their personality to another in others' perceptions of them (see also physical attractiveness stereotype).
  • Herd instinct — a common tendency to adopt the opinions and follow the behaviors of the majority to feel safer and to avoid conflict.
  • Illusion of asymmetric insight — people perceive their knowledge of their peers to surpass their peers' knowledge of them.
  • Illusion of transparency — people overestimate others' ability to know them, and they also overestimate their ability to know others.
  • Ingroup bias — the tendency for people to give preferential treatment to others they perceive to be members of their own groups.
  • Just-world phenomenon — the tendency for people to believe that the world is "just" and therefore people "get what they deserve."
  • Lake Wobegon effect — the human tendency to report flattering beliefs about oneself and believe that one is above average (see also worse-than-average effect, and overconfidence effect).
  • Notational bias — a form of cultural bias in which a notation induces the appearance of a nonexistent natural law.
  • Outgroup homogeneity bias — individuals see members of their own group as being relatively more varied than members of other groups.
  • Projection bias — the tendency to unconsciously assume that others share the same or similar thoughts, beliefs, values, or positions.
  • Self-serving bias — the tendency to attribute successes to internal characteristics while blaming failures on outside forces. It may also manifest itself as a tendency for people to evaluate ambiguous information in a way beneficial to their interests (see also group-serving bias).
  • Modesty bias — The tendency to blame failures on oneself while attributing successes to situational factors. Opposite of self-serving bias.
  • Self-fulfilling prophecy — the tendency to engage in behaviors that elicit results which will (consciously or subconsciously) confirm our beliefs.
  • System justification — the tendency to defend and bolster the status quo, i.e. existing social, economic, and political arrangements tend to be preferred, and alternatives disparaged sometimes even at the expense of individual and collective self-interest.
  • Trait ascription bias — the tendency for people to view themselves as relatively variable in terms of personality, behavior and mood while viewing others as much more predictable.
  • Ultimate attribution error — A sub-type of the fundamental attribution error above, the ultimate attribution error occurs when negative behavior in one's own group is explained away as circumstantial, but negative behavior among outsiders is believed to be evidence of flaws in character.

[edit]Memory errors

  • Beneffectance — perceiving oneself as responsible for desirable outcomes but not responsible for undesirable ones. (Term coined by Greenwald (1980))
  • Consistency bias — incorrectly remembering one's past attitudes and behaviour as resembling present attitudes and behaviour.
  • Cryptomnesia — a form of misattribution where a memory is mistaken for imagination.
  • Egocentric bias — recalling the past in a self-serving manner, e.g. remembering one's exam grades as being better than they were, or remembering a caught fish as being bigger than it was
  • Confabulation or false memory — Remembering something that never actually happened.
  • Hindsight bias — filtering memory of past events through present knowledge, so that those events look more predictable than they actually were; also known as the 'I-knew-it-all-along effect'.
  • Selective Memory and selective reporting
  • Suggestibility — a form of misattribution where ideas suggested by a questioner are mistaken for memory. Often a key aspect of hypnotherapy.

[edit]Common theoretical causes of some cognitive biases

  • Attribution theory, especially:
    • Salience
  • Cognitive dissonance, and related:
    • Impression management
    • Self-perception theory
  • Heuristics, including:
    • Availability heuristic
    • Representativeness heuristic
  • Adaptive Bias

[edit]See also


Logical fallacy

 formal fallacies[edit]

Main article: Formal fallacy
A formal fallacy is an error in logic that can be seen in the argument's form.[1] All formal fallacies are specific types of non sequiturs.
  • Anecdotal fallacy – using a personal experience or examples to extrapolate without a statistically significant number of cases that could form scientifically compelling evidence.
  • Appeal to probability – is a statement that takes something for granted because it would probably be the case (or might be the case).[2][3]
  • Argument from fallacy – also known as fallacy fallacy, assumes that if an argument for some conclusion is fallacious, then the conclusion is false.[4]
  • Base rate fallacy – making a probability judgment based on conditional probabilities, without taking into account the effect of prior probabilities.[5]
  • Conjunction fallacy – assumption that an outcome simultaneously satisfying multiple conditions is more probable than an outcome satisfying a single one of them.[6]
  • Masked-man fallacy (illicit substitution of identicals) – the substitution of identical designators in a true statement can lead to a false one.[7]

Propositional fallacies[edit]

A propositional fallacy is an error in logic that concerns compound propositions. For a compound proposition to be true, the truth values of its constituent parts must satisfy the relevant logical connectives that occur in it (most commonly: <and>, <or>, <not>, <only if>, <if and only if>). The following fallacies involve inferences whose correctness is not guaranteed by the behavior of those logical connectives, and hence, which are not logically guaranteed to yield true conclusions.
Types of propositional fallacies:

Quantification fallacies[edit]

A quantification fallacy is an error in logic where the quantifiers of the premises are in contradiction to the quantifier of the conclusion.
Types of Quantification fallacies:

Formal syllogistic fallacies[edit]

Syllogistic fallacies – logical fallacies that occur in syllogisms.

Informal fallacies[edit]

Main article: Informal fallacy
Informal fallacies – arguments that are fallacious for reasons other than structural (formal) flaws and usually require examination of the argument's content.[12]
  • Appeal to the stone (argumentum ad lapidem) – dismissing a claim as absurd without demonstrating proof for its absurdity.[13]
  • Argument from ignorance (appeal to ignorance, argumentum ad ignorantiam) – assuming that a claim is true because it has not been or cannot be proven false, or vice versa.[14]
  • Argument from incredulity (appeal to common sense) – "I cannot imagine how this could be true, therefore it must be false."[15]
  • Argument from repetition (argumentum ad nauseamargumentum ad infinitum) – signifies that it has been discussed extensively until nobody cares to discuss it anymore;[16][17] sometimes confused with proof by assertion
  • Argument from silence (argumentum ex silentio) – where the conclusion is based on the absence of evidence, rather than the existence of evidence.[18][19]
  • Argument to moderation (false compromise, middle ground, fallacy of the mean, argumentum ad temperantiam) – assuming that the compromise between two positions is always correct.[20]
  • Argumentum verbosium – See Proof by verbosity, below.
  • Begging the question (petitio principii) – providing what is essentially the conclusion of the argument as a premise.[21][22][23][24]
  • Shifting the burden of proof (see – onus probandi) – I need not prove my claim, you must prove it is false.
  • Circular reasoning (circulus in demonstrando) – when the reasoner begins with what he or she is trying to end up with; sometimes called assuming the conclusion.
  • Circular cause and consequence – where the consequence of the phenomenon is claimed to be its root cause.
  • Continuum fallacy (fallacy of the beard, line-drawing fallacy, sorites fallacy, fallacy of the heap, bald man fallacy) – improperly rejecting a claim for being imprecise.[25]
  • Correlative-based fallacies
  • Divine fallacy (argument from incredulity) – arguing that, because something is so incredible/amazing/ununderstandable, it must be the result of superior, divine, alien or paranormal agency.[28]
  • Double counting – counting events or occurrences more than once in probabilistic reasoning, which leads to the sum of the probabilities of all cases exceeding unity.
  • Equivocation – the misleading use of a term with more than one meaning (by glossing over which meaning is intended at a particular time).[29]
    • Ambiguous middle term – a common ambiguity in syllogisms in which the middle term is equivocated.[30]
    • Definitional retreat – changing the meaning of a word to deal with an objection raised against the original wording.[31]
  • Ecological fallacy – inferences about the nature of specific individuals are based solely upon aggregate statistics collected for the group to which those individuals belong.[32]
  • Etymological fallacy – which reasons that the original or historical meaning of a word or phrase is necessarily similar to its actual present-day usage.[33]
  • Fallacy of accent – a specific type of ambiguity that arises when the meaning of a sentence is changed by placing an unusual prosodic stress, or when, in a written passage, it's left unclear which word the emphasis was supposed to fall on.
  • Fallacy of composition – assuming that something true of part of a whole must also be true of the whole.[34]
  • Fallacy of division – assuming that something true of a thing must also be true of all or some of its parts.[35]
  • False attribution – an advocate appeals to an irrelevant, unqualified, unidentified, biased or fabricated source in support of an argument.
  • False authority (single authority) – using an expert of dubious credentials or using only one opinion to sell a product or idea. Related to the appeal to authority fallacy.
  • False dilemma (false dichotomy, fallacy of bifurcation, black-or-white fallacy) – two alternative statements are held to be the only possible options, when in reality there are more.[37]
  • False equivalence – describing a situation of logical and apparent equivalence, when in fact there is none.
  • Fallacy of many questions (complex question, fallacy of presupposition, loaded question, plurium interrogationum) – someone asks a question that presupposes something that has not been proven or accepted by all the people involved. This fallacy is often used rhetorically, so that the question limits direct replies to those that serve the questioner's agenda.
  • Fallacy of the single cause (causal oversimplification[38]) – it is assumed that there is one, simple cause of an outcome when in reality it may have been caused by a number of only jointly sufficient causes.
  • Furtive fallacy – outcomes are asserted to have been caused by the malfeasance of decision makers.
  • Gambler's fallacy – the incorrect belief that separate, independent events can affect the likelihood of another random event. If a fair coin lands on heads 10 times in a row, the belief that it is "due to the number of times it had previously landed on tails" is incorrect.[39]
  • Historian's fallacy – occurs when one assumes that decision makers of the past viewed events from the same perspective and having the same information as those subsequently analyzing the decision.[40] (Not to be confused with presentism, which is a mode of historical analysis in which present-day ideas, such as moral standards, are projected into the past.)
  • Historical fallacy – where a set of considerations holds good only because a completed process is read into the content of the process which conditions this completed result.[41]
  • Homunculus fallacy – where a "middle-man" is used for explanation, this sometimes leads to regressive middle-men. Explains without actually explaining the real nature of a function or a process. Instead, it explains the concept in terms of the concept itself, without first defining or explaining the original concept. Explaining thought as something produced by a little thinker, a sort of homunculus inside the head, merely explains it as another kind of thinking (as different but the same).[42]
  • Inflation of conflict – The experts of a field of knowledge disagree on a certain point, so the scholars must know nothing, and therefore the legitimacy of their entire field is put to question.[43]
  • If-by-whiskey – an argument that supports both sides of an issue by using terms that are selectively emotionally sensitive.
  • Incomplete comparison – in which insufficient information is provided to make a complete comparison.
  • Inconsistent comparison – where different methods of comparison are used, leaving one with a false impression of the whole comparison.
  • Intentionality fallacy – the insistence that the ultimate meaning of an expression must be consistent with the intention of the person from whom the communication originated (e.g. a work of fiction that is widely received as a blatant allegory must necessarily not be regarded as such if the author intended it not to be so.)[44]
  • Ignoratio elenchi (irrelevant conclusion, missing the point) – an argument that may in itself be valid, but does not address the issue in question.[45]
  • Kettle logic – using multiple, jointly inconsistent arguments to defend a position.[dubious ]
  • Ludic fallacy – the belief that the outcomes of non-regulated random occurrences can be encapsulated by a statistic; a failure to take into account unknown unknowns in determining the probability of events taking place.[46]
  • McNamara fallacy (quantitative fallacy) – making a decision based only on quantitative observations, discounting all other considerations.
  • Moral high ground fallacy – in which one assumes a "holier-than-thou" attitude in an attempt to make oneself look good to win an argument.
  • Moralistic fallacy – inferring factual conclusions from purely evaluative premises in violation of fact–value distinction. For instance, inferring is from ought is an instance of moralistic fallacy. Moralistic fallacy is the inverse of naturalistic fallacy defined below.
  • Moving the goalposts (raising the bar) – argument in which evidence presented in response to a specific claim is dismissed and some other (often greater) evidence is demanded.
  • Naturalistic fallacy – inferring evaluative conclusions from purely factual premises[47] in violation of fact–value distinction. For instance, inferring ought from is (sometimes referred to as the is-ought fallacy) is an instance of naturalistic fallacy. Also naturalistic fallacy in a stricter sense as defined in the section "Conditional or questionable fallacies" below is an instance of naturalistic fallacy. Naturalistic fallacy is the inverse of moralistic fallacy.
  • Naturalistic fallacy[48] (anti-naturalistic fallacy[49]) – inferring impossibility to infer any instance of ought from is from the general invalidity of is-ought fallacy, mentioned above. For instance, is  does imply ought  for any proposition , although the naturalistic fallacy would falsely declare such an inference invalid. Naturalistic fallacy is an instance of argument from fallacy.
  • Nirvana fallacy (perfect solution fallacy) – when solutions to problems are rejected because they are not perfect.
  • Onus probandi – from Latin "onus probandi incumbit ei qui dicit, non ei qui negat" the burden of proof is on the person who makes the claim, not on the person who denies (or questions the claim). It is a particular case of the argumentum ad ignorantiam fallacy, here the burden is shifted on the person defending against the assertion.
  • Petitio principii – see begging the question.
  • Post hoc ergo propter hoc Latin for "after this, therefore because of this" (faulty cause/effect, coincidental correlation, correlation without causation) – X happened, then Y happened; therefore X caused Y. The Loch Ness Monster has been seen in this loch. Something tipped our boat over; it's obviously the Loch Ness Monster.[50]
  • Proof by assertion – a proposition is repeatedly restated regardless of contradiction; sometimes confused with argument from repetition (argumentum ad infinitumargumentum ad nauseam)
  • Proof by verbosity (argumentum verbosium, proof by intimidation) – submission of others to an argument too complex and verbose to reasonably deal with in all its intimate details. (See also Gish Gallop and argument from authority.)
  • Prosecutor's fallacy – a low probability of false matches does not mean a low probability of some false match being found.
  • Proving too much – using a form of argument that, if it were valid, could be used to reach an additional, undesirable conclusion.
  • Psychologist's fallacy – an observer presupposes the objectivity of his own perspective when analyzing a behavioral event.
  • Red herring – a speaker attempts to distract an audience by deviating from the topic at hand by introducing a separate argument the speaker believes is easier to speak to.[51]
  • Referential fallacy[52] – assuming all words refer to existing things and that the meaning of words reside within the things they refer to, as opposed to words possibly referring to no real object or that the meaning of words often comes from how we use them.
  • Regression fallacy – ascribes cause where none exists. The flaw is failing to account for natural fluctuations. It is frequently a special kind of the post hoc fallacy.
  • Reification (concretism, hypostatization, or the fallacy of misplaced concreteness) – a fallacy of ambiguity, when an abstraction (abstract belief or hypothetical construct) is treated as if it were a concrete, real event or physical entity. In other words, it is the error of treating as a "real thing" something that is not a real thing, but merely an idea.
  • Retrospective determinism – the argument that because an event has occurred under some circumstance, the circumstance must have made its occurrence inevitable.
  • Shotgun argumentation – the arguer offers such a large number of arguments for a position that the opponent can't possibly respond to all of them. (See "Argument by verbosity" and "Gish Gallop", above.)
  • Special pleading – where a proponent of a position attempts to cite something as an exemption to a generally accepted rule or principle without justifying the exemption.
  • Wrong direction – cause and effect are reversed. The cause is said to be the effect and vice versa.[53]

Faulty generalizations[edit]

Faulty generalizations – reach a conclusion from weak premises. Unlike fallacies of relevance, in fallacies of defective induction, the premises are related to the conclusions yet only weakly buttress the conclusions. A faulty generalization is thus produced.
  • Accident – an exception to a generalization is ignored.[54]
    • No true Scotsman – makes a generalization true by changing the generalization to exclude a counterexample.[55]
  • Cherry picking (suppressed evidence, incomplete evidence) – act of pointing at individual cases or data that seem to confirm a particular position, while ignoring a significant portion of related cases or data that may contradict that position.[56]
    • Survivorship bias – when a small number of survivors of a given process are actively promoted while completely ignoring a large number of failures
  • False analogy – an argument by analogy in which the analogy is poorly suited.[57]
  • Hasty generalization (fallacy of insufficient statistics, fallacy of insufficient sample, fallacy of the lonely fact, leaping to a conclusion, hasty induction, secundum quid, converse accident) – basing a broad conclusion on a small sample.[58]
  • Inductive fallacy – A more general name to some fallacies, such as hasty generalization. It happens when a conclusion is made of premises that lightly support it.
  • Misleading vividness – involves describing an occurrence in vivid detail, even if it is an exceptional occurrence, to convince someone that it is a problem.
  • Overwhelming exception – an accurate generalization that comes with qualifications that eliminate so many cases that what remains is much less impressive than the initial statement might have led one to assume.[59]
  • Thought-terminating cliché – a commonly used phrase, sometimes passing as folk wisdom, used to quell cognitive dissonance, conceal lack of thought-entertainment, move on to other topics etc. but in any case, end the debate with a cliché—not a point.

Red herring fallacies[edit]

A red herring fallacy, one of the main subtypes of fallacies of relevance, is an error in logic where a proposition is, or is intended to be, misleading in order to make irrelevant or false inferences. In the general case any logical inference based on fake arguments, intended to replace the lack of real arguments or to replace implicitly the subject of the discussion.[60][61][62]
Red herring – argument given in response to another argument, which is irrelevant and draws attention away from the subject of argument. See also irrelevant conclusion.
  • Ad hominem – attacking the arguer instead of the argument.
    • Poisoning the well – a subtype of ad hominem presenting adverse information about a target person with the intention of discrediting everything that the target person says.[63]
    • Abusive fallacy – a subtype of ad hominem that verbally abuses the opponent rather than arguing about the originally proposed argument.[64]
    • Appeal to motive – a subtype of ad hominem that dismisses an idea by questioning the motives of its proposer.
    • Tone policing – a subtype of ad hominem focusing on emotion behind a message rather than the message itself as a discrediting tactic.
    • Traitorous critic fallacy (ergo decedo) – a subtype of ad hominem where a critic's perceived affiliation is seen as the underlying reason for the criticism and the critic is asked to stay away from the issue altogether.
  • Appeal to authority (argumentum ab auctoritate) – where an assertion is deemed true because of the position or authority of the person asserting it.[65][66]
  • Appeal to consequences (argumentum ad consequentiam) – the conclusion is supported by a premise that asserts positive or negative consequences from some course of action in an attempt to distract from the initial discussion.[68]
  • Appeal to emotion – where an argument is made due to the manipulation of emotions, rather than the use of valid reasoning.[69]
    • Appeal to fear – a specific type of appeal to emotion where an argument is made by increasing fear and prejudice towards the opposing side[70][71]
    • Appeal to flattery – a specific type of appeal to emotion where an argument is made due to the use of flattery to gather support.[72]
    • Appeal to pity (argumentum ad misericordiam) – an argument attempts to induce pity to sway opponents.[73]
    • Appeal to ridicule – an argument is made by presenting the opponent's argument in a way that makes it appear ridiculous.[74][75]
    • Appeal to spite – a specific type of appeal to emotion where an argument is made through exploiting people's bitterness or spite towards an opposing party.[76]
    • Wishful thinking – a specific type of appeal to emotion where a decision is made according to what might be pleasing to imagine, rather than according to evidence or reason.[77]
  • Appeal to nature – wherein judgment is based solely on whether the subject of judgment is 'natural' or 'unnatural'.[78] (Sometimes also called the "naturalistic fallacy", but is not to be confused with the other fallacies by that name)
  • Appeal to novelty (argumentum novitatisargumentum ad antiquitatis) – where a proposal is claimed to be superior or better solely because it is new or modern.[79]
  • Appeal to poverty (argumentum ad Lazarum) – supporting a conclusion because the arguer is poor (or refuting because the arguer is wealthy). (Opposite of appeal to wealth.)[80]
  • Appeal to tradition (argumentum ad antiquitatem) – a conclusion supported solely because it has long been held to be true.[81]
  • Appeal to wealth (argumentum ad crumenam) – supporting a conclusion because the arguer is wealthy (or refuting because the arguer is poor).[82] (Sometimes taken together with the appeal to poverty as a general appeal to the arguer's financial situation.)
  • Argumentum ad baculum (appeal to the stick, appeal to force, appeal to threat) – an argument made through coercion or threats of force to support position.[83]
  • Argumentum ad populum (appeal to widespread belief, bandwagon argument, appeal to the majority, appeal to the people) – where a proposition is claimed to be true or good solely because many people believe it to be so.[84]
  • Association fallacy (guilt by association and honor by association) – arguing that because two things share (or are implied to share) some property, they are the same.[85]
  • Bulverism (psychogenetic fallacy) – inferring why an argument is being used, associating it to some psychological reason, then assuming it is invalid as a result. It is wrong to assume that if the origin of an idea comes from a biased mind, then the idea itself must also be a falsehood.[43]
  • Chronological snobbery – where a thesis is deemed incorrect because it was commonly held when something else, clearly false, was also commonly held.[86][87]
  • Fallacy of relative privation ("not as bad as") – dismissing an argument or complaint due to the existence of more important problems in the world, regardless of whether those problems bear relevance to the initial argument. For example, First World problem.
  • Genetic fallacy – where a conclusion is suggested based solely on something or someone's origin rather than its current meaning or context.[88]
  • Judgmental language – insulting or pejorative language to influence the recipient's judgment.
  • Naturalistic fallacy (is–ought fallacy,[89] naturalistic fallacy[90]) – claims about what ought to be on the basis of statements about what is.
  • Pooh-pooh – dismissing an argument perceived unworthy of serious consideration.[91]
  • Straw man fallacy – an argument based on misrepresentation of an opponent's position.[92]
  • Texas sharpshooter fallacy – improperly asserting a cause to explain a cluster of data.[93]
  • Tu quoque ("you too", appeal to hypocrisy, I'm rubber and you're glue) – the argument states that a certain position is false or wrong or should be disregarded because its proponent fails to act consistently in accordance with that position.[94]
  • Two wrongs make a right – occurs when it is assumed that if one wrong is committed, an "equal but opposite" wrong will cancel it out.[95]
  • Vacuous truth – A claim that is technically true but meaningless, in the form of claiming that no A in B has C, when there are no A in B. For example, claiming that no mobile phones in the room are on when there are no mobile phones in the room at all.
  • Appeal to self-evident truth - A claim that a proposition is self-evidently true, so needs no further supporting evidence. If self-evidence is actually the basis for the claim, it is arbitrary and the opposite (a contradictory or contrary statement) is equally true. In many cases, however, the basis is really some kind of unstated and unexamined observation or assumption.

Conditional or questionable fallacies[edit]

  • Broken window fallacy – an argument that disregards lost opportunity costs (typically non-obvious, difficult to determine or otherwise hidden) associated with destroying property of others, or other ways of externalizing costs onto others. For example, an argument that states breaking a window generates income for a window fitter, but disregards the fact that the money spent on the new window cannot now be spent on new shoes.
  • Definist fallacy – involves the confusion between two notions by defining one in terms of the other.[96]
  • Naturalistic fallacy – attempts to prove a claim about ethics by appealing to a definition of the term "good" in terms of either one or more claims about natural properties (sometimes also taken to mean the appeal to nature) or God's will.[78]
  • Slippery slope (thin edge of the wedge, camel's nose) – asserting that a relatively small first step inevitably leads to a chain of related events culminating in some significant impact/event that should not happen, thus the first step should not happen. It is, in its essence, an appeal to probability fallacy. (e.g. if person x does y then z would [probably] occur, leading to q, leading to w, leading to e.)[97] This is also related to the reductio ad absurdum.

See also[edit]








https://en.wikipedia.org/wiki/Emotional_bias



ecision making[edit]

Emotions have a small to large impact on the decisions we make depending on the type of emotion.[2] Some of the most influential emotions for decision-making are sadness, disgust, and guilt.[2] Anger differs the most from fear and sadness in both judgment and decision-making contexts.[2] Fear is associated with uncertainty, while sadness is associated with a perception that outcomes are due to the situation.[2] Angry decision-makers tend to make choices quickly and are unlikely to analyze their decisions.[3] Stress can play a role in decision-making. Acute stress can alter the response to moral dilemmas.[4] On the other hand, stress does not always alter everyday, moral decision-making.[5]One study looked at the role emotions play in adolescents’ moral decision-making. In a hypothetical, prosocial behavioral context, positively charged self-evaluative emotions most strongly predict moral choice.[6] In anti-social behaviors, negatively charged, critical emotions most strongly predict moral choice.[6] Regret and disappointment are emotions experienced after a decision. In some cases, regret has created a stronger desire to switch choices than disappointment.[7]
Emotions affect different types of decisions. Emotions have a strong influence on economic behavior and decision-making.[8] In some behavioral anomalies, certain emotions related to some tasks can have an increased impact.[9] In one experiment, researchers looked at what emotions manifest the disposition effect, where individuals sell winning shares and hold losing ones.[9] They found that elation for winners and regret for losers are necessary emotions that can cause the effect to occur.[9] In regards to patients making a medical decision, emotions and one’s motivational goals, play a part as well.[10] One study looked at the elements of coping behaviors.[10] The first two elements have to do with the need to control the cognitive and emotional elements of the health threat; the second pair of elements relate to the management of cognitive and emotional aspects of the decision itself.[10]
Brain damage can cause changes in normal decision-making processes. The amygdala is an area in the brain involved in emotion. Studies have found that patients with bilateral amygdala damage, which is damage in both hemispheres of the amygdala region in the brain, are deficient in decision-making.[11] When an initial choice is made in decision-making, the result of this choice has an emotional response, which is controlled by the amygdala.[11]


ypes

There are several methods of grouping fallacies. RationalWiki categorizes fallacies into four groups:
  1. Formal fallacy: An argument in which the conclusion would not be true whether or not its premises are correct, because it does not follow valid logical structure.
  2. Informal fallacy: An argument in which the conclusion would be true if the premises were true, but those premises are almost always incorrect.
  3. Conditional fallacy: An argument which may or may not be fallacious, conditional on whether or not one of its premises is true. For example, an argument from authority can be fallacious when the authority isn't authoritative, but can be valid when there's reason to trust the source.
  4. Fallacious argument style: An argument in which one speaker uses unfair, manipulative, or disruptive tactics to prevent actual discussion of the issue.

[edit]Formal

See the main article on this topic: Formal fallacy
  1. Affirmative conclusion from a negative premise: Asserting some positive fact from negative premises.
  2. Affirming a disjunct: A or B, A, therefore not-B.
  3. Affirming the consequent: If A implies B, and B, therefore A. The inverse of the valid affirming the antecedent: from A -> B and A to infer B.
  4. Chewbacca Defense: A logically invalid Gish Gallop intended to confuse.
  5. Confusion of the inverse: Confusing the probability of a set of data given a hypothesis, and the probability of a hypothesis given a set of data.
  6. Denying the antecedent: If A implies B, and not-A, therefore not-B.
  7. Enthymeme: When an unstated premise is necessary for logical validity.
  8. Existential assumption: When the conclusion of a syllogism requires that a class has at least one member, but one or more of the premises do not.
  9. Fallacy fallacy: The meta-fallacious argument that your opponent (or someone arguing for a similar point) has used logical fallacies, therefore the argument is wrong.
  10. False dilemma: When two opposing views are presented as the only options, but are not.
  11. Four-term fallacy: Any syllogism in which four terms are present, instead of the mandatory three.
  12. Illicit process: Incorrectly concluding for all of a set when the premises apply to only some of a set. Specificially, the illicit major and illicit minor.
  13. Negative conclusion from affirmative premises: Asserting some negative fact from positive premises.
  14. Negative proof: Arguing that something must exist because there is no evidence it does not exist.
  15. Not even wrong: An answer that is utterly unrelated to the question.
  16. Self-refuting idea: A claim that on closer inspection disagrees with itself.
  17. Substituting explanation for premise: Asserting that, because a given explanation for why something occurs is bad, the thing does not occur.
  18. Syllogistic fallacy: Any instance in which a syllogism with incorrect structure is used.

[edit]Informal

See the main article on this topic: Informal fallacy
  1. General:
    1. Appeal to ancient wisdom: It's right because the Maya/Chinese/Hebrews said it thousands of years ago!
    2. Appeal to tradition: Because it's always been that way, it's absolutely the right way!
    3. Balance fallacy — Giving equal weighting to both sides of an argument, even if one really doesn't deserve the time.
    4. My enemy's enemy — Supporting someone because you've a mutual enemy with them.
    5. Reductio ad absurdum — Following a chain of thought to its absurd or contradictory conclusion. Sometimes worthwhile, but oftentimes fallacious.
    6. Appeal to novelty — Arguing that a claim is valid because it is novel.
    7. Appeal to nature — Arguing that something is good because it is "natural" (see also Moralistic fallacy).
  2. Ad hoc: When some idea is asserted purely to shore up some other idea.
    1. Argumentum ex culo: When some fact is cited to defend something, but is entirely fictional.
    2. Escape hatch: When some rhetorical technique is used to evade the burden of proof.
    3. Handwave: When some fact that defeats an argument is ignored.
    4. Moving the goalposts: When truth is redefined on the spot.
    5. Nirvana fallacy: Claiming that a realistic solution is useless because it is not as good as an idealized perfect solution.
    6. No True Scotsman: When groups are redefined on the spot.
    7. Slothful induction: Ignoring the strongest conclusion of an inductive argument to focus on a weaker one.
    8. Special pleading: When universal rules no longer apply in this specific instance.
  3. Ad hominem: When an arguer is attacked, rather than their idea.
    1. Ad iram: When an arguer's supposed anger is used to disprove their ideas.
    2. Appeal to bias: When an arguer's supposed mental bias is used to disprove their ideas.
      1. Shill gambit: Asserting an arguer is working for someone and spreading disinformation.
    3. Argumentum ad cellarium: Accusing the arguer of still being in "mom's basement".
    4. Association fallacy: When someone's associations are used as evidence against their ideas.
    5. Blaming the victim: When a victim's actions are used as proof that some offense against them was justified.
    6. Damning with faint praise: When someone is attacked through praise of an achievement that isn't praiseworthy or isn't significantly praiseworthy, suggesting that no achievements worthy of praise exist.
    7. Demonization: When one's opponent is depicted as unequivocally terrible.
      1. Poisoning the well: Where an opponent is pre-painted as terrible.
      2. Dixiecrat fallacy: Dems supported segregation! Dems are racist!
    8. Fallacy of opposition: When someone's opposition to your opinion is taken as proof of their incorrectness.
      1. Bulverism: Arguing about how someone got such bad ideas, rather than that the ideas are bad.
      2. Jonanism: Considering all people who disagree with you as the same person (or following the same ideas).
    9. Genetic fallacy: When the source of an argument is attacked, rather than its merits.
    10. Nutpicking: When a few extremists from a group are taken as representative of the group.
    11. Tu quoque: Where a criticism is falsely dismissed because its author is also guilty of the charge.
      1. Whataboutism: Tu quoque, USSR style!
    12. Scapegoat: Using someone to take the blame.
  4. Argument from ignorance: When it is claimed that the truth of a premise is based on the fact that it has not been proven false or proven true.
    1. Argument from incredulity: Literally "that's unbelievable = that's obviously not real". This kind of thinking would quickly put an end to virtually all quantum physics.
    2. Argument from silence: The lack of response to my point makes my point correct!
      1. Argument by censorship: I have created silence; this shows that my point cannot be responded to!
    3. Science doesn't know everything: And therefore it knows nothing.
    4. Science was wrong before: And therefore it can never be right.
    5. Toupee fallacy: I have never seen a toupee that I could not recognize as a toupee.
    6. Willful ignorance: *fingers in ears* nah-nah-nah-nah-nah, can't heeeeaaaarrrr youuuu!
  5. Causalation: When something is not truly the cause of an effect.
    1. Confusing correlation for causation — The number of pirates on the seas has gone down, this correlates with global temperatures rising. So, do pirates cause global cooling?
    2. Gambler's fallacy: I lost the last twenty dice rolls -- I'm due for a win, so I better double down!
    3. Post hoc, ergo propter hoc: Because event A happened before B, A must have caused B.
    4. Wrong direction: When a cause is mistakenly considered an effect.
  6. Emotional appeal: Evaluating an argument based on its emotional consequences rather than logical ones.
    1. Appeal to confidence: Trust me, I know what I'm doing.
    2. Argument from consequences: Arguing against a point's truth based on expected negative outcome.
      1. Argumentum ad baculum: A subset of arguing from adverse consequences, the negative outcomes are being actualised by the one making the argument.
    3. Appeal to fear: We're surrounded by logical fallacies! RUN!!
    4. Appeal to flattery: What a lovely fallacy you have there! You must be a smart person, someone who'd find quantum healing quite fascinating.
    5. Appeal to gravity: I'm the only one up here who takes this seriously. Disregard these jokers -- I have the truth.
    6. Appeal to money: I'm telling the truth. How else could I have gotten all this money?
    7. Appeal to shame: Would you say that in front of your mother?
    8. Argumentum ad fastidium: Ugh, that's so gross -- it must be false.
    9. Deepity: something something children of the stars, something something love is more than just chemicals
    10. Tone argument: If you can't keep it civil, you clearly can't make truthful statements!
  7. Circular reasoning: Ironclad and impermeable self-supporting logic.
    1. Argumentum ad dictionarium: If the dictionary says what I think something means,the dictionary is right. If not, find a new dictionary.
    2. Argument by assertion: If you say something enough times, it eventually becomes true and therefore you win the argument.
      1. Argumentum ad nauseam: If you say something often enough to make people vomit, you win.
      2. Infinite regress: If I say X, and then say that X proves that X is true, then I win!
    3. Appeal to faith
    4. Self-refuting idea: If I need to explain why my theory is wrong, I'll just introduce a new theory that entirely cancels out my old theory.
  8. Fallacy of ambiguity: Purposefully making something unclear, to allow its misuse.
    1. Etymology: Confusing the original meaning of a word and its current meaning.
    2. Equivocation: Deliberately substituting the meaning of a given word in one context for another context that is inappropriate in order to make your argument.
    3. Fallacy of accent: When the meaning of a text is changed by what word or words are stressed, and stress is unclear.
    4. Fallacy of amphiboly: When a sentence, because of its grammar, structure, or punctuation, can be interpreted in multiple ways.
    5. Loaded language — Asking a question, which has an assumption built into it, so that it can't be answered without appearing to agree to the assumption, or else appearing evasive by questioning the assumption. To be distinguished from a Leading question, which is not a fallacy, but is a way of suggesting the desired answer by how the question is phrased. "Are you still beating your wife?" is a loaded question, for it assumes that at one time you did beat your wife; while "You weren't beating your wife, were you?" is a leading question, for it suggests the simple answer no.
    6. Masked man fallacy: When Leibniz's law is used improperly.
    7. Moral equivalence: Arguing that two things are morally equal, even though they are different things.
      1. Not as bad as: A moral equivalence fallacy that says because B is worse than A, A is justified. Also known as the 'not as bad as' fallacy.
    8. Phantom distinction: When someone spends time arguing for the superiority of one term over another (rather than the intended debate), yet there is no effective difference.
    9. Quote mining: Misquoting someone to gain the appearance of authority.
    10. Scope fallacy: When the scope of a logical operator (eg, "not", "some", or "all") is vague allows misinterpretation and incorrect conclusions.
    11. Style over substance fallacy: Using language or rhetoric (ethos or pathos) to enhance the appeal of an argument, but not its validity, or arguing the method of presentation affects the truth of a claim.
    12. Suppressed correlative: Attempting to redefine two mutually exclusive options so that one encompasses the other.
    13. Wronger than wrong: The fallacy of assuming that different degrees of "wrong" are the same.

[edit]Conditional

See the main article on this topic: Conditional fallacy
  1. Appeal to authority: incorrectly asserts that some authority's assertion proves a point. It has several subfallacies:
    1. Anonymous authority: When a source is quoted (or supposedly quoted), but no name is given.
    2. Appeal to celebrity: When a source is supposedly authoritative because of the respect people give them.
    3. Appeal to confidence: When a source is supposedly authoritative because of their confidence.
    4. Argumentum ad populum: When a source is supposedly authoritative because of their popularity.
      1. Silent Majority: When a source is supposedly authoritative because of the popularity of their views, yet there is no evidence of the popularity of their views.
    5. Invincible authority: When a source is the entirety of an argument.
    6. Ipse dixit: When a source is the person making the argument.
    7. Linking to authority: When a source is "cited" in-text yet the reference doesn't exist / is irrelevant / says something else.
    8. Ultracrepidarianism: When a source is quoted outside their expertise, as if expertise in one field extended to another.
      1. Professor of nothing: When a source is introduced as "Prof." or "Dr.", yet they aren't.
    9. Quote-mined authority: When an authority is selectively quoted to distort their views.
  2. Slippery slope: But if we legalize rabbit hunting, then pretty soon we may legalize human hunting!
    1. Continuum fallacy: Asserting that a continuum of possibilities between two distinct states renders the states identical.
  3. Special pleading: Your evidence might disprove my example, but my example is special.
  4. What's the harm: It's just some water (and your payment of $50); what harm can it do?
  5. Imprecision fallacy: When data is specific to something that doesn't apply to what is claimed.
    1. Anecdotal evidence: Using anecdotal evidence to make a general point.
      1. Generalization from fictional evidence: Using a fake story to make a general point.
    2. Apex fallacy: Using the best/worst group to generalize to the whole group.
    3. Argument from analogy: Using an ill-fitting analogy to generalize a group.
      1. Extended analogy - Arguing an opponent's analogy that A is like B in one particular way is a claim that A and B are directly comparable to one another.
    4. Category mistake: Confusing what is true of a part with what is true of the whole.
    5. Cherry picking: Using examples that support your viewpoint.
    6. Nutpicking: Using examples that are batshit insane to represent a group.
    7. Overgeneralization: Taking a few specifics and making a general rule out of them, without the few specifics adequately representing the entire group.
    8. Overprecision: Assuming a prediction is exactly correct for any given point.
    9. Pragmatic fallacy: It worked for me -- it'll work for everyone!
    10. Selection bias: Bias inherent in selecting data.
    11. Spotlight fallacy: Assuming aspects of a group from aspects from a smaller observed part of the group.
    12. Texas sharpshooter fallacy: A data mining fallacy and pattern recognition error where the arguer makes an ad hocconclusion from a set of unrelated data without looking for corroborating data.

[edit]Argumentative

  1. Galileo gambit — If someone is going against the tide of popular thinking, they must be right because the likes of Galileo were right, while in reality, Galileo was right because he had evidence (see also Argumentum ad martyrdom).
  2. Intuition pump — Deliberately abusing a thought experiment to prove a fallacious point.
  3. One single proof — Dismissing all circumstantial evidence in favor of a single "smoking gun" that may not (and may not need to) exist.
  4. Red herring — A group of fallacies which bring up a fact which is irrelevant to the issue, in an attempt to distract the opponent and/or audience.
  5. Straw man — Distorting an opponent's position for greater rhetorical flexibility.
  6. Argumentum ad Hitlerum — Saying something is bad because Hitler did it.

[edit]Fallacy collections

There are lots of fallacy collections in the Web. Some of them promote a particular agenda, but most fallacies listed in them are real and present in arguments everyday. Unfortunately, many are deprecated.
Here is a list of websites, ordered roughly by usefulness:
  1. Fallacy Files(link)
    1. Taxonomy of Logical Fallacies(link)
    2. Glossary(link)
    3. What is a logical fallacy?(link)
  2. Your Logical Fallacy Is(link)
  3. International Encyclopedia of Philosophy(link)
  4. Secular Web(link)
  5. Nizkor Project(link)
  6. Skeptic's Dictionary(link)
  7. About.com: Agnosticism/Atheism(link)
  8. Arthur Schopenhauer(link)
  9. Stephen's Guide to the Logical Fallacies(link)
  10. Dr. Michael LaBossiere(link)
  11. Free Dictionary(link)
  12. Bruce Thompson(link)
  13. Don Lindsay(link)
  14. Art of Debate(link)
  15. George Boeree(link)
  16. Philosophy in Action(link)
  17. Daniel Kies(link)
  18. L. Van Warren(link)
  19. Agent Orange(link)
  20. Humanist Discussion Group(link)
Deprecated ones, listed ad hoc:
  1. Sinclair Community College(link)
  2. Global Tester(link)
  3. Anti-Mormon Illogic(link)
  4. Objectivism(link)
  5. Evolution_V_Creation forums(link)
  6. Peter A. Angeles(link)
  7. Sine Wave(link)
  8. Carleton University(link)
  9. P5(link)
  10. Mathenomicon(link)
  11. Vanessa Hall(link)
  12. J. P. Craig(link)
  13. Informal Fallacies(link)
  14. Autonomist(link)
  15. Gordon, Hanks, & Zhu(link)
  16. Freemasonry(link)
  17. Taking Sides(link)
  18. Jeff Richardson(link)
  19. Chisnell.com(link)

[edit]See also

Icon fun.svgFor those of you in the mood, RationalWiki has a fun article about Justification generator.


https://en.wikipedia.org/wiki/Fallacy


מאמרים:




מילגרום

Conformity



From Wikipedia, the free encyclopedia


Conformity is the act of matching attitudes, beliefs, and behaviors to group norms.[1] Norms are implicit, unsaid rules, shared by a group of individuals, that guide their interactions with others. This tendency to conform occurs in small groups and/or society as a whole, and may result from subtle unconscious influences, or direct and overt social pressure. Conformity can occur in the presence of others, or when an individual is alone. For example, people tend to follow social norms when eating or watching television, even when alone.





[hide]This article has multiple issues. Please help improve it or discuss these issues on the talk page(Learn how and when to remove these template messages)
This article possibly contains original research(March 2012)This article may require cleanup to meet Wikipedia's quality standards(April 2012)
Peer pressure (or social pressure) is influence a peer group, observers, or an individual exerts that encourages others to change their attitudesvalues, or behaviors to conform to those of the influencing group or individual. Social groups affected include bothmembership groups, in which individuals are "formally" members (such as political parties and trade unions), and cliques in which membership is not clearly defined. However, a person does not need to be a member or be seeking membership of a group to be affected by peer pressure. One may also recognize dissociative groups, with which one wishes to avoid associating, and thus behave counter to that group's norms.[citation neede







מערכון מחנה הריכוז בארץ נהדרתגל של תלונות ברשות השניה










כוח (סוציולוגיה)





כוח מבחינה סוציולוגית הוא היכולת לגרום או לכפות על יחידים וקבוצות לפעול בדרך מסוימת או להימנע מפעולה בדרך אחרתבדרך כלל על ידי נקיטת סנקציות שונות או על ידי איום בהןכוחיכול להתבטא ככוח פיזי או להבדיל ככוח פוליטיהמתבטא באיום בסנקציות כלכליותחברתיות וכדומהכוח הוא גורם מסביר חשוב מאוד בכל תאוריה סוציולוגיתובעזרתו ניתן להסביר כיצד נורמות וערכים מוצאים את ביטויים בהתנהגותלדוגמה.




abc news Primetime Milgram








Asch Conformity Experiment





https://www.amazon.com/Crimes-Obedience-Psychology-Authority-Responsibility/dp/0300048130/159-9554110-1957546?ie=UTF8&*Version*=1&*entries*=0








הטרדה נפשית

מארי-פראנס היריגוין





אלימות בעבודהכשהבוס שלך עושה אותך חולה



http://www.haaretz.co.il/hasite/pages/ShArt.jhtml?contrassID=1&subContrassID=5&sbSubContrassID=0&itemNo=701405

ההתעללות הסמויה מהעין





https://he.wikipedia.org/wiki/%D7%AA%D7%90%D7%95%D7%A8%D7%99%D7%99%D7%AA_%D7%94%D7%91%D7%97%D7%99%D7%A8%D7%94_%D7%94%D7%A6%D7%99%D7%91%D7%95%D7%A8%D7%99%D7%AA
תאוריית הבחירה הציבורית היא ענף של הכלכלה העוסק בחקר הליכי קבלת החלטות אצל מצביעיםפוליטיקאים ועובדי ציבורמהפרספקטיבה של התאוריה הכלכלית.





בעל זבוב





מילגרום





זימברדו




http://www.cognetica.co.il/symptoms-of-ptsd

סימפטומים של פוסט טראומה





http://www.bechar.co.il/ptsd.html

סימפטומים של פוסט טראומה




תסמונת פוסט טראומתיתמהי פוסט טראומה?
http://www.natal.org.il/?CategoryID=234









https://www.youtube.com/watch?v=6-nMA8qqhKQ&feature=related

ככה סוחטים מילד הודאה ברצח אחותו





http://www.hebpsy.net/store.asp?scr=product&typ=738
מילים הורגות ניני גוטספלד-מנוח



http://www.mouse.co.il/CM.television_articles_item,789,209,59813,.aspx

מערכון מחנה הריכוז בארץ נהדרתגל של תלונות ברשות השניה









http://abcnews.go.com/Primetime/story?id=2765416&page=2

Basic Instincts: The Science of Evil





מילגרום הסרט
http://www.imdb.com/title/tt1466523/






http://www.ifeel.co.il/%D7%93%D7%99%D7%9B%D7%90%D7%95%D7%9F-%D7%A7%D7%9C%D7%99%D7%A0%D7%99/

דיכאון קליני





http://home.netcom.com/~workfam1/unders1.htm
WORK ABUSE:
HOW AND WHY IT HAPPENS



http://www.macom.org.il/definitions/topic_emotional_abuse/emotional-abuse-links/

דף קישורים בנושא אלימות רגשית או נפשית





http://www.mako.co.il/news-israel/local/Article-05d8fe2684a9531018.htm

שוטר איים לתת דוח לנהג מכיוון שהעיר לו על חניה





ההכרה החברתית בטראומה כל כך חשובהעד כדי כך שאנשים שעברו טראומה אך החברה מתעלמת ממנהלעתים מתאבדיםכי בהתעלמות משדרת החברה לקרבן מסרלא רק שעברת טראומה אלא שמכאן ואילך זכויותיך נפרצותההתעלמות פירושה שלקרבן אין זכות לסיפור חייולאופן הייחודי שעל פיו הוא מארגן את תפיסת העולם שלוכתוצאה מכך מבין הקרבן שלמעשה אין לו זכות להגנה על גופו ונפשוהוא נדון לחיות את הטראומה שוב ושובכי בהתעלמותה מהנרטיב שלומשדרת לו הסביבה שאין סיכוי שיזכה להגנהועל כן הסביבה הטראומטיתזו שבמסגרתה התאפשרה הטראומהאינה חדלה לעולםמול איום שכזהשנתפס כאיום מוחשייש נפגעי טראומה שמנסים להתאבדלמרבה הצערחלקם מצליחים









http://www.calcalist.co.il/local/articles/0,7340,L-3398926,00.html

כמה רוע אפשר לבלוע

מסע אל שורשי הרע האנושי עם פרופפיליפ זימבארדוהאיש שחתום על האסון המחקרי שנהפך לאבן בתולדות הפסיכולוגיה של הרשע





http://www.ynet.co.il/articles/0,7340,L-3666015,00.html
שנות מאסר לפרנואיד שהתנכל לעורך דין







http://en.wikipedia.org/wiki/Situationism_(psychology)




https://www.osh.org.il/uploadfiles/t_134_bullying_f.pdf
הצקה בעבודה



https://www.osh.org.il/uploadfiles/ccohs_Bullying.html
הצקה בעבודה



חוק ההיפוךככל שאדם מכיר פחות את הקורבן כך קל יותר למנוול שמפיץ שמועות שקריות בזדון לגרום לאדם להאמין לשמועות השקריות.ככל שאדם יותר מקורב ומכיר את קורבן העלילות הרי שאיקס ידרוש יותר ראיות והוכחות ויתקשה להאמין לסיפורים.




https://www.youtube.com/watch?v=0gPYCRfHg3Q

BBC - How to Make You Torture Someone to Death





https://www.youtube.com/watch?feature=endscreen&v=jmFur5Cr288&NR=1

Social Psychology - TV Causes a Deadly Break in Reality





abc news Primetime Milgram





https://www.youtube.com/watch?feature=endscreen&v=TYIh4MkcfJA&NR=1

Asch Conformity Experiment





http://www.hayadan.org.il/on-milgram-experiment-0307096

ניסויים פסיכולוגיים במקדונלד'ס – על הניסוי המצמרר של מילגרם





https://en.wikipedia.org/wiki/Strip_search_phone_call_scam

Strip search phone call scam









בעל זבוב בירושלים

http://www.haaretz.co.il/magazine/1.1708946







שיטת ההטרדה היא שיטת הפאסיב אגרסיב.הסובלים מהפרעת האישיות פאסיב אגרסיב לא יגידו בגלוי וביושר שהם כועסים עליך ולמה ויבטאו כעס בגלוי אלא יחפשו דרך לפגוע ברגשות/להעליב/להשפיל
בצורה שתהיה בלתי נראית לסביבה ובלתי מורגשת והם מוגנים מאחורי התירוץ של "הוא מדמיין/משוגעהם יתממו ובפומבי יגידו שאין להם שום דבר נגדך ובחשאי יעסקו בהצקות,הטרדות וחבלות קטנות בשיטת כפל הלשון הארסית בשיטת ההתבטאויות הודו משמעיות הלעגניותאלו אנשים מלאי מרירות ורעל שעסוקים בהרעלת הסביבההטענה היחידה האמיתית שיש להם נגדך היא רק שאתה ברמה מקצועית גבוהה והם מקנאים וזו לא טענה שאפשר כמובן להודות בה בפומבי או בגלוי ומכאן משחק ההיתממות.





http://www.sahar.org.il/?categoryId=63113&itemId=132807
אלימות פסיכולוגית



https://www.youtube.com/watch?v=wAgg32weT80

Workplace Bullying





http://www.eip.co.il/?key=3763
איך לשתול מחשבותאיך פועלים מסרים תת הכרתייםאיך לגרום למישהו לעשות מה שאתה רוצהפלרטוטטיזינגמסרים כפוליםריבוי משמעויותמסר תת הכרתיאיך להשפיע על אנשיםאיך לשכנע אנשים?




לחץ חברתי

הטית אישור

נטיה לקונפורמיזם

נטיה לטובת אינטרס


ציות למרות ולסמכות
.
--------------- ---------------------------------

העמדת פני נורמטיבי

מאמינים בקלות ,כושר שכנועהסתה מניפולטיבית.

אנשים רגילים

התאמת מסרים לשומע

כפל משמעות



שתילת מחשבות.
פאסיב אגרסיב.















http://psycnet.apa.org/journals/abn/109/4/602/
Induced emotional interpretation bias and anxiety.



https://www.sciencedaily.com/terms/confirmation_bias.htm

Confirmation bias





https://wiki.lesswrong.com/wiki/Narrative_fallacy

Narrative fallacy




The narrative fallacy addresses our limited ability to look at sequences of facts without weaving an explanation into them, or, equivalently, forcing a logical link, an arrow of relationship upon them. Explanations bind facts together. They make them all the more easily remembered; they help them make more sense. Where this propensity can go wrong is when it increases our impression of understanding.
Nassim Nicholas Taleb, The Black Swan

Blog posts










http://www.overcomingbias.com/2008/12/the-bad-guy-bia.html

The Bad Guy Bias







https://en.wikipedia.org/wiki/Doublespeak

Doublespeak is language that deliberately obscures, disguises, distorts, or reverses the meaning of words. Doublespeak may take the form of euphemisms (e.g., "downsizing" for layoffs, "servicing the target" for bombing[1]), in which case it is primarily meant to make the truth sound more palatable. It may also refer to intentional ambiguity in language or to actual inversions of meaning. In such cases, doublespeak disguises the nature of the truth. Doublespeak is most closely associated with political language



https://en.wikipedia.org/wiki/Euphemism


A euphemism /ˈjufəˌmɪzəm/ is a generally innocuous word or expression used in place of one that may be found offensive or suggest something unpleasant.[1] Some euphemisms are intended to amuse; while others use bland, inoffensive terms for things the user wishes to downplay. Euphemisms are used to refer to taboo topics (such as disability, sex, excretion, and death) in a polite way, or to mask profanity.[2]
There are three antonyms of euphemism: dysphemismcacophemism, and loaded language. Dysphemism can be either offensive or merely ironic; cacophemism is deliberately offensive. Loaded language evokes a visceral response beyond the meaning of the words.

https://en.wikipedia.org/wiki/Malapropism

malapropism (also called a malaprop or Dogberryism) is the use of an incorrect word in place of a word with a similar sound, resulting in a nonsensical, often humorous utterance. An example is the statement by baseball player Yogi Berra, "Texas has a lot of electrical votes", rather than "electoral votes".[1] Malapropisms also occur as errors in natural speech and are often the subject of media attention, especially when made by politicians or other prominent individuals. Philosopher Donald Davidson has noted that malapropisms show the complex process through which the brain translates thoughts into language.

https://en.wikipedia.org/wiki/Gibberish#Other_terms_and_usage

Gibberish, jibberish, jibber-jabber and gobbledygook refer to speech or other use of language that is nonsense, or that appears to be nonsense. It may include speech sounds that are not actual words,[1] or forms such as language games or highly specialized jargon that seems non-sensical to outsiders.[2] Gibberish should not be confused with literary nonsense such as that used in the poem "Jabberwocky" by Lewis Carroll.[citation needed]
The word gibberish is more commonly applied to informal speech, while gobbledygook (sometimes gobbledegookgobbledigook or gobbledegoo) is more often applied to writing or language that is meaningless or is made unintelligible by excessive use of abstruse technical terms.[citation needed] "Officialese", "legalese", or "bureaucratese" are forms of gobbledygook. The related word jibber-jabber refers to rapid talk that is difficult to understand.[3]


https://en.wikipedia.org/wiki/Bloviation

Bloviation is a style of empty, pompous political speech particularly associated with Ohio due to the term's popularization by United States President Warren G. Harding, who, himself a master of the technique, described it as "the art of speaking for as long as the occasion warrants, and saying nothing".[1] The verb "to bloviate" is the act of creating bloviation. In terms of its etymology, according to one source, the word is a "compound of blow, in its sense of 'to boast' (also in another typical Americanism, blowhard), with a mock-Latin ending to give it the self-important stature implicit in its meaning.".[2]



סין, טאייוון

  תעלומה של שי : אפשרות פלישה לטאייוון :   התעלומה של שי : נשיא סין הוא לאומן סיני עם מנטליות של המערב חורש את רעתה של סין . בענין הזה הוא ת...