Ego Depletion: Accepted Theory in Trouble
This story from Slate was cited by Retraction Watch (http://retractionwatch.com/).
It is not that "all these pseudo-scientists on government money are frauds." For one thing, the original research was carried out at Case Western Reserve University, a private school, in fact, the "Patrick Henry University" of Atlas Shrugged. Rather, it speaks to the sociology of science. Science, no less than religion, sports, or business, is an artifact of human society. It is subject the same kinds of personal failings that are the equal and opposite of heroic achievements.
Identifying these failures is integral to the process, no different than a business dropping an unprofitable product. That product had champions who cited research before being able to show at least some market response. We all carry smart phones now, but how many failed PDAs (personal digital assistants) can you name from the 1990s?
Ego depletion may be real: it seems intuitively obvious that we can get worn down. Quantifying that may be intractable with our current paradigms. Objectivism might suggest a more robust psycho-epistemological model.
For the fifth year in a row, I judged our regional science fairs for senior high, junior high, and elementary schools. My area is Behavioral and Social Science. Across all of the categories in the Intel International, we always give the highest ratings to "original research." We never reward replication studies.
When I lived in Michigan and my wife worked at the U of M (Flint), I delivered two "Super Science Friday" sessions to middle schoolers. The second year, my theme was "CSI: Flint." Centered on junk science in the courtroom and police laboratory misconduct, I suggested to the kids that anyone with a head for science who wants to go into police work should consider working for an office of research integrity. Every major university has one. The federal government has several because they fund so much research. I have never heard of such a thing in the private sector.
Here is the original Slate article
http://www.slate.com/articles/health_...
It is not that "all these pseudo-scientists on government money are frauds." For one thing, the original research was carried out at Case Western Reserve University, a private school, in fact, the "Patrick Henry University" of Atlas Shrugged. Rather, it speaks to the sociology of science. Science, no less than religion, sports, or business, is an artifact of human society. It is subject the same kinds of personal failings that are the equal and opposite of heroic achievements.
Identifying these failures is integral to the process, no different than a business dropping an unprofitable product. That product had champions who cited research before being able to show at least some market response. We all carry smart phones now, but how many failed PDAs (personal digital assistants) can you name from the 1990s?
Ego depletion may be real: it seems intuitively obvious that we can get worn down. Quantifying that may be intractable with our current paradigms. Objectivism might suggest a more robust psycho-epistemological model.
For the fifth year in a row, I judged our regional science fairs for senior high, junior high, and elementary schools. My area is Behavioral and Social Science. Across all of the categories in the Intel International, we always give the highest ratings to "original research." We never reward replication studies.
When I lived in Michigan and my wife worked at the U of M (Flint), I delivered two "Super Science Friday" sessions to middle schoolers. The second year, my theme was "CSI: Flint." Centered on junk science in the courtroom and police laboratory misconduct, I suggested to the kids that anyone with a head for science who wants to go into police work should consider working for an office of research integrity. Every major university has one. The federal government has several because they fund so much research. I have never heard of such a thing in the private sector.
Here is the original Slate article
http://www.slate.com/articles/health_...
Ratcliffe makes the claim that science in general is at a critical juncture at this point due to the effects of those belief biases carried into popular literature and even professional materials. I think he's right, and as a result, I question nearly all 'new' science and claims.
What I mean is that I read Atlas Shrugged in high school and took the Basic Principles class in the 12th grade. I read Anthem first, then The Fountainhead and then FNI and VOS before AS. But that did not provide me with an insightful application of objective epistemology in physics classes. We got rewarded for completing the lab in 90 minutes or two hours, from set-up to write-up. We knew the intended outcome. We got as close to the curve as possible and identified some sources of error. That's it. We never were asked how we would test this claim or that from our textbooks. We just cooked from the recipe.
The one chemistry class I had was like that. The one geology class I had was like that. And geology, in particular, is a science in constant upheaval. We never were asked for alternative explanations of the physical evidence in front of us.
Changing that deeply reinforced cultural pattern is a serious challenge.
Likewise, lectures and texts also omit the process by which scientific principles were attained. The most we got was so-and-so discovered this, with or without a crude one-liner about the experiment. To understand what the principles mean and how they were found you have to go back on your own and find the best conceptual explanations and histories you can to try to trace what happened.
As I have pointed out in the sociology classes I took while majoring in criminology, and finishing a masters in social science, we actually did study the scientific method, at every level, at least at the beginning of the semester. (In the world standard textbook by Sir Anthony Giddens (architect of "New Labor"), it is discussed twice, once at the outset and again with more detail at the end of the book.) Moreover, in sociology, we study the origins and development of the science, sometimes to our own dismay.
Many in sociology accept physics as the gold standard of science. They point out that in Social Forces and other journals, you can find many papers published today that cite Max Weber. No one in Physics Letters A-G or Physics Today cites James Clerk Maxwell in support of an argument. Conversely, few sociology papers cite research less than five years old. Physics is always about citing the latest research in your paper.
But that speaks exactly to the point. We do not bury our story of development, the false paths, the overturned assumptions, the backpedaling and even the curious, if not hypocritical, yet highly rewarding research of Marxists who set up a consumer polling business. (Read Paul Lazarsfeld in Wikipedia.)
I had a 200-level class, required not only for sociology, but also for social workers in Research Methods. Every week, we chose and criticized two peer reviewed papers of our own selection. "Can undergraduates meaningfully criticize peer-reviewed papers?" I asked. "Start with the math," the professor said. You don't get that in physics.
Social sciences obviously are plagued by many problems, fundamental conceptual problems. They appear as basic ideological problems. The reason that I did not pursue a master's in criminology was that there was nothing more to learn: race, gender, and capitalist oppression pretty much defined the sources of all of our problems. By choosing an open program in "social science" I put together an approved study of transnational white collar crime by taking graduate classes in criminology, U.S. foreign policy, economics, and geography.
But by then - as opposed to when I was first a freshman in 1967 - my understanding of Objectivism was better integrated.
See also here:
http://utexas.academia.edu/HiltonRatc...
And his blog here:
"Scientists are neither superhuman nor divinely privileged. Scientists, let me tell you right now, are simply plodding bricklayers in the wall of knowledge."
http://www.hiltonratcliffe.com/
I must insist, however, that the social sciences actually do teach science. It is the physical sciences that are remiss. In sociology classes from the freshman to the senior level, we study the study: how sociology came to be, how it grew and changed by criticism within and without. We also study the scientific method and statistical methods. That is not true in physics. They just dump the thing in the lap of the freshman as if it were from the head of Zeus. They assume that you know the scientific method. But is that the three-step, nine-step, or 14-step version?
No, truthfully, the social sciences adhere more closely to what we expect from "real" science.
I took one 'soft' science course in college: Sociology. It was obvious to me that the high grades went to the people who knew the right 'cant' phrases to use. It was a type of techno-Babel competition, not a science class.
Pity. It could have been interesting otherwise. I stuck to the hard sciences after that - at least they had answers that could be measured in grams.
Jan
But you can run into the same troubles in the physical sciences, also. Again, consider Hilton Ratcliffe. He can say those things now that he is old and tenured and independent. Follow stories about him and you will find that his Christian faith attracts others who use his doubts about mainstream science for other purposes. (And Ratcliffe is a Christian, also.) Here is a story about a Christian paleontologist who was fired because he had another explanations for his objectively supported empirical discovery. http://losangeles.cbslocal.com/2014/0...
We might dismiss Christian fundamentalists from Patrick Henry University, but the deeper question needs to be addressed, because here-and-now, the physical sciences have their own mantras, liturgies, and cant. It may have been only that you were able to learn the creole of physical science.
Here is one example of that: Did you gather lots of data? Did you build a curve from a robust array of measurements? That is the fallacy of induction. What if you had offered David Harriman's "objectivist" theory that a single example is enough, if it is well-integrated to the foundation of truth? "Oh, no, professor, I just needed one example..." Right...
“In the context of the Lesson of Voelvlei, what emerges is this: To get a coherent mental reconstruction of external reality, we must use logic. There is no other way to consistently produce a proper result. We are simply not equipped—dare I say intended?—to deal with the world irrationally as a survival mechanism. It would be counter-productive and unnatural. Whatever we think, the only audit we have is comparison with external reality. No matter how convinced I might be that by simply flapping my arms I could fly like a bird, if I were to test my faith by jumping off the Empire State, gravity would win. If I can predict gravity’s victory, that’s logic."
“So yes, I am a chattering African ape (a notion which does not offend me in the slightest), and I babble on unashamedly in ape-talk, thinking and developing opinions in the fashion of the monkey that I am. For every yin, there is a yang. Intelligence is a mastery of logic and an appreciation of the aesthetic. Hedonism is tempered by ethics. Rational is shadowed by the irrational, and we, creatures of the little blue planet, must cope with that. It’s how well we harmonise with the laws of nature that will determine in the broadest terms the location of that seminal line drawn in the sands of time that separates success from failure. We won’t win by fighting it.(emphasis added)
And:
"My father was agnostic, and I was brought up without religious prejudice. That really was an advantage, because when I went into science I had no philosophical or theological baggage to worry about. It was great. Eventually, my journey took me into the infinite universe of astronomy, and what I came across, what I saw with my own eyes, absolutely blew me away." (emphasis added)
And in his introduction to his latest book 'Steven Hawking Smoked My Socks' he writes:
"I do not question Islam because I am Christian; I do not critique Christianity because I am a Jew; I do not deny Mayan Doomsday ideas because they conflict with my belief in Nostradamus; I do not challenge Greenpeace because I am a member of the National Rifle Association; and I do not attack the 9/11 conspiracy theories because I am Republican. In every case, I assess those belief systems using the objective scientific method, and in every case they are found wanting, and I remain in all cases an agnostic." (emphasis added)
I've seen nothing in his writings or self description to verify your description.
My major was bio; minor in chem (and physics...by accident), so my tests were a lot more functional. Here is a tube full of bacteria: Identify genus and species. Or. Here is a tube full of some substance - What is it?. These had pleasantly real answers.
Jan
That said, just because a scientist is wrong does not justify firing someone after his work as been lauded and his career endorsed every step along the way.
Moreover, as for those cross-checks, long ago, Durk Pearson told me to subscribe to Industrial Research magazine. Mostly about the nitty-gritty of industrial chemistry, they had a contrarian columnist whose wild claims caused people to cancel their subscriptions. One of his columns said that because carbon-14 is continually produced in the upper atmosphere by collisions of cosmic rays with carbon-12, we cannot be sure that our steady-state assumption is correct: maybe the past more or less was made, as, for instance, when the magnetic poles shifted.
On that note, I recently read In Suspect Terrain by John McPhee. Although plate tectonics had a hard birth, today it is used to explain everything. McPhee's subject matter expert was Anita Harris. Again, what was once heresy is now orthodoxy, resistant to challenge.
Also, you did not answer my question about your acceptance of the inductive fallacy of positivism in mainstream science.
With respect to Plate Tectonics, pre-Clovis New World occupation, epigenetics: As with Global Warming, the important part is that all sides of the question be heard and that no side be vilified - though stern criticisms of scientific data are expected.
I do not care that the professor is Christian (or even Young Earth Xtian). The leap from "it has soft tissue" to "therefore it proves that dinosaurs are only a few thousand years old" is a huge and totally unsupported leap - this is a leap of Faith and not a scientific progression. There are many more (over a dozen) ways of testing the age of an object than just C-14; most recent paleological work uses C-14 dates that are calibrated to take the fluctuating amounts of C-14 into account.
Jan
What may be called the scientific method is as you say not overtly taught in 'hard science' classes, it is by and large practiced. The reverse is the case in social science when sensitive cultural beliefs are defended according to the orthodoxy of the time.
As Samuel Johnson said-
"Example is more efficacious than precept".
It has been observed before on this site (was it jbrenner?) that any field of study containing the word science is not science.
political, social, climate, ..
At my alma mater, Eastern Michigan University, the psychology department was very proud of being in the physical sciences building, rather than with the social sciences.
Oh... and computer science! (Not a science?)
Quips and zingers are a lot of fun, but the tragedy of Orwell's 1984 was the destruction of thought with "ducktalk" the recitation of one-liners. We call the best of them aphorisms or proverbs.
Computer science, supposedly the science underlying computing, however, more the collection of rules followed at a time by those writing programs. Computer science is further from science than engineering and medicine, it relates to science much as the study of law.
It is hard to see how economics, as currently understood and propagated, is a science. Economics today incorporates some of the greatest con-tricks and slights-of-hand of the age. As for psychology, this is a branch of social sciences (oops), distinguish from psychiatry which is part of medicine, the theme of this thread suggests how common it is for myths to be accepted as fact in psychology.
I am a lover of one-liners so I am quite upset (sob) to see them condemned. I wish I had the writing skill to condense my thoughts to one-liners instead of having to quote them.
Mostly what we study is probably best called "software engineering".
Most "computer science" that I've encountered has been more the study of techniques rather than underlying truths. There have been a few forays into that realm, such as the study of Turing machines and computability but in general we are talking about algorithms and techniques.
I guess to me the idea of science is that you are discovering a truth of nature, something about objective reality that you can understand. To my mind, engineering involves taking tools and components and building things.
When you are developing an algorithm to perform a task, there is no 'scientifically correct' way to do this, we have a bunch of tools that can be applied but you can always come up with some weird way of accomplishing it. You don't need 'rigorous application of studied principles'. Of course using things out of the standard tool set is easier.
"I guess to me the idea of science is that you are discovering a truth of nature..."
Okay. So your arguments are #1 that you feel computer science isn't as directed towards research and focuses more on real-world application when compared to other disciplines such as chemistry and #2 that in many cases there is no "one source of truth" regarding how to solve a particular problem.
I can see where you are coming from. Here's what I consider:
To me, any discipline to be useful must focus on application. Theory is a start, but it holds little value until it can be applied in the real world to a task at hand. The study of a lever or an inclined plane is of much greater value when being applied with a hammer or screw. We study the basics of chemistry so that we can understand and derive consistent processes of input leading to output whether it be the combination of vinegar and baking soda to create carbon dioxide bubbles or the breakdown of various -thane molecules when combined with oxygen and a catalyst to generate heat. But there again, how much of the study of chemistry is geared towards the examinations of the molecules themselves rather than their uses?
With regards to the "single source of truth" aspect, much of computer science is oriented about how people interact with information. With twenty years in the field, I can tell you that it is very rare indeed when two different people see and use the same bit of information exactly the same way. Your argument there is acknowledged, however it presupposes that there is a single use of information which is "proper" or "best". There is no question that most informational requests center on one aspect of interest, but it is the linking of that information to secondary and tertiary uses that is one of the reasons a field called business intelligence is such a hot place right now.
I think that one of the things you are attempting to differentiate is what some call pure science as compared to what others call applied science. I would argue that there are aspects of every field which are purely theoretical (dark matter, AI) and others which are applied (reduction of bauxite to aluminum, coding an operating system). That one determines a particular discipline to be more or less applied vs research oriented is more a matter of perception than anything.
"Applied science" I would consider more of an engineering discipline. You don't necessarily find any new truths, but figure out how to use what is known to produce things.
I remember in the early 1970's when colleges were first introducing "computer science". I attended classes at Illinois Institute of Technology, one of the first schools to offer an undergraduate degree in Computer Science. At the time there was much debate as to whether there was an actual computer science and the study of things like turing machines and "proof of correctness" were pushed to make it more "sciency". They still remain a part of the curriculum but have little to do with what we actually do.
As to how we interact with information, that is an evolving art (see, not a science!). I remember when I first saw a presentation of the Xerox Star with the mouse and Graphical Interface. I was gobsmacked and was instantly convinced that it was how we would interact with computers in the future. Of course I don't think that it is the final solution. There will be other UI techniques that let us more directly interface and people will look back with scorn at having to move a mouse.
I guess to me the distinction between science and engineering is that science is like the X files where "the truth is out there" and in engineering we make it ourselves.
In my comment above to jlc, I pointed to the positivist influence in physical science. It is why they pursue so many data points, eternally chasing one more example to build an inductive proof. And when a paradigm shift invalidates their inductions, they just start all over again, working within the new paradigm.
Science is in trouble, as it always has been. The Scylla and Charybdis of rationalism and realism were never navigated successfully. Even so, science is not as hopeless as many here seem to want to believe. An objective foundation will prove itself the cause of a renaissance in science. And that is being established. It is no accident that the works of Ayn Rand are popular with so many professionals in STEM.
http://retractionwatch.com/2016/03/14...
"Academics ‘regularly lie to get research grants’"
"Scholars in the UK and Australia contemptuous of impact statements and often exaggerate them, study suggests"
"...Academics routinely lie and exaggerate when telling funding agencies what impact their research will have, a series of candid interviews with scholars in the UK and Australia has suggested..."
"Respondents said that future projections of impact were 'charades' or 'made-up stories'. As one UK professor put it: 'would I believe it? No, would it help me get the money – yes.'
"Academics felt pushed into lying on their impact statements by the logic of ferocious academic competition, the paper found.
“'If you can find me a single academic who hasn’t had to bullshit or bluff or lie or embellish in order to get grants, then I will find you an academic who is in trouble with [their] head of department,' said one professor in Australia.
"Another Australia-based academic said that embellishment was about 'survival' in the research grant game."
This is the corrupt ethics and political system side of the influence of bad philosophy in science as in everything else.
"[R]esearchers in several scientific areas have consistently struggled to reproduce major results of prominent studies. By some estimates, at least 51%—and as much as 89%—of published papers are based on studies and experiments showing results that cannot be reproduced.
"Researchers have recreated prominent studies from several scientific fields and come up with wildly different results...
"No one is accusing the psychologists behind the initial experiments of intentionally manipulating their results. But some of them may have been tripped up by one or more of the various aspects of academic science that inadvertently encourage bias.
"For example, there’s massive academic pressure to publish in journals, and these journals tend to publish exciting studies that show strong results.
“Journals favor novelty, originality, and verification of hypotheses over robustness, stringency of method, reproducibility, and falsifiability,” Hagger tells Quartz. “Therefore researchers have been driven to finding significant effects, finding things that are novel, testing them on relatively small samples.”
"[I]n cases where researchers have access to large amounts of data, there’s a dangerous tendency to hunt for significant correlations. Researchers can thus convince themselves that they’ve spotted a meaningful connection, when in fact such connections are totally random."
But these are not the primary causes. They are the result of a lack of commitment to scientific objectivity and a lack of knowing how when the dominant methods of thinking are rationalistic "model building".
http://www.jclinepi.com/article/S0895...
(The link to the PDF is at the right of that.)
From your first link, I note this core:
Ivan Oransky, producer of the blog Retraction Watch, which tracks retractions printed in journals, tells Quartz that ultimately, the alarm will lead to increased rigor.
“There’s going to be some short-term and maybe mid-term pain as all of this shakes out, but that’s how you move forward,” he says. “It’s like therapy—if you never get angry in therapy, you’re probably not pushing hard enough. If you never find mistakes, or failures to reproduce in your field, you’re probably not asking the right questions.”
Your rant above about the corrupt practices of research in the the UK and Australia is easy to read and easy to agree with. However, it buries reports such as the one I cite here, from the ASA. That paper is being widely cited, "going viral" as we say now. I found out about it from reading Retraction Watch. They conducted their own interview with the authors. When I googled for it now, one of the top hits was to the Science story about the ASA press release.
We can wring our hands over these revelations, but the fact is that this is the self-correcting nature of science.
Not from you, ewv, but from others here, I perceive the same "flavor" or "tenor" of anti-scientific, anti-intellectual deconstruction that comes from the post-modernists. When I was completing my masters in social science in 2010, I had a criminology theory class that presented post-modernism. We read nonsense about crime being a torus and a strange attractor (citing Jacques Lacan). The post modernists say that the practice of science is just a social fraud to perpetuate oppression. You can find that thinking here, also.
Again, myself, as interesting - if not amusing - as these revelations are in the moment, the more important fact is that they reveal the true nature of science as depending on evidence and being tested by falsification.
It does happen that results are buried for financial reasons, and that just fuels the fires of anti-capitalists who want all research to be funded by the government.
Jan
Tobacco- when the companies paid, one kind of conclusion came out. Now there is little company money but much from government, only the other kind appears.
Climate- ' research is bought and paid for on both sides. And that is good'.
But when the money ratio is about 3000:1 no wonder which way the research goes. Stopping all the government money would solve that, there will always be altruistic sheep who want to save the planet, let them use their own money.
Rather than government ensuring fair play (expletive deleted), I'd rather rely on Galbraith's countervailing power which says that other interests always exist and these will do their own research with the public deciding.
Baumeister is so wedded to his original theory that he can't let go. It is more rewarding, however, to discover a mistake and rectify it than to falsely believe in one's own correctness. It's more satisfying to BE right than be thought to be right. That’s a good principle to keep in mind when confronted by loss of acceptance.
About the radishes vs. cookies theory: In a culture where sweets are rewards, those receiving cookies will have a higher level of self-esteem, albeit artificially induced. They will have a surplus stock of performance motivation. Those not so rewarded will have a diminished sense of self-worth (ego has nothing to do with it) and thus shrink from seeking to excel.
My personal preference would actually be to eat the radishes rather than the cookies; I love radishes and am not crazy about the effect all that sugar has on me.
Those with a lesser sense of self will tend to lack the confidence to succeed and prevail and will give up sooner. It may well have been that the early versions of these experiments brought correct results, within the context of the culture of that time. But little by little the infusion of knowledge about previous tests contaminated the new recruits. There is a great temptation to tailor results to the desired goals, and to rationalize outcomes.
The disparate results from differently treated or rewarded groups has its parallel also in kids not getting enough love and growing up maladjusted. A similar experiment decades ago with baby monkeys showed that babies left alone in a cold, empty cage with only a metal fixture from which to get their food fared much worse and did not thrive, compared to monkeys that had a fur covering over the fixture that gave them some small comfort and thus they fared better, versus babies with their live mothers who flourished the best. Being mistreated is demoralizing and ill prepares a creature, animal or human, to cope with confidence with the problems of life.
It’s not ego depletion; it’s self-confidence destruction. In extreme cases the intent is, in fact, to break a person’s spirit, as in prisons. Timid individuals don’t assert themselves and are drained of the energy to take initiative on their own behalf. It’s like their batteries are not fully charged.
Quite aside from that, each person has a motivation program running that can be triggered in ways unique to each individual. A timid boy may get courage to defend a friend, or a smart person may judge correctly when further effort is useless. Even a lion knows when to stop running after a gazelle she can’t catch. And any individual who volunteers for an experiment is already tainted by the sense of importance of his or her role. That’s more nearly an ego involvement and would tend to override the radish rebellion.
I once made a graphic for my office wall: Long ago, people believed that the mind resided in the heart. Then, they placed it in the liver. Today, we say that the mind is in the brain. Some people will believe anything.
Objectively, ego is identical with self-esteem. That contradicts the Freudian definition of ego. In mainstream psychology, can have a strong ego by knowing who you are, but not be happy with yourself or your station in life. That is not true for a self-actualizing psychology. By the standard of human potential if you are resigned to your fate, then you lack ego.
I wish I knew what they used or how they shielded?
That knowledge could be put to many other uses.
Part of my response to a mentors musings that we really don't know what gravity and another force is...I suggested that these two forces are one in the same only manifested differently...care to guess what the other force is? and how it is manifest? (we have to be careful with this one) -the lefty loosies would be all agush!
His work on ego and self-esteem written well before this recent test would have indicated the same result, even though that was not specifically what he was aiming at. I love debunkers, don't you? From the 70's fear of global freezing, to today's fear of global warming. Only in those cases you might get arrested in the near future.
I understand the flaw of the original study. I was referring to the several books published by Branden regarding ego and self-esteem. Upon reading those books, I would have determined the same as the recent study, not because that was Branden's intent, but if one understands his work in my opinion, it would also negate the original study.
Also, I used this thread to tout Branden because he is one of my heroes, warts and all, and I found his work to be very valuable, especially as it substantiates much of Rand's work.
,
Those that live mostly in the mind, no longer need an ego for identity...however, it's rarely that clear cut, it's not pretty. Due to many distractions; including the memes of awareness and the paradigm in which we live, everyone finds themselves vacillating back and forth between the bicameral brain and our minds. [excepting those that do not have a connection to a mind, ex. liberal progressives...the very young or those so challenged]
Gaining an identity, an "Iness" within the mind is actually quantum physical...the mind being part of the ether which pervades creation or the cosmos.
As for Otto W., I really don't get a sense of his hold on reality...it would seem that even for one fleeting moment, living in his head got him in trouble...how often do we question ourselves on a daily basis..."why did I do that?" "What was I thinking?"
I agree that separation from the herd can cause the construction of a new identity -- actually, I believe, it can be the true construction of identity. It separates city people from country people in many ways.
Here in the Gulch and on other Objectiv-ish discussion boards, some people extol the virtues of their small towns and close-knit communities where "everyone knows everyone." They tend to be political conservatives, traditionalists. But, even the very idea of a "Gulch." Is an expression of that. Some weeks back, khalling complained about the "loss" of the "sense of life" in the Gulch, how people here just are not the same ones anymore.
In the city, where everyone is a stranger, people do build new identities. In fact, that is also an aspect of America as a "large medieval town." Immigrants drop their Old World names and take new ones. In frontier days, as people moved West, they left problems behind by taking new names.
I forget who wrote about the fact that it is not the culture that defines the city, it is the concept of cities that defines the culture and it's usually a dependent one because your life and it's problems is dependent upon someone else most of the time...
I am not sure that you can go back and forth from the bicameral brain to the mind. I think that once you discover (or create) your mind, there is no going back. That is one reason that so many fail to understand others. We are using different brains. Ayn Rand in her journals but seldom in print touched on the distinctions between fully formed people and those who do not think. It is a slippery slope because once you define other people as "not human" morality ceases.
I've seen some amazing stuff...
While your example certainly speaks to the problem, we all have stories. i worked for a software firm that scammed investors with promises of Reaganomic investment credits. One of the officers went to prison, but not enough of them did. That proves nothing about the software industry "being in real trouble" or the fraudulent core of investment tax credits. It is just one case.
I do agree with you (I believe) that in order to improve (if not rescue) science in America, we need to create a new educational model in science. As I said above, we do not reward replication studies, but they are the touchstone of science.
What is more ego depleting than false praise. When the subject finds out after leaving the warm fuzzy cocoon of daily attaboys and enters the real world.
I'm referring to the self esteem monster which demands every child from Kindergarten on up to the Universities are barraged with phrases such as 'good try.' Or are told it's all right if you answer two plus three is seven we'll still pass you to the next higher grade level and even put you on the honor roll.
So my poor little contribution is YES I can see where that phrase might have a useful realism. but useful only to another grant seeker.
Your hyperbole about being rewarded for asserting that 2+3=7 does raise another, deeper, issue. The problem requires an objective solution, and all we have is one-liners and platitudes. I refer to whether and to what extent, you reward a learner for having the right method, but getting the wrong answer.
Feynman was adamant: The right answer is all that matters; and there are many ways to find it. He would not reward a child for the wrong answer. On the other hand, earlier this month I checked by hand the physics, algebra, and arithmetic of a 19th century scientist (John Leonard Riddell). In this case, he dropped a minus sign. It happens... How would you grade a child (university student) who calculated an Earth-Moon Lagrange Point and dropped a minus sign?
On another Objectiv-ish board, a mother with an Austism-spectrum child wrote about one particular story problem. "A spaceship can carry two aliens. If six aliens want to travel to Earth, how many spaceships do they need?" The kid wanted to know if they all have to travel in pairs... Just sayin'... Me, I'd praise the kid just for asking.
Finally, to open another discussion entirely, why do we have this emphasis on arithmetic and reading, science and history? Yes, they are important. However, I refer to the fact that schools are praised for dropping music and art in order to keep with "core" studies. Those are not "core" studies if you are a musician or artist. Why not make everyone take four years of high school music and art? Just asking...
Consider Steve Ditko is. He used his GI Bill to study comic book art under Jerry Robinson at the Cartoonist and Illustrators School. Float that idea here in the Gulch today. Remove the famous references and just ask: "Should schools offer career classes in cartoon art?" Imagine the outrage at the liberal progressive post modernist destruction of education...
If the method is right and the answer is wrong it shows something is not right be it dyslexia or some other problem. The teacher must find the shortcoming and fix it.Better yet help the student find the shortcoming and teach him or her how to fix it. Shortcuts in mental math which used to be our only computer or calculator for example and I agree with Feynman to the point the final grade was based on the ability of the student at the end of the course of instruction.The grades before were judgements on the teacher not the student.
But I repeat mis using the words self esteem and we read them in Rand constantly go hand in hand with self respect. One reason I harp constantly on not using PC era dictionaries which pander to the lowest levels of attainment and often have deeper objectives. So we graduate from hyperbole to serious discussion.,
My objective has been met. But the generations subjected to 'Good try here's a gold star' when the student knows it wasn't a good try will find themselves behind the old eight ball - another one liner. Unless they can find an employer willing to like a public school teacher is currently trained to act.
Easy to say when the taxpayer foots the bill. Harder when it's a profit conscious employer. As for me I didn't include the words High School Graduate on applications for employees. Home School, GED and JC were present.
Over to you.
As for Michael Faraday, the only biography that I read (as a child) revealed the abuse he took while working for Sir Humphrey Davies. So, Davies, too, then, would be suspect as a scientist acting lower than his station, if you will.
Your sleight toward this unnamed collective on behalf of four exemplars does not explain much: "The kind of errors they made and understanding they missed were not like the kinds of rationalistic theorizing and promotions being criticized today."
Only two paths to error exist: rationalism and realism. Either you fail to find empirical evidence for your (logically consistent) assertions or you fail to provide a logically consistent explanation for your (empirically verified) perceptions. Truth comes from the integration of the rational and the real, i.e., the objective.
Ultimately, of course, they are integrated: no dichotomy exists, except in our erroneous understandings, our misunderstandings. It is the difference between an accused criminal being "really" guilty and the prosecution being able to prove it. Obviously, reality is real: the accused committed the act or not. What we know, however, may be severely limited. And, as we know all too well, often the prosecution "proves" an innocent person "guilty." So, too, in science, we all often go astray, though with far less consequence.
As I recall having read, long after their experiment discredited the concept of the "luminiferous aether" Edward Morley continued to search for it, attempting to refine his instruments to remove the "error" of their famous experiment. Do you condemn him for that? Was he a hopeless "rationalist"?
When you dismiss everyone except four names you know, you prevent the study and understanding of important work in the past in many fields far beyond physics. "
The work of those who did that are virtually unknown today because their methods failed. " I never heard of Robert Hooke until I took a class in civil engineering (1978). But Hooke anticipated Newton's pronouncements with his own work in the same fields. Newton's "shoulders of giants" remark was meant as a thinly-veiled insult to Hooke, who was short of stature, as many were then and there. When I researched Newton's life and work for an award-winning biography, I came upon Hooke several times. If not for the insistence - and generosity - of Sir Edmund Halley, Newton would have been an interesting footnote in the history of science, a London savant, master of the Mint, Parliamentary representative from Cambridge, dabbler in optics, and a religious fanatic who narrowly escaped the laws against heresy.
Objectively, Newton's work would have been great, only that we would not know about it; and he would have been relegated to "those who are virtually unknown because their methods failed."
There were many at the same time who used improper methods or a mixture and who either accomplished nothing or not much, and so they have not been heard of since or are little known for good reason. That includes rationalists claiming to anticipate Newton but whose speculations did not accomplish what Newton in fact did.
I pointed out above, that even at the elementary school level, at science fairs we reward originality, not replication, testing, and falsification. This is the culture of science, to promote new ideas and new discoveries.
Imagine what it would be like in a world operating on the opposite paradigm, one where only replication and verification were highly rewarded and novelty was always suspect. It would take a long time for the candle to replace the torch.
Clearly, the proper path is not rigid adherence to one Kantian "idea" over another. I do not know how to encapsulate that "middle of the road" except to use the word "objective."
How do you properly integrate the predictive truth of mathematics with the empirical evidence of our experiential world?
The answer to that requires a full course in epistemology, a study which itself is still under construction. Harriman's Logical Leap was a new addition to that body of knowledge. But it, too, is only an indication, not a cookbook.
Objectivity does not mean 'middle of the road' or 'moderate'.
Mathematics is often misused in rationalistic speculation, counting on the technical nature of the mathematics to appear 'scientific'. That does not mean that all mathematics is rationalizing.