Agenda driven data is one of the hardest obstacles to overcome - whether in the Orthodox world or the world as a whole. For all the obfuscation regarding the Iraq War, the most legitimate potential criticism is not whether the Bush administration had the data to justify invasion, but whether the gathering and weighing of that data was agenda driven in such a way that they were more prone to give it credence than they otherwise would or should have been. More recently, it was impressive to see three liberal publications discuss how liberal or other agenda driven bias is causing major issues in extremely important scientific fields by placing ideology over science.
In The New York Times about two weeks ago, an article about social psychologists' bias tells an eye-opening story:
[Dr. Jonathan Haidt] polled his audience at the San Antonio Convention Center, starting by asking how many considered themselves politically liberal. A sea of hands appeared, and Dr. Haidt estimated that liberals made up 80 percent of the 1,000 psychologists in the ballroom. When he asked for centrists and libertarians, he spotted fewer than three dozen hands. And then, when he asked for conservatives, he counted a grand total of three.Prior to that, The Atlantic noted that the structure of academia caused research to emphasize the sensational over fact, stating in Lies, Damned Lies, and Medical Science that
To get funding and tenured positions, and often merely to stay afloat, researchers have to get their work published in well-regarded journals, where rejection rates can climb above 90 percent. Not surprisingly, the studies that tend to make the grade are those with eye-catching findings. [...] Imagine, though, that five different research teams test an interesting theory that’s making the rounds, and four of the groups correctly prove the idea false, while the one less cautious group incorrectly “proves” it true through some combination of error, fluke, and clever selection of data. Guess whose findings your doctor ends up reading about in the journal, and you end up hearing about on the evening news?And The New Yorker had a fascinating piece entitled The decline effect and the scientific method about the same time discussing how studies which were (and are) used to explain and create numerous theories and ideas are increasingly difficult to replicate, from the effectiveness of anti-depressants to memory to numerous other fields, causing huge questions to be raised about the efficacy and accuracy not only of the studies but of all that has been based on the findings of them. No matter the study, the more it is replicated, the less true it seems to be.
If we as a Jewish community wish to begin fixing our problems, examples such as these show we can't use agenda driven studies to come to conclusions. If we do, ultimately we'll rush in one direction that looks good, only to find over and over that it's just not working the way it's supposed to - and then it will be too late. When the JES began two years ago, the primary purpose was to put together a simple guide for young singles and couples to help them when they began living independently or started their marriage. As the data came in, however, its findings taught so much about what problems existed and how people viewed those problems and the community's economics as a whole - and pointed to, but didn't prove, what might be able to help.
It is high time we put in the effort to collect and truly understand as much data as possible about the Jewish community and its various underlying problems and their causes - and its strengths and what allows those to thrive. Perhaps (!) a data-driven approach will allow us to ultimately help the Jewish community, rather than simply push off what seems to be an increasingly close, inevitable collapse of the structure currently in place.