Misconceptions about IBL (or CCSS or other education efforts in general) are in part due to what I call "selective reporting." When there's news about evidence that points towards using active, student-centered methods of teaching, it rarely receives much fanfare, so most people miss it. On the other hand, if school district or academic institution is struggling to implement effective teaching, then the story makes it to mainstream news, creating a bias. Put another way, the problem is that the full picture isn't being reported in a reasonable way, so the takeaway message is distorted one.
Here's an analogy that gets across the depth of the difference. In photography, cropping is a way to change or enhance an image. What we exclude can significantly alter the impact of an image.
Picture #1. The subject is a boy who appears to be pensive or annoyed. Presented by itself the image leaves the viewer with a very specific interpretation. It's definitely not about playful, youthful themes.
Picture #2: What was really going on? The subject was engaged in a variety of silly poses and only pretended to be annoyed in the image above. Taking a look at the frames below shows the broader story, and a completely different perspective of what is happening.
In a similar way, news is reported in bits and pieces about education that leaves the broader story out of the picture. An oversimplification of education is presented below just to get the point across. The diagram is not how I actually think about the education system, but I think it's good enough for these purposes for a "back-of-the-envelop calculation."
Let's assume the main groups of topics in education reform are listed in the diagram below.
What happens is that the media employs selective reporting, where the emphasis is on the problems with say implementation or highlighting a small subgroup's opinions disproportionately more than their earned merit (other stuff). So to the non-expert it's easy to make incorrect/limited conclusions.
For instance with respect to CCSS, the media emphasis has been on implementation struggles. The public then could be swayed to think the entire CCSS idea is flawed, as opposed to seeing the problem for what it is -- early struggles with the transition. Reporting rarely (if ever?) asks natural follow-up questions or provides a broader view to put the issues into context. It's especially unfortunate, because education is a complex, long-term issue. Therefore, context is fundamentally important to understanding what is going on in education, and context and framing is just what is being excluded by media reporting.
The math profession isn't entirely free of this. The AMS published somewhat recently a Doceamus article in the AMS Notices http://www.ams.org/notices/201010/, where a narrow study on "worked examples" was extrapolated to imply that constructivist and minimally-guided approaches were invalid. This article made the rounds, but the article by Freeman et al, based on a metaanalysis of 225 research articles from STEM disciplines appeared to get less fanfare, despite being categorically a vastly stronger body of work in scale, quality, and value to society.
The dynamics of misinformation is subtle, because it's hard to know about things you don't know about already. Undoing misconceptions is harder than informing people right the first time, so it's a problem that can snowball and lead to unproductive or harmful resolution.
The misinformation gap is here and is real.