The Media and Educational Research:
What We Know vs. What the Public Hears

 

By
Alex Molnar

Presented at the 2001 AERA annual meeting,
Seattle, Washington April 11, 2001

 

Center for Education Research, Analysis, and Innovation
School of Education
University of Wisconsin-Milwaukee
PO Box 413
Milwaukee WI 53201
414-229-2716

   April 11, 2001

 

 

CERAI-01-14

 

Abstract

News reports of education research frequently do not appear to take accountof whether such research is peer reviewed. Various education research journalsand research organizations that release research on education topics weresurveyed to determine whether they subject their research to external peerreview. While journals responding consistently employed peer review processes,only some research organizations did. Of four organizations widely perceived ashaving an ideological stance, only one reported subjecting its research tooutside peer review. Research organizations, whether their work waspeer-reviewed or not, employed a variety of media strategies to draw attentionto their work. Journals with rigorous peer review processes did not reportsophisticated strategies to draw media attention to their findings. The absenceof consistent peer review in education research that succeeds in winning publicattention creates a risk that sound policy may be subverted by the promotion ofpriorities that are not founded on solid social science research or that do notrely the best available research knowledge.

Introduction

In February 2001, The Manhattan Institute and researcher Jay P. Greenereleased a report on Florida’s “A-Plus” education reform program. In thereport, “An Evaluation of the Florida A-Plus Accountability and School ChoiceProgram,”[1] Greenecalculated that “schools receiving a failing grade from the state in 1999 andwhose students would have been offered tuition vouchers if they failed a secondtime achieved test score gains more than twice as large as those achieved byother schools.”[2] Heconcludes that the threat of vouchers motivated the poorest-performing schoolsto improve.[3] The reportwas written under a contract with Florida State University as part of a grantfrom the Florida Department of Education to evaluate the A-Plus Program.[4]

Greene’s report received extensive publicity.[5] At least one account, published in thenationally circulated daily USA Today, presented no dissenting point ofview.[6] Within a fewweeks, however, two articles were published challenging the initial report’smethodology and conclusions. Both articles called into question the statisticalmethodology of Greene’s original study, and presented alternative explanationsfor his findings.[7]Unlike the Greene report, the two critical articles were published in arefereed academic journal. Those articles received no publicity in the generalinterest press.[8]

The Greene report and its reception in the media illustrate a largerchallenge that education researchers, policymakers and the general public face.Over the past two decades, reports produced by national and state private thinktanks and policy organizations have played an increasingly important role inshaping education policy.  Examples of reports on education policy issuedby private organizations over the past couple of years include: “Report Card onAmerican Education: A State by State Analysis, 1976-1999” (American LegislativeExchange Council, April 2000); “The State of State Standards” (Thomas B.Fordham Foundation, January 2000); “Choice and Community: The Racial, Economic,and Religious Context of Parental Choice in Cleveland” (Buckeye Institute,November 1999); “Why a ‘Super’ Ed-Flex Program Is Needed to Boost AcademicAchievement” (Heritage Foundation, March 1999).

To increase their impact, private policy organizations often disseminate andpromote each other’s work. For example, the “Super Ed-Flex” program (laterrenamed “Straight A’s”) advanced by the Heritage Foundation in its March 1999report was followed by “Why We Need ‘Straight A’s’,” issued by the Thomas B.Fordham Foundation in May 1999. The Fordham document is the text of testimonyby Chester E. Finn, Jr., given before the U.S. House of RepresentativesCommittee on Education and the Workforce, May 20, 1999. The testimonysubsequently provided the basis for op-ed articles that appeared in the NewYork Times, the Los Angeles Times, and the Washington Times,among other publications.

Yet amid the welter of such reports and media coverage of them, there seemto be few guideposts for readers to measure the quality of the information thatthey are receiving.

To publish the results of research in academic journals, social scientists mustsubject their findings to outside review by others in their discipline. Thissystem of external review is generally regarded as an important tool forquality control in scientific research. Certainly, peer-reviewed work maybecome subject to disputes over conclusions or methodology. Even so, subjectingresearch to outside review can help ensure that it is well founded and that itsfindings are the result of sound, reliable and tested methods. Indeed, externalreview and subsequent professional publication can be generally regarded asprerequisites for a piece of research to be taken seriously enough for othersto follow up and attempt to replicate its findings.

Because much of the work produced by public policy organizations is notpublished in recognized, refereed academic journals, however, the generalpublic and policy makers alike lack any sort of yardstick by which they canmeasure its quality. This fact raises the question of whether the researchproduced by these organizations, so successful in gaining media attention, isin fact subject to objective outside review at all.

The question is important because the distribution of policy reports notsubject to a peer review process carries with it a risk that sound policy maybe subverted. Such reports may promote priorities that are not founded on solidsocial science research or that do not rely the best available researchknowledge. Often policy reports issued by private think tanks are prominentlyfeatured in media stories, whereas any scholarly response normally occurs muchlater and is commonly little noticed by either the press or by policy makers.

This report summarizes the findings of two brief studies. The first is acase study in the media treatment of a particular research report on educationpolicy, the Greene report on Florida’s voucher program noted above. The secondis a brief survey of the peer review procedures employed by representativeorganizations and journals.

The Florida Voucher Report Case Study

Searches on Lexis-Nexis and Education Abstracts were conducted for theperiod between Feb. 15, 2001, when the Greene study was released, and April 11,2001, when the searches were conducted, using various search terms appropriateto the Greene report on Florida’s plan. In addition, articles were retrievedthrough links from the Web site of the Manhattan Institute, which published theGreene paper. Details of the search methods and findings are set forth inAppendix A.

Articles retrieved in the search were read and categorized by the author as follows:Items that reported the Greene-Florida study’s findings uncritically, withlittle or no comment from authoritative sources calling its findings intoquestion, were designated “Pro.” Items that appeared to balance discussion ofthe study’s findings with other points of view, including critical assessments,were designated “Neutral.” Items that consisted largely of comments orarguments questioning the findings were designated “Con.”

 

As might be expected, the large volume of reports on the study appeared inthe first few days after its publication, which was promoted by a press releasenationally distributed by a public relations news service and funded by anorganization openly established to promote private school vouchers. Newscoverage tapered off significantly thereafter.

 

Among news stories that were turned up during the search, 11 were rated as“Pro” and 10 as “Neutral” using the above criteria. A single news story wasrated “Con.”  Among commentaries, six were rated “Pro” and two “Con.” Itshould be noted that all three of the “Con” items appeared in a singlepublication, a newspaper in Palm Beach, Fla.

 

As already noted, none of the mainstream print media that turned up in thesearch covered either of the academic articles that questioned the Greenereport’s methodology and conclusions. Both articles were published after thewave of publicity surrounding the Greene report.


The Peer Review Survey

To examine the quality of research currently produced by variousorganizations and publications, the Center for Education Research, Analysis,and Innovation undertook a survey of peer review procedures employed by severalsuch entities. The survey also inquired about media strategies and methods ofmaking the public aware of research findings produced or published by theseorganizations and journals.

Methodology

Sixteen organizations or journals were selected for structuredinterviews.  The content of each interview was the same, with appropriatechanges made depending on whether the interview subject was the editor of ajournal or a designated representative of a research organization. (Seeinterview form, Appendix B.)  Varying follow-up questions were asked toclarify responses to the initial questions.

Interview subjects were chosen to reflect a cross-section of publicationsand research organizations whose work bears on issues in education andeducation policy.  (See lists of respondents and non-respondents, AppendixC.) Subjects included academic journals; academically based research organizations;and independent research organizations established primarily to influencepolicy-makers.

Results

Of the 16 organizations or publications selected for the survey, nine hadprovided responses by April 2, 2001.

The remaining seven did not respond to at least three contacts by telephoneor e-mail, or else indicated they would not participate in the survey.  Itis hoped that this report can be updated in the future with responses frompreviously non-responding organizations or publications, and by extending thesurvey to other organizations or publications not previously contacted.

A table of the survey results from organizations that responded follows:[9]

Respondent

Category

Peer review? Details

  Media Strategy

Education Policy Analysis Archives

Journal

Yes. Blind copies of submissions go to editorial board members for review and comments.

None. Some education journalists are subscribers.

Journal of Educational Evaluation and Policy Analysis

Journal

Yes. Blind copies of submissions are submitted to outside reviewers, selected by subject matter.

None. Education journalists are assumed to be subscribers.

Rand Corp.

Research organization

Yes. Copies of submissions are submitted to at least two outside reviewers for technical review and then are reviewed by two internal officials of the institution.

Varies with work product. May range from simply publishing on the Web to more formal press-notification strategies.

 

Consortium for Policy Research and Education

Research organization

Yes. Reports are reviewed internally, then submitted to two or three outside reviewers for evaluation of technical matters, methodology, assumptions and other matters.

None.

Brookings Institution

Research organization

Yes. Each submission is reviewed by three people selected on the basis of topic from inside or outside the institution.

Varies with work product. Include public forums for press and policymakers, press releases, and paid advertising.

Heritage Foundation

Research organization

Submissions are reviewed internally but not sent to outside reviewers.

Varies with work product. Some reports become the basis of events or press conferences; all reports are delivered to media lists kept by individual departments in the foundation, as well as to members of Congress and others.

Hoover Institution

Research organization

No, except for work submitted by an outside researcher (a rarity).

Press releases, book tours and a public relations contractor.

Thomas B. Fordham Foundation

Research organization

No, although sponsored research may be submitted to peer-reviewed journals after release.

Quarterly mailings to lists that include journalists; press events as warranted by specific projects.

Economic Policy Institute

Research organization

Yes. Research is sent to at least three outside reviewers chosen for technical expertise in the subject, and may be sent to organizations and individuals whose interests bear on the subject.

Individual strategies are constructed for all research projects and may include press releases, conferences, and book tours.

Of the nine organizations responding, two were journals and seven wereindependent research organizations.  Of the research organizations, threeare generally viewed as non-ideological in nature (Rand, Brookings, and theConsortium for Policy Research and Education [CPRE]), while four are generallyconsidered to reflect a political or ideological outlook (The HeritageFoundation, the Hoover Institution, the Thomas B. Fordham Foundation, and theEconomic Policy Institute [EPI]). Heritage, Hoover and Fordham all may befairly characterized as conservative, while the EPI may be fairly characterizedas liberal.

Both journals, as well as the three non-ideological organizations, Rand,Brookings, and CPRE, all reported having mechanisms for peer review or outsidereview of research. Of the other four research organizations, only one, theEconomic Policy Institute, reported  mechanisms for outside peer review.

Media strategies could be found both in organizations with peer review andin those lacking peer review. The two journals, responding to the survey lackedmedia strategies, beyond simply assuming that education journalists receivedthe publications. CPRE also reported no strategy targeting the general media.Other research organizations’ strategies varied, but whether categorized asideological or non-ideological, and whether or not they required externalreview, their media strategies were considerably more sophisticated than thoseof the peer-reviewed journals.

Discussion

While this initial survey was limited in its scope, its results were notsurprising. It is clear that at least three organizations whose reports oneducation subjects are the subject of media attention – and in many cases haveproduced broad-based declarations of what does and does not work in educationreform – have no process for conducting the kind of peer review and expertscrutiny of their work that is an important benchmark of social scienceresearch and that is a requirement for publication in journals such as EducationalEvaluation and Policy Analysis or American Educational Research Journal.

Meanwhile, the journals where the most rigorous policy analyses arepublished may convey their findings to the education press routinely, but lackany formal strategies to draw their findings to the attention of such opinionleaders.

The danger posed by widely publicized non peer-reviewed reports has been thesubject of considerable discussion. Recent articles and essays addressing theissue include: “Oops, Sorry: Seems That My Pie Chart Is Half-Baked,” PatriciaCohen, New York Times, April 8, 2000; “Reckonings: How to Be a Hack,”Paul Krugman, New York Times, April 23, 2000; “They Blinded Me withPolitical Science: On the Use of Non peer-Reviewed Research in EducationPolicy,” Ed Muir, PS, December, 1999; and “Without Peer,” ChristopherShea, Lingua Franca, April 2000.

This state of affairs might best be understood by analogy.  Each week,news organizations publish articles about some of the newest findings of medicalresearchers. Among the principal sources for these articles are the two mostinfluential medical journals, the Journal of the American Medical Associationand the New England Journal of Medicine. Like virtually all other scientificjournals, both publications require submissions to undergo peer review.

We do not see routinely published in the mainstream media unscrutinizedclaims for novel or “fringe” medical treatments or practices that have not beensubject to such peer review. When such treatments are covered in news orfeature stories, they are almost always in a context that includes at leastsome references to the findings of peer-reviewed research pertaining to thetopic at hand.

Yet such discrimination does not seem to be in evidence in coverage ofeducation research. Its absence is reflected in the case study of coverage ofthe Greene paper. “Peer review” of the paper came in the form of two academicarticles that were themselves subject to peer review. These were publishedthree and four weeks, respectively, after the Greene report’s release. This isextraordinarily quick for an academic response; nevertheless, the two follow-upreports did not themselves receive media attention, except for a single articlein Education Week.

There remain several avenues for further inquiry.  First, the survey ofpeer review practices would benefit from a larger sample, by extending itsreach to additional respondents and by encouraging non-respondents toparticipate.  Second, these findings suggest two potentially fruitfulother surveys, both directed at the news media themselves. One would be areview of all articles in select news publications, on broadcast news shows, orboth, over a representative time period to establish the relative volume ofcoverage of education research from peer-reviewed and non-peer-reviewedsources. Another would be to survey editors and journalists to learn whatstandards they employ in assessing whether to cover particular researchreports, and whether the presence of peer-review standards enters into thatconsideration.

Appendix A

Press Coverage of
“An Evaluation of the Florida A-Plus Accountability and School Choice Program”by Jay P. Greene and of the Scholarly Response to Greene

February 15, 2001 – April 11, 2001

Greene, Jay.  (Feb. 15, 2001). “An Evaluation of the Florida A-PlusAccountability and School Choice Program.” Manhattan Institute. Program onEducational Policy and Governance. Available:http://www.manhattan-institute.org/html/cr_aplus.htm.

 

Feb. 15, 2001

Public Relations Service

“New Research Reveals Vouchers Motivate Florida Public Schools to ImproveAcademic Achievement.”  PR Newswire Association, Inc.*** (Press release from the Milton and Rose D. Friedman Foundation, an organizationsupporting private school vouchers.)

Newspaper/Magazine Article

“Journal: Proof Competition Aids Public Schools.”  The AtlantaJournal-Constitution (daily newspaper)  Pro

Newspaper/Magazine Commentary

Lynch, Michael W.  “How Vouchers Passed Their Florida Test.”  ReasonMagazine (monthly magazine)*     Pro

Feb. 16, 2001

News Service

“Manhattan Institute Study Finds Voucher Threat Spurred Florida’s WorstPerforming Schools to Do Better.” The Bulletin’s Frontrunner***   Neutral 

Royse, David.  “Study: Failing Fla. Schools Improved.” [Associated Press - AP Online]**    Pro

Newspaper/Magazine Article

Billups, Andrea.  “Study Backs Jeb Bush’s Voucher Program.” TheWashington Times (daily newspaper)**   Pro                      

Brown, Marilyn.  “Study Says Fear of Vouchers Drives Fla. Schools toImprove.” The Tampa Tribune (daily newspaper)**    Neutral

Flannery, Mary Ellen. “Educators: Vouchers Threat Didn’t Spur Schools’Progress.” The Palm Beach Post (daily newspaper)**  Con

Greenberger, Scott S. “Voucher Backers Tout Fla. Scores.”  TheBoston Globe (daily newspaper)**  Neutral

Hegarty, Stephen.  “Study Finds Voucher Threat Effective.” St.Petersburg Times (daily newspaper)**  Neutral

Henry, Tamara.  “Florida Schools Shape Up Amid Voucher Threat, FindingsCould Boost Bush’s National Plan.”  USA Today (daily newspaper)** Pro

Kelly, Pat.  “Voucher Program Boosting Scores.” Bradenton (Florida)Herald (daily newspaper)***  Pro

Newcomb, Amelia & Rowan, Robin. “In Florida, Lessons on Vouchers.” The Christian Science Monitor (daily newspaper)**  Neutral

Royse, David. “Study: Failing Schools Improved Twice as Much as OtherSchools.” Tampa Bay Tribune (daily newspaper)**  Pro

 

Royse, David.  “Study Shows F Schools Make Greater Progress.” TallahasseeDemocrat (daily newspaper) Pro (AP)

Schemo, Diana Jean.  “Threat of Vouchers Motivates Schools to Improve,Study Says.”  The New York Times (daily newspaper)** Neutral

Schouten, Fredreka.  “Threat of Vouchers Does Work: State-Funded SurveyEvaluates Failing Schools.”  Pensacola News Journal (dailynewspaper) Pro

February 18, 2001

Newspaper/Magazine Article

Schouten, Fredreka.  “Study Says Voucher Threat Motivated Schools ToImprove.”  The Indianapolis Star (daily newspaper)*** Neutral

Manuel, Marlon. “Voucher Schools Fighting to Survive; Being at the Forefrontof a National Debate Over Education Takes a Toll on Florida Community.”, TheAtlanta Journal-Constitution (daily newspaper)***  Neutral

February 20, 2001

Newspaper/Magazine Commentary

VerSteeg, Jac Wilder.  “A Stretch to Vouch for Vouchers.” The PalmBeach Post (daily newspaper)***  Con

February 21, 2001

Newspaper/Magazine Commentary

 

Flicker, Aaron.  “Good News From… Florida?” The Post. (dailynewspaper published by Ohio University, Athens, OH) via University Wire*** Pro

Greene, Jay P.  “Bush’s School Plan: Why We Know It Will Work.”  NewYork Post (dailynewspaper) Pro

Education Press Article

Sandham, Jessica L.  “Study Finds ‘Voucher Effect’ in Florida TestGains.” Education Week Vol. 20, No. 23, p. 12, 15 ***  Neutral

February 22, 2001

Newspaper/Magazine Article

Lopez, Kathryn Jean.   “Bush Ed Plan Rates an ‘A’: An Interviewwith Jay Greene.” National Review (bi-weekly magazine)Pro

Newspaper/Magazine Commentary

“Voucher Study Doesn’t Make Case for Vouchers.” The Palm Beach Post(daily newspaper)***   Con

February 23, 2001

Newspaper/Magazine Commentary

“Voucher Victories: Florida Program Motivates Public Schools.”  TheDaily Oklahoman (daily newspaper)***  Pro

February 27, 2001

Newspaper/Magazine Article

Henry, Tamara & Kasindorf, Martin.  “Testing Could Be the Test forBush Plan. The Debate: Are Exams the Way to Hold Schools Accountable forStudent Achievement?” USA Today (daily newspaper)*** (Section on Greene Voucher research) Pro

February 28, 2001

Newspaper/Magazine Commentary

Lynch, Michael W. “Vouchers Raise Scores: Challenging Florida’s FailingSchools.”  The Washington Times (daily newspaper)*** Pro

March 1, 2001

Newspaper/Magazine Article

 

“The Wonderful Voucher Threat.”  The New York Post (daily newspaper)*** Pro

March 4, 2001

Academic Publication

Camilli, Gregory & Bulkley, Katrina. “Critique of ‘An Evaluation of theFlorida A-Plus Accountability and School Choice Program.”  EducationPolicy Analysis Archives, Vol. 9, No. 4.  Available at: http://epaa.asu.edu/epaa/v9n7/

March 5, 2001

Greene, Jay P.  “A Reply to ‘Critique of ‘An Evaluation of the FloridaA-Plus Accountability and School Choice Program’ ’ by Gregory Camilli andKatrina Bulkley in Education Policy Analysis Archives, Vol. 9, No. 7March 4, 2001.”  Available at:http://www.manhattan-institute.org/html/greenes_reply.htm

March 12, 2001

Newspaper/Magazine Commentary

Finn, Chester E., Jr. “Bush’s Education Semi-Reform: Don’t Open theChampagne Bottles Yet.” The Weekly Standard (weekly magazine)***   Pro

March 16, 2001

Newspaper/Magazine Commentary

“Senate Falls Short on School Reform.”  The AtlantaJournal-Constitution (daily newspaper)***   Pro

March 19, 2001

Academic Publication

Kupermintz, Haggai.  “The Effects of Vouchers on School Improvement:Another Look at the Florida Data.” Education Policy Analysis Archives,Vol. 9, No. 8. Available at: http://epaa.asu.edu/epaa/v9n8/

March 23, 2001

Academic Publication

 

Camilli, Gregory & Bulkley, Katrina.  Review of: “A Reply to‘Critique of ‘An Evaluation of the Florida A-Plus Accountability and SchoolChoice Program’ ’ by Gregory Camilli and Katrina Bulkley in Education PolicyAnalysis Archives, Vol. 9, No. 7 March 4, 2001” by Jay P. Greene. Available: http://www.uwm.edu/Dept/CERAI/edpolicyproject/cerai-01-11.html

March 28, 2001

            EducationPress

Sandham, Jessica L., “Second Study Questions Research Linking Voucher Threatto Gains.” Education Week, Vol. 20, No. 28, p. 22.***  Neutral

* Article not found through Lexis Nexis/Education Abstracts butlisted on The Manhattan Institute for Policy Research Web site

** Article found through Lexis Nexis/Education Abstracts andlisted on The Manhattan Institute for Policy Research Web site

***Article found through Lexis Nexis/Education Abstracts but notlisted on The Manhattan Institute for Policy Research Web site

How Time Line Was Constructed

A search was performed for relevant articles using the Lexis Nexisdatabase, Education Abstracts, and The Manhattan Institute for Policy ResearchWeb site.  Articles from Education Policy Analysis Archives andthose posted on the CERAI web site (http://www.uwm.edu/Dept/CERAI/) were knownabout prior to searching and were used to help form search terms.
Lexis Nexis, All News Database

Period Searched:        February 15, 2001 –April 11, 2001

Search Terms:           
(Greene) and (voucher*) and (date > Feb. 14, 2001)
(Greene) and (date > Feb. 14, 2001)
(Camilli) and (date > Feb. 14, 2001)
(Bulkley) and (date > Feb. 14, 2001)
(Kupermintz) and (date > Feb. 14, 2001)

Education Abstracts

Search Terms:               
(Greene) and (voucher) and publication year = 2001(Greene) or (voucher) andpublication year = 2001  
(Camilli) and publication year = 2001              
(Bulkley) and publication year = 2001  
(Kupermintz) and publication year = 2001

The Manhattan Institute for Policy Research Web Site, “An Evaluationof the Florida A-Plus Accountability and School Choice Program - What the PressSaid…” Available: http://www.manhattan-institute.org/html/cr_aplus_press.htm[last visited April 5, 2001]

Each mass media article in the search was evaluated for the tone and slantof its account of the Greene study.

 

·         Items designated Proreported the study’s findings uncritically, with little or no comment from authoritativesources calling its findings into question.

·         Items designated Neutralappeared to balance discussion of the study’s findings with other points ofview, including critical assessments.

·         Items designated Conconsisted largely of comments and arguments calling the findings into question.

Academic articles, including the original report, the two responses to it,Greene’s reply to one response and the subsequent rejoinder, are listed butwere not rated as Pro, Con, or Neutral

Appendix B

The interview instrument

Words or phrases in bold were used when the subject of the interview was aresearch organization. Words or phrases in italic were used when the subjectwas a journal. Otherwise, questions were phrased identically.

1) What are your organization's/publication's writtenguidelines pertaining to methodology and standards for research going out underyour imprimatur/in your publication?

2) Topic selection

Institutes/Foundations: How do you decide on, and set prioritiesfor, those topics you will research?

Publications: How do you decide whether a topic falls into yourpublication's sphere of interest?

3) Qualifying researchers

What sort of credentials, backgrounds, or qualifications do you look for inpeople submitting research for your consideration (or, where research isdone by staff, ditto for people whom you hire to conduct research) ? (Forjournals only: Would you accept well-done research that passed peerreview from someone without a doctorate or without an academic affiliation?

4) Evaluating research design

What are your standards for sound research -- what must be present in theresearch design for it to be published under your organization's name/inyour publication's pages ?

5) What is your process for external or peer review of research?

6) How do you disseminate the findings that you're reporting? Do you have amedia strategy? What is it?

Appendix C

Interview documentation

Organizations and publications that participated (in alphabetical order):

Brookings Institution

Interview March 15, 2001, with Tom Loveless, director of Brown Center onEducation Policy.

Consortium for Policy Research and Education

Interview March 29, 2001, with Robb Sewell, communications manager.

Economic Policy Institute

Interview March 21, 2001, with Eileen Appelbaum, director of research.

Educational Policy Analysis Archives

Interview March 21, 2001, with Gene V Glass, Editor.

Heritage Foundation

Interview March 29, 2001, with Thomas Timmons, Director of PublishingServices

Hoover Institution

Interview March 29, 2001, with Richard Sousa, Associate

 

">

">

"> "> Director

Journal of Educational Evaluation and Policy Analysis

Interview March 19, 2001, with Barbara Schneider, Editor.

RAND Corporation

Interview March 20, 2001, with Dominic Brewer, Director of EducationResearch

Thomas B. Fordham Foundation

Interview March 21, 2001, with Marci Kanstoroom, Research Director

Organizations and publications that did not participate (in alphabeticalorder)

American Education Research Association

E-mail on March 19, 2001, from William J. Russell, Executive Director,declining to respond before AERA national meeting.

Center for Education Reform

Interview scheduled for March 29, 2001, with Chris Braunlich, vice presidentfor policy and communications, who declined to continue after hearing the firstquestion, and said Director Jeanne Allen would call; no further response as ofApril 5, 2001.

Educational Researcher

Voice mail message March 29, 2001, from feature editor Evelyn Jacobdeclining to participate unless copy of survey was sent by mail; copy of surveysent March 29, 2001, by e-mail; no response as of April 5, 2001.

Education Matters: A Journal of Opinion and Research

No response to e-mail request sent March 20, 2001, and again on March 28,2001.

Evaluation Center at Western Michigan University

No response to e-mail request sent March 19, 2001; telephone call March 28,2001, was referred to Editor Sally Vietor; no responses to voice mail messagesleft March 28 and March 29, 2001.

Manhattan Institute

No responses to voice-mail messages left March 21, March 22, and March 28with Henry Olson, Executive Director, Center for Civic Innovation

Urban Institute

Survey requested for Public Affairs Director Susan Brown via e-mail March14, 2001 and sent same date. 

E-mail message March 30, 2001, from Karen McKenzie, public affairsassistant, declining to participate.


NOTES


[1] Greene, J.P. ,  AnEvaluation of the Florida A-Plus Accountability and School Choice Program.New York: The Manhattan Institute. Feb. 15, 2001; available athttp://www.manhattan-institute.org/html/cr_aplus.htm

[2] Ibid., Executive Summary

[3] Ibid.

[4] Ibid.

[5] See Appendix A.

[6] Henry, Tamara, "Floridaschools shape up amid voucher threat: Findings could boost  Bush'snational plan," USA Today, 2/16/2001

[7] Camilli, P., and Bulkley, K.,“Critique of ‘An Evaluation of the Florida A-Plus Accountability and SchoolChoice Program,’” Education Policy Analysis Archives, Volume 9 Number 7,March 4, 2001, available at: http://epaa.asu.edu/epaa/v9n7/

  Kupermintz, H., “The Effects ofVouchers on School Improvement: Another Look at the Florida Data,” EducationPolicy Analysis Archives, Volume 9 Number 8, March 19, 2001, available at:http://epaa.asu.edu/epaa/v9n8/

[8] See Appendix A.

[9] Data compiled from structuredinterviews conducted between March 13, 2001, and March 30, 2001. See Appendicesfor details.