News

Saxa Politica: Life report lacks data

March 15, 2012


Last April, the GUSA Executive commissioned the 2012 Report on Student Life, and allocated a large portion of their budget to the project. Although they call the resulting 73-page report “rigorous” and “empirical,” the report’s findings and methods are dubious at best. From withholding data to undisclosed conflicts of interest to an unaddressed scope, there are many reasons why the report should be read with severe reservations.

GUSA charged a three-member Student Life Report Committee with the task of researching, with nine additional co-authors to write individual sections. The committee sponsored nine comparative trips to other universities, had access to numerous university surveys, solicited comments by school-side blast email, and supposedly conducted numerous interviews with stakeholders on campus. The report, published in February, says the committee engaged in a “rigorous and sweeping research process” to outline student engagement on campus. Their year-long research process still yielded a report riddled with faulty depictions.

The authors of the report do not publish the data they used. While they do analyze student life and draw conclusions from their analysis, very few of these conclusions are supported by data. The report contains no data tables from their surveys, no summaries of their comparative trips, and no transcripts of their interviews. If you, the reader, want to take anything away from this report, you’ll have to trust the authors that their methods were rigorous and that their conclusions are warranted.

And there are two more reasons why you shouldn’t.

First, the report does not disclose any conflicts of interest. For instance, there is a section on why the Hoya going independent is so critically important for the University as a whole, but there is no disclosure that co-author Lauren Weber is on the Hoya’s board of directors. Although the co-authors are called section authors, there is no mention of who wrote which section.

Second, if the authors are calling the report empirical (which they are), on principle, no one should have to trust them personally; the methods should speak for themselves. What’s the point of an official report if it is comprised primarily of the unsubstantiated perceptions of a group of students? I can get that from a campus editorial or Vox comments (oh yeah, and the report cites Hoya editorials—not news stories—twice). In my own personal perfect world, a “report” should have more data than recommendations, but apparently the authors have the opposite perspective.

As a case study, let’s take the report’s most basic claim that student engagement and happiness go hand in hand. “Using [rudimentary empirical models], we found evidence of a positive, statistically significant relationship between student satisfaction and various measures of engagement—number and range of activities that a student is involved in, the administration’s responsiveness, faculty advising outside the classroom, etc” (page 9). Since none of the data cited by this claim are in the report, I can’t tell if they are misinterpreting something or making the claim up. To that end, I think they’re wrong. In my experience both reporting on and participating in many of the organizations discussed in the report, the relationship between engagement and happiness is more complicated than the positive correlation they allege. Based on the number of student-life burnouts I’ve met on campus—call it my own anecdotal survey—the most jaded and cynical students tend to have been the most involved. So how do I reconcile my perceptions with the conclusions in the report? Well, I can’t, because the authors withhold their data. The discussion ends there.

The report has more problems than data withholding and undisclosed conflicts of interest. Even the scope of the report is erroneous: the authors become so bogged down in short-sighted bureaucratic reform that they at once neglect the basics and the big picture. First, they don’t define their terms: what exactly is engagement? Also, the authors don’t attempt to lay a groundwork for student life as it exists. Some helpful information would have been, “how many students are engaged in student life?” or “to what extent are the students invoked?” A report that claims to discuss student engagement but does not address these questions cannot be taken seriously.

Regarding the big picture, the authors of the report say that they tried to view student life through an institutional lens, but miss several key institutional policies. For instance, access to benefits is only mentioned once in passing. Having access to benefits is a prerequisite for student engagement on campus, so it deserves to be discussed.

As I’ve said in previous editions of this column, I am a huge proponent of self-study and evidence-based decision-making. But trying to pass off a report, severely lacking in data and replete with undisclosed conflicts of interest, that also misses the point of what they are trying to say, does not help. A good report should be transparent, rigorous, and filled with data.  The Student Life Report 2012 is none of these.

Check Ryan’s data at rbellmore@georgetownvoice.com



More:


Read More


Subscribe
Notify of
guest

1 Comment
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Commenter

I have to agree with Ryan, here. But thi sis not a fatal flaw — I see no reason why the authors can’t release the data (expunged of any sensitive identifying material). In fact, it could help others in the community come to even better insights that the team might not even thought of.