ALTA Report Published

Leave a comment

A culmination of one year of work, we are proud to present the report of the Academic Life Total Assessment. Please email feedback to yaro@princeton.edu.

Click here to download the ALTA Report optimized for quick web viewing (2.3MB)

Click here for a hiqh-quality printable copy (5.7MB)

Want a summary of our recommendations?  See this post.

Highlights from the Presentation

Leave a comment

On Monday, March 26th, 2012 ALTA presented to the Princeton Council of the Princeton University Community.

ALTA Presenting at the March 26 CPUC Meeting. Photo courtesy of The Daily Princetonian.

Here is the presentation (build-by-build):

Below are the suggestions presented:

Suggestion Benefits
Beginning of the Semester
Registrar and Committee on Course of Study permit faculty to publish additional information about classes on Course Offerings nearer to the start of the semester, and professors to make syllabi available to students on Blackboard before the semester begins. Faculty can more easily reach interested students, and students gain access to more reliable and more complete information about classes and professors. Decreases need to shop classes (which is disruptive for students and faculty).
Registrar make official course evaluations more accessible by adding direct search functionality (search by course number and by professor). Better meet student needs so they do not rely on less reliable, unofficial resources outside of the University’s purview.
Dean of the College establish and facilitate a university-wide common deadline for selective classes’ applications. Reduce confusion among students about application deadlines for selective classes.  Create potential for efficiency.
Professors of 3-hour seminars coordinate a universal 10-minute break time of 2:50 p.m. during shopping period. Give students shopping courses an appropriate opportunity to come or go from lengthy seminars without disrupting.
During the Semester
Professors giving exams limit new material covered immediately before the exam, and professors not giving exams limit assignments, if possible, during the common “midterm exam” period in week 6. Reduce factors that lead to “cramming.”  Give students time to take advantage of review resources (e.g. office hours) and re-engage with past material.
Professors give students specific goals for reading and control the amount of assigned reading to maximize completion. Goals or guidance that give a purpose for reading help students focus and engage with the material.  More students will be prepared for class and precept.  Reduces “cramming” of unread material later.
Departments take corrective and proactive measures to reduce variability in precepts of the same course. Create a more equitable experience across precepts, which improves student satisfaction.
Instructors design precepts to focus on current materials more than on learning new material. Students indicate that they prefer this use of precept time, so adjusting the focus can help increase student engagement and satisfaction.
Professors allow collaboration on assignments where possible. Increase student engagement with material by teaching to or learning from peers. Expand opportunities for students to discuss material outside classes or precepts.
Professors make collaboration policies explicit (e.g. on syllabus). Avoid confusion which may lead to violations of professors’ expectations or students not making use of peer resources.
McGraw facilitate pairing of students into study groups. So students do not need to have friends in the course to take advantage of the benefits of collaborative learning.
Departments provide more preparation for students conducting research for independent work. Students indicated in the survey that they wanted more help learning how to conduct research while conducting independent work, so this will meet a demonstrated student need.
Departments help students select their advisers for independent work. Students indicated in the survey that they wanted more help learning about professors to select an adviser for independent work, so this will meet a demonstrated student need.
McGraw, Writing Center, and Residential College Tutoring use our data to improve and add services to meet student needs. Meet student needs that were revealed or explained by this project.
End of the Semester
USG create web application to relay student feedback to professors and instructors early in the semester. Formally connect students and faculty during the semester so instructors have better information about their students.
Registrar and Dean of the College expand the parameters evaluated by the official course evaluations. Discourage students from using outside resources the University does not control. Make official course evaluations more helpful for students who rely on it to select courses.  Improve response rate.
Registrar prominently inform students how course evaluations will be used. Let students better tailor their reviews to the intended audiences.  Improve response rate.
Faculty design curriculum to distribute evaluation (e.g. exams, papers) more evenly throughout semester and decrease weight of final exams. Less incentive for students to “cram” for midterm and final exams or papers, meaning more student engagement throughout the semester.  Improve student learning and retention.
Dean of the College revise final exam overcrowding policy so students with multiple exams in 24 hours are also eligible to reschedule. Reduce the problem of overcrowded exam schedules by making use of existing mechanisms.
Faculty and Dean of College change Pass/D/Fail policy so students can see their final grades and choose to rescind the P/D/F status to keep grades they earned. Give students more feedback on P/D/F classes.  Provide more incentive to earn top marks in classes students P/D/F.  Increase student engagement in classes.
Faculty provide feedback to students without citing grading policies as justification for grades given. Provide more helpful feedback so students can improve in the future. Avoid stoking anxiety about the grading policy among students.

Gearing up for our March 26 Presentation

Leave a comment

The ALTA group is locked and loaded for our presentation the Monday after spring break.  Please join us at 4:30 p.m. on Monday, March 26th in Friend 101 for the presentation.  We are excited to share our suggestions and hear the community’s feedback.

Ironically, if some of our suggestions seem obvious to students, we’ve done our job of representing student interests.  We care more about being rigorous in supporting our suggestions than being original for its own sake.  Our goal is to make suggestions backed by data and vetted by faculty and administrators that have a good chance of happening and bearing positive results for all stakeholders.

In addition, we will establish student opinion and behavior on a variety of topics that have not previously or recently been explored.  How much reading do students actually do?  What matters most to students when selecting classes?  Why do students select the pass/d/fail option?  This is information that can lead to better-informed decisions by students, faculty, and administrators.

This presentation will be an abridged overview of our findings, but we hope it will jump-start the campus conversation about academics.  We are committed to an iterative process whereby we tailor our suggestions to the feedback we receive from the campus community.  Therefore, we will incorporate the feedback prompted by this presentation into our written report, which we will publish about  3 weeks after the presentation.  In the presentation, we will also discuss how these suggestions will be seen through to fruition.

Here are a few teaser slides from the presentation:

 

 

 

Almost 1400 responses on the first email

Leave a comment

By noon today, we’ve had almost 1400 responses.  Before the survey launched, I felt that 1200 responses would be the bare minimum we would need for our data to be credible, so we’ve definitely exceeded that.  The graph below shows the comparison with other surveys again. I think this graph is useful because it provides necessary perspective.

ALTA has nearly eclipsed COMBO II’s entire response rate on the first email alone.  I expect that today we’ll see the response rate flatten out significantly, though.  That would make tomorrow’s scheduled reminder email well-timed.  If the rate of responses doesn’t decrease significantly by tomorrow, though, it might make sense to postpone the reminder by a day or two. Perhaps even more people are still planning to take the survey over the weekend but are waiting to finish an exam.  Either way, a reminder email might be convenient so students do not have to dig back for the email with the link.

If you apply the geometric series pattern with a common ration of 1/2 to the next three reminder emails’ response rate, it “predicts” that the emails will bring in 700, 350, and 175 responses, respectively.  This would mean a total of 2625 responses!  That’s about 50% of the school.

I don’t think any survey, whether administered by the USG or the university administration, has achieved such a response rate before.  Therefore, perhaps it’s too optimistic to expect that 1/2 of campus will take the survey (that would be amazing, though!), but I think it’s safe to expect over 2000 responses based on what we’ve seen.  That would be a very strong response rate and contribute significantly to the survey’s credibility and the ALTA projects’ effectiveness.

Strong student support for the ALTA project, as implied by record-breaking response rate to the survey, shows that students want to be proactive about their academic experience.  When ALTA brings forward suggestions, having this student support (“mandate,” if you want to be adventurous with your wording) will make our report much more compelling.

About 1400 responses on the first email! The two plateaus in the ALTA data are from the overnight periods. Responses are starting to pick up again for the day, as you can see by how it's hooking up again.

1200 Responses in 24 Hours!

Leave a comment

Wow!  That’s all I can say.  The response rate for ALTA has broken records.

Within the first 15 minutes of launching the survey, 180 students completed the survey, and about 130 students completed the survey within the next 15 minutes.

Check out the following graph:

The response rate for ALTA within 24 hours of launch. If previous survey response patterns hold true, about 2200 students may respond at the end of the 2-week collection period.

We reached 1200 responses in 24 hours.  For COMBO III, it took 11 days to reach this number of responses (this survey was, however, released over the summer — so this is not surprising).  For a slightly better comparison, COMBO II was released in April 2009 and attained 1600 total responses.  It took 3 days for COMBO II to reach 1200 responses.  This is very exciting!

Here is how this response rate compares to the other surveys:

ALTA has had the fastest response rate of any USG survey. Click the image to enlarge.

Reminders are scheduled for Saturday (1/21), Wednesday (1/25), and Monday (1/30).  These will only be sent to those that have not completed the survey yet.  We expect that each reminder will net about half of the number of responses of the previous email.  Therefore, we expect to receive 600, 300, and 150 responses, respectively, for the upcoming reminders.  This would put the total response rate over 2200 — the highest response rate of any survey the USG has ever conducted, not to mention one of this length.

Thank you to everyone who’s taken time to take the survey!  We’re eager to make it worth your while.  We’ll begin analyzing the data after we stop collecting responses, and then we’ll be hard at work talking through our proposals and conclusions with our advisory board and other stakeholders.

If you have any feedback that you did not or could not contribute by the survey, please feel free to get in contact with us via the “Contribute” page.

For comparison, here are the response rates for COMBO II and COMBO III by themselves. Notice how each email reminder garners about half the number of responses as the previous reminder.  For COMBO II, this meant 1000 and then 600 responses for each email.  For COMBO III, this meant 1100, 500, and 270 responses for each email.

COMBO II was administered in April 2009. There was only one reminder email sent for this survey. The data is "lumpy" because there were also kiosks set up around campus that served food to those who completed the survey -- meaning there were other publicity methods than email.

COMBO III collected data in June and July of 2011. Because it was administered over the summer, the response rate was slower, but the overall response surpassed it's predecessor. There were two email reminders, the results of which are clearly visible on this graph.

Profiles Complete, Survey Underway

Leave a comment

It’s been a long time since the last post, but we’ve been working steadily over the last semester to complete the profiles section. This section is now complete.

We profiled 9 students.  These “profilees” represent five classes, graduates from the class of 2011 through freshman in the class of 2015.  The profilees are/were involved in many different academic and extracurricular pursuits on campus.  It is impossible to claim to represent the full breadth of students on campus with just 9 students, but we hope these students will complement the anonymous data we are about to collect.  It’s difficult to relate to anonymous data, and it’s impossible to draw conclusions from anecdotes.  Together, the profiles and the survey data (see below) should make the final ALTA report relatable and actionable.

The students wrote the profiles themselves using general guidelines we provided.  Each profile underwent weeks or even months of revision, with ALTA members and profilees exchanging drafts.  In the profiles, they described their academic experience, including issues such as how they made major decisions and how they balance their demanding schedules. For example, one profilee explained how she chose intentionally to fall behind in her reading so she could spend more time making friends in her first semester on campus.  Another profilee explains how living in Forbes made missing lecture sometimes unavoidable.  Yet another profilee explains how she breaks up her to-do list into items that take only 15 minutes each so she can make the best use of little gaps in her schedule.

A snapshot of one of the profiles

Beyond the written profiles, each student provided for us a minute-by-minute record of their activities for a week, including everything from when they worked on assignments to when they slept.  This information is graphically represented on a calendar, giving the reader a chance to understand how each profilee spends his or her time.

An example calendar from the profiles section

Now that the profiles document is completed, the survey is moving forward in full force.  It will be launching on Friday or Saturday this week after we have a small test group take the survey and give us feedback.

Student Profiles Underway

Leave a comment

The ALTA committee has begun one of our major undertakings — profiling undergraduate students.  We have two goals for this portion of the project:

  1. Provide faculty and administrators with a candid, inside perspective on undergraduate academic life that will be helpful for making decisions that affect undergraduates
  2. Based on what we learn from profiling these students, motivate the other sections of the report

We selected a handful of students pseudorandomly — using a simple random sample of students at first to avoid overrepresenting those in our friend circles, and then by hand to ensure breadth and diversity in our selection.  In terms of diversity, some of our criteria included class year, degree, department, gender, hometown (and home country), activities, and affiliations.

Through these profiles, we will identify commonalities among Princetonians and also comment on the differences.  As for the commonalities, we will explain how our profiled students manage the pressures of Princeton and make important academic decisions.  In considering the individual differences, we will demonstrate how different everyone’s experience at Princeton can be and how this impacts a student’s academic experience.

Older Entries

Follow

Get every new post delivered to your Inbox.