This Blog is Ranked #1

Once again, college ranking season is upon us.  Timed for when high school juniors and their parents are starting to focus on college admissions for the following year, a whole host of magazines, newspapers, and websites publish their annual “rankings” of their “top” institutions of higher learning.

What are you – parent or student – supposed to do with all these opinions?  How do you know which college is the best, or, more important, the best for your student?

When buying a washing machine, many consumers start with Consumer Reports magazine.  Is there a “Consumer Reports” rating authority for colleges?

Alas, no.  Instead of one possibly authoritative source, there are well over a dozen contenders.

Start with traditional news media organizations that publish college rankings, including:

U.S. News and World Report:  https://www.usnews.com/best-colleges (the 800 lb. gorilla of college ratings)

The Wall Street Journal/Times Higher Education of London:  http://graphics.wsj.com/image-grid/college-rankings-2018/?mod=djmc_wsjcollegeranking_search_092617&ef_id=WTGbrgAAAH26bSoC:20171028182719:s

The Economist magazine:  https://www.economist.com/blogs/graphicdetail/2015/10/value-university

Forbes magazine:  https://www.forbes.com/top-colleges/#3fb042601987

Money magazine:  http://time.com/money/best-colleges/rankings/best-colleges/

Kiplinger:  http://www.kiplinger.com/article/college/T014-C000-S002-kiplinger-s-best-college-values-2017.html

Washington Monthly magazine:  https://washingtonmonthly.com/2017college-guide

 

You can also consult web sites which publish their own rankings, such as:

Payscale:  https://www.payscale.com/college-salary-report

The Alumni Factor:  https://www.alumnifactor.com/node/5836

College Factual:  https://www.collegefactual.com/rankings/

LinkedIn:  http://www.collegeconfidential.com/admit/college-rankings/

Niche:  https://www.niche.com/colleges/rankings/methodology/

 

Let us not forget guidebooks.  Some are specialized, such as the “40 Colleges Which Change Lives.” Here, however, I am referring to the largest publishers, those whose guidebooks profile 300 or 400 of the “top” colleges in the nation.  Although these guidebooks, such as Fiske and the Princeton Review, do not rank colleges, their inclusion of each college in their books is an endorsement of sorts for that institution, or at least a culling from the herd of over 2,500 four-year institutions of higher learning.  These guidebooks also include specialized lists, such as private and public university “best buys”.

Why are there so many lists?  For the most part, money is the motivator.   The magazines gain subscribers and, for those readers who find them on the Internet, advertising dollars.  The guidebook companies sell more books.  Which book would you buy:  “382 Colleges” or “The Best 382 Colleges”?  Princeton Review chose the latter title for its book:  https://www.princetonreview.com/college-rankings/best-colleges.

Another reason for the plethora of lists is that rating any institution, including a college, depends upon what is deemed most important to the reader.  Consider the large number of criteria available:

  1. School resources, including size of the endowment, availability of research equipment and funding for undergraduates.
  2. Selectivity – how difficult is to win admission.
  3. Affordability – usually a combination of price (tuition, room and board, fees) and availability and generosity of financial aid.
  4. Academic record of accepted students.
  5. Graduation rates, at 4 and 6 years.
  6. Student retention rates after freshman year.
  7. Reputation of the faculty.
  8. Faculty’s teaching ability.
  9. Student-professor ratio.
  10. Alumni contributions, both in percentage giving and amounts donated.
  11. Student “engagement” with professors.
  12. Student satisfaction with the college.
  13. Projected earnings for graduates; sometimes expressed as a ratio with tuition to arrive at an ROI (return on investment) for each college.
  14. Recruitment of disadvantaged students (e.g., race, income).
  15. College emphasis on service by its students (e.g., community service requirements).
  16. Student loan repayment rate.

Thus, determining “the #1 college” depends upon on what information is deemed important.

The key to using these lists is to understand:

  1. What criteria are used, and how those criteria are weighted, in arriving at the ranking.
  2. Whether the data used is sufficient to support the conclusion drawn from it.
  3. Whether the ranking relies on data which is subject to misinterpretation or even fraud.
  4. Whether the criteria are relevant to your student’s interests.

 

A few examples of “best practices” and, well, “less than best practices,” follow.

 

What criteria are used, and how are they weighed?

The most serious problem arises when the ranking is done in a “black box”, where only the ranking service knows what it considers important.

Consider the explanation offered by Niche (formerly College Prowler) concerning how it creates college rankings (https://www.niche.com/colleges/rankings/methodology/):

The Niche 2018 College Rankings are based on rigorous analysis of academic, admissions, financial, and student life data from the U.S. Department of Education along with millions of reviews from students and alumni. Because we have the most comprehensive data in the industry, we’re able to provide a more comprehensive suite of rankings across all school types.

Very impressive, if a bit vague concerning exactly what that data is, and how much of it is relevant to ranking colleges.  However, we can break it down.  The Department of Education collects a trove of data from colleges – the complete dataset is over 200 megabytes – and makes it public.  Most ranking services use this data, and sites such as www.collegedata.com (highly recommended) compile and present it in usable form.  See https://www.collegedata.com/cs/main/main_choose_tmpl.jhtml.

Niche’s claim to fame appears to be that it combines some of that data with its proprietary student survey data.  Confusion results when it explains how it uses that data (emphasis added in bold).

With clean and comparable data, we then assigned weights for each factor. The goal of the weighting process was to ensure that no one factor could have a dramatic positive or negative impact on a particular school’s final score and that each school’s final score was a fair representation of the school’s performance. Weights were carefully determined by analyzing:

How different weights impacted the distribution of ranked schools;

Niche student user preferences and industry research;

After assigning weights, an overall score was calculated for each college by applying the assigned weights to each college’s individual factor scores. This overall score was then assigned a new standardized score (again a z-score, as described in step 3). This was the final score for each ranking.

Yes, but what factors – criteria – were used, and how were they weighted?  Niche does not say.  For example, if Niche weighted “reputation of faculty” at 90%, then the rankings would be skewed heavily in favor of prestigious schools.

In contrast, many ranking sites are transparent about how they use such data in arriving at their rankings.  See e.g., https://www.usnews.com/education/best-colleges/articles/how-us-news-calculated-the-rankings (U.S. News and World Report – listing factors and the weight assigned to each); https://www.economist.com/blogs/graphicdetail/2017/08/graduate-earnings (the Economist magazine factors and weights used in its ratings for British universities).  If we cannot discern what information underpins rankings, then rankings are not very helpful.

Further, Niche makes explicit what I suspect other rankings purveyors do quietly – it takes a “second look” at the data to make sure that it looks “right” before publishing its rankings.  Here is the quote from above, with different emphasis added in bold:

With clean and comparable data, we then assigned weights for each factor. The goal of the weighting process was to ensure that no one factor could have a dramatic positive or negative impact on a particular school’s final score and that each school’s final score was a fair representation of the school’s performance. Weights were carefully determined by analyzing:

How different weights impacted the distribution of ranked schools;

Niche student user preferences and industry research;

That certainly looks like Niche “tried out” different weightings for each of the unnamed criteria, and then changed those weights if the resulting rankings did not look “right”.  The last sentence even suggests that it adjusts its ranking to conform to what other ranking services report (Niche analyzes student preferences and “industry research”, i.e., how other college insiders rank the schools).  That level of subjectivity is understandable – sort of applying a “smell test” to the results – but it does not add confidence that the weighting is completely objective.  It also limits the possibility of “uncovering hidden gems”.  What is the point of rankings if a school cannot score at the top unless it “looks the part”?

The lesson here is that before relying upon a list, understand exactly what criteria are being used, and how they are being weighted.

 

Is the data sufficient to support the measurement? 

When you step on a scale, the datum that stares you in the face, no matter how unpleasant, is almost certainly sufficient to determine how much you weigh.  However, if three people step on the scale – one at a time – the average of their weights will be insufficient to determine the average weight of the population of the United States.

Some ranking criteria require complete datasets to be relevant, and those datasets are often very difficult to obtain.  For example, many surveys measure “student outcomes,” usually through a proxy such as average earnings after 1 year, 5 years, etc.  Unfortunately, when schools survey graduates about their employment, often only the graduates with “good news” to report answer.  Would you be eager to tell your alma mater that you are unemployed?  And even those with good news to report simply ignore the surveys – perhaps out of fear that a letter from the development department asking for funds will follow.  (On a personal note, for the last 38 years, UCLA has been able to track me to a new address within six months of my arrival – very impressive.)

For example, only 19% of University of Kansas graduates responded to an outcome survey.  See https://blog.naceweb.org/2014/09/09/an-insiders-look-at-first-destination-surveys/.  The university then took the unusual step of looking up LinkedIn profiles to supplement responses.

Take any exemplary “placement rate” with a large grain of salt.  One item to look for when evaluating individual colleges’ placement rates is whether they use the NACE standard for survey responses.  See  http://www.naceweb.org/job-market/graduate-outcomes/first-destination/standards-and-protocols/.  Even then, be careful with any statistics about “full-time employment” – many colleges interpret Starbucks baristas as being fully employed.

The college rankings that rely on graduates’ salaries also suffer from incomplete datasets.  See https://www.nytimes.com/2017/12/01/your-money/college-graduation-earnings-privacy.html?hpw&rref=business&action=click&pgtype=Homepage&module=well-region&region=bottom-well&WT.nav=bottom-well (“Want to Search Earnings for English Majors?  You Can’t.”  New York Times, 12/1/17).

In addition, graduates tend to find work close to their colleges.  Unless adjusted for cost of living, student earning surveys will skew in favor of West and East Coast schools.

Other criteria are easy to compute, but relatively meaningless.  The percentage of students graduating after four or six years is susceptible to sample bias.  It is normal for many engineering students to take more than four years to graduate.  Unless you know the sample (MIT, small liberal arts college), low four-year graduation rates coupled with high six year-rates may be “normal”.  Obviously, any school where both rates are low will drop out of any ranking without ceremony.  A better measure is freshmen retention – if students are not returning after freshman year, the odds are higher that your student will not, either.

 

Some data is subject to “gaming,” or even fraud

Selectivity is the ratio of applicants to students admitted.  In 2016, Stanford only admitted 1 in 20 applicants, leaving it with a selectivity of 5%.  That means Stanford must be the best, because only the best students get in, correct?

Well, a lousy school will not attract enough students unless it relaxes admission standards.  But be careful with putting too much weight on that correlation, because many colleges “game” selectivity.  Methods include:

  1. Using “VIP applications”. Colleges send out these one page, no-fee, applications to thousands of students.  Because they do not require essays or application fees, many students fill them out and return them.  See https://www.propublica.org/article/the-admission-arms-race-six-ways-colleges-can-game-their-numbers.  I commented on this in my “You’ve Got Mail” post (6/22/15).

 

  1. Adopting the Common Application. The Common Application is used by over 500 colleges.  Students need only fill out the application once to apply to all member colleges.  Most colleges require students using the Common Application to respond to one or more essay prompts unique to the school.  Stanford has 11 such prompts, although many of them are short.  Colleges who wish to encourage applications do not require any supplemental essays.

 

  1. Going test-optional. A growing “fair test” movement in college admissions eschews standardized tests (SAT, ACT) as merely reflecting affluence or penalizing students who do not perform well on standardized tests.  These schools allow students to apply without them.  Although this may be laudable, it can also increase applications.

 

Colleges using these tools can lower their selectivity number, which, remember, is the percentage of applicants admitted, simply by inducing more students to enroll while not increasing the number of students accepted.  Why would they do this?  The U.S. News and World Report includes a college’s selectivity number in its ranking system, and guidebooks list colleges’ selectivity ratings prominently.  Students use selectivity as a proxy for “desirability,” which results in even more applications; a vicious cycle for students ensues.

The U.S News rankings also factors in the SAT and ACT score of enrollees.  You might ask how colleges could possibly “game” such numbers.  Well, five colleges since 2012 have been caught reporting higher scores than students actually earned – we call that fraud.  And there may be more who have not been caught.  See https://www.propublica.org/article/the-admission-arms-race-six-ways-colleges-can-game-their-numbers.

 

The criteria are not useful to students

U.S. News and World Report assigns 22.5% of its college ranking to “undergraduate academic reputation”.  What does that phrase mean?  Here is the explanation from U.S. News:

For the Best Colleges 2013 rankings in the National Universities category, 842 top college officials were surveyed in the spring of 2012 and 53 percent of those surveyed responded. Each individual was asked to rate peer schools’ undergraduate academic programs on a scale from 1 (marginal) to 5 (distinguished).

Those individuals who did not know enough about a school to evaluate it fairly were asked to mark “don’t know.” A school’s score is the average score of all the respondents who rated it. Responses of “don’t know” counted neither for nor against a school. 

The problem is that “reputation” is intangible – the reader must ask: “reputation according to who?”  At least U.S. News makes the “who” clear:  college officials and high schools counselors.  But should students care how college administrators regard their colleagues (some of whom will be rivals)?  Unless the school is nearby (in which case it may well be a rival), most college administrators are unable to make fine distinctions between colleges.  And high school counselors rarely follow up with their charges to determine their experiences after enrollment.

Indeed, shouldn’t students care a lot more about what employers think about the schools?  The Wall Street Journal published just such a list:  https://www.wsj.com/articles/SB10001424052748704554104575435563989873060 — in 2010.

And some ranking services reject traditional criteria as merely a proxy for wealth.  For example, The Washington Monthly publishes its list as “our answer to U.S News & World Report, which relies on crude and easily manipulated measures of wealth, exclusivity, and prestige to evaluate schools.”

Alas, not many career-oriented students (and parents) will be interested in its alternative:

We rate schools based on their contribution to the public good in three broad categories: Social Mobility (recruiting and graduating low-income students), Research (producing cutting-edge scholarship and PhDs), and Service (encouraging students to give something back to their country). https://washingtonmonthly.com/2017college-guide

Kudos to those who are.

The largest failing of college rankings is that they are usually so general as to be meaningless.  Students in STEM fields may not value small class sizes as much as a school’s facilities and labs.  Liberal arts students are just the opposite.  Students who are planning professional careers may not care as much about “outcome measures”, as opposed to acceptance rates at professional schools (which the ranking industry tends not to measure).  Wealthy families may not worry about financial aid – for less wealthy parents, their inquiry may begin – and end – with that criterion.

 

College rankings can be useful

With the caveats expressed above, college rankings do have their uses.  Here is how to use them:

  1. Understand what is being measured, and the weights assigned to each criteria.
  2. Understand what criteria are omitted, and determine if the omission is important to your student’s needs.
  3. Choose the most specific ranking possible. U.S. News and World Report, along with other services, publishes “sub-lists”, e.g., best undergraduate engineering programs.  Prospective engineers should start with that list.
  4. Consult more than one ranking. Schools that are consistently ranked highly may be more likely to deserve their rankings.
  5. Prepare to dig deeper. Sometimes variations in formulas make little difference in the rankings they produce.  Eight of the top 10 universities on the U.S. News list were also in the Journal/THE top 10.  The other two were right behind.  https://www.washingtonpost.com/news/grade-point/wp/2016/10/20/whos-no-1-as-college-rankings-proliferate-it-depends/?utm_term=.4f57fb7feafc.  You will have to do your own research to tease out the differences among high-ranked schools, or make decisions based on other factors (g., geography, cost).

How do I use rankings?  First, I use them to ascertain which schools my clients will value highly.  I may have to overcome preconceived notions with research.  Second, I use them like a string around a finger – when I am composing a list of candidate schools, I check the rankings to make sure that I have evaluated all of the commonly considered schools.

I also use college matching services, such as BigFuture (https://bigfuture.collegeboard.org/college-search) to create lists of schools, but that is grist for another article.

Congratulations! You’ve Been Admitted to the University of California – For Now

Universities reserve the right to rescind admission in extreme circumstances, such as a student’s commission of criminal acts, failure to graduate, and the like.  Indeed, I commented on just such a situation involving Harvard a few months ago in this blog.  However, the University of California (“UC”) is taking this a step further, and not in a helpful way.

The UC uses one application for its nine campuses that offer undergraduate instruction.  Students “check off” boxes for the UC campuses they are applying to, and each campus makes its own admissions decisions.

The application instructs students not to send transcripts; students self-report their grades.  To ensure that the reported grades are accurate, all offers of admissions are made provisional upon the UC receiving official final transcripts verifying those grades.

Those provisional offers also include additional conditions which must be satisfied before enrollment.  The UC refers to these provisional offers as “contracts.”  (As noted below – in a paragraph only a lawyer could love – it is not clear that these are true “contracts”.)

Unfortunately, some of those additional conditions protect the UC against more than fraud, and students may face unwelcome surprises.  For example, this summer, the University of California, at Irvine (“UCI”), rescinded 499 admissions because students had failed to arrange for their high schools to deliver their final transcripts by July 1, as required by those agreements.  This seemed like an overreaction to a trivial deadline, but the news soon broke that there was more at stake for UCI.

At the time, UCI was over-enrolled by approximately 800 students.  The university had already tried to convince some accepted students to enroll instead in a separate “academy” with a 50% tuition break.  However, UCI failed to mention that those new students “would have to cancel their enrollment as regular freshmen, take a more limited menu of classes in the adult education division and give up access to campus housing and financial aid.” (LA Times, 8/2/17; see also https://www.insidehighered.com/admissions/views/2017/08/07/essay-lessons-controversy-over-university-california-irvine-revoking.)

Not surprisingly, the offer was not taken up by enough students to solve UCI’s over-enrollment problem.  So UCI turned to its contracts and rejected those students who had failed to submit transcripts timely.

On Friday, campus spokesman Tom Vasich conceded that the admissions office was more stringent than usual about checking requirements “as a result of more students accepted admissions to UCI than it expected.”

The vice-chancellor of students affairs also fessed up:

I acknowledge that we took a harder line on the terms and conditions this year and we could have managed that process with greater care, sensitivity and clarity . . . “

In other words, the University of California reserves the right to punish some breaches of its “contracts” more than others depending upon whether doing so is to its advantage.  Ultimately the public outcry – and examples where the UC had indeed received the transcripts timely but mistakenly rescinded admission – forced the university to back down.

But questions remain because some of the terms in these “contracts” are vague.  For example, to discourage “senioritis”, the UCs require students to maintain a certain standard of excellence in their senior years.  But what standard?

The general rule is that students must maintain a minimum weighted 3.0 GPA in college prep classes (the “a-g” requirements in UC lingo), with no “D” or “F” grades.  However, for some of the most prestigious campuses, the contract may specify higher GPAs or test scores.  See e.g., https://talk.collegeconfidential.com/university-california-los-angeles/1990196-ucla-provisional-contract-for-ib-student.html.

And then there are campuses that add “expectations”.  Take this language from the University of California, at Santa Cruz:

In accepting admission at UCSC, you agree that you will:

  1.  Earn a level of academic achievement in your fall and spring courses (as you listed on your UC application) consistent with your previous coursework. Earn a grade of C or higher in those courses (or equivalent for other grading systems).

Wait.  Look at the text again: “[e]arn a level of academic achievement in your fall and spring courses (as you listed on your UC application) consistent with your previous coursework.”

UC Davis appears to have similar language in its “contract”.  See https://www.ucdavis.edu/admissions/freshman/admitted/.

Do we now have two requirements?  Must a student not only avoid “Ds” and “F”s, but produce the same grades (mostly “A”s) that got them admitted in the first place?  That is a lot harder to accomplish, and essentially extends the application period until high school graduation.  Say it ain’t so, UCSC!

Regrettably, it is so, at least for UCSC.

UCSC includes an FAQ here:  https://admissions.ucsc.edu/apply/conditions-faq.html:

FAQ 1A: My contract indicates “Earn a level of academic achievement in your fall and spring courses consistent with your previous coursework, with no grade lower than a C (or equivalent for other grading systems).” What do you mean by “consistent?”

Answer 1A: We expect that the grades you will earn in your senior year will look similar to the grades you earned in the first three years of your high school career; for instance, if you were a straight-A student for three years, we would expect A’s in your senior year. Consistency in your level of achievement must be carried through your senior year coursework.

FAQ 1E: I earned a C- in a course. Does that mean my admission will be cancelled?

Answer 1E: The University of California does not compute pluses or minuses in high school coursework. Therefore, a C- is considered equivalent to a C grade. Remember, however, that we also expect a consistent level of academic achievement in your coursework.  (Emphasis added.)

Well, so much for students pausing during their last semester to enjoy the high school they attend in a way they previously had to defer for 3 ½ years of grueling competition.  Of course, we do not know whether UCSC (or UC Davis) routinely enforce this standard.

But this requirement is worse than draconian – it is confusing.  When analyzing a contract, lawyers look for the consequences of a violation of its terms, what we call a “breach.”  The UC’s “contracts” take a sweeping approach – any violation of any of its terms is considered grounds for canceling the contract and rescinding the admission.

UCSC is no exception.  From that same linked document:

Failure to meet your Conditions of Admission Contract will result in the cancellation of your admission. It is your sole responsibility to meet all conditions. Read each of the six conditions below and ensure that you meet all of them. Accepting your offer of admission signifies that you understand these conditions and agree to all of them.  (Emphasis in original.)

That appears clear, but for one problem.  What does it mean for UCSC to “expect” students to have grades that are “consistent”?  This formulation appears vague: “how many “B”s can a straight-A student sustain before being in breach?  And what do we make of the verb “expect”?  It is weaker than “require”, and lacks the commanding “shall”.  Is such an expectation enforceable?

I am not practicing law anymore, so I leave this question to those who are.  But the fact that this is a reasonable question is a huge problem – both for UCSC and our students.

Looking at the big picture does not yield a pretty sight.  The University of California requires students to sign “contracts,” even though minors are generally considered to lack the capacity to do so.  (Not a small point for lawyers – see https://www.mnscu.edu/system/ogc/docs/HANDBOOK%20Minors%20on%20C.pdf; I suggest that a university’s authority to enforce a policy, as opposed to the letter of a contract, may be subject to heightened due process concerns).

The UC reserves the right to enforce those “contracts” differently depending upon when it is to the UC’s advantage to do so.  And it is sometimes vague about what it wants a student to do (earn “consistent” or “strong” grades through all four years) and what conduct constitutes a breach of contract allowing cancellation of admission.

Worst of all, by the time the UC decides whether to exercise its contractual remedies by rescinding admission, students have already declined all other offers in accordance with the May 1 rule used nationwide by colleges.

The UC has a sterling reputation, and I am a proud graduate of UCLA.  However, as I’ve said previously in other contexts, caveat emptor.

You and your students should read any admission letters and accompanying materials very carefully before accepting an offer from the University of California.  Upon accepting that offer, labor diligently to make sure that your student’s transcript arrives at their campus (and confirm in writing that it did), and that all other terms of the contract/agreement are satisfied.

Finally, of course, senioritis can be a serious threat to any student’s college aspirations, but it is a particularly dangerous malady for those planning to attend the UC.  If it looks like grades will be a concern, call Admissions and warn them; they may be more inclined to be lenient if notified early than the cold, hard, text of their provisional “contracts” suggest.

A New Addition to College Curricula:  Preparing to Fail

The New York Times generally covers education issues in its “Fashion and Style” section.  Make of that what you will (including my reading habits), but one of their recent articles caught my eye: “On Campus, Failure is on the Syllabus” (June 24).

The lede is illustrative:

Last year, during fall orientation at Smith College, and then again recently at final-exam time, students who wandered into the campus hub were faced with an unfamiliar situation: the worst failures of their peers projected onto a large screen.

“I failed my first college writing exam,” one student revealed.

. . . .

The faculty, too, contributed stories of screwing up.

“I failed out of college,” a popular English professor wrote. “Sophomore year. Flat-out, whole semester of F’s on the transcript, bombed out, washed out, flunked out.”

“I drafted a poem entitled ‘Chocolate Caramels,’ ” said a literature and American studies scholar, who noted that it “has been rejected by 21 journals … so far.”

It is now a meme (see my entry on “Watch What You Post!” for a definition) that millennials are unusually fragile creatures who require cosseting against the vicissitudes of the real world.  In other words, they are weak creatures who have been raised in a bubble.

This is an odd idea on its face when applied to students admitted to elite colleges and universities.  These students have won an extraordinarily competitive race for four years to demonstrate that they are superbly equipped for any academic challenge.  Why would these students, of all people, suddenly crumble when they arrive at college?

Late adolescence is an uncertain, and even dangerous, moment.  At the extreme, this is the time when some forms of mental illness are more likely to emerge, such as bipolar disorder, schizophrenia, and depression.  See https://nami.org/collegeguide (claiming that one in five adults will experience a form of mental illness in college).  Students are also experimenting with adult life, including entering relationships which can end badly.

Further, many of these students have never experienced academic failure before they arrive at college.  The winners of a grueling race that penalizes even a “C” quite heavily, these students are heavily invested in succeeding, and are likely to be blindsided by failure.

Colleges are becoming alarmed at the frequency of students buckling under the new – for these students – experience of failure.  Smith College reminds students that 64% of students will receive a B-minus or lower during their time there.

And it goes a step further:

[W]hen students enroll in [Smith’s] program, they receive a certificate of failure upon entry, a kind of permission slip to fail. It reads: “You are hereby authorized to screw up, bomb or fail at one or more relationships, hookups, friendships, texts, exams, extracurriculars or any other choices associated with college … and still be a totally worthy, utterly excellent human.”

A number of students proudly hang it from their dormitory walls.

Smith is just one of many elite colleges rolling out such programs.

consortium of academics soon formed to share resources, and programs have quietly proliferated since then: the Success-Failure Project at Harvard, which features stories of rejection; the Princeton Perspective Project, encouraging conversation about setbacks and struggles; Penn Faces at the University of Pennsylvania, a play on the term used by students to describe those who have mastered the art of appearing happy even when struggling.

Some of this can be attributed to the newly fashionable idea that “grit” is an important ingredient for success.  As with most such revelations, a grain of truth can be puffed up into a silo full of grant-funded excesses seemingly devoted to accentuating the obvious.

However, such movements can also be useful, and this is one of them.  The “lesson” for students and the college counselors who work with them is that students should be made aware of the challenges ahead.  Most important, students should be told before they leave the nest that colleges have resources available to help students in distress.  Students should know the location of the Counseling Office on campus.  They should be instructed to seek help, and informed that doing so will not result in any social or parental stigma.

We all know that failure is part of life.  Make sure that your student knows that, too.  For my part, my students who just graduated high school and thought that they had heard the last of me until my Christmas break “check-in” are about to receive an e-mail with the New York Times article attached.

It Could Be Denver

The Independent Educational Consultants Association (IECA) held its semi-annual conference in May in Denver.  As part of that group, I visited several colleges of interest.

 

University of Colorado at Boulder

The University of Colorado at Boulder (“CU-B”) is located 30 miles northwest of Denver in the city of Boulder (107,000).  The city is self-contained, which is very helpful because the promised light rail line to Denver remains a mirage.

Fortunately, Boulder is in a beautiful valley within two hours of some of the finest skiing in the world.  The weather is advertised as providing about 300 days of sun each year.  Boulder itself is a foodie, microbrew, green town that is one of the most sought after suburbs in Colorado.

CU-B’s strengths make it an attractive choice for students from every part of the country.  It should be on the shortlist of every student who wants to become an astronaut or aerospace engineer; you can view CU-B’s alumni list here:  http://www.colorado.edu/aerospace/about-us/astronauts-affiliated-cu.

U.S. News ranks its aerospace engineering program at #12; the programs ahead of it are much more selective.  And CU-B is not shy about reporting that the same ranking service chose its graduate physics department as #1 in the nation in Atomic, Molecular, and Optical Physics, ahead of MIT, Harvard, and Stanford.

STEM in general is a priority on campus.  See http://www.colorado.edu/csl/pdfs/csl_materials/CSLBrochure_3-8-13.pdf for information on the university’s attempt to improve teaching and learning.  Of course, many other universities are trumpeting similar initiatives, but CU-B’s appears to be ambitious in scope.

CU-B also has interesting programs outside of STEM.  One of them is the environmental design program, which repackages the existing architecture program within a larger School of the Environment and Sustainability.  Here is the university’s description:

Students enroll in studios, lectures, and seminars taught by 30 faculty with both academic and professional expertise. They design innovative “green” buildings and infrastructure and they work directly with cities to figure out how to integrate social, ecological, and economic needs to support a sustainable future. Students apply state-of-the-art educational technology including computing tools, digital image databases, fabrication equipment, and advanced media to make a persuasive case and bring their ideas into light. Layer on top of all this the resources of the Boulder campus—from sciences, social sciences, humanities, arts, and technology fields—and we offer an educational opportunity like no other.

Students interested in architecture, landscape architecture, urban planning, and design generally may find this approach intriguing.

Are there downsides to the CU-B experience?  Two come to mind.  First, CU-B may become a victim of its own success.  For example, the engineering school is planning to decrease enrollment next year to alleviate overcrowding – applicants may find it harder to win admission.  Second, the popularity of the school contributes to a high cost of living:  Boulder is a very expensive place to live, and students who leave the dormitories for rental housing (71% of them this year) will need to budget accordingly.

How difficult is it to get in?  Admission requirements vary by program.  The numbers for the College of Arts and Sciences for the middle cohort (25% to 75%) are GPA 3.37 to 4.0, SAT 1170-1350, and ACT 24-30.  The numbers are similar for admission to the environmental design program.  However, the engineering program raises the bar:  GPA 3.87-4.0, SAT 1290-1470, ACT 29-33.

The male/female ration is 56/44, perhaps fueled by the popularity of STEM at the university.  The OOS number (percentage of out of state students) is 39%, suggesting that out-of-state students may need to post better numbers than those cited above.

CU-B is a typical large public university with some uncommon strengths, set in one of the most beautiful places in the country.  It merits your attention.

 

Colorado School of Mines

They still tote the rock.

Decades ago, when your correspondent was applying to college, Colorado School of Mines was known as the finest engineering school in the West.  It may still deserve that title, but it is not nearly as well known.  This is a shame, because for certain STEM fields tied to the earth, there are few better in the country.

“Mines”, as it is known, is a public university serving 4,500 undergraduates and 1,300 students.  Located in Golden, Colorado, a town of 20,000 people – with light rail access to Denver and Boulder – the school was founded in 1874, two years before Colorado achieved statehood.

The school notes that Mines was originally devoted to the study of – wait for it – mining.

Courses offered to students during the early years of Colorado School of Mines included chemistry, metallurgy, mineralogy, mining engineering, geology, botany, math and drawing. The focus of the early academic programs was on gold and silver, and the assaying of those minerals.

According to Wikipedia, in 1906 the school opened the first experimental mine in the nation for teaching purposes.

Its mission has broadened considerably today:

The nexus between the earth, the environment and society’s need to generate and distribute energy in an economic and sustainable way is central to Mines’ specialized mission. Faculty and students at Mines research new frontiers in resource exploration, extraction and processing, renewable energy production and distribution, advanced materials, and environmental impact, mitigation and remediation.

Mines welcomes students who wish to study engineering, computer science (the school has its own supercomputer), biochemistry, applied math and statistics, geoscience, or closely related fields.  A complete list can be found here:  https://www.mines.edu/Undergraduate-Academic.

STEM students only need apply; those who are looking for a college with a robust liberal arts offering should look elsewhere.  Indeed, Mines’ own application – it does not accept the Common Application – has no essay requirement.

Our tour guide gave us a window on the sort of students who go to Mines.  She said that she decided on Mines after attending a summer day program where they built catapults.  She said that it was the only place where she did not have to explain to her peers what a catapult was, and how to build it.  She felt at home at Mines.

Indeed, teamwork and inclusion are important at the school.  Professors teach all classes, which is unusual for an engineering school.  The atmosphere emphasizes teamwork over competition.  This should not be surprising, as almost every student has a job upon graduation.  According to staff, 220 companies interview at Mines every year; and more are put on a waiting list in case a recruiting firm can’t make it.  The gender ratio is 75:25, but the school also fields the largest collegiate section of the Society of Women Engineers.

Traditions are prized here.  Each year about 1,000 students “pull” an ore cart from Golden to Denver, where someone at the Capitol reads from a proclamation declaring it Engineering Days in Colorado.  Every student is given a hard hat upon entry.  Although these usually just sit on a closet shelf, they can come in handy.  We were hit with a hailstorm during our tour.  Some students who really had to cross campus could be seen running across one of the quads, with the hail bouncing off their hard hats.  So Colorado offers 300 days of sunshine, plus occasional hailstorms – it could be worse.

And what about the rock, you ask?  Incoming freshmen are expected to bring along a 10-pound rock from their hometowns.  (I’d like to see students explain this to airport security.)  In a ceremony aided by upperclassmen, the students haul their rock up the mountain overlooking their campus, and place it in the whitewashed “M” at the top.  Upon graduation, they are invited to go back to the “M” and retrieve a rock to take with them on their journeys, which probably ends up sitting on a shelf next to the hard hat and the Mines’ silver-plated diploma.  The whole place has a bit of “old school” atmosphere that is hard to resist.

As for getting in, the numbers for the middle cohort (25% to 75%) are GPA 3.74 to 4.0, SAT 1370-1470, and ACT 29-32.  The OOS percentage is 35%.  These numbers are typical for good engineering schools.

Two additional facts stand out.  First, admissions are rolling, and they open early – in September.  Although students can still get priority admission status through November 15, they should aim to submit their application the moment admissions open.  This is easier than it appears because, as noted above, there is no essay requirement.  Second, Mines will give AP credit sparingly, and in many cases, only after the student passes a “challenge exam” in the subject.

Mines is a hard-core (sorry) engineering school with a small and cohesive class, professors who teach students, a few unusual majors (explosives engineering, anyone?), and an excellent location with the mountains nearby.  For the right student, this place could be a perfect fit.

 

Colorado College

We travel 75 miles south from Denver for our final college, but the trip is worth it.

It is unusual to find a college which differs from others, not just in academic offerings, but in the structure of the curriculum itself.  Colorado College (“CC”), in Colorado Springs, serving 2,100 undergraduates, is one of a handful of colleges in the U.S. where the academic year is based on a Block System – students take one course at a time, for three weeks and half weeks, before proceeding to the next.  Students take eight Blocks each year, plus optional summer sessions and a half-Block over the winter break.

CC’s pitch for its Plan is compelling:

Want to study for your biology midterm without worrying about filming your documentary, reading 72 pages of The Odyssey, or training your psychology rat?

Why not take just one class at a time? 

Do not assume that CC is a trendy experiment.  The College was established in 1874, and is graced by a large and stunning Norman Romanesque chapel built in 1931.  A glance around various dedications on buildings reveals an endowment funded by the Packards (the “P” in “H-P”), the Waltons, and other luminaries.  The campus is pretty, and the setting near Pike’s Peak is hard to beat.

The Block Plan was started in 1970, and has become the best-known feature of a strong liberal arts college.  Students take one course Monday through Friday, usually from 9 a.m. to noon, with labs in the afternoon.  Unless enrolled in a lab course, students have afternoons and evenings to themselves and each other.

Each Block period runs for 3 ½ weeks, ending the Wednesday of the fourth week; students get a four-day weekend before the start of the next Block.

One major advantage of this schedule is that professors can schedule trips into the field without conflicting with other courses.  Students pursuing outdoor fields of study (e.g., archeology, geology, environmental studies, wildlife biology) are particularly well-served by this arrangement.  CC mentions film students traveling to Hollywood and art students going to Paris – you get the idea.  And with a 10:1 student/teacher ratio, education can be personal, to the point where professors often invite their class to dinner at their homes.

By the way, CC’s outdoor location at the foreground of the Rockies is perfect for its plan.  Like all the other Colorado schools, there is plenty of sunshine and opportunity for winter sports.

CC attracts many students who are frustrated by having to multitask constantly in high school.  These are students who prefer to “dig in” to a topic.  They can immerse themselves fully in one subject.  If they hate the class, they can either exchange it for another (on the first or second day) or grin and bear it, knowing that each Block only lasts for just over three weeks.

Students and counselors alike should read the College’s definition of the “right fit” student here:  https://www.coloradocollege.edu/lifeatcc/different/.  The school tends to attract “intense” students.  They may not be quirky, but they are definitely looking for something different.  For most of them, that difference is the opportunity to fashion their own education, block by block.  For such students, CC represents a wonderful opportunity.

Of course, great opportunities are not available for everyone.  Only 17% of applicants are admitted, although the SAT/ACT numbers are not overly demanding:  averages of 1340 and 31, respectively.

Note that the annual cost of attendance is a hefty $65,000; students should investigate financial aid opportunities (most need-based) carefully.

Watch What You Post!

This blog does not usually cover breaking news, but Harvard’s recent move to rescind admissions for ten students makes this a timely topic.

The Washington Post is just one of many news outlets carrying the story:  https://www.washingtonpost.com/news/morning-mix/wp/2017/06/05/harvard-withdraws-10-acceptances-for-offensive-memes-in-private-chat/?hpid=hp_hp-morning-mix_mm-harvard%3Ahomepage%2Fstory&utm_term=.64e28d839ba4#comments

Like many schools, Harvard sponsors a Facebook page for admitted students.  The page allows students – and admitted students who have yet to matriculate – to exchange information.  The topics are typically mundane, including information about student groups, orientation, students seeking others with similar interests (e.g., incoming international students), and the like.

In this case, the Post reports that about 100 members of the freshman class created a messaging group which would “share memes about popular culture – a growing trend on the Internet among students at elite colleges.”  For anyone older than 30, this may require translation.  A complete explanation can be found here:  https://studybreaks.com/2016/12/19/meme-culture/; my summary follows, quoting liberally from the link.

A meme in this context refers to a “humorous piece of online content, usually in the form of an image with text or video, that is copied and rapidly disseminated by internet users across all platforms.”  Many college Facebook pages have sub-groups that are devoted to communicating, and commenting on, memes related to college life.  These range from tweaking the administration to edgier topics.

At this point, a small alarm bell should start ringing in your head.  It is one thing to exchange information, but views about potentially sensitive topics?  This may be dangerous.

And so it became on the Harvard Facebook page, as a group of students formed a separate group to exchange messages.  Then an even smaller number of that group formed an “offshoot” page to exchange “off-color” or “R-rated” communications, including graphics.  This “offshoot” page is where the trouble began.  The Harvard Crimson reports:

A handful of admitted students formed the [offshoot] messaging group—titled, at one point, “Harvard memes for horny bourgeois teens”—on Facebook in late December, according to two incoming freshmen. 

In the group, students sent each other memes and other images mocking sexual assault, the Holocaust, and the deaths of children, according to screenshots of the chat obtained by The Crimson. Some of the messages joked that abusing children was sexually arousing, while others had punchlines directed at specific ethnic or racial groups. One called the hypothetical hanging of a Mexican child “piñata time.”

Yes, this is very ugly.  However, what happened next made news.  Harvard admissions administrators learned about the offshoot group, and began monitoring it.  Eventually, the administrators reached out to individuals and demanded explanations for particularly offensive posts.  Harvard ultimately rescinded ten students’ admissions, well after the deadline for those students to enroll at other schools where they had been admitted.

Per the Washington Post and other news outlets, the university invoked a sort of “moral code” from its admissions policy that was included on the Harvard Facebook page:

As a reminder, Harvard College reserves the right to withdraw an offer of admission under various conditions including if an admitted student engages in behavior that brings into question his or her honesty, maturity, or moral character. 

To recap, students congregated on a public web page created by Harvard, then created a “private” chat group, where they made off-color, and hateful, jokes among themselves.  Harvard monitored this private page on the Harvard sponsored Facebook page, reviewed the communications, and rescinded admissions.

Aside from whether Harvard’s response was appropriate, the episode points to an emerging reality of the 21st century.  Whatever you write or post on the Internet is available to all to see, forever.  It may, or may not, result in consequences.

On the Internet, typing something in “all caps” is the equivalent of (rude) shouting.  This point needs to be shouted:

STUDENTS SHOULD NOT PLACE ANYTHING ON THE INTERNET THAT THEY WOULD NOT WANT THEIR RELATIVES, FRIENDS, PROSPECTIVE EMPLOYERS, AND COLLEGE ADMINISTRATORS TO SEE.

As college students would say:  rant off.

A surprising number of college administrators are looking for such material.  According to Kaplan Test Prep, 35 percent of admissions officers said they check social media sites like Facebook, Twitter and Instagram to learn more about applicants. About 42 percent of those officials said what they found had a negative impact on prospective students.  See http://press.kaptest.com/press-releases/kaplan-test-prep-survey-college-admissions-officers-say-social-media-increasingly-affects-applicants-chances.

Even if Kaplan only surveyed “elite” colleges (my guess), those numbers are still eye-opening.

Students need to understand that their social media activity is “fair game” in the college admissions process, and act accordingly.

University of Washington Switches Gears on Admitting Students to its Engineering Programs

Faithful readers who have stuck with me since the beginning (okay, you can put your hand down), will remember one of my first posts (“The Data Is Out There, But You Need to Look For It”, April 15, 2015) about the need for counselors to find current and detailed information to help their students make informed choices.

One of my examples concerned the University of Washington, and the data I found was surprising.

Consider the University of Washington, a top-ranked engineering school.  The Engineering Department only admits 10-20% of applicants directly.  The rest are required to apply later, as juniors.  

What happens to [them]?  It isn’t pretty [link omitted]:

. . . .

To summarize, the majority of these very bright and talented students fail to get into their engineering sub-specialty.  (Thus, the would-be aeronautical engineer may have to settle for industrial engineering – much like a would-be brain surgeon ending up as a general practitioner.)  Some students get into none of the available sub-specialties, and either switch majors or transfer to schools to finish their engineering degree.

Well, not that my blog had anything to do with it, but the University of Washington is changing its policy, effective next year.  It will now admit about half its class, 650 students, directly.  Accounting for shrinkage after admission (about 40% of engineering students nationwide switch out of the major), the total will account for about half of the engineering degrees awarded.  The rest will come from students who transfer into the major, from other programs at the university, or from other schools.

However, there is an interesting footnote to this change in policy.  The department plans to enroll approximately 75% of its students from in-state, and the remaining 25% from “out of state or out of the country.”  If international students make up half of that 25%, your out-of-state student is going to find that getting into the University of Washington engineering program difficult, indeed.  But at least they will be rejected at the start, as opposed to waiting for two years for the hammer to fall and then not being eligible for the financial aid afforded to four-year students at UW.

And that’s cause for celebration.