You May Be the True Victim of the College Scandal

Why do I mention that I am an Associate Member of IECA, the Independent Educational Consultants Association?

IECA is the gold standard of college consultants organizations. Members are expected to adhere to the highest ethical standards; those who do not are not allowed as members.

This brings us to the college admissions scandal currently in the news, a tale of avarice and cunning that will shake up some corners of the college admissions world. I think that there a few points worth noting.

First, the consultant at the center of the scheme was not a member of IECA. This is not surprising – IECA does not look kindly on consultants charging outrageous fees for college admissions. Frankly, the amount of work involved for a reputable consultant (in my view, 100-200 hours), does not justify tens of thousands of dollars in fees. (And the much smaller fee I charge is not enough to bribe a school mascot, never mind a college coach.  For information about my fees, see Why Hire Me?.

IECA also demands that college consultants not “guarantee” admissions, simply because there is no honest way to do that. In addition, college consulting is about finding the right school for each client. It is hard to accomplish this goal when your clients are fixated on doing whatever it takes to get into a “prestige school.”

Second, there has been no evidence presented that college admissions staff were bribed. I suspect that this is because admissions officers generally make decisions in groups – bribery would be cumbersome, more expensive, and risky (it would take only one admissions officer to talk to end the scheme). White-collar criminals generally find the “soft spot” in the victim organization; in this case, altering the records on which the admissions officers rely was simpler and cheaper. (Ask me how I know –  see About Me.)  Whatever the reason, I find the absence of admissions officers as defendants reassuring.

Third, and most important, the victims in this scandal are not really the universities, however they spin it. Yes, their employees accepted bribes, and some undeserving students were admitted. Some colleges – most notably USC – may suffer undeserved reputational damage.

But the number of students involved are a drop in the bucket compared to the thousands of students enrolled in those colleges legitimately. More important, there remain other, legal, ways for those with money to oil the gears of the admissions process. Legacy admits (students whose parents attended the college) have a huge advantage in Ivy League admissions. And then there are contributions – for the reported $6.5 million one parent invested in bribery, that parent probably could have achieved the same result by simply donating to the university.

College counselors will be largely unaffected.  Perhaps those who are charging outrageous fees might garner suspicion as to exactly what services they are offering to justify same, but consultants belonging to reputable associations (e.g., IECA, HECA, NACAC) may even benefit from all of the discussion about what reputable counselors do.

The victims are students. To be precise, they are high school students with learning disabilities. If your student falls into this category, then you and that student are potential victims. This is because one of the tentacles of this scandal may be particularly far-reaching:  the college consultant claimed that he was able to bribe therapists to provide false documentation of a disability to be used to obtain accommodations on the ACT or SAT.

From the New York Times, March 13, 2019:

The conspiracy relied on the parents getting medical documentation that would entitle their children to extra time on the test, an accommodation normally made for students with disabilities. Students who need extra time generally take the test alone, supervised only by a proctor — providing the opportunity for the bribed proctor to rig the outcome. Mr. Singer advised parents on how to get the medical documentation needed to qualify.

According to court filings, in a conversation with one of the parents, Gordon Caplan, Mr. Singer explained that for $4,000 or $5,000, a psychologist he worked with would write a report saying Mr. Caplan’s daughter had disabilities and required special accommodations. He assured Mr. Caplan that many parents did this for their children.

“What happened is, all the wealthy families that figured out that if I get my kid tested and they get extended time, they can do better on the test,” Mr. Singer said in the conversation. “So most of these kids don’t even have issues, but they’re getting time. The playing field is not fair.”

This is a potential disaster for many of you. It can be tough for students with learning disabilities to obtain accommodations. Now we can expect the ACT and the College Board (the SAT) to make the process even more difficult. If psychologists (possibly a neuropsychologist, but the story does not say) can be bought, then how will the testing companies identify students who really do need these accommodations?

The initial reaction from the College Board was encouraging and helpful:


The College Board considers all reasonable requests for accommodations — such as large print, Braille, or extended time — needed by students with documented disabilities.

The board asks for documentation in some cases, Mr. Goldberg said, but in the “vast majority” of cases, the modifications are granted through the schools that students attend, where they are evaluated and given an individualized education program.

It appears that the best defense is to have a history of accommodations. Parents should be prepared to show current and past IEPs and 504 Plans. A long history of documentation is likely to allay suspicions. However, for those students whose diagnosis is too recent to have obtained an IEP or 504 Plan, or whose need for minor accommodations does not justify the hassle and expense of obtaining same, this scandal may prove troublesome.

This is another reason for parents of students with learning disabilities to consider paying for an assessment by a neuropsychologist and, if appropriate, obtaining an IEP or 504 Plan.

What Juniors Should Be Doing This Spring

For some intensely motivated families, the college hunt started a long time ago.  For some high school juniors, the student has not – and does not – want to think about college yet.  But there are steps juniors should be taking right now to prepare for college admissions applications season later this year.

 

Summer plans

A few very selective universities (e.g. Stanford, Princeton) ask students how they spent the summer after their junior year.  (Stanford asks about the last two summers.)  Many more ask about extra-curricular activities or work experiences – summer is prime-time for both.  For the record, the summer after my junior year, I slept 12 hours a night and “hung out” with friends doing nothing important whatsoever.  Alas, this is no longer a viable college strategy for juniors.

The “rules” about what constitutes a worthy summer experience are in flux.  The accepted wisdom has been that travel, internships, and summer courses on college campuses are the best ways for juniors to show that they have used their time to learn more about the world and themselves.

However, stung by charges of elitism, colleges admissions officers are proclaiming that local volunteer work or even “merely” having a job will suffice.  And as I have written about previously in Mission Accomplished?  Maybe Not Anymore, overseas service trips are losing their luster because they are perceived as available only to the wealthy.

Each student’s circumstances will determine their best “summer strategy.”  Exploring a potential medical career by “shadowing” in a hospital, or a science career by working in a lab as a research assistant, or any career by working in a related internship, is a common activity.  Students leaning more toward humanities and the social sciences often participate in writing workshops, fine arts experiences, and travel.  Students may also study college-level material in a formal academic program.

Of course, plenty of students needing to support their families or save for tuition have only one choice:  work all summer.  As noted above, colleges are slowly accepting this reality, and many fine essays come out of the most mundane of work experiences.  Families unwilling or unable to pay for summer experiences will favor this option, along with encouraging their student to read for enrichment; Wake Forest currently asks for the student’s five favorite books.

The only rule that matters for now is that families should start working on summer plans while it is still cold out.

 

Recommendations

Most colleges will accept one to two teacher recommendations; some colleges will also accept a recommendation from a non-teacher.

Stay tuned for a post about how to secure the best recommendations.  For now, note that most recommendations come from teachers who are currently teaching your student.  Teachers in your student’s senior year will have too little time in which to learn about the student before applications are due; sophomore teachers may not remember much about the student when it comes time to write recommendations two years later.  Much the same is true about non-teachers, such as employers, coaches, and the like.

Juniors who have impressed teachers should redouble efforts in those classes.  They should  keep notes of their achievements (best papers, projects, and tests) for later use when asking for a recommendation.  Note:  doing so may be essential because a few colleges are now requiring students to submit a graded paper with their application – this may well become a trend.

Juniors should also increase their participation in class, review their teacher’s comments on papers and projects with them after class, and generally gain their teacher’s regard.  Some may call this “sucking up.”  Why yes, that is exactly right.  Welcome to the world of college admissions – and life.

Finally, all colleges require a letter from students’ “guidance counselor.”  This can be the most important recommendation of all.  We will take up the reasoning supporting this assertion in a subsequent post; suffice to say at this juncture that juniors should get to know their guidance counselors.  In large schools, the guidance counselors serving as college advisers have huge caseloads – even identifying them and introducing yourself can be a challenge. Nonetheless, juniors should begin planning to make an appointment with their guidance counselors to discuss college plans.

If you are a parent or student living in Tucson, please note that I am one of a handful of independent consultants who live and work here.  Please see The Tucson Advantage for why that matters. 

 

The Invasion of the Reps!

Tis the season when college admissions representatives (“reps”) descend on high schools. Prepare to take advantage of this opportunity.

Not all colleges send reps to high schools to recruit students. Some elite and/or small universities combine forces to present sessions off-campus in the evening. Here is an example:  https://apps.admissions.yale.edu/portal/events?id=91fe4029-9573-4852-8e3e-546e7aa5c037. Other colleges offer information sessions in various cities – if you are not in a large city, you probably will not see them.

However, a significant number of colleges send reps directly to high schools. These colleges generally assign reps to assess applications from different regions and states; part of their job is to visit schools in their assigned area. This means that reps who visit your school are likely to be part of the team evaluating your application.

The typical rep (not rap) session is devoted to an informal presentation on the merits of the college, followed by questions. Students are usually allowed to skip class to attend. The entire session usually lasts 30-60 minutes.

Most reps confine their visits to the “strongest high schools” in your area. If your school does not typically host college reps, then you should consider examining the website of the strongest schools in your area and choosing one or two information sessions with colleges to which you intend to apply. If you can swing the logistics (traveling from one school to the other, losing class time, etc.) seek permission from the host school (and your own) to attend the information session at the other school. Of course, you will identify yourself to the rep and explain the situation. Your initiative will seize the rep’s attention, which may bolster your admissions chances.

You have several objectives when attending a college information session:

  1. Learn information about the school that will help you decide whether to apply.
  2. Learn information about admissions that is not available elsewhere.
  3. Register your interest in the college and, if possible, impress the rep.

 

Here are some tips to accomplish these objectives.

  1. Determine which colleges are sending reps to your area.

Apart from checking with your high school about which colleges are sending reps to the school, most college websites – under “Admissions” – will announce when and where reps will be visiting your area. This may take some digging, but it’s there. If all else fails, call the college’s Admissions office.

 

  1. Choose your sessions carefully.

Attending sessions often requires skipping classes or extra-curricular activities. Unless you are seriously interested in the college, skip the session. As one rep notes:

I always felt one of the great ironies of the high school visit was that I was there exhorting students to take the toughest classes, do as well as possible…and then skip them when I came to school. That never made sense to me.

 

  1. Research the college.

This is another reason to choose your sessions carefully:  you should do some homework on the college before the session.

Determine whether the college cares about “demonstrated interest.”  College sessions generally include a “guestbook” where students sign in to register attendance. Reps also hand out their cards, giving the student the opportunity to send “follow-up questions”.  Most colleges keep track of students who attended the session as an indicator that the student “demonstrated interest” in the school. But a few do not.

Fortunately, there is data out there from which you can determine which colleges care about demonstrated interest. Begin by checking the college’s website. A few colleges, such as Carnegie Mellon, make it abundantly clear that they do not track this information. See https://admission.enrollment.cmu.edu/pages/eliminating-demonstrated-interest.

Next, check www.collegedata.com, which compiles data submitted by colleges to the U.S. Department of Education. Find your college, and then click on the “Admissions” tab. About halfway down the page, you will find “Selection of Students.”

Here is the data for one elite university.

 

Factor Very Important Important Considered Not Considered
Rigor of Secondary School Record X
Academic GPA X
Standardized Tests X
Class Rank X
Recommendations X
Essay X
Interview X
Level of Applicant’s Interest X
Extracurricular Activities X
Volunteer Work X
Particular
Talent/Ability
X
Character/Personal Qualities X
First Generation to Attend College X
State Residency X
Geographic Residence X
Relation with Alumnus X
Religious Affiliation/ Commitment X
Ethnicity X
Work Experience X

 

As you can see above, the ratings are: “Very Important”; “Important”; “Considered”; and “Not Considered.”  One of the attributes is “Level of Applicant’s Interest.”  Most colleges – including the one above – state that such interest is “Considered.”  A few label it as “Important.”  That is often a code word for “you’d better go visit that college if you want to win admission.”  (Yes, Rice University, I’m looking at you.)

Colleges that state that the Level of Interest is “Not Considered” should be taken at their word. These are often elite colleges that do not want anxious students flooding onto campus simply because they believe they must. Feel free to attend their high school presentation sessions, but do not assume that you are bolstering your admissions chances by doing so.

Note:  the “Interview” attribute refers to interviews offered by the college after receiving your application. Most of the colleges which request interviews will offer to have an alumnus interview you close to your location; there is usually no need to visit the school for that purpose. For purposes of deciding whether to attend a rep session, you can ignore this attribute.

We’re just getting started with research here. Know the basics:  the college’s location, size (and perhaps gender/racial composition) of its student body, and courses of study available. A quick review of the college’s web site should yield this information.

Then dig a bit deeper:  determine the GPA and SAT/ACT scores necessary to be competitive, the schools, departments, and majors for which the school is best known, and some of the colleges for which it typically competes for students. There are several resources where you can find this information, but for a “quick look”, I consult www.collegedata.com. Type in the school name, and choose the “overview”, “admissions”, and “student life” tabs.

Then run the college’s name through a search engine and browse the links which appear, from Wikipedia to various ranking sites (e.g., Niche, Princeton Review). Hard-core readers with plenty of time might look up the school in “College Confidential”, as well.

Finally, you should examine the Common Application (or the Coalition Application or, in a few cases, the college’s own application) and determine what supplemental essays and short-answer questions are contained in the application for the college visiting your high school. Be patient – there is a very good reason for doing so.

Now you are ready to evaluate the rep’s presentation and ask an intelligent question.

Here is a step-by-step guide to getting the most out of the session itself.

  1. Arrive 5 minutes early. That is enough time to snag a front row seat – you want to be seen – and not so early that you are stuck talking to the rep with nothing to say. Talk to the rep after the session, not before.
  2. Sign the guestbook.
  3. Turn off your cell phone and put it away.
  4. Listen carefully to the presentation; take notes.

The mere act of taking notes marks you as a serious applicant. And you may use the information later.

What should you be listening for?

What does the college consider its strengths?  Colleges compete for your attendance. Let them sell the benefits their school offers. Take notes, because at some point in your application to that school you will want to express interest in those benefits. In other words, you will want to tell them – at least briefly – what they want to hear. This is where you connect the information and sales pitch with your answer to their “Why Our College” question.

What information is the rep revealing about admissions policies and/or priorities?  Although most colleges simply repeat the information on their websites, sometimes they will deliver a nugget of information you can use. Be alert for suggestions that you apply early, or that the college is looking for certain extracurricular activities. (Although you cannot invent an activity, this information helps you decide which activities to emphasize.)  One year a rep from a top college responsible for my student’s region emphasized that he likes to read humorous essays. My client obliged – she was admitted. (Of course, her academic record might have had something to do with it.)

5. Do not ask more than one question during the session.

Asking more than one question may give the impression that you are trying to dominate the session. Of course, if you have a follow-up question, do not be afraid to ask it.

Avoid questions where the answers are apparent on the website. For example, asking about admissions requirements that are set forth on the website makes you look lazy. This is one reason to the research I suggested above.

Ask more general questions that may pertain to your interests, such as:

Question: “I am interested in exploring both sciences and the liberal arts. How difficult is that to do at ________?”  “Is it common for students to double-major?”

Question: “How popular is undergraduate research [presuming your interest is in STEM] at ______?”  “What are the requirements for students who are interested in performing research?”

Question: “Is there much interaction between the students on campuses and the surrounding community?”  “What are the internship (or co-op – where you work full-time) opportunities in the area?”  [This question can be tweaked depending on whether you want to know about social, business, or academic opportunities – if you are budding social worker, you may care more about the demographics of the area than other students.]

I list two questions for each topic because another attendee might ask the first one.

After the session ends:

  1. If you haven’t done so already, sign the guestbook.
  2. Engage the rep.  Start by asking for a business card.

There will probably be a line of students waiting to talk to the rep. If you have plenty to say, then you might want to linger in the back of the line in hopes of being the last student who talks to the rep. You want to be memorable, in a good way.

When you get home after school, if you are still interested in the college, write a quick paragraph (based on your notes) and put it into a file for that college (electronic or visual) for use when drafting your application. Remember to save (or scan) that business card.

And write a thank-you note and e-mail it to the rep.  Something simple will do:  “Thank you for your information session today at [“X high school”].  I found it very informative and useful.”  Noting the high school is important, because reps often visit more than one high school each day.

 

A Test That Is Too Easy Results in Hard Feelings

A not so funny thing happened on the June 2018 SAT test – many students were surprised by their low marks. In fact, the entire testing community appears to have been taken aback, and not in a good way.

Here is an article that explains all of this in detail:  https://www.compassprep.com/when-up-is-down-june-2018-sat/. Other test-prep companies have also discussed this issue. See https://www.applerouth.com/blog/2018/07/12/june-sat-scores-much-lower-than-expected/.

The takeaway for students and parents is that the June 2018 SAT test was too easy. As a result, the College Board psychometricians (a vocabulary word that will probably never appear on the SAT – it refers to experts on testing) used a very steep curve to avoid giving the same scores to everyone. This curve punished mistakes harshly. As the Compass article linked above points out:

Compare this to how the June SAT 2018 Math fits in among its fellow new SATs. A 650 could be achieved with 50 correct answers. That’s the lowest scaled score the new SAT has ever produced for 50 correct answers. The highest score it has produced for 50 correct answers on an actual, released exam is 740 points — a 90-point swing! So in its first two years, the new SAT has approximately doubled the extremes seen on the old SAT over 10 years and 4 times as many exams.

What will happen to those scores?  The College Board remains committed to the results of the test, going so far as to insist that the results were not “curved,” but “equated.”  Technically this may be correct, but the impact for many is that their scores on that test do not reflect their ability to score well on the “typical” SAT.

However, the furor surrounding this exam changes the usual calculation concerning when students should re-take a standardized test. Normally, students with two or more SAT (or ACT) results should not retake the exam except under two circumstances: 1) students are very confident that they will be able to improve their scores because they have now received extra time for a learning disability, were ill previously, or have put in more time in studying for the test (perhaps with help from a test-prep company); or 2) students absolutely need a higher score to stand a chance of being admitted to their “reach” schools.

The risk of taking the exam repeatedly is that some colleges require students to send all test results. For those colleges, there are several risks. First, the student may get unlucky and receive lower scores on the retake. Second, some of those colleges (e.g., Ivy League) frown on students taking the SAT or ACT multiple times. Finally, colleges tend to discount later scores as being due to the “practice effect”, i.e., students who are more familiar with the test typically post higher scores.

Some of these risks may be reduced for students who scored below their expectations on the June 2018 exam. Although colleges will probably not discard the June 2018 scores, the furor over the test means that they will consider those scores with an asterisk – they are more likely to accept later scores favorably.

If you are dissatisfied with your score on the June 2018 SAT, you should seriously consider retaking the test.

SAT Score? No, But Check Out How I Did on the Gaokao!

A couple of news items grabbed my attention in June.  First, the universe of test-optional schools expanded with the addition of a top-ten institution, the University of Chicago.

 

Just say no to standardized tests?

Test-optional schools are what the label implies:  students are not required to submit SAT/ACT scores when applying.  The “test-optional” practice has become increasingly popular, with 100 colleges adopting the practice during the last five years.  Why is this becoming popular?  From my vantage point I see good – and perhaps not so good – reasons.

Some students just do not test well.  Students with learning disabilities who need extra time sometimes do not receive accommodations, or do not benefit from them.  Standardized tests stand accused of cultural bias – asking questions which presuppose knowledge that students not raised in “mainstream” America lack.  The “test-prep” industry, which often only the wealthy can afford – can boost some students’ scores at the expense of their less financially endowed competitors.  In other words, in some cases the tests simply do not measure students’ academic potential.

However, some colleges adopting this practice are being more strategic rather than altruistic.  Colleges are businesses, and once you get beyond the top 100 colleges, demand for places starts to ebb.  Many of those “test-optional” institutions are small private colleges.

“Going test-optional” enlarges the pool of interested students.  See https://www.nacacnet.org/globalassets/documents/publications/research/defining-access-report-2018.pdf.  In a few cases, those extra students can be essential for financial survival of the institution.

Schools that go this route can obtain another benefit – selective reporting.  Students with low standardized test scores who stand to be admitted for other reasons (e.g., legacy admissions, athletic scholarships) are most likely to not submit those scores.  When the school reports its admission statistics, its average standardized test scores for admitted students will rise commensurately.  This makes the college appear more selective, which can affect its rating in reference lists such as U.S. News and World Report.  These colleges get the best of all possible worlds:  more applicants, higher reported test scores, and a few rungs up the “prestige” ladder for schools (which in turn prompts more students to apply).

A few “test-optional” colleges have “refined” this concept by requiring submission of standardized test scores to obtain financial aid.  This appears to be an obvious “pay to play” plan, where students with poor test scores are not required to report them if all they seek is admission.  If those students are admitted, they subsidize their better testing peers by paying full tuition.

Many – but not all – top colleges have resisted adopting a “test-optional” policy because of the stigma attached.  The SAT and ACT are marks of quality (however imperfect); announcing that they are no longer required for admission suggests a lack of rigor.  This is increasingly the case because widespread grade inflation is making GPAs less reliable for distinguishing among students.

This is what makes the decision by the University of Chicago so surprising.  It is an elite institution with perhaps the most “cerebral” reputation of them all.  This was where the Manhattan Project helped win WWII, and the Chicago School of Economics changed economic policy around the world.  This is the university famous for its admissions essay questions – a few years back applicants were invited to explain “what is odd about odd numbers”?  The university is known as the place “where fun goes to die,” because everyone is so busy studying.  These students run an intellectual gauntlet second to none.

Perhaps most important, the University does not need more prestige – it already ranks #3 on the all-important U.S. News and World Report list of top colleges.  Thus, when the University of Chicago goes “test-optional”, the world of higher education pays attention.

So why did the University do it?  It cites some of the “good” reasons stated above; it is coupling the move with an increase in financial aid for families earning under $125,000.  If there were also strategic motives behind the plan, they were not announced.  (No surprise there.)

Two questions remain.  First, will this change the demographic of admitted students at the University?  Will more minority and students with learning disabilities apply, and will they be accepted?  Second, will other schools follow the University’s lead?

We will have to wait a few years to find the answers to both questions.  In the meantime, the “test-optional” movement just gained a significant boost.

Students considering test-optional schools should carefully evaluate the testing policies of individual colleges – they can differ in important respects.  College counselors can add value here.

 

The rise of the gaokao

Meanwhile, there was another standardized test in the news this week – the gaokao.  This one dwarfs the SAT and ACT in almost every respect.  About 3 million U.S. students take the SAT or ACT, while approximately 9 million Chinese students sit for the gaokao.  Like some European countries, the gaokao is the most important, if not the only, data used by Chinese colleges for admission.

The gaokao is a nine-hour exam given over two days.  Compulsory subjects include Chinese, mathematics, and, usually, English (students can substitute other languages); students also sit for an additional subject depending upon whether they are pursuing STEM or other careers.

It is hard to overestimate the importance of the gaokao to Chinese students.  The results determine which colleges students can attend and the majors they may pursue there once admitted.

Many students compress their high school careers so that they can graduate early and spend the next year cramming for the exam.

The hype around the exam makes stories about test anxiety in the United States seem tame.  Per the South China Morning Post:

Hengshui Middle School in Hebei province, where more than 100 students earned admission to the prestigious Peking and Tsinghua universities, students have been given IV drips as they study, believing that it will help them with concentration and focus. Girls are given contraceptive pills to delay their periods until after the exam.

Similar stories abound.  See https://www.theatlantic.com/education/archive/2015/01/what-students-in-china-have-taught-me-about-us-college-admissions/384212/ (referred to as “the Atlantic article”).

The Atlantic article suggests that the growing number of Chinese students seeking to attend college overseas are driven by worry about – and disdain for – the gaokao.  It notes that Chinese students taking the SAT and ACT are under similar pressure, and links that pressure with the recent wave of cheating scandals on all three exams in Asia.  (The penalties are a bit stiffer for cheating on the gaokao – cheaters are banned from retaking the test for years, and those caught facilitating cheating face prison sentences of up to seven years.  See http://time.com/4360968/china-gaokao-examination-university-entrance-cheating-jail-prison/.)

The gaokao and the SAT/ACT share one justification – identifying talent.  The archetypical example in China is the student in the rural provinces who would otherwise have been consigned to a life of farming but did well enough on the gaokao to change her life.  It is ironic that the SAT/ACT are threatened in the U.S. while the gaokao remains dominant in China.

One reason for the dominance of the gaokao is that for many Chinese students, the test is the only guaranteed authentic mark of achievement and talent.

From the Atlantic article:

Guessing the percentage of fraudulent transcripts in applications from China is a popular parlor game among educators over here. Unscientific estimates abound: One prominent agent who works with students at some of the best high schools in China recently estimated to me that at least half of the transcripts in China are doctored to look like the students have done well in a robust high school curriculum, when the reality is one of almost constant memorization and practice tests. Unfortunately, no one in the college prep industry in China would be surprised if the actual percentage was significantly higher.

The Chinese system poses a challenge for U.S. colleges who tout their “holistic college admissions” processes.  How can they distinguish among foreign students who spend all of their time studying for one exam, and whose transcripts, even if produced, may be fraudulent?

The obvious answer is to consider the results of the exam.  After all, colleges rely on the TOEFL exam to assess competency in English.

And so it begins.  Newspapers this week trumpeted the decision by the University of New Hampshire to consider the gaokao.  But UNH is only the first public university to do so – the University of San Francisco (“USF”) started accepting the gaokao in 2015.  Dozens of universities in Australia, Canada, and Europe accept it.

This article from Inside Hire Ed reporting on USF’s program is skeptical that many U.S. universities will follow, mostly because the timing of the gaokao conflicts with the admissions cycle:  https://www.insidehighered.com/news/2015/05/20/u-san-francisco-gives-gaokao-based-admissions-try-china.

We shall see.  When 9 million test-takers sit for an exam, the number of “underperformers” is in the seven-digit range.  It is therefore no surprise that U.S. universities are after some of those test-takers, preferably those who will pay full tuition in the United States.

Per the USF administrator in charge of Chinese admissions:

He anticipates that USF will set gaokao cutoff scores equivalent to the marks needed to get into a first-tier Chinese university in each province, plus or minus a few points. Students who are admitted based on their gaokao scores will pay their own way, though Nel said they could be eligible for merit scholarships of up to $20,000 per year.

Brave words, but I doubt that the high standard will be maintained.  After all, the goal of almost every student who takes the gaokao is to snag a spot in a first-tier Chinese university.  Very few will trade that for a spot at most U.S. colleges.  Expect lower, less publicized, standards as this practice grows.

And it will grow.  With 337,000 Chinese students currently enrolled in U.S. colleges, and financial pressures on those colleges increasing as public funding continues to lag, do not be surprised when this practice spreads throughout our American system of higher education.

Ken Rosenblatt — Tucson College Counselor

TEN-HUT! Applying to Our Nation’s Military Service Academies

The summer before senior year is usually quiet for high school students.  Not so for those seeking to attend our Nation’s military service academies.  This is because of the importance of seeking – and obtaining – a nomination for an appointment.

An appointment is the equivalent of a college acceptance.  A nomination is a request by a designated person or institution that an applicant receive an appointment.

If this looks complicated, it is.  However, rest assured that the service academies will walk you through the process (the Coast Guard Academy does not require nominations, but is listed below for reference):

Army (West Point, NY):  https://www.usma.edu/admissions/SitePages/Apply_Nominations.aspx

Navy (Annapolis, MD):  https://www.usna.edu/Admissions/Apply/index.php#fndtn-panel1-Steps-for

Air Force (Colorado Springs, CO):  https://www.academyadmissions.com/#

U.S. Coast Guard (New London, CT):  https://www.uscga.edu/admissions/

U.S. Merchant Marine (Kings Point, NY):  https://www.usmma.edu/admissions

Most students seeking nominations apply to their local Representatives and Senators.  The Vice-President and President also nominate candidates.

Applicants are encouraged to ask each of their local Representatives and Senators for nominations because each of these office-holders can make only a limited number of nominations.  As the Air Force puts it:

We recommend attaining a nomination from as many sources as you are eligible for, President, Vice-president, Senators and Representatives. This will improve your chances for success if one source does not nominate you.

Certain military officials, JROTC units (discussed below), and even some private military academies can also nominate applicants.  Consult service academy websites for more information on those options.

The nomination is only part of the application, but it is the one with which most students are least familiar.  Applicants who have military connections (e.g., JROTC, relatives in the services) should reach out to each and every one of them for help.

 

Planning further ahead

For those with the luxury of time – students about to enter their sophomore or junior years – now is the time to lay the groundwork for your application and request for nomination.  This is because the service academies have some unique requirements for admission that require sustained effort.

First and foremost is fitness for duty.  Before you get your hopes up, determine as best you can whether you have any medical problems which could pose a barrier to admission.  A medical examination, the DoDMERB, is required for admission.

The Air Force is particularly stringent (see https://www.academyadmissions.com/admissions/the-application-process/medical-evaluation/#medicalstandards), but all of the academies have medical criteria.  Waivers may be obtained for certain conditions.

The academies also require evidence of physical fitness.  The Army includes a guide for aspiring cadets:  https://www.usma.edu/dpe/SitePages/Cadet%20Candidates.aspx.

It also requires students to complete a fitness test before applying:

You will schedule your CFA with your physical education (PE) teacher. The Candidate Kit has the test form and instructions you can forward in PDF format. Your performance in six events will be judged:

  • Basketball throw (from a kneeling position)
  • Cadence pull-ups or flexed-arm hang (women’s option)
  • 40-yard shuttle run (for time)
  • Abdominal crunches (number completed in 2 minutes)
  • Push-ups (number completed in 2 minutes)
  • 1-mile run (for time)

 

Leadership

Colleges proclaim that they seek students who have demonstrated leadership.  The service academies “walk the walk.”

Consider this advice from the Air Force:

So how do you prepare for a future at the Academy?

  • Study hard. Get the best grades you can in all subjects — especially English, math and science.
  • Join a sports team. If your school does not have an after-school sports program, you can usually find one at your local community park or recreation center.
  • Become a leader. Join a scouting program like Girl Scouts, Boy Scouts or Civil Air Patrol. Or join another school or local club and go for a leadership position like club president or secretary.
  • Demonstrate character. Consider activities that help others. Get involved with church groups or other organizations that may be helping members of your community.

 

Commitment to military life

Service academies are looking for leaders who will take well to military discipline.  One good way to demonstrate that quality is to join a local JROTC unit and, if possible, attend a summer program at a service academy.

Wikipedia offers a general (if lengthy) overview of the JROTC and a similar program, the NDCC:  https://en.wikipedia.org/wiki/Junior_Reserve_Officers%27_Training_Corps

These programs are intensive preparation for military life; evaluations and recommendations (and even nominations for appointment) from commanders are key.  The downside is that the commitment required to join such a unit may extend to attending very early morning sessions at a different location from the student’s high school.  Some parents even home-school their students as a way of accommodating the conflicting demands.

Another way of demonstrating a commitment to the academies is to attend one of their summer programs:

Army:  https://www.usma.edu/admissions/SitePages/FAQ_SLS.aspx

Navy:  https://www.usna.edu/Admissions/Programs/index.php

Air Force:  https://www.academyadmissions.com/admissions/outreach-programs/summer-seminar/

Coast Guard:  https://www.uscga.edu/aim/ (prospective merchant mariners also attend this program – the U.S. Merchant Marine Academy does not offer its own).

These camps are a pathway to an appointment.  As the Navy puts it: “Summer Seminar gives you a taste of life at the Academy and kick-starts your application journey for an appointment to the Academy.”

Finally, the Academies have the same way to demonstrate interest as do many colleges – sign up for their mailing lists.  For example, prospective applicants to the Air Force Academy should sign up to be “Future Falcons” https://www.cvent.com/events/new-future-falcon/registration-c93c8b84f6ef4435995663d7f90b7174.aspx?fqp=true).

 

Academics

Just because the services look for fitness and leadership potential, do not assume that academics are at the bottom of their list of priorities.  Academic performance makes up 50% of the assessment for admission to the Air Force academy.  The Naval Academy emphasizes STEM coursework, with many cadets graduating from the Academy with engineering degrees.  Standardized testing is also important.  For example, the published mean SAT score for the Air Force academy is in the mid-1200s, with a mean ACT of 30.

 

Work on Plan B

Our nation’s service academies are competitive; admission is the exception, not the rule.  Therefore, students also need to apply to colleges in case they do not win an appointment.  The good news is that academic and leadership ability, along with extracurricular activities, make for a winning application to both service academies and colleges.  As you might expect, I can assist students with both endeavors.

This Blog is Ranked #1

Once again, college ranking season is upon us.  Timed for when high school juniors and their parents are starting to focus on college admissions for the following year, a whole host of magazines, newspapers, and websites publish their annual “rankings” of their “top” institutions of higher learning.

What are you – parent or student – supposed to do with all these opinions?  How do you know which college is the best, or, more important, the best for your student?

When buying a washing machine, many consumers start with Consumer Reports magazine.  Is there a “Consumer Reports” rating authority for colleges?

Alas, no.  Instead of one possibly authoritative source, there are well over a dozen contenders.

Start with traditional news media organizations that publish college rankings, including:

U.S. News and World Report:  https://www.usnews.com/best-colleges (the 800 lb. gorilla of college ratings)

The Wall Street Journal/Times Higher Education of London:  http://graphics.wsj.com/image-grid/college-rankings-2018/?mod=djmc_wsjcollegeranking_search_092617&ef_id=WTGbrgAAAH26bSoC:20171028182719:s

The Economist magazine:  https://www.economist.com/blogs/graphicdetail/2015/10/value-university

Forbes magazine:  https://www.forbes.com/top-colleges/#3fb042601987

Money magazine:  http://time.com/money/best-colleges/rankings/best-colleges/

Kiplinger:  http://www.kiplinger.com/article/college/T014-C000-S002-kiplinger-s-best-college-values-2017.html

Washington Monthly magazine:  https://washingtonmonthly.com/2017college-guide

 

You can also consult web sites which publish their own rankings, such as:

Payscale:  https://www.payscale.com/college-salary-report

The Alumni Factor:  https://www.alumnifactor.com/node/5836

College Factual:  https://www.collegefactual.com/rankings/

LinkedIn:  http://www.collegeconfidential.com/admit/college-rankings/

Niche:  https://www.niche.com/colleges/rankings/methodology/

 

Let us not forget guidebooks.  Some are specialized, such as the “40 Colleges Which Change Lives.” Here, however, I am referring to the largest publishers, those whose guidebooks profile 300 or 400 of the “top” colleges in the nation.  Although these guidebooks, such as Fiske and the Princeton Review, do not rank colleges, their inclusion of each college in their books is an endorsement of sorts for that institution, or at least a culling from the herd of over 2,500 four-year institutions of higher learning.  These guidebooks also include specialized lists, such as private and public university “best buys”.

Why are there so many lists?  For the most part, money is the motivator.   The magazines gain subscribers and, for those readers who find them on the Internet, advertising dollars.  The guidebook companies sell more books.  Which book would you buy:  “382 Colleges” or “The Best 382 Colleges”?  Princeton Review chose the latter title for its book:  https://www.princetonreview.com/college-rankings/best-colleges.

Another reason for the plethora of lists is that rating any institution, including a college, depends upon what is deemed most important to the reader.  Consider the large number of criteria available:

  1. School resources, including size of the endowment, availability of research equipment and funding for undergraduates.
  2. Selectivity – how difficult is to win admission.
  3. Affordability – usually a combination of price (tuition, room and board, fees) and availability and generosity of financial aid.
  4. Academic record of accepted students.
  5. Graduation rates, at 4 and 6 years.
  6. Student retention rates after freshman year.
  7. Reputation of the faculty.
  8. Faculty’s teaching ability.
  9. Student-professor ratio.
  10. Alumni contributions, both in percentage giving and amounts donated.
  11. Student “engagement” with professors.
  12. Student satisfaction with the college.
  13. Projected earnings for graduates; sometimes expressed as a ratio with tuition to arrive at an ROI (return on investment) for each college.
  14. Recruitment of disadvantaged students (e.g., race, income).
  15. College emphasis on service by its students (e.g., community service requirements).
  16. Student loan repayment rate.

Thus, determining “the #1 college” depends upon on what information is deemed important.

The key to using these lists is to understand:

  1. What criteria are used, and how those criteria are weighted, in arriving at the ranking.
  2. Whether the data used is sufficient to support the conclusion drawn from it.
  3. Whether the ranking relies on data which is subject to misinterpretation or even fraud.
  4. Whether the criteria are relevant to your student’s interests.

 

A few examples of “best practices” and, well, “less than best practices,” follow.

 

What criteria are used, and how are they weighed?

The most serious problem arises when the ranking is done in a “black box”, where only the ranking service knows what it considers important.

Consider the explanation offered by Niche (formerly College Prowler) concerning how it creates college rankings (https://www.niche.com/colleges/rankings/methodology/):

The Niche 2018 College Rankings are based on rigorous analysis of academic, admissions, financial, and student life data from the U.S. Department of Education along with millions of reviews from students and alumni. Because we have the most comprehensive data in the industry, we’re able to provide a more comprehensive suite of rankings across all school types.

Very impressive, if a bit vague concerning exactly what that data is, and how much of it is relevant to ranking colleges.  However, we can break it down.  The Department of Education collects a trove of data from colleges – the complete dataset is over 200 megabytes – and makes it public.  Most ranking services use this data, and sites such as www.collegedata.com (highly recommended) compile and present it in usable form.  See https://www.collegedata.com/cs/main/main_choose_tmpl.jhtml.

Niche’s claim to fame appears to be that it combines some of that data with its proprietary student survey data.  Confusion results when it explains how it uses that data (emphasis added in bold).

With clean and comparable data, we then assigned weights for each factor. The goal of the weighting process was to ensure that no one factor could have a dramatic positive or negative impact on a particular school’s final score and that each school’s final score was a fair representation of the school’s performance. Weights were carefully determined by analyzing:

How different weights impacted the distribution of ranked schools;

Niche student user preferences and industry research;

After assigning weights, an overall score was calculated for each college by applying the assigned weights to each college’s individual factor scores. This overall score was then assigned a new standardized score (again a z-score, as described in step 3). This was the final score for each ranking.

Yes, but what factors – criteria – were used, and how were they weighted?  Niche does not say.  For example, if Niche weighted “reputation of faculty” at 90%, then the rankings would be skewed heavily in favor of prestigious schools.

In contrast, many ranking sites are transparent about how they use such data in arriving at their rankings.  See e.g., https://www.usnews.com/education/best-colleges/articles/how-us-news-calculated-the-rankings (U.S. News and World Report – listing factors and the weight assigned to each); https://www.economist.com/blogs/graphicdetail/2017/08/graduate-earnings (the Economist magazine factors and weights used in its ratings for British universities).  If we cannot discern what information underpins rankings, then rankings are not very helpful.

Further, Niche makes explicit what I suspect other rankings purveyors do quietly – it takes a “second look” at the data to make sure that it looks “right” before publishing its rankings.  Here is the quote from above, with different emphasis added in bold:

With clean and comparable data, we then assigned weights for each factor. The goal of the weighting process was to ensure that no one factor could have a dramatic positive or negative impact on a particular school’s final score and that each school’s final score was a fair representation of the school’s performance. Weights were carefully determined by analyzing:

How different weights impacted the distribution of ranked schools;

Niche student user preferences and industry research;

That certainly looks like Niche “tried out” different weightings for each of the unnamed criteria, and then changed those weights if the resulting rankings did not look “right”.  The last sentence even suggests that it adjusts its ranking to conform to what other ranking services report (Niche analyzes student preferences and “industry research”, i.e., how other college insiders rank the schools).  That level of subjectivity is understandable – sort of applying a “smell test” to the results – but it does not add confidence that the weighting is completely objective.  It also limits the possibility of “uncovering hidden gems”.  What is the point of rankings if a school cannot score at the top unless it “looks the part”?

The lesson here is that before relying upon a list, understand exactly what criteria are being used, and how they are being weighted.

 

Is the data sufficient to support the measurement? 

When you step on a scale, the datum that stares you in the face, no matter how unpleasant, is almost certainly sufficient to determine how much you weigh.  However, if three people step on the scale – one at a time – the average of their weights will be insufficient to determine the average weight of the population of the United States.

Some ranking criteria require complete datasets to be relevant, and those datasets are often very difficult to obtain.  For example, many surveys measure “student outcomes,” usually through a proxy such as average earnings after 1 year, 5 years, etc.  Unfortunately, when schools survey graduates about their employment, often only the graduates with “good news” to report answer.  Would you be eager to tell your alma mater that you are unemployed?  And even those with good news to report simply ignore the surveys – perhaps out of fear that a letter from the development department asking for funds will follow.  (On a personal note, for the last 38 years, UCLA has been able to track me to a new address within six months of my arrival – very impressive.)

For example, only 19% of University of Kansas graduates responded to an outcome survey.  See https://blog.naceweb.org/2014/09/09/an-insiders-look-at-first-destination-surveys/.  The university then took the unusual step of looking up LinkedIn profiles to supplement responses.

Take any exemplary “placement rate” with a large grain of salt.  One item to look for when evaluating individual colleges’ placement rates is whether they use the NACE standard for survey responses.  See  http://www.naceweb.org/job-market/graduate-outcomes/first-destination/standards-and-protocols/.  Even then, be careful with any statistics about “full-time employment” – many colleges interpret Starbucks baristas as being fully employed.

The college rankings that rely on graduates’ salaries also suffer from incomplete datasets.  See https://www.nytimes.com/2017/12/01/your-money/college-graduation-earnings-privacy.html?hpw&rref=business&action=click&pgtype=Homepage&module=well-region&region=bottom-well&WT.nav=bottom-well (“Want to Search Earnings for English Majors?  You Can’t.”  New York Times, 12/1/17).

In addition, graduates tend to find work close to their colleges.  Unless adjusted for cost of living, student earning surveys will skew in favor of West and East Coast schools.

Other criteria are easy to compute, but relatively meaningless.  The percentage of students graduating after four or six years is susceptible to sample bias.  It is normal for many engineering students to take more than four years to graduate.  Unless you know the sample (MIT, small liberal arts college), low four-year graduation rates coupled with high six year-rates may be “normal”.  Obviously, any school where both rates are low will drop out of any ranking without ceremony.  A better measure is freshmen retention – if students are not returning after freshman year, the odds are higher that your student will not, either.

 

Some data is subject to “gaming,” or even fraud

Selectivity is the ratio of applicants to students admitted.  In 2016, Stanford only admitted 1 in 20 applicants, leaving it with a selectivity of 5%.  That means Stanford must be the best, because only the best students get in, correct?

Well, a lousy school will not attract enough students unless it relaxes admission standards.  But be careful with putting too much weight on that correlation, because many colleges “game” selectivity.  Methods include:

  1. Using “VIP applications”. Colleges send out these one page, no-fee, applications to thousands of students.  Because they do not require essays or application fees, many students fill them out and return them.  See https://www.propublica.org/article/the-admission-arms-race-six-ways-colleges-can-game-their-numbers.  I commented on this in my “You’ve Got Mail” post (6/22/15).

 

  1. Adopting the Common Application. The Common Application is used by over 500 colleges.  Students need only fill out the application once to apply to all member colleges.  Most colleges require students using the Common Application to respond to one or more essay prompts unique to the school.  Stanford has 11 such prompts, although many of them are short.  Colleges who wish to encourage applications do not require any supplemental essays.

 

  1. Going test-optional. A growing “fair test” movement in college admissions eschews standardized tests (SAT, ACT) as merely reflecting affluence or penalizing students who do not perform well on standardized tests.  These schools allow students to apply without them.  Although this may be laudable, it can also increase applications.

 

Colleges using these tools can lower their selectivity number, which, remember, is the percentage of applicants admitted, simply by inducing more students to enroll while not increasing the number of students accepted.  Why would they do this?  The U.S. News and World Report includes a college’s selectivity number in its ranking system, and guidebooks list colleges’ selectivity ratings prominently.  Students use selectivity as a proxy for “desirability,” which results in even more applications; a vicious cycle for students ensues.

The U.S News rankings also factors in the SAT and ACT score of enrollees.  You might ask how colleges could possibly “game” such numbers.  Well, five colleges since 2012 have been caught reporting higher scores than students actually earned – we call that fraud.  And there may be more who have not been caught.  See https://www.propublica.org/article/the-admission-arms-race-six-ways-colleges-can-game-their-numbers.

 

The criteria are not useful to students

U.S. News and World Report assigns 22.5% of its college ranking to “undergraduate academic reputation”.  What does that phrase mean?  Here is the explanation from U.S. News:

For the Best Colleges 2013 rankings in the National Universities category, 842 top college officials were surveyed in the spring of 2012 and 53 percent of those surveyed responded. Each individual was asked to rate peer schools’ undergraduate academic programs on a scale from 1 (marginal) to 5 (distinguished).

Those individuals who did not know enough about a school to evaluate it fairly were asked to mark “don’t know.” A school’s score is the average score of all the respondents who rated it. Responses of “don’t know” counted neither for nor against a school. 

The problem is that “reputation” is intangible – the reader must ask: “reputation according to who?”  At least U.S. News makes the “who” clear:  college officials and high schools counselors.  But should students care how college administrators regard their colleagues (some of whom will be rivals)?  Unless the school is nearby (in which case it may well be a rival), most college administrators are unable to make fine distinctions between colleges.  And high school counselors rarely follow up with their charges to determine their experiences after enrollment.

Indeed, shouldn’t students care a lot more about what employers think about the schools?  The Wall Street Journal published just such a list:  https://www.wsj.com/articles/SB10001424052748704554104575435563989873060 — in 2010.

And some ranking services reject traditional criteria as merely a proxy for wealth.  For example, The Washington Monthly publishes its list as “our answer to U.S News & World Report, which relies on crude and easily manipulated measures of wealth, exclusivity, and prestige to evaluate schools.”

Alas, not many career-oriented students (and parents) will be interested in its alternative:

We rate schools based on their contribution to the public good in three broad categories: Social Mobility (recruiting and graduating low-income students), Research (producing cutting-edge scholarship and PhDs), and Service (encouraging students to give something back to their country). https://washingtonmonthly.com/2017college-guide

Kudos to those who are.

The largest failing of college rankings is that they are usually so general as to be meaningless.  Students in STEM fields may not value small class sizes as much as a school’s facilities and labs.  Liberal arts students are just the opposite.  Students who are planning professional careers may not care as much about “outcome measures”, as opposed to acceptance rates at professional schools (which the ranking industry tends not to measure).  Wealthy families may not worry about financial aid – for less wealthy parents, their inquiry may begin – and end – with that criterion.

 

College rankings can be useful

With the caveats expressed above, college rankings do have their uses.  Here is how to use them:

  1. Understand what is being measured, and the weights assigned to each criteria.
  2. Understand what criteria are omitted, and determine if the omission is important to your student’s needs.
  3. Choose the most specific ranking possible. U.S. News and World Report, along with other services, publishes “sub-lists”, e.g., best undergraduate engineering programs.  Prospective engineers should start with that list.
  4. Consult more than one ranking. Schools that are consistently ranked highly may be more likely to deserve their rankings.
  5. Prepare to dig deeper. Sometimes variations in formulas make little difference in the rankings they produce.  Eight of the top 10 universities on the U.S. News list were also in the Journal/THE top 10.  The other two were right behind.  https://www.washingtonpost.com/news/grade-point/wp/2016/10/20/whos-no-1-as-college-rankings-proliferate-it-depends/?utm_term=.4f57fb7feafc.  You will have to do your own research to tease out the differences among high-ranked schools, or make decisions based on other factors (g., geography, cost).

How do I use rankings?  First, I use them to ascertain which schools my clients will value highly.  I may have to overcome preconceived notions with research.  Second, I use them like a string around a finger – when I am composing a list of candidate schools, I check the rankings to make sure that I have evaluated all of the commonly considered schools.

I also use college matching services, such as BigFuture (https://bigfuture.collegeboard.org/college-search) to create lists of schools, but that is grist for another article.

Congratulations! You’ve Been Admitted to the University of California – For Now

Universities reserve the right to rescind admission in extreme circumstances, such as a student’s commission of criminal acts, failure to graduate, and the like. Indeed, I commented on just such a situation involving Harvard a few months ago in this blog. However, the University of California (“UC”) is taking this a step further, and not in a helpful way. Students may find that the UC will treat mild cases of “senioritis,” or even something beyond the student’s control, such as a lost transcript, as grounds for rescinding their admissions.

The UC uses one application for its nine campuses that offer undergraduate instruction. Students “check off” boxes for the UC campuses they are applying to, and each campus makes its own admissions decisions.

The application instructs students not to send transcripts; students self-report their grades. To ensure that the reported grades are accurate, all offers of admissions are made provisional upon the UC receiving official final transcripts verifying those grades.

Those provisional offers also include additional conditions which must be satisfied before enrollment. The UC refers to these provisional offers as “contracts.”  (As noted below – in a paragraph only a lawyer could love – it is not clear that these are true “contracts”.)

Unfortunately, some of those additional conditions protect the UC against more than fraud, and students may face unwelcome surprises in the form of the University rescinding admissions based on seemingly trivial grounds, including “senioritis.” For example, this summer, the University of California, at Irvine (“UCI”), rescinded 499 admissions because students had failed to arrange for their high schools to deliver their final transcripts by July 1, as required by those agreements. This seemed like an overreaction to a trivial deadline, but the news soon broke that there was more at stake for UCI.

At the time, UCI was over-enrolled by approximately 800 students. The university had already tried to convince some accepted students to enroll instead in a separate “academy” with a 50% tuition break. However, UCI failed to mention that those new students “would have to cancel their enrollment as regular freshmen, take a more limited menu of classes in the adult education division and give up access to campus housing and financial aid.” (LA Times, 8/2/17; see also https://www.insidehighered.com/admissions/views/2017/08/07/essay-lessons-controversy-over-university-california-irvine-revoking.)

Not surprisingly, the offer was not taken up by enough students to solve UCI’s over-enrollment problem. So UCI turned to its contracts and rejected those students who had failed to submit transcripts timely.

On Friday, campus spokesman Tom Vasich conceded that the admissions office was more stringent than usual about checking requirements “as a result of more students accepted admissions to UCI than it expected.”

The vice-chancellor of students affairs also fessed up:

I acknowledge that we took a harder line on the terms and conditions this year and we could have managed that process with greater care, sensitivity and clarity . . . “

In other words, the University of California reserves the right to punish some breaches of its “contracts” more than others depending upon whether doing so is to its advantage. Ultimately the public outcry – and examples where the UC had indeed received the transcripts timely but mistakenly rescinded admission – forced the university to back down.

But questions remain because some of the terms in these “contracts” are vague. For example, to discourage “senioritis” — failure to maintain good grades in the student’s senior high of high school) — UC requires students to maintain a certain standard of excellence in their senior years. But what standard?

The general rule is that students must maintain a minimum weighted 3.0 GPA in college prep classes (the “a-g” requirements in UC lingo), with no “D” or “F” grades. However, for some of the most prestigious campuses, the contract may specify higher GPAs or test scores. See e.g., https://talk.collegeconfidential.com/university-california-los-angeles/1990196-ucla-provisional-contract-for-ib-student.html.

And then there are campuses that add “expectations”. Consider “senioritis” in light of this language from the University of California, at Santa Cruz:

In accepting admission at UCSC, you agree that you will:

  1.  Earn a level of academic achievement in your fall and spring courses (as you listed on your UC application) consistent with your previous coursework. Earn a grade of C or higher in those courses (or equivalent for other grading systems).

Wait. Look at the text again: “[e]arn a level of academic achievement in your fall and spring courses (as you listed on your UC application) consistent with your previous coursework.”

UC Davis appears to have similar language in its “contract”. See https://www.ucdavis.edu/admissions/freshman/admitted/.

Do we now have two requirements?  Must a student not only avoid “Ds” and “F”s, but produce the same grades (mostly “A”s) that got them admitted in the first place?  That is a lot harder to accomplish, and essentially extends the application period until high school graduation. “Senioritis” could be fatal under such conditions. Say it ain’t so, UCSC!

Regrettably, it is so, at least for UCSC.

UCSC includes an FAQ here:  https://admissions.ucsc.edu/apply/conditions-faq.html:

FAQ 1A: My contract indicates “Earn a level of academic achievement in your fall and spring courses consistent with your previous coursework, with no grade lower than a C (or equivalent for other grading systems).” What do you mean by “consistent?”

Answer 1A: We expect that the grades you will earn in your senior year will look similar to the grades you earned in the first three years of your high school career; for instance, if you were a straight-A student for three years, we would expect A’s in your senior year. Consistency in your level of achievement must be carried through your senior year coursework.

FAQ 1E: I earned a C- in a course. Does that mean my admission will be cancelled?

Answer 1E: The University of California does not compute pluses or minuses in high school coursework. Therefore, a C- is considered equivalent to a C grade. Remember, however, that we also expect a consistent level of academic achievement in your coursework. (Emphasis added.)

Well, so much for students pausing during their last semester to enjoy the high school they attend in a way they previously had to defer for 3 ½ years of grueling competition. Of course, we do not know whether UCSC (or UC Davis) routinely enforce this standard.

But this requirement is worse than draconian – it is confusing. When analyzing a contract, lawyers look for the consequences of a violation of its terms, what we call a “breach.”  The UC’s “contracts” take a sweeping approach – any violation of any of its terms is considered grounds for canceling the contract and rescinding the admission.

UCSC is no exception. From that same linked document:

Failure to meet your Conditions of Admission Contract will result in the cancellation of your admission. It is your sole responsibility to meet all conditions. Read each of the six conditions below and ensure that you meet all of them. Accepting your offer of admission signifies that you understand these conditions and agree to all of them. (Emphasis in original.)

That appears clear, but for one problem. What does it mean for UCSC to “expect” students to have grades that are “consistent”?  This formulation appears vague: “how many “B”s can a straight-A student sustain before being in breach?  And what do we make of the verb “expect”?  It is weaker than “require”, and lacks the commanding “shall”. Is such an expectation enforceable?

I am not practicing law anymore, so I leave this question to those who are. But the fact that this is a reasonable question is a huge problem – both for UCSC and our students.

Looking at the big picture does not yield a pretty sight. The University of California requires students to sign “contracts,” even though minors are generally considered to lack the capacity to do so. (Not a small point for lawyers – see https://www.mnscu.edu/system/ogc/docs/HANDBOOK%20Minors%20on%20C.pdf; I suggest that a university’s authority to enforce a policy, as opposed to the letter of a contract, may be subject to heightened due process concerns).

The UC reserves the right to enforce those “contracts” differently depending upon when it is to the UC’s advantage to do so. And it is sometimes vague about what it wants a student to do (earn “consistent” or “strong” grades through all four years) and what conduct constitutes a breach of contract allowing cancellation of admission.

Worst of all, by the time the UC decides whether to exercise its contractual remedies by rescinding admission, students have already declined all other offers in accordance with the May 1 rule used nationwide by colleges.

The UC has a sterling reputation, and I am a proud graduate of UCLA. However, as I’ve said previously in other contexts, caveat emptor.

You and your students should read any admission letters and accompanying materials very carefully before accepting an offer from the University of California. Upon accepting that offer, labor diligently to make sure that your student’s transcript arrives at their campus (and confirm in writing that it did), and that all other terms of the contract/agreement are satisfied.

Finally, of course, “senioritis” can be a serious threat to any student’s college aspirations, but it is a particularly dangerous malady for those planning to attend the UC. If it looks like grades will be a concern, call Admissions and warn them; they may be more inclined to be lenient if notified early than the cold, hard, text of their provisional “contracts” suggest.

A New Addition to College Curricula:  Preparing to Fail

The New York Times generally covers education issues in its “Fashion and Style” section.  Make of that what you will (including my reading habits), but one of their recent articles caught my eye: “On Campus, Failure is on the Syllabus” (June 24).

The lede is illustrative:

Last year, during fall orientation at Smith College, and then again recently at final-exam time, students who wandered into the campus hub were faced with an unfamiliar situation: the worst failures of their peers projected onto a large screen.

“I failed my first college writing exam,” one student revealed.

. . . .

The faculty, too, contributed stories of screwing up.

“I failed out of college,” a popular English professor wrote. “Sophomore year. Flat-out, whole semester of F’s on the transcript, bombed out, washed out, flunked out.”

“I drafted a poem entitled ‘Chocolate Caramels,’ ” said a literature and American studies scholar, who noted that it “has been rejected by 21 journals … so far.”

It is now a meme (see my entry on “Watch What You Post!” for a definition) that millennials are unusually fragile creatures who require cosseting against the vicissitudes of the real world.  In other words, they are weak creatures who have been raised in a bubble.

This is an odd idea on its face when applied to students admitted to elite colleges and universities.  These students have won an extraordinarily competitive race for four years to demonstrate that they are superbly equipped for any academic challenge.  Why would these students, of all people, suddenly crumble when they arrive at college?

Late adolescence is an uncertain, and even dangerous, moment.  At the extreme, this is the time when some forms of mental illness are more likely to emerge, such as bipolar disorder, schizophrenia, and depression.  See https://nami.org/collegeguide (claiming that one in five adults will experience a form of mental illness in college).  Students are also experimenting with adult life, including entering relationships which can end badly.

Further, many of these students have never experienced academic failure before they arrive at college.  The winners of a grueling race that penalizes even a “C” quite heavily, these students are heavily invested in succeeding, and are likely to be blindsided by failure.

Colleges are becoming alarmed at the frequency of students buckling under the new – for these students – experience of failure.  Smith College reminds students that 64% of students will receive a B-minus or lower during their time there.

And it goes a step further:

[W]hen students enroll in [Smith’s] program, they receive a certificate of failure upon entry, a kind of permission slip to fail. It reads: “You are hereby authorized to screw up, bomb or fail at one or more relationships, hookups, friendships, texts, exams, extracurriculars or any other choices associated with college … and still be a totally worthy, utterly excellent human.”

A number of students proudly hang it from their dormitory walls.

Smith is just one of many elite colleges rolling out such programs.

consortium of academics soon formed to share resources, and programs have quietly proliferated since then: the Success-Failure Project at Harvard, which features stories of rejection; the Princeton Perspective Project, encouraging conversation about setbacks and struggles; Penn Faces at the University of Pennsylvania, a play on the term used by students to describe those who have mastered the art of appearing happy even when struggling.

Some of this can be attributed to the newly fashionable idea that “grit” is an important ingredient for success.  As with most such revelations, a grain of truth can be puffed up into a silo full of grant-funded excesses seemingly devoted to accentuating the obvious.

However, such movements can also be useful, and this is one of them.  The “lesson” for students and the college counselors who work with them is that students should be made aware of the challenges ahead.  Most important, students should be told before they leave the nest that colleges have resources available to help students in distress.  Students should know the location of the Counseling Office on campus.  They should be instructed to seek help, and informed that doing so will not result in any social or parental stigma.

We all know that failure is part of life.  Make sure that your student knows that, too.  For my part, my students who just graduated high school and thought that they had heard the last of me until my Christmas break “check-in” are about to receive an e-mail with the New York Times article attached.