The Invasion of the Reps!

Tis the season when college admissions representatives (“reps”) descend on high schools. Prepare to take advantage of this opportunity.

Not all colleges send reps to high schools to recruit students. Some elite and/or small universities combine forces to present sessions off-campus in the evening. Here is an example: Other colleges offer information sessions in various cities – if you are not in a large city, you probably will not see them.

However, a significant number of colleges send reps directly to high schools. These colleges generally assign reps to assess applications from different regions and states; part of their job is to visit schools in their assigned area. This means that reps who visit your school are likely to be part of the team evaluating your application.

The typical rep (not rap) session is devoted to an informal presentation on the merits of the college, followed by questions. Students are usually allowed to skip class to attend. The entire session usually lasts 30-60 minutes.

Most reps confine their visits to the “strongest high schools” in your area. If your school does not typically host college reps, then you should consider examining the website of the strongest schools in your area and choosing one or two information sessions with colleges to which you intend to apply. If you can swing the logistics (traveling from one school to the other, losing class time, etc.) seek permission from the host school (and your own) to attend the information session at the other school. Of course, you will identify yourself to the rep and explain the situation. Your initiative will seize the rep’s attention, which may bolster your admissions chances.

You have several objectives when attending a college information session:

  1. Learn information about the school that will help you decide whether to apply.
  2. Learn information about admissions that is not available elsewhere.
  3. Register your interest in the college and, if possible, impress the rep.


Here are some tips to accomplish these objectives.

  1. Determine which colleges are sending reps to your area.

Apart from checking with your high school about which colleges are sending reps to the school, most college websites – under “Admissions” – will announce when and where reps will be visiting your area. This may take some digging, but it’s there. If all else fails, call the college’s Admissions office.


  1. Choose your sessions carefully.

Attending sessions often requires skipping classes or extra-curricular activities. Unless you are seriously interested in the college, skip the session. As one rep notes:

I always felt one of the great ironies of the high school visit was that I was there exhorting students to take the toughest classes, do as well as possible…and then skip them when I came to school. That never made sense to me.


  1. Research the college.

This is another reason to choose your sessions carefully:  you should do some homework on the college before the session.

Determine whether the college cares about “demonstrated interest.”  College sessions generally include a “guestbook” where students sign in to register attendance. Reps also hand out their cards, giving the student the opportunity to send “follow-up questions”.  Most colleges keep track of students who attended the session as an indicator that the student “demonstrated interest” in the school. But a few do not.

Fortunately, there is data out there from which you can determine which colleges care about demonstrated interest. Begin by checking the college’s website. A few colleges, such as Carnegie Mellon, make it abundantly clear that they do not track this information. See

Next, check, which compiles data submitted by colleges to the U.S. Department of Education. Find your college, and then click on the “Admissions” tab. About halfway down the page, you will find “Selection of Students.”

Here is the data for one elite university.


Factor Very Important Important Considered Not Considered
Rigor of Secondary School Record X
Academic GPA X
Standardized Tests X
Class Rank X
Recommendations X
Essay X
Interview X
Level of Applicant’s Interest X
Extracurricular Activities X
Volunteer Work X
Character/Personal Qualities X
First Generation to Attend College X
State Residency X
Geographic Residence X
Relation with Alumnus X
Religious Affiliation/ Commitment X
Ethnicity X
Work Experience X


As you can see above, the ratings are: “Very Important”; “Important”; “Considered”; and “Not Considered.”  One of the attributes is “Level of Applicant’s Interest.”  Most colleges – including the one above – state that such interest is “Considered.”  A few label it as “Important.”  That is often a code word for “you’d better go visit that college if you want to win admission.”  (Yes, Rice University, I’m looking at you.)

Colleges that state that the Level of Interest is “Not Considered” should be taken at their word. These are often elite colleges that do not want anxious students flooding onto campus simply because they believe they must. Feel free to attend their high school presentation sessions, but do not assume that you are bolstering your admissions chances by doing so.

Note:  the “Interview” attribute refers to interviews offered by the college after receiving your application. Most of the colleges which request interviews will offer to have an alumnus interview you close to your location; there is usually no need to visit the school for that purpose. For purposes of deciding whether to attend a rep session, you can ignore this attribute.

We’re just getting started with research here. Know the basics:  the college’s location, size (and perhaps gender/racial composition) of its student body, and courses of study available. A quick review of the college’s web site should yield this information.

Then dig a bit deeper:  determine the GPA and SAT/ACT scores necessary to be competitive, the schools, departments, and majors for which the school is best known, and some of the colleges for which it typically competes for students. There are several resources where you can find this information, but for a “quick look”, I consult Type in the school name, and choose the “overview”, “admissions”, and “student life” tabs.

Then run the college’s name through a search engine and browse the links which appear, from Wikipedia to various ranking sites (e.g., Niche, Princeton Review). Hard-core readers with plenty of time might look up the school in “College Confidential”, as well.

Finally, you should examine the Common Application (or the Coalition Application or, in a few cases, the college’s own application) and determine what supplemental essays and short-answer questions are contained in the application for the college visiting your high school. Be patient – there is a very good reason for doing so.

Now you are ready to evaluate the rep’s presentation and ask an intelligent question.

Here is a step-by-step guide to getting the most out of the session itself.

  1. Arrive 5 minutes early. That is enough time to snag a front row seat – you want to be seen – and not so early that you are stuck talking to the rep with nothing to say. Talk to the rep after the session, not before.
  2. Sign the guestbook.
  3. Turn off your cell phone and put it away.
  4. Listen carefully to the presentation; take notes.

The mere act of taking notes marks you as a serious applicant. And you may use the information later.

What should you be listening for?

What does the college consider its strengths?  Colleges compete for your attendance. Let them sell the benefits their school offers. Take notes, because at some point in your application to that school you will want to express interest in those benefits. In other words, you will want to tell them – at least briefly – what they want to hear. This is where you connect the information and sales pitch with your answer to their “Why Our College” question.

What information is the rep revealing about admissions policies and/or priorities?  Although most colleges simply repeat the information on their websites, sometimes they will deliver a nugget of information you can use. Be alert for suggestions that you apply early, or that the college is looking for certain extracurricular activities. (Although you cannot invent an activity, this information helps you decide which activities to emphasize.)  One year a rep from a top college responsible for my student’s region emphasized that he likes to read humorous essays. My client obliged – she was admitted. (Of course, her academic record might have had something to do with it.)

5. Do not ask more than one question during the session.

Asking more than one question may give the impression that you are trying to dominate the session. Of course, if you have a follow-up question, do not be afraid to ask it.

Avoid questions where the answers are apparent on the website. For example, asking about admissions requirements that are set forth on the website makes you look lazy. This is one reason to the research I suggested above.

Ask more general questions that may pertain to your interests, such as:

Question: “I am interested in exploring both sciences and the liberal arts. How difficult is that to do at ________?”  “Is it common for students to double-major?”

Question: “How popular is undergraduate research [presuming your interest is in STEM] at ______?”  “What are the requirements for students who are interested in performing research?”

Question: “Is there much interaction between the students on campuses and the surrounding community?”  “What are the internship (or co-op – where you work full-time) opportunities in the area?”  [This question can be tweaked depending on whether you want to know about social, business, or academic opportunities – if you are budding social worker, you may care more about the demographics of the area than other students.]

I list two questions for each topic because another attendee might ask the first one.

After the session ends:

  1. If you haven’t done so already, sign the guestbook.
  2. Engage the rep.  Start by asking for a business card.

There will probably be a line of students waiting to talk to the rep. If you have plenty to say, then you might want to linger in the back of the line in hopes of being the last student who talks to the rep. You want to be memorable, in a good way.

When you get home after school, if you are still interested in the college, write a quick paragraph (based on your notes) and put it into a file for that college (electronic or visual) for use when drafting your application. Remember to save (or scan) that business card.

And write a thank-you note and e-mail it to the rep.  Something simple will do:  “Thank you for your information session today at [“X high school”].  I found it very informative and useful.”  Noting the high school is important, because reps often visit more than one high school each day.


A Test That Is Too Easy Results in Hard Feelings

A not so funny thing happened on the June 2018 SAT test – many students were surprised by their low marks. In fact, the entire testing community appears to have been taken aback, and not in a good way.

Here is an article that explains all of this in detail: Other test-prep companies have also discussed this issue. See

The takeaway for students and parents is that the June 2018 SAT test was too easy. As a result, the College Board psychometricians (a vocabulary word that will probably never appear on the SAT – it refers to experts on testing) used a very steep curve to avoid giving the same scores to everyone. This curve punished mistakes harshly. As the Compass article linked above points out:

Compare this to how the June SAT 2018 Math fits in among its fellow new SATs. A 650 could be achieved with 50 correct answers. That’s the lowest scaled score the new SAT has ever produced for 50 correct answers. The highest score it has produced for 50 correct answers on an actual, released exam is 740 points — a 90-point swing! So in its first two years, the new SAT has approximately doubled the extremes seen on the old SAT over 10 years and 4 times as many exams.

What will happen to those scores?  The College Board remains committed to the results of the test, going so far as to insist that the results were not “curved,” but “equated.”  Technically this may be correct, but the impact for many is that their scores on that test do not reflect their ability to score well on the “typical” SAT.

However, the furor surrounding this exam changes the usual calculation concerning when students should re-take a standardized test. Normally, students with two or more SAT (or ACT) results should not retake the exam except under two circumstances: 1) students are very confident that they will be able to improve their scores because they have now received extra time for a learning disability, were ill previously, or have put in more time in studying for the test (perhaps with help from a test-prep company); or 2) students absolutely need a higher score to stand a chance of being admitted to their “reach” schools.

The risk of taking the exam repeatedly is that some colleges require students to send all test results. For those colleges, there are several risks. First, the student may get unlucky and receive lower scores on the retake. Second, some of those colleges (e.g., Ivy League) frown on students taking the SAT or ACT multiple times. Finally, colleges tend to discount later scores as being due to the “practice effect”, i.e., students who are more familiar with the test typically post higher scores.

Some of these risks may be reduced for students who scored below their expectations on the June 2018 exam. Although colleges will probably not discard the June 2018 scores, the furor over the test means that they will consider those scores with an asterisk – they are more likely to accept later scores favorably.

If you are dissatisfied with your score on the June 2018 SAT, you should seriously consider retaking the test.

SAT Score? No, But Check Out How I Did on the Gaokao!

A couple of news items grabbed my attention in June.  First, the universe of test-optional schools expanded with the addition of a top-ten institution, the University of Chicago.


Just say no to standardized tests?

Test-optional schools are what the label implies:  students are not required to submit SAT/ACT scores when applying.  The “test-optional” practice has become increasingly popular, with 100 colleges adopting the practice during the last five years.  Why is this becoming popular?  From my vantage point I see good – and perhaps not so good – reasons.

Some students just do not test well.  Students with learning disabilities who need extra time sometimes do not receive accommodations, or do not benefit from them.  Standardized tests stand accused of cultural bias – asking questions which presuppose knowledge that students not raised in “mainstream” America lack.  The “test-prep” industry, which often only the wealthy can afford – can boost some students’ scores at the expense of their less financially endowed competitors.  In other words, in some cases the tests simply do not measure students’ academic potential.

However, some colleges adopting this practice are being more strategic rather than altruistic.  Colleges are businesses, and once you get beyond the top 100 colleges, demand for places starts to ebb.  Many of those “test-optional” institutions are small private colleges.

“Going test-optional” enlarges the pool of interested students.  See  In a few cases, those extra students can be essential for financial survival of the institution.

Schools that go this route can obtain another benefit – selective reporting.  Students with low standardized test scores who stand to be admitted for other reasons (e.g., legacy admissions, athletic scholarships) are most likely to not submit those scores.  When the school reports its admission statistics, its average standardized test scores for admitted students will rise commensurately.  This makes the college appear more selective, which can affect its rating in reference lists such as U.S. News and World Report.  These colleges get the best of all possible worlds:  more applicants, higher reported test scores, and a few rungs up the “prestige” ladder for schools (which in turn prompts more students to apply).

A few “test-optional” colleges have “refined” this concept by requiring submission of standardized test scores to obtain financial aid.  This appears to be an obvious “pay to play” plan, where students with poor test scores are not required to report them if all they seek is admission.  If those students are admitted, they subsidize their better testing peers by paying full tuition.

Many – but not all – top colleges have resisted adopting a “test-optional” policy because of the stigma attached.  The SAT and ACT are marks of quality (however imperfect); announcing that they are no longer required for admission suggests a lack of rigor.  This is increasingly the case because widespread grade inflation is making GPAs less reliable for distinguishing among students.

This is what makes the decision by the University of Chicago so surprising.  It is an elite institution with perhaps the most “cerebral” reputation of them all.  This was where the Manhattan Project helped win WWII, and the Chicago School of Economics changed economic policy around the world.  This is the university famous for its admissions essay questions – a few years back applicants were invited to explain “what is odd about odd numbers”?  The university is known as the place “where fun goes to die,” because everyone is so busy studying.  These students run an intellectual gauntlet second to none.

Perhaps most important, the University does not need more prestige – it already ranks #3 on the all-important U.S. News and World Report list of top colleges.  Thus, when the University of Chicago goes “test-optional”, the world of higher education pays attention.

So why did the University do it?  It cites some of the “good” reasons stated above; it is coupling the move with an increase in financial aid for families earning under $125,000.  If there were also strategic motives behind the plan, they were not announced.  (No surprise there.)

Two questions remain.  First, will this change the demographic of admitted students at the University?  Will more minority and students with learning disabilities apply, and will they be accepted?  Second, will other schools follow the University’s lead?

We will have to wait a few years to find the answers to both questions.  In the meantime, the “test-optional” movement just gained a significant boost.

Students considering test-optional schools should carefully evaluate the testing policies of individual colleges – they can differ in important respects.  College counselors can add value here.


The rise of the gaokao

Meanwhile, there was another standardized test in the news this week – the gaokao.  This one dwarfs the SAT and ACT in almost every respect.  About 3 million U.S. students take the SAT or ACT, while approximately 9 million Chinese students sit for the gaokao.  Like some European countries, the gaokao is the most important, if not the only, data used by Chinese colleges for admission.

The gaokao is a nine-hour exam given over two days.  Compulsory subjects include Chinese, mathematics, and, usually, English (students can substitute other languages); students also sit for an additional subject depending upon whether they are pursuing STEM or other careers.

It is hard to overestimate the importance of the gaokao to Chinese students.  The results determine which colleges students can attend and the majors they may pursue there once admitted.

Many students compress their high school careers so that they can graduate early and spend the next year cramming for the exam.

The hype around the exam makes stories about test anxiety in the United States seem tame.  Per the South China Morning Post:

Hengshui Middle School in Hebei province, where more than 100 students earned admission to the prestigious Peking and Tsinghua universities, students have been given IV drips as they study, believing that it will help them with concentration and focus. Girls are given contraceptive pills to delay their periods until after the exam.

Similar stories abound.  See (referred to as “the Atlantic article”).

The Atlantic article suggests that the growing number of Chinese students seeking to attend college overseas are driven by worry about – and disdain for – the gaokao.  It notes that Chinese students taking the SAT and ACT are under similar pressure, and links that pressure with the recent wave of cheating scandals on all three exams in Asia.  (The penalties are a bit stiffer for cheating on the gaokao – cheaters are banned from retaking the test for years, and those caught facilitating cheating face prison sentences of up to seven years.  See

The gaokao and the SAT/ACT share one justification – identifying talent.  The archetypical example in China is the student in the rural provinces who would otherwise have been consigned to a life of farming but did well enough on the gaokao to change her life.  It is ironic that the SAT/ACT are threatened in the U.S. while the gaokao remains dominant in China.

One reason for the dominance of the gaokao is that for many Chinese students, the test is the only guaranteed authentic mark of achievement and talent.

From the Atlantic article:

Guessing the percentage of fraudulent transcripts in applications from China is a popular parlor game among educators over here. Unscientific estimates abound: One prominent agent who works with students at some of the best high schools in China recently estimated to me that at least half of the transcripts in China are doctored to look like the students have done well in a robust high school curriculum, when the reality is one of almost constant memorization and practice tests. Unfortunately, no one in the college prep industry in China would be surprised if the actual percentage was significantly higher.

The Chinese system poses a challenge for U.S. colleges who tout their “holistic college admissions” processes.  How can they distinguish among foreign students who spend all of their time studying for one exam, and whose transcripts, even if produced, may be fraudulent?

The obvious answer is to consider the results of the exam.  After all, colleges rely on the TOEFL exam to assess competency in English.

And so it begins.  Newspapers this week trumpeted the decision by the University of New Hampshire to consider the gaokao.  But UNH is only the first public university to do so – the University of San Francisco (“USF”) started accepting the gaokao in 2015.  Dozens of universities in Australia, Canada, and Europe accept it.

This article from Inside Hire Ed reporting on USF’s program is skeptical that many U.S. universities will follow, mostly because the timing of the gaokao conflicts with the admissions cycle:

We shall see.  When 9 million test-takers sit for an exam, the number of “underperformers” is in the seven-digit range.  It is therefore no surprise that U.S. universities are after some of those test-takers, preferably those who will pay full tuition in the United States.

Per the USF administrator in charge of Chinese admissions:

He anticipates that USF will set gaokao cutoff scores equivalent to the marks needed to get into a first-tier Chinese university in each province, plus or minus a few points. Students who are admitted based on their gaokao scores will pay their own way, though Nel said they could be eligible for merit scholarships of up to $20,000 per year.

Brave words, but I doubt that the high standard will be maintained.  After all, the goal of almost every student who takes the gaokao is to snag a spot in a first-tier Chinese university.  Very few will trade that for a spot at most U.S. colleges.  Expect lower, less publicized, standards as this practice grows.

And it will grow.  With 337,000 Chinese students currently enrolled in U.S. colleges, and financial pressures on those colleges increasing as public funding continues to lag, do not be surprised when this practice spreads throughout our American system of higher education.

Ken Rosenblatt — Tucson College Counselor

TEN-HUT! Applying to Our Nation’s Military Service Academies

The summer before senior year is usually quiet for high school students.  Not so for those seeking to attend our Nation’s military service academies.  This is because of the importance of seeking – and obtaining – a nomination for an appointment.

An appointment is the equivalent of a college acceptance.  A nomination is a request by a designated person or institution that an applicant receive an appointment.

If this looks complicated, it is.  However, rest assured that the service academies will walk you through the process (the Coast Guard Academy does not require nominations, but is listed below for reference):

Army (West Point, NY):

Navy (Annapolis, MD):

Air Force (Colorado Springs, CO):

U.S. Coast Guard (New London, CT):

U.S. Merchant Marine (Kings Point, NY):

Most students seeking nominations apply to their local Representatives and Senators.  The Vice-President and President also nominate candidates.

Applicants are encouraged to ask each of their local Representatives and Senators for nominations because each of these office-holders can make only a limited number of nominations.  As the Air Force puts it:

We recommend attaining a nomination from as many sources as you are eligible for, President, Vice-president, Senators and Representatives. This will improve your chances for success if one source does not nominate you.

Certain military officials, JROTC units (discussed below), and even some private military academies can also nominate applicants.  Consult service academy websites for more information on those options.

The nomination is only part of the application, but it is the one with which most students are least familiar.  Applicants who have military connections (e.g., JROTC, relatives in the services) should reach out to each and every one of them for help.


Planning further ahead

For those with the luxury of time – students about to enter their sophomore or junior years – now is the time to lay the groundwork for your application and request for nomination.  This is because the service academies have some unique requirements for admission that require sustained effort.

First and foremost is fitness for duty.  Before you get your hopes up, determine as best you can whether you have any medical problems which could pose a barrier to admission.  A medical examination, the DoDMERB, is required for admission.

The Air Force is particularly stringent (see, but all of the academies have medical criteria.  Waivers may be obtained for certain conditions.

The academies also require evidence of physical fitness.  The Army includes a guide for aspiring cadets:

It also requires students to complete a fitness test before applying:

You will schedule your CFA with your physical education (PE) teacher. The Candidate Kit has the test form and instructions you can forward in PDF format. Your performance in six events will be judged:

  • Basketball throw (from a kneeling position)
  • Cadence pull-ups or flexed-arm hang (women’s option)
  • 40-yard shuttle run (for time)
  • Abdominal crunches (number completed in 2 minutes)
  • Push-ups (number completed in 2 minutes)
  • 1-mile run (for time)



Colleges proclaim that they seek students who have demonstrated leadership.  The service academies “walk the walk.”

Consider this advice from the Air Force:

So how do you prepare for a future at the Academy?

  • Study hard. Get the best grades you can in all subjects — especially English, math and science.
  • Join a sports team. If your school does not have an after-school sports program, you can usually find one at your local community park or recreation center.
  • Become a leader. Join a scouting program like Girl Scouts, Boy Scouts or Civil Air Patrol. Or join another school or local club and go for a leadership position like club president or secretary.
  • Demonstrate character. Consider activities that help others. Get involved with church groups or other organizations that may be helping members of your community.


Commitment to military life

Service academies are looking for leaders who will take well to military discipline.  One good way to demonstrate that quality is to join a local JROTC unit and, if possible, attend a summer program at a service academy.

Wikipedia offers a general (if lengthy) overview of the JROTC and a similar program, the NDCC:

These programs are intensive preparation for military life; evaluations and recommendations (and even nominations for appointment) from commanders are key.  The downside is that the commitment required to join such a unit may extend to attending very early morning sessions at a different location from the student’s high school.  Some parents even home-school their students as a way of accommodating the conflicting demands.

Another way of demonstrating a commitment to the academies is to attend one of their summer programs:



Air Force:

Coast Guard: (prospective merchant mariners also attend this program – the U.S. Merchant Marine Academy does not offer its own).

These camps are a pathway to an appointment.  As the Navy puts it: “Summer Seminar gives you a taste of life at the Academy and kick-starts your application journey for an appointment to the Academy.”

Finally, the Academies have the same way to demonstrate interest as do many colleges – sign up for their mailing lists.  For example, prospective applicants to the Air Force Academy should sign up to be “Future Falcons”



Just because the services look for fitness and leadership potential, do not assume that academics are at the bottom of their list of priorities.  Academic performance makes up 50% of the assessment for admission to the Air Force academy.  The Naval Academy emphasizes STEM coursework, with many cadets graduating from the Academy with engineering degrees.  Standardized testing is also important.  For example, the published mean SAT score for the Air Force academy is in the mid-1200s, with a mean ACT of 30.


Work on Plan B

Our nation’s service academies are competitive; admission is the exception, not the rule.  Therefore, students also need to apply to colleges in case they do not win an appointment.  The good news is that academic and leadership ability, along with extracurricular activities, make for a winning application to both service academies and colleges.  As you might expect, I can assist students with both endeavors.

This Blog is Ranked #1

Once again, college ranking season is upon us.  Timed for when high school juniors and their parents are starting to focus on college admissions for the following year, a whole host of magazines, newspapers, and websites publish their annual “rankings” of their “top” institutions of higher learning.

What are you – parent or student – supposed to do with all these opinions?  How do you know which college is the best, or, more important, the best for your student?

When buying a washing machine, many consumers start with Consumer Reports magazine.  Is there a “Consumer Reports” rating authority for colleges?

Alas, no.  Instead of one possibly authoritative source, there are well over a dozen contenders.

Start with traditional news media organizations that publish college rankings, including:

U.S. News and World Report: (the 800 lb. gorilla of college ratings)

The Wall Street Journal/Times Higher Education of London:

The Economist magazine:

Forbes magazine:

Money magazine:


Washington Monthly magazine:


You can also consult web sites which publish their own rankings, such as:


The Alumni Factor:

College Factual:




Let us not forget guidebooks.  Some are specialized, such as the “40 Colleges Which Change Lives.” Here, however, I am referring to the largest publishers, those whose guidebooks profile 300 or 400 of the “top” colleges in the nation.  Although these guidebooks, such as Fiske and the Princeton Review, do not rank colleges, their inclusion of each college in their books is an endorsement of sorts for that institution, or at least a culling from the herd of over 2,500 four-year institutions of higher learning.  These guidebooks also include specialized lists, such as private and public university “best buys”.

Why are there so many lists?  For the most part, money is the motivator.   The magazines gain subscribers and, for those readers who find them on the Internet, advertising dollars.  The guidebook companies sell more books.  Which book would you buy:  “382 Colleges” or “The Best 382 Colleges”?  Princeton Review chose the latter title for its book:

Another reason for the plethora of lists is that rating any institution, including a college, depends upon what is deemed most important to the reader.  Consider the large number of criteria available:

  1. School resources, including size of the endowment, availability of research equipment and funding for undergraduates.
  2. Selectivity – how difficult is to win admission.
  3. Affordability – usually a combination of price (tuition, room and board, fees) and availability and generosity of financial aid.
  4. Academic record of accepted students.
  5. Graduation rates, at 4 and 6 years.
  6. Student retention rates after freshman year.
  7. Reputation of the faculty.
  8. Faculty’s teaching ability.
  9. Student-professor ratio.
  10. Alumni contributions, both in percentage giving and amounts donated.
  11. Student “engagement” with professors.
  12. Student satisfaction with the college.
  13. Projected earnings for graduates; sometimes expressed as a ratio with tuition to arrive at an ROI (return on investment) for each college.
  14. Recruitment of disadvantaged students (e.g., race, income).
  15. College emphasis on service by its students (e.g., community service requirements).
  16. Student loan repayment rate.

Thus, determining “the #1 college” depends upon on what information is deemed important.

The key to using these lists is to understand:

  1. What criteria are used, and how those criteria are weighted, in arriving at the ranking.
  2. Whether the data used is sufficient to support the conclusion drawn from it.
  3. Whether the ranking relies on data which is subject to misinterpretation or even fraud.
  4. Whether the criteria are relevant to your student’s interests.


A few examples of “best practices” and, well, “less than best practices,” follow.


What criteria are used, and how are they weighed?

The most serious problem arises when the ranking is done in a “black box”, where only the ranking service knows what it considers important.

Consider the explanation offered by Niche (formerly College Prowler) concerning how it creates college rankings (

The Niche 2018 College Rankings are based on rigorous analysis of academic, admissions, financial, and student life data from the U.S. Department of Education along with millions of reviews from students and alumni. Because we have the most comprehensive data in the industry, we’re able to provide a more comprehensive suite of rankings across all school types.

Very impressive, if a bit vague concerning exactly what that data is, and how much of it is relevant to ranking colleges.  However, we can break it down.  The Department of Education collects a trove of data from colleges – the complete dataset is over 200 megabytes – and makes it public.  Most ranking services use this data, and sites such as (highly recommended) compile and present it in usable form.  See

Niche’s claim to fame appears to be that it combines some of that data with its proprietary student survey data.  Confusion results when it explains how it uses that data (emphasis added in bold).

With clean and comparable data, we then assigned weights for each factor. The goal of the weighting process was to ensure that no one factor could have a dramatic positive or negative impact on a particular school’s final score and that each school’s final score was a fair representation of the school’s performance. Weights were carefully determined by analyzing:

How different weights impacted the distribution of ranked schools;

Niche student user preferences and industry research;

After assigning weights, an overall score was calculated for each college by applying the assigned weights to each college’s individual factor scores. This overall score was then assigned a new standardized score (again a z-score, as described in step 3). This was the final score for each ranking.

Yes, but what factors – criteria – were used, and how were they weighted?  Niche does not say.  For example, if Niche weighted “reputation of faculty” at 90%, then the rankings would be skewed heavily in favor of prestigious schools.

In contrast, many ranking sites are transparent about how they use such data in arriving at their rankings.  See e.g., (U.S. News and World Report – listing factors and the weight assigned to each); (the Economist magazine factors and weights used in its ratings for British universities).  If we cannot discern what information underpins rankings, then rankings are not very helpful.

Further, Niche makes explicit what I suspect other rankings purveyors do quietly – it takes a “second look” at the data to make sure that it looks “right” before publishing its rankings.  Here is the quote from above, with different emphasis added in bold:

With clean and comparable data, we then assigned weights for each factor. The goal of the weighting process was to ensure that no one factor could have a dramatic positive or negative impact on a particular school’s final score and that each school’s final score was a fair representation of the school’s performance. Weights were carefully determined by analyzing:

How different weights impacted the distribution of ranked schools;

Niche student user preferences and industry research;

That certainly looks like Niche “tried out” different weightings for each of the unnamed criteria, and then changed those weights if the resulting rankings did not look “right”.  The last sentence even suggests that it adjusts its ranking to conform to what other ranking services report (Niche analyzes student preferences and “industry research”, i.e., how other college insiders rank the schools).  That level of subjectivity is understandable – sort of applying a “smell test” to the results – but it does not add confidence that the weighting is completely objective.  It also limits the possibility of “uncovering hidden gems”.  What is the point of rankings if a school cannot score at the top unless it “looks the part”?

The lesson here is that before relying upon a list, understand exactly what criteria are being used, and how they are being weighted.


Is the data sufficient to support the measurement? 

When you step on a scale, the datum that stares you in the face, no matter how unpleasant, is almost certainly sufficient to determine how much you weigh.  However, if three people step on the scale – one at a time – the average of their weights will be insufficient to determine the average weight of the population of the United States.

Some ranking criteria require complete datasets to be relevant, and those datasets are often very difficult to obtain.  For example, many surveys measure “student outcomes,” usually through a proxy such as average earnings after 1 year, 5 years, etc.  Unfortunately, when schools survey graduates about their employment, often only the graduates with “good news” to report answer.  Would you be eager to tell your alma mater that you are unemployed?  And even those with good news to report simply ignore the surveys – perhaps out of fear that a letter from the development department asking for funds will follow.  (On a personal note, for the last 38 years, UCLA has been able to track me to a new address within six months of my arrival – very impressive.)

For example, only 19% of University of Kansas graduates responded to an outcome survey.  See  The university then took the unusual step of looking up LinkedIn profiles to supplement responses.

Take any exemplary “placement rate” with a large grain of salt.  One item to look for when evaluating individual colleges’ placement rates is whether they use the NACE standard for survey responses.  See  Even then, be careful with any statistics about “full-time employment” – many colleges interpret Starbucks baristas as being fully employed.

The college rankings that rely on graduates’ salaries also suffer from incomplete datasets.  See (“Want to Search Earnings for English Majors?  You Can’t.”  New York Times, 12/1/17).

In addition, graduates tend to find work close to their colleges.  Unless adjusted for cost of living, student earning surveys will skew in favor of West and East Coast schools.

Other criteria are easy to compute, but relatively meaningless.  The percentage of students graduating after four or six years is susceptible to sample bias.  It is normal for many engineering students to take more than four years to graduate.  Unless you know the sample (MIT, small liberal arts college), low four-year graduation rates coupled with high six year-rates may be “normal”.  Obviously, any school where both rates are low will drop out of any ranking without ceremony.  A better measure is freshmen retention – if students are not returning after freshman year, the odds are higher that your student will not, either.


Some data is subject to “gaming,” or even fraud

Selectivity is the ratio of applicants to students admitted.  In 2016, Stanford only admitted 1 in 20 applicants, leaving it with a selectivity of 5%.  That means Stanford must be the best, because only the best students get in, correct?

Well, a lousy school will not attract enough students unless it relaxes admission standards.  But be careful with putting too much weight on that correlation, because many colleges “game” selectivity.  Methods include:

  1. Using “VIP applications”. Colleges send out these one page, no-fee, applications to thousands of students.  Because they do not require essays or application fees, many students fill them out and return them.  See  I commented on this in my “You’ve Got Mail” post (6/22/15).


  1. Adopting the Common Application. The Common Application is used by over 500 colleges.  Students need only fill out the application once to apply to all member colleges.  Most colleges require students using the Common Application to respond to one or more essay prompts unique to the school.  Stanford has 11 such prompts, although many of them are short.  Colleges who wish to encourage applications do not require any supplemental essays.


  1. Going test-optional. A growing “fair test” movement in college admissions eschews standardized tests (SAT, ACT) as merely reflecting affluence or penalizing students who do not perform well on standardized tests.  These schools allow students to apply without them.  Although this may be laudable, it can also increase applications.


Colleges using these tools can lower their selectivity number, which, remember, is the percentage of applicants admitted, simply by inducing more students to enroll while not increasing the number of students accepted.  Why would they do this?  The U.S. News and World Report includes a college’s selectivity number in its ranking system, and guidebooks list colleges’ selectivity ratings prominently.  Students use selectivity as a proxy for “desirability,” which results in even more applications; a vicious cycle for students ensues.

The U.S News rankings also factors in the SAT and ACT score of enrollees.  You might ask how colleges could possibly “game” such numbers.  Well, five colleges since 2012 have been caught reporting higher scores than students actually earned – we call that fraud.  And there may be more who have not been caught.  See


The criteria are not useful to students

U.S. News and World Report assigns 22.5% of its college ranking to “undergraduate academic reputation”.  What does that phrase mean?  Here is the explanation from U.S. News:

For the Best Colleges 2013 rankings in the National Universities category, 842 top college officials were surveyed in the spring of 2012 and 53 percent of those surveyed responded. Each individual was asked to rate peer schools’ undergraduate academic programs on a scale from 1 (marginal) to 5 (distinguished).

Those individuals who did not know enough about a school to evaluate it fairly were asked to mark “don’t know.” A school’s score is the average score of all the respondents who rated it. Responses of “don’t know” counted neither for nor against a school. 

The problem is that “reputation” is intangible – the reader must ask: “reputation according to who?”  At least U.S. News makes the “who” clear:  college officials and high schools counselors.  But should students care how college administrators regard their colleagues (some of whom will be rivals)?  Unless the school is nearby (in which case it may well be a rival), most college administrators are unable to make fine distinctions between colleges.  And high school counselors rarely follow up with their charges to determine their experiences after enrollment.

Indeed, shouldn’t students care a lot more about what employers think about the schools?  The Wall Street Journal published just such a list: — in 2010.

And some ranking services reject traditional criteria as merely a proxy for wealth.  For example, The Washington Monthly publishes its list as “our answer to U.S News & World Report, which relies on crude and easily manipulated measures of wealth, exclusivity, and prestige to evaluate schools.”

Alas, not many career-oriented students (and parents) will be interested in its alternative:

We rate schools based on their contribution to the public good in three broad categories: Social Mobility (recruiting and graduating low-income students), Research (producing cutting-edge scholarship and PhDs), and Service (encouraging students to give something back to their country).

Kudos to those who are.

The largest failing of college rankings is that they are usually so general as to be meaningless.  Students in STEM fields may not value small class sizes as much as a school’s facilities and labs.  Liberal arts students are just the opposite.  Students who are planning professional careers may not care as much about “outcome measures”, as opposed to acceptance rates at professional schools (which the ranking industry tends not to measure).  Wealthy families may not worry about financial aid – for less wealthy parents, their inquiry may begin – and end – with that criterion.


College rankings can be useful

With the caveats expressed above, college rankings do have their uses.  Here is how to use them:

  1. Understand what is being measured, and the weights assigned to each criteria.
  2. Understand what criteria are omitted, and determine if the omission is important to your student’s needs.
  3. Choose the most specific ranking possible. U.S. News and World Report, along with other services, publishes “sub-lists”, e.g., best undergraduate engineering programs.  Prospective engineers should start with that list.
  4. Consult more than one ranking. Schools that are consistently ranked highly may be more likely to deserve their rankings.
  5. Prepare to dig deeper. Sometimes variations in formulas make little difference in the rankings they produce.  Eight of the top 10 universities on the U.S. News list were also in the Journal/THE top 10.  The other two were right behind.  You will have to do your own research to tease out the differences among high-ranked schools, or make decisions based on other factors (g., geography, cost).

How do I use rankings?  First, I use them to ascertain which schools my clients will value highly.  I may have to overcome preconceived notions with research.  Second, I use them like a string around a finger – when I am composing a list of candidate schools, I check the rankings to make sure that I have evaluated all of the commonly considered schools.

I also use college matching services, such as BigFuture ( to create lists of schools, but that is grist for another article.

Congratulations! You’ve Been Admitted to the University of California – For Now

Universities reserve the right to rescind admission in extreme circumstances, such as a student’s commission of criminal acts, failure to graduate, and the like. Indeed, I commented on just such a situation involving Harvard a few months ago in this blog. However, the University of California (“UC”) is taking this a step further, and not in a helpful way. Students may find that the UC will treat mild cases of “senioritis,” or even something beyond the student’s control, such as a lost transcript, as grounds for rescinding their admissions.

The UC uses one application for its nine campuses that offer undergraduate instruction. Students “check off” boxes for the UC campuses they are applying to, and each campus makes its own admissions decisions.

The application instructs students not to send transcripts; students self-report their grades. To ensure that the reported grades are accurate, all offers of admissions are made provisional upon the UC receiving official final transcripts verifying those grades.

Those provisional offers also include additional conditions which must be satisfied before enrollment. The UC refers to these provisional offers as “contracts.”  (As noted below – in a paragraph only a lawyer could love – it is not clear that these are true “contracts”.)

Unfortunately, some of those additional conditions protect the UC against more than fraud, and students may face unwelcome surprises in the form of the University rescinding admissions based on seemingly trivial grounds, including “senioritis.” For example, this summer, the University of California, at Irvine (“UCI”), rescinded 499 admissions because students had failed to arrange for their high schools to deliver their final transcripts by July 1, as required by those agreements. This seemed like an overreaction to a trivial deadline, but the news soon broke that there was more at stake for UCI.

At the time, UCI was over-enrolled by approximately 800 students. The university had already tried to convince some accepted students to enroll instead in a separate “academy” with a 50% tuition break. However, UCI failed to mention that those new students “would have to cancel their enrollment as regular freshmen, take a more limited menu of classes in the adult education division and give up access to campus housing and financial aid.” (LA Times, 8/2/17; see also

Not surprisingly, the offer was not taken up by enough students to solve UCI’s over-enrollment problem. So UCI turned to its contracts and rejected those students who had failed to submit transcripts timely.

On Friday, campus spokesman Tom Vasich conceded that the admissions office was more stringent than usual about checking requirements “as a result of more students accepted admissions to UCI than it expected.”

The vice-chancellor of students affairs also fessed up:

I acknowledge that we took a harder line on the terms and conditions this year and we could have managed that process with greater care, sensitivity and clarity . . . “

In other words, the University of California reserves the right to punish some breaches of its “contracts” more than others depending upon whether doing so is to its advantage. Ultimately the public outcry – and examples where the UC had indeed received the transcripts timely but mistakenly rescinded admission – forced the university to back down.

But questions remain because some of the terms in these “contracts” are vague. For example, to discourage “senioritis” — failure to maintain good grades in the student’s senior high of high school) — UC requires students to maintain a certain standard of excellence in their senior years. But what standard?

The general rule is that students must maintain a minimum weighted 3.0 GPA in college prep classes (the “a-g” requirements in UC lingo), with no “D” or “F” grades. However, for some of the most prestigious campuses, the contract may specify higher GPAs or test scores. See e.g.,

And then there are campuses that add “expectations”. Consider “senioritis” in light of this language from the University of California, at Santa Cruz:

In accepting admission at UCSC, you agree that you will:

  1.  Earn a level of academic achievement in your fall and spring courses (as you listed on your UC application) consistent with your previous coursework. Earn a grade of C or higher in those courses (or equivalent for other grading systems).

Wait. Look at the text again: “[e]arn a level of academic achievement in your fall and spring courses (as you listed on your UC application) consistent with your previous coursework.”

UC Davis appears to have similar language in its “contract”. See

Do we now have two requirements?  Must a student not only avoid “Ds” and “F”s, but produce the same grades (mostly “A”s) that got them admitted in the first place?  That is a lot harder to accomplish, and essentially extends the application period until high school graduation. “Senioritis” could be fatal under such conditions. Say it ain’t so, UCSC!

Regrettably, it is so, at least for UCSC.

UCSC includes an FAQ here:

FAQ 1A: My contract indicates “Earn a level of academic achievement in your fall and spring courses consistent with your previous coursework, with no grade lower than a C (or equivalent for other grading systems).” What do you mean by “consistent?”

Answer 1A: We expect that the grades you will earn in your senior year will look similar to the grades you earned in the first three years of your high school career; for instance, if you were a straight-A student for three years, we would expect A’s in your senior year. Consistency in your level of achievement must be carried through your senior year coursework.

FAQ 1E: I earned a C- in a course. Does that mean my admission will be cancelled?

Answer 1E: The University of California does not compute pluses or minuses in high school coursework. Therefore, a C- is considered equivalent to a C grade. Remember, however, that we also expect a consistent level of academic achievement in your coursework. (Emphasis added.)

Well, so much for students pausing during their last semester to enjoy the high school they attend in a way they previously had to defer for 3 ½ years of grueling competition. Of course, we do not know whether UCSC (or UC Davis) routinely enforce this standard.

But this requirement is worse than draconian – it is confusing. When analyzing a contract, lawyers look for the consequences of a violation of its terms, what we call a “breach.”  The UC’s “contracts” take a sweeping approach – any violation of any of its terms is considered grounds for canceling the contract and rescinding the admission.

UCSC is no exception. From that same linked document:

Failure to meet your Conditions of Admission Contract will result in the cancellation of your admission. It is your sole responsibility to meet all conditions. Read each of the six conditions below and ensure that you meet all of them. Accepting your offer of admission signifies that you understand these conditions and agree to all of them. (Emphasis in original.)

That appears clear, but for one problem. What does it mean for UCSC to “expect” students to have grades that are “consistent”?  This formulation appears vague: “how many “B”s can a straight-A student sustain before being in breach?  And what do we make of the verb “expect”?  It is weaker than “require”, and lacks the commanding “shall”. Is such an expectation enforceable?

I am not practicing law anymore, so I leave this question to those who are. But the fact that this is a reasonable question is a huge problem – both for UCSC and our students.

Looking at the big picture does not yield a pretty sight. The University of California requires students to sign “contracts,” even though minors are generally considered to lack the capacity to do so. (Not a small point for lawyers – see; I suggest that a university’s authority to enforce a policy, as opposed to the letter of a contract, may be subject to heightened due process concerns).

The UC reserves the right to enforce those “contracts” differently depending upon when it is to the UC’s advantage to do so. And it is sometimes vague about what it wants a student to do (earn “consistent” or “strong” grades through all four years) and what conduct constitutes a breach of contract allowing cancellation of admission.

Worst of all, by the time the UC decides whether to exercise its contractual remedies by rescinding admission, students have already declined all other offers in accordance with the May 1 rule used nationwide by colleges.

The UC has a sterling reputation, and I am a proud graduate of UCLA. However, as I’ve said previously in other contexts, caveat emptor.

You and your students should read any admission letters and accompanying materials very carefully before accepting an offer from the University of California. Upon accepting that offer, labor diligently to make sure that your student’s transcript arrives at their campus (and confirm in writing that it did), and that all other terms of the contract/agreement are satisfied.

Finally, of course, “senioritis” can be a serious threat to any student’s college aspirations, but it is a particularly dangerous malady for those planning to attend the UC. If it looks like grades will be a concern, call Admissions and warn them; they may be more inclined to be lenient if notified early than the cold, hard, text of their provisional “contracts” suggest.

A New Addition to College Curricula:  Preparing to Fail

The New York Times generally covers education issues in its “Fashion and Style” section.  Make of that what you will (including my reading habits), but one of their recent articles caught my eye: “On Campus, Failure is on the Syllabus” (June 24).

The lede is illustrative:

Last year, during fall orientation at Smith College, and then again recently at final-exam time, students who wandered into the campus hub were faced with an unfamiliar situation: the worst failures of their peers projected onto a large screen.

“I failed my first college writing exam,” one student revealed.

. . . .

The faculty, too, contributed stories of screwing up.

“I failed out of college,” a popular English professor wrote. “Sophomore year. Flat-out, whole semester of F’s on the transcript, bombed out, washed out, flunked out.”

“I drafted a poem entitled ‘Chocolate Caramels,’ ” said a literature and American studies scholar, who noted that it “has been rejected by 21 journals … so far.”

It is now a meme (see my entry on “Watch What You Post!” for a definition) that millennials are unusually fragile creatures who require cosseting against the vicissitudes of the real world.  In other words, they are weak creatures who have been raised in a bubble.

This is an odd idea on its face when applied to students admitted to elite colleges and universities.  These students have won an extraordinarily competitive race for four years to demonstrate that they are superbly equipped for any academic challenge.  Why would these students, of all people, suddenly crumble when they arrive at college?

Late adolescence is an uncertain, and even dangerous, moment.  At the extreme, this is the time when some forms of mental illness are more likely to emerge, such as bipolar disorder, schizophrenia, and depression.  See (claiming that one in five adults will experience a form of mental illness in college).  Students are also experimenting with adult life, including entering relationships which can end badly.

Further, many of these students have never experienced academic failure before they arrive at college.  The winners of a grueling race that penalizes even a “C” quite heavily, these students are heavily invested in succeeding, and are likely to be blindsided by failure.

Colleges are becoming alarmed at the frequency of students buckling under the new – for these students – experience of failure.  Smith College reminds students that 64% of students will receive a B-minus or lower during their time there.

And it goes a step further:

[W]hen students enroll in [Smith’s] program, they receive a certificate of failure upon entry, a kind of permission slip to fail. It reads: “You are hereby authorized to screw up, bomb or fail at one or more relationships, hookups, friendships, texts, exams, extracurriculars or any other choices associated with college … and still be a totally worthy, utterly excellent human.”

A number of students proudly hang it from their dormitory walls.

Smith is just one of many elite colleges rolling out such programs.

consortium of academics soon formed to share resources, and programs have quietly proliferated since then: the Success-Failure Project at Harvard, which features stories of rejection; the Princeton Perspective Project, encouraging conversation about setbacks and struggles; Penn Faces at the University of Pennsylvania, a play on the term used by students to describe those who have mastered the art of appearing happy even when struggling.

Some of this can be attributed to the newly fashionable idea that “grit” is an important ingredient for success.  As with most such revelations, a grain of truth can be puffed up into a silo full of grant-funded excesses seemingly devoted to accentuating the obvious.

However, such movements can also be useful, and this is one of them.  The “lesson” for students and the college counselors who work with them is that students should be made aware of the challenges ahead.  Most important, students should be told before they leave the nest that colleges have resources available to help students in distress.  Students should know the location of the Counseling Office on campus.  They should be instructed to seek help, and informed that doing so will not result in any social or parental stigma.

We all know that failure is part of life.  Make sure that your student knows that, too.  For my part, my students who just graduated high school and thought that they had heard the last of me until my Christmas break “check-in” are about to receive an e-mail with the New York Times article attached.

It Could Be Denver

The Independent Educational Consultants Association (IECA) held its semi-annual conference in May in Denver, As part of that group, I visited several colleges of interest.


University of Colorado at Boulder

The University of Colorado at Boulder (“CU-B”) is located 30 miles northwest of Denver in the city of Boulder (107,000). The city is self-contained, which is very helpful because the promised light rail line to Denver remains a mirage.

Fortunately, Boulder is in a beautiful valley within two hours of some of the finest skiing in the world. The weather is advertised as providing about 300 days of sun each year. Boulder itself is a foodie, microbrew, green town that is one of the most sought after suburbs in Colorado.

CU-B’s strengths make it an attractive choice for students from every part of the country. It should be on the shortlist of every student who wants to become an astronaut or aerospace engineer; you can view CU-B’s alumni list here:

U.S. News ranks its aerospace engineering program at #12; the programs ahead of it are much more selective. And CU-B is not shy about reporting that the same ranking service chose its graduate physics department as #1 in the nation in Atomic, Molecular, and Optical Physics, ahead of MIT, Harvard, and Stanford.

STEM in general is a priority on campus. See for information on the university’s attempt to improve teaching and learning. Of course, many other universities are trumpeting similar initiatives, but CU-B’s appears to be ambitious in scope.

CU-B also has interesting programs outside of STEM. One of them is the environmental design program, which repackages the existing architecture program within a larger School of the Environment and Sustainability. Here is the university’s description:

Students enroll in studios, lectures, and seminars taught by 30 faculty with both academic and professional expertise. They design innovative “green” buildings and infrastructure and they work directly with cities to figure out how to integrate social, ecological, and economic needs to support a sustainable future. Students apply state-of-the-art educational technology including computing tools, digital image databases, fabrication equipment, and advanced media to make a persuasive case and bring their ideas into light. Layer on top of all this the resources of the Boulder campus—from sciences, social sciences, humanities, arts, and technology fields—and we offer an educational opportunity like no other.

Students interested in architecture, landscape architecture, urban planning, and design generally may find this approach intriguing.

Are there downsides to the CU-B experience?  Two come to mind. First, CU-B may become a victim of its own success. For example, the engineering school is planning to decrease enrollment next year to alleviate overcrowding – applicants may find it harder to win admission. Second, the popularity of the school contributes to a high cost of living:  Boulder is a very expensive place to live, and students who leave the dormitories for rental housing (71% of them this year) will need to budget accordingly.

How difficult is it to get in?  Admission requirements vary by program. The numbers for the College of Arts and Sciences for the middle cohort (25% to 75%) are GPA 3.37 to 4.0, SAT 1170-1350, and ACT 24-30. The numbers are similar for admission to the environmental design program. However, the engineering program raises the bar:  GPA 3.87-4.0, SAT 1290-1470, ACT 29-33.

The male/female ration is 56/44, perhaps fueled by the popularity of STEM at the university. The OOS number (percentage of out of state students) is 39%, suggesting that out-of-state students may need to post better numbers than those cited above.

CU-B is a typical large public university with some uncommon strengths, set in one of the most beautiful places in the country. It merits your attention.


Colorado School of Mines

They still tote the rock.

Decades ago, when your correspondent was applying to college, Colorado School of Mines was known as the finest engineering school in the West. It may still deserve that title, but it is not nearly as well known. This is a shame, because for certain STEM fields tied to the earth, there are few better in the country.

“Mines”, as it is known, is a public university serving 4,500 undergraduates and 1,300 students. Located in Golden, Colorado, a town of 20,000 people – with light rail access to Denver and Boulder – the school was founded in 1874, two years before Colorado achieved statehood.

The school notes that Mines was originally devoted to the study of – wait for it – mining.

Courses offered to students during the early years of Colorado School of Mines included chemistry, metallurgy, mineralogy, mining engineering, geology, botany, math and drawing. The focus of the early academic programs was on gold and silver, and the assaying of those minerals.

According to Wikipedia, in 1906 the school opened the first experimental mine in the nation for teaching purposes.

Its mission has broadened considerably today:

The nexus between the earth, the environment and society’s need to generate and distribute energy in an economic and sustainable way is central to Mines’ specialized mission. Faculty and students at Mines research new frontiers in resource exploration, extraction and processing, renewable energy production and distribution, advanced materials, and environmental impact, mitigation and remediation.

Mines welcomes students who wish to study engineering, computer science (the school has its own supercomputer), biochemistry, applied math and statistics, geoscience, or closely related fields. A complete list can be found here:

STEM students only need apply; those who are looking for a college with a robust liberal arts offering should look elsewhere. Indeed, Mines’ own application – it does not accept the Common Application – has no essay requirement.

Our tour guide gave us a window on the sort of students who go to Mines. She said that she decided on Mines after attending a summer day program where they built catapults. She said that it was the only place where she did not have to explain to her peers what a catapult was, and how to build it. She felt at home at Mines.

Indeed, teamwork and inclusion are important at the school. Professors teach all classes, which is unusual for an engineering school. The atmosphere emphasizes teamwork over competition. This should not be surprising, as almost every student has a job upon graduation. According to staff, 220 companies interview at Mines every year; and more are put on a waiting list in case a recruiting firm can’t make it. The gender ratio is 75:25, but the school also fields the largest collegiate section of the Society of Women Engineers.

Traditions are prized here. Each year about 1,000 students “pull” an ore cart from Golden to Denver, where someone at the Capitol reads from a proclamation declaring it Engineering Days in Colorado. Every student is given a hard hat upon entry. Although these usually just sit on a closet shelf, they can come in handy. We were hit with a hailstorm during our tour. Some students who really had to cross campus could be seen running across one of the quads, with the hail bouncing off their hard hats. So Colorado offers 300 days of sunshine, plus occasional hailstorms – it could be worse.

And what about the rock, you ask?  Incoming freshmen are expected to bring along a 10-pound rock from their hometowns. (I’d like to see students explain this to airport security.)  In a ceremony aided by upperclassmen, the students haul their rock up the mountain overlooking their campus, and place it in the whitewashed “M” at the top. Upon graduation, they are invited to go back to the “M” and retrieve a rock to take with them on their journeys, which probably ends up sitting on a shelf next to the hard hat and the Mines’ silver-plated diploma. The whole place has a bit of “old school” atmosphere that is hard to resist.

As for getting in, the numbers for the middle cohort (25% to 75%) are GPA 3.74 to 4.0, SAT 1370-1470, and ACT 29-32. The OOS percentage is 35%. These numbers are typical for good engineering schools.

Two additional facts stand out. First, admissions are rolling, and they open early – in September. Although students can still get priority admission status through November 15, they should aim to submit their application the moment admissions open. This is easier than it appears because, as noted above, there is no essay requirement. Second, Mines will give AP credit sparingly, and in many cases, only after the student passes a “challenge exam” in the subject.

Mines is a hard-core (sorry) engineering school with a small and cohesive class, professors who teach students, a few unusual majors (explosives engineering, anyone?), and an excellent location with the mountains nearby. For the right student, this place could be a perfect fit.


Colorado College

We travel 75 miles south from Denver for our final college, but the trip is worth it.

It is unusual to find a college which differs from others, not just in academic offerings, but in the structure of the curriculum itself. Colorado College (“CC”), in Colorado Springs, serving 2,100 undergraduates, is one of a handful of colleges in the U.S. where the academic year is based on a Block System – students take one course at a time, for three weeks and half weeks, before proceeding to the next. Students take eight Blocks each year, plus optional summer sessions and a half-Block over the winter break.

CC’s pitch for its Plan is compelling:

Want to study for your biology midterm without worrying about filming your documentary, reading 72 pages of The Odyssey, or training your psychology rat?

Why not take just one class at a time? 

Do not assume that CC is a trendy experiment. The College was established in 1874, and is graced by a large and stunning Norman Romanesque chapel built in 1931. A glance around various dedications on buildings reveals an endowment funded by the Packards (the “P” in “H-P”), the Waltons, and other luminaries. The campus is pretty, and the setting near Pike’s Peak is hard to beat.

The Block Plan was started in 1970, and has become the best-known feature of a strong liberal arts college. Students take one course Monday through Friday, usually from 9 a.m. to noon, with labs in the afternoon. Unless enrolled in a lab course, students have afternoons and evenings to themselves and each other.

Each Block period runs for 3 ½ weeks, ending the Wednesday of the fourth week; students get a four-day weekend before the start of the next Block.

One major advantage of this schedule is that professors can schedule trips into the field without conflicting with other courses. Students pursuing outdoor fields of study (e.g., archeology, geology, environmental studies, wildlife biology) are particularly well-served by this arrangement. CC mentions film students traveling to Hollywood and art students going to Paris – you get the idea. And with a 10:1 student/teacher ratio, education can be personal, to the point where professors often invite their class to dinner at their homes.

By the way, CC’s outdoor location at the foreground of the Rockies is perfect for its plan. Like all the other Colorado schools, there is plenty of sunshine and opportunity for winter sports.

CC attracts many students who are frustrated by having to multitask constantly in high school. These are students who prefer to “dig in” to a topic. They can immerse themselves fully in one subject. If they hate the class, they can either exchange it for another (on the first or second day) or grin and bear it, knowing that each Block only lasts for just over three weeks.

Students and counselors alike should read the College’s definition of the “right fit” student here: The school tends to attract “intense” students. They may not be quirky, but they are definitely looking for something different. For most of them, that difference is the opportunity to fashion their own education, block by block. For such students, CC represents a wonderful opportunity.

Of course, great opportunities are not available for everyone. Only 17% of applicants are admitted, although the SAT/ACT numbers are not overly demanding:  averages of 1340 and 31, respectively.

Note that the annual cost of attendance is a hefty $65,000; students should investigate financial aid opportunities (most need-based) carefully.

Watch What You Post!

This blog does not usually cover breaking news, but Harvard’s recent move to rescind admissions for ten students based on their posts to social media makes this a timely topic.

The Washington Post is just one of many news outlets carrying the story:

Like many schools, Harvard sponsors a Facebook page for admitted students. The page allows students – and admitted students who have yet to matriculate – to exchange information. The topics are typically mundane, including information about student groups, orientation, students seeking others with similar interests (e.g., incoming international students), and the like.

In this case, the Post reports that about 100 members of the freshman class created a messaging group which would “share memes about popular culture – a growing trend on the Internet among students at elite colleges.”  For anyone older than 30, this may require translation. A complete explanation can be found here:; my summary follows, quoting liberally from the link.

A meme in this context refers to a “humorous piece of online content, usually in the form of an image with text or video, that is copied and rapidly disseminated by internet users across all platforms.”  Many college Facebook pages have sub-groups that are devoted to communicating, and commenting on, memes related to college life. These range from tweaking the administration to edgier topics.

At this point, a small alarm bell should begin ringing in your head. It is one thing to exchange information, but views about potentially sensitive topics?  This may be dangerous.

And so it became on the Harvard Facebook page, as a group of students formed a separate group to exchange messages. Then an even smaller number of that group formed an “offshoot” page to exchange “off-color” or “R-rated” communications, including graphics. This “offshoot” page is where the trouble began. The Harvard Crimson reports:

A handful of admitted students formed the [offshoot] messaging group—titled, at one point, “Harvard memes for horny bourgeois teens”—on Facebook in late December, according to two incoming freshmen.

In the group, students sent each other memes and other images mocking sexual assault, the Holocaust, and the deaths of children, according to screenshots of the chat obtained by The Crimson. Some of the messages joked that abusing children was sexually arousing, while others had punchlines directed at specific ethnic or racial groups. One called the hypothetical hanging of a Mexican child “piñata time.”

Yes, this is very ugly. However, what happened next made news. Harvard admissions administrators learned about the offshoot group, and began monitoring it. Eventually, the administrators reached out to individuals and demanded explanations for particularly offensive posts. Harvard ultimately rescinded ten students’ admissions, well after the deadline for those students to enroll at other schools where they had been admitted.

Per the Washington Post and other news outlets, the university invoked a sort of “moral code” from its admissions policy that was included on the Harvard Facebook page:

As a reminder, Harvard College reserves the right to withdraw an offer of admission under various conditions including if an admitted student engages in behavior that brings into question his or her honesty, maturity, or moral character. 

This apparently includes posts on social media To recap, students congregated on a public web page created by Harvard, then created a “private” chat group, where they made off-color, and hateful, jokes among themselves. Harvard monitored this private page on the Harvard sponsored Facebook page, reviewed the communications, and rescinded admissions.

Aside from whether Harvard’s response was appropriate, the episode points to an emerging reality of the 21st century. Whatever you write or post on the social media — indeed, the Internet in general — is available to all to see, forever. It may, or may not, result in consequences.

On the Internet, typing something in “all caps” is the equivalent of (rude) shouting. This point needs to be shouted:


As college students would say:  rant off.

A surprising number of college administrators are looking for such material. According to Kaplan Test Prep, 35 percent of admissions officers said they check social media sites like Facebook, Twitter and Instagram to learn more about applicants. About 42 percent of those officials said what they found had a negative impact on prospective students. See

Even if Kaplan only surveyed “elite” colleges (my guess), those numbers are still eye-opening.

Students need to understand that their social media activity is “fair game” in the college admissions process, and act accordingly.