Content marketing for higher education: Six steps for effectively packaging content you already have



Content marketing is a great tactic for engaging students and prompting them to offer their contact information in exchange for valuable content from your campus.

My prior blog post on content marketing and paid interactive marketing focused on two different approaches to using e-brochures as a content marketing “carrot” to prompt prospective students to share their contact information in order to download the brochures. In this post, we will examine some “guerrilla” approaches you can use to create an effective e-brochure and share some tips on finding good content that your campus already has available.

Before we start, note that these tips are appropriate for campuses where resources to assemble high quality, professionally-produced e-brochures are not readily available. Of course, it’s great to have the capability to have professionally produced e-brochures, and for those who do have such access, I suggest you count yourselves fortunate and begin building your content marketing production partnerships right away. But, as is often the case in higher education environments, you don’t always have access to robust budgets and skilled publications staff who can craft an expensive publication.

Even if you don’t have access to professional design and production, the good news is that campuses often have the appropriate content they need to create effective e-brochures. It’s just a matter of tracking that content down, assembling and perhaps updating the content slightly, and then repackaging in a electronically-deliverable form (often a PDF can be easy and effective).

So how do you create a compelling e-brochure that students will want to download, even if you’re operating on a modest budget and don’t have a ton of resources to devote to it? These six steps can help you repurpose existing content you already have available into an effective e-brochure.

Step 1: Go on a scavenger hunt for content across your website and in your marketing, admissions, and academic departments.

Look and ask for anything and everything that was created to market or simply inform students about a given program. Targets for your hunt can include print materials, e-communications, recruiting event invitations, program brochures, program one-sheets, admissions requirements, student videos, and so on.

Step 2: Think about what your marketing campaign will be seeking to achieve (likely this will be inquiry/lead generation) and the informational expectations of the audience you’ll be targeting.

Discuss with your campus colleagues the most frequent issues or pain points prospective students cite during their early discussions in the recruitment process. Are they concerned about program quality? Do they want more detail on the classes they will take or about their instructors? Is it something driven by emotion or fear, such as “Can I get into this program?” or “Can I afford it?”

Step 3: Make a prioritized list of all of the pain points and then do an audit of the items you found in your scavenger hunt from step one.

Match up your top three or four highest priority pain points with bits and pieces of content you found during your collection process, and then begin deconstructing the prior content. Borrow from and adapt that content until you develop a narrative flow that answers those student pain points. Good news: No one need worry about “plagiarism” here because you are taking content from your own campus resources. Go crazy with what you use from these other institutional assets.

Step 4: Edit the narrative into a lean, focused set of content, then give it a clear and punchy title that describes what key questions it answers.

As you edit and title the content you collected, ask yourself what problems this information solves for prospective students. If it’s more of a laundry list, that’s okay… Just be clear that is what it is and employ formatting such as bullets to make it understood that this is a collection of important information from multiple sources bundled together in one convenient content package.

Step 5: Work with a talented member of your team (or do it yourself) to assemble the content into a three- or four-page PDF document with a few images that go with the content.

This document does not need win any awards for graphic design, but it should not be “ugly” or unprofessional. It should simply provide answers to the questions that your prospective students have, packaged in a way so the content is easily readable.

Step 6: Offer up your new e-deliverable with an accurate call-to-action that is honest, straightforward, and that gives a clear sense of the value it has for prospective students.

Example: “Request our e-brochure today and learn the answers to the seven most common questions students considering Program X at Y University have.”

Now that I’ve shared some of my favorite “guerrilla” tactics with you for producing content marketing assets for use in your campaign efforts, I should mention that, while these approaches are useful in a pinch, there is also great potential value in partnering with your colleagues in the marketing and communications areas to generate a more polished e-deliverable, or video, or infographic. If that department is game to work with you, terrific! If, on the other hand, they hear your request and become concerned that this work will take too long, perhaps you can share the above as a way to create comfort in their minds.

Whatever path you end up walking, the key is to get started with content marketing ASAP so you can gain some momentum, and build a foundation upon which you can then start to build a proper, professional, and strategic program.

Final thought: Stay focused on the objective—to produce valuable content that students will find useful—and don’t put yourself in a position where the perfect becomes the enemy of the good!

As always, I invite your questions and opinions in the comments below, or email me and we can discuss strategies for creating content marketing that will satisfy prospective students and create opportunities for engagement. I will also be conducting a free webinar on paid interactive marketing, a valuable strategy for getting your content marketing seen and downloaded by prospective students.

Predicting student retention at community colleges



Enrollment at community colleges has been a hot topic lately, especially following President Obama’s announced plan to provide tuition-free classes for students at two-year institutions. In a recent blog, my colleague Mari Normyle shared some reactions based on assumptions about community college students and the quality of their educational experience. The data she shared may counteract some common assumptions people have about community college students (especially their commitment to academics), which illustrates why it is important to study and analyze the data about college student attitudes and behaviors.

One assumption many have is that when college students are satisfied, they are more likely to persist and complete their educations. Noel-Levitz has investigated this topic in recent years. We published a study by Dr. Laurie Schreiner called Linking Student Satisfaction and Retention, which found a significant link between satisfaction and persistence at four-year institutions. In another study last year, The Relationship of Student Satisfaction to Key Indicators for Colleges and Universities, my colleague Scott Bodfish and I reviewed institutional graduation rates, including those at community colleges, and found that colleges with higher graduation rates were also more likely to have higher student satisfaction scores.

However, while many community colleges have long had a commitment to assessing student satisfaction, there has been little definitive evidence that satisfaction with the experience was linked to individual student persistence at two-year institutions, until now.

Dr. Karen Miller, vice president for access and completion at Cuyahoga Community College (OH), in cooperation with Noel-Levitz, recently completed a national study of 22 institutions and more than 22,000 student records to examine student satisfaction and spring-to-spring persistence. The study, Predicting Student Retention at Community Colleges, is the first study of its kind with a national scope. Dr. Miller looked at student satisfaction and importance data from the Student Satisfaction Inventory (SSI) as well as additional institutional and student demographic variables to see how they predict student retention at community colleges.
Continue reading “Predicting student retention at community colleges” »

P + P = R is the basic formula for student retention



Many of you have heard me recommend a basic formula for student retention which combines the leading indicators of retention with the actual retention outcome. That formula is P + P = R or persistence plus progression equals retention. While most colleges and universities have policies that allow students to persist from their first term to their second term, those same students may not have progressed, i.e., successfully completed their courses in the first term. I have had conversations with many student success professionals about the above formula and many of us believe that progression indicators are probably more predictive of first year retention than is the persistence indicator.

Let’s take a closer look at progression. Once grades for the first term are posted, many of you may begin to think about the progression indicator of GPA, which may have placed many first year students on warning or probation or suspension, depending upon your policy. At this point, you may ask yourself: Are our probation rates and our students’ credit hours attempted-to-earned ratios “normal” as compared to similar schools?  To find out, Noel-Levitz conducts a poll of leading persistence, progression, and retention indicators every other year. Many of you may have participated in the past or in the latest study. See the latest benchmark report to compare your rates with other schools who participated. Once you have compared your first term outcomes, you may want to consider more intensive academic recovery strategies to try to improve progression rates among your students, which, in turn, affects your retention rate.

One must-do intervention for progression

To improve progression rates, I recommend that effective programs which require students to participate in the development of their own academic recovery should be implemented at the end of term one and/or the beginning of term two. These programs can come in the form of courses, individual counseling, academic support, TRIO programs, or a combination of these services. If a student isn’t earning the required GPA or hours that are expected at the end of term one, immediate participation in such academic recovery programs must be expected.

Examples from campuses

I encourage you to discuss the following progression models with your retention committee or task force:

  • My friends at High Point University in North Carolina use an extensive review process (what one might call a 360-degree review), of each student’s first term and the student’s improvement plan for term two. The plan is developed with a success coach, referrals are made, and progress is monitored.
  • Academic recovery at Montana State University Billings comes in the form of a workshop in which all students must participate. Group and individual meetings are held with follow-up and monitoring by success coaches.
  • Albion College requires a course taught by counseling staff which has in-depth assessments, appropriate referrals, and ongoing monitoring as its key elements. To learn more about this model, join us at the Noel-Levitz National Conference (NCSRMR) this July in Boston, where Dr. Barry Wolf will deliver his very engaging workshop describing just how Albion manages academic recovery.

No matter what form your academic recovery strategy takes, please try to be timely in your delivery. Many of you might be on break when or shortly after grades are posted in December. This is the critical time to begin to assess and respond to the progression indicators.

Share your ideas and strategies
I would love to hear from you to learn more about your academic recovery strategies. Please post your ideas so that others might learn from you. Your ideas never to cease to amaze us, and we’re all about helping one another strategize.

If you have questions about P + P = R, or if you’d like to discuss your strategies with me, please e-mail or contact me at 1-800-876-1117, ext. 5602.


Paid interactive marketing—a tactical example of content marketing for colleges



Content marketing allows campuses to cast a wide net and attract the interest of students who may not yet be specifically interested in that school.

In a prior blog post on content marketing for higher education, I focused on defining what content marketing approaches for prospective college student lead generation look like from a macro perspective. With this post, we’ll dive into the details to explore some specific and ac­­tionable approaches you might try on your campus.

As a reminder, and to frame this post for those who may have missed the previous blog, here’s the working definition for content marketing I shared in that post: “Using brochures or other media to provide those viewing online ads with value-added information that we send them in exchange for sharing their contact information.”

With a content marketing ad approach, we use a value-added information resource—an e-brochure, a video, or infographic—as a “carrot” offered in exchange for a prospective student’s sharing their contact information with us via a form embedded in the campaign’s landing page.

As a specific example, assume a school is using a Google AdWords search engine advertising campaign for marketing a master’s level education program. A click on an ad could lead a prospective student who is searching for education programs to visit a landing page where they can request a free brochure, “10 Keys to Advancing Your Career as an Educator.” All the visitor has to do is provide a few bits of information (name, email, maybe a qualifying question or two) and submit the form. Once they submit the form, they then see a “thank you” confirmation page where they can download the brochure and, ideally, an email message to their provided address will immediately land in their inbox with a link to that same brochure.

Continue reading “Paid interactive marketing—a tactical example of content marketing for colleges” »

College website SEO: Tracking your results in Google Analytics



Co-written with Jennifer Croft and Alan Etkin. Jennifer Croft  is an SEO consultant with 30 years of marketing experience who has worked on more than 500 websites, including 50 higher education websites. Alan Etkin has extensive experience using web analytics to manage large transactional websites, including 10 years in higher education.

Search engine optimization (SEO) is the single most cost-effective way to drive qualified traffic from prospective students to your campus website. Yet many campuses do not track key search metrics effectively, and either have an incomplete assessment of their search performance or are completely in the dark about it.

Skeptical about how important this is? Try using these tips on how web analytics can measure the return on SEO, and then see how it compares to your other promotional efforts.

If you do not already have a web analytics tool in place, you’ll need to install one first. Google Analytics is the most popular application for web analytics tracking. It’s a free program that can provide most of the metrics you’ll need.  We’ll use this as our reference tool for the strategies we’ll discuss.

Keep track of organic traffic

Organic traffic refers to visitors who arrive at your site after clicking on a search result (not one of the paid search ads you see at the top or to the side of your search results). Google Analytics provides an Organic Search Channel Report that tracks how many visits you get from search engines. For an additional level of insight, you can combine this source information with behavior data—for example, how long search visitors stayed on your site, how many pages they clicked, and so on.

Getting around the “not provided” block of keyword phrases

Back in the good old days, Google Analytics tracked the keyword phrases that visitors used to get to your site, and it was easy to separate branded terms from non-branded, and qualified phrases from non-qualified. Beginning in November 2011, however, Google started withholding keyword phrases from users who were logged into a Google account. Then in September 2013, Google completely shut off this flow of information, citing privacy concerns. Yahoo followed suit in January 2014, and as a result, little information remains in this section of Google Analytics.

There are two silver linings to this otherwise dark cloud. First, you can recapture some of that keyword data in Google Webmaster Tools, by looking in the Search Traffic/Search Queries section. Second, if your Google Analytics account has been active for years, you still have a trove of keyword data on hand from before we entered the “not provided” era. You want to be careful about relying on old search data, but it can still provide useful insights into your current SEO strategies.

This is a sample of an SEO dashboard that tracks: the total count of new visitors who used organic search to enter directly to course pages; the % of returning visitors and new visitors; entrances from organic search; page “values” (based on pre-determined dollar values for conversion goals); and overall values, broken down by channel.

This is a sample of an SEO dashboard that tracks: the total count of new visitors who used organic search to enter directly to course pages; the percentage of returning visitors and new visitors; entrances from organic search; page “values” (based on pre-determined dollar values for conversion goals); and overall values, broken down by channel.


Use “Page Views” + “Organic Traffic” + “Entrances” + “New Visits” to measure progress

Given that so much keyword phrase data has been eliminated from Google Analytics, you’ll need to focus your measurement efforts in a new direction.

Continue reading “College website SEO: Tracking your results in Google Analytics” »

How did you react to President Obama’s proposal to make community colleges tuition-free?



Recently, President Obama drew strong reactions—both positive and negative—when he announced a plan to provide tuition-free classes for students at community colleges who attend at least half time, maintain a minimum GPA of 2.5, and make steady progress toward a degree.

Whether you reacted positively or negatively, I suspect your reactions, like mine, were based on some assumptions about community college students and the quality of their educational experience. So, together, let’s look at some data to learn more about these students and about this vital segment of American higher education.

Do your assumptions about community college students match up with these four facts?
The 2014 National Freshman Attitudes Report compared survey responses from more than 25,000 first-year students at two-year institutions across the country in 2013 to responses from a representative pool of first-year students at four-year institutions. Here are four points that jump out at me:

  1. Students at two-year institutions report that they are more likely to have found a potential career that strongly attracts them, with 82.2 percent of these students reporting this versus 77.8 percent of students at four-year public institutions and versus 81.5 percent of students at four-year private institutions.
  2. The data show that students at two-year institutions are just as likely as students at four-year institutions to report that they are strongly committed to their educational goals and willing to make the sacrifices needed to achieve them. This attitude was held by 90.2 percent of students at two-year institutions versus 90.9 percent of students at four-year public institutions and versus 91.6 percent of students at four-year private institutions.
  3. The data show that students at two-year institutions are just as likely as students at four-year institutions to report that they have strong study habits, as six in 10 students at both two-year and four-year institutions report they study hard for all their classes, even those they don’t like. This attitude was held by 63.2 percent of students at two-year institutions versus 59.1 percent and 61.3 percent of students at four-year public and private institutions, respectively.
  4. Students at two-year institutions appear to be more tolerant than their peers at four-year institutions, as the data show that 62.2 percent report being comfortable relating to someone who thinks quite differently about major social issues compared with 60.5 percent and 60.9 percent of students at four-year public and private institutions, respectively.

More strengths of community college students

When I look at the data from students midway through their first year of college, based on a mid-year report on freshman attitudes, I also see additional strengths and gains made by students at two-year institutions:

  • 61.4 percent now report having a “very good grasp” of scientific ideas they have studied—an improvement of 24.1 percentage points from the beginning of the year;
  • 74.6 percent now report they have developed a solid system of self-discipline—an improvement of 16.4 percentage points from the beginning of the year; and
  • 74.9 percent now report they are capable of writing clear and well-organized papers—up 20.3 percentage points from the beginning of the year.

Students at two-year institutions are also open to receiving assistance. The top five requests for institutional assistance from freshmen at two year institutions midway through their first year include:

  • 56.4 percent would like help improving math skills;
  • 52.0 percent would like to discuss qualifications needed for certain occupations;
  • 51.5 percent would like help with an educational plan to get a good job;
  • 50.6 percent would like help in improving writing skills; and
  • 48.6 percent would like to discuss salaries and outlooks for various occupations.

These requests are very similar to those expressed by students at four-year institutions, both public and private, suggesting that students at community colleges may have more in common with their peers at four-year institutions than we often assume.

Certainly, students at two-year institutions have some challenges, too. For example, they report having less confidence in their math and science abilities and in their verbal skills. Also, a lower percentage report that they have the financial resources they’ll need to finish their college programs. In addition, they are more likely to be first-generation college students and 43.6 percent are working 20 hours per week or more at a job.

In summary
As we explore the potential of President Obama’s proposal and focus on the value of community colleges and the critical role they play in our communities and our nation, we should also focus on the data that is available to help us understand the students who enroll at community colleges. Moreover, campuses of all types—two-year and four-year, public and private—need to examine the academic motivation, general coping skills, and receptivity to institutional assistance that their particular students bring to support their achievement of academic goals (certificate, degree, or transfer) and to build on their many strengths.

For more information on community colleges and their students, I encourage you to refer to our entire series of National Freshman Attitudes Reports, as well as the survey instruments on which these reports are based which are available for any college to use.  Let’s keep working to unmask our assumptions. If you have questions about the reports or assessments, or if would like to discuss your assumptions of community college students, please contact me at 1-800-876-1117 or

Why you need to track source code performance in your college student pool



Proper source code analysis can help campuses understand the origin of their inquiries, applicants, admits, and enrolled students.

I had a recent discussion with several colleagues regarding the need in higher education to better understand source code performance and its impact on enrollment success, as well as how it informs and affects future recruitment strategies and effectively projects enrollment. This conversation triggered a realization that seems so simple yet eludes so many of us: A major issue in enrollment management today begins at…the beginning.

What I mean by this is that a growing number of institutions lack a fundamental understanding of which initial source codes result in the most inquiries, applications, admits, and enrolled students. Not only do many enrollment managers have little understanding of source code performance, others either do not have the time, resources, or full understanding of how to track the effectiveness of those events that bolster results at the top of the funnel as well as the rest of the funnel.

In addition, there is a need for enrollment professionals to become reacquainted with which sources to deliberately track. Although many schools have been quite effective in knowing the who, what, when, and where of tracking data to inform strategy, others have continued to follow the path of “this is what we have always done” instead of using data to guide decision making as much as they should. By no means am I insinuating that this comes from poor leadership in most cases; instead, I see that this is a product of antiquated technologies, few professional development opportunities, a lack of data analysts on campus who also understand enrollment management, and the cumulative effect of poorly collected data over many years.

So what is wrong and how do we begin to fix the issue? I believe that the first steps toward getting on the right track can be followed by answering three basic questions:

  • What sources currently produce any inquiries in our pool?
  • Are we effectively tracking these sources, and if so, how are they converting through the entire funnel?
  • How are we using the data to help inform and affect recruitment strategies?

Here’s how to answer each one of those questions in order to make your college student source code analysis more strategic and illuminating.
Continue reading “Why you need to track source code performance in your college student pool” »

Eight ways to get the most from college recruitment funnel tracking


Accurate funnel tracking for college student recruitment remains critical. Why? Because accurate college student funnel data remain one of the best resources available to project enrollment for today’s colleges and universities.

The following are some specific suggestions for how colleges and universities can get the most value from funnel tracking efforts in today’s higher education environment. For further information or discussion, consider arranging a complimentary telephone consultation with a Noel-Levitz enrollment consultant.

1) Use multiple funnels when you track your institution’s funnel data. As Noel-Levitz’s latest data demonstrate in the 2014 Recruitment Funnel Benchmarks Report, different types of students convert and yield at different rates, so it is no longer possible to use a “one-size-fits-all” funnel. We recommend that most four-year public and private campuses should, at minimum, be using separate funnels for traditional and nontraditional-age freshmen, transfers, in-state, out-of-state, international, and paper vs. online applicants. In addition, separate funnels should be used for those who enter at the application stage (secret shoppers) vs. those who enter at the inquiry stage.

2) Fine-tune your enrollment predictions by comparing your current funnel data to your institutions’ funnel data from previous years. It is essential that every institution look back at its own internal benchmarks first, even before examining external benchmarks such as those from Noel-Levitz. By examining your institution’s historic conversion rates at each stage of the admissions cycle and for each type of applicant, you can better predict where your future enrollment will end up as each day and week of the admissions cycle unfolds. For effective internal benchmark comparisons, we advise our client institutions to store and analyze three to five years of comparative data. To more fully understand how to use this historical trend data to predict and influence enrollment, see the table and illustrations on pages 2 and 3 of the Noel-Levitz white paper, 7 Categories of Admissions Data to Guide Decision-Making.

Continue reading “Eight ways to get the most from college recruitment funnel tracking” »

Why we need to hear about college student retention programs that are working



Excellent retention programs make a difference in student success every day on college campuses. All across the country, campus retention professionals go the extra mile to help as many students as possible succeed, persist, and complete their educations. It’s important to recognize these remarkable efforts, not just to acknowledge the fantastic work of those dedicated colleagues, but so that others can learn from their examples and make vital changes in their own retention efforts.

So, are you ready to share your story and be recognized for your efforts? Then you should apply for a Retention Excellence Award (REAs).

Lee Noel and Randi Levitz started these awards in 1989 in order to celebrate exceptional retention programs and promote awareness of effective retention practices. The REAs honor the retention achievements of postsecondary institutions throughout North America. More than 165 colleges and universities have received an REA since the program began. As a result of this national exposure, these award-winning programs have served as models of retention excellence to stimulate the creativity and energy of hundreds of two-year and four-year institutions.

These success stories have been compiled in The Compendium of Successful, Innovative Retention Programs and Practices a valuable resource that provides descriptions of the programs that have been recognized over the years. You can download the compendium to find new ideas that may be just what your campus needs to spark additional retention improvement. It includes many examples such as:

  • Madonna University (MI) developing the Bridging Lost Gaps (BLG) initiative in 2011 to increase the recruitment and retention of African American male students.
  • Paul Smith’s College (NY) implementing a Comprehensive Student Support Program as part of a strategically driven change in focus to a holistic student success model.
  • The Six Pillars of Retention that Seward County Community College/Area Technical School (KS) followed to improve Hispanic student retention.
  • Virginia Commonwealth University developing a proactive advising program for undeclared students.

These programs and many others are shared so you don’t have to re-invent the wheel. You just need to follow their lead and make it work for your own campus.
Continue reading “Why we need to hear about college student retention programs that are working” »

The satisfaction and priorities of online college learners


Are online students satisfied with instruction?

As we discussed in a previous blog, faculty interactions are a pivotal point of the college student academic experience.  This is true for students in online courses as well.  The quality of instruction, clearly defined assignments and faculty responsiveness are three key elements that influence student perceptions about the academic quality of their online experience.  So how do online learners nationally rate their experience with faculty?

The 2014-2015 National Online Learners Priorities Report provides insight to student perceptions in this area.  The report reflects responses from more than 122,000 college students enrolled in undergraduate and graduate online courses between the fall of 2011 and the spring of 2014.  The tables below show three key ratings that illustrate student perceptions on the instruction they receive:

  • Importance: The percentage of online learners who said that an issue was important to them.
  • Satisfaction: The percentage of students who were satisfied with that issue,
  • Gap: The difference between the importance and satisfaction scores. Higher gap scores indicate that students are not as satisfied on issues of high importance.

On instruction-related items, online learners had high gap scores on four out of five key issues:


Two of these issues, faculty being responsive and faculty providing timely feedback, while also important to students in traditional programs, take on an even greater priority for students in online courses.  The nature of the 24/7 online learning environment may create an expectation of constant availability of faculty.  It is important for online faculty to communicate guidelines on what students can expect for responsiveness, and then stick to those promised timeframes.

The perception of the quality of instruction is a challenge item (high importance, large performance gap) and a priority for improvement.  Note that while 72 percent of students did indicate that they were satisfied or very satisfied in this area, the 95 percent of students who labeled it important creates a gap of 23 percent, which is why it is a challenge.  Online learning institutions have opportunities to influence perceptions of the quality of instruction through their marketing and orientation efforts, as well as with appropriate training of their online faculty.

The priority of tuition paid being a worthwhile investment is an issue for online learners as well as traditional students.

However, while the perception of tuition value is a challenge for online learners, they had considerably higher levels of satisfaction than students at four-year institutions on this item, where satisfaction ranged from 46-52 percent. (See the 2014 National Student Satisfaction and Priorities Report for more details on student satisfaction among traditional college students).

Download the 2014-2015 National Online Learners Priorities Report to see more satisfaction and priorities responses on issues such as enrollment services, academic services and student services.  The full report also includes a list of the top factors influencing online learners’ decisions to enroll in their program.

You can also learn more about the survey instrument used for the report, the Priorities Survey for Online Learners.  And as always, I am happy to answer any of your questions too.  Just send me an email or connect with me on Twitter.


Click to learn more about the workshops

E-mail Michael Lofstead
Michael Lofstead