ORLANDO, Fla. — Talk to presidents of liberal arts colleges and they are proud of how their institutions educate graduates and prepare them for life. But ask the presidents to prove that value, and many get a little less certain. Some cite surveys of alumni satisfaction or employment. Others point to famous alumni.
And, privately, many liberal arts college presidents admit that their arguments haven’t been cutting it of late with prospective students and their parents (not to mention politicians), who are more likely to be swayed by the latest data on first-year salaries of graduates, surveys that seem to suggest that engineering majors will find success and humanities graduates will end up as baristas.
Richard A. Detweiler believes he has evidence — quantifiable evidence — that attending a liberal arts college is likely to yield numerous positive results in graduates’ lifetimes, including but not limited to career and financial success. He has been giving previews of his findings for the last year. On Friday, at a gathering here of presidents of the Council of Independent Colleges, he presented details and said he believes the results have the potential to change the conversation about liberal arts colleges. He said his findings show that the key characteristics of liberal arts colleges — in and out of the classroom — do matter.
At the meeting, Detweiler described his project. He started by examining the mission statements of 238 liberal arts colleges, looking at what the colleges say they are trying to accomplish with regard to their students. Among the common goals given for graduates were to produce people who would continue to learn throughout their lives, make thoughtful life choices, be leaders, be professionally successful and be committed to understanding cultural life.
Then Detweiler and colleagues conducted interviews with 1,000 college graduates — about half from liberal arts colleges and half from other institutions. The graduates were not asked about the value of their alma maters or of liberal arts education, but were asked a series of very specific questions about their experiences in college and then their experiences later in life. The graduates were a mix of those 10 to 40 years after graduation, and conclusions were drawn on liberal arts graduates vs. other graduates only when there was statistical significance for both relatively recent and older alumni. Some of the findings may be relevant to liberal arts disciplines at institutions other than liberal arts colleges, but the comparison point was for those who attended the colleges.
What Detweiler found was that graduates who reported key college experiences associated with liberal arts colleges had greater odds of measures of life success associated with the goals of liberal arts colleges. Here are some of the findings:
Graduates who reported that in college they talked with faculty members about nonacademic and academic subjects outside class were 25 to 45 percent more likely (depending on other factors) to have become leaders in their localities or professions. Those who reported discussions on issues such as peace, justice and human rights with fellow students outside class were 27 to 52 percent more likely to become leaders.
Graduates who reported that students took a large role in class discussions were 27 to 38 percent more likely to report characteristics of lifelong learners than others were. Students who reported most of their classwork was professionally oriented were less likely to become lifelong learners.
Graduates who reported that as students they discussed philosophical or ethical issues in many classes, and who took many classes in the humanities, were 25 to 60 percent more likely than others to have characteristics of altruists (volunteer involvement, giving to nonprofit groups, etc.).
Graduates who reported that as students most professors knew their first names, and that they talked regularly with faculty members about academic subjects outside class, were 32 to 90 percent more likely to report that they felt personally fulfilled in their lives. Those who reported that professors encouraged them to examine the strengths and weaknesses of one’s views, and whose course work emphasized questions on which there is not necessarily a correct answer, were 25 to 40 percent more likely to report that they felt personally fulfilled.
But What About Money?
Detweiler saved for last the characteristic that gets so much attention these days, and that liberal arts college leaders fear hurts them: money.
To think historically is to recognize that all problems, all situations, all institutions exist in contexts that must be understood before informed decisions can be made. No entity — corporate, government, nonprofit — can afford not to have a historian at the table.
Since the beginning of the Great Recession in 2007, the history major has lost significant market share in academia, declining from 2.2% of all undergraduate degrees to 1.7%. The graduating class of 2014, the most recent for which there are national data, included 9% fewer history majors than the previous year’s cohort, compounding a 2.8% decrease the year before that. The drop is most pronounced at large research universities and prestigious liberal arts colleges.
This is unfortunate — not just for those colleges, but for our economy and polity.
Of course it’s not just history. Students also are slighting other humanities disciplines including philosophy, literature, linguistics and languages. Overall, the core humanities disciplines constituted only 6.1% of all bachelor’s degrees awarded in 2014, the lowest proportion since systematic data collection on college majors began in 1948.
Conventional wisdom offers its usual facile answers for these trends: Students (sometimes pressured by parents paying the tuition) choose fields more likely to yield high-paying employment right after graduation — something “useful,” like business (19% of diplomas), or technology-oriented. History looks like a bad bet.
A historian, however, would know that it is essential to look beyond such simplistic logic. Yes, in the first few years after graduation, STEM and business majors have more obvious job prospects — especially in engineering and computer science. And in our recession-scarred economic context, of course students are concerned with landing that first job.
Over the long run, however, graduates in history and other humanities disciplines do well financially. Rubio would be surprised to learn that after 15 years, those philosophy majors have more lucrative careers than college graduates with business degrees. History majors’ mid-career salaries are on par with those holding business bachelor’s degrees. Notably these salary findings exclude those who went on to attain a law or other graduate degree.
The utility of disciplines that prepare critical thinkers escapes personnel offices, pundits and politicians (some of whom perhaps would prefer that colleges graduate more followers and fewer leaders). But it shouldn’t. Labor markets in the United States and other countries are unstable and unpredictable. In this environment — especially given the expectation of career changes — the most useful degrees are those that can open multiple doors, and those that prepare one to learn rather than do some specific thing.
All liberal arts degrees demand that kind of learning, as well as the oft-invoked virtues of critical thinking and clear communication skills. History students, in particular, sift through substantial amounts of information, organize it, and make sense of it. In the process they learn how to infer what drives and motivates human behavior from elections to social movements to board rooms.
Employers interested in recruiting future managers should understand (and many do) that historical thinking prepares one for leadership because history is about change — envisioning it, planning for it, making it last. In an election season we are reminded regularly that success often goes to whoever can articulate the most compelling narrative. History majors learn to do that.
Everything has a history. To think historically is to recognize that all problems, all situations, all institutions exist in contexts that must be understood before informed decisions can be made. No entity — corporate, government, nonprofit — can afford not to have a historian at the table. We need more history majors, not fewer.
There is no reason to unduly limit our students’ horizons. Following your interests does not doom you to a life of poverty and struggle.
by Michael W. Clune in the Chronicle Review
I was nearly 30 the first time I met an example of the new breed — a University of Michigan graduate who knew nothing beyond what was necessary to pursue his trade. It was my first job out of graduate school, and Michigan had one of the highest-ranked engineering schools in the country.
Let’s call him Todd. He’d graduated a few years before. I met him at a party. He had a good job at a local engineering firm and drove a nice car. Talk turned to intellectual matters, and I soon learned that he was a creationist. He didn’t seem to be aware of arguments for the other side.
He was surprised to learn that Russia had fought in World War II. He’d done well in AP high-school English, which had gotten him out of having to take literature classes, and he hadn’t read a book since graduating from college. “Most manuals nowadays are online,” he said.
Learning that I was an English professor, he asked me if I’d be willing to help him with a self-assessment document he had to write for his job. I was curious, and when a few days later his draft landed in my inbox, I discovered that his writing suffered from basic flaws.
I think even those most committed to putting vocational training at the center of higher education will agree that Michigan had failed Todd. The key Todd-prevention mechanism, which had somehow malfunctioned in this case, is known as general education. This set of courses required for all majors is designed to transmit the rudiments of critical thinking, writing, science, history, and cultural literacy to the students whom our universities are training — as Wisconsin’s Gov. Scott Walker memorably put it — to meet our “work-force needs.”
To begin to illustrate the threats that gen ed now faces, let me introduce another figure. We’ll call him Donald…
“The power of ‘the Eye of the Heart,’ which produces insight, is vastly superior to the power of thought, which produces opinions,” the great British economic theorist and philosopher E.F. Schumacher wrote in his 1973 meditation on how we know what we know. He was responding to the Persian poet and philosopher Rumi who, seven centuries earlier, extolled “the eye of the heart” as seventy-fold more seeing than the “sensible eyes” of the intellect. To the intellectually ambitious, this might sound like a squishy notion — or a line best left to The Little Prince. But as contemporary scientists continue to shed light on how our emotions affect our susceptibility to disease, it is becoming increasingly clear that our emotional lives are equipped with a special and non-negligible kind of bodily and cognitive intelligence.
The nature of that intelligence and how we can harness its power is what Martha Nussbaum, whom I continue to consider the most compelling and effective philosopher of our time, examines in her magnificent 2001 book Upheavals of Thought: The Intelligence of Emotions (public library). Titled after Proust’s conception of the emotions as “geologic upheavals of thought,” Nussbaum’s treatise offers a lucid counterpoint to the old idea that our emotions are merely animal energies or primal impulses wholly separate from our cognition. Instead, she argues that they are a centerpiece of moral philosophy and that any substantive theory of ethics necessitates a substantive understanding of the emotions.
A lot is at stake in the decision to view emotions in this way, as intelligent responses to the perception of value. If emotions are suffused with intelligence and discernment, and if they contain in themselves an awareness of value or importance, they cannot, for example, easily be sidelined in accounts of ethical judgment, as so often they have been in the history of philosophy. Instead of viewing morality as a system of principles to be grasped by the detached intellect, and emotions as motivations that either support or subvert our choice to act according to principle, we will have to consider emotions as part and parcel of the system of ethical reasoning. We cannot plausibly omit them, once we acknowledge that emotions include in their content judgments that can be true or false, and good or bad guides to ethical choice. We will have to grapple with the messy material of grief and love, anger and fear, and the role these tumultuous experiences play in thought about the good and the just.
Emotions are not just the fuel that powers the psychological mechanism of a reasoning creature, they are parts, highly complex and messy parts, of this creature’s reasoning itself.
One of Nussbaum’s central points is that the complex cognitive structure of the emotions has a narrative form — that is, the stories we tell ourselves about who we are and what we feel shape our emotional and ethical reality, which of course is the great psychological function of literature and the reason why art can function as a form of therapy. What emerges is an intelligent manifesto for including the storytelling arts in moral philosophy.
A wonderful piece from one of my favorite historians
Our mission is simple. And it means death to one of our greatest lusts.
by Allen Guelzo, PhD in Christianity Today
It is very nearly four decades since, as a terribly callow graduate student with an interest in philosophy, I made a pilgrimage with a friend to the home of a professor of Christian apologetics. I was looking for direction, and even though Cornelius Van Til had been retired for many years, he was known to welcome inquirers—whom he often greeted on his front porch with a rake in hand, suggesting that perhaps they could pile-up his leaves for him before they talked.
I was hoping to hear an intimidating, intellectually-convoluted, scholastic, metaphysical strategy for blowing the philosopher’s version of Gideon’s trumpet. Van Til, then pushing 80 stood with his hard white comb of hair brushed back from his cliff-like brow, and the smile of an old Dutch dairy farmer (which his father had been). I asked, “Dr. Van Til, why did you decide to devote your life to the study of philosophy and the teaching of apologetics?”
And I then sat back to allow the metaphysics free room to roll. Van Til never blinked.
“Why,” he said, “to protect Christ’s little ones.”
The surprise that could have dropped me to the floor that afternoon has never quite evaporated. Why, to protect Christ’s little ones. Not only because those words express a great nobility in a few syllables, but because, remembering them, they cast down every castle of intellectual folly I erect, or am tempted to erect. And because, at the end, I am not worthy of them, and because anyone who understands that the kingdom of God is our true home, that God’s people are truly our people, and that this is a world by turns indifferent and hostile to both, must see those words as a true reminder of what we owe to each other as Christians, and in what relation we stand to each other.
I recall those words—Why, to protect Christ’s little ones—with tears, both because I have not always lived according to them, and because it is precisely the world of the scholar and historian that encourages me to ignore them…
Allen Guelzo is the Henry R. Luce III Professor of the Civil War Era at Gettysburg College, where he serves as Director of the Civil War Era Studies Program. He is widely recognized as one of the leading scholars of the Civil War in general and Abraham Lincoln in particular. He is also a committed Christian and churchman. Though far better known for his work on Lincoln and the Civil War, he has also written noteworthy studies on Jonathan Edwards and free will and on the Reformed Episcopalians, as well as co-editing a book on the New England Theology. (Bio by Justin Taylor, who I gratefully thank for pointing me to Allen’s article.)
The status of victim has been weaponized at campuses across the nation, but there is at least one encouraging sign.
By Roger Kimball
For more than a week now, the country has been mesmerized, and appalled, by the news emanating from academia. At Yale the insanity began over Halloween costumes. Erika Christakis, associate master of a residential college at Yale, courted outrage by announcing that “free speech and the ability to tolerate offense are the hallmarks of a free and open society” and it was not her business to police Halloween costumes.
To people unindoctrinated by the sensitivity training that is de rigueur on most campuses today, these sentiments might seem unobjectionable. But to the delicate creatures at Yale’s Silliman College they were an intolerable provocation. What if students dressed as American Indians or Mexican mariachi musicians? Angry, hysterical students confronted Nicholas Christakis, Erika’s husband and the master of Silliman, screaming obscenities and demanding that he step down because he had failed to create “a place of comfort, a home” for students. The episode was captured on video and went viral.
At the University of Missouri, Jonathan Butler, the son of a wealthy railroad executive (2014 compensation: $8.4 million), went on a hunger strike to protest what he called “revolting” acts of racism at Mizzou. Details were scanty. Nevertheless, black members of the university football team threatened to strike for the rest of the season unless Tim Wolfe, Mizzou’s president, stepped down. A day or two later, he did.
I am now always searching for safety, and I appreciate safe spaces — the ones I create for my students in a classroom, the ones I create with my writing and the ones others create, too — because there is so much unsafe space in this world.
This past week, the news media has energetically discussed student unrest at Yale and at the University of Missouri, where students are protesting administrative insensitivity or inaction in the face of troubled racial climates. At Mizzou, in particular, student activists have demanded safe space. A student journalist, Tim Tai, was denied access to the protesters’ tent city in a public area of the campus. The protesters didn’t want to be photographed or interviewed, possibly not trusting journalists to tell their story accurately.
The next day, they rightly changed their stance, opened their space to the media, and a debate on free speech and safe spaces found new life. Quickly, the student protesters were accused of not tolerating free speech in regard not only to Mr. Tai, but also to those who use racial epithets and otherwise engage in hate speech. They were accused of being weak, of being whiny for having the audacity to expect to attend college without being harassed for their blackness.
As a writer, I believe the First Amendment is sacred. The freedom of speech, however, does not guarantee freedom from consequence. You can speak your mind, but you can also be shunned. You can be criticized. You can be ignored or ridiculed. You can lose your job. The freedom of speech does not exist in a vacuum.
More fundamentally, though, the protests are a result of longer-term frustration and anger over persistent shortcomings in the inclusiveness of the campus culture and of the daily interactions that enable students to feel fully accepted and embraced for who they are. Inclusiveness is an important value, but especially so on a highly residential campus and in a tight-knit campus community like that of Ithaca College.
In the wake of decisions by the president of the University of Missouri system and the chancellor of its Columbia campus to step down in the face of similar protests, many people wonder if such resignations will become a trend. At Ithaca College, as well, the focus of student protest has been on me as president, not because of anything prejudicial I have done but due to a belief that the campus climate is not what it should be and that the buck stops with the president.
It is impossible to know whether the current wave of campus activism will increase leadership turnover. So much depends on institution-specific circumstances such as the extent of board support for a given president and whether there are any hidden issues in play in addition to the public issues that engender the protests.
But it is highly likely that we have entered a new era of student activism focused on inclusivity and bias.
The protest movement that started at the University of Missouri at Columbia has outlasted the president of the University of Missouri System, who resigned on Monday. While Missouri had some unique factors, in particular a boycott started by the black members of the football team, campuses nationwide are seeing protests by students over racial tensions without the benefit of support from football teams. Some of the protests are expressions of solidarity with the black students at Missouri, but many go beyond that to talk about racial conditions on their own campuses, which many describe as poor.
The movement is showing up at large campuses and small, elite and not-so-elite institutions, campuses with strong histories of student activism and not. On some campuses the focus is on issues experienced by black students, while on others the discussion is about many minority groups. More protests are planned for this week.
As the weekend ended, students who had been staging a sit-in in the library at Amherst College, with a long list of demands, agreed to leave, despite not getting their demands met. At the University of Kansas, minority students are demanding the resignation of student government leaders who they say haven’t done enough for all students, and an alumnus has started a hunger strike on campus — all on a campus that last week held a lengthy open forum on race relations (right) with the university president, who on Friday issued a statement of support for minority students.
The protests are also provoking considerable backlash — with online threats appearing at many campuses. While the threats have led to several arrests, without indications that those posting the threats actually intended to carry them out, these actions have caused fear at many campuses.
Headlines scream … Ex-Christians, Young Adults Leaving the Faith, A Generation of Dropouts, Quitting Church, the Rise of the Nones. We are on the verge of a crisis with faith and the faithful in retreat. Could we be the last Christian generation? We must rally the troops, cut our losses, and tackle the problem.
Jonathan P. Hill, Assistant Professor of Sociology at Calvin College recently published a short book Emerging Adulthood and Faith that explores the change (or absence of change) in the religious faith of young adults, the so-called generation Y or Millennials who were born roughly from 1980-2000. Much has been written of late decrying the loss of a generation, with the explicit or implicit assumption that the experience and trajectory of Millennials is unique.
But do we have such a catastrophic problem?
Hill, like many other sociologists, argues that we need to take a closer look at hard data. Only then will we make good decisions for the right reasons. Bluntly: We need the facts. Hill’s book is a valuable fact-based resource for anyone involved in church leadership, from lead pastor to youth pastor to elder. Less than a hundred pages and very readable.
First, we need to frame the problem.
At any given moment, age differences between people can be the result of what demographers refer to as age, period, and cohort effects. (p. 7)
Some changes depend on stage of life. In many respects those who are 20 today reflect the same trends and attitudes as those who were 20 in the 50’s, 70’s and 90’s. For instance the elderly pray more than the young. In 1983 42% of those 18-29 reported daily prayer compared with 68% of those 70 and above. In 2012 the numbers were 39% and 71%. This isn’t a generational change; it depends more significantly on biological age and stage of life.
Some changes depend on period – historical shifts, cultural transformations that cut across generations. These events have a significant event on everyone alive at the time, both old and young. The reformation was a historical shift; the great depression and World War II are other examples.
Other changes are generational or cohort effects. These can be significant, but it is important to separate them from effects depending on stage of life and from larger cultural transformations. Bad information will lead to bad responses.
Second, we need to consider the source. There is power in stories, but anecdotal data will only deliver part of the picture. Individual stories always focus down to a specific social context and experience.
Good social science allows us to take a step back from our own experiences, providing data that can give us glimpses of powerful social forces we would otherwise be blinded to. These forces are the background to our lives, though we are often unaware of their influence. Social science can help us see these contexts. (p. 10)
It is important to consider both stories and survey data. In his book Hill focuses on survey data to consider three big issues: the religious identity of young people, the influence of higher education, and the influence of science on faith. He doesn’t disregard stories, but looks at the data that provide a larger context for the individual stories.
Is the religious identity of young adults changing?
Certainly the data tell us that “somewhere between 50 and 70 percent of teenagers who were regularly in church on Sunday morning are no longer attending by their early twenties.” (p. 15) In fact, the drop in attendance begins at ages younger than 15 and levels out around 18 or 19. But much of this is a trend based on biological age. This isn’t a cohort or generational effect. About 12-13 percent of 18-29 year olds attend a Protestant church every week, a number that has been remarkably steady for the last forty years! Similarly about 42% of 18-29 year olds pray at least daily – a number that has been remarkably steady for the last forty years. There has been a significant change with the rise of “nones,” but this comes from those who were less committed to begin with. There has been a significant drop in those who attend services occasionally or who pray occasionally and a concomitant rise in those who never attend and never pray.
At least among Protestants, every concerned parent and pastor should know this: The percentage of young people with a strong Protestant identity, and the percentage who regularly practice their faith publicly and privately has barely budged over the past forty years. They are in your churches, youth groups, and Bible studies. Yes, something has been happening at the margins, but the center has held. (p. 20)
To me this suggests that we don’t need massive institutional transformations. We need to continue engaging people, families, young and old, in strong Christian community.
There is no evidence that Millennials are “drastically different from previous generations” except that those loosely affiliated with religious identity are now willing to drop it altogether. There is little social advantage to maintaining this kind of religious identity.
Is secular college education a threat to faith?
Bluntly, the answer is no, but there is a generational element to this one.
While stories of individual faith loss are real and deserve our attention, there is little evidence of widespread disaffection from the faith as a result of secular higher education. (p. 29)
When college graduates are compared to the general population a different trend appears.
College graduates are actually more likely to practice their faith and say it is important in their daily life. They are no more likely to disaffiliate from a religious faith, although they do appear more likely to shy away from exclusivist claims about the Bible and more prone to switch to a mainline Protestant denomination. (p. 30)
Hill digs deeper and compares those with and without a college education, controlling for religious affiliation as younger teenagers. In the past college education did have an influence on secularization, but this isn’t true for the millennials. At last a distinction, but it isn’t negative.
Overall college graduation has a positive correlation with regular participation in a church, perhaps because college graduates have more faith in institutions than those who have never attended college.
Hill also finds that evangelical colleges and universities are doing a good job of nurturing faith. “On every measure of religious practice and identity, individuals attending evangelical Protestant colleges decline considerably less than their counterparts at other schools.” (p. 50) For example, at evangelical Protestant schools 73% of students who prayed regularly continue to do so, a number that drops to 56% at public schools. This probably results both from the curricular and social environment of the schools and from the family commitments and support experienced by these students.
Does education in science drive young people from the faith?
Again the aggregate answer is no. Surprisingly, education in science in general and evolution in particular seems to have little effect on faith commitments. (As a science professor I cringe at this one – student form their views on “touchy” issues based on things other than the evidence.) And certainly there are plenty of stories of people struggling with these issues and changing their views. Hill doesn’t disregard or discount these (after all, the NSYR survey indicated that some 40 percent of young people with creationist views in 2002-3 did change their views by 2007-8, whether they go to college or not). But science education seems to have a minor influence one positions most people take.
People come to accept or reject evolution not as a result of pouring over the details and evidence, but as a symbolic gesture to indicate to others where they belong in the socio-political landscape. This requires understanding that religious beliefs, and beliefs about other contentious public issues, are intertwined with identity and social relationships. Formal science education, for most young people, is unlikely to change this. (p. 57)
Those who enter college with particular views concerning evolutionary biology are likely to retain them throughout the experience. The more tightly these views are coupled to social identity, the more tightly they will be held by most. This can be a wrenching experience for some who do find their views changing – to conflict with their social community, but it isn’t a widespread crisis for the church.
Why do doomsday scenarios carry such appeal?
Hill suggests three reasons:
“First, our own social experience strongly colors how we frame the problem and interpret the data, yet our personal experience can frequently be an unreliable guide.” (p. 61)
There is a distortion in perception that arises from the ways we receive data. Personal stories are powerful and true, but the ones that “stick” are the stories from the margins, not the mainstream. Hill points out that many Americans feel that violent crime is up while the data shows that homicide, for example, is markedly down since 1990 and at about the same level as the 1950’s. “All violent crime has been declining, yet public perception of crime—largely filtered through mass media and politicians—is systematically in error.” (p. 61-62) The stories told in the Christian community carry the same kind of bias.
Second, generating crises in the Church can be an efficient and effective way to mobilize the faithful to action. (p. 62)
Sociological study of social movements can identify techniques that succeed in building a community.
Elites must work on generating an interpretational framework that identifies a specific problem, then identifies the source(s) of the problem, and finally provides a potential solution in the form of collective action. Further, the entire interpretational framework must align with existing grievances in recruits, otherwise collective action will fail. The temptation, then, will be for leaders in the Church to frame concerns about the next generation of Christians in such a way as to result in action by the rank and file, even if these interpretations are not entirely accurate. (p. 62, emphasis added)
Grabbing onto a crisis works to build a cohesive following.
Last, there is a fairly large gap between the ideal concept of Christian faithfulness and the “lived religion” of ordinary believers. (p. 62)
There is a tendency to idealize the past and see the current trends as a significant deviation. One thing the longitudinal studies show is that this is not really the case. While there is a clear generational loss of youth from the Catholic church the same cannot be said for Protestants, particularly “evangelical” Protestants. Beyond these longitudinal studies, there is a temptation to believe that the current state is the result of a serious religious decline and the historical evidence (although sketchy) doesn’t really support this. A so-called “Christian” nation didn’t necessarily mean deeper, or more wide-spread, devotion.
“I hope to be a catalyst for innovation in the future of seminary education, integrating the best of the arts into the church, seeing cities as classrooms for that integration, and helping the church to become the leading practitioner of culture care.” —Mako Fujimura
by Fuller Theological Seminary
Fuller Theological Seminary is pleased to announce that Makoto (“Mako”) Fujimura will join the seminary as director of the Brehm Center for Worship, Theology, and the Arts. Fujimura’s appointment, effective September 1, 2015, follows a yearlong international search process. Master painter Mako Fujimura is a respected leader in the conversation between Christian faith and art. A devoted believer, world-renowned artist, and cultural influencer, his leadership has had a profound impact around the world.
Fujimura is the craftsperson of a movement toward renewal called “culture care.” This magnum opus work, his alternative to “culture wars,” is born from the integration of his work as an artist and his commitment to his Christian faith. He says it is worship that integrates all of his endeavors, acting as the heartbeat of a legacy that dovetails beautifully with the task before Fuller and with the original vision of the Brehm Center.
“I am filled with gratitude and joy at the rich opportunity I have to welcome Mako as the new director of the Brehm Center for Worship, Theology, and the Arts,” says Fuller President Mark Labberton. “This role ‘shaping culture shapers,’ as Bill and Dee Brehm noted in the early days of their vision, is key for Fuller, and Mako beautifully fulfills our commitment to innovation, collaboration, faithfulness, fruitful risk-taking, and courageous creativity. Bill has always said of the brainstorming process: ‘start with the universe!’ The appointment of Makoto Fujimura to the directorship of the center that bears the Brehm name lives up to that robust challenge.”
“Our relationship with Mako,” says Nate Risdon, Brehm Center program director, “has been one of mutual admiration for nearly 10 years: his lectures, work, and writings have inspired much of our work to date. To have Mako serve as our director brings an amazing convergence that will energize and magnify the movement of ‘culture care.’”
Fujimura will be a “vision director” for Fuller’s Brehm Center, working directly with President Labberton toward a “robust, imaginative experience for Fuller students,” says Fujimura. “My goal for all of us is to experience our God as the author of beauty. My studio is where I experience the presence of my calling the most, and is also where I can offer an integrated experience to others.” Through his studio practice he has mentored many, written several books, and had significant conversation with the church and the world. Fujimura says, “This Fuller appointment is intended to amplify those connections to share with many more.”
Fujimura argues for the importance of creating and conserving beauty as antidote to cultural brokenness by asserting a need for cultural “generativity” in public life. Beauty is vital to “soul care,” he believes, offering a vision of the power of artistic generosity to inspire, edify, and heal the church and culture. (See his speech on culture care at the National Press Club.) Fujimura’s book Culture Care is a support volume to the personal gatherings and international speaking engagements in which he shares that vision with like-minded artists, supporters, and creatives…
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Editors note: I do believe trigger warnings can be appropriate in some situations (I have used them in class and on this site for years), and I also believe that becoming sensitized to microaggressions can be an important part of discussions regarding gender, race, and sexuality. However, as Lukianoff and Haidt point out as mental health professionals, things have gotten out of hand. -GDS
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article forThe New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress.
In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her.
In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
Two terms have risen quickly from obscurity into common campus parlance. Microaggressions are small actions or word choices that seem on their face to have no malicious intent but that are thought of as a kind of violence nonetheless. For example, by some campus guidelines, it is a microaggression to ask an Asian American or Latino American “Where were you born?,” because this implies that he or she is not a real American. Trigger warnings are alerts that professors are expected to issue if something in a course might cause a strong emotional response.
For example, some students have called for warnings that Chinua Achebe’s Things Fall Apart describes racial violence and that F. Scott Fitzgerald’sThe Great Gatsby portrays misogyny and physical abuse, so that students who have been previously victimized by racism or domestic violence can choose to avoid these works, which they believe might “trigger” a recurrence of past trauma.
In a sharp-elbowed opinion piece in The New York Times this week (entitled, “Stop Universities From Hoarding Money,”) Victor Fleischer, a law professor at the University of San Diego, took several big-name schools to task for the ways that they handle their endowments.
Fleischer cited Harvard, the University of Texas, Stanford and Princeton — but he reserved his harshest criticism for Yale University, which he says pays private equity firms $480 million a year to handle its endowment. Meanwhile, he says the school spends only $170 million dollars on financial aid for students — while tuition often rises.
“As some of these endowments grow larger and larger, the group that benefits the most is not students; it’s not faculty. It’s the fund managers who manage the money,” Fleischer says. “The point is: What is the endowment there to serve? The point is to advance teaching and research and scientific inquiry today.”
He points to the ways the universities managed their money during the tough financial losses of the financial crisis.
“It’s striking that, in those circumstances where you would expect the universities to tap into the endowment for a lot of support, they didn’t. Instead, the focus was on growing the endowment back to the previous size.”
Fleischer’s argument moved Malcolm Gladwell, the author and New Yorkerwriter, to fire off a barrage of tweets excoriating Yale and the other schools featured in Fleischer’s article.
“I’ve gotten increasingly incensed at the inequality in American higher ed,” Gladwell tells NPR’s Scott Simon. “There’s a handful of schools that just have too much money.”
Yale supplied NPR with a statement saying their endowment provides $1 billion a year — or fully a third — of Yale’s budget, which they say is much greater than the revenue they get from student tuition. And they say that more than half of their students receive financial aid.
But that argument doesn’t persuade Gladwell, who finds fault not with how the money is spent, but the sheer amount the institution has accumulated.
“Do you need $26 billion to provide that level of service?” he says. “Is our educational system better or worse off for having a small number of schools with a massive amount of money, and a very large number of schools who are hurting?”
Since the Enlightenment, Europe has observed the slow divorce of the church from the university. What does it look like to be an educated Christian in an age in which the intellectual elite have written off faith as bad scholarship?
Why is it important that religious institutions exist in American public life? Gordon College President Michael Lindsay discusses principled pluralism and the future of religious educational institutions.
Is newer always better? Most people don’t know that current educational practice is less than a century old. Paradoxically, the harder we try to produce great thinkers similar to those of the past, the further we move from the style of education that produced them.
Building upon Erickson’s developmental theory, Dr. Patterson theorizes two stages of emerging adulthood –Incarnation and Impudence:
Incarnation is seen when young people accept responsibility (particularly financial responsibility) for their actions and make decisions in regard to, but not as a result of, parental guidance. The primary reason given for returning to the parental home was finances. Simply put, those emerging adults felt they did not have the resources to continue to maintain their residential independence and took advantage of an opportunity to live with their parents and save money.
For impudentemerging adults, there is typically less thought toward the future. They may not consider how expensive it is to maintain a household or live in school housing, instead adding their living expenses into the cost of college student loans that can be paid later. What used to be paid on rent now is paid on student loans that can be paid later.
In my work with emerging adults and their parents, I often use the analogy of running a cross country race versus running on a treadmill. In both activities you’re body is performing the same motions, but in only one case are you actually getting somewhere. Patterson’s work reveals that “Impudent” boomerang kids are using their parents’ home only to avoid adult responsibilities, whereas the boomerang experience helps others mature into responsible adults. They might look the same to outside observers, but the Incarnational emerging adults are actually getting somewhere.
A Generally Positive and even Beneficial Experience
One of the key findings in Dr. Patterson’s research is that attitudes toward boomerang children need to be adjusted. She discovered that boomeranging back home does not negatively impact the adult child’s development into adulthood. She writes:
The negativity surrounding popular views of intergenerational coresidence casts a pall on what can be (and usually is) a rewarding experience, which is not only tolerated well, but also desirable in many families.
Research indicated that boomerang children do not suffer developmental setbacks in emerging adulthood due to living in their parental homes. Instead, it is more likely that as emerging adults mature and move toward incarnation, they have a greater likelihood of returning to live with their parents.
Misperceptions and Popular Culture
Despite this, attitudes of parents and emerging adults view boomeranging quite negatively. Emerging adults are haunted by feelings of inadequacy (even though 40% of their peers are in the same situation). Parents feel that they have failed in their role if their children are not successfully launched from the nest immediately. The feeling that boomerang children are dead weight is often unfounded and may place unnecessary stress on the family and the adult child.
Interestingly, Dr. Patterson found that adult children who live at home are often more mature and responsible than those who are attempting to make it on their own. She believes that this is due to the fact that it is not very satisfying to live at home while abdicating decision-making to their parents or refusing to accept responsibility. Those that are in this ‘impudence’ phase find living at home with rules and responsibilities imposed upon them to be quite stressful.
By removing the stigma of young adults living at home, both parents and emerging adults can work together to make the most of this transitional time so they can get the best start possible into their adult life.