Reimagining Spiritual Formation and Cultural Engagement
Author: Two Handed Warriors
The occasional musings of Gary David Stratton and friends designed for educators, filmmakers, artists, philanthropists, faith leaders, students and entrepreneurs seeking to reimagine the relationship between spiritual formation and cultural engagement. Subscribe and join the conversation.
“A writer for Emmy magazine is on the phone for you.” At first I thought our PR director was pulling my leg. College professors don’t get calls from Emmy magazine, even if they are moonlighting as the Executive Director of a community of Christian entertainment industry professionals seeking to train and equip storytellers to enter mainstream Hollywood. Act One had been in existence for over a decade and even though we had graduates writing, producing, and directing on numerous TV shows and more than a few feature films, no entertainment industry press had ever called our offices before.
Kurt Schemper changed all that. A producer for A&E’s critically acclaimed reality program, Intervention, Kurt had just become the first Act One graduate to win a prime time Emmy Award. The writer on the phone, Libby Slate, was fascinated by Kurt’s connection to a Hollywood Christian community. But, what really impressed her was how the Act One community had lived out our faith by rallying to aid former staff member Rosario Rodriguez after her gang-related shooting while walking in the tawny L.A. neighborhood Libby called home. (Read story here.)
Libby wanted to know if Emmy could do an article highlighting Kurt and Act One’s unique mission in Hollywood. Kurt and I readily agreed, and director Korey Scott Pollard (House, Grey’s Anatomy, Monk, Nashville, Rizzoli and Isles, Lie to Me, The Middle, Jack Ryan) signed on to represent the Act One faculty perspective.
As Kurt, Korey and I prepared for our interview, Korey pushed for us to be ‘really ready’ to express exactly what we wanted to say. Our conversations turned to how difficult it is to thrive spiritually in Hollywood, and interviewer Libby Slate graciously picked up on this theme.
In the course of our conversations Kurt mentioned that one of his college professors at Judson College encouraged him to pursue his calling to Hollywood by quoting Frederick Buechner:
“The place God calls you to is the place where your deep gladness and the world’s deep hunger meet.”
Kurt’s response was, “My deep gladness is Jesus. The entertainment industry is no different than any other place with lonely people searching for gladness.”
The idea of finding “deep gladness” in Hollywood really resonated with me, especially as I contemplated what a “soul-deadening” place Hollywood can be for many industry insiders. So in my interview, I told Emmy, “We’ve found that the spirituality taught by Jesus is an ideal starting place for guiding industry professionals on a soul-nourishing spiritual journey.”
That language resonated with Emmy readers as well, and soon opened doors all over Hollywood. Now it leads to this new series entitled, “Soul-nourishing Practices for a Soul-deadening world: Finding the Voice of Your Own Gladness in Hollywood and Beyond.”
My hope is that these posts will help filmmakers, educators and other culture makers find their own “deep gladness” through the soul-nurturing practices Jesus taught his first followers over 20 centuries ago. Not mere religious practices targeted at greater self-righteousness, but spiritual practices targeted at nurturing a deeper connection to God.
We officially launched the series earlier, but today I thought you might want to read the original Emmy article. (I couldn’t figure out how to post it directly, so you’ll have to download the article as a pdf.) Enjoy!
There are probably few things we do more poorly than relaxing. When they try, workaholics feel guilty, controllers get anxious, the lazy get bored, fun-lovers become disillusioned, the responsible get uncomfortable and the diligent feel awkward. We’ve got to make it happen; if we don’t who will? We’d rather burn out than rust out. We’ve got to be proactive, hard-working, productive, energetic, and busy. How in the world can we relax when there’s so much to do?
I’m using the word relax. But Jesus, in Matthew 11:28, uses the word “rest.” His word is superior and more satisfactory since it goes to the core of our being – our souls (“you shall find rest for your souls”). Relaxing is primarily a physical thing which humans try to make happen through their own efforts. Rest is an inner serenity, a calm trust that is realized even in the midst of outer turmoil. True rest, Jesus tells us, is a gift given to those who are with Him, accept His yoke, and learn from Him. Therefore, we can experience rest no matter our circumstances, our energy levels, or our productivity.
I’d like to focus on finding rest by “taking his yoke” as an exploration as to why we have such a hard time resting (or relaxing). Typically when a Bible teacher gets to the “yoke” he or she begins explaining what a yoke in Jesus’ day may have looked like and concluding that we, too, are connected (“yoked”) to Jesus so that He guides our lives and shares our burdens. That’s all well and good as far as it goes. Yet there may be something richer and deeper here into which Jesus is inviting us when he says, “take my yoke upon you.” We will find real rest in this richer and deeper place.
Pause a while on the word “my” that Jesus used in designating this yoke. Could it be that Jesus is inviting us into the same intimate relationship he has with the Father? First, he explains that certain truths are hidden from the wise and clever but not to the childlike (11:25-26). Then, he opens the door for us to get a glimpse of his relationship of experiencing life with the Father as the context for helping us understand “his” yoke (11:27). Now, is the perfect time to offer those willing to accept it (the childlike) the invitation to take hold of (enter and embrace fully) the intimate relationship that he has with the Father that He now wants us to share with him. In other words, “my yoke” is the kind of yoke he has with the Father, a yoke that connects them in loving intimacy. Astonishing! You and I are invited to possess an intimate relationship with the Father similar to what Jesus has with Him.
Are you facing turmoil? Are you weighed down by genuine concerns? Are you exhausted from trying to make your marriage, family, job, or ministry work? Are you carrying burdens that are crushing you? Take hold of Jesus’ “yoke,” put it on and learn what it is to be intimate with your Father. Within this deepening intimacy with the Father you will discover fresh ways to manage life’s burdens and weariness.
Does the prospect of intimacy with the Father like Jesus has stir something deep within you? Is that kind of intimacy something your heart and soul longs for? Does the prospect of genuine rest in connection with your Father resonate in your soul? Come to Jesus…his relationship with the Father can be your relationship with the Father…and find rest.
“Don’t you think it’s a little odd to give up something for Lent in order to worship a Savior who told us to remember him by eating carbs and drinking alcohol?”
That’s the question a brilliant young writer confronted me with after an intense conversation covering Ash Wednesday, Lent, fasting, and dieting (there is a difference, right?).
To her, fasting made about as much sense as the head-bonking monks in “Monty Python and the Holy Grail,” whose unspoken motto appears to be: “Painfulness is next to godliness.”
She had a point. We find the head-bonking monks so funny precisely because we know that the extreme asceticism of the Middle ages, no matter how sincere, was profoundly flawed.
But are all ascetic practices flawed? She suspected they were. I desperately tried to offer an alternative perspective.
After talking it through for nearly an hour, I finally gave her the best answer I could: fasting is more like Moana than Monty Python.
Let me explain…
A Brief History of Lent
Fasting for 40 days before Easter was originally established as a time of spiritual preparation for new converts to Christianity before they were all baptized together each Easter. However, in 325 AD, The Council of Nicea made Lent an official season of fasting for the entire church to prepare to receive the new members.
This was normally practiced as eating only one meal per day for the entire 40 days. (Note: While many modern Catholics give up something for Lent, the Vatican only prescribes Ash Wednesday and Good Friday as official fast days.)
In the ensuing centuries many Christ-followers found Lent a helpful practice in their walk with God. Fasting is often connected with repentance in Scripture. Using fasting and repentance to help “Prepare the way” for the Lord” in one’s heart for the celebration of Christ’s death, burial, and resurrection can be a very helpful and instructive practice.
Entering this season of repentance through imposition of ashes on one’s forehead on “Ash Wednesday” can create a strong connection to the Biblical practice of repenting in sackcloth and ashes. (Traditionally, the ashes are made from palm fronds from the previous year’s Palm Sunday to remind us of how quickly our cries of, “Hosanna” can turn to “Crucify him!”) Skipping a meal, a favorite food, or favorite activity can help underscore our words of repentance with our bodies. Our “hunger” allows us to more closely identify with Jesus’ missional commitments, “My food is to do the will of him who sent me and to finish his work” (John 4:34).
However, the practice of Lent also has a dark side in church history. As the human tendency toward hyper-control began to infiltrate the church, the practice of Lent became more and more prescribed and restrictive with each passing year. By the Middle Ages the compulsory practice of Lent had become genuinely oppressive, not unlike Monty Python’s head-bonking monks. With the advent of the Reformation, Protestant leaders began to distance themselves from the practice.
Martin Luther saw nothing wrong with Lent in theory, but feared that most Lenten fasting had become dead compulsory religious ritual aimed at earning God’s favor that amounted to “fasting to Satan instead of a fasting unto holiness.” Ulrich Zwingli and later John Calvin were just as rough. They all but outlawed what they called the “gross delusion” of the “superstitious observance of Lent.”
Soon, Lent-keeping became a shibboleth defining which side of the Reformation you were on. Take ashes and the “Anti-Lent” crowd called you an enemy of the gospel. Refuse them, and the “Pro-Lent” gang condemned you to hell.
John Wesley, founder of the Methodist church, was perhaps the first Protestant to swing the pendulum back toward a more balanced approach. Wesley broke with the Church of England’s ban on Lent by listing it among his approved fast days of the Church. In fact, he thought it was “deplorable” that many Methodists neglected such fasting.
Wesleyan Methodist churches eventually reinstated Lent as an official church practice. Anglicans, Lutherans, and later Presbyterians also eventually reinstated Lent as well (which probably caused Zwingli and Calvin to roll over their graves.)
Dallas Willard and the Spiritual Formation Movement
In recent years, Lent has enjoyed something of a revival among younger Christians, especially those influenced by the contemporary spiritual formation movement. Dallas Willard, Richard Foster, James Houston, and a growing chorus of “Willard for Dummies” advocates are helping contemporary Christians recapture the positive elements of “spiritual discipline” in general, and Lent in particular.
Willard warns that Protestantism’s emphasis upon grace all too often draws believers into the heresy of passivism. A proper understanding of grace rightly emphasizes our inability to “earn” our own salvation. However, passivism mistakenly emphasizes our inability to play a role in our own transformation (The Renovation of the Heart, p. 82). Fasting in general, and Lenten fasting in particular can help counteract this passive, “I’ll wait around for God to change me,” approach to faith.
While we are saved by faith through grace alone (Eph. 2:8-9), we are transformed by the “interactive presence” of the Holy Spirit in our lives (p. 23). God could transform us instantly and unilaterally, but he has chosen to transform us largely by working with us (p. 10). We participate in our own transformation indirectly by shaping of our thoughts and feelings through the rigorous and skillful application of spiritual discipline (p. 248).
In other words, while we cannot instantly or immediately transform our character by sheer force of will, we can will to practice the kind of disciplines that put us in a place where God’s grace can transform us into the image of Christ.
This is why Lent can be both used and abused. To practice Lent out of sense of compulsion—say, fearing that God will smite me if I eat chocolate—or in hopes of earning brownie points with God for my good behavior, are both anathema to the the gospel of Christ and the true spirit of the disciplines.
However, to give up something for Lent in hopes of using your body (your whole being) to express your prayer of repentance can be very powerful. It can put you in a position to better cooperate with the movements of the Spirit in your own soul. And, of course, if we also “take up” a spiritual discipline for Lent—say, Scripture meditation or centering prayer—then we are in a position to catch even more of God’s grace.
Catch the Wave
This is where Moana comes in. Just like in Disney’s Moana, the surfer doesn’t create the wave, but her board (or canoe) helps her to catch the energy provided by the ocean. In the same way, a spiritual discipline (such as a Lenten fast) doesn’t create the transforming power of God, but it does help us to catch it.
The spiritual discipline of fasting creates a space of faith that God is only too glad to fill. When practiced in this way, the spiritual discipline of Lent helps people “catch the wave” of God’s ever-available power. (For ideas, see The Lent Project, sponsored by Biola University’s Institute for Spiritual Formation.)
A Personal Note
That’s the way it has worked for me. I didn’t grow up in a tradition that emphasized Lent. Yet for some reason, As a young Christ follower, Lent just seemed like a good idea to prepare my heart for Easter by following Christ into a 40-day fast. Since I wanted my fast to be ‘to’ Christ and not just ‘from’ something, I decided to give up television and use the time I freed for prayer and bible reading.
It turned out to be a profound spiritual experience. I discovered that God’s power and presence had been fully available to me, but night-after-night I had not been available to him. Once I began using the time previously devoted to mindless entertainment to seek him, I began to catch the supernatural resources that had always been at my disposal.  The spiritual discipline of Lent became a surfboard God used to propel me forward in my faith. I’ve since witnessed corporate Lenten fasts impact entire churches and academic communities.
Alcohol, Carbs, and the Presence of God
And that is why Lent is more like Moana’s majestic wave riding than the Monty Python monks pointless head-bonking.
So, Arielle, there’s my answer. Enjoy the blessings of God found in food, drink, carbs, and the arts. “Whether you eat or drink or whatever you do, do it all for the glory of God” (1Cor 10:31). But sometimes an intense season of spiritual discipline such as Lent is just what we need to re-examine our heart and catch the wave of Christ’s ever-present help.
 Okay, the official Lenten season from Ash Wednesday to Easter is actually 46 days. Why? Because Medieval church leaders decided that fasting on Sunday (a Christian ‘feast’ day) was hypocritical. They deducted the six Sunday’s of Lent from the season of repentance, making Lent an awkward 46 days long. This has always seemed more like a loophole than an actual spiritual discipline to me. I normally just fast the whole 46 days, but having a break once a week can be nice and even help prevent legalism from creeping in.
 John Ortberg’s self-professed job description.
 Don’t take this as a slam on TV viewing in general. I still love television and many of my friends and students work in the TV industry. I think moderate viewing of excellent shows can be a very helpful spiritual discipline. In fact, my DVR and streaming services have helped me nearly eliminate the kind of mindless channel-surfing that often thwarted my early spiritual development. Since then I have given up Facebook or Social Media, as these tend to be my major time wasters in my current lifestyle.
More than a century before the New Deal, Public Education, or the Civil Rights movements, the Second Great Awakening fostered a nation-wide “benevolent empire” of care for the poor, freedom for the oppressed and education for all.
By Gary David Stratton, Professor, Johnson University (TN) and James L. Gorman, Assistant Professor, Johnson University
Generally regarded as a second groundswell of evangelical Protestant religious interest following the Revolutionary War, the Second Great Awakening was more extensive and enduring than the Great Awakening of the 1730s-1740s. The Second Great Awakening began as a rural movement in the 1790’s and achieved notoriety in the Cane Ridge Revival (1801) led by Barton Stone in the south and the Yale College revival (1802) led by Timothy Dwight in the north. The movement was marked by great educational and social reform, culminating in the ministry and Oberlin college presidency of Charles Grandison Finney, who published one of revivalism’s most influential works, Revival Lectures, in 1835.
Kidd (2007) asserts that dividing the early American awakening into two distinct timelines may “obscure the fact that the evangelical movement continued to develop after 1743 and before 1800” (p. xix). No certain or obvious stopping point for the Great Awakening exists; the same is true for the Second Great Awakening. For instance, Scots-Irish Presbyterianism was crucial to the story of evangelicalism’s development during the Revolutionary period and provides a direct link from the colonial Great Awakening to the early-republic Second Great Awakening (Schmidt). Similarly, New Divinity ministers kept Jonathan Edwards’ vision of the outpouring of the Holy Spirit alive in Congregational churches across New England and into New York, while Pietist revivals in Pennsylvania and New Jersey never completely died out. The same could be said for developments among Baptists, Methodists, Anglicans, etc., who each sought the outpouring of the Holy Spirit upon their ministries.
Noll (2003) notes that while awakenings may be works of the Holy Spirit, such movements can also be studied as the effect of human leadership. “By taking note of the agents who, whether perceived as servants of God or merely adept shapers of culture, historical explanation adds the sphere of human responsibility to realms of theological principle, religious conviction or social tectonics” (p. 141). By following three key exemplars of the movement, it is possible to sketch out many of the key characteristics of the Second Great Awakening.
Barton Stone and the Cane Ridge Revival (1801)
If one were to mark the “beginning” of the Second Great Awakening, based on criteria of numerical size and geographical extent of awakening, the best starting point would be the “Great Revival in the West” (1797-1805). The leaders were revivalist Presbyterians who followed Jonathan Edwards’ balanced approach to awakening to stoke the fires of awakening through the Revolutionary era who found particularly fertile ground in Kentucky. The rapid expansion of the fledgling nation across the Appalachians created a vast territory with little or no rule of law, where settlers and outlaws often battled to an uneasy seasons of peace, and leaving a spiritual vacuum which revivalists rushed in to fill.
One of the best known of these revivalists was Barton Stone, a “discontented Calvinist” and pastor of two Presbyterian churches in Bourbon County, Kentucky. After witnessing revival in Scottish style “sacramental meetings” in Logan County under the preaching of James McGready (who Stone knew and trusted from his academy days) Stone became convinced that God could grant the gift of faith without an extensive season of “seeking” God. He returned to Bourbon County determined to preach that his men could “believe now, and be saved.” (Alvarez, p. 45) After growing success in Concord began to attract large crowds, Stone called for a weeklong sacramental meeting at Cane Ridge. The meetings attracted between 10,000 and 20,000 people with many “falling” under the power of the Spirit and coming to faith in a matter of hours (Conkin).
Denominational ties began to lose their meaning in meetings where as many as seven pastors from four denominations were preaching in various parts of the camp simultaneously. Calling themselves simply “Christians,” the movement spread throughout the Ohio and Tennessee Valleys, where Stone eventually joined forces with Alexander Campbell in 1832, forming a denomination with a handshake. Denominational unity (a strong ideal of Jonathan Edwards’ revivalism) and innovation (first modeled by George Whitefield in the Great Awakening) became hallmarks or the Stone-Campbell movement and the entire Second Great Awakening. “The Disciples, Christian Churches, and Churches of Christ founded by these leaders effectively evangelized the Upper South and opening West because they had translated the Christian message into an effective American idiom” (Noll, p. 51).
Timothy Dwight and Yale College
When the faculties at Harvard and Yale rejected the (First) Great Awakening, entrenching these institutions as “Old Light” bastions, “New Light” friends of the awakening were quick to take up the charge in the founding of a flurry of new colleges with a revival bent. Some New Divinity colleges, such as Dartmouth, and Amherst, were founded directly on Jonathan Edwards’ principles of revival. Others, like Williams, and Rutgers were later captured by followers of Edwards’ educational vision. In the end, nearly all colleges of the era were eventually influenced by the Edwards/Dwight project of integrating revivalism with Scottish Common Sense Realism, in no small degree due to influence of his grandson, Timothy Dwight, who was named to the presidency of Yale in 1795 in a striking Edwardsean takeover of what had once been an “Old Light” institution.
Like his grandfather, preaching was central to Dwight’s approach to preparing the way for spiritual awakening and presidential sermons were the core of the college curriculum. Dwight preached twice each Sunday in mandatory college church services: a morning sermon addressed to a doctrinal topic, and an afternoon discourse on more practical and experiential applications of faith, using scripture and Common Sense Realism (Thomas Reid and John Witherspoon) to defend his theology. Still, revival eluded Dwight for his first seven years at Yale, as students commitment to ‘French infidel philosophy’ often exceeded those committed to Christian faith.
It wasn’t until students who had been touched by revivals in the rural churches of the Connecticut River Valley instituted a Jonathan Edwards’ style concert of prayer–a weekly meeting of “united and fervent prayer that God might pour out his Spirit upon the college”–that the Second Great Awakening finally came to Yale. By the end of the summer term, no less than eighty out of 230 students had been “hopefully converted to God and admitted to the college church, thirty-five of which became preachers of the gospel.
Yale experienced three further revivals under Dwight and these outpourings of the Spirit became a welcomed and promoted aspect of the president’s educational program. When students petitioned to cancel classes in a season of spiritual awakening, Dwight refused and instead carefully guided them back to a biblical holism committed to fostering the life of the Spirit in the day-to-day life of the college; an approach that eventually spread to many if not most of America’s colleges.
Under Dwight’s presidency Yale College grew into the largest and most influential college in the Americas and so that higher education became a hallmark of the Second Great Awakening. At one point 35 of the 150 college presidents in the United States were graduates of Dwight’s Yale. Marsden notes that Dwight’s emphasis upon “revival and moral philosophy, were the chief collegiate supplements to traditions of regulated worship…” and laid the foundation for nearly a century of academic ascendancy that “may be called with justice the great age of Christian higher education in the history of the country” (p. 58).
Noll notes that Dwight and these “revival colleges” were instrumental in effecting a “surprising intellectual synthesis” of evangelicalism and common-sense moral reasoning that dominated the nation’s thinking and led to the remarkable “Christianization” of American society (Noll, 2005, p. 9).
Charles Grandison Finney
Regarded as the father of modern revivalism, Charles G. Finney was the human catalyst for some of the most impressive urban revivals in United States history and in the process created the methodology for virtually all evangelists who followed. In 1821 he was converted in the early stages of the Second Great Awakening and left his law studies with the declaration, “I have a retainer from the Lord.” After brief theological training, the local Presbytery licensed Finney as an itinerant home missionary in upstate New York. Bright, athletic, unusually tall, and musically gifted, his theatrical preaching drew enthusiastic crowds and produced numerous converts. The largely “New School Presbyterian” New York Presbytery embraced these measures and published a pamphlet of his revival efforts in the tradition of Jonathan Edwards’ Faithful Narrative.
Finney considered himself a theological descendant of Jonathan Edwards’ revivalism. However, his highly volunteeristic theology of conversion led him to reject Calvinistic views and preach “man’s duty to change his own heart.” Rather than pressing his audience to begin the long process of seeking a salvation granted only by God, Finney called sinners to make an instantaneous decision to repent and believe. His view of conversion as a “free decision” led him to adopt and popularize a highly “democratic practice” of evangelism known as New Measures (Smith, 2007, 2-8), including dramatic and colloquial preaching, an extensive time of singing before preaching, the inclusion of women as leaders, the use of an anxious seat (precursor to the altar call), the use of celebrity, novelty, and story to persuade, and public prayer meetings for God to pour out his Spirit upon particular sinners.
In 1830 Finney moved his efforts into urban settings with a tremendous success in a great revival in Rochester, NY that is still regarded as “the greatest revival in American history” (Cross, p. 13). The experience launched Finney into national prominence, and after accepting brief pastorates in New York and Boston, he eventually settled at Oberlin College (OH) as a faculty member and later president. It was during this era that Finney delivered and published his wildly popular Revival Lectures, one of the most widely read books in American religious history. Rather than instructing evangelicals to wait passively for God to send revival, Finney’s great confidence in God’s willingness to grant the awakening gift of the Spirit in answer to prayer led him to declare, “A revival is no more a miracle than a crop of wheat.”
William G. McLoughlin’s interpretation that Finney was asserting that revivals were ‘worked up’ while Edwards believed revivals were ‘prayed down’ (p. 11) misses Finney’s remarkable emphasis upon prayer and the sophisticated nuance of divine and human interaction in both revivalists’ theologies. Still, it seems a fitting epitaph for much evangelism after Finney when “revival meetings” became standard practice in virtually every Christian denomination in the United States and beyond.
Finney’s emphasis on the filling of the Holy Spirit as the key to perfectionistic holiness evidenced in self-sacrificing love for the lost, the disadvantaged, and the oppressed became the impetus for his version of the Second Great Awakening’s vision to create a “benevolent empire” of “good government, Christian education, temperance reform, relief for the poor and the abolition of slavery” (T. L. Smith, p. 60-61).
Oberlin was one of the first colleges in the nation to admit blacks and women as students in full standing and the clear leader for the anti-slavery movement in the mid-west. Due to the enduring popularity of Finney’s Memoirs and Revival Lectures, his influence upon revivalist evangelicalism eventually rivaled and even eclipsed that of Jonathan Edwards. Noll contends,
“[A] good case can be made that Finney should be ranked with Andrew Jackson, Abraham Lincoln, and Andrew Carnegie […] as one of the most important public figures in nineteenth-century America” (Noll, 2002, p. 176).
While it is as difficult to find a clear ending point to the Second Great Awakening as it is to find a clear beginning, its impact was felt deep into the nineteenth-century and beyond. More than a century before the New Deal, Public Education, or the Civil Rights movements, the Second Great Awakening fostered a nation-wide “benevolent empire” of care for the poor, freedom for the oppressed and education for all
Religiously, the awakening left enduring practices of concerted prayer, revival/camp meetings, anxious seats/altar calls, new measures, that still influence nearly every evangelical Protestant denomination today. Theologically, the Second Great Awakening marked the end what Guelzo calls one hundred years of “theological bungee-jumping” between God and human roles in conversion, so that gradually and in increments the idea of gradually seeking salvation was replaced by immediate conversion.
Politically, it is difficult to miss the connection to the democratization of American society and the democratization of the church. However, the direction of that influence is difficult to measure. Globally, the Second Great Awakening birthed the beginning of a massive evangelical missionary movement, first to the Native American communities and eventually to foreign missions. Culturally, the awakening contributed to a sense of national cohesion at a time of profound social change, but most likely also fueled a sense of manifest destiny that deeply wounded the very Native American populations the revivalists most wanted to evangelize.
Gary David Stratton (Ph.D. Biola University) is University Professor of Spiritual Formation and Cultural Engagement and Dean of the School of Arts and Sciences at Johnson University (TN). James L. Gorman (Ph.D. Baylor University) is Assistant Professor of History at Johnson University. Based upon Stratton and Gorman’s “The Great Awakening [1730s to 1740s]” in the “Encyclopedia of Christianity in the United States” (Rowman and Littlefield, 2016).
It is terrifically hard to wait at the foot of the mountain for the Word of the Lord. Will we wait in the dangerous silence for who He truly is, or slowly grow desperate enough to worship a golden calf?
When I first heard Andrew Peterson’s song “The Silence of God,” I was stunned. It was so bare. I wondered if it was even heretical…
I’ve since read thoughts by theologians about the growth value of long spans in which God leaves us in silence, but if I remember correctly, the first time I ever encountered someone wrestling with the concept wasn’t in a book, but in Andrew’s song.
He was the first person I heard admit, “I can’t hear God’s voice right now, and that’s terrible and it’s scary.”
It’s enough to drive a man crazy
It’ll break a man’s faith
It’s enough to make him wonder
If he’s been sane
When he’s bleating for comfort
From Thy staff and Thy rod
And the heavens’ only answer
Is the silence of God
And it’ll shake a man’s timbers
When he loses his heart
When he has to remember
What broke him apart
And this yoke may be easy
But this burden is not
And the crying fields are frozen
By the silence of God
If a man has got to listen
To the voices of the mob
Who are reeling in the throes
Of all the happiness they’ve got
When they tell you all their troubles
Have been nailed up to that cross
What about the times when even
Followers get lost
‘Cause we all get lost sometimes
If you know this song, you know these last stanzas don’t finish it off. But even hearing this much, I felt a strange sort of relief wash over me. Until he verbalized it, I hadn’t realized that all those years of religious-speak, all those appeals for God to “show up” had made me feel pressure to find continual signs of His engagement.
I didn’t realize how badly I needed to hear someone I trusted say, “When God is silent–and that’s often enough for me to write a song about it–I feel disappointed and lost.”
Martin Scorsese’s, Silence
Martin Scorsese’s film, Silence, was another one of those moments for me. Among other things, this is a film about faith attempting to survive long expanses of Divine quiet. The film reveals how we expect God to show up, how He does show up instead, and the human weaknesses that appear in the massive gaps between those two realities.
Unlike Christian movies in which God provides some sort of “I have arrived” moment– God does not show up here with a new pickup truck, a much-desired pregnancy, or a restored marriage. The God of this film lets His children wrestle with years of suffering in relative silence. Because of this, we watch people who are trying to obey Him strain and grieve–desperate for confirmation during impossible times.
There are so many angles to this film, but I’m just going to focus on the one most personal to me in this post: the traumatic impact of an older follower of Christ who abandons his pure faith.
The film opens describing the work of Christovao Ferreira, a legendary Jesuit priest who has spent 15 years attempting to evangelize Japan. Ferreira was iconic to believers at the time. Your denomination’s equivalent might be N.T. Wright, Billy Graham, John Piper, or Francis Chan—but whoever that hero is, Ferreira was this sort of leader. He was so solid, so certain, so strong that every young priest knew that he would not sell out for any reason.
When news hits Portugal that Ferreira has apostasized, Rodrigues and a fellow priest believe the news is a dirty rumor. So, the two leave home to scour Japan in an attempt to dispel the disheartening story. It is a dangerous mission, likely to lead to death, but the two young men are idealistic and devoted, and they know how important it is to to the global church reclaim Ferreira’s reputation.
After arriving in Japan, the two young priests grieve to see believers tortured and slaughtered. As they experience emotional and spiritual torment, they stumble; they fail. But over and again, they rise up again in their faith to try to follow God once more.
When Rodrigues is captured by Japanese officials, his opponents try to break his faith repeatedly. The young priests heart crumbles, and he wavers on insanity, but he continues to hold fast. At last, the Japanese leaders bring his suffering to a climax — a meeting with Ferreira.
In this meeting, Rodrigues finds that Ferreira has truly apostasized. His hero is now a Buddhist, writing a book about the great lie of Christianity. His former hero begins to discourage Rodrigues from his own belief, arguing against the gospel and its ability to saturate Japan.
Ferreira urges Rodrigues to give up his faith, to compromise, to conform. Rodrigues is devastated, but he holds fast.
The Japanese could kill Rodrigues, but for strategic purposes, they want him to abandon his faith instead. So, they place Rodrigues in a holding cell where he can hear the gasps and wails of other believers being tortured. He is told that these Christians will be persecuted until Rodrigues denies his faith.
As he praying for strength and wisdom, he finds words of praise carved into his cell wall. Laudate Eum (Praise Him). He runs his fingers into the grooves and appeals desperately to the Lord for courage and fortitude. At this moment, Ferreira enters the cell and explains to Rodrigues that those praises were carved by himself before his denial of the faith.
It is a hellish scene of betrayal and temptation. Ferreira urges Rodrigues to see how selfish it is to maintain an idealistic belief that causes others to suffer. He urges Rodrigues to see that apostasy is altruistic. He builds a case for joining with the leaders of the world out of love of the masses.
Of all the torment Rodrigues endures, this betrayal of a former hero is the worst. This man who had once led him in steadfast belief is now leading him to abandon it. It is more than Rodrigues can bear.
Abandoned by Our Heroes
As I sat in the theater watching all of this, I was blown away. The timing was more than a little ironic.
Just a few moments before watching this film, I had been talking with a friend about how distraught we have felt this past year. So many people my age feel abandoned by our own older faith heroes. In dire national circumstances, we have watched several of our evangelical heroes abandon the ideals they have taught us–urging us to make alliances with forces hostile to our faith.
They have told us that this is loving. They have told us to do this for the good of the people.
Values they once encouraged us to embrace in the face of all opposition have now been discarded for what they now claim to be a greater cause. They mock us for being too committed to impractical standards. They tell us to wake up, to open our eyes, to give up our old, innocent way of looking at the world.
But before our very eyes, some of these men seem to have changed into different sorts of beings. We recognize their faces, but we no longer recognize their hearts. Their language is different, soured, horrifying. They twist the stories of our Scripture to suit their new causes.
Watching this has taken our knees out from under us.
I’m not going to get more specific than that, nor am I going to dig into what happens in the end of the film here. But I will say that this movie (among other things) helped me to understand why the last few months have broken my heart so deeply. Watching my heroes conform to the ideals of the world has been too much for my heart to bear.
These men ask us to “leave well enough alone” and move on. But we aren’t sulking. We aren’t pouting. We feel like we have watched people we trusted and imitated trample on the gospel. And we feel like they have called out and asked us to do the same.
So many people claim to know exactly what God is doing these days, but I will tell you the truth. I don’t. My perceptions might be all wrong…
Time will tell, I suppose.
I do know that I’m profoundly disappointed in some of my old heroes. I know that I no longer recognize our strange, new evangelical America. And even though scores of people around me believe that I am too sensitive, I think it is right to be disappointed. Watching your heroes distort truth is no small thing. God holds leaders to a higher standard because heroes falling creates aftershocks that can trickle through an entire generation of young believers.
Waiting on a Silent God
A huge lightning bolt of God’s appearance didn’t show up at the end of this film, but I left the theater feeling like I felt when I first heard Andrew Peterson’s lyric. I walked away affirmed that it was not wrong to be sincere, not wrong to be sad, and that it was even okay to sit alone in the quiet and wait for an honest manifestation of God’s presence instead of letting immediate needs force me to rush in to claim what He isn’t and what He hasn’t done.
God’s name is holy, even when He seems silent. In those expanses, I do not want to use it in vain. It is terrifically hard to wait at the foot of the mountain for the Word of the Lord, but I would rather wait in the dangerous quiet for what He truly is than grow desperate enough to worship a golden calf.
There’s a statue of Jesus
On a monestary knoll
In the hills of Kentucky
All quiet and cold
And He’s kneeling in the garden
Silent as a stone
And all His friends are sleeping
And He’s weeping all alone
And the man of all sorrows
He never forgot
What sorrow is carried
By the hearts that He bought
So when the questions dissolve
Into the silence of God
The aching may remain
But the the breaking does not
The aching may remain
But the the breaking does not
In the holy, lonesome echo
Of the silence of God
“What was really easy was falling in love with this person, was falling in love with Jesus Christ. That was the most surprising thing.” -Andrew Garfield
‘Grace Enough’ by Brendan Busse in America
People make the Spiritual Exercises of St. Ignatius Loyola for a variety of reasons. Preparing to play a featured role in a Martin Scorsese film is not one you hear often, but it’s probably not the worst reason. Men and women often make retreats to find some clarity about who they are or who they’re called to be. I suppose it was so for Andrew Garfield when he asked America’s James Martin, S.J., to guide him through the Exercises as he prepared to play the lead role in Mr. Scorsese’s new film, “Silence.”
Father Martin was hesitant at first. But Garfield was looking for something. Or someone. And that’s not a bad reason at all. In the end, it was enough for Jim. And more than enough for God.
Andrew Garfield was, for lack of a better word, successful in the Exercises. “There were so many things in the Exercises that changed me and transformed me, that showed me who I was…and where I believe God wants me to be,” he told me. That’s about as good a retreat outcome as one can hope for. And his success should not surprise us.
His training as an actor prepared him well for the dynamics of Ignatian prayer, whereby one imagines oneself within a series of biblical scenes in order to attain “interior knowledge” of God and to articulate that knowledge in a life of compassionate action and generous service. What was more surprising, what surprises him still, was falling in love.
When I asked what stood out in the Exercises, he fixed his eyes vaguely on a point in the near distance, wandering off into a place of memory. Then, as if the question had brought him back into the experience itself, he smiled widely and said: “What was really easy was falling in love with this person, was falling in love with Jesus Christ. That was the most surprising thing.”
He fell silent at the thought of it, clearly moved to emotion. He clutched his chest, just below the sternum, somewhere between his gut and his heart, and what he said next came out through bursts of laughter: “God! That was the most remarkable thing—falling in love, and how easy it was to fall in love with Jesus.”
The experience of falling in love with Jesus was most surprising, perhaps, because Garfield, like many people, came to the Exercises asking for something else…
On August 28, 1963, Martin Luther King, Jr. rose to the top of the steps of the Lincoln Memorial during the March on Washington for Jobs and Freedom and delivered his legendary “I Have a Dream” speech before 250,000 civil rights supporters. It would go on to reverberate through the nation, reaching millions more, and through history, inspiring generations and forever changing the course of culture. But how can sixteen minutes of human speech have the power to move millions and steer history?
Duarte notes the Dr. King spoke in short bursts more reminiscent of poetry than of long-winded lecture-speak and highlights his most powerful rhetorical devices — repetition, metaphors, visual words, references to political documents, citations from sacred texts and spiritual songs — in a fascinating visualization of the speech, demonstrating how it embodies the core principles of her book.
Metaphors are a powerful literary device. In Dr. Martin Luther King Jr.’s “I Have a Dream” speech, about 20% of what he said was metaphorical. For example, he likened his lack of freedom to a bad check that America has given the Negro people … a check that has come back marked ‘insufficient funds.’” King introduced his metaphor three minutes into his 16-minute talk, and it was the first time the audience roared and clapped.
ORLANDO, Fla. — Talk to presidents of liberal arts colleges and they are proud of how their institutions educate graduates and prepare them for life. But ask the presidents to prove that value, and many get a little less certain. Some cite surveys of alumni satisfaction or employment. Others point to famous alumni.
And, privately, many liberal arts college presidents admit that their arguments haven’t been cutting it of late with prospective students and their parents (not to mention politicians), who are more likely to be swayed by the latest data on first-year salaries of graduates, surveys that seem to suggest that engineering majors will find success and humanities graduates will end up as baristas.
Richard A. Detweiler believes he has evidence — quantifiable evidence — that attending a liberal arts college is likely to yield numerous positive results in graduates’ lifetimes, including but not limited to career and financial success. He has been giving previews of his findings for the last year. On Friday, at a gathering here of presidents of the Council of Independent Colleges, he presented details and said he believes the results have the potential to change the conversation about liberal arts colleges. He said his findings show that the key characteristics of liberal arts colleges — in and out of the classroom — do matter.
At the meeting, Detweiler described his project. He started by examining the mission statements of 238 liberal arts colleges, looking at what the colleges say they are trying to accomplish with regard to their students. Among the common goals given for graduates were to produce people who would continue to learn throughout their lives, make thoughtful life choices, be leaders, be professionally successful and be committed to understanding cultural life.
Then Detweiler and colleagues conducted interviews with 1,000 college graduates — about half from liberal arts colleges and half from other institutions. The graduates were not asked about the value of their alma maters or of liberal arts education, but were asked a series of very specific questions about their experiences in college and then their experiences later in life. The graduates were a mix of those 10 to 40 years after graduation, and conclusions were drawn on liberal arts graduates vs. other graduates only when there was statistical significance for both relatively recent and older alumni. Some of the findings may be relevant to liberal arts disciplines at institutions other than liberal arts colleges, but the comparison point was for those who attended the colleges.
What Detweiler found was that graduates who reported key college experiences associated with liberal arts colleges had greater odds of measures of life success associated with the goals of liberal arts colleges. Here are some of the findings:
Graduates who reported that in college they talked with faculty members about nonacademic and academic subjects outside class were 25 to 45 percent more likely (depending on other factors) to have become leaders in their localities or professions. Those who reported discussions on issues such as peace, justice and human rights with fellow students outside class were 27 to 52 percent more likely to become leaders.
Graduates who reported that students took a large role in class discussions were 27 to 38 percent more likely to report characteristics of lifelong learners than others were. Students who reported most of their classwork was professionally oriented were less likely to become lifelong learners.
Graduates who reported that as students they discussed philosophical or ethical issues in many classes, and who took many classes in the humanities, were 25 to 60 percent more likely than others to have characteristics of altruists (volunteer involvement, giving to nonprofit groups, etc.).
Graduates who reported that as students most professors knew their first names, and that they talked regularly with faculty members about academic subjects outside class, were 32 to 90 percent more likely to report that they felt personally fulfilled in their lives. Those who reported that professors encouraged them to examine the strengths and weaknesses of one’s views, and whose course work emphasized questions on which there is not necessarily a correct answer, were 25 to 40 percent more likely to report that they felt personally fulfilled.
But What About Money?
Detweiler saved for last the characteristic that gets so much attention these days, and that liberal arts college leaders fear hurts them: money.
From Sausage Party to Silence, it was a banner year for religion onscreen.
by Alissa Wilkinson in Vox @firstname.lastname@example.org
I started 2016 as chief film critic at Christianity Today and ended it on staff here at Vox. Religion and pop culture has been my beat for a long while. So it’s not surprising I spot it around every corner.
But even by my heightened radar’s standards, 2016 feels like a banner year for onscreen treatments of religion. I don’t mean what we’ve come to consider “Christian movies,” though there were a few of those, most notably the moderately commercially successful God’s Not Dead 2 and the crashing box office failure Ben-Hur (executive produced, by the way, by Mark Burnett of The Apprentice). “Christian films” are made for a sizable but still niche market and bent to the tastes of that segment: biblical or inspirational tales, or (in the case of the God’s Not Dead franchise) legends of the culture wars. They’re meant to preach to — or shore up — the choir.
“Christian movies” had their most recent heyday in 2014 and 2015 and seem to be tapering off, at least in terms of box office returns. But 2016 belonged to a different kind of onscreen religion, aimed at mainstream audiences. In 2016, films and TV shows that portrayed religion — organized or not — were less interested in preaching or caricaturing and more in exploring how faith and (especially) doubt fit into the frameworks of people’s lives today.
Religion showed up onscreen in everything from dark, gritty dramas to dirty animated fables
2016 started with the Coen brothers’ Hail, Caesar, a comedy about competing ideologies (Hollywood capitalism, Marxism, and Christian faith) that is explicitly modeled on a passion play.
And now the year is ending with Martin Scorsese’s Silence, perhaps the most stirring, perceptive film about belief and doubt in decades.
In A Quiet Passion (which played at festivals in 2016 and will open in theaters next year), Terence Davies uses Emily Dickinson’s life to plumb the space that might best be described as believing unbelief. The Witch artfully poses a conflict between stringent Puritan faith and witchcraft in colonial New England. Knight of Cups positioned its narrator on the road to faith (modeled explicitly on both tarot and Pilgrim’s Progress). The documentary The Illinois Parables reads the complicated matter of religion and historical conflict into the landscape of Illinois. In Queen of Katwe, a Christian missionary brings opportunity to illiterate children in the slums he came from.
Beyoncé’s magnum opus Lemonade explicitly drew on religious imagery in its proclamation of freedom for its creator and women like her. The Innocents, like Silence, grapples with faith cracked by doubt in the face of unthinkable violence to the bodies of the devout — in this case, the brutal rape of nuns.
That Christianity is the organized belief system of interest in most of these projects isn’t surprising. They’re mostly American productions, and Christianity is still the dominant religion practiced in America — though I suspect that onscreen organized religion will expand in the next few years to include a higher number of serious treatments of Judaism, Islam, and other religions.
Still, attentive moviegoers could have caught Under the Shadow, a stellar Iranian political horror film, which borrows on concepts from Islamic folklore to explore the fallout from the Iran-Iraq War. And Tikkun, an Israeli horror film, navigated the complexities of bodies and souls in contemporary Orthodox Judaism.
Meanwhile, on TV, Rectify (about guilt, forgiveness, and redemption in small-town America) and The Americans (about religion as a competitor to nationalist ideologies) topped critics’ lists, while The Path and Unbreakable Kimmy Schmidt looked at the complicated reasons people enter, leave, remain in, and recover from oppressive systems of belief.
For a pretty goofy show, Lucifer featured a surprisingly nuanced account of evil and fate, while on Daredevil, Matt Murdock’s Catholicism is a central part of his character. Preachertook as its starting point the conflation of pastorly authority and possession by something evil. On bothJane the Virgin and The Jim Gaffigan Show, Catholic faith is also part of characters’ identities and influences the decisions they make.
This isn’t even an exhaustive list — that would be impossible to compile — and leaves out a lot of what’s happening in genres like horror and in independent and niche film. But as close watchers of the industry can attest, these certainly constitute an observable uptick in religiously oriented content for mainstream audiences.
Religion is part of characters’ identities in 2016, but not their only defining feature
It’s too early in this groundswell to sort out exactly why or how this happened in 2016. It can’t really be attributed to the US election — most of these movies and shows were finished or in development before the race even took shape. But there are some commonalities worth noting.
One notable trend is a growing interest in taking religious belief to be part of, but not the entirety of, a character’s identity. In other words, religious characters are growing more complex.
Religion has at times operated as a negative character-defining trait in onscreen stories: Sometimes the religious character’s faith is played off as just a quirk or an outright flaw, a writing shorthand for being bad, weak, hypocritical, or strange. (Think of Angela in the early seasons of The Office, Shirley Bennett on Community, or Vice President Sally Langston on Scandal.) But Rectify, The Americans, Daredevil, Jane the Virgin, The Night Of, Transparent, and The Jim Gaffigan Show, among others, all have characters who are religious, but who say and do lots of things that aren’t explicitly tied to their faith. They aren’t trotted on to be the token clergy or judgy friend; they’re just people who go to church and believe in God, and also have other interests, views, and friends. Their faith is one among many defining traits, but one that is ever-present (as opposed to, for instance, Agent Dana Scully on The X-Files, whose Catholicism seemed to crop up only when it suited the story).
These sorts of characters can be hard to write for a mainstream audience, because fleshing them out often requires personal experience that busts up easy stereotypes. A sort of prototypical religious character, Studio 60 on the Sunset Strip’s Harriet Hayes, was based on creator Aaron Sorkin’s real-life ex-girlfriend, the outspoken Christian Kristin Chenoweth; another Sorkin character, the very Catholic President Jed Bartlet from West Wing, also belongs to this category. We especially see this in TV — most writers’ rooms aren’t noted for their diversity, and sometimes religion has been pretty one-note on screen. And flatly written secondary characters have a way of standing out more starkly in TV’s longform storytelling.
But perhaps recognizing that a myopic view of religious people results in underwritten characters, many shows have developed the sense that, as with gender or race, a character’s religion is part of their identity, one in a series of overlapping layers. A white Southerner’s Christianity looks different from that of a Catholic comedian living in New York City or a black marketing executive in Southern California. Not all Muslims look like they do on Homeland. Religious people look, sound, and act differently from one another. Their political and social views may differ. Even if they belong to an organized religion, the way they express and live that faith is unique.
In her film The Innocents, French director Anne Fontaine elected to dramatize a spectrum of faith in characters that on first blush look very much alike: a group of Polish nuns who, during World War II, are raped by a group of passing Russian soldiers. When the film begins, a French Red Cross doctor (who is an avowed atheist, a fact that does not change throughout the film) is called to the convent, where she discovers that many of the nuns are in advanced stages of pregnancy.
It’s tempting to see a convent full of nuns as a homogeneous group: all Polish women, living together, having taken the same vows, following the same rituals together every day, professing the same belief, experiencing the same violence. But The Innocents recognizes that women have individual responses to severe trauma, and their responses are complex and different from one another. It’s a remarkable exploration of shades of belief and doubt rooted in the different ways that different people internalize and express faith.
Some religious storylines incorporate the supernatural, to great effect
Another striking trend in onscreen religion showed up in two places in 2016: Jeff Nichols’s film Midnight Special and the Hulu TV series The Path, something echoed in the HBO drama The Leftovers (which aired the final episodes of its second season in December 2015 and will premiere its third season in 2017)…
To think historically is to recognize that all problems, all situations, all institutions exist in contexts that must be understood before informed decisions can be made. No entity — corporate, government, nonprofit — can afford not to have a historian at the table.
Since the beginning of the Great Recession in 2007, the history major has lost significant market share in academia, declining from 2.2% of all undergraduate degrees to 1.7%. The graduating class of 2014, the most recent for which there are national data, included 9% fewer history majors than the previous year’s cohort, compounding a 2.8% decrease the year before that. The drop is most pronounced at large research universities and prestigious liberal arts colleges.
This is unfortunate — not just for those colleges, but for our economy and polity.
Of course it’s not just history. Students also are slighting other humanities disciplines including philosophy, literature, linguistics and languages. Overall, the core humanities disciplines constituted only 6.1% of all bachelor’s degrees awarded in 2014, the lowest proportion since systematic data collection on college majors began in 1948.
Conventional wisdom offers its usual facile answers for these trends: Students (sometimes pressured by parents paying the tuition) choose fields more likely to yield high-paying employment right after graduation — something “useful,” like business (19% of diplomas), or technology-oriented. History looks like a bad bet.
A historian, however, would know that it is essential to look beyond such simplistic logic. Yes, in the first few years after graduation, STEM and business majors have more obvious job prospects — especially in engineering and computer science. And in our recession-scarred economic context, of course students are concerned with landing that first job.
Over the long run, however, graduates in history and other humanities disciplines do well financially. Rubio would be surprised to learn that after 15 years, those philosophy majors have more lucrative careers than college graduates with business degrees. History majors’ mid-career salaries are on par with those holding business bachelor’s degrees. Notably these salary findings exclude those who went on to attain a law or other graduate degree.
The utility of disciplines that prepare critical thinkers escapes personnel offices, pundits and politicians (some of whom perhaps would prefer that colleges graduate more followers and fewer leaders). But it shouldn’t. Labor markets in the United States and other countries are unstable and unpredictable. In this environment — especially given the expectation of career changes — the most useful degrees are those that can open multiple doors, and those that prepare one to learn rather than do some specific thing.
All liberal arts degrees demand that kind of learning, as well as the oft-invoked virtues of critical thinking and clear communication skills. History students, in particular, sift through substantial amounts of information, organize it, and make sense of it. In the process they learn how to infer what drives and motivates human behavior from elections to social movements to board rooms.
Employers interested in recruiting future managers should understand (and many do) that historical thinking prepares one for leadership because history is about change — envisioning it, planning for it, making it last. In an election season we are reminded regularly that success often goes to whoever can articulate the most compelling narrative. History majors learn to do that.
Everything has a history. To think historically is to recognize that all problems, all situations, all institutions exist in contexts that must be understood before informed decisions can be made. No entity — corporate, government, nonprofit — can afford not to have a historian at the table. We need more history majors, not fewer.
In a stinging rebuke of the legal profession’s governing body, the B.C. Court of Appeal said the regulators abrogated their responsibilities and acted unreasonably — infringing the school’s right to freedom of religion and associative rights.
“This case demonstrates that a well-intentioned majority acting in the name of tolerance and liberalism, can, if unchecked, impose its views on the minority in a manner that is in itself intolerant and illiberal,” concludes the 66-page unanimous decision signed by Chief Justice Robert Bauman and four other justices.
The court upheld the B.C. Supreme Court ruling that the law society had not given the Langley-based Evangelical institution a fair shake in rejecting its proposal to open a law school.
In July, the Nova Scotia Court of Appeal similarly repudiated that province’s barristers’ society for the way it dealt with the proposed law school and the clash between freedom of religion and sexual discrimination.
Only in Ontario has a provincial appellate bench supported the manner in which the charter rights were balanced.
The Law Society of Upper Canada’s refusal to accredit the school was endorsed by the Ontario courts who said: “The part of TWU’s community covenant in issue in this appeal is deeply discriminatory to the LGBTQ community, and it hurts.”
The school said it was seeking to appeal that decision to the Supreme Court of Canada.
TWU first proposed the law school in June 2012 and was later granted preliminary approval by the Federation of Law Society of Canada and the B.C. Ministry of Advanced Education.
Law societies in the three provinces, however, opposed accrediting the school because of the university’s controversial Bible-inspired community covenant that staff and students must sign.
Although members of the lesbian, gay, bisexual, transgender and queer communities are welcome to apply to TWU, they cannot attend without signing the creed that prohibits sexual intimacy except between heterosexual married couples.
The school, founded in 1962 and made a degree-granting institution in 1979, insists it doesn’t go looking for violations of that code, but discipline for breaking it can include expulsion.
In B.C., the law society originally endorsed the law school but that decision triggered a backlash among lawyers standing up for the rights of those who identify as LGBTQ.
As a result the law society conducted a referendum Oct. 29, 2014 and a majority of the lawyers who voted rejected the TWU proposal.
The benchers accepted that outcome on Oct. 31 and the minister of advanced education as a result revoked his consent for the school.
TWU appealed and the B.C. Supreme Court found that the benchers acted improperly.
Chief Justice Christopher Hinkson said the benchers delegated their authority and failed to do their job under the Legal Profession Act.
The society infringed the school’s freedom of religion and “allowed the members to dictate,” he said.
In a complicated reasoning process, the court of appeal disagreed with some of Hinkson’s ruling but not the thrust of it — that the benchers “abdicated their duty.”
“Where charter values are implicated in an administrative decision, and the decision might infringe a person’s charter rights, the administrative decision-maker is required to balance, or weigh, the potential charter infringement against the objectives of the administrative regime,” the appeal judges said.
“In making their Oct. 31, 2014 declaration, the benchers did not engage in any exploration of how the charter values at issue in this case could best be protected in view of the objectives of the Legal Profession Act. They made no decision at all, instead deferring to the vote of the majority in the referendum.”
They added: “TWU is a relatively small community of like-minded persons bound together by their religious principles. It is not for everyone. For those who do not share TWU’s beliefs, there are many other options …
“The majority must not, however, be allowed to subvert the rights of the minority TWU community to pursue its own values. Members of that community are entitled to establish a space in which to exercise their religious freedom.”
The case is expected to be ultimately decided by the Supreme Court of Canada, whose view would prevail across the nation.
Legal regulators in Alberta, Saskatchewan, Manitoba, New Brunswick, Prince Edward Island and Newfoundland and Labrador have gone along with the TWU proposal.
At the Republican National Convention, Senator Lindsey Graham noted the shifting national demographics and commented, “We’re not generating enough angry white guys to stay in business for the long term.”1
Graham said this at the 2012 convention.
Hundreds of pieces will be published as a postmortem on those Americans, particularly evangelicals, who supported Donald Trump in the 2016 presidential election. Robert P. Jones has recently referred to them as “nostalgia voters . . . culturally and economically disaffected voters that are anxious to hold on to a white conservative Christian culture that’s passing from the scene.”
Rod Dreher says this bloc holds the paradoxical view that the future is rightfully theirs and that the space for them in the United States is shrinking. This “dispossession,” as Dreher calls it, is “psychologically traumatic to certain whites who expected the world to work in a different way—a way that favored them.”2 Trump and his ilk offer a temporary balm to the damaged psyche of the dispossessed by making them feel good about who they are (i.e., real Americans) and what they could be (i.e., great again), all of which is tied directly to who they are not (i.e., immigrants, Muslims, etc.).
Marquez Ball further complicates things by suggesting that the term evangelical evokes racist undertones, saying, “Very few African American Christians would consider themselves to be evangelical, because for many the term often implies a white racist. . . . The 2016 presidential campaign of Donald Trump is challenging white evangelicals to prove that evangelical is not a code word for ‘white racist.’” As Michael Horton says, “many who call themselves evangelicals today find their ultimate loyalty in preserving or regaining a lost socio-political and cultural, perhaps even racial, hegemony.”3 Both Ball and Horton identify the significant baggage of the term evangelical, now reinforced by those who support Trump’s candidacy, which is simply an undercurrent of what evangelicalism has always been in America. If white evangelicals wish to be reconciled with people of color, then they should confess precisely how they have been possessed by something other than the faith they proclaim, irrespective of the repercussions that will befall the penitent and their structures of power.