Are you the publisher? Claim or contact us about this channel


Embed this content in your HTML

Search

Report adult content:

click to rate:

Account: (login)

More Channels


Showcase


Channel Catalog


    0 0

    Missionary media has played an important role in shaping world news.Pamla J. Eisenberg/Flickr.com, CC BY-SA

    The Christian Broadcasting Network, founded over 50 years ago by evangelist Pat Robertson, has now launched the first 24-hour Christian television news channel.

    Robertson said that the channel would help viewers understand how current events both in the United States and abroad affect them. The Christian Broadcasting Network has considerable influence among evangelicals, and President Trump, at times, has used the outlet to reach this support base.

    But this is not the first time Christians have shared and shaped the content of world news and information through a distinctly Christian viewpoint.

    Christian missionary publications

    For much of the 19th century, Christian missionaries served as informal foreign correspondents for a broadly Christian public in the eastern United States and Western Europe.

    They kept churches and missionary societies up to date on the societies in which they lived through regular letters and – by the late 19th century – photographs. Their letters were often reprinted in pamphlets and newsletters, or shared informally through extensive church networks.

    One of the most notable examples of the use of missionary networks in bridging the imagined distance between a Western Christian public and distant people comes from the Congo Free State, which was established in 1885 and ruled solely by King Leopold of Belgium.

    Leopold’s rule was characterized by widespread atrocities. Some estimates of the death toll of Leopold’s policies exceed 10 million people. Leopold used his reign to extract natural resources from the region. Following a boom in rubber prices, his agents were quick to use violence against the local population to make them harvest and process rubber.

    In 1904, Alice Harris, a Protestant missionary with the Congo Balolo Mission, which was organized and supported by British Baptists, took what would become an iconic image of the horrors. Her image has a Congolese father sitting in a kind of stupor, gazing at his daughter’s severed hand and foot, which lie in front of him on the missionary’s porch.

    A Congolese man looks at the severed hand and foot of his daughter.From a photograph taken at Baringa, Congo State, May 15, 1904.

    Harris’s image was reproduced in a host of pamphlets, books and newspapers in both Britain and the United States. Along with other images and reports, it helped foment an international reaction against Leopold’s brutal reign.

    Armenian genocide

    At around the same time, missionaries also highlighted the pogroms and genocidal violence committed against Assyrian and Armenian Christians in the eastern Ottoman Empire.

    When Assyrian and Armenian Christians experienced systematic mass violence at the hands of the Ottoman Empire in 1915, evangelical missionaries from the American Board of Commissioners for Foreign Missions were among the first to report on the atrocities.

    An Armenian woman kneeling beside her dead child.American Committee for Relief in the Near East

    Their dispatches motivated the formation of an unprecedented international relief effort for the persecuted Christians. Supported by the Woodrow Wilson-led government, approximately US$116 million in aid was raised.

    Global awareness

    Missionaries believed that God worked with them through religious conversions, moral reform and material and economic progress, to spread the truth of Christianity. The role of missionary media became foundational in providing information and images of suffering in the world.

    This role often pushed them into ever more remote territories. The information that they sent enabled many Christians in the West to more easily imagine the world as a globally connected community.

    Scholars in a wide range of emerging academic disciplines consulted missionary newsletters and updates for knowledge about the world. These networks also established a model for creating public humanitarian campaigns on behalf of those who were suffering on the other side of the globe – one that continues to shape contemporary humanitarian efforts.

    CBN News’ insistence that “God is everywhere – even in the news” echoes similar sentiments. It places the network in a longer line of creating a global Christian identity through knowledge production. News is an essential component of this.

    The Conversation

    Jason Bruner receives funding from the Louisville Institute


    0 0

    Pakistani religious groups protest against a Supreme Court decision that acquitted Asia Bibi, who was accused of blasphemy, in Islamabad, Pakistan.AP Photo/B.K. Bangash

    The citizens of Ireland voted recently, in a nationwide referendum, to remove a clause from their constitution that had made blasphemy a criminal offense.

    Ireland’s now-defunct Defamation Act of 2009 prohibited the “publication or utterance of blasphemous matter.” Just last year, in fact, Irish police opened a brief investigation into whether comedian Stephen Fry had broken the law when he described God as “capricious, mean-minded, stupid” and “an utter maniac” during a televised interview. The case was closed, however, as the police said they had been “unable to find a substantial number of outraged people.”

    The overturning of Ireland’s blasphemy law stands in stark contrast to recent news out of Pakistan – where the release from prison of Asia Bibi, a Christian woman, accused of blasphemy, has led to widespread protests. In Indonesia, too, many people have been jailed for speaking irreverently against Islam.

    Despite its recent defeat, Ireland’s 2009 blasphemy law is an important reminder that laws against blasphemy have hardly been unique to the Muslim world – even in the 21st century.

    Understanding the Muslim world

    As of 2014, according to the Pew Research Center, nearly one-fifth of European countries and a third of countries in the Americas, notably Canada, have laws against blasphemy.

    In my research for a literary study of blasphemy, I found that these laws may differ in many respects from their more well-known counterparts in Muslim nations, but they also share some common features with them.

    In particular, they’re all united in regarding blasphemy as a form of “injury” – even as they disagree about what, exactly, blasphemy injures.

    In the Muslim world, such injured parties are often a lot easier to find. Cultural anthropologist Saba Mahmood said that many devout Muslims perceive blasphemy as an almost physical injury: an intolerable offense that hurts both God himself and the whole community of the faithful.

    For Mahmood that perception was brought powerfully home in 2005, when a Danish newspaper published cartoons depicting the prophet Muhammad. Interviewing a number of Muslims at the time, Mahmood was “struck,” she wrote, “by the sense of personal loss” they conveyed. People she interviewed were very clear on this point:

    “The idea that we should just get over this hurt makes me so mad.”

    “I would have felt less wounded if the object of ridicule were my own parents.”

    The intensity of this “hurt,” “wounding” and “ridicule” helps to explain how blasphemy can remain a capital offense in a theocratic state like Pakistan. The punishment is tailored to the enormity of the perceived crime.

    Blasphemy and Christians

    That may sound like a foreign concept to secular ears. The reality, though, is that most Western blasphemy laws are rooted in a similar logic of religious offense.

    As historians like Leonard Levy and David Nash have documented, these laws– dating, mostly, from the 1200s to the early 1800s – were designed to protect Christian beliefs and practices from the sort of “hurt” and “ridicule” that animates Islamic blasphemy laws today. But as the West became increasingly secular, religious injury gradually lost much of its power to provoke. By the mid-20th century, most Western blasphemy laws had become virtually dead letters.

    That’s certainly true of the U.S., where such laws remain “on the books” in six states but haven’t been invoked since at least the early 1970s. They’re now widely held to be nullified by the First Amendment.

    Yet looking beyond the American context, one will find that blasphemy laws are hardly obsolete throughout the West. Instead, they’re acquiring new uses for the 21st century.

    Religious offense in a secular world

    Consider the case of a Danish man who was charged with blasphemy, in February 2017, for burning a Quran and for posting a video of the act online.

    In the past, Denmark’s blasphemy law had only ever been enforced to punish anti-Christian expression. (It was last used in 1946.) Today it serves to highlight an ongoing trend: In an increasingly pluralist, multicultural West, blasphemy laws find fresh purpose in policing intolerance between religious communities.

    In other words, the real question for the 21st century has not been whether blasphemy counts as a crime. Instead it’s been about who, or what – God or the state, religion or pluralism – is the injured party. Instead of preventing injury to God, these laws now seek to prevent injury to the social fabric of avowedly secular states.

    That’s true not only of the West’s centuries-old blasphemy laws but also of more recent ones. Ireland’s Defamation Act, for instance, targeted any person who “utters matter that is grossly abusive or insulting in relation to matters held sacred by any religion, thereby causing outrage among a substantial number of the adherents of that religion.”

    With its emphasis on the “outrage” blasphemy may cause among “any religion,” the measure was clearly aimed less at protecting the sacred than at preventing intolerance among diverse religious groups.

    Illustrations of prophecy: particularly the evening and morning visions of Daniel, and the apocalyptical visions of John (1840).Internet Archive Book Images. Image from page 371.

    The law itself caused outrage of a different sort, however. Advocacy organizations, such as Atheist Ireland, mounted fierce opposition to the law and to the example it set internationally. In late 2009, for instance, Pakistan borrowed the exact language of the Irish law in its own proposed statement on blasphemy to the United Nations’ Human Rights Council.

    Thus, Atheist Ireland warned on its website that “Islamic States can now point to a modern pluralist Western State passing a new blasphemy law in the 21st century.”

    Blasphemy in modernity

    That warning resonates with the common Western view of blasphemy as an antiquated concept, a medieval throwback with no relevance to “modern,” “developed” societies. Atheist Ireland’s chairperson, Michael Nugent, drew on this tradition when he touted the significance of the recent referendum victory:

    “It means that we’ve got rid of a medieval crime from our constitution that should never have been there.”

    As Columbia University professor Gauri Viswanathan puts it, blasphemy is often used“to separate cultures of modernity from those of premodernity.” Starting from the assumption that blasphemy can exist only in a backward society, critics point to blasphemy as evidence of the backwardness of entire religious cultures.

    I would argue, however, that this eurocentric view is growing increasingly difficult to sustain. If anything, blasphemy has in recent years enjoyed a resurgence in many corners of the supposedly secular West – including prosecutions in Austria, Finland, Germany, Greece, Switzerland and Turkey. Perhaps the fate of Ireland’s Defamation Act forecasts a broader reversal of that trend.

    This piece incorporates elements of an earlier article published on May 1, 2017.

    The Conversation

    Steve Pinkerton does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.


    0 0

    shutterstock

    When Loren Jacobs, member of the Shma Yisrael Congregation, offered a prayer for the victims of the Tree of Life congregation at a campaign rally attended by Mike Pence, it left many Jews feeling very upset. The vice president’s office later denied inviting Jacobs to the event.

    Jacobs is a messianic Jew and part of a group called Jews for Jesus. Here is why their relationship with Jews is so fraught.

    Messianic Jews

    Messianic Jews consider themselves Jewish Christians. Specifically they believe, as do all Christians, that Jesus is the son of God, as well as the Messiah, and that he died in atonement for the sins of mankind.

    There are approximately 175,000 to 250,000 messianic Jews in the U.S, and 350,000 worldwide. About 10,000 to 20,000 live in Israel. According to Dan Juster, a theologian who founded a major messianic Jewish congregation, there are currently about 300 congregations in the United States, and about half of the attendants are Gentiles, or ethnically non-Jewish.

    And most of these groups consider the conversion of ethnic Jews specifically – that is, people with at least one Jewish parent – to messianic Judaism a central part of their mission.

    Messianic Jews and Jewish messianism

    Belief in a Messiah who will redeem the Jewish people and thereby usher in a new, more humane era is very much a Jewish concept. However, there are deep theological differences between Jews and Christians regarding exactly who is a Messiah, what a Messiah should do and even how central a Messiah should be to their traditions.

    According to both the Hebrew Bible and Jewish oral tradition, a Messiah is a king, a warrior, a political figure or a revolutionary whose mission is divine and specific to the Jews. But the leader is neither divine nor a savior concerned with the afterlife of humanity. Neither is a Messiah worshiped as a deity.

    According to the Hebrew Bible and Jewish oral tradition, a Messiah could be a king, a warrior, a political figure or a revolutionary.tomertu/Shutterstock.com

    This leader’s job is to facilitate the return of the Jews to the land of Israel, not in the afterlife but in the temporal world. Therefore, redemption does not entail atonement for sins, but is a liberation from exile and a return to self-rule in Israel.

    One doesn’t need to be Jewish to be a Messiah. The Persian King Cyrus is referred to as a “Messiah” in the Hebrew Bible because he allowed the Jews to return to the land of Israel, signaling the end of what is known as the Babylonian Exile in the sixth century B.C.

    And Cyrus is not the only figure to be called a Messiah. Bar Kohkbah, the warrior and revolutionary who led the Jewish revolt against Rome from A.D. 132 to 135, was also believed to be a possible Messiah because he sought to eject Roman rule from Israel and return the Jews to Jerusalem. The fact that Bar Kohkbah did not successfully defeat Rome ultimately meant he did not turn out to be a Messiah – but he certainly took on the job of a Jewish Messiah.

    There are multiple forms of Jewish messianism, but none of them believe that a messianic figure – if such a person exists – will be divine.

    Contemporary Judaism’s many branches do not agree on when or if a Messiah will appear at all, especially since the creation of the Israeli state in 1948. This is in large part because the traditional job of a Messiah – the restoration of the Jewish state – has already been accomplished. Some Jews do believe that a Messiah will come, but the signs that would foretell have not appeared yet.

    Also, many Jews have rejected the idea of an individual Messiah in favor of the idea that humans themselves, through acts of social justice or tikkun olam, will mend the world and bring about a “messianic age” wherein life for Jews and in fact humanity improves for the better.

    Christianity’s redefinition of the nature and role of a Messiah is its most important point of departure from Judaism, and has accounted for much of the tension between Jews and Christians historically.

    Jews do not share the Christian belief that Jesus was divine. This difference in belief is grounded in the Jewish assertion that there is only one God, who can never be human, even though God may reveal himself in multiple ways. Historically, this created an insurmountable theological barrier between Jews and Christians.

    Conversion of Jews

    Although Jewish Christians have technically been around since the death of Jesus, the more modern form of the movement has its roots in late 19th-century Europe, when anti-Semitic persecution was on the rise in Russia and large numbers of Jews immigrated to the United States.

    The sole focus of some missions based in England and the U.S. was the conversion of the Jews to Christianity. One such mission, the London Society for Promoting Christianity Amongst the Jews, writes scholarPatricia A. Power, met in Boston in 1816. Its objective, as she says, “was to encourage Gentiles to take the task of Jewish evangelism seriously.”

    Jews for Jesus is an inheritor of this objective. It began, as Power explains, as a small group with dedicated followers and became“a multimillion dollar evangelistic machine that aggressively, and with savvy, marketed Jesus as the Jewish Messiah to an astonished and often hostile Jewish community.”

    Jews for Jesus’s controversial founder, Moishe Rosen, who died in 2010, adopted some of the practices of the “Jesus People” movement – a religious movement of the 1960s that sought to return to the original life of early Christians – for the conversion of Jews. While appearing to reject anti-Semitism, he portrayed Judaism as an incomplete tradition practiced by people who misunderstand their own scriptures and needed to be saved through conversion to Christianity.

    Misinterpreting scriptures?

    According to Jews for Judaism, an organization that provides support and education for Jews who have been targeted for conversion, missionaries like Jews for Jesus are often aggressive and manipulative in their pursuit of Jewish conversions to Christianity.

    Jews for Judaism alerts Jews to misinformation that might take scriptures out of context.ungvar/Shutterstock.com

    On its website, Jews for Judaism alerts Jews to the most common form of misinformation that involves taking the Jewish scriptures out of context – tactics that have been denounced by Jews and Christiansalike. Loren Jacobs was defrocked by the Union of Messianic Jewish Congregations for accusations of libel 15 years ago “after becoming involved in a bitter theological debate with other members of the group.”

    What makes the targeting of Jews for conversion to Christianity particularly painful and damaging is that for over a thousand years, Jews were persecuted, first at the hands of a Christian Roman Empire and then the Church, because Christians did not believe that Jewish scriptures contained truths claimed by another religion.

    Prayers like the one said by Loren Jacobs are a powerful reminder of that long and violent history.

    The Conversation

    Ingrid Anderson does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.


    0 0

    'Protest chaplains' came together to pledge their support to the Occupy Wall Street protests at Judson Memorial Church in New York in 2011.AP Photo/Andrew Burton

    More and more institutions across the United States are hiring chaplains and other spiritual care providers. Some are places that have long employed chaplains, but others may come as a surprise.

    The Massachusetts Institute of Technology, for example, recently installed a new chaplain. Various police departments are adding additional chaplains, as are horse racing tracks. At the same time, chaplaincy positions continue to exist in the U.S. House and Senate.

    Given the growing numbers of Americans who describe themselves as atheistic, agnostic or “nothing in particular,” this can appear puzzling.

    Why is chaplaincy growing when institutional religious affiliation is on the decline?

    History of chaplaincy

    The presence of chaplains in American institutions goes back to the Revolutionary War, when they served the American military. Chaplains helped perform many rituals and were present for patriotic ceremonies and events. Military chaplains have long been uniformed, noncombatant, commissioned officers with rank.

    A chaplain from Iowa presides over Passover service for Jewish soldiers during World War II.AP Photo

    Later, prisons and hospitals came to employ them to provide spiritual care. In federal prisons, chaplains provide a ministry to prisoners, along with support for behavior modification.

    In earlier eras, chaplains, like the American population in general, were overwhelmingly Protestant, Catholic or Jewish. They mostly cared for people from their own faith traditions.

    Changing role

    These traditional roles are changing. In our research we have come across some unique examples of organizations and people providing support to individuals and communities in a variety of situations.

    Allay Care Services, a newly launched venture, for example, provides chaplains who, for a fee, help individuals and families clarify their wishes at the end of life and prepare the necessary legal documents. While religious leaders have long worked around these issues, Allay links chaplains to people they do not know. The work takes place by phone.

    Chaplains provide care for weary travelers. Donna Mote at Atlanta’s Hartsfield-Jackson International Airport is just one chaplain among those working in over 170 countries who provide support to people they see mostly once as they pass through that busy space. At the New England Seafarers’ Mission in Boston, chaplain Steve Cushing, greets the foreign-born staff of container and cruise ships every week.

    Chaplains are currently deployed with every Red Cross disaster team in the United States and with many fire departments across the country. In these and other examples, they are present with people in crisis and help connect them to other resources. Mote and Cushing, for example, help travelers transfer money to their families, shop for basic necessities or even call home.

    A chaplain for disaster and spiritual care with the American Red Cross at a memorial service following Boston Marathon explosions in 2013.AP Photo/Boston Globe, Dina Rudick, Pool

    While some of the people chaplains serve have relationships with local clergy, growing numbers do not. This means that chaplains are, in many cases, the only theologically educated people that these members of the public have a connection with.

    Religious studies scholar Winnifred Sullivan describes chaplains today as “secular priests” or “ministers without portfolios.” Their work, increasingly called “spiritual care,” she argues, is understood by many as required by the First Amendment of the U.S. Constitution.

    Chaplaincy without religion?

    What is most interesting is the presence of chaplains in places not typically thought to be “religious.”

    For example, chaplains are increasingly present in social movements including Occupy, Black Lives Matter and Standing Rock. They provide a steady presence to protesters grappling with existential questions amid deep tensions that characterize such situations.

    An interesting example is that of Laura Everett, executive director of the Massachusetts Council of Churches. Everett serves as a bicycle chaplain. When cyclists are killed in traffic in the greater Boston area, she places white bicycles on the sites and leads services of remembrance for community members.

    The point being, even when people are skeptical or distant from religious organizations, many remain personally spiritual. Millennials, especially, are gathering in athletic groups and activist organizations – not congregations - to build community and support personal growth. And they too are being joined by chaplains who accompany them through life in ways traditional clergy have done in the past.

    In view of this trend, a quarter of theological schools are focusing attention directly on chaplaincy as their overall enrollment numbers continue to decline. Might this reflect a long-term shift in American religious life?

    The Conversation

    Wendy Cadge receives funding from FISH

    Michael Skaggs receives funding from FISH.


    0 0

    Vaccine work because they help create herd immunity.JPC-PROD/Shutterstock.com

    Across the country, billboards are popping up suggesting that vaccines can kill children, when the science behind vaccination is crystal clear – vaccinations are extremely safe.

    Researchers who study the beliefs of anti-vaxxers have found many different reasons, not just religious or political, as to why some parents refuse to get their children vaccinated.

    As a bioethicist who investigates how societal values impact medicine, I consider such decisions to be downright indefensible. And here are three reasons why.

    1. Failure to contribute to the public good

    Public goods benefit everyone. Take the example of roads, clean drinking water or universal education. Public health– the health of the overall population as a result of society-wide policies and practices – also falls into this category.

    Many ethicists argue that it is unfair to take advantage of such goods without doing one’s own part in contributing to them.

    Years of research involving hundreds of thousands of people have proven vaccines to be safe and effective. One reason why they are so effective – to the point of complete eradication of certain diseases – is because of what scientists call “herd immunity.”

    What this means is that once a certain percentage of a population becomes immunized against a disease through public health programs, it provides general protection for everyone. Even if a few people get sick, the disease won’t spread like wildfire.

    Those avoiding vaccination are aware that their children might nonetheless benefit from protection on account of herd immunity. This is unfair. For if everyone acted in that way, herd immunity would disappear.

    Indeed, this happened in California, where measles made a comeback because so many parents chose not to vaccinate their children.

    These parents not only failed in their duty to contribute to the public good, they also actively undermined it, hurting others and also costing the economy millions of dollars.

    2. Impact of health choices on the vulnerable

    People with weakened immune systems are likely to get sick more easily.Kaspars Grinvalds/Shutterstock.com

    Viruses do not affect everyone equally. Oftentimes, it is the elderly, infants, and people with weakened immune systems, who are most at risk.

    In my family, my brother, Jason, often had to be rushed to a hospital as he would easily catch a bug. So, when we had visitors, my family would inquire if they could let us know if they had any infections.

    Often the answers were not truthful. Some would say that it was merely an “allergy,” and some others would be downright offended. My brother would end up catching the germs and more than once, nearly lost his life due to their lack of concern for his health.

    Ethicists have long argued for special obligations towards the most vulnerable. And we need to be mindful of the impact of individual health choices on others, particularly the vulnerable.

    3: Health is communal

    Political philosophers like John Dewey have argued that democratic public institutions necessarily rely upon belief in scientific evidence and facts. People can hold different personal beliefs, but there are some truths that are irrefutable, such as the fact that the Earth is flat and revolves around the sun.

    Anti-science attitudes are dangerous because they undermine our ability to make decisions together as a society, whether about education, infrastructure or health. For example, if too many people treat the scientific consensus on climate change as just “one perspective,” that will hinder our ability to respond to the massive changes already underway. In a similar manner, treating the science on vaccines as just “one perspective” negatively impacts everyone.

    In the face of overwhelming scientific evidence concerning the efficacy, safety and importance of vaccines, citizens have a duty to support vaccination and encourage others to do so as well.

    At the foundation of each of these duties lies a simple and powerful truth: Health is communal. Health-related ethical obligations do not stop at our own doorstep. To think that they do is both empirically misguided and ethically indefensible.

    The Conversation

    Joel Michael Reynolds no recibe salario, ni ejerce labores de consultoría, ni posee acciones, ni recibe financiación de ninguna compañía u organización que pueda obtener beneficio de este artículo, y ha declarado carecer de vínculos relevantes más allá del cargo académico citado.


    0 0

    The largest public housing complex in the country, Queensbridge Houses, is located near the spot where Amazon plans to put a new headquarters.AP Photo/Mark Lennihan

    When large companies move into an area, politicians often proclaim how the new business will create jobs, increase tax revenues, and thus lead to economic growth. This is one reason local governments offer tax incentives to businesses willing to move in.

    Amazon’s decision to locate offices in Long Island City across the East River from Manhattan, and in Crystal City on the outskirts of Washington, D.C., follows this pattern. The New York location borders the largest low-income housing area in the United States, with mostly African-American and Hispanic residents whose median household income is well below the federal poverty level. These people, local politicians claim, will benefit from Amazon’s move to the neighborhood.

    However, when large companies with an upscale and specialized workforce move into an area, the result is more often gentrification. As economic development takes place and prices of real estate go up, the poorer residents of the neighborhood are forced out and replaced by wealthier ones.

    Is such a market-driven approach that accepts displacement ethically justifiable? And how do we even measure its costs?

    Can gentrification ever be ethical?

    Although politicians don’t typically frame gentrification as a question of ethics, in accepting the displacement of poor residents in favor of better-off residents they are, in effect, making an argument based on ideas of utilitarianism.

    Utilitarianism, developed as a modern theory of ethics by the 19th-century philosophers Jeremy Bentham and John Stuart Mill, seeks the greatest balance of happiness over suffering in society as a whole. Utilitarianism seeks the greatest net benefit in any situation. In economics, it is often expressed in monetary terms.

    A classic example is of a new dam that will generate electricity, irrigate crops and provide a new lake for recreation. But it might also displace people and flood land that is used for other purposes.

    Economists might calculate the dollar cost of the dam itself, the monetary value of the land lost, and the cost to relocate displaced people. They would weigh these monetary costs against the value of the electricity gained, the increased food production, and added income from recreation.

    What economists miss in these calculations are the social costs. For example, they do not count the lives disrupted through displacement, nor do they determine if the benefits of the dam are equally available to all.

    Gentrification, as an economic and social phenomenon, is not limited to cities in the United States. Gentrification has become a global issue. In cities as geographically dispersed as Amsterdam, Sydney, Berlin and Vancouver, gentrification has been linked to free-market economic policies. Put another way, when governments decide to let housing and property markets exist with little or no regulation, gentrification typically flourishes.

    When neighborhoods gentrify, politicians and policymakers often point to physical and economic improvements and the better quality of life for residents in an area after gentrification. For example in 1985, during a period of intense urban renewal in New York City, the Real Estate Board of New York took out advertisements in The New York Times to claim that “neighborhoods and lives blossom” under gentrification.

    Through the lens of utilitarianism, one could say that the population living in neighborhoods after gentrification experience greater happiness than before.

    The fallacy of this argument is, of course, that these “happier” populations are overwhelmingly not the same people as were there before gentrification. As a scholar who works on questions of ethics in the built environment, I have studied how we, as the concerned public, can better equip ourselves to see through such arguments.

    Economic development in an area leads to less poverty in that area, not because the personal economic situation of poor people who live there has improved, but because the poor people have quite simply been erased out of the picture.

    Erasing the working class

    Urban geographer Tom Slater points to a similar disappearing act within gentrification research.

    Tenants being pushed out on account of rising rents in Harlem in 2007.AP Photos/Bebeto Matthews

    Researchers once focused on the experiences of those negatively affected by gentrification. For example, one study of the Williamsburg neighborhood of Brooklyn found that gentrification commonly removed manufacturing from inner city areas, leading to blue-collar workers losing urban job opportunities.

    Another study found that gentrification was associated with increased social hardships for residents. Not only did their housing expenses rise, social networks disintegrated as neighbors were forced to move elsewhere. In an examination of seven New York neighborhoods, for example, the researchers found that half of the poor households who had remained in gentrifying areas were paying more than two-thirds of their income for rent.

    Where gentrification research once focused on evictions of low-income and working class residents, housing affordability problems, and torn social fabrics caused through changing neighborhoods, the talk has since turned to the experiences of the middle classes who are doing the gentrifying.

    Terms like “competitive progress” and “regeneration, revitalization and renaissance” of urban neighborhoods are commonly used to describe a process whereby physically distressed areas of a city have their buildings renovated and updated.

    Urban planner and best-selling author Richard Florida also focuses on the gentrifiers. In his much discussed 2002 book, Florida maintains that cities with a large gay and “bohemian” population of artists and intellectuals tend to thrive economically.

    He calls this group of hip and affluent urbanites the “creative class,” and states that they are responsible for a city’s economic success. When Florida’s book came out, city leaders throughout the United States quickly seized on his ideas to promote their own urban renewal projects.

    When researchers and urban leaders focus on the gentrifiers, the displaced poor and working class are doubly erased – from the gentrifying areas they once called home, and with few exceptions, from the concerns of urban policymakers.

    The need to restore happiness

    Amazon’s move to Washington and New York along with an influx of well-paid employees brings us back to the question of how we might apply the ethical concept of utilitarianism to understand the greatest balance of happiness over suffering for the greatest number of people.

    In my view, this number must include the poor and working class. In an area threatened by gentrification, the economic and social costs for displaced residents is typically high.

    To make ethical decisions, we must consider the people who suffer the consequences of rapidly rising costs in the area they call home as part of the ethical equation.

    The Conversation

    Alexandra Staub does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.


    0 0

    Researchers have found people use the 'like' button on social media posts for many reasons.Worawee Meepian

    Jack Dorsey, Twitter’s founder and CEO, was recently reported to have questioned how the site “incentivizes people to want (the number of likes on their posts) to go up.” He also said that “he was not a fan of the heart-shaped (‘like’) button and that Twitter would be getting rid of it ‘soon.’” Twitter has since released a statement indicating that there are no immediate plans to remove the “like” button.

    Whatever the future of Twitter’s “like” button may be, as a scholar of social media and religion I’d argue that the cute little heart-shaped button on Twitter and Facebook is far more impactful than it appears.

    How people use the ‘like’ button

    The “like” button is not there by accident. Instead, this one-click feature exists as an intentional design decision.

    Like most big tech companies, Twitter has an entire department dedicated to understanding users. Every feature and font, based on their research, is there to maximize the overall user experience.

    The design decisions furtively influence users’ feelings and behaviors. Chris Nodder, a user experience researcher and the author of “Evil by Design,” explains how designers must always ask the question, “How do we influence behavior through the medium of software?”

    In other words, design decisions are made not only to improve a users’ experience but also influence their behaviors. The “like” button is one example.

    While the heart-shaped button is seemingly only for expressing appreciation for the content of a social media post, researchers have determined that people use the button for many other reasons.

    A team of researchers, for example, found that users in the United States often chose to like something for bonding purposes rather than simply liking the content.

    Another study of Facebook users found that the “like” button is used to maintain relationships with existing friends or to develop new relationships. People may use the “like” button as a way to publicly show closeness to another person, or even as an effort toward dating someone.

    The point being that the “like” button does far more than just express how much a person likes a particular picture or post.

    Not so harmless

    Research has also shown that the “like” button is not entirely harmless.

    Social networking sites are powerful tools for building relationships. Nevertheless, research has shown that certain social media features can adversely affect users.

    For instance, a study found that impersonal gestures such as the one-click “like” communication may not promote user well-being. According to Facebook researcher Moira Burke and Robert Kraut, an emeritus professor at Carnegie Mellon University, “simply reading about friends, receiving text communication from weak ties, and receiving one-click communication did not affect well-being.” On the other hand, more personal and direct communication such as a direct message or personalized comment can have an impact on user well-being.

    The number of ‘likes’ on a post can generate feelings of envy.13_Phunkod

    A particularly harmful byproduct of the “like” button is found in the way social networking sites foster negative social comparisons. A review of research on the topic has found that social media use correlates with measurable increases in envy and depression.

    These feelings of envy can take two different forms: malicious envy and benign envy. Malicious envy involves resentment and a desire to harm the other person. Benign envy involves admiration and a desire to obtain what the other person possesses.

    One of the studies in the review involved 194 college-aged Facebook users in Germany. In this study, researchers found that “the closer the relationship, the more a Facebook user will experience benign envy.”

    These digital showrooms allow people to present the best version of themselves for everyone else to see. Often, people use the number of likes to judge others and themselves. That little heart-shaped button becomes a publicly quantifiable measure of social support.

    According to this research, the “like” button works as a “mechanism to compare oneself with others.” The number of “likes,” make social support quantifiable. It can then be easily viewed for making social comparisons.

    Ripple effects of social networking sites

    Given the impact of social networking sites on the feelings and behaviors of billions of users, I believe there needs to be an ethical component to designing these technologies.

    As Twitter is busy “rethinking everything,” the company would do well to think about how the platform is shaping the feelings and behaviors of its users.

    Michigan Tech humanities scholar Robert Johnson, in his book, “User-Centered Technology,” writes how technologies have “ripple effects” that “shape culture in defining ways.”

    The same argument is true for social networking sites. As such, every design feature – even that little heart-shaped button – must be carefully scrutinized.

    The Conversation

    A. Trevor Sutton does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.


    0 0

    The Salvation Army is among the top few U.S. charities.CityOfFortWorth, CC BY-NC-ND

    Tinseled trees and snowy landscapes are not the only signs of the upcoming holiday season. Red kettles, staffed by men and women in street clothes, Santa suits and Salvation Army uniforms also telegraph Christmastime.

    The Army is among America’s top-grossing charities. In 2015, its 25,000 bell-ringers helped raise an all-time high of US$149.6 million. That was part of the year’s almost $3 billion revenue from bequests, grants, sales, in-kind donations and investments as well as direct contributions.

    William Booth, an English evangelist, founded the Salvation Army in 1878 as a religious outreach to London’s poor. How a British evangelical church became an American icon is an ongoing interest of mine.

    Entry into the United States

    William Booth, founder of the Salvation Army.AP Photo

    Booth, who called himself “The General,” fashioned his Army on Britain’s military. From the start, his “soldiers” wore uniforms and they described their mission in martial terms. Salvationists marched through the streets of London’s East End, a neighborhood of poor immigrants, with brass bands and female preachers. Booth and his followers also pursued “sinners” and frequently preached in bars, brothels and theaters.

    Booth’s plan was to send his army worldwide, and his first stop was the United States. One of his early recruits had migrated to Philadelphia, and wrote to Booth about the residents’ need for salvation. In 1880, a small party of British Salvationists debarked at Castle Garden, New York’s first immigration center. The group immediately started singing hymns set to popular melodies and marching through lower Manhattan.

    During the next few days, the English “soldiers” tacked up posters, similar to ads for commercial entertainments, for a prayer service at Harry Hill’s, a popular dance hall, theater and saloon. The venue was not only thick with drunkards, prostitutes and pleasure seekers, its unlikeliness as a religious meeting place guaranteed press attention.

    Such unexpected behavior did bring the Army to the public’s attention. Their boisterousness, even in service of saving souls, was criticized by New York’s clergy and ridiculed in newspapers and magazines. That the Army featured female preachers at a time when most Protestant groups did not ordain women only added to its notoriety.

    But the Army did not surrender. Pressing their “invasion” beyond New York City, the soldiers traveled first to Philadelphia and later nationwide. Their exuberance attracted young people and women to the cause.

    Young people liked the notion of a military crusade for religious purposes, and women joined because the Army offered them positions of leadership and authority. In fact, William Booth’s daughter-in-law, Maud Ballington, followed by his two daughters, Emma and Evangeline, headed the American Salvation Army from 1887 to 1950.

    Kettles for Christmas dinner

    In both Britain and the U.S., Salvationists saw their mission as twofold: converting sinners and assisting the needy.

    In the Army’s perspective, the two went hand in hand, which is why members opened shelters for addicts, alcoholics and prostitutes. Yet they also sought to aid “down and outers,” their name for the needy. Among their early outreaches were Christmas dinners for the urban poor. But finding funds for food and gifts was difficult.

    Salvation Army mini Red Kettle and bell at Delnor-Wiggins Pass State park.Robin Wendell/Flickr.com, CC BY

    By 1891, Salvationists had outposts nationwide. In San Francisco, Salvation Army Captain Joseph McFee was eager to serve a Christmas feast for a thousand of the city’s poorest residents. Frustrated by his lack of success, he decided to improvise. Grabbing a crab pot from the local wharf, he hung it from a tripod at a busy intersection. Above the pot was a sign: “Fill the Pot for the Poor – Free Dinner on Christmas Day.” McFee’s campaign was a success.

    Word spread and the kettles soon provided Christmas dinners for thousands nationwide.

    The kettle also helped rehabilitate the Army’s image. Instead of seeing Salvationists as an unruly pack of religious rebels, many Americans recognized their work with the poor. At a time when neither state nor federal governments provided a social safety net, the Army offered meals, beds, work and medical facilities to destitute men and women.

    But it was the Salvationists’ service in World War I that sealed the deal. Eager to support the American war effort, Salvation Army leaders sent “Sallies,” the popular nickname for Army women, to the French front. The Sallies set up huts where they fried donuts, sewed buttons, wrote letters and otherwise “mothered” the troops.

    The women’s faith, fortitude and friendship touched many young soldiers. One wrote in his letter home:

    “These good women create an atmosphere that reminds us of home, and out of the millions of men over there not one ever dreams of offering the slightest sign of disrespect or lack of consideration to these wonderful women.”

    By the war’s end, the Army had become a symbol of American humanitarianism, and fundraising was much easier. But after the 1920s, the Army’s evangelical crusade took a back seat to social service delivery, at least in their public relations. It was easier to raise money for helping the poor than for converting them.

    Despite challenges, an American icon

    Today, many contributors do not realize the Army is a church, a fact that has caused many Army leaders consternation.

    And, much like other churches, its growth has stalled. Since 2000, it only has approximately 90,000 members. Nonetheless, it continues to deliver social services nationwide. In 2017, according to its own records, the Army served over 50 million meals, operated 141 rehabilitation centers and provided shelter for almost 10 million people. It also provided adult and child day care, job assistance, disaster relief, medical care and community centers.

    But like any other long-established institution, the Army has its challenges. Most recently, LGBT groups allegeddiscrimination in service provision and in hiring.

    The Army has responded with its own statements of how it is “open and inclusive to all people.”

    It also faces new problems ranging from a shortage of bell ringers in some cities to fewer kettle contributions as people carry less cash.

    ‘Guys and Dolls’ musical.Poughkeepsie Day School/Flickr.com, CC BY-NC-ND

    Yet the Army remains a familiar symbol for religious and philanthropic outreach. Each year, when high school and college actors perform “Guys and Dolls,” the Army graces American stages. This popular musical, inspired by a real-life Salvationist, captures the missionaries’ zealous dedication. And this holiday season, Grammy-award winning singer Meghan Trainer kicked off the 2018 Red Kettle Campaign during the Dallas Cowboys’ Thanksgiving Day game halftime show.

    Salvation Army Captain Joseph McFee’s legacy lives on – providing inspiration to millions of Americans, whether they care about religion or not.

    The Conversation

    Diane Winston does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.


    0 0

    Dorothy Day, founder of the Catholic Worker Movement, back to camera, speaks with black and white southerners about the problems of segregation during a radio show.AP Photo/H.B. Littell

    Dorothy Day died 38 years ago. Her life followed an unorthodox path – moving from rejecting religion in favor of activism to embracing Catholicism and integrating it with social action through the Catholic Worker Movement.

    A hero of the Catholic left, Day found an unlikely champion for her canonization in New York’s conservative archbishop, Cardinal Timothy M. Dolan, who hailed her as “the saint for our times.” At their November 2012 meeting, the U.S. bishops unanimously supported her cause, and the Vatican accepted the recommendation, naming her “Servant of God.” If an investigation proves her life to be exceptionally virtuous, she will be declared “venerable.”

    However, to declare her a saint, two miracles through her intercession will need to be proven. The process is long and complex, and only three other American-born Catholics, all women, have been canonized. The Catholic Church remembers the life of saints at daily Mass on their feast day, usually the day of death.

    What most appeals to me, as a scholar of Dorothy Day, is her ability to discern beauty in the midst of her harsh and demanding life. In that, she has a lesson for the times we live in.

    An early radical life

    The arc of her early life followed an unconventional path. In her 1952 autobiography, “The Long Loneliness,” Day reveals her lifelong attraction to the radical life among anarchists, socialists and communists.

    Dropping out of the University of Illinois in 1916, she followed her family to New York City and found work as a journalist and freelance writer. Living on her own, she spent much of her time among radicals like Max Eastman, editor of socialist newspaper “The Masses” and communist. As a journalist, she took up the cause of striking workers. She loved to read in her spare time and found especially inspiring the work of Russian novelist Fyodor Dostoevsky.

    She was also an activist. In 1917, Day joined a friend in a suffragette protest which led to their arrest and incarceration at the notorious Occoquan work farm in Virginia. Day describes in vivid detail the guards’ brutality, grabbing her and dragging her to her cell. She subsequently participated in a hunger strike with her companions to protest against such treatment.

    After her release, she returned to New York, working odd jobs and drinking until dawn with an assortment of friends in a bar nicknamed “Hell Hole.” She recalls with fondness the playwright Eugene O'Neill reciting Francis Thompson’s “Hound of Heaven.” As she wrote in her biography, the hound’s relentless pursuit fascinated her and caused her to wonder about her own life’s ultimate end.

    She went through times of deep personal sorrow. Her granddaughter, Kate Hennessy, reveals in “Dorothy Day: The World Will Be Saved by Beauty” Dorothy’s heartache of failed love affairs, including procuring an illegal abortion. The trauma contributed to her strong opposition to abortion after becoming Catholic.

    The highs and lows of this life left Day unsettled, and she recalls slipping into the back of St. Joseph’s Church, on Sixth Avenue, taking solace in watching Mass as dawn broke over the cityscape.

    Becoming a Catholic

    Then, in 1925, Dorothy Day fell in love with Forster Batterham, the brother of a friend’s wife, a transplanted southerner, a lover of nature and, like Day, of opera. They shared her Staten Island cottage and conceived a child, Tamar Therese, born in 1926.

    She describes in loving detail her life with Forster, “walking on the beach, resting on the pier beside him while he fished, rowing with him in the calm of bay, walking through fields and woods.”

    It was the birth of her daughter that connected her to the beauty of the divine in a deeply personal way. She wrote,

    “The final object of this love and gratitude is God.”

    She was moved to worship God with others. Even though the man she loved rejected all institutions, especially religious, Day had her daughter baptized a Catholic and herself less than six months later.

    Dorothy Day. A photo from 1934.New York World-Telegram & Sun Collection, via Wikimedia Commons

    This ended her common law marriage, though in her memoir, her granddaughter, Hennessy, makes abundantly clear that her grandfather, Forster, remained a constant presence throughout her grandmother’s life.

    About five years later, Day met Peter Maurin, a French immigrant who taught her about Catholic radicalism. They founded the Catholic Worker Movement and began publishing a newspaper by the same name in May 1933 to disseminate their radical Catholic vision as a counter to Communism.

    That same summer a Catholic Worker Movement community formed and lived in what Maurin called a “house of hospitality,” a place of welcome to every person, especially the poor. Day explains the gospel inspiration for these houses of hospitality.

    “The mystery of the poor is this: That they are Jesus, and what you do for them you do for Him. It is the only way of knowing and believing in our love.”

    The Catholic Worker Movement continues to thrive through its newspapers and houses of hospitality.

    Saving beauty

    For Day, beauty appeared wherever God was present. This meant Day came to see beauty everywhere and in everything.

    She believed Christ’s saving beauty appeared not only on the altar at Mass but also around every Catholic Worker Movement table. Jesus identified with the least, and so, for Day, Christ appeared in every poor person who came to share a meal at a house of hospitality.

    Her writings make clear that she never wavered in this conviction.

    This attentiveness to beauty translated to everything commonplace in her daily life. Another Day scholar told me of his vivid memory of an elderly Dorothy gazing intently at a jar of unkempt wild flowers that were quite unremarkable in their abundance and fleeting in their beauty.

    Day’s keen sense of wonder at commonplace beauty remained a hallmark of being a witness to God’s love. Three years before her death, she wrote:

    What samples of His love in creation all around us! Even in the city, the changing sky, the trees, frail though they be, which prisoners grow on Riker’s Island to be planted around the city, bear witness. People – all humankind, in some way.“

    In sharing with her readers the view from her Staten Island cottage, she wrote:

    "the bay, the gulls, the ‘paths in the sea,’ the tiny ripples stirring a patch of water here and there, the reflections of the cloud on the surface – how beautiful it all is.”

    Dorothy Day surrounded herself in the beauty of a loving God made manifest in the least – something contemporary culture could learn from.

    The Conversation

    Sandra Yocum does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.


    0 0
  • 12/02/18--07:29: How Hanukkah came to America
  • In the United States, Hanukkah has gained much significance.Tercer Ojo Photography/Shutterstock.com

    Hanukkah may be the best known Jewish holiday in the United States. But despite its popularity in the U.S., Hanukkah is ranked one of Judaism’s minor festivals, and nowhere else does it garner such attention. The holiday is mostly a domestic celebration, although special holiday prayers also expand synagogue worship.

    So how did Hanukkah attain its special place in America?

    Hanukkah’s back story

    The word “Hanukkah” means dedication. It commemorates the rededicating of the ancient Temple in Jerusalem in 165 B.C. when Jews – led by a band of brothers called the Maccabees – tossed out statues of Hellenic gods that had been placed there by King Antiochus IV when he conquered Judea. Antiochus aimed to plant Hellenic culture throughout his kingdom, and that included worshipping its gods.

    Legend has it that during the dedication, as people prepared to light the Temple’s large oil lamps to signify the presence of God, only a tiny bit of holy oil could be found. Yet, that little bit of oil remained alight for eight days until more could be prepared. Thus, each Hanukkah evening, for eight nights, Jews light a candle, adding an additional one as the holiday progresses throughout the festival.

    Hanukkah’s American story

    Today, America is home to almost 7 million Jews. But Jews did not always find it easy to be Jewish in America. Until the late 19th century, America’s Jewish population was very small and grew to only as many as 250,000 in 1880. The basic goods of Jewish religious life – such as kosher meat and candles, Torah scrolls, and Jewish calendars – were often hard to find.

    Until the late 19th century, basic goods of Jewish life were hard to find in the U.S.Zoltan Kluger

    In those early days, major Jewish religious events took special planning and effort, and minor festivals like Hanukkah often slipped by unnoticed.

    My own study of American Jewish history has recently focused on Hanukkah’s development.

    It began with a simple holiday hymn written in 1840 by Penina Moise, a Jewish Sunday school teacher in Charleston, South Carolina. Her evangelical Christian neighbors worked hard to bring the local Jews into the Christian fold. They urged Jews to agree that only by becoming Christian could they attain God’s love and ultimately reach Heaven.

    Moise, a famed poet, saw the holiday celebrating dedication to Judaism as an occasion to inspire Jewish dedication despite Christian challenges. Her congregation, Beth Elohim, publicized the hymn by including it in their hymnbook.

    This English language hymn expressed a feeling common to many American Jews living as a tiny minority. “Great Arbiter of human fate whose glory ne'er decays,” Moise began the hymn, “To Thee alone we dedicate the song and soul of praise.”

    It became a favorite among American Jews and could be heard in congregations around the country for another century.

    Shortly after the Civil War, Cincinnati Rabbi Max Lilienthal learned about special Christmas events for children held in some local churches. To adapt them for children in his own congregation, he created a Hanukkah assembly where the holiday’s story was told, blessings and hymns were sung, candles were lighted and sweets were distributed to the children.

    His friend, Rabbi Isaac M. Wise, created a similar event for his own congregation. Wise and Lilienthal edited national Jewish magazines where they publicized these innovative Hanukkah assemblies, encouraging other congregations to establish their own.

    Lilienthal and Wise also aimed to reform Judaism, streamlining it and emphasizing the rabbi’s role as teacher. Because they felt their changes would help Judaism survive in the modern age, they called themselves “Modern Maccabees.” Through their efforts, special Hanukkah events for children became standard in American synagogues.

    20th-century expansion

    By 1900, industrial America produced the abundance of goods exchanged each Dec. 25. Christmas’ domestic celebrations and gifts to children provided a shared religious experience to American Christians otherwise separated by denominational divisions. As a home celebration, it sidestepped the theological and institutional loyalties voiced in churches.

    For the 2.3 million Jewish immigrants who entered the U.S. between 1881 and 1924, providing their children with gifts in December proved they were becoming American and obtaining a better life.

    But by giving those gifts at Hanukkah, instead of adopting Christmas, they also expressed their own ideals of American religious freedom, as well as their own dedication to Judaism.

    A Hanukkah religious service and party in 1940.Center for Jewish History, NYC

    After World War II, many Jews relocated from urban centers. Suburban Jewish children often comprised small minorities in public schools and found themselves coerced to participate in Christmas assemblies. Teachers, administrators and peers often pressured them to sing Christian hymns and assert statements of Christian faith.

    From the 1950s through the 1980s, as Jewish parents argued for their children’s right to freedom from religious coercion, they also embellished Hanukkah. Suburban synagogues expanded their Hanukkah programming.

    As I detail in my book, Jewish families embellished domestic Hanukkah celebrations with decorations, nightly gifts and holiday parties to enhance Hanukkah’s impact. In suburbia, Hanukkah’s theme of dedication to Judaism shone with special meaning. Rabbinical associations, national Jewish clubs and advertisers of Hanukkah goods carried the ideas for expanded Hanukkah festivities nationwide.

    In the 21st century, Hanukkah accomplishes many tasks. Amid Christmas, it reminds Jews of Jewish dedication. Its domestic celebration enhances Jewish family life. In its similarity to Christmas domestic gift-giving, Hanukkah makes Judaism attractive to children and – according to my college students – relatable to Jews’ Christian neighbors. In many interfaith families, this shared festivity furthers domestic tranquility.

    In America, this minor festival has attained major significance.

    The Conversation

    Dianne Ashton received funding for her research on Hanukkah from the National Endowment for the Humanities, the Gilder Lehrman Institute, the American Jewish Archives, and Rowan University.


    0 0

    Hanukkah demands fewer religious rituals than most other Jewish observances.Golden Pixels LLC

    When I was growing up in suburban New York, Hanukkah was not grounded in religious observance. Having no clue that there are traditional Hebrew blessings that accompany the kindling of the Hanukkah candles, we invented our own wishes, awkwardly voiced out loud, for happiness and peace.

    Then again, the festival of Hanukkah demands the performance of fewer religious rituals than most other Jewish observances. Even the most pious Jews do not take off from work during the eight-day festival. After all, the holiday is never mentioned in the Bible, since the events that it commemorates occurred hundreds of years after the Bible was written.

    Today, this minor festival of Hanukkah has become supersized into a Jewish version of Christmas – a time for family gatherings, gift-giving and festivity. But it is through pop culture that Jews have found their own identity, in which they can take pride.

    Hanukkah in America

    The true story of Hanukkah is of a conflict between two different groups of Jews– those who were eager to become part of the Hellenistic culture represented by the Syrian-Greeks against a band of zealots called the Maccabees, who sought to maintain Jewish rites.

    Today, in the U.S., however, only 15 percent of American Jews view their Jewish identity as rooted in religion. And for many American Jews, aspects of Hanukkah that are most attractive tend to be those that mirror what many other Americans are doing at this time of year – such as celebrating Christmas.

    As some economists have pointed out, Hanukkah is the only Jewish holiday that is celebrated much more widely among American Jews who have children. Notably, Jews who live in Christian majority areas, end up spending more on Hanukkah gifts than those who reside in mostly Jewish neighborhoods. By contrast, Hanukkah in Israel is not as significant.

    Hanukkah in pop culture

    Nonetheless, American Jews have carved out a place for Hanukkah in pop culture.

    Seeing their own group depicted in pop culture has been an important source of pride for American Jews throughout the last century, as I observed in my book on Jewish vaudeville, theater and film.

    Jewish comedians over the last few decades have mined humor from the need that Jews have to feel that their minority identity is still a meaningful and salient one, even while poking gentle fun at Christmas.

    An example is that of comedian Jon Lovitz’s Hanukkah Harry premiered on “Saturday Night Live” in 1989.

    Hanukkah Harry.

    As a gray-bearded, ultra-Orthodox Jewish character, Hanukkah Harry fills in for an ailing Santa to deliver presents on Christmas Eve only to face disappointment from Christian children when they receive chocolate coins and dreidels, a Hanukkah spinning top, which seem paltry and foreign to them.

    And another comedian, Adam Sandler, whose “Hanukkah Song” was first performed on “Saturday Night Live” in 1994, reminds Jews that they have their own holiday in which they can take pride. “When you feel like the only kid in town without a Christmas tree,” the song starts off, “here’s a list of people who are Jewish just like you and me,” and then provides a humorous list of celebrities who are at least partly Jewish in ancestry, from Kirk Douglas to Dinah Shore.

    Adam Sandler’s Hanukkah song.

    The song has been watched almost 5 million times on YouTube.

    Jewish role in secularizing Christmas

    Some scholars suggest that before making Hanukkah into an essentially non-religious celebration, Jews had already “secularized” Christmas.

    Music scholar David Lehman, for example, writes that Christmas “became a secular holiday” thanks to the efforts of composer Irving Berlin, a Russian Jewish immigrant whose “White Christmas” unified Americans during the Second World War. Its lyrics about “sleigh bells in the snow” appealed to common feelings of nostalgia toward hearth and home.

    Indeed, a new documentary from Canadian filmmaker Larry Weinstein also shows the role of Jewish songwriters in recreating Christmas as a secular holiday. The majority of iconic Christmas carols, from “The Christmas Song,” about chestnuts roasting on an open fire, to “Silver Bells,” were written by Jews. These songs de-emphasized the religious aspects of the holiday and turned it into a celebration of cold weather, family and simple pleasures.

    Even “Rudolf the Red-Nosed Reindeer” can be seen as a song about an outsider who, without losing what what makes him distinct, manages to join the in-crowd, just as Jews themselves did in America.

    Connecting to other Jews

    In the end, the contemporary celebration of Hanukkah does not tend to hinge on the need to reclaim a distinctive religious practice. Instead, it centers on recapturing a sense of connection to other Jews.

    This Hanukkah, I will celebrate the holiday with my wife and children by lighting the menorah and chanting the Hebrew blessings – which I finally learned.

    The real highlight, however, will not be the religious aspects, which are pretty thin, but the gustatory pleasure of the thick, sizzling potato latkes, waiting to be covered with sour cream or apple sauce.

    The Conversation

    Ted Merwin does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.


    0 0

    A sculpture of Meister Eckhart in Germany.Lothar Spurzem , CC BY-SA

    The percentage of Americans who do not identify with any religious tradition continues to rise annually. Not all of them, however, are atheists or agnostics. Many of these people believe in a higher power, if not organized religion, and their numbers too are steadily increasing.

    The history of organized religion is full of schisms, heresies and other breakaways. What is different at this time is a seemingly indiscriminate mixing of diverse religious traditions to form a personalized spirituality, often referred to as “cafeteria spirituality.” This involves picking and choosing the religious ideas one likes best.

    At the heart of this trend is the general conviction that all world religions share a fundamental, common basis, a belief known as “perennialism.” And this is where the unlikely figure of Meister Eckhart, a 14th-century Dominican friar famous for his popular sermons on the direct experience of God, is finding popular appeal.

    Who was Meister Eckhart?

    I have studied Meister Eckhart and his ideas of mysticism. The creative power that people address as “God,” he explained, is already present within each individual and is best understood as the very force that infuses all living things.

    He believed this divinity to be genderless and completely “other” from humans, accessible not through images or words but through a direct encounter within each person.

    A sculpture of Meister Eckhart in Germany.Lothar Spurzem, CC BY-SA

    The method of direct access to the divine, according to Eckhart, depended on an individual letting go of all desires and images of God and becoming aware of the “divine spark” present within.

    Seven centuries ago, Eckhart embraced meditation and what is now called mindfulness. Although he never questioned any of the doctrines of the Catholic Church, Eckhart’s preaching eventually resulted in an official investigation and papal condemnation.

    Significantly, it was not Eckhart’s overall approach to experiencing God that his superiors criticized, but rather his decision to teach his wisdom. His inquisitors believed the “unlearned and simple people” were likely to misunderstand him. Eckhart, on the other hand, insisted that the proper role of a preacher was to preach.

    He died before his trial was complete, but his writings were subsequently censured by a papal decree.

    The modern rediscovery of Eckhart

    Meister Eckhart thereafter remained relatively little known until his rediscovery by German romantics in the 19th century.

    Since then, he has attracted many religious and non-religious admirers. Among the latter were the 20th-century philosophers Martin Heidegger and Jean-Paul Sartre, who were inspired by Eckhart’s beliefs about the self as the sole basis for action. More recently, Pope John Paul II and the current Dalai Lama have expressed admiration for Eckhart’s portrayal of the intimate relationship between God and the individual soul.

    During the second half of the 20th century, the overlap of his teachings to many Asian practices played an important role in making him popular with Western spiritual seekers. Thomas Merton, a monk from the Trappist monastic order, for example, who began an exploration of Zen Buddhism later in his life, discovered much of the same wisdom in his own Catholic tradition embodied in Eckhart. He called Eckhart “my life raft,” for opening up the wisdom about developing one’s inner life.

    Richard Rohr, a friar from the Franciscan order and a contemporary spirituality writer, views Eckhart’s teachings as part of a long and ancient Christian contemplative tradition. Many in the past, not just monks and nuns have sought the internal experience of the divine through contemplation.

    Among them, as Rohr notes were the apostle Paul, the fifth-century theologian Augustine, and the 12th-century Benedictine abbess and composer Hildegard of Bingen.

    In the tradition of Eckhart, Rohr has popularized the teaching that Jesus’ death and resurrection represents an individual’s movement from a “false self” to a “true self.” In other words, after stripping away all of the constructed ego, Eckhart guides individuals in finding the divine spark, which is their true identity.

    Eckhart and contemporary perennials

    Novelist Aldous Huxley frequently cited Eckhart, in his book, ‘The Perennialist Philosophy.’RV1864/Flickr.com, CC BY-NC-ND

    This subjective approach to experiencing the divine was also embraced by Aldous Huxley, best known for his 1932 dystopia, “Brave New World,” and for his later embrace of LSD as a path to self-awareness. Meister Eckhart is frequently cited in Huxley’s best-selling 1945 spiritual compendium, “The Perennialist Philosophy.”

    More recently, the mega-best-selling New Age celebrity Eckhart Tolle, born Ulrich Tolle in 1948 in Germany and now based in Vancouver, has taken the perennial movement to a much larger audience. Tolle’s books, drawing from an eclectic mix of Western and Eastern philosophical and religious traditions, have sold millions. His teachings encapsulate the insights of his adopted namesake Meister Eckhart.

    While many Christian evangelicals are wary of Eckhart Tolle’s non-religious and unchurched approach, the teachings of the medieval mystic Eckhart have nonetheless found supportamong many contemporary Catholics and Protestants, both in North America and Europe.

    Fully understanding a new spiritual icon

    The cautionary note, however, is in too simplistic an understanding of Eckhart’s message.

    Eckhart, for instance, did not preach an individualistic, isolated kind of personal enlightenment, nor did he reject as much of his own faith tradition as many modern spiritual but not religious are wont to do.

    The truly enlightened person, Eckhart argued, naturally lives an active life of neighborly love, not isolation – an important social dimension sometimes lost today.

    Meister Eckhart has some important lessons for those of us trapped amid today’s materialism and selfishness, but understanding any spiritual guide – especially one as obscure as Eckhart – requires a deeper understanding of the context.

    The Conversation

    Joel Harrington does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.


    0 0

    What might have motivated the young missionary killed on a remote island in India?AP Photo/Sarah Prince

    The recent killing of a 26-year-old U.S. missionary, John Allen Chau, on a remote island in India has raised many questions about global evangelical Protestant missions.

    Chau was on a personal mission to convert the Sentinelese, a protected tribe who have avoided contact with the rest of the world. Indian ships monitor the waters to stop outsiders from approaching them. Chau, however, is reported to have asked fishermen to take him illegally to the island where the Sentinelese live. The Sentinelese are reported to have shot and killed him with arrows.

    As my research on missionaries shows, this often unwise haste to evangelize the world was the founding characteristic of evangelical missions in the late 19th century.

    Faith missions

    From the beginning of the 19th century, Protestants sent missionaries abroad under mission boards that required seminary education and full funding for prospective recruits. By the end of the 19th century, however, some mission leaders believed that the established missions were evangelizing the world at much too slow a pace.

    Evangelicals believe in a hell where the souls of those who don’t convert to Christianity will burn forever.

    Missionaries are motivated by Christ’s words in the “Great Commission” to “make disciples of all nations.” In these biblical verses, the risen Christ commands his disciples to go into all the world and preach the gospel. This command has motivated the missionary enterprise for centuries.

    These leaders founded what became known as “faith” missions to greatly expand the missionary force. As I write in my book, the new missions began sending out highly committed but lightly educated and ill-prepared missionaries. Many had not even finished high school. Just a bit of Bible training was considered enough.

    There were dozens of such missions by the early 20th century, each founded to Christianize a specific section of the globe, such as the China Inland Mission, the Sudan Interior Mission and the Central American Mission.

    Hundreds of young men and women, often with families, were sent overseas with little to no training in anything beyond the Bible and no promise of funding.

    While doing archival research for my book, I also found evidence of the conversations between the missionary leaders and their young recruits. The first question, for example, R. D. Smith, a leader of the Central American Mission, posed to new recruits:

    “Do you understand that you are to trust the Lord for the supply of all your needs and not rely upon the Central American Mission or any other human agency?”

    He did not want new recruits to even have one year’s funding in place. “I am not exercising faith,” Smith groused, “if I have a year’s supply in hand.”

    In digging into records in the Townsend archives in Waxhaw, North Carolina, I also found that recruits were told simply to pray, and God would provide the necessary funds: hence the name “faith missions.”

    These records show that mission leaders believed such a plan would result in a more “spiritual” missionary force, as missionaries were forced to rely only on God for their needs. Cameron Townsend, a Central American Mission recruit, wrote in 1927: “We want entire dependence on God, and not on anyone’s bank account.”

    Missionary zeal

    A missionary visits village women in the 1900s and is playing a small accordion.Unknown photographer via Wikimedia Commons

    Hundreds of untrained, unfunded, largely unsupervised young people got on boats to Africa or Asia knowing little about where they were going and with no language training other than what they might pick up from the missionary in the cabin next door.

    That tragedy often resulted should come as no surprise. A. T. Pierson, an early promoter of faith missions, argued to the leaders of the Africa Inland Mission in the late 1890s, after all of the initial 16 recruits the mission sent to Africa either died or quit,

    “The hallmark of God on any work is death. God has given us that hallmark. Now is the time to go forward.”

    While no official estimates are available, the death of missionaries was accepted as a common occurrence. The board of directors of an evangelical mission, the Nazarene Missions International, once sent a letter to two of its sick missionaries saying “we presume you will already be dead,” to explain why no funds were sent for the month – perhaps referring to funds that came through family or friends.

    In this case, the missionaries recovered. But how long they survived afterwards is not clear.

    Linguistic training

    By the 1930s some mission leaders were recognizing that such a plan was untenable.

    The Wycliffe Bible Translators, the largest and most influential faith mission in the 20th century, founded by Cameron Townsend, who had earlier advocated for dependence on God, began providing its recruits with much more thorough training and support. This included linguistic training through its Summer Institute of Linguistics.

    The goal was to teach missionaries how to, first, learn languages that had never been written down and second, create a written language. The ultimate goal was to translate the Bible into every language in the world. By midcentury these linguistic training institutes were recognized on secular university campuses as ranking with the best in the world. My own parents spent 30 years with Wycliffe translating the Bible into the Mansaka language in the jungles of the Philippines, where I grew up.

    At the same time, many of the Bible institutes, which had sent many graduates to the faith missions, were becoming Bible colleges and universities, and missionary training was becoming more thorough and academic.

    The discipline of anthropology made inroads into evangelical colleges by the 1960s and dramatically changed missionary education.

    Today’s evangelizing

    Most evangelical missionaries today are at least college-educated, and many have graduate degrees in anthropology or linguistics. John Allen Chau was a college graduate who had briefly attended a Canadian linguistic institute affiliated with Wycliffe’s Summer Institute of Linguistics, but he was not trained to be a missionary. He had majored in sports medicine.

    And while evangelicals continue to send missionaries as educators or business people to places where proselytizing is not welcome, such as China, few established missions today would sponsor Chau’s amateurish and obviously illegal approach.

    Today’s missions understand that passionate faith like Chau’s is not enough.

    The Conversation

    William Svelmoe does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.


    0 0

    Santa Claus will soon be coming to town, bringing gifts to children.

    Santa has several aliases, depending on the part of the world you live in. The English call him Father Christmas, the French know him as Père Noël, and Kris Kringle seems be a version of the Christkind, or Christ Child, who leaves treats for good German Lutherans.

    In the Netherlands, he arrives in town on a steamboat or horse from Spain. On the night of Dec. 5, Dutch children put their shoes on the hearth – these days near the central heating duct – hoping that he will fill them with sweet rewards rather than a reprimand for poor behavior. The Dutch call him Sinterklaas – which has come into American English as ‘Santa Claus’ – short for Sint Nicolaas or St. Nicholas.

    St. Nicholas and Santa Claus are historically the same man. But unlike the jolly figure who purportedly flies on a sleigh from the North Pole, the saint came originally from the balmy Mediterranean coast.

    Who was St. Nicholas really?

    As a historian of religions who has written books about ancient saints, I caution against reading accounts of saints’ lives as factual history. However, the earliest stories of St. Nicholas seem to correlate with histories and church documents of the period.

    According to these early medieval texts, Nicholas was born around 260 A.D. into a Christian family. His birthplace was near the town of Myra, now called Demre, on the southwest coast of modern Turkey. At the time, Christianity was illegal under the Roman empire.

    He studied to be a priest and spent time in prison for his beliefs. However, after Emperor Constantine legalized Christianity, Nicholas was elected Bishop of Myra.

    During his lifetime, he became famous for defending his people against imperial taxes and other forms of oppression. According to the earliest document about Nicholas, from the fifth century, he prevented three loyal generals from unjust execution for treason.

    A ninth-century Greek legend claims he revived three scholars who had been murdered and stashed in a pickling tub. He also saved three girls whose poverty-stricken father wanted to sell them into prostitution.

    After his death, people believed that Nicholas continued to work miracles. His burial place, below the floor of his church, became a popular destination for pilgrims who begged Nicholas to relay their petitions to God.

    Proof that Nicholas was listening, they believed, was in the “manna”holy oil or water– that dripped from the tomb. Pilgrims took this manna home in little bottles or used rags to sop up the moisture that dripped from the saint’s tomb in its subterranean crypt. This was a common pilgrimage practice at Christian shrines.

    Visitors to the coastal town of Myra spread Nicholas’ fame along sea routes across the Mediterranean. From there, word passed to the Latin West, and upriver to Russia. Soon, pilgrims from all over Christendom were journeying to Myra to seek the gifts of protection and healing from the saint, who was said to be especially attentive to children.

    Italians steal the body

    This pilgrimage was disrupted in the 11th century when Seljuk Turks invaded Anatolia. Christians feared that the Muslims who now governed Demre would disregard the saint’s tomb. So, one crew of pious Italian Christians decided to take action.

    In 1087, three ships laden with grain set out from Bari, on Italy’s southeast coast, bound for Antioch. However, according to a monk named Nicephorus who wrote immediately after the event, their real mission was to steal St. Nicholas’ body.

    In Antioch they heard a rumor that the Venetians too were planning a similar heist. The Barian sailors hastily sold off their grain and headed for Myra in search of St. Nicholas’ church. Priestly custodians there became suspicious when the sailors asked to see the saint’s body.

    The Barians claimed that the pope had a vision directing him to fetch Nicholas to Italy. When the priests refused, they offered gold for the relics, but the offer “was tossed aside like dung.” Done with arguing, the Barians caught and bound the priests. Suddenly, a phial of manna fell to the pavement and broke. It seemed that St. Nicholas spoke to them: “It is my will that I leave here with you.”

    So, the Barians broke through marble floor with picks and hammers. A delicious aroma filled the church as they opened the tomb. They found the bones swimming in a small sea of manna. They carefully wrapped the relics in a silk case brought for the purpose.

    Nicephorus describes how they fled to their ship, pursued by outraged priests and a howling crowd of citizens demanding that they “give back the father who has by his protection kept us safe from visible foes.”

    Yet the crew made it back to the harbor at Bari, where the townsfolk and clergy processed, singing joyous hymns, to greet the saint.

    St. Nicholas gets a reputation

    A new church was built for Nicholas in the court of the governor of Bari. A few years later, Pope Urban II— the one who would preach the First Crusade – formally enshrined the relics of the saint.

    A view of the interior of the church of St. Nicholas, built in the 11th century, at Bari.AP Photo

    The Barians believed that manna continued to ooze from Nicholas’ coffin. And going by the claim on the basilica’s website, the belief persists to this day.

    Within a decade of the saint’s arrival in 1087, the Basilica di San Nicola was one of Europe’s most popular pilgrimage destinations. May 9 is still celebrated as the day that Nicholas moved shrines or was “translated.”

    For at least five centuries, the region, which includes Bari and its saint, was caught in constant wars for possession of southern Italy. In 1500, Bari fell into the hands of King Ferdinand of Aragon, whose marriage to Queen Isabella of Spain created a global naval power.

    Because Nicholas was a patron saint of sailors, Spanish sailors and explorers carried stories of the saint wherever they went: Mexico, the Caribbean, Florida and other ports around the world.

    St. Nicholas around the world: Russian Orthodox believers line up to kiss the relics of St. Nicholas that were brought from an Italian church where they have lain for 930 years.AP Photo/Alexander Zemlianichenko

    Even the Dutch, who rebelled against Catholic Spain and formed a Calvinist republic in 1581, somehow maintained their devotion to Sinterklass. In other parts of Europe, St. Nicholas lost his feast day but his concern for children helped link him to the gift-giving tradition of another December feast day: Christmas.

    How true is this story?

    In the 1950s, Italian scientists examined the bones enshrined in the Basilica di San Nicola, seeking evidence of authenticity.

    They found the skull and incomplete skeleton of a man, dating to around the fourth century. More recent technology has allowed experts to use the bones to reconstruct Nicholas’ face – he looks like an old Greek man with a broad, worn face. He lacks the rosy cheeks and Anglo-Germanic features of modern Christmas decorations, but like the Santa Claus of greeting cards, he was probably bald.

    Turkish archaeologists now claim that the Italians stole the wrong body and that Nicholas’ remains never left Demre. They have discovered another sarcophagus dating to the fourth century in the same church, which they claim contains the saint.

    Meanwhile, historians have suggested that the story of Nicholas’ translation is a fiction purposely created to advertise a new pilgrimage center in the 11th century. Although relic theft was common in the Middle Ages, grave-robbers often made mistakes or lied about the authenticity and source of their bones. Nothing in the shrine at Bari proves that the bones inside belong to the fourth-century Bishop Nicholas.

    Which Santa story will you tell this holiday season?Delta News Hub/Flickr.com, CC BY

    Still, this holiday season, when you tell your children about Santa Claus, why not include the tale of Santa’s well-traveled bones? And don’t forget the manna, which is believed to still flow in Bari.

    The Conversation

    Lisa Bitel does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.


    0 0

    The Blackfeet always faced their tipis towards the rising sun, including on winter solstice.Beinecke Library via Wikimedia Commons, CC BY

    On the day of winter solstice, many Native American communities will hold religious ceremonies or community events.

    The winter solstice is the day of the year when the Northern Hemisphere has the fewest hours of sunlight and the Southern Hemisphere has the most. For indigenous peoples, it has been a time to honor their ancient sun deity. They passed their knowledge down to successive generations through complex stories and ritual practices.

    As a scholar of the environmental and Native American religion, I believe, there is much to learn from ancient religious practices.

    Ancient architecture

    For decades, scholarshave studied the astronomical observations that ancient indigenous people made and sought to understand their meaning.

    One such place was at Cahokia, near the Mississippi River in what is now Illinois across from St. Louis.

    The Cahokia mounds.Doug Kerr, CC BY-SA

    In Cahokia, indigenous people built numerous temple pyramids or mounds, similar to the structures built by the Aztecs in Mexico, over a thousand years ago. Among their constructions, what most stands out is an intriguing structure made up of wooden posts arranged in a circle, known today as “Woodhenge.”

    To understand the purpose of Woodhenge, scientists watched the sun rise from this structure on winter solstice. What they found was telling: The sun aligned with both Woodhenge and the top of a temple mound – a temple built on top of a pyramid with a flat top – in the distance. They also found that the sun aligns with a different temple mound on summer solstice.

    Archaeological evidence suggests that the people of Cahokia venerated the sun as a deity. Scholars believe that ancient indigenous societies observed the solar system carefully and wove that knowledge into their architecture.

    Clip from ‘Cahokia’s Celestial Calendar (Woodhenge)’ episode of PBS’ ‘Native America.’

    Scientists have speculated that the Cahokia held rituals to honor the sun as a giver of life and for the new agricultural year.

    Complex understandings

    Zuni Pueblo is a contemporary example of indigenous people with an agricultural society in western New Mexico. They grow corn, beans, squash, sunflowers and more. Each year they hold annual harvest festivals and numerous religious ceremonies, including at the winter solstice.

    At the time of the winter solstice they hold a multiday celebration, known as the Shalako festival. The days for the celebration are selected by the religious leaders. The Zuni are intensely private, and most events are not for public viewing.

    But what is shared with the public is near the end of the ceremony, when six Zuni men dress up and embody the spirit of giant bird deities. These men carry the Zuni prayers for rain “to all the corners of the earth.” The Zuni deities are believed to provide “blessings” and “balance” for the coming seasons and agricultural year.

    As religion scholar Tisa Wenger writes, “The Zuni believe their ceremonies are necessary not just for the well-being of the tribe but for "the entire world.”

    Winter games

    Not all indigenous peoples ritualized the winter solstice with a ceremony. But that doesn’t mean they didn’t find other ways to celebrate.

    The Blackfeet tribe in Montana, where I am a member, historically kept a calendar of astronomical events. They marked the time of the winter solstice and the “return” of the sun or “Naatosi” on its annual journey. They also faced their tipis – or portable conical tents – east toward the rising sun.

    They rarely held large religious gatherings in the winter. Instead the Blackfeet viewed the time of the winter solstice as a time for games and community dances. As a child, my grandmother enjoyed attending community dances at the time of the winter solstice. She remembered that each community held their own gatherings, with unique drumming, singing and dance styles.

    Later, in my own research, I learned that the Blackfeet moved their dances and ceremonies during the early reservation years from times on their religious calendar to times acceptable to the U.S. government. The dances held at the time of the solstice were moved to Christmas Day or to New Year’s Eve.

    The solstice.Divad, from Wikimedia Commons

    Today, my family still spends the darkest days of winter playing card games and attending the local community dances, much like my grandmother did.

    Although some winter solstice traditions have changed over time, they are still a reminder of indigenous peoples understanding of the intricate workings of the solar system. Or as the Zuni Pueblo’s rituals for all peoples of the earth demonstrate – of an ancient understanding of the interconnectedness of the world.

    The Conversation

    Rosalyn R. LaPier does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.


    0 0
  • 12/14/18--03:44: Who are Yemen's Houthis?
  • Supporters of Shiite Houthi rebels attend a rally in Sanaa, Yemen, in 2017.AP Photo/Hani Mohammed, File

    Fully half of Yemen’s population – 14 million people– are on the brink of starvation. Some analysts blame their inability to access basic foodstuff on escalating conflict between two religious factions: the country’s Sunni Muslims and its Houthis. The Houthis belong to the Shiite branch of Islam.

    Saudi Arabia, which shares a border with Yemen and is predominantly Sunni, has been helping Yemen’s government forces try to regain control over Houthi-held parts of the country. For several weeks, a Saudi-led coalition has unleashed near-continuous airstrikes on Houthi strongholds including access points for the majority of humanitarian aid coming into country.

    What are the Houthis’ religious beliefs?

    Roots of Houthi movement

    Just as the Protestant tradition is subdivided into Methodists, Presbyterians, Congregationalists and others, Shiite Islam is also subdivided. Houthis belong to the Zaydi branch.

    From the ninth century onward, or for a thousand years, a state ruled by Zaydi religious leaders and politicians existed in northern Yemen. Then, in 1962, Egyptian-trained Yemeni military officers toppled the Zaydi monarchy and replaced it with a republic. Because of their ties to the ancient regime, Zaydis were perceived as a threat to the new government and were subjected to severe repression.

    Nearly three decades later, in 1990, the region known as south Yemen merged with north Yemen to become the Republic of Yemen. Zaydis remained a majority in the north and west of the country, and also in the capital city of Sanaa. However, in terms of the overall population, they became a minority.

    According to a 2010 CIA estimate, 65 percent of Yemen’s people are Sunnis and 35 percent are Shiites. The majority of those Shiites are Zaydis. Jews, Bahais, Hindus and Christians make up less than 1 percent of inhabitants, many of whom are refugees or temporary foreign residents.

    Yemen: 2015 Civil War map. The section in green is controlled by the Houthis.0ali1,via Wikimedia Commons, CC BY-SA

    To reduce the dominance of Zaydis in the north, government authorities encouraged Muslims belonging to two Sunni branches with links to Saudi Arabia – Salafis and Wahhabis – to settle in the heart of the Zaydis’ traditional territories.

    Start of Houthi insurgency

    Contributing to this trend, in the early 1990s, a Yemeni cleric founded a teaching institute in the Zaydis’ heartland. This cleric, educated in Saudi Arabia, developed a version of Salafi Islam.

    His institute proselytized with the goal of reforming Muslims through conversion. It educated thousands of Yemeni students and, in less than three decades, the new religious group grew large enough to compete with older groups such as the Zaydis.

    According to scholar Charles Schmitz, the Houthi insurgency began in the early 1990s, spurred, in part, by Zaydi resistance to growing Salafi and Wahhabi influence in the north.

    Hussein Badreddin al-Houthi, son of a prominent Zaydi cleric, gave the grassroots movement its name. He coalesced support among his followers around a narrative of Houthis as defenders and revivers of Zaydi religion and culture.

    Sunni vs. Zaydi Shiite beliefs

    What beliefs set Zaydis apart from Sunni Muslims? That is an old story, dating back to the seventh century when the Prophet Muhammad died.

    Shiites and Sunnis disagree about who should have been selected to succeed Muhammad as head of the Muslim community. Two groups emerged after his death. One group of the Prophet’s followers – later called Sunnis – recognized four of his companions as “rightly guided” leaders In contrast, another group – later called Shiites – recognized only Ali, the fourth of these leaders, as legitimate.

    Ali was the Prophet’s first cousin and closest male blood relative. He was also married to Fatima, Muhammad’s youngest daughter. For these and other reasons, Shiites believe that Ali was uniquely qualified to lead. In support of this claim, they cite sources describing Muhammad’s wish that Ali succeed him. Shiites consider Ali second in importance only to the Prophet.

    Over time, further divisions took place. Allegiances to different descendants of Ali and his two sons, Hassan and Hussein, split Shiites into sub-branches. A grandson of Hussein called Zayd gave the Zaydis their name. To them, he is the fifth imam after Muhammad, giving the Zaydis their other name: “Fivers.”

    The family genealogy of the Zaydi Shiites’ first five imams.The Conversation, CC-BY-ND, CC BY

    Zayd earned the respect of his followers when he rose up against the powerful Muslim rulers of his time, whom he believed to be tyrannical and corrupt. Though his rebellion was ill-fated, his fight against oppression and injustice inspires Zaydis to actively resist.

    A key Zaydi belief is that only blood relatives of Ali and Fatima are eligible to serve as religious leaders, or imams. In Yemen, these relatives form a notable class of people called Sada. Hussein al-Houthi, the first leader of the Houthis, came from a prestigious clan of Sada.

    Impact of sectarian differences

    Not all Zaydis have a favorable view of Sada elites. When north and south Yemen merged in 1990, the republican government, led by a Zaydi president sought to reduce their outsized influence and limit their privileges.

    Some members of the Sada reacted to the country’s changing political landscape by joining electoral politics to secure honor and exercise power. This path was initially followed by Hussein al-Houthi but, after he decided it was ineffective, he abandoned it.

    Other members of the Sada, particularly the youth, reacted by pledging to teach and promote Zaydism among their peers who had forgotten their ancestors’ religion. To accomplish this, they founded the Believing Youth organization and set up a cultural education program based on a network of summer camps in the north. Hussein al-Houthi joined this organization in the early 2000s and later transformed it into a political movement critical of the Yemeni government’s ties to the West.

    Security forces sent to arrest Hussein al-Houthi touched off the first war with the Houthis. Hussein was killed during the conflict and leadership passed to Hussein’s father and then to Hussein’s youngest brother, Abdul-Malik Badreddin al-Houthi. Abdul-Malik helped transform the Houthi movement into a powerful fighting force.

    Five additional wars followed over the next six years until, in 2010, the rebels had grown strong enough to repel a ground and aerial offensive launched against them by Saudi Arabia. During these wars, the Houthis pushed beyond their traditional base and captured vast sections of territory.

    Yemeni women and children at a camp in north Yemen.IRIN Photos/Flickr.com, CC BY-NC-ND

    Many Yemenis, according to one expert, believe that the Houthis are fighting to restore a state like the one prior to 1962, led by imams who came exclusively from Sada families.

    Complex factors today

    Houthis continue to focus on protecting the Zaydi region of north Yemen from state control. However, they have also forged coalitions with other groups – some of them Sunni – unhappy with Yemen’s persistent high unemployment and corruption.

    A 2015 U.N. Security Council report estimates that the Houthi movement includes 75,000 armed fighters. However, if unarmed loyalists are taken into account, they could number between 100,000 and 120,000.

    Sectarian tension is only one factor in the complex set of interlocking factors responsible for violence and starvation in Yemen. But it is, without a doubt, a contributing factor.

    The Conversation

    Myriam Renaud is affiliated with the Parliament of the World's Religions.


    0 0

    A painting showing Saint Francis Borgia, a 16th century saint,, performing an exorcism.Francisco Goya

    The Exorcist,” a horror film released 45 years ago, is a terrifying depiction of supernatural evil. The film tells the story of a young American girl who is possessed by a demon and eventually exorcised by a Catholic priest.

    Many viewers were drawn in by the film’s portrayal of exorcism in Christianity. As a scholar of Christian theology, my own research into the history of Christian exorcisms reveals how the notion of engaging in battle against demons has been an important way that Christians have understood their faith and the world.

    Early and medieval Christianity

    The Bible’s account of the life of Jesus features several exorcism stories. The Gospels, reflecting views common in Judaism in the first century A.D., portray demons as spirits opposed to God that haunt, possess or tempt people to evil.

    Exorcism by St. Exupere, Bishop of Toulouse, France, at the beginning of fifth century.Philippe Alès, CC BY-SA

    Possessed individuals are depicted as displaying bizarre and erratic behaviors. In the Gospel of Luke, for example, a boy is possessed by a demon that makes him foam at the mouth and experience violent spasms. Jesus is shown to have a unique power to cast out demons and promises that his followers can do the same.

    In the centuries that followed, accounts of using Jesus’ name for casting out demons are common. Origen, an early Christian theologian, writing in the second century, explains how the name of Jesus is used by Christians to expel “evil spirits from … souls and bodies.”

    Over the years exorcism came to be associated more widely with the Christian faith. Several Christian writers mention exorcisms taking place publicly as a way to convince people to become Christians. They argued that people should convert because the exorcisms Christians performed were more effective than those of “pagans.”

    Early Christian texts mention various exorcism methods that Christians used, including making the sign of the cross over possessed persons or even breathing on them.

    Minor exorcism

    Beginning some time in the early Middle Ages, specific priests were uniquely trained and sanctioned for exorcism. This remains the case today in Roman Catholicism, while Eastern Orthodox traditions allow all priests to perform exorcisms.

    Early Christians also practiced what is sometimes called a “minor exorcism.” This type of exorcism is not for those considered to be acutely possessed.

    This took place before or during the ritual of baptism, a ceremony whereby someone officially joins the Church. The practice emerges from the assumption that all people are generally susceptible to evil spiritual forces. For this reason some sort of prayer or statement against the power of the devil would often be recited during catechesis, a period of preparation prior to baptism, baptism, or both.

    Demons and Protestants

    Between the 15th to 17th centuries, there was an increased concern about demons in Western Europe. Not only are there abundant accounts of priests exorcising individuals from this time period, but also of animals, inanimate objects and even land.

    A woodcut from 1598 shows an exorcism performed on a woman by a priest and his assistant, with a demon emerging from her mouth.Pierre Boaistuau, et al., Histoires prodigieuses et memorables, extraictes de plusieurs fameux autheurs, Grecs, & Latins, sacrez & prophanes (Paris, 1598), vol. 1.

    The narratives are also much more detailed. When someone possessed by a demon was confronted by an exorcist priest, it was believed that the demon would be aggravated and cause the individual to engage in more intense and violent behavior. There are reports of physical altercations, floating around the room, and speaking or screaming loudly and angrily during the exorcism process.

    Protestants, who were skeptical of many Catholic rituals, combated demonic possession with more informal practices such as impromptu prayer for the afflicted individual.

    During the Enlightenment, between 17th to 19th centuries, Europeans began to cast doubt on so-called “superstitious” elements of religion. Many intellectuals and even church leaders argued that people’s experiences of demons could be explained away by psychology and other sciences. Exorcism began to be viewed by many as unnecessary or even dangerous.

    Exorcism today

    Many Christian denominations still practice some form of minor exorcism. Before people are baptized in the Episcopal Church, for example, they are asked: “Do you renounce Satan and all the spiritual forces of wickedness that rebel against God?”

    Exorcism is practiced by Christians across the world.Lutsenko_Oleksandr/Shutterstock.com

    The Catholic Church still has an active ministry devoted to performing exorcisms of possessed individuals. The current practice includes safeguards that require, among others, persons suspected of being possessed to undergo medical and psychiatric evaluation before an exorcism takes place.

    Exorcism is particularly common in Pentecostalism, a form of Christianity that has grown rapidly in recent decades. This branch of Christianity emphasizes spiritual experience in everyday life. Pentecostals practice something akin to exorcism but which is typically called “deliverance.” Pentecostals maintain that possessed persons can be delivered through prayer by other Christians or recognized spiritual leader. Pentecostalism is an international Christian tradition and specific deliverance practices can vary widely around the world.

    In the United States belief in demons remains high. Over half of all Americans believe that demons can possess individuals.

    So, despite modern-day skepticism, exorcism remains a common practice of Christians around the world.

    The Conversation

    S. Kyle Johnson does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.


    0 0

    Catnip induces changes in cat behavior.Anna Hoychuk

    As the Christmas season gets underway thoughts turn to buying gifts for the entire family. For some, companion animals are on the gift list, particularly cats and dogs who share our homes and hearts.

    Whether they’ve been naughty or nice matters not, as the more than US$1 billion pet toy industry has everything from the whimsical to practical to keep Fluffy and Spot occupied and caretakers entertained. Many of the go-to items for cats contain catnip.

    This herb, which goes by the botanical name of Nepeta cataria, induces changes in cat behavior. In my view, it’s worth considering whether giving a mood-altering substance to a pet is ethical.

    Kitty crack?

    Catnip is sold in small packets and toys as well as in highly concentrated forms such as oils and sprays. The concentrated forms are different from its availability in nature. If a cat were to encounter catnip in the wild, it would be in the form of leafy greens growing on plants, not concentrated.

    Not all cats are affected by the drug, but for some it can have a five- to 15-minute marijuana- kind of effect.

    About 30 percent do not respond at all– which means 70 percent do – and it doesn’t have an impact on kittens until they are about 6 months old, the time they attain sexual maturity.

    When under the influence, some cats roll around, salivate, and at times, fight with other cats. It is not clear if there are any medicinal benefits. Cat owners often laugh at this behavior of their feline friends as being “high.”

    This video explains how catnip works.

    Babes and beasts

    As an animal media studies scholar, I argue laughing at a cat who has been given a drug even if they seem happy should raise questions about human power and animal autonomy.

    Several philosophers have made an argument for giving the same moral consideration to animals as we would give to humans. Philosopher Jan Narveson, for example, asked in context of eating meat, whether animals suffer and if that was sufficient reason not to eat them.

    One animal ethics theory denies moral standing to other animals, stating they lack characteristics that only humans are thought to possess, such as rationality, autonomy and consciousness. But another theory of moral equality argues that there are parallels in mental capabilities between humans and other animals and that moral consideration should not be limited to only our own species.

    Philosopher Peter Singer, calls for “equal consideration of interests.” Singer argues that we should not use our species as a measure of the worth or abilities of others, or their worthiness of ethical consideration. Other philosophers too have argued that simply because dogs or other animals don’t have the same vocal structure as humans doesn’t mean they should be treated with less compassion.

    Furthermore, humans share many traits – empathy, ability to communicate, eating habits, sociability – with other species. For example, the capacity to love one’s young, the need to have food, water and to spend time with others of one’s own species are not exclusively human traits. According to philosopher Julia Tanner, “It would be arbitrary to deny animals with similar capacities a similar level of moral consideration.”

    So, if is unethical to drug a child and to laugh at how he or she responds, should we unthinkingly do the same with our cats?

    Consider animal ethics

    The discussion on whether giving catnip is ethical has been an ongoing one on social media and other websites.

    Should you reconsider giving catnip to your cat?Tanya Plonka

    On Reddit, for example, one person commented, “think of it as your cat going out for a few beers after work.” To that, another reader from an Alcoholics Anonymous family responded, asked whether it was ethical to give someone a drug in an otherwise substance-free home.

    I asked the nonprofit People for the Ethical Treatment of Animals where they stand on this issue. Media Officer Sophia Charchuk responded:

    “PETA is all for treating cat companions to reasonable amounts of high-quality catnip – and for keeping them indoors, where they’ll be safe from cars, contagious diseases, predators, and cruel humans and able to enjoy toys (including those filled with catnip) for years to come.”

    However, my point here is not only about whether cats feel pleasure or pain. It’s about taking responsibility for our actions towards our pets and giving them the same moral consideration as we do to humans.

    We rarely notice how advertising, television programs, movies and photographs often present a one-dimensional view of animals using them to say something about us, but very little about them. Wolves, for example, are widely shown in advertising and film as intent solely on harming us, rather than the complex, multidimensional pack animals that they are.

    This has an impact on how we view animals. I agree with scholars who have pointed out that we need to view animals as subjects of their own lives rather than objects in ours. I believe we need to reconsider the ethics of “catnipping” them.

    The Conversation

    Debra Merskin does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.


    0 0

    More and more Americans are choosing to be single.mimagephotography/Shutterstock.com

    As the holidays transition to the New Year, singles may face questions from friends and family: “When are you getting serious about dating?

    In many families, seasonal festivities draw lines between who’s coupled and who’s not. Romantic partners are invited to holiday meals, included in family photographs, and seen as potential life mates – while "mere” friends are not. These practices draw a line between relationships seen as significant – and those which aren’t.

    As I’ve argued in my research on the ethics and politics of the family, these practices reflect widespread assumptions. One is that everyone is seeking a romantic relationship. The second is more value-laden: living in a long-term romantic, sexual partnership is better than living without one. This fuels beliefs that those living solo are less happy, or lonelier, than couples.

    These assumptions are so prevalent that they guide many social interactions. But research shows they’re false.

    Why more Americans are living single

    The truth is that more Americans are living unmarried and without a romantic partner. In 2005, the census for the first time recorded a majority of women living outside of marriage Although, of course, some unmarried women have romantic partners.

    By 2010, married couples became a minority in the United States. The percentage of unmarried adults is at an all-time high, with more young adults choosing to live unmarried and without a romantic partner.

    Personal finances likely plays a role in such choices. Millennials are worse off than earlier generations. There is a proven connection between economic resources and marriage rates – what legal scholar Linda McClain calls “the other marriage equality problem.” Lower incomes correlate with lower rates of marriage.

    But changing family patterns are not simply the result of financial instability. They reflect choices: Not everyone wants romantic partnership and many singles see solo life as more conducive to flourishing and autonomy.

    Single by choice

    Many people enjoy being single.Jelena Danilovic/Shutterstock.com

    As I show in my book “Minimizing Marriage,” people have many different political or ethical reasons for preferring singlehood.

    Some women become single mothers by choice. As sociologist Arlie Hochschild has argued, marriage brings extra work for women, making it less attractive than single life for some.

    For other people, being single is simply a relationship preference or even an orientation. For example, there are those, referred to as “asexuals” and “aromantics,” who lack interest in sexual and romantic relationships.

    Who are asexuals and aromantics?

    Data from a 1994 British survey of more than 18,000 people showed 1 percent of the respondents to be asexual. Because asexuality is still little-known, some asexuals might not identify as such. And so, it’s possible that the true numbers could be higher.

    Asexuals are people who do not feel sexual attraction. Asexuality is not simply the behavior of abstaining from sex, but an orientation. Just as straight people feel sexual attraction to members of a different sex, and gays and lesbians feel attraction to members of the same sex, asexuals simply do not feel sexual attraction. Asexuals can have romantic feelings, wanting a life partner to share intimate moments with and even cuddle – but without sexual feelings.

    But some asexuals are also aromantic, that is, not interested in romantic relationships. Like asexuality, aromanticism is an orientation. Aromantics may have sexual feelings or be asexual, but they do not have romantic feelings. Both asexuals and aromantics face a lack of understanding.

    Angela Chen, a journalist writing a book about asexuality, reports that her asexual interview subjects suffered from a lack of information about asexuality. As they failed to develop sexual attractions during puberty - while their classmates did - they asked themselves, “Am I normal? Is something wrong with me?”

    But while asexuality is sometimes misunderstood as a medical disorder, there are many differences between an asexual orientation and a medical disorder causing a low sex drive. When asexuals are treated as “abnormal” by doctors or therapists, it does them a disservice.

    Since the early 2000s, asexuals have exchanged ideas and organized through online groups. One such group, The Asexual Visibility and Education Network, for example, promotes the understanding that lack of sexual attraction is normal for asexuals, and lack of romantic feelings is normal for aromantics.

    Asexuals, like aromantics, challenge the expectation that everyone wants a romantic, sexual partnership. They don’t. Nor do they believe that they would be better off with one.

    Single and alone – or lonely?

    Far from the stereotype of the lonely single, lifelong singles are less lonely than other older people, according to psychologist Bella DePaulo, the author of “Singled Out.” Nor are singles alone.

    Many singles have close friendships which are just as valuable as romantic partnerships. But assumptions that friendships are less significant than romantic partnerships hide their value.

    Understanding the reasons people have for remaining single might help to handle family stresses. If you’re single, you could take unwanted questioning as a teachable moment. If you’re the friend or family member of someone who tells you they’re happily single – believe them.

    The Conversation

    Elizabeth Brake has received funding from the Social Sciences and Humanities Research Council (Canada).


    0 0

    A woman holds up a quilt with photos of people who say they were abused as children by priests, in San Diego, 2007.AP Photo/Denis Poroy

    Pope Francis started the new year criticizing some Catholic bishops for their role in the church’s sexual abuse crisis. In a letter to bishops gathered at Mundelein Seminary in Illinois for a spiritual retreat, the pope said that the “disparaging, discrediting, playing the victim” had greatly undermined the Catholic Church. This followed the pope’s earlier remarks asking clergy guilty of sexual assault to turn themselves over to law enforcement.

    Stories of clergy sex abuse have continued to increase. Among the more recent revelations, a Catholic diocese recently released the names of Jesuit priests who face “credible or established” accusations of abuse of minors. Church members learned that many priests accused of sexual abuse on Indian reservations were retired on the Gonzaga University campus in Spokane. And another external investigation has revealed that the Catholic Church failed to disclose abuse accusations against 500 priests and clergy.

    Church attendance has been on the decline for some time, with the steepest fall of an average 45 percent, between 2005 to 2008. And with these latest scandals, as a theologian recently wrote, the Catholic Church is in the midst of its “biggest crisis since the Reformation.”

    But what many do not realize is that staying in the church does not mean agreeing with its policies. In the past, Catholics have challenged the church through multiple forms of resistance – at times discreet and at other times quite dramatic.

    Pacifist protesters

    I had already begun my training as a scholar of religion and society when I learned that the priest from whom I took my first communion was a known predator in the Boston Archdiocese. I have since then researched and written about the Catholic clergy abuse cover-up.

    Back in the 1960s, some radical American Catholics were at the forefront of challenging U.S. involvement in the war in Vietnam. Perhaps the most famous among them were the Berrigan brothers. Rev. Daniel Berrigan, the older brother, was an American Jesuit priest, who, along with with other religious leaders, expressed public concern over the war.

    Daniel Berrigan marching with about 40 others outside of the Riverside Research Center in New York.AP Photo/Marty Lederhandler, File

    In New York, Daniel Berrigan joined hands with a group called the Catholic Workers, in order to build a “decent non-violent society” – what they called “a society of conscience.” Among their protests was a public burning of draft cards in Union Square in 1965.

    Months earlier, the U.S. Congress had passed legislation that made mutilation of draft registration a felony. A powerful commentary by the editors of the Catholic “Commonweal” magazine described the event as a “liturgical ceremony” backed by a willingness to risk five years of freedom.

    But some in the Catholic leadership were concerned that Daniel Berrigan’s peace activism was going too far. Soon after another Catholic protester set himself on fire in front of the United Nations in an act of protest, Berrigan disappeared from New York. He’d been sent to Latin America on an “assignment” by his superiors.

    The word among Catholics was that Cardinal Francis Spellman had Berrigan expelled from the U.S. The accuracy of the decision is selectively disputed. However, the narrative had great power. The public outcry among Catholics was immense. University students took to the streets.

    The New York Times carried a vehement objection that was signed by more than a thousand Catholic practitioners and theological leaders. The repression of free speech, they said, was “intolerable in the Roman Catholic Church.”

    Catholic symbols of protest

    In May 1967, Berrigan returned to the United States, only to renew his protest against the draft. Joined by his brother Philip, they broke into a draft board office in Baltimore and poured vials of their own blood on paper records.

    A 1973 photo shows Rev. Daniel Berrigan and others participating in a fast and vigil to protest the bombing in Cambodia, on the steps of St. Patrick’s Cathedral in New York City.AP Photo/Ron Frehm

    In pouring vials of their own blood on draft records, they were extending the use of Christ’s blood of sacrifice, to promote peace, as part of Catholic teachings.

    The next year they joined by seven other Catholic protesters in a protest action in Catonsville, Maryland. The group used homemade napalm to destroy 378 draft files in the parking lot of a draft board. Daniel Berrigan was put on the FBI’s most wanted list. Both brothers later served time in federal prisons.

    After the Vietnam war, their protests continued under a group called Plowshares. The name came from the commandment in the book of Isaiah to “beat swords into plowshares.” The Berrigan brothers put their energy into anti-nuclear protests around the country. At a nuclear missile facility in King of Prussia, Pennsylvania, they hammered on nuclear warheads and once again poured their own blood upon them, bridging Catholic symbols with religious protest.

    Church leadership, they said, was too cozy with a heavily militarized America.

    Protests inside the church

    Around the same time, another group of Roman Catholics was challenging the leadership of the church using different tactics. In 1969, a group of Chicano Catholic student activists that called itself Católicos Por La Raza, objected to the money that the Archdiocese of Los Angeles was spending on building a new cathedral called St. Basil’s. They believed that money could be better spent on improving the social and economic conditions of Catholic Mexican-Americans.

    Católicos Por La Raza posed a list of demands for the Catholic Church that included the use of church facilities for community work, providing housing and educational assistance, and developing health care programs.

    On Christmas Eve, 300 people marched to protest at St. Basil’s. Outside, they chanted “Que viva la raza” and “Catholics for the people.” Some members also planned to bring the protest across the threshold of the cathedral and into the Christmas Eve Mass.

    The church locked its front doors. The marchers were met at side doors by undercover county sheriffs.

    Later, the protesters publicly burned their baptismal certificates. Catholic teaching maintains that, once baptized, Catholic identity cannot be divested. By burning these symbols of Roman Catholic belonging, members of Católicos Por La Raza were making a powerful statement of their renunciation of the religion that they perceived could not be reformed.

    A priest steps over a protester, who deliberately fell to the floor in front of him as the priest was giving communion at St. Patrick’s Cathedral in New York in 1989.AP Photo/Mario Suriani

    Back in New York, a generation later, Catholics also organized confrontations with Church leadership. At the height of the AIDS crisis, in 1989, the American Catholic Bishops drafted an explicit condemnation of the use of condoms to stop the spread of the AIDS virus. “The truth is not in condoms or clean needles,” said Cardinal John O'Connor. “These are lies … good morality is good medicine.”

    In response, AIDS activists organized an action called “Stop the Church” to protest against the “murderous AIDS policy” at St. Patrick’s Cathedral in Manhattan. Thousands of people gathered to protest. Outside, activists distributed condoms and safer-sex information to passers-by. Inside, some protesters staged a die-in.

    And this does not even get into waves of protests over women’s ordination since 1976.

    In all these protests, Roman Catholics were demanding that powerful members of the hierarchy acknowledge their demands for the ethics of the church.

    Bringing change in the church

    Catholics have challenged the church through multiple forms of resistance.AP Photo/Patrick Semansky

    Similar resistance continued in 2002, when the Boston Globe Spotlight investigation team exposed the systematic cover-up of child sexual abuse in the Boston Archdiocese, under Cardinal Bernard Law.

    On Sundays Catholics came out to protest in front of the Cathedral of the Holy Cross in Boston, where the cardinal said Mass. They shouted and held up signs calling for his resignation. Other Catholics were creating pressure to have the cardinal removed by cutting off lay financial support for the Archdiocese.

    They encouraged continuing giving to the poor or to the local parish. But until the cardinal was held accountable, those in the pews were encouraged to abstain from institutional giving. Before the next New Year, enough financial and legal pressure forced Cardinal Law to be removed from the Archdiocese.

    February 2019 will bring a crucial meeting between the pope and the cardinals. Catholics today could well ask what is their way of showing resistance. After all, there is a rich Catholic heritage that shows that members of the church who put their bodies on the line can make a difference.

    The Conversation

    Mara Willard does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.


    0 0

    Elizabeth Ann Seton shrine.Pam Broviak, CC BY-NC-ND

    January marks the feast of Saint Elizabeth Ann Seton. Born in New York City in 1774, Seton became the first person born in what would soon become the United States to be canonized as a saint in the Roman Catholic Church. Since then, she has been celebrated as an “American saint.”

    As the author of her recent biography, I believe Seton’s life and legacy transcend national boundaries. Seton drew inspiration from other cultures, and the religious community she created continues to serve and learn from immigrants.

    Early years

    When the American Revolution began, Seton’s family, like many other colonists, remained loyal to the Crown. After the war she witnessed the difficulties that defeated Loyalists faced.

    As she grew to womanhood in New York City, Seton educated herself through an intellectual and social world that went beyond national boundaries. She was fascinated by French philosophy and English theology.

    She married a transatlantic merchant, William Seton, son of an English immigrant, who had lived in Italy. The Setons socialized with other cosmopolitan merchant families, some of them immigrants.

    If there was anything distinctly American about Seton’s experience of religion, it was that she saw around her many different faiths practiced openly. An Episcopalian by birth, she loved the Methodist hymns she overheard on Manhattan’s streets. She also admired the plain bonnets of Quaker women – “pretty hats,” as she called them – that they wore to demonstrate their humility.

    New Yorkers worshiped in any number of ways, and Seton believed they all had value.

    Converting to Catholicism

    Seton’s discovery of Catholicism emerged from her willingness to appreciate, as she once wrote, “many different customs and manners.” A chance visit to Italy introduced her to the faith that would transform her life.

    In 1804, William Seton’s health and business failed. The Setons traveled to Italy, hoping that the climate would cure William’s tuberculosis and that Italian merchant friends would resuscitate his business. William died, bankrupt, weeks after their arrival.

    In Italy, Elizabeth visited Catholic churches, moved by the same interest in other faiths that characterized her New York life. She was first dazzled by the beauties of Florence, and then moved by the Catholic doctrine of transubstantiation, a belief that God was present during the sacrament of communion.

    Back home in New York, Seton wavered in the face of her friends’ and family’s mistrust of a faith they did not consider appropriate for the United States. Among Protestant Americans, anti-Catholic attitudes were deeply rooted. Many believed Catholics were loyal only to Rome and untrustworthy.

    After an agonizing deliberation, Seton formally converted. But weary of her family’s distaste for her new faith, she hoped to emigrate to Quebec, home to French-speaking Catholics and many churches. She hoped to find in Quebec a unified, Catholic society.

    Founding a new community

    Emigration proved impractical, and Seton instead moved to Maryland. Over the next 15 years, she developed a new understanding of how to live a faithful life in a diverse nation. Her beliefs did not change, but while earlier she had tried to persuade relatives to convert, she no longer did so.

    National Shrine of St. Elizabeth Ann Seton in Emmitsburg, Maryland.Jon Dawson, CC BY-ND

    In Maryland, Seton founded the American Sisters of Charity, an apostolic women’s religious community. The Sisters of Charity began orphanages and schools in Philadelphia, New York and beyond. Many of those cared for were newcomers to the United States or their children. The sisters were laying the groundwork for a Church that drew strength from immigrants in American cities and towns.

    Seton also founded a school for girls. She insisted that non-Catholic children be welcome, and that they not be pressed to change their beliefs.

    Seton was canonized in 1975. Pope Paul VI declared she had performed posthumous miracles, led a holy life and entered heaven. There are now 11 men and women who have been canonized for their work in the United States or colonies that would become part of the United States.

    Some of those who advocated Seton’s canonization emphasized her status as a native-born citizen. The reason lies not in Seton’s life but in the later history of Catholicism.

    In the decades after Seton’s death in 1821, large numbers of Irish and German Catholics immigrated to the United States. The cultural antipathy and economic competition that resulted revived anti-Catholic sentiments that had begun to recede.

    The heavily immigrant Church was often anxious in the face of anti-Catholicism. Seton’s canonization was meant to be the ringing affirmative answer to the question of whether one could be both a good American and a good Catholic.

    Seton’s legacy

    Statue and relic of St Elizabeth Ann Seton in the Basilica where she’s buried in Emmitsburg, Maryland.Lawrence OP, CC BY-NC-ND

    Today, the religious communities Seton inspired, the Sisters and Daughters of Charity, honor her as an American and a faithful Catholic. Yet they interpret Seton’s legacy as a commitment to human community that extends beyond national boundaries.

    Members of the Sisters of Charity Federation aid immigrants in a variety of ways, including working with the legal system and offering homes to refugee families.

    The Federation works with the United Nations to “give voice to those living in poverty,” and has joined other religious communities in a statement on behalf of “our Muslim brothers and sisters.”

    As 2019 begins, and issues about stopping immigrants from entering the United States loom large, it is worthwhile to remember Elizabeth Seton belonged to many communities during her life – the nation was just one of them.

    .

    The Conversation

    Catherine O'Donnell does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.


    0 0

    A smarter use of social media can improve your sense of well-being.Rawpixel.com/Shutterstock.com

    This past year, many people deleted their social media accounts following revelations about privacy violations on social media platforms and other concerns related to hate speech.

    As people adopt their resolutions for the year, it is likely that many more will reconsider their social media use.

    However, as a scholar of social media and religion, I’d argue that rather than just stop using social media, people could use it to improve their overall well-being. Here are three ways to do so.

    1. Be active

    Studies have shown that there is a big difference between passive social media use and active use. Scrolling through a newsfeed and merely looking at what others have posted is considered passive social media use.

    Conversely, commenting on posts, sharing articles and creating posts constitute active social media use. Research has found that actively using social networking sites can contribute to feelings of social connectedness. This can contribute to a sense of overall well-being.

    On the other hand, a study found that passive Facebook use increases feelings of envy. Researchers asked participants to sit in a laboratory and passively use Facebook by only browsing and not commenting, sharing or liking content. Participants passively using Facebook were found to have an increase in their feelings of envy.

    2. Focus on meaningful engagement

    Social media sites allow users to engage in various types of communication. There are impersonal forms of communication such as the single click “Like” button and more personal forms of communication such as direct messaging and comments.

    Direct messaging and comments can help with a deeper level of engagement.Denys Prykhodov/Shutterstock.com

    Research has found that direct communication on Facebook can have a positive psychological impact on individuals. A direct message can often lead to feelings of social support and encouragement. It has been found to be particularly helpful when people already share a connection. Direct messaging and personalized comments can provide a deeper level of engagement.

    One of these studies showed that commenting on a post, instead of pressing the like button, could improve the mood of the person who made the original post. In one such example, a respondent in the study described how personalized comments, even trivial ones about funny cat videos, can result in feelings of support.

    Similarly, research has shown that social networking sites can provide social support to those who have recently lost a job.

    3. Use social media for professional purposes

    According to researchers in Germany, Sonja Utz and Johannes Breuer, using social networking sites for professional purposes can result in “informational benefits” such as knowing what is happening in one’s field and developing professional connections.

    For example, these scholars found that people who use social networking sites for professional purposes report having greater access to information about timely innovations in their field than nonusers. A similar study of academics in the United Kingdom found that 70 percent of participants had gained valuable professional information through Twitter.

    Researchers, however, have found that these professional benefits require active use of social networking sites. “Frequent skimming of posts,” as Utz and Breuer explain, can lead to “short time benefits.” What is more important, however, are “active contributions to work-related discussions.”

    Indeed, there are those who recommend curtailing use of social media and focusing instead on real-world relationships. But, as with everything else, moderation is vital.

    The Conversation

    A. Trevor Sutton does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.


    0 0

    Your phone tracks your movements all the time.grapestock/Shutterstock.com

    Technology companies have been pummeled by revelations about how poorly they protect their customers’ personal information, including an in-depth New York Times report detailing the ability of smartphone apps to track users’ locations. Some companies, most notably Apple, have begun promoting the fact that they sell products and services that safeguard consumer privacy.

    Smartphone users are never asked explicitly if they want to be tracked every moment of each day. But cellular companies, smartphone makers, app developers and social media companies all claim they have users’ permission to conduct near-constant personal surveillance.

    The underlying problem is that most people don’t understand how tracking really works. The technology companies haven’t helped teach their customers about it, either. In fact, they’ve intentionally obscured important details to build a multi-billion-dollar data economy based on an ethically questionable notion of informed consent.

    How consumers are made to agree

    Most companies disclose their data protection practices in a privacy policy; most software requires users to click a button saying they accept the terms before using the program.

    But people don’t always have a free choice. Instead, it’s a “take-it-or-leave-it” agreement, in which a customer can use the service only if they agree.

    Consumers often do not have a free choice when it comes to privacy agreements.Marta Design/Shutterstock.com

    Anyone who actually wants to understand what the policies say finds the details are buried in long legal documents unreadable by nearly everyone, perhaps except the lawyers who helped create them.

    Often, these policies will begin with a blanket statement like “your privacy is important to us.” However, the actual terms describe a different reality. It’s usually not too far-fetched to say that the company can basically do whatever it wants with your personal information, as long as it has informed you about it.

    U.S. federal law does not require that a company’s privacy policy actually protect users’ privacy. Nor are there any requirements that a company must inform consumers of its practices in clear, nonlegal language or provide consumers a notice in a user-friendly way.

    Theoretically, users might be able to vote with their feet and find similar services from a company with better data-privacy practices. But take-it-or-leave-it agreements for technologically advanced tools limit the power of competition across nearly the entire technology industry.

    Data sold to third parties

    There are a few situations where mobile platform companies like Apple and Google have let people exercise some control over data collection.

    For example, both companies’ mobile operating systems let users turn off location services, such as GPS tracking. Ideally, this should prevent most apps from collecting your location – but it doesn’t always. Further, it does nothing if your mobile provider resells your phone’s location information to third parties.

    App makers are also able to persuade users not to turn off location services, again with take-it-or-leave-it notifications. When managing privileges for iOS apps, users get to choose whether the app can access the phone’s location “always,” “while using the app” or “never.”

    But changing the setting can trigger a discouraging message: “We need your location information to improve your experience,” says one app. Users are not asked other important questions, like whether they approve of the app selling their location history to other companies.

    And many users don’t know that even when their name and contact information is removed from location data, even a modest location history can reveal their home addresses and the places they visit most, offering clues to their identities, medical conditions and personal relationships.

    Why people don’t opt out

    Websites and apps make it difficult, and sometimes impossible, for most people to say no to aggressive surveillance and data collection practices. In my role as a scholar of human-computer interaction, one issue I study is the power of defaults.

    When companies set a default in a system, such as “location services set to on,” people are unlikely to change it, especially if they are unaware there are other options they could choose.

    Further, when it is inconvenient to change the location services, as is the case on both iOS and Android systems today, it’s even less likely that people will opt out of location collection– even when they dislike it.

    Companies’ take-it-or-leave-it privacy policies and default choices for users’ privacy settings have created an environment where people are unaware that their lives are being subjected to minute-by-minute surveillance.

    They’re also mostly not aware that information that could identify them individually is resold to create ever-more-targeted advertising. Yet the companies can legally, if not ethically, claim that everyone agreed to it.

    Overcoming the power of defaults

    Monitor your phone’s default settings.Georgejmclittle/Shutterstock.com

    Privacy researchers know that people dislike these practices, and that many would stop using these services if they understood the extent of the data collection. If invasive surveillance is the price of using free services, many would rather pay or at least see companies held to stronger data collection regulations.

    The companies know this too, which is why, I argue, they use a form of coercion to ensure participation.

    Until the U.S. has regulations that, at a minimum, require companies to ask for explicit consent, individuals will need to know how to protect their privacy. Here are my three suggestions:

    • Start by learning how to turn off location services on your iPhone or Android device.

    • Turn location on only when using an app that clearly needs location to function, such as a map.

    • Avoid apps, such as Facebook Mobile, that dig deeply into your phone for as much personal information as possible; instead, use a browser with a private mode, like Firefox, instead.

    Don’t let default settings reveal more about you than you want.

    The Conversation

    The Center for Internet and Society receives funding from multiple organizations; information is available here: http://cyberlaw.stanford.edu/about-us


    0 0

    For many Muslim women, a hijab is a way of expressing resistance.AP Photo/Robert F. Bukaty

    Nazma Khan, who immigrated to the United States from Bangladesh at age 11, faced years of shaming over wearing a headscarf in New York.

    So, in 2013, she started World Hijab Day – a day for both Muslim and non-Muslim women to experience wearing a headscarf.

    Celebrated on Feb. 1, the day is an expression of solidarity and support for religious freedom.

    As a scholar of Muslim immigrants, I have also long argued for women’s right to religious expression in their clothing choices. The hijab is not simply about religion – women wear it for a variety of reasons that can change, depending on the time and social context.

    Is the veil an Islamic requirement?

    Muslim religious writings are not entirely clear on the question of veiling.

    Various passages in the Quran, the Muslim holy book, and the Hadiths, statements attributed to the Prophet Mohammad, make reference to veiling by the prophet’s wives. But scholars disagree about whether these statements apply only to the prophet’s wives or to all Muslim women.

    According to some, the veil has been used as a way of curbing male sexual desire. Yet covering the head and body predated Islam. Jewish, Christian and Hindu women have also covered their head at various times in history and in different parts of the world.

    Certainly, the headscarf is tied to religion. Many women who cover talk about it as a way demonstrating their submission to God and a constant reminder to hold fast to Islamic beliefs such as being honest and generous to those in need.

    Asserting identity

    However, there are other reasons for adopting the hijab.

    French and British colonizers encouraged Muslim women to remove the veil and emulate European women. Consequently, in North African and Middle Eastern countries, the veil became a symbol of national identity and opposition to the West during independence and nationalist movements.

    Today, some women wear the hijab to signal pride in their ethnic identity. This is more so for immigrants in Europe and the United States, where there has been an increase in Islamophobia.

    In a Facebook post for World Hijab Day 2018 that went viral, Columbia College student Toqa Badran wrote,

    “I wear this scarf because when I was a child I was socialized to be embarrassed, even ashamed, of my religion and my culture. I was told that to be a Muslim was to be a terrorist and that to be outwardly Muslim was to endorse violence and oppression … I understood that I would be unwelcome as long as I wore symbols of my heritage and chose to, in however modern a way, embrace my ancestors.”

    Muslim African-American women in the U.S. sometimes wear a hijab to signal their religious affiliation. They also want to dispel the assumption that all African-Americans are Christians, and that only people with origins abroad can be Muslim. In fact, 13 percent of adult Muslims in the U.S are black Americans born in the country.

    Different reasons for wearing a hijab

    The newly elected Congresswoman Ilhan Omar wears a hijab.AP Photo/Carolyn Kaster)

    For many other women, the headscarf has become a means of resistance to standards of feminine beauty that demand more exposure. Proponents of this view argue that removing clothing for the benefit of the male gaze does not equal liberation.

    According to researchers, women in hijabs note that employers must interact with them based on their qualifications rather than their appearance and that, therefore, the hijab levels the playing field. In Western countries, however, women find that wearing a head covering makes it harder to get hired.

    Finally, for some women, the headscarf is a convenience. It can reduce comments from others about women being out in public and lessen incidents of harassment on the street and at work.

    Despite the multiple, complicated reasons behind wearing a hijab, there are those who routinely assert that women who wear a headscarf are necessarily oppressed.

    Examples of hijab-wearing women in the government, such as newly elected Congresswoman Ilhan Omar, or athletes such as Olympian fencer Ibtihaj Muhammad, may help dispel these stereotypes.

    The Conversation

    Caitlin Killian does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.


    0 0

    Wounded Knee Memorial at Wounded Knee, South Dakota.AP Photo/Russell Contreras

    President Trump evoked the Wounded Knee massacre in a recent tweet. He was reacting to an Instagram video that Sen. Elizabeth Warren posted on New Year’s Eve.

    There’s been considerable criticism of the president’s inaccurate portrayal of Native American history, including from members of his own party. Two Republican senators from South Dakota, Mike Rounds and John Thune, spoke out against the tweet.

    Wounded Knee is among the worst massacres in Native American history. It was also one of the most violent examples of the repression of indigenous religion in American history.

    Religious suppression

    Religion historian Tisa Wenger explains that before the 20th century, many Americans believed that “indigenous practices were by definition savage, superstitious and coercive.” They did not consider them to be religion.

    In part because of this belief, the U.S. government decided not to recognize Native Americans as citizens of sovereign governments in the 19th century, but as colonial subjects. In 1883, the Department of Interior enacted the first “Indian Religious Crimes Code” making the practice of Native American religions illegal. These codes remained in place until 1934.

    In response, Wenger writes, some Native American groups tried to convince government agents that their gatherings were places of “prayer and worship” similar to Christian churches. Others claimed that their gathering were “social,” not religious.

    But this kind of masking of religious practices did not stop the U.S. government from using violence to suppress these Native American ceremonies.

    In 1890, the U.S. military shot and killed hundreds of unarmed men, women and children at Wounded Knee, South Dakota, in an effort to suppress a Native American religious ceremony called the “ghost dance.”

    Historian Louis Warren explains that the ghost dance developed as a religious practice in the late 19th century after Native Americans witnessed the devastating environmental change of their homelands from American settlement. The dance envisioned a return to their unspoiled natural world.

    The U.S. military, however, viewed it differently. They believed the Native Americans at Wounded Knee were gathering for war.

    Survivors of the 1890 Wounded Knee Creek massacre in South Dakota arrived in Washington on March 4, 1938.AP Photo

    The darkest moment

    The U.S. government changed its policies of openly suppressing indigenous religions in 1934. But it would take another 44 years before the U.S. fully committed “to protect and preserve” religious rights of American Indians through the American Indian Religious Freedom Act in 1978.

    As a Native American scholar of religion and environment history, I agree with Republican Sen. Mike Rounds – the Wounded Knee massacre “should never be used as a punchline.”

    The Conversation

    Rosalyn R. LaPier does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.