Are multinational corporations undermining freedom in poor countries?

Whether on a subway, a sidewalk, or a living room couch, life for millennials is lived in two worlds. There is the physical world around them, but also the online world that exists just out of sight, but must remain squarely in view for political professionals looking to capture and keep their attention.

For anyone aiming to truly move the needle this election cycle, the trick is to greet the largest and most diverse generation of voters in our nation’s history in a language they understand, using technology that already feels like home. This means advocating for and facilitating digital opportunities for voter registration and reminders and using digital engagement to urge action in the physical world of ballot boxes and election events.

Thought the U.S. was divided already? Just watch as elections go digital.

Stephen Ansolabehere is a professor in the Department of Government at Harvard University.

2016 will be noted as the year in which the Internet came into its own in U.S. political campaigns. While it has long been the technology for the back-end operations of political campaigns, this year online media have broken through as one of the primary sources for information about the election.

Throughout the primaries, social media often eclipsed paid media and coverage by traditional news organizations. Donald Trump’s tweets often became the story, drawing all media coverage to his message and disarming his opponents without relying on significant fundraising. As the general election begins, some analysts project that Internet political advertising will exceed $1 billion.

The U.S. doesn’t have a strong third party, and it hurts labor unions the most

Barry Eidlin is an Assistant Professor of Sociology at McGill University. His forthcoming book, “Labor and the Class Idea in the United States and Canada,” explains why union strength diverged in the U.S. and Canada over the course of the 20th century.

U.S. unions haven’t declined because they’re unpopular. Today, a majority of Americans approve of unions, and far more U.S. workers would like to join unions than are able to.

Nor is the decline of unions due to the loss of unionized manufacturing jobs. Plenty of other countries have shifted to a service economy without as much of a fall in membership. Labor laws — another common explanation for union weakness — are certainly stacked against U.S. workers. But this is more a symptom than a cause.

Facebook may soon have more power over elections than the FEC. Are we ready?

Nathaniel Persily is the James B. McClatchy Professor of Law at Stanford Law School.

For political advertising, like so much else, the digital revolution inspires both utopian and apocalyptic predictions. And as in many other arenas where Internet-based “disruption” looms, the optimists and pessimists both have a point.

For those of us who study campaign and election regulation, however, new technology poses a serious challenge to the existing ways of thinking about and addressing the campaign finance problem. Government regulation becomes increasingly difficult once communication moves online, thus, large Internet platforms such as Google, Facebook and Twitter will become the primary regulators of political campaigns. They need to recognize their new role and use their power responsibly.

How the Internet could democratize campaign spending

Bob Biersack is a senior fellow at the Center for Responsive Politics, a nonpartisan research group tracking money in U.S. politics. He spent 30 years on the staff of the Federal Election Commission, including five years as FEC Press Officer.

The prospect of billions of dollars moving opaquely through the Internet and aimed at influencing our votes can sound ominous, especially as the web is increasingly a tool that Americans use to communicate both personally and politically. In the broadest sense, allowing this flood of ad money feels contrary to our usual efforts to ensure that we know who is spending large sums to try to influence the makeup and actions of our government.

But the world of campaign finance regulation has always been fuzzy and complex, especially as we try to balance the values of fairness and transparency with the equally fundamental principle of freedom of expression. These new advertising tools, from email and websites to Facebook and YouTube and more, clearly offer both opportunities and challenges in an already fraught landscape.

How will the Internet change political advertising?

What are the potential dangers of political media transitioning to the Internet? How will this impact how players fund campaigns, and how will it change the way candidates frame their messaging to supporters? Should the FEC create new rules to make online advertising more transparent — and is it even possible?

This election could decide the fate of American labor — and that’s the problem

Joseph A. McCartin is professor of history and director of the Kalmanovitz Initiative for Labor and the Working Poor at Georgetown University. He is the author of “Collision Course: Ronald Reagan, the Air Traffic Controllers and the Strike that Changed America.”

One of the many questions to be decided this election is the future of U.S. labor policy. Unions entered the race with high hopes, having recently made big gains. They had won rising support for a $15 minimum wage, reformed overtime rules and dodged a potentially devastating blow from the Supreme Court, which, had it not been for the death of Justice Antonin Scalia, undoubtedly would have crippled labor’s ability to collect fees from millions of public sector workers who benefit from union contracts.

But any hope of translating such victories into a broad union revival hinges on the outcome of the election.

Business and labor don’t have to be enemies

Cathie Jo Martin is professor of political science at Boston University and co-chair of the American Political Science Association presidential task force on political negotiation. Her recent books include The Political Construction of Business Interests: Coordination, Growth and Equality” with Duane Swank and Political Negotiation: A Handbook” with Jane Mansbridge.

In America, the image of a powerful union connotes rapacious groups of workers, jockeying to get perks and salaries beyond what they rightfully deserve. In this zero-sum world, union gains — if unsubstantiated by productivity growth — become public losses. So why should we think that strong unions are ever a good idea?

In reality, stronger and more involved unions could help the U.S. develop better public policy. Elsewhere in the world, unions enjoy much higher levels of support from the public — in many countries, they cover most workers and play a crucial role in forging public policies. Paradoxically, they do this in conjunction with equally strong employers’ associations.

Unions could make a comeback — if we help them

Michael Wasser is a senior policy analyst at Jobs With Justice.

You won’t hear opponents admit it, but unions are popular and have been for a while. Last year Gallup found that 58 percent of Americans approved of unions. Since Gallup first asked people about their support for unions in 1936, approval dipped below 50 percent just once — when it dropped to 48 percent at the height of the Great Recession in 2009.

Anti-union advocates prefer to focus on the long-term decline of union membership in the United States, which can suggest that unions are unnecessary or in an inevitable decline. It is true that union density has shrunk from its peak of 35.4 percent of the workforce in 1945 to 11.1 percent in 2015. But the erosion in union membership is not a natural, pre-ordained outcome — the reality is that intentional policy choices significantly contributed to fewer people becoming union members.

How employers broke unions by creating a culture of fear

Kimberly Phillips-Fein is the author of “Invisible Hands: The Businessmen’s Crusade Against the New Deal.” She teaches American history at New York University’s Gallatin School of Individualized Study.

Why are there no labor unions in America? This is, of course, an overstatement — millions of Americans still belong to unions. But the size of the unionized workforce has declined every year for 40 years. And even at its mid-20th-century peak, it was lower than in most European countries.

Many explanations for low union density turn on the distinctiveness of American culture. Americans are deemed individualists, with self-interest trumping any sense of the common good. They are driven wild with consumer longings, willing to do anything for low prices. They are entrepreneurial, identifying with their employers and always dreaming of upward mobility or striking it rich rather than claiming solidarity via working-class identity.

Why are unions in the U.S. so weak?

Labor in the United States simply never seemed to have the same political power that it did in other countries. But why not? What  does this say about our political and economic systems? Why didn’t we see corporatism take root here as it did in Europe? What factors have contributed to relatively weak unions in the U.S., and can they be changed?

The culture of college is an asset, not a flaw

Robert Pondiscio is a senior fellow at the Thomas B. Fordham Institute, an education think tank in Washington, DC.

A few weeks from now, my wife and I will pack our only child and all her gear into the back of our rapidly aging Ford Escape with a big new college decal on the back window. We will drive her across five states and 500 miles to Chapel Hill, North Carolina. I-95 will henceforth be known in our home as The Trail of Tears.

There is nothing novel about my family’s bittersweet road trip. And despite how anachronistic and inefficient college has become, I expect and hope the sight of cars packed to the windows with bedding, clothes and the trappings of teenage life will remain a late summer fixture on America’s highways for generations to come.

U.S. colleges need a new age of experimentation

Wilfred M. McClay is the G.T. and Libby Blankenship Chair in the History of Liberty at the University of Oklahoma.

Is it time to retire the classic ideal of the four-year, residential university? Is our notion of “college” — complete with leafy campuses, crisp autumn football weekends and resplendent commencement addresses — hopelessly outdated? Could it be possible that, after becoming a near-universal rite of passage for middle-class Americans, the classic collegiate experience is an obsolete relic — or at least, an unaffordable model that no longer serves our needs?

Many intelligent observers answer “yes” to these questions. And they may be right. There is no doubt that the current system is financially unsustainable for a great number of people. There are plenty of reasons to doubt whether our colleges and universities — even the most selective ones — can consistently deliver what they promise. There’s also plenty of reason to wonder if they have become little more than social sorting mechanisms, rather than bastions of higher learning. The oft-invoked promise of higher lifetime earnings for college graduates rings hollow in the ears of underemployed and debt-burdened millennials who have had trouble launching careers in the last decade’s stagnant economy.

Here’s what American universities can learn from Germany

Joseph Parilla and Martha Ross are fellows at the Brookings Metropolitan Policy Program.

Education and skills are the currency for workplace success in our advanced economy, but many young Americans struggle to launch their careers. Unemployment rates among teens and young adults are typically much higher than they are for workers over age 25, and over 3 million U.S. youth — nearly 8 percent of all young people — are disconnected from both work and school.

This represents an unfortunate case of American exceptionalism. We should look to places like Germany to inspire new ways to align government, business and educational institutions to prepare young people for economic success.

One size doesn’t fit all in higher education

Donald E. Heller is provost and vice president of Academic Affairs at the University of San Francisco. His research focuses on higher education economics, finance and public policy.

We often hear stories about student loan debt topping $1 trillion, college graduates who struggle to find jobs in their chosen careers and the rising price of college.

As college prices have risen — and as the nation suffered through the worst recession since the Great Depression — there has understandably been much focus on whether students will get a good job after graduation. Students want to know whether they will earn enough to pay back loans and make their time spent in college worthwhile.

Do we need to remodel our university system?

Are the expensive, research-focused academic institutions in the U.S. the best outlets to offer the job training needed in our rapidly shifting economy? Should “job training” even be the main goal of higher education? And what about alternative systems that are less costly and easier to access — whether community college, vocational training or apprenticeships? Should we be taking lessons from other countries as we attempt to reform our higher education system?

The latest threat to the American cowboy: Environmentalists

Blake Hurst is a farmer in Northwest Missouri, and president of the Missouri Farm Bureau.

Pity the poor cowboy. Once the hero of hundreds of movies, television shows, and pulp novels, he’s no longer an icon. Instead, he’s a social pariah, spending his days caring for cows, the worst environmental villain that modern man can imagine.

True, he’s finally free of the diet mavens who bedeviled him for a generation (four meta-analyses published since 2009 have failed find any connection between consumption of saturated fat and heart disease). But now our lonely cow puncher is attacked by global climate change warriors, who have decided that only the hamburger stands between modern man and Eden.

We need to eat less meat. Should the government step in?
Each week, In Theory takes on a big idea in the news and explores it from a range of perspectives. This week we’re talking about meat consumption. Need a primer? Catch up here. Laura Wellesley is a research associate in the Energy, Environment and Resources department at Chatham House, where she focuses on issues relating to climate change and food […]

We can save the earth without giving up bacon. Here’s how.

Brian Kateman is co-founder and president of The Reducetarian Foundation.

You’d have to be living under a rock to have missed the memo: Too much meat is bad for your health, responsible for immense animal cruelty on factory farms, to blame for speeding up climate change, at fault for squeezing out rare species and more.

Yet despite increased pressure to eliminate meat from our diet, consumption rates have been slow to budge. Americans are eating less beef, but chicken sales are on the rise. China has committed to cutting its meat consumption in half, but other developing nations are expected to eat more meat and diary in coming years. With rising incomes, population growth and the spread of western diets, the trend towards heavily animal-based diets looks like it’s here to stay.

Is reducing meat consumption in the U.S. possible?

Last month, the Chinese government announced plans to cut its country’s meat consumption in half — partly for public health reasons, but also to reduce the carbon footprint left by livestock. But China is far from the only place moving in this direction. Experts across the world, including one of the top nutritional panels in the U.S., recommended similar meat reductions for the exact same reasons.

But what’s the next step? Will social pressure and dietary guidelines alone have a legitimate impact on meat consumption in the long run? As an ecological issue, will we naturally reduce our meat-intake, or will we have to follow the example of China and use the government to force it to happen?

Live chat with the editors: In the age of Trump and terrorism, do media ethics need to evolve?

On Thursday we sat down with Erik Wemple and Margaret Sullivan — media columnists at The Post — to discuss media ethics in political campaigns. If you missed the conversation, you can watch it above, or at facebook.com/washingtonpost.

We weren’t able to get through anywhere near all of the questions from readers during the live chat. As there were a number of interesting ones left over, we thought we’d respond to a few here.

Confirmed: Echo chambers exist on social media. So what do we do about them?

Last month, three scholars confirmed what we already knew about social media — or at least had suspected. In a draft paper called “Echo Chambers on Facebook,” social scientists Walter Quattrociocchi, Antonio Scala, and Cass Sunstein found quantitative evidence of how users create tend to promote their favorite narratives, form polarized groups and resist information that didn’t conform to their beliefs.

The study focused on how Facebook users interacted with two narratives involving conspiracy theories and science. Users belonging to different communities tended not to interact and tended to be connected only with “like-minded” friends, creating closed, non-interacting communities centered around different narratives — what the researchers called “echo chambers.” Confirmation bias accounted for users’ decisions to share certain content, creating informational cascades within their communities.

Donald Trump is crashing the system. Journalists need to build a new one.
Each week, In Theory takes on a big idea in the news and explores it from a range of perspectives. This week we’re talking about digital media ethics. Need a primer? Catch up here. Jay Rosen teaches journalism at New York University and writes the blog PressThink. Follow him on Twitter: @jayrosen_nyu Journalists commonly divide information from persuasion, as when they […]

The debates gave Donald Trump the nomination, and it’s the media’s fault

Dan Kennedy is an associate professor of journalism at Northeastern University in Boston and a panelist on WGBH-TV’s “Beat the Press.” He blogs at Media Nation.

What could be more open and democratic than a debate? For all the rending of garments and gnashing of teeth now taking place over the massive amounts free media bestowed upon Donald Trump, it was his dominating performance in the televised debates that allowed him to separate himself from the pack.

Yet the debates themselves were an exercise in faux democracy. What really mattered, especially early on, was who got invited, who got to stand where and who was allowed to speak the most. Unfortunately, the media organizations that ran the debates (along with the Republican National Committee) relied on polls to make those decisions right from the very first encounter in August.

In the age of Donald Trump, is it time to revisit media ethics?

The media’s failings regarding primary coverage will undoubtedly continue to be discussed throughout the next year — and of course, that’s welcome. These criticisms come at a pivotal moment for journalism: The majority of the public has a smartphone and, consequently, instant access to the news. Given this backdrop, is it time to revisit journalism’s ethical responsibilities when it comes to issues of national importance?

Conservatives will probably lose the legal fight for small government. Here’s why.

Stop me if you’ve heard this one before: The 2016 election matters because the next president might be nominating two or three justices to the Supreme Court.

It’s one of those statements you hear all the time because, well, it’s true. The next Supreme Court appointment may throw the liberal vs. conservative balance of the court into disarray, which is why pundits have been issuing foreboding proclamations like “we’re only one vote away from losing (insert constitutional right here).”

One of the legal issues that’s less often discussed is the role that the next Supreme Court justice will play in conservatives’ long-running legal fight to limit the size of the federal government. For decades, conservatives on the bench have been losing that war, giving way to a system of administrative law that is written, for the most part, by bureaucratic agencies. Without putting a justice on the bench who can build consensus on how to rework the rules on agencies regulations, it’s likely conservatives will lose this fight in the long run.

How the ‘Serial’ podcast is challenging the criminal-justice system
Last week, a Baltimore City Circuit Court judge delivered a bombshell development to the case of Adnan Syed, who was convicted in the 2000 strangling of his ex-girlfriend and classmate, Hae Min Lee. The judge vacated the conviction of Syed, ordering a new trial due to a failure of his attorney to challenge unreliable evidence. […]

Here’s what happens when payday loans are banned

Deyanira Del Rio is co-director of New Economy Project in New York City. Andy Morrison is campaigns coordinator at New Economy Project.

Imagine a land with no payday loans. Flashing neon signs advertising “Fast Cash” no longer dot the landscape in low income neighborhoods and communities of color, and nary a lender is permitted to extract interest rates of 400 percent and up. This is not a far-fetched fairy tale or some long-forgotten history. It was the reality across most of the United States until the 1990s, when financial deregulation and the gutting of state usury laws enabled the payday lending industry to proliferate.

Today, 14 states and the District of Columbia are payday loan-free, thanks to strong usury laws that cap interest rates on loans. It’s a swath of the country we have dubbed PaydayFreeLandia – representing 90 million Americans, or one-third of the U.S. population. Experiences in these diverse states belie the notion that usurious, short-term loans are a necessary evil. In fact, these states have demonstrated that the best way to address abusive payday lending is to end it once and for all.

Want to eliminate payday lending? Raise the minimum wage.

Valerie R. Wilson is an economist and director of the Program on Race, Ethnicity and the Economy at the Economic Policy Institute in Washington, DC.

Payday loans are advertised as a quick and easy solution to temporary cash flow problems. In reality, for most borrowers the payday lending experience rarely ends up being easy or temporary.

By severely limiting the payday loan industry, the CFPB’s proposed new rules are intended to put an end to the “debt trap” far too many economically vulnerable borrowers struggle to escape. This is a responsible and appropriate response from the agency charged with protecting the best interests of American consumers, but Congress has responsibilities as well.

The problem is bigger than payday loans

Nathan Fiala is an assistant professor at the University of Connecticut. He has worked closely with governments, international aid organizations and microfinance institutes across Africa.

Payday loans are intended to help people who want or need access to credit that banks are unwilling to provide. However, the difference between wanting credit and needing credit can be important. When high-interest credit is used for wants instead of needs, it can be a terrible idea.

Payday loans are essentially the United States version of microfinance: they have very short repayment periods, the interest rates are very high, and they target those that can’t access normal credit channels. Small-scale, high-interest rate loans can lead people to remain impoverished when the additional debt they accrue becomes too burdensome — a problem that often occurs in the field of microfinance, and, we have found, with payday loans. That’s not to say that short-term loans in the U.S. are always a bad idea. On the contrary, they fulfill a need. But what is that need, exactly?

We can’t get rid of payday loans just because we don’t like them

Tim Worstall is a senior fellow at the Adam Smith Institute in London and a regular contributor to Forbes.

As the Federal Reserve Bank of New York has taken the pains to point out, everyone hates payday loans except the 10 million people a year who take them out. It’s no surprise, then, that the Consumer Financial Protection Bureau has taken up arms against the payday lending industry.

Banning things simply because we disapprove of other people’s choices is not a mark of a free society. We might do so if we consider that people are being ripped off, and we might if we think the activity is harmful to those not making the actual choice themselves. Thus laws banning pollution are fine, and laws restricting the capitalists’ ability to gouge the worker are fine too.

Think there’s no good alternative to payday loans? Think again.

Mike Calhoun is the president of the Center for Responsible Lending.

The CFPB’s proposed rule requiring payday and car title lenders to assess borrowers’ ability to repay will, by all projections, reduce the number of these loans being made. The question often comes up: What will those consumers who might have taken out a payday or car title loan do instead?

A review of the credit market for households with lower credit scores and the experience of states that have addressed payday lending show that many alternatives are available. They are all far better than the long strings of repeated high-cost loans that trap payday borrowers in unmanageable and ultimately devastating debt.

Payday lending isn’t helping the poor. Here’s what might.

Mehrsa Baradaran is the J. Alton Hosch Associate Professor of Law at the University of Georgia School of Law and author of “How the Other Half Banks: Exclusion, Exploitation, and the Threat to Democracy.”

A public interest lawyer once explained that “poverty creates an abrasive interface with society; the poor are always bumping into sharp legal things.” Indeed, the poor are also constantly bumping into sharp financial things.

Without a financial cushion, every mistake, unexpected problem or minor life change can quickly turn into a financial disaster. Half of the U.S. population has less than $500 in savings, living paycheck to paycheck and sometimes relying on payday lenders in a pinch. The reality is that people need short-term loans and we have to find a way to provide credit that is safe and accessible.

Why the Supreme Court is right to overturn McDonnell’s corruption conviction
Former Virginia governor Robert F. McDonnell won a major victory Monday at the Supreme Court, where the justices unanimously vacated a lower court’s decision that found him guilty of bribery charges after he accepted more than $175,000 in loans, gifts and other benefits from a Virginia businessman. The win doesn’t necessarily end McDonnell’s high-profile fight to stay out of prison, but […]

What’s the alternative to payday loans?

As the CFPB moves its proposed rules through the public review process, the debate for further action around payday loans will continue. Is federal regulation the answer? Or should government take a greater role in offering emergency finance for the poor?

The Orlando shooter Googled my name. I wish he had reached out to me.

Hamza Yusuf is president of Zaytuna College, the first Muslim liberal arts college in the United States. He also serves as the vice president of The Forum for Promoting Peace in Muslim Societies.

A few days ago, a colleague from the college where I teach and serve as president called to let me know the FBI had just paid a visit. They wanted to inform me that the Orlando shooter, Omar Mateen, had been Googling me on his computer and iPhone. They were concerned that he may have had malevolent intent, given that the Islamic State had called for my death twice in the last year, presumably due to my strong condemnation of the utterly anti-Islamic nature of their “state” in a sermon that went viral in the Middle East.

After some deliberation, I thought of another explanation for Omar Mateen googling me: Perhaps he had been conflicted about what he was contemplating and wanted to seek advice. I may be guilty of wishful thinking, but I would like to believe that in his heart he knew something was not right. We cannot determine what motivated this young man — who had a wife and a child — to infamously propel himself into American history by means of one of our bloodiest massacres. But my sense, given what we’ve learned about him since then, is that it cannot be reduced to a simple equation.

For trans people like me, pronouns are about more than grammatical correctness

Noah Lewis is a Staff Attorney at Transgender Legal Defense & Education Fund, a national New York City-based organization committed to achieving equality for transgender people.

Does it take a little bit of effort to break out of old habits and refer to someone by a new pronoun? Sure. But given the stark realities facing transgender people who are excluded from jobs and education, it’s the very least we can do. The simple act of calling me “Mr. Lewis” and “he” adds up to nothing less than creating a society in which transgender people don’t merely survive, but thrive.

It’s foolish to try to simplify the motives of terrorists

David Sterman is a Senior Program Associate at New America’s International Security Program. Find him on Twitter: @DSterms.

In the wake of the shooting in Orlando, the deadliest terror attack in the United States since 9/11, the discussion of how people become radicalized has often sought a master narrative, one that can explain each case and assign a single motivation. Pundits decry a  supposed tendency to refer to factors other than radical Islamist ideology when trying to explain an attack as erasing the role of ideology.

Yet it is precisely the ability avoid either-or explanations that is essential to understanding the threat the United States faces from homegrown terrorists. As Enrique Marquez, who is accused of buying the weapons for 2015’s San Bernardino attack, said: “No one really knows me. I lead multiple lives.” Marquez’ self-analysis was flawed in many respects, but in one respect he is correct: A single motive for violent actions is seldom sufficient. Jihadist ideology with its peculiar mix of religious, foreign policy and socio-political views is important but it should not overshadow the investigation and analysis of other explanations.

Radical Islam tells a story. We must tell a better one.

Abigail R. Esman is a freelance writer and award-winning journalist specializing in Islamic extremism in the West. She is author of “Radical State: How Jihad Is Winning Over Democracy In the West” and is currently writing a book about domestic abuse and terrorism.

The attacks on 9/11 changed the world the way we knew it. Yet still, we struggle to understand, asking, first “why do they hate us,” to later, in ever more urgent voices, “how do we make it stop?”

The conspicuous fact is that there are no simple explanations or straightforward solutions. If we are going to defeat Islamist terrorism, we need to change the ways that we approach the threat. This means recognizing that the core impulse behind radical Islam is less about hate and more about honor; less a rage against others and more a personal, self-directed shame.

War drives terrorism

Daniel Byman is a professor at Georgetown University and a senior fellow in the Center for Middle East Policy at Brookings. His latest book is “Al Qaeda, the Islamic State, and the Global Jihadist Movement: What Everyone Needs to Know.” Find him on Twitter: @dbyman.

Study after study has found no common profile of a terrorist. It is hard to explain why Sweden, say, has sent almost four times as many recruits to fight with the Islamic State than neighboring Norway, despite having only twice the population. Pundits and politicians alike speculate on why individuals might embrace terrorism, but such a question risks missing one obvious point: Having a terrorist group around in the first place is one of the most important factors in influencing an individual’s choice to join.

Terrorist groups don’t emerge out of nowhere, and wars are perhaps the richest soil for seeding and growing violent groups of all stripes. Without the wars in the Middle East there would be no Islamic State, and it is not the only one: Al-Qaeda, Hezbollah and other extremist groups all emerged out of regional civil wars. The formation of such groups is a political phenomenon, and so too is the radicalization of foreign fighters from the West.

How do domestic terrorists become radicalized?

How best should we characterize the factors that lead people to become violent? What can we learn from these theories to stem radicalization or, at least, to better monitor those who might become threats?

Donald Trump’s claims about radical jihadists are very wrong

Martha Crenshaw is a senior fellow at the Center for International Security and Cooperation at Stanford University.

What do we mean when we talk about “homegrown extremism” or “radicalization” in the United States? Donald Trump claims that the threat of “radical Islam” is imported by immigrants from abroad, from regions where there is a history of terrorism against us and our allies. He refers to “thousands upon thousands of people” entering the United States “many of whom have the same thought process” as the Orlando shooter. He asserts that they are forming “large pockets” of people who want to “slaughter us.”

Be the first to comment on "Are multinational corporations undermining freedom in poor countries?"

Leave a comment

Your email address will not be published.


*


Time limit is exhausted. Please reload the CAPTCHA.