Month: <span>June 2019</span>


Conservatives should have sympathy for millennial borrowers, who did everything their parents and culture told them to do to be successful, only to become the most debt-laden generation in history. Countering a culture of credentialism mania with apprenticeships and trade alternatives is a positive step, but the first rule of finding yourself in a hole should be to stop digging. There’s no reason for the average American to subsidize the elite sorting mechanism universities have become.
► Inez Feltscher Stepman

No, it’s not another term for “marriage,” though that does offer an indication of just how involved this can get.

The use of “share” and “agreement” give the phrase an uplifting, almost cheery, sound, but the underlying consideration here is debt – the mountains of nearly unpayable debt not all but many college students face. After mortgages (which most of the time are a manageable and ordinary part of middle-class living), the largest mass of debt in the United States is higher education debt, more than $1.5 trillion. In many cases that debt is in such large amounts that final payoffs of them seem unseeably far into the future.

This is a new development. During my college days in the 1970s, I took out a couple of student loans, but they were small, and I paid them off without difficulty in four or five years. The loans were small because the costs were too. Finances were not a reason, in those days, a person could not go to college (at least, some decent college) if they chose to.

Conditions have changed. The situation is not good for anyone involved, but especially for those buried under all this debt. One theoretical advantage in a search for solutions is that, increasingly, student debt is not scattered among endless numbers of private lenders but under the umbrella of the federal government; the advantage is not that the federal government is any better as a lender but that it is just one unit to deal with,k and susceptible to congressional action.

One approach for dealing with it, a method that seems to be gaining in popularity, is the “income share agreement,” which is a variation on how a loan will be repaid. Instead of imposing a set amount due every month (depending presumably in part on the size of the loan), the ISA is more flexible: It would vary in size depending on he income the former student receives once employed. A law student who goes to work for a top white-shoe firm might kick in more, while one who works as a public defender might pay less. An in-demand physician would may more dollars per month than, say, an elementary school teacher.

The idea has some appeal (which is about 40 years old), as a way of matching ability to pay with liability. But the story could get more complicated. The debt size in many cases is so enormous that it might not plausibly be repaid in a working lifetime – and what then? (The law is very hard on discharging student loans, albeit not impossible under some conditions.)

That’s only one of the questions.

There’s a financial-structural question, which is beginning to arise as private lenders gradually move back into the business. As writer Malcolm Harris put it, “If you can convince investors you’re going to be rich for the rest of your life, why spend your college years poor?

I.S.A.s bridge the gap. It’s hard to think up a better advertisement for free-market capitalism. But I.S.A.s are premised on the idea of discriminating among individuals. Once the high-achieving poor and working-class students have been nabbed by I.S.A.s, the default rate for federal loans starts to rise, which means the interest rates for these loans have to go up to compensate. A two-tiered borrowing system emerges, and the public half degrades.”

This leads to developments that could even “reshape childhood,” encouraging K-12 students to redraw their K-12 learning and activities to suit not only college admissions offices but also lenders – to persuade that they’d be a good lending risk.

The ongoing steps where this might lead – not least in the discouraging of students even thinking about entering much-needed but less-profitable careers – could take a dark path.

University of Chicago economist Gary Becker said in one study that “Economists have long emphasized that it is difficult to borrow funds to invest in human capital because such capital cannot be offered as collateral and courts have frowned on contracts which even indirectly suggest involuntary servitude.” But under enough financial pressure – we’re talking about really big money here, past the trillion-dollar mark – how long will courts continue to look at it that way?

Which takes us back to “share” and “agreement,” and the question of how such a fine-sounding concept can turn into something so dark.



State lines can make quite a difference, which is one reason a gaggle of Oregon Republican legislators are – as this was written – hiding out in Idaho.

They might re-cross the state line soon, but the reasons they’re in Idaho and why the timing matters reflect several differences between the states – procedural, more than philosophical.

Not to mention the substantive issue that got it started.

That issue is climate change, not – the say the least – a high priority at the Idaho Legislature. At the Oregon legislature, where Democrats control both houses, climate change is a bigger deal. Democrats there have been trying for some years to pass a strict “cap and trade” bill, with some tax increases included, with climate change in mind. For years those efforts fell short because in Oregon unlike Idaho – for many fiscal bills, a three-fifths majority vote is needed in each chamber to pass. Idaho has no such requirement. (Even if it did, the minority Democrats wouldn’t have enough votes to stop a measure by themselves.) For many years in Oregon, up until 2018, Republicans held more than two-fifths of each chamber, so they were able to (and often did) block a number of bills Democrats proposed.

In 2018 Oregon Democrats won supermajorities – meaning 60 percent of the seats – in both chambers, so they were freed to push harder. They did, finally teeing up a cap and trade bill for passage.

That was the prompt that caused Senate Republicans to walk out. Disagreement on a single bill – is the sole reason they gave for walking out and declining to participate at all in legislating.

Since the Senate Republicans occupy just 11 out of 30 seats, that would seem to give them little room to stop the bill. In Idaho, they wouldn’t have any room at all. In Idaho, a legislative chamber’s quorum – the number of members who must be present for business, any business, to be transacted, is any number over half. In Oregon, it takes two-thirds. With 11 senators out, everything ground to a halt.

The senators, you may have heard, have fled the state and some of them at least are said to be holed up at an undisclosed location, or more than one, in Idaho.

They probably are watching the calendar, too, because here’s another difference between the states: Oregon legislative sessions are required to end by a specific date. Idaho’s can in theory go on and on, and a few have gone past 100 days though most last about three months or so. Typically, Oregon has one session lasting about five months in odd-numbered years and one little more than a month long in even years. But unlike in Idaho, they do have deadlines. For this year, the state constitution requires adjournment by June 30.

At the time the Senate Republicans walked out, a bunch of key pieces of legislation, including the state budget, were still not yet passed. Most of these things were not especially controversial, but they do have to be done, and can’t be while a quorum is lacking. Maybe they’ll return because the Senate Democrats have agreed to take the cap and trade legislation off the table.

There are workarounds. A special session to get the budgets passed, for example, could be called, and there are other approaches too.

But at some point, Oregon legislators, especially those who didn’t take an Idaho vacation this year, might take a look across the state line at some of the procedural handcuffs that aren’t in place in Idaho, and start thinking about whether a few changes in their own procedures might be helpful. The idea that a third of one half of the legislature could hijack the overall work of the state probably isn’t something most Oregonians or Idahoans would see as a good idea.



Our first president said that virtue of morality was a necessary spring of popular government. He said who that is a sincere friend to it can look with indifference on attempts to shake the foundation of the fabric [of society].

► Senate candidate Roy Moore in a September 2017 debate with Republican primary opponent Sen. Luther Strange

Well, yes. But what morality are we talking about here?

Are we talking about some of it or all of it?

After all, young George Washington was “a man on the make. He wanted to get rich. He bought, sold and traded slaves, raffling off some in a lottery and permanently dividing families. After arranging to marry the richest widow in Virginia, Martha Dandridge Custis, he wrote a series of passionate love letters to the wife of one of his best friends. And then there was his insatiable craving for land, which led him to cheat some of the men he had commanded in the French and Indian War out of acreage they had been offered as an incentive to join the fight. As biographer Ron Chernow put it, Washington ‘exhibited a naked, sometimes clumsy ambition.’”

Of course, he matured with time, but who’s perfect?

What’s really called for here is more specificity; in fact, that seems inherent. Like a number of words in this list, the problem isn’t that the word isn’t significant; the problem is that the real broad scope of the word has been cast aside, and redefined to include only a tiny piece of the original.

The Oxford English Dictionary defines morality as “Principles concerning the distinction between right and wrong or good and bad behaviour” – which may be a reasonable enough definition, although it offers little guidance: What exactly is “right” and “wrong”?

The author C.S. Lewis took a stab at this, suggesting three components to morality: “(1) to ensure fair play and harmony between individuals; (2) to help make us good people in order to have a good society; and (3) to keep us in a good relationship with the power that created us.” That suggests what the purpose of morality might be, but still doesn’t help answer the question of what it is – the practical nature of the moral.

Because our code of ethics (ethical philosophy covers roughly the same territory as “morality”) eventually covers everything we do, including many or most of the choices we make in our lives, that becomes an awful lot of territory for us to cope with as a matter of public life. Inevitably, nearly all of us wind up paying more attention to some parts of this vast territory than to others, and those choices we make say as much (probably more) about us than about those who we would judge.

The Wikipedia entry on morality includes this useful paragraph:
“If morality is the answer to the question ‘how ought we to live’ at the individual level, politics can be seen as addressing the same question at the social level, though the political sphere raises additional problems and challenges. … Moral foundations theory (authored by Jonathan Haidt and colleagues) has been used to study the differences between liberals and conservatives, in this regard. Haidt found that Americans who identified as liberals tended to value care and fairness higher than loyalty, respect and purity. Self-identified conservative Americans valued care and fairness less and the remaining three values more. Both groups gave care the highest over-all weighting, but conservatives valued fairness the lowest, whereas liberals valued purity the lowest. Haidt also hypothesizes that the origin of this division in the United States can be traced to geo-historical factors, with conservatism strongest in closely knit, ethnically homogenous communities, in contrast to port-cities, where the cultural mix is greater, thus requiring more liberalism.”

In the book The Hidden Agenda of the Political Mind, researchers Jason Weeden and Robert Kurzban argued that morality is often based in selfishness: “we often perceive our own beliefs as fair and socially beneficial, while seeing opposing views as merely self-serving. But in fact most political views are governed by self-interest, even if we usually don’t realize it … we engage in unconscious rationalization to justify our political positions, portraying our own views as wise, benevolent, and principled while casting our opponents’ views as thoughtless and greedy.”

Or, when socially broader, morality can be used as a lever, an ideological tool to stop other people from doing what (we think) is harmful to them.

So what can we say of morality that people across our society can accept and understand in a common way? Not much, apparently. “Morality” has become a code word, with provisions that would be commonly understood only in split-off – and often in-conflict – elements of society. It’s a brickbat, not a standard of conduct. It will not mean more until people in America reach beyond it and come to come common agreements – which they seem not to do at present – about what actually is good and bad.

Our conceptions of morality, evidently, are flying apart, and some seemingly logical center is failing to hold.



The legitimate object of government is to do for a community of people whatever they need to have done, but cannot do at all, or cannot do as well for themselves, in their separate and individual capacities.
► Abraham Lincoln

an attack phrase on centralized federal authority and massive taxation and expenditure.
► William Safire, Safire’s Political Dictionary

Safire’s assessment was largely right: a statement including the phrase “the proper role of government” is apt to be an attack on same, with the idea that whatever the main subject at hand is, is not within the “proper role of government.” (Ironically, the supporters of government activism tend not to talk much about the proper role of government as such.)

The attack tends to have three problems in association with each other.
First, all of government is smeared as needing limitation in a way no other aspect of society is; did all those governments officials, high and low, drink some special kool-aid?

Second, it forgets that government is interactive with the rest of us. (You don’t think businesses affect what governments do? Churches? Even, from time to time, individuals? Activist groups?)

Third, in connection with that, obsession on government’s proper “role” diverts attention from other sources of power in our society – which can gain more power in turn, as long as we’re warned not to look in their direction.

One of the foundations of “proper role” rhetoric in recent years is an article written by Ezra Taft Benson, a secretary of agriculture in the Eisenhower Administration, called “Proper Role of Government.” It cleanly isolated a central kernel of the limited-government argument:

“… the proper function of government is limited only to those spheres of activity within which the individual citizen has the right to act. By deriving its just powers from the governed, government becomes primarily a mechanism for defense against bodily harm, theft and involuntary servitude. It cannot claim the power to redistribute the wealth or force reluctant citizens to perform acts of charity against their will. Government is created by man. No man possesses such power to delegate. The creature cannot exceed the creator. In general terms, therefore, the proper role of government includes such defensive activities, as maintaining national military and local police forces for protection against loss of life, loss of property, and loss of liberty at the hands of either foreign despots or domestic criminals.”

It’s a thoughtful contention – if government is created out of the authority of the people, how can it do anything you or I individually cannot do? – but it doesn’t hold up to scrutiny.

You and I as individuals lack the authority to do even the basic things Benson argues governments must do, such as protect people from harm and provide for the common defense. Accomplishing those things means commanding people to do certain things; we as atomized individuals, have no such authority. Nor do we have authority to raise money – other than by begging for it, which as a practical matter wouldn’t work – to accomplish those things.

Authority to do anything that government does, including those things the staunchest libertarians would endorse, means governments have authority beyond that of individuals.

If we want to get into theorizing about how this might be justifiable, simple enough answers are available. These grow mostly out of the concept of the social contract: By living in and benefiting from our society, we give up some of our absolute liberty in the interest of gaining other compensating advantages. It’s transactional; a tradeoff. We can (and many of us do) disagree about the precise terms of that deal, but such a deal is what people who live in any society – whether one like the United States or one drastically different – officially or unofficially agree to. If you don’t, you leave, or you might be punished by the other people who stay.

And there are people who try to withdraw to some extent, sometimes in minor and subtle ways and sometimes in ways more obvious. But most of us take the tradeoff, knowingly or unknowingly, sometimes grumbling as we do. But nonetheless we do.

Benson said in his article that he draws much of his philosophical inspiration from the United States Constitution. But the Constitution itself disagrees with his reductionist view. Its first sentence says this: “We the People of the United States, in Order to form a more perfect Union, establish Justice, insure domestic Tranquility, provide for the common defence, promote the general Welfare, and secure the Blessings of Liberty to ourselves and our Posterity, do ordain and establish this Constitution for the United States of America.”

What it takes for government in America to accomplish these things has varied with time, often has been debated, and sometimes has moved into unpredictable areas.

Few members of the founding generation were as fierce in their calls for limited government as Thomas Jefferson; but in 1803 he doubled the size of the United States with a purchase of land from France, a purchase for which neither he nor Congress had any clear constitutional authority. (Analysits of that day rwisted themselves into pretzels in attempts to find it.) Jefferson did it because like many other people he had become convinced that it was in the interest of the larger aims of the United States. Despite its uncertain constitutionality, it hasn’t been significantly questioned since.



He came close to becoming governor of Idaho.

Ray Rigby was for a time a candidate for governor of Idaho, as a Democrat, in 1970. But only for a while; he didn’t file, and wasn’t on the ballot, and didn’t get very close to the office that year. But in the process he and the eventual nominee and governor, Cecil Andrus, became good friends, and four years later he was Andrus’ top choice for lieutenant governor.

In 1974 the Democratic nomination for lieutenant governor had some value, both evident and as yet unseen. It was a good year for Idaho Democrats – one of the two best in the last half-century – and the nomination drew four serious candidates. Rigby had strong support but came in second, outdone by fellow state senator John Evans, who had pulled in most of the still-strong labor backing. But that four-way race was highly competitive.

And that was appropriate, because two years later Andrus was tapped for interior secretary, and Evans became governor. It could have been Rigby.

If there was some significance to that, Ray Rigby, who died last week at his lifetime home in Rexburg, never seemed to dwell on it. He had a busy life in his profession and his church as well as in politics. His political adventures back in the 60s and 70s, when he was one of the leading figures in the Idaho Legislature and a serious prospect for higher offices, hardly even figure now in many of the recollections of him.

The biographical article about Rigby in the Idaho Falls Post Register, for example, focused more on water.

And that’s not a mistake. Ray Rigby was a water lawyer and one of the leaders in shaping Idaho state water regulation – his son Jerry, has followed in those footsteps too – and he was one of the people who helped create Idaho’s water regime when it was in formative stages half a century ago, turning into something like what the state has now. Rigby was a practicing lawyer who represented clients, which included many of the larger water operations around eastern Idaho, and he had clear points of view about how things should be. But he was willing to compromise, willing to work with a wide range of people, and willing to experiment.

In the oral history book Through the Waters (disclosure: I published and helped edit it), which tracks the story and history of the Snake River Basin Adjudication, Rigby emerges as a major figure in setting up the state’s water structure after Swan Falls Dam court decision in 1982. (In comments after his death, two of the other major figures in that work, then-Attorney General Jim Jones and his resources division chief, Clive Strong, reaffirmed that.) He was the practical source of the idea of trust waters – a key concept in putting the Snake River water rights agreements in place – and also important because he was so widely trusted, across party lines and across a range of interest groups.

Idaho has one of the best water management systems in the country. The most important ingredient allowing that to happen has been trust, a willingness for people with varying interests to work in good faith with each other. Ray Rigby epitomized that, and he helped make that happen.

He could set a good example for policy makers and political people in Idaho today.



How much is a college education worth?

How much is it worth to have attended, or obtained a degree from, the “right” college?

Rounding out the trio of questions: How much should it be worth?
We can get at such questions through the doorway of credentialism, a term almost begging for widespread use on either the political left or right, or maybe both.

Let’s put this into context first. As human society has developed, more information, and more specialized skills and understanding, has been needed to cope and prosper. A person in the 1600s had to understand far more than a counterpart in the 1100s. Someone living in 1800 simply did not need to know as much, to function effectively, as someone living in 2000. Education has helped create the progress, and it also makes itself more necessary as progress continues. More education helps; higher quality education helps more. Of course, let us not forget this, either: Education can come from many sources (a person educated as a fine college who never picks up another book after graduation likely will be far less well educated than a high school grad who continues to learn). And let us not for get that reputation does not necessarily equal actual quality or performance.

Which is to say, most of the academics I’ve met over the years have struck me as highly intelligent people, but I’ve met a few Ph.D.s I wouldn’t trust to park my car.

Next stop, credentialism: “a concept coined by social scientists in the 1970s, is the reduction of qualifications to status conferring pieces of paper. It’s an ideology which puts formal educational credentials above other ways of understanding human potential and ability.”

In the 1960s, amid the rethinking of many social institutions, an approach (fostered by critics such as Ivan Ilich) “proceeded from the assumption that most if not all of the skills needed to competently perform the work tasks carried out by many professionals could be acquired through practical experience and with much less in the way of formal schooling than is usually needed to obtain the “required” credentials. From this perspective, the disguised purpose of much formal schooling (its ‘hidden curriculum’) is to impart a particular disciplinary paradigm, ideological orientation, or set of values to those seeking formal credentials to work in prestigious or ‘high-status’ fields such as medicine, law, and education. Furthermore, the credential systems developed in a number of occupational areas are part of the ‘collective mobility projects’ of practitioners to achieve a ‘professional status’ that brings with it greater material and symbolic rewards. Thus credentialism is closely associated with strategies of ‘social closure’ (to use Max Weber’s expression) that permit social groups to maximize rewards ‘by restricting access to resources and opportunities to a limited circle of eligibles.”

The concept has been pushed much further since then – as well as the pushback against it.

So, for example, we get employers who hire only from Ivy League schools (this including many parts of the federal government in Washington), no matter the demonstrated knowledge, background, skills and other assets that other applicants might bring. The dynamic reaches out on the other end to parent frantic to get their kids into top-rank schools (leading to the corruption of such events as the 2019 college admissions scandal), and the exploding cost of higher education.

As a character on the TV show The Sopranos said, back around 2000 (on the subject of college admissions), “It’s not about grades any more. It’s all, who you know and how many buildings you give.”

Young adults will find they need more education than did their predecessors, but turning the process into a game of extreme musical chairs will lead to social disaster – not least because many students do not go on to college, do not finish or do not attend prestige schools; and there aren’t nearly enough spaces for everyone if they chose to do so. Better answers are needed.

Joseph Fuller, a Harvard University academic, is among those giving the matter some thought, and serving as a critic (ironically maybe, given his professional perch) of over-credentialism in the job market. In a study called “Dismissed by Degrees,” he and co-author Manjari Raman reported that “Degree inflation – the rising demand for a four-year college degree for jobs that previously did not require one – is a substantive and widespread phenomenon that is making the U.S. labor market more inefficient. Postings for many jobs traditionally viewed as middle-skills jobs (those that require employees with more than a high school diploma but less than a college degree) in the United States now stipulate a college degree as a minimum education requirement, while only a third of the adult population possesses this credential.”

As a matter of politics, this tendency leads to anger at the educated elites (mainly on the right) and a a socially-restrictive movement toward income inequality (on the left), among other results.
Credentialism is a term and an issue in land mine status.