Culture

Web gives volume to whispers of assault

When I was in college, in the bygone days of typewriters and corded phones, there was a rumor of a gang rape on campus. A "town" girl had gone back to a fraternity house with a boy, and several others ended up having sex with her against her will.

Or so the story went. Many on campus fumed, avoided the suspected rapists and waited for the college administration or the police to act. Months went by. Nothing happened.

We graduated and went our separate ways. I suspect that the officials involved -- not to mention the young men -- were relieved. But regardless of what really happened that night at the frat house, the way it went unaddressed instilled distrust in me, and perhaps in thousands of others who were on campus at the time: Would people in charge stand up for women's safety and dignity?

Having to ask ourselves that question meant we lost some innocence about the world we were about to fully enter. And it raised the possibility that, maybe, ignoring ugly realities is right. The smart thing to do.

But now, that sort of official privilege has gone the way of the typewriter and corded phone -- as two recent stories of rape illustrate. Hundreds of protesters gathered in eastern Ohio last Saturday to "Occupy Steubenville." They called for justice in the case of a 16-year-old girl, who was allegedly drunk to the point of unconsciousness last August, carried around to parties and sexually assaulted while others watched. The girl was from across the Ohio River in Weirton, W.Va., and the accused are Steubenville High School football players.

The alleged assault became public in the days afterward, when partygoers posted photos and reports to Instagram and Twitter. Two 16-year-old boys were arrested and charged, and they face trial on Feb. 13. They maintain they are innocent.

However, some in the community were not finished with this case. They became convinced that the investigating sheriff wasn't taking it seriously enough. Online, a branch of the hacker collective known as Anonymous accused the sheriff of deleting video evidence, noted his friendship with the high school football coach, and began leaking information on people who are believed to be covering up the full extent of the assault.

Last weekend, protesters arrived from around the country -- like Occupy Wall Streeters, many wearing Guy Fawkes masks. Some speakers told their stories of being raped.

Steubenville city and police officials have been forced to respond by establishing their own website about the case, which they say is intended to sort fact from fiction.

Teenagers are obsessed with documenting their lives online, and oversharing and even sexual cyberbullying are real problems. But without social media, this case never would have gotten such broad attention. And it's all but certain that Steubenville officialdom would not be trying to explain itself online to a bewildered international audience.

The Ohio story has parallels to the horrific alleged gang rape and fatal beating last month of a 23-year-old physiotherapy student on a bus in Delhi, India. Outraged, thousands of people took to the city's streets, only to be met by police with tear gas and long sticks. The government closed roads to discourage protests, but instead, as word spread through social media, protests sprung up around the country. Six men have been arrested.

No doubt, there is danger in rushing to judgment and in anonymous online reports. People's reputations and lives are at stake. But social media are proving to be useful to call for justice in cases of rape, where justice is still too often uncertain, inconvenient and easily avoided.

This essay was first published in Newsday.

Can mommy bloggers harness their political power?

When weighing the good and bad technology has brought us, here's one to add to the plus column: mommy blogs.

The cutesy name is deceptive. These online diaries reveal the messy reality of raising children American-style - which has been relatively isolated in each family home. But these web writers chronicling the ups and downs of parenthood have fashioned community support for millions.

Starting small in the late 1990s, the mommy-blog phenomenon has exploded to about 4 million writers in North America, according to online marketers, and many times more readers. One of the most popular writers, Heather Armstrong of Dooce.com, has over a million followers on Twitter. Mommy blogs have multiplied so rapidly that parent website Babble.com expanded its annual Top 50 ranking last year to the Top 100 Mom Blogs. The 2012 list came out last week.

Of course, the profit motive being what it is, companies with products to sell began wooing the bloggers a half-dozen years ago. Disney, Walmart and Procter & Gamble, among others, recognized them as "influencers" of buying decisions. And lately, they've been attracting political attention as well.

In August, seeking re-election, President Barack Obama opened an annual female blogger conference in New York City live by videoconference. Last month, the premier of British Columbia, Canada, Christy Clark - who is polling badly among female voters - invited blogging moms to her Vancouver office for a chat.

Overtly courting women's votes dates back at least to the soccer moms - married, middle-class suburban women with school-age children - in the 1996 American presidential campaign. Women have cast between 4 million and 7 million more votes than men in recent elections, according to the Center for American Women and Politics at Rutgers University. And this year, for the first time since the Gallup Organization began keeping this sort of record in 1952, the candidate that men overwhelmingly preferred lost.

So, are mom bloggers exercising political power? As it turns out, they don't blog about much that you'd call political. They're generally not endorsing candidates or advocating for legislation. Instead, their topics are often mundane - recipes, shopping, cute things the kids did, pets, frustrations - and also personal: depression, sex, drinking, rage, boredom, self-doubt.

Catherine Connors wrote on her top mommy blog, HerBadMother.com: "I am a bad mother according to many of the measurements established by the popular Western understanding of what constitutes a good mother. I use disposable diapers. I let my children watch more television than I'd ever publicly admit. I let them have cookies for breakfast. ... I have thought that perhaps I am not at all cut out for this motherhood thing."

She goes on to reject the idea of a "community consensus" about what makes for a good mother. In the 50-plus years that child care experts have been judging whether mothers are good enough based on employment, sleeping arrangements, grocery choices, self-abnegation and 1,001 other criteria - having mothers confess who they are and receive the acceptance of a vast online community may be among the more political acts of our time.

Perhaps if we can get past the artificial barriers of who's a good-enough mom - call a cease-fire in the so-called Mommy Wars - we could begin to act collectively and exercise some real political power. We could harness those millions of readers to advocate against cuts to child care subsidies and in favor of paid leave to care for infants.

The Internet has given mothers this platform. It will be interesting to see what they do with it.

This essay was first published in Newsday.

Embracing the new normal

There's nothing like a life-shaking storm to make people appreciate normal. Usually, normal is ho-hum. But when life is turned upside down, normal is the most welcome feeling.

Normal didn't return for me, after superstorm Sandy, when we got our power back or refilled the refrigerator. It was when I saw faces I hadn't seen since before the storm - about two weeks after it knocked our Island around. There we were, smiling, most of us showered, and whole. Normal returned when I realized that people in my community were, for the most part, going to be OK.

That's not the same as saying life will be the same as it was before the storm, or before this long recession. Instead, we're living with a "new normal" - a sense that we must permanently lower our material expectations. Maybe the new normal will define our moment in history.

Some day, years from now, we may think of these times the way people recall the Great Depression. People who lived through it went on to stash away money - sometimes in places far away from banks they no longer trusted. They hoarded food; waste became a sin. Our recollections of 2012 may be that this was the year we acknowledged how much we depend on each other.

Our country has weathered a long series of blows. The banking crisis of 2008 diminished or zeroed out our home equity. High school graduates applied to cheaper colleges, and college graduates couldn't find jobs. Stretches of unemployment lengthened, people couldn't pay their mortgages, and then ... Sandy.

It's fair to say that many of us are feeling wiped out. Thousands of homes and more than a dozen people on Long Island were lost in the storm. It's the sort of thing that makes normal seem miraculous.

You probably think I'm going to say that we should be grateful for normal. It is Thanksgiving Day, after all. Children's smiles, purring kittens, dry basements and the smell of coffee. Yes, all of that.

But there is another point worth remembering, and that is that as the winds have receded, it's impossible to miss the compassion going around. We heard about the occasional tempers flaring as people waited in hours-long gasoline lines. But for the most part, we were patient with one another. Those with generators opened their homes. A friend cooked all the chicken from her neighbor's powerless freezer and fed the neighborhood. An out-of-state tree cutter returned to one woman's home, after his shift was over, to make sure she had lights and heat. Fire departments set up cots for utility workers who were far from home.

Everyone has storm stories like this.

During this recession, unlike those of the past, volunteerism has been on the rise, according to Wendy Spencer, chief executive of the federal Corporation for National and Community Service. What motivates volunteers, he says, is connection to community, and a sense that we are all going to have to contribute if we are going to achieve community and national goals.

This year's re-election of President Barack Obama seemed to me to be an affirmation of depending on each other, with a vision of prosperity for the broadest number.

I don't hear people talking now about what they can get out of the government. They are discussing buying generators when the price goes down and how long food will keep in a freezer if you leave it sealed. They're vowing to fill the gas tank at the next storm warning.

People aren't acting like victims. They're adjusting. They're finding a new normal. It's one of the things we as a people do best.

This essay was first published in Newsday.

Individualism vs. collectivism is a false choice

Hurricane Sandy

Hurricane Sandy

Some people say the 2012 presidential race was a contest between worldviews. On one side is the collective view (represented by President Barack Obama), and on the other, the idea that the individual succeeds on his or her own (promoted by Mitt Romney).

Think of the sound bites we had on these themes - from Rep. Paul Ryan's admiration for ultra-individualist Ayn Rand to Obama's reminder that business people didn't "build that" by themselves. They had a country behind them.

Superstorm Sandy, as if on cue, blew in to provide us with daily reminders of how we need each other. Driving past a recently bisected tree that had been blocking my daily commute, I know: I didn't cut that.

Neighbors have been checking on one another's well-being. Even in the heat of the close presidential contest, leaders of opposite parties returned to civility. Perhaps New Jersey Gov. Chris Christie, a Republican who considered challenging Obama, understood that he might need the White House - whether it's inhabited by an R or a D.

In "The Social Conquest of Earth," published earlier this year, naturalist Edward O. Wilson argues that humans evolved as we did precisely because we have strains of both individualism and collectivism. Wilson, who has spent years studying ant colonies, updates the idea that the fittest individuals survive. In fact, groups in which individuals sacrifice for the good of the collective have, over millions of years, won out. "Selfish individuals beat altruistic individuals," Wilson writes, "while groups of altruists beat groups of selfish individuals."

Groups that are willing to share, to withhold individual rewards in order to further the growth of the collective, emerged from the evolutionary contest to become modern humans.

But we've retained characteristics of both, Wilson says. We are forever stuck in between selfishness and generosity. If we were all-out collectivists, we would cooperate robotically, like ants. As extreme individualists, humans wouldn't have formed societies where we specialize in healing, finding food and building shelters.

It's that tension of being stuck in between that played out in the presidential election - and will continue to bedevil us. What's the right place on the spectrum? Does it change after a hurricane?

Individualists say that when people are free to act in their own self-interest, society benefits. This philosophy promotes hard work and worries about creeping totalitarianism.

Collectivists point out the many things we accomplish together that we wouldn't do singly - efforts that spread the cost over many people and even many generations: medicine, the university system, roads and airports, our judicial system, arming a military, fighting fires.

People who hold the collectivist view fear that job creators want an excuse for greed and special tax treatment.

The great thing is that we don't have to choose between these views - no matter what you heard on the presidential campaign trail. Science argues for some of each.

So, what's it to be to lift us out of the Great Recession? Other catastrophes, like the Great Depression, have catalyzed collective solutions. We emerged with the Social Security Act and the GI Bill and a sense that we're in this together.

It's been difficult over the past several months to feel a sense of fellowship, however. I purposely didn't vote for either candidate in my Assembly district yesterday because I thought they made false, destructive claims about each other.

So, I'm glad it's the day after Election Day. Let's set aside fake choices and use all of our abilities to move on.

This essay was first published in Newsday.

What's up with the U.S.'s declining birth rate?

End-of-the-world scenarios have been circulating forever. Some think the world will end with the Mayan calendar later this year. But I believe I've seen the real doomsday. Our species will simply fail to reproduce.

That's my conclusion from two news items. The first is from the U.S. Census Bureau, which announced a baby "bust" last fall. The census shows that, in 95 percent of counties across the United States, the share of the population younger than 18 was smaller than in 2000.

There are now more households with dogs than children.

The other piece of evidence is a book published this month from feminist author and blogger Jessica Valenti: "Why Have Kids?" A new mother herself at 33, she looks at the unhappiness among parents with young children and asks this very relevant question: Why do it?

According to interviews, Valenti concludes that it's the chasm between the idealized parental life and reality that causes so much woe. Americans glorify the mother alone at home raising kids.

It may be tempting to tut-tut Valenti and tell her that she'll get used to the lack of adult conversation and the jobs that require either 24/7 commitment or unemployment, with nothing in between. But her perspective may well spring not so much from her phase of life as from our time in history. Or, as we've begun to say about this economy that refuses to improve, her complaint is the new normal.

Raising children well has become increasingly difficult. I blame it on my generation - those of us who have teenagers, as I do, and older kids. Instead of banding together to wrest better policies from government and employers - or to create strong communities to assist one another - we've indulged ourselves in divisive "mommy wars." We have bickered about which is better, attachment parenting or free-range? Stay-at-home mothers or moms with paychecks? Opting out or having it all?

In 1996, we heard that it takes a village to raise a child, and we looked the other way.

Now, Americans are having fewer children. In 2007, according to the census, the average number of births per American woman was 2.1. That's just enough to hold the population steady. Last year, however, the birthrate fell to 1.9, the lowest in decades.

Have we decided that it's too difficult to go on - at least in the United States? France is still reporting somewhat higher birthrates. Perhaps the French crèche system of universal day care - which, by the way, supports an employment rate of 80 percent among French mothers - has a lot to do with providing young families with the resources they need to feel happy and hopeful enough to keep having children.

The reasons for the decreasing U.S. birthrate are many. The financial crisis of 2008 made parents fearful of another bill. The annual cost of center-based day care for an infant in 35 states - New York among them - is higher than a year's in-state tuition and fees at a four-year public college.

Wages have been falling for 40 years, which means that many household budgets require two, three or more jobs. Forget about quality family time with that schedule. One New Jersey town recently hired soccer coaches because it could no longer count on parents having the leisure to volunteer. Not only will we have fewer kids in the future, it looks like we can forget about fielding a team for the World Cup!

We could reverse these trends, if we believed that saving the species were important enough. We could fight for better policies. Or we could accept the situation and look on the bright side: It will be a lot easier to navigate store aisles without all those annoying baby strollers.

This essay was first published in Newsday.

Candidates must give their version of moonshot

Mitt Romney's mention of the late Neil Armstrong during the Republican National Convention on Thursday raised cherished images for Americans of a certain age. Those of us who remember the Apollo 11 days can still recall that excitement and sense of purpose. We're nostalgic for it now.

No one would look to the 1960s as a united decade in our history. But as Armstrong took those first steps on the moon in 1969, it became clear that a bold commitment by President John F. Kennedy had driven us forward.

Today, we are drifting through a prolonged economic valley and a divisive presidential race. Commitment to another bold goal would target our energies and revive our faith.

In his convention speech, Romney presented his version of shooting for the moon: creating 12 million new jobs. His five-point plan to reach that goal includes North American energy independence by 2020, school choice, rewritten trade agreements, a reduced deficit, and lower taxes and costs for small businesses.

It will be crucial for President Barack Obama to similarly paint his vision of the path forward during the Democratic National Convention, which opens this week.

It's hard to overestimate what a gamble Kennedy took, as a new president in May 1961, to promise a man on the moon "before this decade is out." At the time, many of the necessary metal alloys and technologies hadn't even been invented.

He intended to prove the United States' cultural and military superiority to the Soviet Union. Just a month earlier, the first cosmonaut, Yuri Gagarin, had orbited the Earth. But the machismo of beating an opponent to conquer this so-called last frontier wasn't the only thing that was so important about Kennedy's promise. It was also having a clear goal that for many years inspired our imagination with a sense of national mission - and, after 1969, with a national identity.

We had done it first.

Where is our national identity today? U.S. astronauts must now hitch rides on Russian spacecraft to get to the International Space Station, and the United States may be outraced toward certain space goals by the Chinese.

But these developments should be cause for celebration. The United States has matured enough in space exploration to share frontiers with scientists from around the world. If globalization has its faults, then shared scientific advancement is among its bright promises.

Obama's goals for NASA are probably too far distant in time to offer much of a unifying purpose. He wants to send a crew to a near-Earth asteroid by 2025 and have an astronaut on Mars by the 2040 decade. Far-off deadlines won't force the sort of compressed technological advancement we achieved from the original space race.

Among the side benefits of that era are the ability to screen for breast tumors, defibrillate hearts, track hurricanes and ocean fish, grow higher-yielding crops and pay at the gas pump with an ATM card. A nearer, more tangible goal is needed to propel similar innovation.

It's not enough for Americans to come together around a negative, as we did after the tragic Sept. 11 attacks or the hunt for Osama bin Laden. We need to agree on what we want to accomplish.

We could commit to making our public schools so good that we stem the flight to private. Building big infrastructure projects to create jobs. Reducing mortgages to reflect the current market and prevent foreclosure. Matching young people with careers that allow them to become productive and independent.

The list goes on, and we won't all agree what should be on it. But it's certain that the prize of the next presidency depends on how each candidate imagines the next footprint on the moon.

This essay was first published in Newsday.

Less homework is a good thing

As school doors swing open, it will be time once again to engage the homework battles.

A major front, every year, is the parents' complaint that schools give too much homework. This campaign has received recent reinforcement with the publication of "Teach Your Children Well" by Madeline Levine, a psychologist who treats adolescents in affluent Marin County, Calif. Levine says that high-pressure parenting with Ivy League goals can leave kids feeling empty inside. Family rituals that generate enthusiasm and contentment are being lost.

Canada has gotten this message. The nation's education minister has directed schools to make sure students are not overloaded. Toronto schools, with nearly 300,000 kids, have limited elementary school homework to reading, eliminated holiday homework and adopted language endorsing the value of family time.

U.S. schools are also experimenting with reduced homework, but there is no national directive like in Canada.

The Banks County Middle School in Homer, Ga., stopped assigning regular homework in 2005. Grades are up, and so are results on statewide tests.

The Kino School, a private K-12 school in Tucson, Ariz., allows time for homework during the school day. Kids can get help with the work if they need it, or spend the time socializing and do their homework later. Giving kids this choice teaches them to manage their time.

Not all the experiments are positive, though. In the 2010-2011 school year, the schools in Irving, Texas, stopped counting homework as part of a student's grade. After six weeks, more than half the high school students were failing a class - a huge increase. The kids seem to lack the judgment and experience to know on their own when additional studying or work outside class is needed in order to pass tests and complete projects.

There ought to be a middle ground. Mandating "no homework" days or weekends, or setting guidelines for how much time children should spend on homework according to their age, seems reasonable.

One leading researcher, Harris Cooper at Duke University, recommends 10 minutes of homework a night for each grade a child is in. In other words, 10 minutes in first grade, 30 minutes in third grade, etc. For middle school and high school students, Cooper found no academic gains after one-and-a-half to two hours of homework a night.

Couldn't teachers assign homework only when the work really can't be accomplished in school? Say, for a project where kids are interviewing various people on a topic?

Cutting back on homework can make the difference in whether some students even attempt the assignment. And teachers who assign large amounts of homework are often unable to do anything more than spot-check it. Shouldn't teachers have the time to read homework closely, so they can see whether kids are learning?

One problem is that parents have trouble even finding out what the assignment is. This sounds straightforward, but parents for the most part only know what kids tell them. In this digital age, schools should communicate better.

Poorly thought-out assignments can make students cynical about school and crush their love of learning. I'm sure you've heard the perennial question, "Am I really going to use this after I graduate?" Some countries teach their children well without much homework. In Finland, for example, which ranks near the top in science worldwide, a half-hour of homework in high school is the norm.

Like many things in life, homework may be a case where less really is more.

This essay was first published in Newsday.

Springsteen's public talk about his depression

With its 140-character limit, Twitter is anything but subtle. A tweet from a recent 19-page New Yorker profile of Bruce Springsteen focused on just one thing: his admission that he's suffered depression.

It's startling that a person so fabulously successful could have been depressed. Even more surprising, depression hit after his breakthrough commercial album "Born to Run" in 1975.

The profile recounted a 1982 drive Springsteen made from the East Coast to California, and then back again. "He was feeling suicidal," his friend and biographer Dave Marsh is quoted as saying. The New Yorker's David Remnick, who wrote the profile, says Springsteen could not let go of "a sense that he had inherited his father's depressive self-isolation." He began seeing a psychotherapist.

Because of his status, Springsteen's revelation may reduce the persistent stigma surrounding depression. Kay Redfield Jamison, a professor of psychiatry who recounted her manic-depression in the 1997 book "An Unquiet Mind," talks about the "silent successful" - people who get well from psychiatric illness but are afraid to speak about it.

"This reluctance is very understandable, very human, but it is unfortunate because it perpetuates the misperception that mental illness cannot be treated," she has written. "What remains visible in the public eye are the newspaper accounts of violence, the homeless mentally ill, the untreated illness in friends, family and colleagues."

People can recover and lead normal lives, or even achieve stardom like Springsteen and, as I recounted in this space upon his death in April, TV journalist Mike Wallace. Springsteen's admission speaks to a new audience and generation.

But without role models for healthy recovery, individuals may reject treatment for fear they'll be mistaken for "crazies" who dye their hair orange and allegedly hunt movie patrons. They rob themselves of the chance for recovery.

Silence also lets decision-makers remain blindered to the prevalence of depression and other mental illnesses. Our society is in denial, to use the psychiatric term. One result is that we place too little importance on adequate funding for mental health offices at universities. Mass murderers keep emerging from campuses where they've had at least some contact with psychiatric counselors - the University of Colorado (suspect James Holmes), Pima (Az.) Community College (suspect Jared Loughner) and Virginia Tech (Seung-Hui Cho). Shouldn't we be asking why they walked out of the counseling offices to kill dozens of people? Perhaps we fail to take campus counseling seriously because we believe mental illness cannot be controlled. The contrary truth is another casualty of the silent successful.

We should applaud those who speak openly. Chicago Rep. Jesse Jackson Jr., who took a leave from Congress on June 10, recently disclosed that he is being evaluated at the Mayo Clinic in Rochester, Minn. for gastrointestinal issues and depression. Will he be accepted as a functioning member of Congress - as much as any Congressman warrants that description nowadays - if he returns to his job? Will a political opponent use his need for treatment against him? I'm afraid you can count on it.

It's less risky for Springsteen, now 62 and apparently long past his darkness, to recount it at this point. No one is going to skip a concert because The Boss sees a shrink. Still, his owning up will color how we view the art he has created. It alters his legacy. So coming out about depression took courage.

Only time will tell if his courage helps a Jesse Jackson Jr. return to a full life, or inspires a fan to get help. Each act against silence inspires hope.

This essay was first published in Newsday.

Deny mass killers the celebrity they crave

This is a column about the man accused of killing a dozen innocent people in a movie theater in Aurora, Colo. But you won't read his name here.

I won't degrade our conversation with it. He doesn't deserve the attention, which apparently he craved. And potential copycats don't need the encouragement that might come from giving any more notoriety to this guy.

It's time we deprived such people of the fame they too easily achieve with their horrific acts. Maybe if we cut off the oxygen of attention, the murderous flames would die down.

I'm not the only one who thinks so. In 2007, a Nebraska man opened fire in a shopping mall and killed eight people before turning the gun on himself. He left a suicide note saying, "I'm going out in style. I'm going to be famous." Do you remember his name? Me neither.

Forget for a moment how sad a commentary that is on the state of our country - we have so many mass murders that eight in Omaha doesn't stand permanently in memory. The point is that, after that shooting, radio host Paul Harvey refused to say the killer's name on the air. It was his personal protest against a twisted being who would inflict so much damage for a place in the history books.

Some are taking a similar tack today. A Dallas columnist, Jeffrey Weiss, wrote: "You may have noticed I've never mentioned the accused shooter's name ... it's pretty obvious he wants to be famous. I can't stop that, but I don't have to be a party to it."

Jordan Ghawi, the brother of slain sports journalist Jessica Ghawi, 24, asked for people to focus on how his sister lived. He tweeted, "remember the names of the victims and not the name of the coward who committed this act."

President Barack Obama, after visiting with victims and families at the University of Colorado Hospital on Sunday evening, gave a televised speech saying that it will be the good people and the heroic acts that will be remembered - not the alleged gunman. The president didn't mention the suspect by name, assuring listeners that though there was a lot of focus on him now, eventually that notoriety will fade.

"In the end, after he has felt the full force of our justice system, what will be remembered are the good people who were impacted by this tragedy," Obama said.

Exactly so.

It's possible that studying the shooter's life - his education, his family, his habits - could help us learn how to avoid another tragic massacre. But the thousands of hours of reporting and writing about the Columbine and Virginia Tech killings didn't prevent the Aurora tragedy. The attention may instead have encouraged escalating horrors.

But we would rather not accept that randomness and chaos can reach us at benign events like a movie premiere. We study the motives of a killer to calm our fears and find "reasons" for his actions - the way, when we hear of cancer, we hope to hear of smoking, or when we hear of an auto fatality, we're reassured if seat belts weren't used. We buckle up.

If I were in charge of the world, I'd withdraw at least one motivation for evil: the thrill of national attention. Take the cameras out of the courtroom. Let editors withhold photos from their publications. Let the justice system work its way to a result.

The ancient Athenians and the Quakers used to ostracize or shun people who refused to live by their rules. In baseball broadcasts, cameras often turn away from fans running out onto the field bent on mischief. Why reward that behavior with instant celebrity?

Celebrity, at least 15 minutes of it, is what Andy Warhol predicted was the fate of every person. Let's rob the Aurora killer of his remaining time on the clock.

This essay was first published in Newsday.

Must the state drag parents into piercings?

iStock

iStock

The New York State Assembly passed a bill last week requiring parents to sign a consent form for their kids younger than 18 who want to have a body part pierced.

I don't normally react badly to nanny state imperatives; I don't miss the trans fats in my New York City restaurant meals one bit. But the body-piercing age limit struck me as intrusive.

It happens that the week before this bill passed, my 14-year-old told me she might like to pierce her upper ear or navel. Those seemed pretty tasteful to me, and more reversible than a tattoo.

"I suppose I should act shocked so you won't take your rebellion phase any further," I joked.

But this is serious. What right does the state have to insert itself into my job as a parent? Forcing my kids to ask permission would turn casual discussions about boundaries and style into high-stakes negotiations.

As a mother of teens, I see how important it is to them to develop their identities. And if everything they do to express themselves has to have a parental sanction - well, that's no longer self-expression, is it? At least, not a self-expression they are in charge of. It takes the freedom of choice out of the teen's hands and puts parents in the role of censor.

Would I be more concerned if my daughter wanted something awful, like nipple or genital piercing? Or an ear gauge? Absolutely. But then, she wouldn't be likely to talk to me about it. Let's face it, this bill could pretty much put an end to body piercings under age 18.

The bill is in the State Senate now, and it looks likely to pass before the scheduled adjournment on June 21. Legislators are under pressure after news stories in April revealed that kids as young as 12 were able to get body piercings for $20 in the East Village.

Some shops won't do the procedures on anyone who can't prove they're 18, and local laws in some places back them up. But there's no statewide minimum age, and if one parlor refuses to honor a young customer's wish, he or she can always shop around.

There is a certain logic in the body-piercing bill, since teens younger than 18 cannot get a tattoo, even with parental permission. The tattoo artist who breaks this law can fetch a class B misdemeanor, meaning a fine of up to $500 or as much as three months in jail. The body-piercing bill would carry the same penalties.

It's certainly hard to argue against parents being informed about body piercing, since it comes with health risks: allergic reactions, infections and scarring. Piercings can be easy to hide, but parents can watch for health problems if they know about them.

And piercing-shop owners may welcome the law. Who wants the legal liability for maiming or sickening a young client? They would probably be glad to be rid of the pressure to give a 12-year-old a tongue stud.

Katie Ragione at Tattoo Lou's in Selden says the shop already requires notarized parental permission for body piercing, and of those shops that don't, "we tell people to watch out for them." She's concerned that shops that cut parents out could be taking other shortcuts.

But it could also drive body piercing underground. Some piercers would still perform the work without parental permission, maybe at a far higher cost. Or, kids could simply grab a needle and an ice cube and do it themselves. If teenagers are determined to pierce something, they'll find a way.

Most other states have passed laws restricting body piercing for minors. Some Canadian provinces set the age at 16.

That lower age may just strike the right balance, and New York's legislators should consider that compromise. That would keep the younger kids out of the piercing parlors - and prevent the nanny state from treating older teens like babies.

Column first published in Newsday.

Mom and dad's electronic tether to campus

iStock

iStock

As college students return home this month for the summer break, their parents might not notice much of a difference. In a sense, for many of them, their kids never really left.

That's because some parents and college students keep in touch several times a day through cellphones, email, Skype and other technological marvels. A horrified English literature professor writes about this constant communication in a recent issue of The Chronicle of Higher Education, in "Don't Pick Up: Why kids need to separate from their parents."

"One student - a delightful young woman whom I know to be smart and levelheaded - confesses that she talks to her mother on the cellphone at least five, maybe six, even seven times a day," writes Terry Castle, who teaches at Stanford University. The student says she calls her mom whenever she gets out of class to tell her about the professors, the exam - whatever's going on at the moment.

"I'm stunned; I'm aghast," Castle writes. When she was an undergraduate, from 1971 to 1975, "all we wanted to do was get away from our parents! We only had one telephone in our whole dorm - in the hallway - for 50 people! If your parents called, you'd yell, 'Tell them I'm not here!'"

Castle never says whether her current students are different from those she taught in the past - more docile, perhaps? More obedient? But she does say that the willingness to defy or just disappoint one's parents is essential to emotional and intellectual freedom. Is the Class of 2012 at risk of remaining in mental chains?

The online responses to her essay are fascinating. One says that with parents paying as much as $55,000 a year for college, you bet they are going to check in. Another says this is probably a problem only at elite universities - the implication being that you needed to be a helicopter parent in the first place to get your kid into a top school. Another says parents are anxious because of the recession and feel they need to try extra hard to help kids find their place in the world.

Melissa Bares, who just finished her junior year at Stony Brook University, says she has friends with too-concerned parents who she describes as "babied." "They can't even make their own schedule without checking with their parents first," she says in an email.

Bares, a psychology major, speaks with her parents about school a couple of times a week - which seems normal to me. Both Bares and James Kim, another Class of 2013 student at SBU, defend parental involvement - but not over-involvement. "If a parent nags, it brings a lot of pressure," says Kim, a double major in chemistry and Asian-American Studies. He calls home about once a week. "If it's the right amount of nagging, you see students excel more."

He mentions a friend - a slacker - who could use a lot more parental oversight.

Jenny A. Hwang, who heads up mental health services at SBU, says parental involvement is crucial and can protect against alcohol and drug abuse, as well as depression and even suicide. But technology has made it so easy for parents to reach out that one of her roles is to counsel moms and dads about a healthy amount of communication.

"Parents can remain available and help students problem-solve," Hwang says, "without responding to that pull that's always there to make it all better."

Castle, the English professor, cites fictional orphans - Dorothy Gale, Harry Potter, Frodo Baggins - who are heroes of their own stories, to argue that psychological distance from parents is essential for kids to grow up.

That may be true, but distance, like many things, is better in moderation.

Column first published in Newsday.

Study: More young women than men consider career important

iStock

iStock

It looks like Supermom is here to stay. Women ages 18 to 34, in a new survey, rated "high-paying career" high on their list of life priorities. For the first time, women in this age group outnumbered men in considering it important - 66 percent of women, compared with 59 percent of men. The last time this question was asked of this age group, in 1997, the sexes ranked "career" roughly equal in importance (56 percent of women and 58 percent of men).

At the same time, being a good parent and having a successful marriage continued to rank significantly high on everyone's list. "They haven't given any ground on marriage and parenthood," said researcher Kim Parker of the Pew Research Center, which conducted the study. "In fact, there is even more emphasis [on home life] than 10 to 15 years ago."

The story line over the past couple of decades has been that, for the most part, women would prefer to stay home with children. Those who could afford it were "opting out" of the workplace for home. The recent stir over Ann Romney's stay-at-home motherhood reawakened culturally conservative voices claiming that her choice is superior for women, and certainly better for kids.

But Parker believes that young women's expectations about the need to earn a paycheck are changing their attitudes. They were surveyed as the damage of the 2008 recession - dubbed the "mancession" for how men lost jobs disproportionately - was still playing out. "The reality is hitting women that they cannot rely on a male breadwinner," Parker says.

On a brighter note, she adds, young women have seen older women reap the fruits of workplace success and "are motivated to take on big roles." Women have been outpacing men for some time in earning college and graduate degrees. There are now three women on the Supreme Court, women play major roles in government, they're running large companies and building media empires - all of this inspires.

Pew also surveyed men and women aged 35 to 64, who responded at roughly the same rate (43 percent and 42 percent) that being successful in a high-paying career or profession is important. In 1997, middle-aged men greatly outranked women: 41 percent to 26 percent.

The big rise in middle-aged women who care about their careers probably reflects both opportunity and necessity, Parker says. But, you'll notice that young women are more positive about work than their middle-aged counterparts. Parker believes that the allure of "having it all" wears off once women are faced with the reality of supermotherhood. In fact, moms who work full time have told numerous pollsters that they would prefer part-time employment if it were available to them.

Often, scaling back from full-time work means a loss of health benefits, seniority, security and status. Employers as a whole could be doing a better job to help moms cope - and as the women in the 18-to-34 age group move up and have children, perhaps there will be more reason for employers to do so.

Governments could also be doing more to raise the quality of child care and birth leave support for both fathers and mothers.

Finally, individuals need to do a better job of thinking through their competing desires, and choose careers that accommodate parenthood well. Doctors, lawyers and accountants - and people who are willing to shift into lower-paying nonprofit or government sectors - often find more flexibility in their schedules.

Supermom is great as a concept - using all of your human abilities in a lifetime. But there's a lot more that can be done to take the risk and stress off parents' shoulders.

Essay first published in Newsday.

Mike Wallace left his mark on awareness of depression

iStock

iStock

Mike Wallace, the groundbreaking TV newsman who died Saturday at 93, worked hard at earning his tough-guy image. During some of the most volatile events of our times, he asked pointed questions of powerful people: members of the Nixon administration, cigarette manufacturers, the Ayatollah Khomeini, Louis Farrakhan, champions of the Vietnam War. He tossed aside his nervy image, though, to highlight a problem that many men have difficulty admitting: depression. This revelation by a highly visible tough guy has encouraged untold numbers to seek help.

Wallace spoke publicly about his depression for the first time during a "60 Minutes" retrospective of his career in 2006. He told the camera that he had tried to commit suicide.

Before Wallace went public, his doctor advised him against owning up to the illness. "'That's bad for your image,'" Wallace quoted his doctor as saying, in an interview with the Saturday Evening Post. "But finally, I had to face up to it."

Although it's more common for women to suffer from depression, men with this affliction more often end their lives, according to research published in the journal "Suicide" in 2008. Because families and the press are reluctant to make suicides public, it's not widely known that suicides are far more common in the United States than homicides - an estimated 30,000 to 35,000 each year.

The majority of men who kill themselves have not asked for help before their deaths, according to Ciaran Mulholland, a psychiatrist and international expert from Queens University in Belfast, Northern Ireland, whose findings hold for men all over the world. The reasons are poorly understood. Perhaps men are less likely to recognize that they are under stress or unhappy, Mulholland says, and more reluctant to consult their doctor about their distress.

Exacerbating the problem, health professionals are often less likely to consider a diagnosis of mental illness in men, Mulholland adds. This is true also for seniors, and especially African-American seniors, according to Charles Reynolds, a professor of psychiatry, neurology and neuroscience at the University of Pittsburgh School of Medicine.

Depression is hard to admit to. Even with everything we've learned about mental illness, it's often viewed as a moral or personal failing rather than as a medical problem. There's no blood test to show that someone is depressed, just a list of grim symptoms. Similarly, there's no clear understanding of what causes depression, just its risk factors: unemployment, social isolation, chronic illness.

Wallace was in his mid-60s when he plunged into a depression that put him in the hospital. He had been fighting a courtroom battle for his journalistic credibility, after being sued by Gen. William C. Westmoreland, who commanded the U.S. military in Vietnam from 1964 to 1968. Westmoreland alleged that he had been libeled by the Wallace documentary "The Uncounted Enemy: A Vietnam Deception," which claimed that U.S. leaders had deliberately underestimated enemy troop strength to prop up domestic support for the war. Westmoreland eventually dropped the suit.

In the midst of the trial, Wallace remembered, "I couldn't sleep, couldn't think straight, was losing weight, and my self-esteem was disappearing."

When Wallace spoke publicly about his depression, his message was that it is treatable - so much so that, at 88, he said the decades since he had begun taking antidepressants had been the best of his life. He lobbied for better health insurance coverage for mental illness.

When he embraced his depression, Wallace was motivated by a cause larger than himself. Just as he was in his journalism. This was one tough guy who could inspire us at our weakest.

Essay first published in Newsday.

Adrienne Rich: A pioneer in writing about motherhood

iStock

iStock

The world knew Adrienne Rich, who died last week at 82, as a poet - influential, political, feminist, lesbian, anti-war, Jewish.

But her profound impact on my life came in the form of prose: a 1976 book called "Of Woman Born: Motherhood as Experience and Institution." Rich, who was a wife until her 40s and the mother of three boys, trained her rebel's eye on the mixed feelings that come with caring for babies and young children.

To be sure, Rich had her predecessors on this ground: Betty Friedan, even the humorist Erma Bombeck. And Rich inspired thousands who came after, from Susan Maushart, who wrote "The Mask of Motherhood," to the many parent-lit moms and dads writing and blogging today.

It's not that parenthood is awful, of course. It's that mothers were to an excessive degree expected to be "beneficent, sacred, pure, asexual and nourishing," as Rich described it, or they would risk disapproval. Rich was instrumental in shattering these public myths that made women feel privately inadequate and unnatural if they discovered any forbidden feelings in the nursery.

More important, this long march away from the perfect angel mother toward a more nuanced - if darker - portrait of parenting paved the way for recognition of postpartum depression so that women and their families could get help. Even the impossibly perfect Brooke Shields published an account, in 2005, of her postpartum depression, "Down Came the Rain."

Rich wrote looking back. She was 46 when "Of Woman Born" was published, and her eldest son was 21. "I only knew that I had lived through something which was considered central to the lives of women, fulfilling even in its sorrows, a key to the meaning of life; and that I could remember little except anxiety, physical weariness, anger, self-blame, boredom, and division within myself: a division made more acute by the moments of passionate love, delight in my children's spirited bodies and minds, amazement at how they went on loving me in spite of my failures to love them wholly and selflessly."

She included journal entries from her days with babies; at one time her three sons were all younger than 7. The entries are startlingly candid: "Degradation of anger. Anger at a child. How shall I learn to absorb the violence and make explicit only the caring?"

"Of Woman Born" is sometimes overlooked amid Rich's 30 books of poetry and prose published over six decades. Its radical take on women's domination in a patriarchy is and was controversial. But the beautifully rendered descriptions of the inner life of this one mother, a poet, is what makes the book so reassuring to parents who can relate to the loss of independent identity and the isolation that comes with caring for a child.

What parent taking a phone call wouldn't recognize this passage? "As soon as [my son] felt me gliding into a world which did not include him, he would come to pull at my hand, ask for help. ... And I would feel [it] ... as an attempt moreover to defraud me of living even for fifteen minutes as myself."

Rich was born in Baltimore, and her father, a pathologist, encouraged her to read poetry from childhood. Her mother was a concert pianist. After graduating from Radcliffe College in 1951, Rich published her first book and soon after, married Alfred Conrad, a Harvard University professor.

They moved to New York in 1966. Four years later, Rich left her marriage, and within several months, Conrad took his own life. It's tempting to see the negative aspect of her writing as a product of this unhappy biography.

But most parents will recognize Rich's ambivalence as truth-telling.

Essay first published in Newsday.

'Hunger Games,' young adult films, reflect a grimmer culture

iStock

iStock

'Kids killing kids." That's how the trilogy "The Hunger Games" is summed up by critics of the forthcoming film, premiering March 23. And it's not an untrue or inaccurate description. That arrow absolutely hits its mark.

As much as I'm a values-enforcing mother of two teenage girls, I have to admit, I love "The Hunger Games." I've read 21/2 of the three books, partly in an effort to have conversations with my 14-year-old. But it may be easier to accept the violent story line on the page than it will be to see it come to life on the big screen.

In an age when Columbine is still much more than a Colorado high school and, just three weeks ago, a student emptied his handgun in a school in Ohio, killing three students, should we ever be sanguine about kids killing kids? The idea makes you want to pop in an escapist Disney DVD - you know, the one with the happy ending. Oh, right, that's every Disney film.

In fact, we've spent generations feeding kids happy endings. Fairy tale characters may face grim obstacles, but they almost always prevail in the end. More recently, our culture has been walking up to darker themes. Voldemort tried to kill the hero in the "Harry Potter" series. The birth scene toward the end of "Twilight" was gruesome.

Are kids ready for all this?

The "Hunger Games" series is set in a post-apocalyptic future, in which the country of Panem is divided into 12 fenced-in manufacturing or agricultural districts, ruled by a hyper-powerful Capitol. Capitol residents obsess about their attractive bright pink hair or sequined skin, while district dwellers are often desperate for medical care or enough to eat.

Each year, two district representatives - a teenage boy and girl - are chosen by lottery to fight in the Hunger Games, a futuristic "American Idol" in which the 24 "tributes" fight to the death. Kids killing kids. Capitol and district residents alike watch the Hunger Games televised. It is their chief entertainment - like the brutal Roman games of history.

The series is imaginative and well-written, and the protagonist is a cunning and brave teenage girl, Katniss Everdeen. Clearly adults everywhere are impressed by the books: The series is assigned reading in eighth grade in my school district.

Katniss wrestles with all the moral questions the plot implies. Why is there an exempt class of Capitol residents who are never required to compete in the games? How can tributes be allies and friends, and then be required to turn on one another? Katniss' love triangle raises further questions of loyalty.

Loyalty is an overarching issue for middle-schoolers, who are often breaking old elementary school bonds and discovering new packs. So it's easy to see why the books were chosen for an eighth-grade audience. If the film portrays these issues well, it will be worth watching.

But morality is harder to convey on screen than gore. If filmmakers go the blood-and-guts route, emphasizing the considerable violence, "Hunger Games" will have failed its fans. Movies with PG-13 ratings, like this one, often push up against the envelope of R - and no ratings system seems adequate to prevent plain bad taste. Will Ferrell has convinced me of that.

At some point we have to trust our kids to understand the difference between reality and dystopian fantasy, and I believe most of them can. In some parts of the world, in the Lord's Resistance Army in northern Uganda, for example, leader Joseph Kony forces children to murder - a real-life "kids killing kids."

It's not as though this idea has never entered the human imagination.

Essay first published in Newsday.

Uneasy about Chris Brown and the Grammys

Chris Brown

Chris Brown

Within minutes of singer Chris Brown's appearance on the 2012 Grammy Awards - as he moved liquidly to his new single, "Turn Up the Music" - the phrases #womanbeater and #chrisbrownbeatswomen began trending worldwide on Twitter.

What that means is that people with Twitter accounts sent those phrases to their followers, in enough numbers that they showed up on every Twitter user's home page.

To achieve "trending" was a victory for those who wanted to protest Brown's appearance on stage. They said his brutality three years earlier should have disqualified him from a Grammy platform; he performed twice during the show - clearly a favorite of the show's producers.

On the eve of the 2009 Grammys, news broke about Brown beating his then-girlfriend and fellow pop star, Rihanna. The images of her beautiful, badly bruised face were heart-rending. The incident would later lead to felony assault charges for Brown, to which he pleaded guilty and accepted a sentence of community service, probation and counseling - a light-seeming sentence.

At the 2012 awards show, Brown won his first Grammy, for best R&B album. Afterward, the 22-year-old took to Twitter to tell off his critics: "Hate all u want becuz I got a Grammy now! That's the ultimate --- off!"

But Brown's was not the most disturbing reaction of the night. That came from at least 25 women on Twitter: "chris brown can punch me whenever he wants." And, "chris brown can beat me all he wants ... I'd do anything to have him, oh my."

This is really disturbing. Could these women really understand what they are saying? Could they have been in abusive relationships before and are volunteering for more? That seems unlikely. More probably, they are making the age-old mistake of confusing emotional intensity with love, and passion.

But the problem with that, of course, is that it seldom ends with one blow. U.S. government statistics from 1976-2005 state that 30 percent of all the murders of women are the result of "intimate partner violence." And what doesn't kill women - or men - in abusive relationships, can cripple them for life. Think of Whitney Houston, recently dead of an assumed drug overdose, who became hooked on drugs during an allegedly abusive 15-year marriage. Abuse, drugs, self-loathing - they can be a toxic mix.

Before the tweets from these young women, we could fool ourselves into believing that they had more self-respect. At one time, women were believed to stay in abusive relationships for financial reasons, or out of fear. The women's movement - with its push for access to paychecks - and the greater availability of women's shelters, were supposed to have won our freedom.

Now, the Grammys, and the Chris Brown twitterati, are glorifying a man who put his then-girlfriend in the hospital.

More disturbing still are the rumors that Rihanna herself is seeing him again. Gossip columns report that they spent Valentine's night together. This is a woman who found the strength to leave him once.

Surely, Brown could be a changed man. He was only 19 in 2009, and the court ordered him into counseling. But if his anger and narcissism have eased, there is no public sign of it. He seems unrepentant.

The only public message is that the Grammys organization rewards batterers - and so do their fans, and perhaps, their ex-girlfriends. These are horrific lessons for our daughters and sons.

Essay first published in Newsday.

First couple Obamas make time for family

We've come a long way, baby, when the president of the United States is worried about getting home in time for dinner at least five nights a week.

That's my takeaway from "The Obamas," a new book by journalist Jodi Kantor that promised a close-up account of the first couple's marriage. The book has stirred a number of tempers, including that of first lady Michelle Obama, who told talk-show host Gayle King that she hadn't read it, but what she had heard made her seem like an "angry black woman."

The Obamas probably have had to defend themselves against that stereotype their entire public lives. But anger has several welcome cousins: determination, strong will, commitment. The first couple, in Kantor's tale, employ these to wrestle with the time binds that constrict many modern families: how to have two careers, raise "normal" kids and find together time.

We're a busy country. The average American has added around a month's worth of work - 164 more hours per year - in the past two decades. The number of dual-income households has risen, as has the number of people working multiple jobs.

Glimpses of life with "The Obamas" gives us a comparison we Americans seem to love: Our celebrities' struggles are somewhat like ours.

Except in special circumstances, Kantor reports, the president turns down cross-country trips, dinner parties, gala invitations, fund-raising or working dinners that would keep him from the family table more than two evenings a week. By 6:30, he walks the few minutes' journey from the Oval Office upstairs to have dinner with Michelle, Sasha and Malia.

Barack Obama is one of several parent coaches for Sasha's basketball team - not at the games, where his presence would be a distraction, but at practice drills. Kantor writes, "Finally, he was what his own father had never been, what he had never been, what his wife had always wanted: the kind of dad who was around to coach basketball."

Michelle Obama is the parent who keeps standards high. If the girls take a trip, they are required to write a report about it for their parents. When they ask for a snack, the first lady questions whether they are really hungry - or just bored. That's probably a tip she picked up from her campaign against childhood obesity.

The girls are not allowed to surf the Internet or watch TV during the week. And they are very active: swimming, tennis, soccer, lacrosse, basketball.

Certainly it's easier to keep such a busy schedule with the help of a White House staff. But they could also be sitting on the couch watching "Big Bang Theory" or "Two Broke Girls."

To be sure, the good father image works for President Obama politically. When he was running for president, polls said that was one of the things voters liked best about him. At the time, scandals were engulfing other men in public life: Think Eliot Spitzer and John Edwards.

Other presidents have been doting fathers. George W. Bush certainly comes to mind. But it's rare to sneak such a peak behind the scenes while a first family is still in the White House. Kantor is to be commended for looking through the eyes not only of a political reporter but also of a wife and mother.

The Obamas of this book are good role models, not only making time for each other but determined to create a rich, rounded childhood for their girls - even in the extraordinary circumstance of growing up in the White House.

The rest of us have our own circumstances - and for most of us, far fewer resources - that make family life challenging. But we shouldn't give up on these ideals, either.

Essay first published in Newsday.

Another mother leaves a great job

People leaving jobs for reasons they don't want to discuss often say something hackneyed about spending more time with family. But it appears that Michèle Flournoy literally means it.

Flournoy, 50, is a top Pentagon policy adviser and potential first female defense secretary. She announced this week that she will quit after the New Year to have more time with her three children, ages 14, 12 and 9. Her work for the Defense Department often runs from 7 a.m. to 8 p.m., and over many weekends.

Flournoy's work sounds fascinating. She testifies before Congress, and is strategizing troop levels in Iraq and Afghanistan. That's a lot to give up for three kids.

Which is why I love that she stated her reason so baldly: The work of being a mother is important, too.

It's possible there's more to her story -- who knows? But her public affirmation of motherhood is brave. It risks the anger of those who argue women can "have it all." Flournoy invites the envy of parents who have to work for financial reasons; she's married to a top deputy at the Department of Veterans Affairs. She risks instilling doubt in the junior women -- perhaps also mothers -- whom she sought to mentor and inspire. And she courts ridicule by the ignorant. Remember when talk show host Mike Gallagher called Fox News anchor Megyn Kelly's three-month maternity leave "a racket"?

Highly visible women should keep talking about the importance of parenting, because they can have repercussions for working moms and dads who aren't among the power elite. There are many parents who don't have the protections of money or status to assert something so basic as the need for time away from a job to raise children.

And working people have ever less leverage now, as the depressed economy has "excessed" so many into the unemployment line. In the spring of 2009, a House subcommittee on Workforce Protections, chaired by Rep. Lynn Woolsey (D-Calif.), heard testimony from advocates that the dismal economy was pushing parents out of the workforce because their opportunities for flexible work schedules were drying up. Parents who had worked a four-day week, for example, found their employers suddenly requiring five days.

Sometimes, employers were trying to stretch to make do with the current workforce, because they didn't want to hire anyone new. But the result was often to upset a delicate balance and force the parents out.

Flexible schedules are rarely set down in writing and can disappear when an accommodating manager is replaced by someone less family-friendly. Another possibility -- and the one that most concerned Congress -- was that employers could be using the bad economy to discriminate against pregnant workers and parents.

Recognizing how precarious the work-family balance continues to be, some companies have begun making flexible work arrangements more formal. For example, KPMG, the audit firm based in Idaho, with offices in Melville, has a flexibility website where employees can explore compressed work weeks, telecommuting, job sharing and more.

Of course, accounting firms like KPMG battle notoriously high turnover, so they look for ways to retain employees. At other kinds of jobs, many workers don't even have paid sick days -- in fact, 47 percent of private-sector workers, according to the Department of Labor. We have a long way to go as a country that supports parents.

People like Flournoy should keep up the drumbeat about the importance of child-raising, reminding employers that parents have important work off the job, too.

First published in Newsday.

Time for a 'living wage' for the middle class?

With millions out of work, complaints about the decline in middle-class wages may seem misplaced. But without some shoring up, the middle class will remain dispirited -- and our economy, which is 70 percent dependent on consumer spending, will remain in the dumper.

It may be that there's a role for government to play in buttressing these eroding wages, which result not only in a declining standard of living, but also in a family life so pressure-filled that it leads to its own problems: angry homes, fast-food diets, dependence on alcohol and drugs.

Calling for any sort of government role during these tea party times can raise charges of socialism. But the idea of a wage that supports some minimum standard of living -- shelter, clothing, food -- has been broached on and off for more than a century.

In the late 1800s, social activists began protesting wages earned by a working-class man that were not sufficient to sustain his family, without the additional wages of working children and mothers. The Catholic Church published a fundamental social teaching, "Rerum Novarum" (on capital and labor), that read, "Wealthy owners of the means of production and employers must never forget that both divine and human law forbid them to squeeze the poor and wretched for the sake of gain or to profit from the helplessness of others."

Shortly afterward, Australia's courts ruled that an employer must pay a wage that guaranteed a standard of living that was reasonable for "a human being in a civilized community" for a family of four to live in "frugal comfort."

In the United States, these ideas led to laws forbidding child labor, making education compulsory and protecting women from exploitive labor conditions. The campaign to establish a "family wage" was defeated, but in 1938, a lower standard, the federal minimum wage, was passed.

The Rev. Martin Luther King Jr., Daniel Patrick Moynihan and in 1968, a group of 1,200 economists including Paul Samuelson and John Kenneth Galbraith, have all supported some kind of minium income guarantee.

Echoes of this debate are being heard now, in the Vatican's critique last week of the global financial system, and in places where labor unions still have some sway: In the New York City Council, which at the urging of retail workers may require employers in commercial developments built with public subsidies to pay at least $10 an hour, a "living wage" higher than the minimum wage of $7.25; and in Albany, where the State Legislature in April passed an increase to $9 an hour for home health aides, who are represented by the influential 1199 SEIU United Health Care Workers East. That increase takes effect on Long Island in 2013.

It's easy to see why the lowest-paid workers would need a boost from someone powerful enough to argue on their behalf. But to make the argument for the middle class, one has to believe that this great swath of America, nearly half the country, has special value. And it does: The stability and upward mobility of the middle class not only underpin the U.S. economy but give America its famously optimistic and innovative spirit.

That spirit is on display as the middle class makes the best of things today: The average American has added around a month's worth of work, 164 hours per year, in the last two decades. One-third of American families have reduced their savings for college, according to a 2010 Sallie Mae/Gallup poll, and another 15 percent are not saving at all. Retirement savings are in similar decline.

How much more can the middle class cinch in its belt, before we lose what's precious about this way of life?

First published in Newsday.

Bullets are wrong way out of a marriage

As the facts stand, it seems wrong to allow Barbara Sheehan to get away with killing her husband. Sheehan, 50, is the Howard Beach, Queens, woman who was just acquitted of murder by reason of self-defense, based on her claims of physical and psychological abuse by Sgt. Raymond Sheehan, a retired cop and her husband of 24 years.

She shot him 11 times on a February morning in 2008, leaving him dead in their bathroom, where he had been shaving. She got off 11 rounds -- and he? Zero. Considering the circumstances, this doesn't seem as much like a woman who fired in self-defense as someone who was shooting to kill.

And yet, a jury on Thursday found her not guilty of murder. It's troubling that, with as many social and legal supports as we've erected for abused partners in the past 40 years, Barbara Sheehan still felt she had to resort to killing to escape her marriage, no matter how nightmarish.

Up until the 1970s, domestic violence, and especially violence against women, was dismissed by the criminal justice system as "a family matter." Perpetrators were often not arrested or charged with crimes. Police gave a low priority to "domestic" calls.

But much has changed. Many states have enacted mandatory arrest laws for reports of violence. Some states have set up special courts and treatment programs for batterers. Victims can seek restraining orders and take refuge in clandestine emergency shelters. The U.S. Department of Justice created an Office on Violence Against Women in 1994, and estimates that this crime fell by more than 50 percent in the subsequent decade.

Sheehan testified that she feared her husband would kill her in one of his rages. He kept at least one gun with him at all times, had smashed her head into a cement wall, and had often held a gun to her head. She said he insinuated that his past as a police officer would make it difficult for her to report him and escape his orbit. She claimed that his threats had been growing more serious.

Sheehan told the court that on that final morning, Feb. 18, she took her husband's revolver and tried to sneak out of their home. But he allegedly confronted her with his 9-mm Glock pistol, which he had taken into the bathroom. She fired five shots from the revolver, retrieved his pistol, and then emptied that into his body too.

Acquitted of murder, Sheehan faces sentencing Nov. 10 on a conviction of gun possession, which could carry three to 15 years behind bars.

What she apparently did not do, before resorting to this lethal act, was call 9-1-1. During Sheehan's monthlong trial, she produced no record of reports to police. She didn't claim, as women often did when their customary role was housewife, that she couldn't afford to leave; she had a job, as a school secretary. Nor could she have been afraid of leaving her children behind: Their daughter and son were grown.

Granted, it may have been dangerous for Sheehan to inform to the police on one of their own. And domestic violence victims are said to enter a kind of mental paralysis and passivity after years of domination, humiliation and torture. Statistics argue that Sheehan had good reason to fear for her life; of those killed by an intimate partner each year, three-quarters are female.

The prosecution argued that she stayed in the marriage to collect her husband's life insurance money. But there should have been some half step she could have taken. Remaining passive in the face of abuse and then nailing someone with 11 bullets shouldn't have felt like her only option. Raymond Sheehan was probably a monster. But society has worked hard to ensure that battered women don't have to resort to violence, too.

First published in Newsday.