Results for Advice of Historians

Live-Tweeting #AHA2014

January 08, 2014
Craig Gallagher

In anticipation of going to my first American Historical Association conference this past weekend in Washington D.C., I sought out a range of senior colleagues who had attended past AHA meetings for advice on what to expect. As a third-year Ph.D. candidate who is about to start writing a dissertation, I was regularly advised that many aspects of the AHA meeting did not yet apply to me, such as the Job Center, where interviews for academic positions are conducted, or the Book Exhibit where publishers meet with scholars and teachers to discuss manuscripts or books for use in the classroom.

My first AHA, therefore, was largely confined to the scholarly panels (and, I should add as a brief aside, various receptions, where I shamelessly handed out business cards and tried to score five minutes of chat with some of my favorite scholars. I was mostly successful). I attended six different panels over the four days, enjoying some immensely and others not-so-much. On the whole, I was impressed with the range of questions posed by various luminaries in my field, and – especially in the Atlantic History panels I was most interested in – the sweeping state-of-the-field discussions most papers engendered.

But it turned out that physically attending those panels and listening to the presenters was only scratching the surface of what was offered by this year’s AHA. I made an early decision this year to live-tweet the panels I was attending so that scholars who couldn’t attend the conference could get a sense, at least, of what issues were being raised. For this purpose, as the AHA themselves recommended, I tweeted with the hashtag #AHA2014 (which remains searchable).

But as I sat there with my laptop, oftentimes the only person in the room not using pen and paper,  frantically summarizing points raised via the game-changing medium of Tweetdeck (if you’re not acquainted with it, become so), I discovered that I was also attending almost every other panel that was happening concurrent to the one I was listening to.

Where I had intended to provide a flavor of the conference proceedings for scholars unable to attend, I quickly found myself in dialogue with interested scholars actually present in D.C. but at another panel or professional development workshop (or even waiting for a job interview to start). Although they were mainly reading my shorthand summaries of points raised (far more eloquently) by the presenters I was listening to, they nonetheless sent me pertinent questions, comments or asked for clarifications which I did my best to provide.

Indeed, it seemed as though every panel or workshop across the AHA program had a dedicated tweeter, whether on classical Rome or modern China, and especially the various digital history (#dhist) workshops run this year by many younger scholars. This was all fitting as part of the AHA meeting featuring the inaugural Reception for History Bloggers and Twitterstorians, which was well attended and stimulated very lively discussions.

Could live-tweeting panels be the answer to the long-suffering conference-goer’s gripe about too many similar panels scheduled at the same time? I’d offer a cautious yes, as long as the tweeter is willing to be the only person in the room typing away furiously and is prepared to spend the intervening time between panels desperately searching for an outlet to charge their laptop or phone!

But at the very least, it suggests that history conferences like the AHA in future are likely to take place in two separate but intimately related spaces: the real world of Washington D.C. on the verge a major snowstorm, and the ethereal, abbreviated, but undeniably lively world of the Twittersphere.

Craig Gallagher is a Ph.D. Candidate at Boston College, who is writing a dissertation about religion, trade and empire in the early modern British Atlantic world. He tweets at @Gallacticos87
Live-Tweeting #AHA2014 Live-Tweeting #AHA2014 Reviewed by Joseph Landis on January 08, 2014 Rating: 5

Will Blog Posts and Tweets Hurt Junior Scholars? Part 2

October 02, 2013
Heather Cox Richardson

Untenured scholars are in a funny place: that gap between the old world and the new. Ten years ago, yes, blogging would convince many senior scholars that a junior person was not a serious academic because s/he was catering to a popular audience. Since then, the old world of the academy is crumbling, and while many departments have not yet caught up, others are aware they must move into the twenty-first century.

So will blog posts and tweets hurt your career? Maybe. But they can also help your career in very practical ways.

The first has to do with publishing. The gold standard for employment and for tenure remains a published book. When most senior scholars finished their doctorates, it was almost guaranteed that their dissertations would find academic publishers. In those days, university presses had standing contracts with university libraries that guaranteed automatic sales of a few thousand copies of each monograph that came out from a reputable press. Budget cuts over the last twenty years killed this system. No longer can an academic press be certain that libraries will buy their monographs. This means that they can’t accept everything that comes over the transom, making it harder than ever to get a book contract.

But you still need one to get tenure.

One of the ways to improve your chances of landing that contract is for you to make sure you have written a book that is of wider interest than to those in your immediately area of interest, one that a press thinks it will be able to sell. How can you do that? Engage with a wider audience on-line. Listen to questions. See which of your posts get a strong response. Are they written differently than your other work? Are they asking different questions? What does this tell you about your argument and your writing style? How can you speak more clearly to what is, after all, a self-selected audience of interested people?

Contracts also depend increasingly on your own networks. Do you have standing because you contribute to a popular blog? Are there lots of people who like to follow what you have to say? That will help convince a publisher that you’re worth a hearing.

An on-line presence might speak to an employer more directly. Blogging gives you an opportunity to present yourself on your own terms. Any diligent search committee will google you. A series of interesting blogs about teaching, for example, will never hurt your profile.

There are pitfalls to an on-line presence, of course. First of all, and above all, it’s important to remember that the very act of on-line work means your opponents can’t respond, and it’s unsporting, at best, to launch a tirade against someone who can’t answer. For the job market, this means it’s crazy to write intemperately about anyone or anything. This is a very small profession, and even if XYZ’s work infuriates you, there is no reason to call it out. XYZ will certainly have good friends at any institution at which you might interview, and they will not forget you have taken a pot shot (they googled you, remember?).

The exception to this rule, of course, is that if you feel strongly that you must take a stand either for or against something on principle, do it proudly and openly. And be prepared to defend your stand against opponents. Just don’t pick fights gratuitously.

On Twitter, the rules are like Facebook. Don’t be an idiot. Don’t post about how much you hate your students, or your colleagues, or any of the obvious rants that will ruin you with a committee. Don’t post endless self-absorbed pieces about what you’re eating or drinking or saying or thinking. But Twitter and Facebook are not just danger zones; they can also reflect you well. I follow a number of junior scholars on Twitter who are obviously tightly linked to their communities and to new scholarship, and who are struggling with really interesting intellectual issues. If one of them applied to my school, their Twitter presence would make them stand out.

The other major pitfall is that you cannot let your on-line presence keep you from producing more traditional scholarship. Blog and tweet, yes, but make sure those contributions to knowledge reflect and/or point back to your larger body of work. No search committee is going to consider a blog equivalent to a manuscript, but it very well might like to see a blog that augments the rest of what you do. Just don’t let on-line work suck all your time.

Here’s a newsflash: The internet is here to stay. The profession hasn’t yet caught up with its implications, but it must, and soon. Today’s junior scholars are in a vague zone between the past and the present, but that same vagueness offers them a great opportunity to shape the way historians use the world’s revolutionary new technologies.
Will Blog Posts and Tweets Hurt Junior Scholars? Part 2 Will Blog Posts and Tweets Hurt Junior Scholars? Part 2 Reviewed by Joseph Landis on October 02, 2013 Rating: 5

Should Historians Use Twitter? Part 1

September 26, 2013
Heather Cox Richardson

Yes.

But since I still have more than 136 characters left, here’s my take on the Twitter question:

I have had many conversations lately with historians based in America about whether or not they should use Twitter. There are three complaints about it. First of all, there is a general impression that Twitter users are narcissists who feel obliged to inform the world every time they eat a bagel. Second, there is a sense that it is a waste of valuable time.  Third, younger scholars are concerned that presence on social media might hurt them on the job market.

These are valid concerns, but they are, to my mind, vastly outweighed by the advantages of Twitter both for individual historians and for the profession.

Let’s start with the profession. Yes, there are plenty of people who use Twitter to issue a play-by-play recap of their most mundane activities. But there is no law that says that’s the only way to use the medium. Twitter works best for historians when participants use it to direct followers to content. This works in two ways. Tweets can mention a new archive or recently discovered source or the significance of a date. They can also be used to call attention to a longer blog post or article—or even a book—on a historical topic. Imbedded links make the longer format instantly available.

It is striking how few established historians in America use Twitter this way. Historians in Canada and the UK are all over Twitter, claiming history for professionals either within or outside the academy, while established historians in America are simply not claiming any territory. There are exceptions, of course. William Cronon posts great links.  So do Henry Louis Gates, Jr.; David Armitage; Tera Hunter; Kevin Levin; and certainly many others I’ve missed. But their heroic ranks can’t compare to the sheer numbers of Twitter users based in Canada and the UK.

If American-based historians don’t take up more oxygen in public spaces, their expertise will continue to be ignored, and the importance of their work discounted. A good way to combat that denigration is simply to show up.

A second benefit to the profession is that Twitter offers a place where people on both sides of the tenure divide can exchange ideas. Another thing I have found shocking about Twitter is how many junior people are active there, and how few senior people are. Following as many junior scholars as I have found suggests to me that they have a completely different set of concerns and skills than people safely ensconced in the academy.  Early career historians are all over digital technologies and new archives. They are terribly worried about the rise of adjuncts and MOOCS. And they have no idea how they will find permanent employment in academia . . . or even if they want it. In my experience, these are not conversations that happen often among tenured folks. We worry much more about research, narrative techniques, and dealing with administrators. People on both sides of this divide have a great deal to offer each other; indeed, it seems to me fatal to the rapidly-changing historical profession NOT to be talking.

Finally, Twitter offers to historians outside perspectives. It’s an opportunity to stand in the same virtual world as a whole bunch of really smart people and hear what they think is important. Aside from the posts I saw yesterday about the problems of transnational history, teaching American history in a diverse classroom, Bruce Bartlett pointed me toward Bloomberg’s recap of the five years of America’s financial crisis, and NPR announced that Alan Lomax’s archive is now on-line. All of this information will inform my teaching and scholarship, and none of it would have come to my attention if it hadn’t flitted across my Twitter feed.
Should Historians Use Twitter? Part 1 Should Historians Use Twitter? Part 1 Reviewed by Joseph Landis on September 26, 2013 Rating: 5

Henry Steele Commager on America during the Cold War

August 29, 2013
Randall Stephens

The November 24, 1954 episode of Longines Chronoscope featured Henry Steele Commager (video embedded here).  That was not unusual for
the news and views program, which regularly featured heads of state, intellectuals, novelists, and other notables.  But the subject of the discussion is particularly interesting all these years later.  Maybe that's especially poignant because Commager was one of America's foremost historians at that time. Here he weighs in on American identity, the pressures of conformity, the post-war economic boom, and freedom of expression.  

This was filmed in the wake of the Korean War, the hydrogen bomb test on Bikini Atoll, and not long after the historic Brown vs. Topeka Supreme Court decision.  Red Scare paranoia remained strong. The coming month of December would see the US Senate reprimand Joseph McCarthy, by a vote of 67–22, for "conduct that tends to bring the Senate into dishonor and disrepute."

Here are some of the questions posed to Commager by hosts Larry LeSueur (CBS News correspondent) and August Heckscher (chief editorial writer for the New York Herald-Tribune):

LeSueur: So, professor Commager, we'd like to ask you: Do you think that this country when it was smaller and less powerful, but when we had less responsibilities, do you think we were happier then than we are now?

LeSueur: Professor Commager, do you feel that our freedoms such as speech are more circumscribed now than they have been in the past?

LeSueur: Surely professor Commager there's less conformity now than in the days of the Puritans?

Heckscher: Would you say that . . . we exaggerate our standard of living in comparison to the standard of living of foreign countries, for example?

LeSueur: Do you think our country is more or less unified in some areas, on foreign policy for example, now than it has been in the past during some of our crises?

How are historians reflecting on the pressing issues of our day?  What will the opinions of contemporary historians look like more than 50 years from now?
Henry Steele Commager on America during the Cold War Henry Steele Commager on America during the Cold War Reviewed by Joseph Landis on August 29, 2013 Rating: 5

Mitch Daniels’ Email Criticizing Howard Zinn Roundup

August 08, 2013
Chris Beneke
 
Tom LoBianco, “Daniels Looked to Censor Opponents,” The Associated Press, July 16, 2013
“Emails obtained by The Associated Press through a Freedom of Information Act request show Daniels requested that historian and anti-war activist Howard Zinn's writings be
banned from classrooms and asked for a "cleanup" of college courses. In another exchange, the Republican talks about cutting funding for a program run by a local university professor who was one of his sharpest critics. … The emails are raising eyebrows about Daniels' appointment as president of a major research university just months after critics questioned his lack of academic credentials and his hiring by a board of trustees he appointed.”

The Mitch Daniels email, February 9, 2010
“This terrible anti-American academic finally passed away. The obits and commentaries mentioned that his book ‘A People’s History of the United States’ is ‘the textbook of choice in high schools and colleges around the country.’ It is a truly execrable, anti-factual piece of disinformation that misstates American history on every page. … Can someone assure me that it is not in use anywhere in Indiana? If it is, how do we get rid of it before any more young people are force-fed a totally false version of our history?”

92 Purdue faculty members, “An open letter to Mitch Daniels,” July 22, 2013
“We trust our colleagues to introduce young people to the facts of history, but also to the much more difficult, much more essential practices of critical thinking. We trust our K-12 colleagues to know how and when to present challenges to received knowledge and how to encourage their students to judge such challenges for themselves. And we trust them to decide how and when to use controversial scholarship such as Zinn’s in their classrooms. This kind of academic freedom is essential to all levels of education, whether within a tenure system or not.”

American Historical Association, “AHA Releases Statement,” AHA Today, July 19, 2013
“The American Historical Association would consider any governor’s action that interfered with an individual teacher’s reading assignments to be inappropriate and a violation of academic freedom.   Some of the relevant facts of this case remain murky, and it is not entirely clear what in the end happened, or did not happen, in Indiana. Nonetheless, the AHA deplores the spirit and intent of former Governor Daniels’s e-mails of 2010 …. Whatever the strengths or weaknesses of Howard Zinn’s text, and whatever the criticisms that have been made of it, we believe that the open discussion of controversial books benefits students, historians, and the general public alike. Attempts to single out particular texts for suppression from a school or university curriculum have no place in a democratic society.”

Robert Cohen and Sonia Murrow, “Who’s Afraid of Radical History,” The Nation, August 5, 2013
“Innovative history teachers across the United States have for decades used A People’s History at the high school level in similarly comparative and rigorous ways. High school teachers desperate to breathe some life into their classes have distributed Xerox copies of Zinn’s most provocative chapters to offer a contrast to state-mandated textbooks, seeking to engage students in historical debate so they learn that history involves sorting out competing interpretations of the past rather than mere memorization of names and dates. These teachers have been drawn to Zinn because he offered their students a uniquely accessible introduction to the new social history, which revolutionized historical scholarship beginning in the 1960s.”

Rich Lowry, “Daniels vs. Zinn,” The National Review Online, July 30, 2013
“The caterwauling in the Daniels controversy about the importance of academic inquiry is particularly rich, given that Zinn didn’t believe in it. He had no use for objectivity and made history a venture in rummaging through the historical record to find whatever was most politically useful, without caring much about strict factual accuracy. ‘Knowing history is less about understanding the past than changing the future,’ he said. He joined his propagandistic purpose to a moral obtuseness that refused to distinguish between the United States and its enemies, including Nazi Germany.”

Sam Wineburg, “In Indiana, history meets politics,” Los Angeles Times, August 2, 2013.
“The Purdue faculty dismissed criticisms of Zinn's scholarship by Handlin and presidential historian Arthur M. Schlesinger Jr. as coming from the ‘consensus school of U.S. history.’ But their dismissal ignored the searing criticisms of historians with impeccable leftist credentials, such as [Michael] Kazin and Princeton historian Sean Wilentz, who wrote that for Zinn, ‘everyone who was president was always a stinker and every left-winger was always great.’ … His [Daniels’] view of history, presented in his 2011 book "Keeping the Republic," is as one-sided from the right as Zinn's was from the left. … What bothers me most about the whole flap — about Daniels' emails and about the Purdue faculty's reaction to them — is the way nuance was sacrificed to politics. We've come to expect politicians under fire to engage in spin. But when academics respond in kind, they reduce education to a game of politics. The loser in this game is truth and the students we are supposed to teach about the value of pursuing it.”
Mitch Daniels’ Email Criticizing Howard Zinn Roundup Mitch Daniels’ Email Criticizing Howard Zinn Roundup Reviewed by Joseph Landis on August 08, 2013 Rating: 5

Memo to America, Re: Welfare in the Olden Days

July 24, 2013
Gabriel Loiacono 
 
One evening, chatting with friends from church, one asked me what kind of history I focused on. I told him: the history of welfare in early America.  He said: what welfare in early America?

"The drunkard's progress, or the direct
road to poverty, wretchedness & ruin," 1826.
Courtesy of the Library of Congress.
I find myself having a conversation like that one more and more these days.  Whether on the left or the right politically, high school grads or Ph.D.s, most Americans I talk to assume that welfare is a creation of the twentieth century: midwifed by Franklin D. Roosevelt or Lyndon B. Johnson.  Those hearty, independent minutemen of the Revolutionary period, they assume, either made the poor find work or relied only on churches for charity. 

Occasionally, this assumption is voiced explicitly in national, political discourse.  For example, in a famous September 12, 2011 Republican Presidential Primary debate, Representative Ron Paul described assistance to the poor in the past thus: “Our neighbors, our friends, our churches would do it.”  Less off-the-cuff, respectable-looking websites will tell you that charity was almost entirely private before FDR, aside from a few dark and dingy poorhouses, which were more effective at driving inmates out than keeping them comfortable.  And it is not only critics of welfare who think this; one can find defenders of welfare describing the U.S.A. as essentially without welfare before FDR.[1]

More often, this assumption is implicit.  You can see this in recent discussions of food stamp policy and the Farm Bill.  When both critics and defenders of welfare policy bring history into the argument, they usually head back to the 1960s.  Occasionally, they reach back to the 1930s.  They almost never go further back in time.  On both sides, the assumption is that prior to the New Deal, there was no welfare to discuss.  Thus, these are the good old days or the bad old days depending on what you think about welfare today. 

It is for this reason that I fantasize about writing the following memo:

MEMORANDUM

TO: The American People
FROM: A Historian
CC: Candidates, Think Tanks, Warriors of the Internet Comment Boards 
SUBJECT: Um, actually there was welfare when the United States was founded


I would go on, of course, to flesh this statement out with some background, evidence, and precision.  I would point out that poor laws came to North America almost with the first British settlers, and that a large welfare state developed in almost every English municipality.  I would cite figures showing that poor relief comprised more than half of most municipalities’ budgets before the 1820s, when school and road costs grew large enough to match poor relief.  I would feel compelled to mention that poor relief could mean a poorhouse, but more often some combination of cash, food, clothes, firewood, doctor’s attention, medicine, or even full-time nursing care.  I would highlight how significant local taxes were to most early Americans, compared with much lower state taxes and almost non-existent federal taxes. 
"Publicly-owned poor houses like the Dexter Asylum
in Providence, Rhode Island did not come cheaply."
Courtesy of the New York Public Library.

This would lead to the obvious comparisons.  Americans spent more than half of their taxes on poor relief when George Washington was president, compared to 12% on the federal “safety net” today, or 55% if you include Social Security, Medicare, Medicaid, and the Children’s Health Insurance Program.  Unlike today’s contributors to Social Security and Medicare, however, most taxpayers (read: property owners) in 1789 would not have expected to benefit from poor relief in their lifetimes.  They could depend on it, though, if they ever met with a financial catastrophe.  I would almost certainly quote historian Elna Green’s witticism, that so many grocers, doctors, wood-hewers, etcetera made money from the town by helping the poor that the poor law system should be called the “welfare/industrial complex.” [2]

Finally, I would point out one big difference between early America and the present: Today’s welfare is largely federal while early America’s was largely municipal.  In fact, I think the local nature of early American welfare is the reason why so many policy analysts overlook welfare’s past.  They just don’t look at the state and local levels of government. 

My fantasy memo is not a prelude to some specific policy prescription for the present day.  I just wish that when we do bring history to the argument, we use a reasonably correct version.  As an historian writing about pre-Civil War poor relief, I find myself cringing almost every time the history of welfare surges into public discourse.  Usually, there is a 300-year hole in the story.  For colonial American and U.S. history, that is a pretty big hole!

Surely, though, I am not alone among historians.  What about the rest of you? What makes you cringe?  You historians who see your subjects of expertise routinely misrepresented, what do you do?  What is your responsibility?  How do you lend your expertise in a helpful way?

_____________________

[1] On respectable-looking websites, see “The Poor in America Before the Welfare State,” at Intellectual Takeout: Feed Your Mind www.intellectualtakeout.org/library/sociology-and-culture/poor-america-welfare-state.  For a defender of welfare on the non-existent past of welfare, see Charles Michael Andres Clark, “The Truth Deficit: Four Myths About Deficit Spending,” in Commonweal July 12, 2011. 

[2] Elna C. Green, This Business of Relief: Confronting Poverty in a Southern City, 1740-1940, p. 1.
Memo to America, Re: Welfare in the Olden Days Memo to America, Re: Welfare in the Olden Days Reviewed by Joseph Landis on July 24, 2013 Rating: 5

PhD Applicant Beware

July 19, 2013
Randall Stephens

The July 11-17 issue of Times Higher Education includes a must-read article for the grad school bound.  In "10 truths a PhD supervisor will never tell you" (11 July 2013) Tara
Brabazon writes: "As a prospective PhD student, you are precious. Institutions want you – they gain funding, credibility and profile through your presence. Do not let them treat you like an inconvenient, incompetent fool. Do your research. Ask questions." Some of her ten tips apply more to the UK setting, but most are right on target for students in the US as well.

Prospective PhD students in history should think long and hard about who they want to work with. Ask around.  Get to know something about the scholar you'd like to be your mentor.  Has this individual shepherded other PhDs?  Do his/her students land good jobs? What is your prospective mentor's publishing record like?  Is he/she a good fit for your project? What will it be like to work with him/her?  Will he/she lend a hand or remain aloof and passive reclusive? 

Brabazon offers some dos and don't and, most of all, warns, "don't let the supervisors grind you down."  Here's one of her particularly helpful pieces of advice:

The key predictor of a supervisor’s ability to guide a postgraduate to completion is a good record of having done so. Ensure that at least one member of your supervisory team is a very experienced supervisor. Anyone can be appointed to supervise. Very few have the ability, persistence, vision, respect and doggedness to move a diversity of students through the examination process. Ensure that the department and university you are considering assign supervisors on the basis of intellectual ability rather than available workload. Supervising students to completion is incredibly difficult.   

 
Read more here.
PhD Applicant Beware PhD Applicant Beware Reviewed by Joseph Landis on July 19, 2013 Rating: 5

Mothers in the Academy: How to Do It All*

June 19, 2013
Heather Cox Richardson

Well, first you need a good household staff.

HAHAHAHAHAHAHAHAHA….

OK, now that we’ve got the hilarity out of the way, how really can mothers take on teaching, research and writing, and children—three incredibly labor-intensive jobs—at the same time?

Let’s start with teaching. Here are a few things I picked up along the way, largely by the seat of my pants as I jumped into a job when my first child (of three) was just shy of three months old. Nothing I learned was intentional, but some of it has stood me in good stead.

The key concept for enabling mothers to survive in the academy is efficiency. And here are some things that helped me to achieve it:

Teach big courses with a wide scope. That sounds counterintuitive, I know. Most people think it takes less energy for junior scholars—and most people with small children will be junior scholars—to teach smaller classes in their specialty. The problem with such specific classes is that they tend to be under enrolled, which means you will constantly have to come up with new courses to keep your numbers up. Repeatedly writing courses from scratch is a huge time-sink.

The initial time investment for writing a survey course is undoubtedly large, but it’s a one-time thing. From there, if you need to, you can pivot that material into a number of other classes. (My History of the American West stands alone, but has also spawned Race, Riots, and Rodeos, as well as The Plains Indians.) But the chances are good you won’t need to. Surveys tend to fill every semester. So do other sweeping courses that cover large periods, major events, important themes, and so on. Yearly tweaking will be enough to keep the courses up-to-date and interesting until you have the time to invest in creating new ones.

Another way to teach efficiently is to use grading rubrics. Instead of writing line-by-line comments on essays and exams—which takes forever—develop your own set of categories that you will evaluate. I grade essays with six categories: thesis, argument, style, evidence, grammar, errors, and overall. Under those categories, I write anything from a line to a paragraph about what worked and what didn’t.

You could, of course, break down your rubrics even further.

This takes about a third as long as my old style of line-by-line comments, and students love it. I started it not to save time, but because I read an education study showing that students were overwhelmed by unorganized comments and learned very little from them. So I gritted my teeth and tried a rubric. The first time I did it, grading took so much less time than usual I felt like I was cheating, but that year I got the best student reviews I’d ever gotten for my essay comments. In this case, it appears that less really is more.

I did both of these things for reasons that had nothing to do with mothering, but they ended up being very helpful ways to organize my teaching time most efficiently. They also were great for my own research, but that will be a different post.
____________

*I’m writing another post on the theoretical argument behind this series of posts. Until then, once again, I am defining “mothers” as a different mindset, not as a biological identification.
Mothers in the Academy: How to Do It All* Mothers in the Academy: How to Do It All* Reviewed by Joseph Landis on June 19, 2013 Rating: 5

Mothers in the Academy

June 09, 2013
Heather Cox Richardson

A recent studyshows that having children hurts women in academia at every stage of the profession. This will not be news to any woman who has had to sneak out of a meeting to pick up a child before the daycare fine system kicks in, who has had to explain to an older male colleague that his insistence on scheduling his pet seminar from 4 to 6 guarantees she can never attend no matter how angry it makes him, who has worked all night in the office because search files could not leave the premises and there was no time during the day to get enough time to read them all, and who has heard those chilling words: “You can have tenure or children, but not both.”

There is a push to change the mechanics of university life to address this problem, offering maternity leave to graduate students, for example, and extending tenure clocks for mothers. (More first-floor bathrooms wouldn’t come amiss either, by the way; two flights to a bathroom when you’re eight months pregnant is no picnic.) These steps are important, to be sure. But for historians, even more important to remember is that, by cutting more than half the population from the study of humanity, we are skewing our scholarship so badly it threatens to lose all meaning.

This is not a theoretical argument; it has real-world meaning for the study of history. Having a family—and nurturing it—is crucial for historians. (And this does not seem to me to have to be birth or adopted children, by the way. Investing in community does not require youngsters who actually live in your home.)


Here’s why: history divorced from the real concerns of everyday people is so rarified it is often meaningless. And when it comes to distinguishing important issues from theoretical fancies, there is nothing like having to explain to a five-year-old what mommy is doing. “Mommy is trying to figure out why sometimes white people aren’t very nice to black people,” was my age-appropriate explanation when writing The Death of Reconstruction, and the constant reminder that my discoveries had real-world implications for today that mattered to my kids made it a much stronger book than it would have been had I emphasized instead the theoretical implications of my argument.

Similarly, constant interactions with kids—your own and others—helps you to recognize which of the many issues that grab you is actually of interest to anyone who isn’t bowled over by the beauty of history for its own sake. Kids actually love real stories (which is, after all, what we discover) and chewing over their meaning. But which story you’re telling matters. The general history of mass protests in the U.S. in the 1960s and 1970s that led to liberation for a number of groups that had previously borne much discrimination . . . not so much. The story that mid-20th-century New York City laws deliberately targeted drag queens by making it a crime to wear articles of clothing associated with the opposite gender; that this forced gays and lesbians to frequent bars owned by the Mafia, which could afford to pay off the police; and that a bunch of well-oiled gays and lesbians had finally had enough when police raided the Stonewall Inn in late June 1969 and took to the streets to demand equal rights . . . that story rings true to young adults. It’s personal. It taps into their own sensitivity about discriminatory rules, and offers not just a lesson about historical change but also the example of people who stood up for their principles.

Your colleagues will happily argue the theoretical implications of mass movements for months. Your students will pretend to listen when you expound on the importance of movement theory. Your non-academic friends will nod as if they’re interested in what you have to say about theory, the same way you pretend to care about the insurance market. But your kids and their friends will always remind you that there’s a reason they are called “theoretical underpinnings.”

The perspective of kids, who are not yet sophisticated enough to pretend interest in anything for appearances’ sake, also helps one’s writing. “How can I explain what I learned today in such a way that it would interest my fifteen-year-old?” is a much better guide to narrative structure than “This is so cool, in all its intricate detail!” I can see the second my kids begin to glaze over as I tell them about a recent discovery, and try to remember that moment of disconnect when my prose gets hijacked by the intricacies of historical events that are so deeply fascinating . . . to me and about three other people.

It’s great to see discussion of the problems of motherhood in academia, but the discussion is hardly new: it was in full swing when my first son was born, twenty-one years ago. While our discussion seems to use more sophisticated words now, the actual world of the academy seems pretty much the same, if not worse than it was in 1992. So how can we actually create change? Part of the problem might well be that the drive to include mothers in the academy tends to focus on how unfair discrimination is to those excluded. That angle is painfully obvious, but it offers nothing to those doing the excluding except the chance to be noble. Evidence suggests that nobility doesn’t interest authorities enough to make much of a difference. But the discipline of history—and, I daresay, all other fields—needs to find a way to keep mothers in academia not so we can pat ourselves on the back for our generosity, but because without them the field runs the risk of becoming so insular it makes itself entirely irrelevant to the real world. It is not a mother’s battle, or even a women’s battle. It is a battle for the relevance of history itself and, as such, should be waged by every historian who thinks our scholarship matters.

Encouraging mothers to stay in the academy might be good for mothers, but it is imperative for the academy.
Mothers in the Academy Mothers in the Academy Reviewed by Joseph Landis on June 09, 2013 Rating: 5

Accurate History for Activists

May 27, 2013
Dan Allosso

I spent last weekend in the Twin Cities, doing a radio interview about my book and giving a talk on freethought history at the monthly meeting of the Minnesota Atheists.  At roughly

the same time, Susan Jacoby was a featured speaker at the second annual Women in Secularism conference in Washington, DC.  A couple of people live-blogged Jacoby’s talk (here and here). Reading these transcripts and thinking about my own weekend as a presenter has changed my perspective on the role of historians in public discourse.

According to a bio produced for Bill Moyers’ website on PBS, Susan Jacoby

began her writing career as a reporter for THE WASHINGTON POST, is the author of five books, including WILD JUSTICE, a Pulitzer Prize finalist. Awarded fellowships by the Guggenheim Foundation, the National Endowment for the Humanities and the New York Public Library's Dorothy and Lewis B. Cullman Center for Scholars and Writers, she has been a contributor to THE NEW YORK TIMES, THE WASHINGTON POST, THE NATION, TomPaine.com and the AARP BULLETIN, among other publications. She is also director of the Center for Inquiry-Metro New York and lives in New York City.

Although she’s not a professional historian, Jacoby has tons of credibility in the literary world.  Also in the secularist world and the liberal intellectual world.  Her recent books, Freethinkers, The Age of American Unreason, and The Great Agnostic: Robert Ingersoll and American Freethought, are all required reading for in-the-know secularists.  I’ve read the first and third, the other one is on my to-read list.  Jacoby has done a lot to remind contemporary readers of the existence of freethinkers in American history (especially Robert Ingersoll).  So I was a little surprised when I saw the live-bloggers recorded Jacoby saying something like this:

2:04: There have been no secular activists who have made women’s rights an issue, except insofar as they are threatened by radical Islam. Telling the truth about radical Islam and women is important, but we need secularists to understand that discrimination and violence against women are hardly confined to the Islamic world...Robert Ingersoll is the only male secularist who is an exception to this. 

While Jacoby’s point that secularists need to extend their understanding of oppression is undoubtedly correct, her historical example couldn’t be more incorrect.  Throughout history, freethinkers have more often than not linked secularism with women’s and family issues.  In addition to the many women freethinkers (Mary Wollstonecraft, Frances Wright, Eliza Sharples Carlile, Ernestine Rose, Elizabeth Cady Stanton, etc.) there have been many male freethinkers who worked for women’s rights.  In America, Dr. Charles Knowlton, Robert Dale Owen, and Abner Kneeland come easily to my mind (because I was talking to the MN Atheists about them while Jacoby was talking to the Secularist Women); in England Richard Carlile, Francis Place, and John Stuart Mill are also easy choices.  Dig beneath the surface layer of famous names, and there are many more.

The point is that Jacoby’s credibility and authority (and the audience’s sympathy with her point about understanding oppression) allowed her to insert bad history into the conference’s stream of consciousness.  It resurfaced later, in discussions like the one about whether Ingersoll would have accepted an invitation to speak at the conference, and in a general impression that secularism has generally NOT been particularly friendly to women and their issues.  The inaccuracy of this view hinders contemporary secular feminists in their efforts to identify freethought with the rights of women and oppressed minorities, and not just the “Rights of Man.”  But the authority of historical expertise (Heather Cox Richardson recently referred to it as the “oxygen”) belongs to the person at the podium — and all too often that person is not a historian.

I’m sure misleading her audience was the opposite of Susan Jacoby’s intent.  She seems to have been arguing that today’s secular women need to push beyond the movement’s history and win new victories of their own.  And this is good advice.  But pushing forward might not seem as difficult, if women were aware of the efforts and sacrifices made by earlier secularists in the same cause.  Today’s secular women might gain valuable information as well as inspiration, if the story of earlier secular feminists was better known.  So I’ve signed on with Secular Woman to tell the stories of secular feminists in the past.  I’ll be writing a monthly series of short biographies of secular women.  Secular Woman is an activist organization, so hopefully these stories will be useful to the women Jacoby was urging to continue the fight.
Accurate History for Activists Accurate History for Activists Reviewed by Joseph Landis on May 27, 2013 Rating: 5

Staying Positive

May 17, 2013
Craig Gallagher*

It's likely that if you have already applied and been accepted to graduate school to study history, you’ve heard it at least once. You’ll hear it plenty more times before you get that masters or Ph.D. in history you’re putting aside a lot of time and/or money to acquire.

In fact, if your decision to continue your education isn’t just about putting off the working world for a few years and is driven by a desire to change direction and start a new career, you’ll hear it so often that it will feel as though everyone thinks you’re running away to join the circus instead of pursuing another professional qualification.

I’m talking, of course, about that constant refrain that hangs over graduate school like a surly cloud at the moment: “There are no jobs!” Now, I don’t wish to debunk this statement with a much rosier picture of the job market than has hitherto been offered, because I can’t do that.

Not when respected publications like the Atlantic and the Chronicle of Higher Education have lined up to inform us that even seeking a degree in the sciences offers little economic advantage anymore in these straitened times, adding as an afterthought that the outlook for those who hold humanities degrees is downright bleak.

What I suggest, however, is that staying positive in the face of such bleak prospects is essential.

The fact remains that universities and colleges are still admitting graduate students to study history and are still training them to read, research, and write to a very high standard. Graduate students still get teaching experience, we still learn how to organize our time effectively, how to argue cogently and coherently and to condense vast amounts of information into digestible bites fit for any palate. We also learn to speak foreign languages.

It needs to be kept in mind that these are transferable skills applicable to a variety of jobs. Sure, we can’t change overnight the fact that some potential employers will see “M.A. in History” and immediately move on to the next CV. But we can embrace the abilities we develop in such a way as to help ourselves, first and foremost, compete in a fallow economy.

And, let’s not forget, these are just the classic skills any apprentice historian will develop. As the discipline broadens and deepens to accommodate technological changes, new opportunities arise in the burgeoning subfield known as the Digital Humanities.

My point is, when you hear “There are no jobs!” don’t translate this as “I am hopelessly unemployable.” Hear instead, “What can I do, say, and work on to make sure I get one of those ‘no jobs’?"

*Craig Gallagher is a PhD Candidate in History at Boston College. His dissertation project focuses on transnational Scottish Presbyterian merchants, ministers, settlers and soldiers in the late-17th century Atlantic World. In spring 2013 he received a research grant from the Society of Antiquaries of Scotland.
Staying Positive Staying Positive Reviewed by Joseph Landis on May 17, 2013 Rating: 5

Stressed Much? You’re In Good Company

March 07, 2013
Eric Schultz
.
Earlier this month, the American Psychological Association released a study called Stress in America, concluding that Millennials, 18 to 33-year-old
From "Science Nation - Teens and Stress" (NSF)
Americans, along with Gen Xers (34-47), are the most stressed generations in America.  On a scale of 1-10, the average American defines a healthy level of stress as 3.6 but feels a level of 4.9.  Millennials and Gen Xers are at 5.4, a level the study concludes is “far higher than Boomers’ average stress level of 4.7 and Matures’ [67 and over] of 3.7.”


Thirty-nine percent of Millennials say their stress has increased in the last year, while 52 percent report having lain awake at night in the past month due to stress.  “Millennials and Gen Xers are most likely to say that they are stressed by work, money and job stability, while Boomers and Matures are more likely to be concerned with health issues affecting their families and themselves,” the study concluded.

All of which left me wondering: Is this degree of stress in America something new?  A recent epidemic?  A product of fast times and too much fast food?  Or perhaps a cultural peculiarity, a kind of national trait—maybe even an irksome downside to achieving progress or byproduct of what time-management guru David Allen would call “getting things done.”

One of my very favorite 19th-century books, both for its passion and unintended humor, was written by Dr. George Miller Beard, a fellow of the New York Academy of Medicine.  Beard was known for having defined “neurasthenia,” a medical condition that arose in the 19th century and produced fatigue, anxiety, and depression, which he attributed to nothing less than American civilization.  His 1881 American Nervousness: Its Causes and Consequences concluded that steam-power, the periodical press, the telegraph, the sciences, and (perhaps Beard’s reaction to suffrage?) the mental activity of women were the primary contributors.

The signs of American nervousness, Beard said, were everywhere: susceptibility to narcotics and drugs, rapid decay of the teeth, premature baldness, the unprecedented beauty of American women (indeed, at some point we might need Dr. Freud to fully understand Dr. Beard), the strain of puberty and change of life; American oratory, speech, and language; and the greater intensity of animal life on this continent.  Fortunately, Beard remained optimistic, saying that wealth and invention could bring calm.  After all, he concluded (in a classic line we might better attribute to a Monty Python sketch), “The Greeks were certainly civilized, but they were not nervous.”

Another poignant reminder of American anxiety came in Henry Adams’s brilliant The Education of Henry Adams, published at his death in 1918 (and a Pulitzer Prize-winner the following year).  Henry’s dilemma clearly resonated with Americans when he wrote that “the old universe was thrown into the ash-heap” and a new one created by the crush of technology--the opening of the Boston and Albany Railroad, the appearance of the first Cunard steamers in the bay, and the telegraphic messages which carried from Baltimore to Washington the news that Henry Clay and James K. Polk were nominated for the Presidency. Later in Adams’ life, of course, came electricity, the telephone and the automobile. He remembered the time of “the flint-and-steel with which his grandfather [John Quincy] Adams used to light his own fires in the early morning”; now the world had bathrooms, water, lighting and modern heat—“the whole array of domestic comforts.”

Adams’s anxiety was caused by the fact, despite a life-long education like few in America would ever experience, that he was completely unprepared for the world of the 20th century. “At the rate of progress since 1800,” Adams wrote, “every American who lived into the year 2000 would know how to control unlimited power.”

(I wonder if he was referring to my iPhone?)

Need further proof, or perhaps a wider lens?  In her superb Inheriting the Revolution: The First Generation of Americans (Belknap Press of Harvard University Press, 2000) Joyce Appleby discusses a book called Peter Rugg, The Missing Man.  Written by William Austin (1778-1841) and published in 1824, it was the most popular story of the early Republic.  It reads like a bad, confusing dream, a kind of 19th-century “Charlie on the MTA.”  Peter sets out by carriage from Concord to Boston in a thunderstorm in 1770 and simple rides forever.  He stops repeatedly to ask directions and finds there is no more King; the old road has become a turnpike; the city has grown beyond anything he could imagine. Indeed, Appleby points out, Austin and his generation would see Boston triple in size and New York grow to six times its size from 1776 to 1820; it was an unprecedented “destruction of their elders world.”

Just ask Rip Van Winkle, Peter Rugg’s contemporary.

By all accounts, stress in America is real, uncomfortable, potentially destructive, and something we need to work to control. But a little history indicates we’re not setting any precedents. Just ask John D. Rockefeller (1839-1937), the country’s first billionaire and, adjusted for time, considered the richest American ever. Rockefeller once confided “how often I had not an unbroken night’s sleep, worrying about how it was all coming out.”

Well, things didn’t turn out so badly for Mr. Rockefeller. American history suggests that, heads down, a little optimism (and maybe a little less fast food), the Millennials and Gen Xers will be alright, too.
Stressed Much? You’re In Good Company Stressed Much? You’re In Good Company Reviewed by Joseph Landis on March 07, 2013 Rating: 5
ads 728x90 B
Powered by Blogger.