research

Friday Follow-Ups

Updates on two posts from earlier this year:

  • And on a related note, two weeks earlier the CBC ombudsman issued a ruling that Lang violated the CBC’s conflict of interest policy, by not revealing personal connections to the Royal Bank of Canada before she interviewed the bank’s CEO. The text of that ruling is here.

Peer Review Gone Wrong, Again

The anonymous peer review process that’s used to determine whether academic research articles are published or presented is supposed to be a neutral process. But research on peer review has revealed many problems with the process, such as biased outcomes, and excessive lengths of time to get articles accepted. This week, there was a stunning example of another problem with the process – sexist reviews. (more…)

Evaluating Historical Research in Business

I started doing research in organizational and business history for no other reason than I like to try to figure out why things are the way they are. I have no formal training in historical research – I’ve learned what I’ve learned mostly from experience, and from very helpful suggestions from more experienced researchers along the way. But I’m also working within an academic discipline that doesn’t have a strong record of historical research, and that only considers certain kinds of historical research to be legitimate or worthwhile.

That background made me very interested in Jeffrey Smith’s recent article “Writing Media History Articles: Manuscript Standards and Scholarly Objectives”, which was published in Journalism and Mass Communication Quarterly. While Smith is specifically discussing research in media history, I found that a lot of the issues he discusses in the article are true for research in business history as well. And many of the issues he identifies resonated with my own experiences of trying to get research in business history published in academic journals. (more…)

When “Best Practices” Aren’t Best

Anyone who went to business school around the same time I did remembers “excellence”. Specifically, that was Tom Peters’ book In Search of Excellence, which described how companies could improve by copying what great companies did well. That book sparked a management fad of benchmarking – which then morphed into the idea of “best practices”. But now, unfortunately, it looks like the very sound ideas behind “best practices” are being lost and corrupted by corporate doublespeak.

In the last couple of weeks, I’ve come across more than a few examples of organizations using “best practices” as a reason to reduce or cancel services. The explanation usually goes something like this: the organization has “benchmarked” itself against similar organizations, or looked at other organizations’ “best practices”, and allegedly found that other organizations are doing less of a certain thing, or doing that thing less expensively. This then becomes a justification for the organization to downgrade its own offerings.

This use of “best practices” is not what was originally envisioned. Although Peters has admitted that his investigation of “excellence” was not as rigorous as it could have been, nevertheless his book had a powerful practical message.  (more…)

Public Sector Pay, Private Sector Pay, and the Fraser Institute

Last year, some of the research produced by the Vancouver-based Fraser Institute received some serious criticism. The Institute claims its work is based on “careful, accurate, rigorous measurement”. But the International Labour Organization – an affiliate of the United Nations – released a report which outlined extensive calculation errors and questionable methodologies in the Institute’s Economic Freedom of the World database. And it was also discovered that data for the Institute’s “survey of mining companies” were being collected through a website that was open to anyone, regardless of whether they knew anything about mining.

You would think that criticism like this would make the Institute look a little more thoughtfully at how it conducts its studies. But judging by its new report, Comparing Government and Private Sector Compensation in British Columbia, the Institute isn’t being any more careful with its work. The research presented in this report has numerous problems that contradict the Institute’s claims of “rigorous” and “transparent” methodologies – and which make the results of the research unreliable, to say the least. (more…)

Predatory Journals: An Experiment

In my occupation, tenure and promotion are big deals. University professors who want to get tenure or be promoted are usually expected not only to conduct research, but also to publish that research in academic journals. And in the last decade or so, the traditional model of academic journal publishing has been disrupted by the emergence of online-only journals and by open access journals.

This disruption has resulted in some good changes. It has led to alternatives to the process of anonymous peer review of journal submissions – a process which is supposed to be objective, but often isn’t. It can shorten the often lengthy time between the submission of a manuscript and the publication of the finished article. And it has also provided wider access to information that might formerly have been subscription-only or password-protected.

But the disruption has also led to the rise of so-called “predatory journals”. These are primarily online journals which have little or no academic legitimacy. They exist solely to make money for their owners, and they make that money by charging excessive “article processing fees”. Unfortunately, these journals prey on vulnerable researchers. That includes researchers who are desperate for publications to put on their resumes; researchers who are not confident in their writing ability; and researchers who can’t identify journals where a publication will hurt, not help, their careers. (Jeffrey Beall, who blogs about predatory journals, has an excellent list of criteria that he uses to define a predatory journal; you can find the list here.)

Predatory journals regularly send out spam emails soliciting manuscripts. I receive at least three of these emails every week. Other than being annoyed by the spam, I had never really thought too much about how these journals work. But at the end of last year, two astounding stories made the rounds. One was about a predatory journal accepting a manuscript that consisted of nothing but the words “Get me off your f***ing mailing list”. The other was about a predatory journal accepting a manuscript of computer-generated nonsense that was allegedly co-authored by two characters from The Simpsons.

These stories blew me away. How could this happen? Wouldn’t disrespectable journals at least try to appear legitimate by rejecting blatantly fake papers? How could even a disrespectable journal miss such obvious signs of fakery? So I decided to conduct an experiment of my own.

The outcome: Two journals accepted a manuscript for publication that was not only nonsense, but also plagiarized nonsense.

Here’s how it happened.

(more…)

TIPPING POINTS? MALCOLM GLADWELL COULD USE A FEW

An excellent investigation of Malcolm Gladwell‘s questionable use of uncredited secondary sources. Here’s some of my earlier posts on other problems with Gladwell’s work:

Malcolm Gladwell’s 10,000 Hour Rule Doesn’t Add Up

Malcolm Gladwell’s Weak Defense of the 10,000 Hour Rule

Who’s David, and Who’s Goliath?: Malcolm Gladwell and His Critics

Malcolm Gladwell and His Critics, Round Two

 

blupman's avatarOur Bad Media

In the summer of 2012, just days before a certain columnist was found to have plagiarized from The New Yorker, a staff writer at the prominent magazine itself resigned in the wake of a widespread plagiarism scandal. The journalist, famous for pop-science works that generated scathing reviews, had been using unattributed quotations taken from other people’s interviews. He had copied-and-pasted from his peers. Generally, he had faked his credentials as an original researcher and thinker.

The New Yorker itself had a doozy on its hands. The scandal had tarred the magazine’s famed fact-checking department, despite claims that its procedure was “geared toward print, not the Web.” Editor-in-chief David Remnick was embarrassed. He’d initially kept the writer on board, distinguishing one bout of self-plagiarism from the more serious offense of “appropriating other people’s work.” Now, his magazine was losing a star that had been groomed as “Malcolm Gladwell 2.0.”

That…

View original post 2,608 more words

Bill C-377: New Information on “The Bill That Nobody Wants”

Two researchers have uncovered some new and very troubling information about Bill C-377, the proposed Canadian law that would impose exceptionally rigorous financial reporting requirements on unions. “The bill that nobody wants”, as it was called in the researchers’ lecture last week, is now the center of an even more appalling story of misinformation and deception – a story that should concern not only anyone who cares about Canadian unions and workers, but also anyone who cares about the integrity of Canada’s democratic legislative process.

The first version of this bill was introduced in the House of Commons in 2011 as Bill C-317, and the Speaker of the House dismissed it as being out of order. The bill was then re-introduced in the House as Bill C-377 – a private member’s bill sponsored by Member of Parliament Russ Hiebert. It was approved in the House of Commons and sent to the Senate. The Senate refused to vote on it, and returned a heavily amended version of the bill to the House in mid-2013. The House returned the original, unamended bill to the Senate, where it is currently being debated again. It’s extremely unusual for private members’ bills to make it this far in the federal legislative process, or to be on Parliament’s agenda for so long. So what’s really going on here? (more…)

The Fraser Institute’s (Not So) Rigorous Data Collection Methods

I’ve written a couple of posts about the questionable research and data collection methodologies of the notoriously right-wing Fraser Institute. But today I have to take my hat off to the researchers over at Press Progress, who discovered that the Institute was (more…)

The Return of Jonah Lehrer

When we last heard about writer Jonah Lehrer – whose career self-destructed after his writing was found to have numerous instances of plagiarism and factual inaccuracies – he had been paid $20,000 to give a much-criticized speech about journalistic ethics. A few months after that, he was reported to be circulating a book proposal – which also allegedly included plagiarized content. Then….nothing.

And now, very quietly, he’s back.

At the end of March, (more…)