At the SLA conference, Kris Fowler presented results of a survey of mathematicians that she conducted. She used Web of Science to find people who had published in mathematics journals and contacted a random sample of them for an online survey.
She had a lot of findings about how people used online tools and collaborated online. One of her big findings was about open access journals coming of age. The mathematicians' reasons for choosing one journal or another were basically the same regardless of whether the journal was open access or subscription. Open access journals weren't just for open access die hards. About a third of the mathematicians indicated that they had published in an open access journal. The main reasons for choosing to submit to a particular journal had to do with reputation and audience, with open access way down on the list for importance.
The survey included a follow-up question asking people who published in open access journals to write the name(s) of the open access journal(s). About a quarter of the journals the mathematicians listed were not open access journals. If I heard correctly, this was much better than a previous survey (not necessarily of mathematicians), in which two thirds of the "open access" journals that people listed were not actually open access.
I liked her presentation a lot. She gathered information beyond just her own library or own institution and did it without having a big grant. Apparently other people also liked what she did. Aside from the article she's preparing to submit to a peer-reviewed journal, she's slated to provide a write-up in Notices of the AMS.
Wednesday, June 22, 2011
Tuesday, June 21, 2011
Research from Morris Library: We're Grrreat ... or at Least Pretty Good
On June 13, SIU was honored at the SLA conference. SIU is one of the top 50 university libraries for research productivity in special librarianship.
From the email announcement that I got from Tony Stankus, "...we are announcing the roll call of honor of the top 50 universities which contributed the most papers to the literature of special librarianship from 2000-2010, based on an analysis of the author affiliations of over 2,000 papers of substantive research or professional commentary in the eleven most cited journals in these areas, using procedures adapted from Wiberley, Hurd & Weller (College & Research Libraries 46(4): 334-342; 2006)
Using the US News & World Report listings of over 1400 institutions of higher education as our baseline, we calculate that your presence on the top 50 list places you in the highest 4% of the country in terms of scholarly productivity in this area, something I am sure would not greatly surprise you."
I missed the award ceremony, so I don't know where in the top fifty we fell. In other rankings of research in library and information science, SIU hasn't made it to the top ten for published articles but made it to the top ten for conference presentations.
I would quibble with the statement that being in the top 50 puts us in the highest 4% of scholarly productivity. The 1400 institutions in the US News listings include about a thousand colleges and universities no publications. Nevertheless, even being in the top quarter says something about the faculty in Library Affairs.
From the email announcement that I got from Tony Stankus, "...we are announcing the roll call of honor of the top 50 universities which contributed the most papers to the literature of special librarianship from 2000-2010, based on an analysis of the author affiliations of over 2,000 papers of substantive research or professional commentary in the eleven most cited journals in these areas, using procedures adapted from Wiberley, Hurd & Weller (College & Research Libraries 46(4): 334-342; 2006)
Using the US News & World Report listings of over 1400 institutions of higher education as our baseline, we calculate that your presence on the top 50 list places you in the highest 4% of the country in terms of scholarly productivity in this area, something I am sure would not greatly surprise you."
I missed the award ceremony, so I don't know where in the top fifty we fell. In other rankings of research in library and information science, SIU hasn't made it to the top ten for published articles but made it to the top ten for conference presentations.
I would quibble with the statement that being in the top 50 puts us in the highest 4% of scholarly productivity. The 1400 institutions in the US News listings include about a thousand colleges and universities no publications. Nevertheless, even being in the top quarter says something about the faculty in Library Affairs.
Tuesday, June 7, 2011
Citation Counts Part 1 of many...
A couple years ago, one of my ideas for an article was to collect tricks for citation counting together in one place and publish it. I worked on it for a while, but it just didn’t seem like something worthy of publication in a journal. In addition, when I would reread my outlines, I was bothered by how cynical it sounded. One of my colleagues recently suggested that I put together a workshop on the topic.
To get started on planning a workshop, I’m going to put the pieces together here.
To start, there was a news article this week that bugged me. There isn’t enough detail in the article to know for sure what happened, but I have my suspicions.
Jeffrey Litwin took the amount of money that universities received for research and divided it by the number of journal articles from those universities. That gave him average cost per paper. Then he ranked the universities.
Some of the “least productive” universities in dollars per publication were Texas A & M, Carnegie Mellon, North Carolina State, and MIT. These colleges are known for their engineering programs. Patents and secret research conducted for the military don’t get included in Litwin’s paper count, so it’s a penalty for universities with a lot of that kind of research under Litwin’s method.
Litwin’s data came from Thomson Reuters. The news articles don’t say exactly how he got them. Assuming Litwin used the Web of Science data, though, there may be a hidden penalty from in Web of Science’s choice of which journals to include or exclude. Web of Science is not perfectly even in its journal coverage for every discipline. I’m not sure it would be possible for it to be perfectly even.
To get started on planning a workshop, I’m going to put the pieces together here.
To start, there was a news article this week that bugged me. There isn’t enough detail in the article to know for sure what happened, but I have my suspicions.
Jeffrey Litwin took the amount of money that universities received for research and divided it by the number of journal articles from those universities. That gave him average cost per paper. Then he ranked the universities.
Some of the “least productive” universities in dollars per publication were Texas A & M, Carnegie Mellon, North Carolina State, and MIT. These colleges are known for their engineering programs. Patents and secret research conducted for the military don’t get included in Litwin’s paper count, so it’s a penalty for universities with a lot of that kind of research under Litwin’s method.
Litwin’s data came from Thomson Reuters. The news articles don’t say exactly how he got them. Assuming Litwin used the Web of Science data, though, there may be a hidden penalty from in Web of Science’s choice of which journals to include or exclude. Web of Science is not perfectly even in its journal coverage for every discipline. I’m not sure it would be possible for it to be perfectly even.
Subscribe to:
Posts (Atom)