Jump to content

Wikipedia:Scholarly journal: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
See also: rm incorrect cat tag (this is in the Wikipedia namespace)
Citation bot (talk | contribs)
Add: doi-access, arxiv. Removed proxy/dead URL that duplicated identifier. | Use this bot. Report bugs. | Suggested by Headbomb | #UCB_toolbar
 
(15 intermediate revisions by 11 users not shown)
Line 1: Line 1:
{{Notability essay|WP:SJ}}
{{Notability essay|WP:SJ}}


Independent, peer-reviewed publications such as academic journals or specialist trade magazines are important places to find [[WP:RS|reliable sources]] on Wikipedia, particularly for new concepts or cutting-edge research that may not yet appear in textbooks. There is serious disagreement among scholars about the validity and/or applicability of measurements such as the [[Science Citation Index]] (SCIdex) and [[Impact Factor]] which are [[bibliometrics]] calculated for academic journals to gauge how often a journal's articles are independently cited. The journal's Impact Factor should not be considered as a decisive factor on whether it should be the subject of a standalone article in Wikipedia.
Independent, peer-reviewed publications such as academic journals or specialist trade magazines are important places to find [[WP:RS|reliable sources]] on Wikipedia, particularly for new concepts or cutting-edge research that may not yet appear in textbooks. There is serious disagreement among scholars about the validity and/or applicability of measurements such as the [[Science Citation Index]] (SCIndex) and [[Impact Factor]] which are [[bibliometrics]] calculated for academic journals to gauge how often a journal's articles are independently cited.


== Notability of individual journals ==
Sometimes an article about a scholarly journal or specialist trade magazine will appear at [[WP:AFD|Articles for Deletion]]. Users citing this essay believe there should be a '''presumption to keep''' such articles provided they can be established as [[WP:V|verifiable]] and independent. In other words, we believe the [[WP:N|notability]] standards for such publications should be relatively inclusive, even if the journal is a new startup, and even if the organisation or company responsible for the publication are the (main) writers and editors of the article, maintaining of course the usual behavioural standards set in [[WP:COI]].
The journal's impact factor is not a decisive factor on whether it should be the subject of a standalone article in Wikipedia.

Sometimes an article about a scholarly journal or specialist trade magazine will appear at [[WP:AFD|Articles for Deletion]]. Users citing this page believe there should be a '''presumption to keep''' such articles provided they can be established as [[WP:V|verifiable]] and independent. In other words, we believe the [[WP:N|notability]] standards for such publications should be relatively inclusive, even if the journal is a new startup, and even if the organisation or company responsible for the publication are the (main) writers and editors of the article, maintaining of course the usual behavioural standards set in [[WP:COI]].


Editors should be able to establish notability of these journals based on adequate citations in [[WP:RS|reliable sources]] from articles published '''in''' the journal as well as references '''to''' the journal in other independent journals, or as citations are found by the usual searches, particularly [[JSTOR]], [[Google Books]] or Scholar.
Editors should be able to establish notability of these journals based on adequate citations in [[WP:RS|reliable sources]] from articles published '''in''' the journal as well as references '''to''' the journal in other independent journals, or as citations are found by the usual searches, particularly [[JSTOR]], [[Google Books]] or Scholar.


Wikipedia articles are largely built on inline references that cite to journals, etc. In the Wikipedia citation, the name of the journal often is [[Help:Wikilinks#Wikilinks|internally wikilinked]] (e.g., doubled square brackets <nowiki>[[ ]]</nowiki> are put around the journal name). If the scholarly journal is widely used within Wikipedia as a source in articles, then for utilitarian reasons that should be taken into account in determining whether Wikipedia should have at least a stub article on the journal in order to provide more information to users about the cited reference. Even a low-quality publication, such as [[Homeopathy (journal)]], may be notable if it has attracted sufficient notoriety. High quality scholarly journals are rarely controversial, and so may be boring as subjects for Wikipedia articles, yet they are important for their publication of reliable sources. Helping users locate journal metadata (information about the journal) is part of helping them to make their own assessment of the reliability of content in those journals and hence to [[Wikipedia:Verification|verify]] statements for which our articles cite them in support.
Wikipedia articles are largely built on inline references that cite to journals, etc. In the Wikipedia citation, the name of the journal often is [[Help:Wikilinks#Wikilinks|internally wikilinked]] (e.g., doubled square brackets <nowiki>[[ ]]</nowiki> are put around the journal name). If the scholarly journal is widely used within Wikipedia as a source in articles, then for utilitarian reasons that should be taken into account in determining whether Wikipedia should have at least a stub article on the journal in order to provide more information to users about the cited reference. Even a low-quality publication, such as [[Homeopathy (journal)]], may be notable if it has attracted sufficient notoriety. High quality scholarly journals are rarely controversial, and so may be boring as subjects for Wikipedia articles, yet they are important for their publication of reliable sources. Helping users locate journal metadata (information about the journal) is part of helping them to make their own assessment of the reliability of content in those journals and hence to [[Wikipedia:Verification|verify]] statements for which our articles cite them in support.

== Deciding whether the journal is a good source ==
There are multiple metrics for evaluating whether a journal is reputable. The most important factor, as always, is [[WP:RSCONTEXT|whether the source fits the statement]]. Even the best physics journal is an unreliable source for statements about politics, and top-quality political journals are poor sources for statements about physics.

=== No magic number ===
An impact factor measures how many times a journal's articles are cited in other journal articles. There is no "magic" or "good" number for impact factors.

In particular, citation patterns, and therefore impact factors, differ significantly across academic disciplines. The average impact factor for fields such as mathematics, history, and education is around 0.5.<ref>{{cite journal |last1=Althouse |first1=Benjamin M. |last2=West |first2=Jevin D. |last3=Bergstrom |first3=Carl T. |last4=Bergstrom |first4=Theodore |title=Differences in impact factor across fields and over time |journal=Journal of the American Society for Information Science and Technology |date=2009 |volume=60 |issue=1 |pages=27–34 |doi=10.1002/asi.20936|arxiv=0804.3116 }}</ref> The average impact factor for information science, business, crop science, dentistry, and orthopedics is around 1. The average impact factor environmental studies and law is around 1.5. The average impact factor for physics, pathology, ophthalmology, and medical imaging is around 2. The average impact factor for general medicine and neuroscience is around 3. The only fields with an average impact factor above 4 are astronomy and molecular and cell biology.

This doesn't mean that MCB journals are eight times better than mathematics and history journals, or that general medicine is three times better than orthopedic surgery. It only means that mathematicians and historians generally find it necessary to cite fewer journal articles in each paper than biologists.

In general, impact factors are higher if:
* the journal focuses on general subjects (e.g., medicine) instead of niche specialties (e.g., orthopedics) or region-specific information (e.g., agricultural research or law in a specific country)
* the journal publishes in a (currently) fashionable or well-funded area
* the journal publishes deliberately provocative papers, in the expectation that they will be cited repeatedly by authors who want to disagree or object to the paper. This causes the impact factor to rise, because all citations are treated equally.
* articles accepted by the journal tend to have more authors than average (which means more people who will cite that article in their future work)
* authors think their articles are more likely to be accepted if they sprinkle needless citations to previous publications by possible reviewers, or to previous publications in the journal they submit the paper to
* the journal publishes in a field that uses journal articles as their primary official communication (e.g., biology), rather than books (e.g., history).

Other metrics take these patterns into account, and journal rankings within a field can be particularly useful. A journal that ranks near the middle of its field (or higher) is almost always going to be an acceptable journal. Journals that rank towards the bottom of their field may have quality problems, or there may be factors irrelevant to quality that lower their ranking (such as not publishing in English).

=== Searching for good sources ===

For scientific journals, you can find good journals at https://www.scopus.com/sources This free service allows you to put in the subject area of research (e.g., genetics) and see a list of ranked journals. You can also check the [[CiteScore]] percentiles and [[SCImago Journal Rank]] ("prestige") of any already-cited journal by switching the search to title or [[International Standard Serial Number|ISSN]] and searching for the journal you want to review. (For these metrics, bigger numbers are better, so a journal with a CiteScore percentile of 60% is cited more often than 60% of journals in that subject area.)

Remember: Even the most prestigious and highly reputable journals have published embarrassingly bad papers, and many disreputable journals have published good quality papers by reputable researchers. Finding journals with good reputations is only part of the work in deciding what sources to use when you are building articles.

== Further reading ==

* https://sfdora.org/ (San Francisco Declaration on Research Assessment)
* {{cite journal|title=Use of the Journal Impact Factor in academic review, promotion, and tenure evaluations|date=2019-04-09|first1=Erin C.|last1=McKiernan|journal=eLife |volume=8 |doi=10.7287/peerj.preprints.27638v2 |pmid=31364991 |pmc=6668985 |doi-access=free }}


== See also ==
== See also ==

Latest revision as of 06:34, 27 November 2023

Independent, peer-reviewed publications such as academic journals or specialist trade magazines are important places to find reliable sources on Wikipedia, particularly for new concepts or cutting-edge research that may not yet appear in textbooks. There is serious disagreement among scholars about the validity and/or applicability of measurements such as the Science Citation Index (SCIndex) and Impact Factor which are bibliometrics calculated for academic journals to gauge how often a journal's articles are independently cited.

Notability of individual journals

[edit]

The journal's impact factor is not a decisive factor on whether it should be the subject of a standalone article in Wikipedia.

Sometimes an article about a scholarly journal or specialist trade magazine will appear at Articles for Deletion. Users citing this page believe there should be a presumption to keep such articles provided they can be established as verifiable and independent. In other words, we believe the notability standards for such publications should be relatively inclusive, even if the journal is a new startup, and even if the organisation or company responsible for the publication are the (main) writers and editors of the article, maintaining of course the usual behavioural standards set in WP:COI.

Editors should be able to establish notability of these journals based on adequate citations in reliable sources from articles published in the journal as well as references to the journal in other independent journals, or as citations are found by the usual searches, particularly JSTOR, Google Books or Scholar.

Wikipedia articles are largely built on inline references that cite to journals, etc. In the Wikipedia citation, the name of the journal often is internally wikilinked (e.g., doubled square brackets [[ ]] are put around the journal name). If the scholarly journal is widely used within Wikipedia as a source in articles, then for utilitarian reasons that should be taken into account in determining whether Wikipedia should have at least a stub article on the journal in order to provide more information to users about the cited reference. Even a low-quality publication, such as Homeopathy (journal), may be notable if it has attracted sufficient notoriety. High quality scholarly journals are rarely controversial, and so may be boring as subjects for Wikipedia articles, yet they are important for their publication of reliable sources. Helping users locate journal metadata (information about the journal) is part of helping them to make their own assessment of the reliability of content in those journals and hence to verify statements for which our articles cite them in support.

Deciding whether the journal is a good source

[edit]

There are multiple metrics for evaluating whether a journal is reputable. The most important factor, as always, is whether the source fits the statement. Even the best physics journal is an unreliable source for statements about politics, and top-quality political journals are poor sources for statements about physics.

No magic number

[edit]

An impact factor measures how many times a journal's articles are cited in other journal articles. There is no "magic" or "good" number for impact factors.

In particular, citation patterns, and therefore impact factors, differ significantly across academic disciplines. The average impact factor for fields such as mathematics, history, and education is around 0.5.[1] The average impact factor for information science, business, crop science, dentistry, and orthopedics is around 1. The average impact factor environmental studies and law is around 1.5. The average impact factor for physics, pathology, ophthalmology, and medical imaging is around 2. The average impact factor for general medicine and neuroscience is around 3. The only fields with an average impact factor above 4 are astronomy and molecular and cell biology.

This doesn't mean that MCB journals are eight times better than mathematics and history journals, or that general medicine is three times better than orthopedic surgery. It only means that mathematicians and historians generally find it necessary to cite fewer journal articles in each paper than biologists.

In general, impact factors are higher if:

  • the journal focuses on general subjects (e.g., medicine) instead of niche specialties (e.g., orthopedics) or region-specific information (e.g., agricultural research or law in a specific country)
  • the journal publishes in a (currently) fashionable or well-funded area
  • the journal publishes deliberately provocative papers, in the expectation that they will be cited repeatedly by authors who want to disagree or object to the paper. This causes the impact factor to rise, because all citations are treated equally.
  • articles accepted by the journal tend to have more authors than average (which means more people who will cite that article in their future work)
  • authors think their articles are more likely to be accepted if they sprinkle needless citations to previous publications by possible reviewers, or to previous publications in the journal they submit the paper to
  • the journal publishes in a field that uses journal articles as their primary official communication (e.g., biology), rather than books (e.g., history).

Other metrics take these patterns into account, and journal rankings within a field can be particularly useful. A journal that ranks near the middle of its field (or higher) is almost always going to be an acceptable journal. Journals that rank towards the bottom of their field may have quality problems, or there may be factors irrelevant to quality that lower their ranking (such as not publishing in English).

Searching for good sources

[edit]

For scientific journals, you can find good journals at https://www.scopus.com/sources This free service allows you to put in the subject area of research (e.g., genetics) and see a list of ranked journals. You can also check the CiteScore percentiles and SCImago Journal Rank ("prestige") of any already-cited journal by switching the search to title or ISSN and searching for the journal you want to review. (For these metrics, bigger numbers are better, so a journal with a CiteScore percentile of 60% is cited more often than 60% of journals in that subject area.)

Remember: Even the most prestigious and highly reputable journals have published embarrassingly bad papers, and many disreputable journals have published good quality papers by reputable researchers. Finding journals with good reputations is only part of the work in deciding what sources to use when you are building articles.

Further reading

[edit]
  • https://sfdora.org/ (San Francisco Declaration on Research Assessment)
  • McKiernan, Erin C. (2019-04-09). "Use of the Journal Impact Factor in academic review, promotion, and tenure evaluations". eLife. 8. doi:10.7287/peerj.preprints.27638v2. PMC 6668985. PMID 31364991.

See also

[edit]
  1. ^ Althouse, Benjamin M.; West, Jevin D.; Bergstrom, Carl T.; Bergstrom, Theodore (2009). "Differences in impact factor across fields and over time". Journal of the American Society for Information Science and Technology. 60 (1): 27–34. arXiv:0804.3116. doi:10.1002/asi.20936.