Journal List > J Korean Med Sci > v.32(8) > 1108449

Lee, Kim, and Lee: Publication Delay of Korean Medical Journals

Abstract

Publication lag is a determinant to journal efficiency that was not yet studied concerning Korean medical journals. To measure publication lag, we investigated the publication timestamps of 4,762 articles published by 10 Korean medical journals indexed in Scopus database, randomly selected from the KoreaMed Synapse since 2013. The total publication lag was 246.5 (Q1, Q3; 178.0, 347.0) days. The overall acceptance lag was 102.0 (65.0, 149.0) days. The overall lead lag was 123.0 (63.0, 236.0) days. The year of publication did not significantly affect the acceptance lag (P = 0.640), supposedly shortening it by about 1.4 (97.5% confidence interval [CI], −5.2 to 8.0) days/year, while the date affected the lead lag (P = 0.028), shortening it by about 12.9 (1.3 to 24.5) days/year. The Korean medical journals have reduced the total publication delay entirely by means of reducing the lead lag, not by reducing the acceptance lag.

Graphical Abstract

jkms-32-1235-abf001

INTRODUCTION

Publication delay of scientific papers is a well-known problem. Whereas the actual measurement has hardly been reported in scholarly publications, we could refer to the Björk and Solomon's work (1) in 2013. From ‘Satoshi Village,’ a famous blog on science general, we could also refer to the exact measurement of publication lag in 3,475 articles published by PLOS (2), and in 3 million or more PubMed (http://www.pubmed.gov/; National Library of Medicine. Bethesda, MD, USA) articles (3). However, these resources do not provide specific insight regarding the situation in Korea.
Publication delay is not only an inconvenience for authors, but also an impediment to the timeliness of science, and also a chore for editors to overcome for journal efficiency. For example, we identified an anthropological paper that was published 40 months after the start of the study (4) and a molecular biology paper that was published after 4 years and 7 months; adding an additional 3 years to the research (5). These examples of time lag make us skeptical regarding how much these papers will contribute to the advancement of science. Publication delays are a problem for both authors and editors.
The Korean Association of Medical Journal Editors (KAMJE) journals have been rapidly globalizing in recent years despite their long history. The KoreaMed Synapse is the front-end gateway and reference linking platform established by the KAMJE as a part of the globalization that opens to worldwide readers. An editor of a member journal had published the very first data summary (6) for its publication lag, as an announcement of a journal reformation through online-first publication. Yet, for other KAMJE journals, a tangible report regarding publication delay never materialized.
In this study, the authors investigated the timestamps of KAMJE journals over the past four years. By categorizing total publication delay by acceptance lag and lead lag, the authors aimed to give the KAMJE journals insights regarding their efficiency and fuel them to take proper measures to reduce the publication delay.

MATERIALS AND METHODS

Data acquisition

Seventeen Scopus (a bibliographic database; Elsevier, Amsterdam, The Netherland) journal names were randomly selected out of the official list of the KoreaMed Synapse journals (https://synapse.koreamed.org). Total 8,037 PubMed IDs (PMIDs) were retrieved from PubMed for articles published in 2013–2016 by the 17 journals (Table 1). Starting with the 8,037 PMIDs, the database was refined to 7,503 PMIDs by a criterion of PubMed Article Type of one of the following: “journal article,” “case reports,” “comparative study,” “controlled clinical trial,” “clinical trial,” “clinical trial, phase I–IV,” “clinical study,” and “evaluation studies,” all of which designate the category of “Journal Articles” in a broad sense. Each PMID was converted to its digital object identifier (DOI) and the resulting database included 7,486 DOIs. Some of PMIDs were, for several reasons, ineligible for DOI conversion.
Table 1

Articles published by 17 journals in 2013–2016 (n = 8,037)

jkms-32-1235-i001
Journals No. of articles by year
2013 2014 2015 2016 2017 Total
ALLERGY ASTHMA IMMUNOL RES 68 92 85 76 0 321
ANN DERMATOL 131 196 187 186 0 700
ANN LAB MED 88 94 135 117 0 434
ANN SURG TREAT RES 0 115 115 107 0 337
J ADV PROSTHODONT 72 77 70 70 0 289
J BREAST CANCER 69 59 58 49 0 235
J CLIN NEUROL 44 61 67 90 17 279
J GYNECOL ONCOL 56 58 58 74 10 256
J KOREAN ACAD NURS 81 73 93 70 0 317
J KOREAN MED SCI 320 323 333 346 0 1,322
J KOREAN NEUROSURG SOC 188 189 202 113 0 692
KOREAN J ANESTHESIOL 313 226 108 119 0 766
KOREAN J ORTHOD 42 46 46 49 0 183
KOREAN J PHYSIOL PHARMACOL 75 73 72 76 0 296
NUTR RES PRACT 69 98 92 82 0 341
PSYCHIATRY INVESTIG 65 76 86 95 0 322
YONSEI MED J 232 243 243 229 0 947
Total 1,913 2,099 2,050 1,948 27 8,037
PubMed IDs are retrieved using PubMed search words “2013:2016 [DP] AND journal name [TA]”. Please note that there is some discrepancy of the year of publication between the PubMed search words we applied and the values inferred by the DOI. Included in the analyses are 8,010 articles except for those having 2017 DOIs.
DOI = digital object identifier.
Using DOIs as key variables, scraping the Crossref article page, parsing/mining the pages, and reading the Crossref metadata containing publication history, we managed to procure 4,783 articles possessing all 3 timestamps (‘received date,’ ‘accepted date,’ and ‘epublished date’) per article (Fig. 1).
Fig. 1
Schematic diagram of the research. It begins with 8,037 PMIDs from 17 Korean medical journals, and finally curtails the dataset into 4,762 DOIs from 10 journals. Among 3 publication lags graphed for exploration, 2 essential publication lags are modeled for further understanding. Crossref is a registered trademark of Publishers International Linking Association. DOI is a registered trademark of The International DOI Foundation. PubMed is a registered trademark of National Library of Medicine, United States.
PMID = PubMed identifier, DOI = digital object identifier, XML = eXtensible Markup Language.
jkms-32-1235-g001
Three publication lags were calculated: total publication lag (days between ‘received date’ and ‘epublished date’), acceptance lag (days between ‘received date’ and ‘accepted date’), and lead lag (days between ‘accepted date’ and ‘epublished date’). The year of publication was renewed from DOI names, whose attribution always accommodates a field for the year of publication in the case of articles from a KoreaMed Synapse journal. To simplify the year-effect, we substituted 5 discrete numbers (0, 1, 2, 3, and 4) for the year of publication (2013 through 2017), calculated by a simple calculus: subtraction of 2013 from the year of publication. Our dataset finally shrank to 4,762 articles in 10 KAMJE Journals, after banning journals containing fewer than 100 records per journal; and erasing records with suspicious timestamps and records having 2017-DOIs, disclosed in the Crossref eXtensible Markup Language (XML) metadata, where the lags yielded negatively.
The PMIDs were retrieved on December 31, 2016, the DOIs were converted on January 6, 2017, and the timestamps were acquired on January 7, 2017.

Exploration and modeling

In attempt to gain understanding about overall distributions of publication lag, those 3 lags were graphically visualized by the year of publication and a specific journal in violin plot and overlapping boxplot. Because the authors did not intend to compare the lags between journals, the plots used different ranges on the ordinate for different journals. Median (Q1, Q3) were also displayed.
To estimate the date-effect on the 2 publication lags (acceptance and lead lags), wide inter-journal variability shown in the Figs. 2,3,4 necessitated an application of mixed-effects modeling. The acceptance and lead lags were modeled using linear mixed-effects modeling (maximum likelihood method) to estimate the date-effect as both fixed and random effects (random intercept and slope model). The displayed fixed-estimates in models were the mean estimate and associated confidence intervals (CIs) along with the x 2 estimates at a given degree of freedom (d. f.). The variance components of the random-estimates were displayed and used in the calculation of intraclass correlation coefficient (ICC). Individual coefficients were presented separately for the journal identity.
Fig. 2
Total publication lag (days) per publications date (year) in 4,762 articles of 10 Korean medical journals. Panels are rearranged by order of the median publication lag per journal. Varying width of the violin indicates kernel density estimates. In a boxplot, boxes indicate Q1 and Q3 with dots in the box, Q2. Whiskers extend to the highest (lowest) value that is within 1.5 times interquartile range of the hinge. Note that the ordinate of each panel has a different range.
jkms-32-1235-g002
Fig. 3
Acceptance lag (days) per publications date (year) in 4,762 articles of 10 Korean medical journals. Panels are rearranged by order of median value of acceptance lag per journal. Varying width of the violin indicates kernel density estimates. Light blue violins indicate journals that sport 2013 track record below 100 days (the J CLIN NEUROL is coerced into the blue group) and red ones over 100 days. Blue lines and gray shadow indicate loess smooth fit and associated standard error. Property of blue lines (solid or broken) indicates increasing or decreasing pattern. In a boxplot, boxes indicate Q1 and Q3 with dots in the box, Q2. Whiskers extend to the highest (lowest) value that is within 1.5 times interquartile range of the hinge. Note that the ordinate of each panel has a different range.
jkms-32-1235-g003
Fig. 4
Lead lag (days) per publications date (year) in 4,762 articles of 10 Korean medical journals. Panels are rearranged by order of median value of lead lag per journal. Varying width of the violin indicates kernel density estimates. Light blue violins indicate journals that sport 2013 track record below 100 days (The ANN LAB MED is coerced into the blue group) and red ones over 100 days. Blue lines and gray shadow indicate loess smooth fit and associated standard error. Property of blue lines (solid or broken) indicates increasing or decreasing pattern. In a boxplot, boxes indicate Q1 and Q3 with dots in the box, Q2. Whiskers extend to the highest (lowest) value that is within 1.5 times interquartile range of the hinge. Note that the ordinate of each panel has a different range.
jkms-32-1235-g004

Tools for data import and analysis

Total 8,037 PMIDs were batch-queried from PubMed, using rentrez package (rentrez: Entrez in R. David Winter., R package version 1.0.4). Matched DOIs were queried using the PMIDs as key variables using the ID Converter application programming interface (API) (7).
The article-based 3 timestamps; ‘received date,’ ‘accepted date,’ and ‘epublished date’ were obtained through XML-parsing and mining of the custom metadata field of the Crossref article page, scraped in the form of XML using author-developed R code that utilized API calling procedures provided by Crossref (8).
Randomness in selecting journal names was assured using dplyr package (dplyr: A Grammar of Data Manipulation. Hadley Wickham and Romain Francois., R package version 0.5.0). Throughout the data acquisition and analyses; API procedures for web scraping, data handling, graphing, and statistical analyses were powered by R software version 3.3.2 (R: A language and environment for statistical computing; R Foundation for Statistical Computing, Vienna, Austria) added on GNU Emacs version 25.1.1 (Free Software Foundation, Inc., Boston, MA, USA; 2016). Linear mixed-effects models were constructed using the lme4 package (lme4: R package for linear mixed-effects models. Douglas Bates, Martin Mächler, Ben Bolker, and Steve Walker, R package version 1.0.+) (9), with maximum likelihood method. Since the authors planned 2 inferential tests separately on 2 dependent variables (acceptance and lead lag), each inference was targeted to α value of 0.025, keeping overall α value, 0.05. So, CIs in this report were within 97.5%, as well. Subsidiary P values were attained by performing the likelihood ratio test against a null model. The journal names were set in italicized International Organization for Standardization (ISO)-abbreviation format.

RESULTS

Exploration

The total publication lag of articles published by the 10 KAMJE journals from 2013 to 2016 was 246.5 (Q1, Q3; 178.0, 347.0) days (Fig. 2). Between the slowest (436.0 [371.8, 537.0] days) and the fastest one (145.5 [87.0, 203.5] days), there was a 3-fold difference, though most journals reduced the publication lag since 2013.
The overall acceptance lag was 102.0 (65.0, 149.0) days. Between the slowest (157.5 [124.0, 195.3] days) and the faster one (60.5 [30.8, 106.0] days), there was a 2.5-fold difference (Fig. 3). Six journals managed to reduce the acceptance lag, while another 4 failed to reduce it. The longer the acceptance lag they harbored in 2013, the more the journal inclined to reduce the lag. On the contrary, based on the lag in 2013, faster journals, recording the acceptance lag < 100 days, failed to reduce the acceptance lag (J GYNECOL ONCOL, YONSEI MED J, ALLERGY ASTHMA IMMUNOL RES) with the exception of the KOREAN J ANESTHESIOL and the J CLIN NEUROL.
The overall lead lag was 123.0 (63.0, 236.0) days. Between the slowest (44.0 [27.0, 62.0] days) and the faster one (323.0 [269.0, 372.0] days), there was an 8-fold difference (Fig. 4). Eight journals managed to reduce the lead lag, while another 2 failed to reduce it. Like the acceptance lag, changes in the lead lag seemed to be linked to the track record of 2013. Among 5 journals with shorter lead lag in 2013 (the lag < 100 days), 3 journals (ANN LAB MED, J GYNECOL ONCOL, and J KOREAN MED SCI) managed to reduce the lead lag.

Modeling

The year of publication did not significantly affect the acceptance lag (χ2 [df = 1] = 0.22, P = 0.640), and supposedly shortening it by about 1.4 (97.5% CI, −5.2 to 8.0) days/year, while the year did affect the lead lag (χ2 [df = 1] = 4.86, P = 0.028), significantly shortening it by about 12.9 (1.3 to 24.5) days/year.
The random-effects components were composed of inter-journal variances as 1,471 and total error variances as 4,477 in the model for the acceptance lag. As for the model for the lead lag, the random-effects components were composed of inter-journal variances as 12,231 and total error variances as 6,915 in the model. ICC for the acceptance lag model was estimated to:
ICC[acceptance]=?interjournalτinterjournal+?error=1,471/(1,471+4,477)=0.25jkms-32-1235-e001
and for the lead lag:
ICC [lead] = 12,231/(12,231 + 6,915) = 0.64
The individual coefficients for journals suggested that 4 out of 10 journals have failed to shorten the acceptance lag (Table 2) and 2 out of 10 journals have failed to shorten the lead lag (Table 3).
Table 2

Annual decrease in acceptance lags (days) of individual journal

jkms-32-1235-i002
Journals No. of articles Mean days in 2013 Annual decrease, day
J CLIN NEUROL 191 134.5 15.8
J ADV PROSTHODONT 286 169.8 8.6
ANN LAB MED 292 177.0 6.3
ANN DERMATOL 696 135.6 4.8
KOREAN J ANESTHESIOL 510 83.1 2.2
J KOREAN MED SCI 1,188 121.2 1.3
J BREAST CANCER 228 126.4 −0.4
ALLERGY ASTHMA IMMUNOL RES 312 96.5 −2.3
J GYNECOL ONCOL 191 57.1 −10.6
YONSEI MED J 868 81.9 −11.9
The annual decrease is the arbitrary negative transformation of the individual coefficients in the linear mixed-effects models. Table rows are arranged by order of the ‘annual decrease.’
Table 3

Annual increase in lead lags (days) of individual journal

jkms-32-1235-i003
Journals No. of articles Mean days in 2013 Annual decrease, day
ANN DERMATOL 696 413.7 44.5
J CLIN NEUROL 191 232.3 26.6
J GYNECOL ONCOL 191 110.8 21.5
KOREAN J ANESTHESIOL 510 247.6 21.3
ANN LAB MED 292 74.0 11.9
YONSEI MED J 868 227.0 8.8
ALLERGY ASTHMA IMMUNOL RES 312 146.3 7.0
J KOREAN MED SCI 1,188 75.9 0.2
J BREAST CANCER 228 59.6 −3.5
J ADV PROSTHODONT 286 58.0 −9.4
The annual decrease is the arbitrary negative transformation of the individual coefficients in the linear mixed-effects models. Table rows are arranged by order of the ‘annual decrease.’

DISCUSSION

At a glance, the present result showed one detail encouraging to the KAMJE editors. The total publication delay of the KAMJE journals decreased during the period targeted by this experiment. The reduction in the total publication lag was due solely to the shortening of the lead lag. As the acceptance lag, on average, became negligibly shorter recently, we devoted a good deal of space in the present manuscript to proclaim that the failed effort on acceptance lag will come to the KAMJE journals as the next hurdle.
We adopted a seemingly sophisticated method to analyze year trend of the lags. As mixed-effects modeling treated journals as a source of random variation and allowed the inter-journal variability to be strictly normalized, which produced a safe result for expanding discussion over all KAMJE journals (10). We preferred to view the inter-journal difference as random and avoided disputing why and how much there were differences according to the journal identity. The estimates of slopes (annual changes) for the lags were meant to indicate a population prediction, which truly did not indicate forecasting but a collective output amounted by what policies journals had taken and what circumstances they had coped with.
The lead lag, days between acceptance and publication, showed wide variability both between journals and between the year of publication. Reduction of the total publication lag relied definitely on reduction of the lead lag. A sound speculation exists for the reduction of the lead lag in which the change was technical, including a recent introduction of online-first publication and redesign of a flow of manuscripts in many KAMJE journals. Thus, in order to shorten the lead lag, every journal can take the same above plan irrespective of its identity or discipline, and are urged to reform weak segments in the process. They will get a decent opportunity to reduce lead lag.
Failure in reducing acceptance lag is intriguing to KAMJE editors. That was the one that they most deplored in the results. Editors should pay attention to the fact that acceptance lag and lead lag amounted to total publication lag by the same weight. Average length of the acceptance lag of the KAMJE journals was drifting over the same range with that of the PubMed journals, 100–150 days (3). It occurs to us that medical journals had already reached a limit of efficiency of traditional peer review, so calling it, “a glass ceiling.” Scheer (11) raised credible concerns about incompleteness and slowness of the traditional review system, based on an authors' survey framed by the Nature Publishing Group in 2014 (12).
Forgetting the P value, narrower CI pertaining to the acceptance lag suggests that our estimation on the acceptance lag was much more precise than the lead lag (13.2 vs. 21.2 days). Moreover, narrow variability in acceptance lag between journals (ICC [acceptance] = 0.25) also supports journal identity-dependent factor takes scarcely into account in reducing that lag. The narrow variability in itself strongly indicated an actual difference in peer review system was small between the KAMJE journals, opposed to the previous instance of the lead lag.
Prolongation of peer review is associated with increasing number of papers worldwide and growing number of interdisciplinary research, whose improvement should be shared by authors, reviewers, and editors (13). We, however, stuck to emphasizing the role of the editors, not only because of its importance but because of our primary focus. If the lead lag depends on a technical aspect of the journal publication, the acceptance lag is subject to the elementary conditions about efficient peer review systems, starting with triaging manuscripts, identifying a good reviewer, educating them, encouraging and rewarding them (1415), publishing guidelines, improving a review process, and more. An editorial authored by Gasparyan (16) summarized a good list of a few strategic efforts for editors to promote a big difference in delivering successful editing and publishing.
It was evident that the KAMJE editors have dedicated to reducing the lead lag for boosting journal efficiency and it is likewise clear that now is the time to attempt to reduce the acceptance lag. While the traditional peer review system faces a lot of doubt and challenge, we do not have the single most efficient peer review system possible. The KAMJE journals are administered by pure academic societies of medicine that confers the journals on a unique status, opposed to other medical journals published abroad. Thus, the KAMJE journal editors have to use their initiative to change everything on the journal quickly based on the needs of the journal, regardless of profit-generation. They have spent much time probably on changing page layout, designing the cover, and revamping the flow of the manuscript from authors to XML-depositor, until now. They have to spend much more time to advance the process of peer review.
In a sense, our approach was not fair. The result was concocted from journals of different types of publication; that is, online-first publication, online publication after print, and the recently commencing online-only publication. According to Alves-Silva et al. (17), publication delay was significantly linked to journal identity, the amount of publications per journal, and inversely the journal impact factor, none of which were included in this research. Another limitation also lurked in the raw dataset. The Crossref metadata had some errors in the timestamp, from whichever they originated. While the only issue we could address was falsely negative calculation of the lag values that was ruled out, other issues still might be hidden in the dataset. We found it hard to design code for locating all types of errors. Furthermore, it was a voluntary option for submitting the publication history to the Crossref; that was the main reason why the final dataset contained just 10 of the 17 journals; that also posed a question about selection bias between journals that provided timestamps or not.
We limited the subject of this research to the so-called ‘journal articles,’ categorized as either original articles or case reports. Types of articles not included in the research might be editorials or reviews or opinions, which, to our knowledge, were not eligible for regular peer review, or at least, published after a dedicated process other than regular review, in some journals. Including them in the analysis would have ruined our purpose and contaminated the results. Actually, we could easily find that some journals omitted the timestamps of those articles.
Moreover, throughout the manuscript, we were saying the timestamps of ‘the KAMJE journals’ instead of saying ‘the KAMJE Scopus journals.’ Developing and debugging the R code, enormous PMIDs obtained from random KoreaMed Synapse articles were attempted and yielded a bad return because those timestamps were often blank in the Crossref pages. Putting a limit to the Scopus journals yielded a good return, finally we had to scrape articles of the KoreaMed Synapse journals indexed in the Scopus. It was inevitable to acquire good timestamps.
We reported a cross-section of the publication delay of the last 4 years by investigating 10 KAMJE journals. The KAMJE journals have reduced the total publication delay entirely by means of reducing the lead lag, not by reducing the acceptance lag. Along with their struggle over the lead lag, editors must encourage further accelerated peer review processes. This finding warrants a consistent follow-up, driven by the editors.

ACKNOWLEDGMENT

We thank Ms. Jang H (XMLink, Co.) for verifying and cross-validating some erroneous date field in the Crossref XML pages.

Notes

Funding This research was supported by Dongguk University Research Fund.

DISCLOSURE Younsuk Lee was the editor-in-chief of the Korean Journal of Anesthesiology. He made no influence on this work in relation with the journal. Other authors have no potential conflicts of interest to disclose.

Author Contributions

  • Conceptualization: Lee Y.

  • Data curation: Lee Y.

  • Formal analysis: Lee Y, Kim K.

  • Funding acquisition: Lee Y.

  • Investigation: Lee Y.

  • Writing - original draft: Lee Y, Kim K, Lee Y.

  • Writing - review & editing: Lee Y, Kim K, Lee Y.

References

1. Björk BC, Solomon D. The publishing delay in scholarly peer-reviewed journals. J Informetrics. 2013; 7:914–923.
2. Himmelstein D. Publication delays at PLOS and 3,475 other journals [Internet]. accessed on 31 December 2016. Available at http://blog.dhimmel.com/plos-and-publishing-delays/.
3. Himmelstein D. The history of publishing delays [Internet]. accessed on 31 December 2016. Available at http://blog.dhimmel.com/history-of-delays/.
4. Powell K. Does it take too long to publish research? Nature. 2016; 530:148–151.
5. Else H. Scholar complains of how long it can take to publish interdisciplinary science [Internet]. accessed on 17 January 2017. Available at https://www.insidehighered.com/news/2016/12/02/scholar-complains-how-long-it-can-take-publish-interdisciplinary-science/.
6. Lee Y. Time for something different: the Korean Journal of Anesthesiology commences EPUB ahead of print. Korean J Anesthesiol. 2016; 69:315–316.
7. National Center for Biotechnology Information, U.S. National Library of Medicine. ID Converter API: the backend web service [Internet]. accessed on 6 January 2017. Available at https://www.ncbi.nlm.nih.gov/pmc/tools/id-converter-api/.
8. Crossref (US). Open API of Crossref text and data mining for researchers [Internet]. accessed on 6 January 2017. Available at http://tdmsupport.crossref.org/researchers/.
9. Bates D, Mächler M, Bolker B, Walker S. Fitting linear mixed-effects models using lme4. J Stat Softw. 2015; 67:i01.
10. Lee Y. What repeated measures analysis of variances really tells us. Korean J Anesthesiol. 2015; 68:340–345.
11. Scheer R. Further experiments in peer review [Internet]. accessed on 21 December 2016. Available at http://blogs.nature.com/ofschemesandmemes/2015/03/27/further-experiments-in-peer-review/.
12. Nature Publishing Group.Research (GB). Peer review statements from 2014 author survey [Internet]. accessed on 21 December 2016. Available at https://figshare.com/articles/Peer_Review_statements_from_2014_Author_survey/1362178.
13. Vosshall LB. The glacial pace of scientific publishing: why it hurts everyone and what we can do to fix it. FASEB J. 2012; 26:3589–3593.
14. Hong ST. Peer review in 2014: more supports than neglects. J Korean Med Sci. 2015; 30:120–125.
15. Gasparyan AY, Gerasimov AN, Voronov AA, Kitas GD. Rewarding peer reviewers: maintaining the integrity of science communication. J Korean Med Sci. 2015; 30:360–364.
16. Gasparyan AY. Peer review in scholarly biomedical journals: a few things that make a big difference. J Korean Med Sci. 2013; 28:970–971.
17. Alves-Silva E, Porto AC, Firmino C, Silva HV, Becker I, Resende L, Borges L, Pfeffer L, Silvano M, Galdiano MS, et al. Are the impact factor and other variables related to publishing time in ecology journals? Scientometrics. 2016; 108:1445–1453.
TOOLS
ORCID iDs

Younsuk Lee
https://orcid.org/0000-0003-2488-5926

KyoungOk Kim
https://orcid.org/0000-0001-7509-3668

Yujin Lee
https://orcid.org/0000-0003-1043-6214

Similar articles