February 11th, 2020 | by Sharon Johnatty
As scientists we are not all gifted writers, nor is writing a manuscript a particularly enjoyable task compared to other aspects of research where the thrill of discovery drives your career. Very few researchers actually ‘enjoy’ writing a scientific manuscript, partly because of the time involved, and the possibility of rejection by a reviewer who doesn’t consider the months or years of hard work summarized in ~5000 words or less. Sometimes the ‘getting started’ stage can be the most difficult. This article is an overview of the process of writing a manuscript. I will expand on these points in upcoming articles.
In general, you are ready to write your manuscript when you have reached that point in your research that triggers the decision to stop your experiments and start writing. For some research questions that point is fairly self-evident if you have closely followed an a priori hypothesis. Some types of research can go on indefinitely (or as long as you have funding) if the answers to the research question generates more questions.
I have often heard senior researchers comment on the very clever researcher or collaborator who cannot write to save her/his life (or career). Getting into the necessary headspace to write can be the hardest part of research.
Here are some tips, gleaned from over three decades in research, on getting started with that manuscript that you’ve put on hold.
- Choose your journal. Create a list of journals that are suitable for your subject area and review their aims and scope. Use published articles that your research builds upon, if relevant, and see where they are published. For more on journal selection, see my earlier blog titled ‘Choosing the right journal – think before you submit’.
- Review your results. Thinking about the ‘design’ of your manuscript as you review your results will help you decide what other experiments you need, and what questions potential reviewers may ask. Plan your tables and figures before the write-up stage. These can be anything from flow-charts to graphs or illustrations that can be updated as more data comes in. You will know when the write-up stage is ready when the tables and figures tell a story. ‘A picture tells a thousand words’, so use them to your best advantage.
- Start with an outline. Create a blueprint that will guide your manuscript. This can be as detailed or as basic as you wish, and helps to ensure flow and logic. With your journal in mind, lay out the different sections in your document, i.e. Title page, Abstract, Introduction, Methods, Results, Discussion. Include any relevant formatting or word count requirements outlined by the journal. You may even fill in subheadings in the Results section. It is not necessary to list these chronologically, but I have found that it does help with outlining the logical thought processes that guided the work. The ultimate goal should be to tell the story as simply as possible without the day-by-day diary of events that led to the final result. It may in fact be better to exclude experiments that did not substantively contribute to the result. It is a judgment call whether a failed experiment (I don’t mean negative results – more on that in another blog) adds to the current manuscript, or is a distraction from the main finding.
- Order of writing. Choose the order according to what inspires you most. Like a jig-saw puzzle, you can start at the top or bottom, left or right. I personally find it helps to write the Methods first, as this should be the least difficult section. Where you have complicated experiments, keeping a good lab book (a requirement of most research institutes) will help avoid frustration and ‘memory loss’ about what you did months (or years) ago. In some situations it may be helpful to write up results bit by bit as you do your experiments or analyses, rather than waiting for the penultimate experiment.
- Write now, edit later. Very often my first draft will be full of ‘notes to self’ and ‘stream of consciousness’ text. There will necessarily be several rounds of editing before you are ready to share it with collaborators and co-authors. At the editing stage, aim for clarity and conciseness. Avoid overuse of transition phrases like ‘Next’ or ‘We then showed that’ etc. Any sentence that you need to do a double-take on needs to go! Fragment long sentences. Don’t get caught up with formatting requirements at this stage. That should be the absolute last thing you do prior to submission, along with ensuring you have met the required word count. Remind your co-authors that you need substantive comments back when you send them the first draft — not formatting!
- Keep a close eye on potential plagiarism. Most journals use plagiarism software, as do most reputable research institutions. Invest in plagiarism checks with your institute. Automated software will almost always detect a level of plagiarism. I recently got a report back from a fairly reputable journal, and was somewhat amused that standard phrases in my manuscript, like “associated with an increased risk” and “as a risk factor for” were flagged as plagiarism! Reasonable journals will not ask for changes to every phrase that the software picks up on, but will highlight the ones that require your attention. So at all cost, avoid lifting entire paragraphs or sentences from published manuscripts or online resources. Also, ‘self-plaigarism’ needs to be avoided. If the topic of your paper is one you have published on before, it can be tempting to save time and recycle sections that you legitimately wrote in a previous publication. This should be avoided because regardless of who originally crafted the words, the software will pick this up as legitimate plaigarism.
- The nitty-gritty.
- The title can be written at any stage of the process, and crafting one is not a minor element of the paper. Avoid long drawn out titles. Some journals put character limits on this. As a general guide, titles that run into 2-3 lines are too long. Try to find the phrase that ‘sells’ your work and encourages the reader to want to know more. Also, shorter titles that are easy to understand and informative are more likely to reach a wider audience and shared on social media than those with scientific jargon (reported in Journal of Clinical Epidemiology 2017). More on crafting a title can be found in my blog titled “It matters how your write”.
- The Abstract may be one of the last sections you write. Word count limit forces you to again focus on the main message. Where there is a lot of information in the Results, prioritizing what to include in the Abstract can be daunting. Work your way back to the a priori hypothesis and the main result. Remember to include key words and search terms in your Abstract that may be used by other researchers for Pubmed or Google Scholar searches.
- The Results section is what reviewers will focus on. Take the time to properly format your tables and figures, and avoid sloppiness. As a reviewer, I have seen manuscripts with tables that look like they were just pasted into a Word document from Excel without any attempt to reformat them. The same applies to images. Be pedantic about data presentation. You run the risk of a reviewer expecting the rest of the paper to be sloppy if they cannot make sense of your data.
Planning for a manuscript takes time and commitment. My best advice to doctoral students and early career researchers is try to start writing early on in your career, as there is no avoiding it. Pay attention to the writing styles of your mentors, and be open to their advice, edits and comments on your draft. Good mentors will also provide guidance in developing writing skills. Allocate time at the end of each day to actually write up your results as your research progresses. Writing skills take practice, so start early.
Finally, celebrate your achievements!! Acceptance of a manuscript for publication are major personal milestones worth celebrating with your colleagues. So even if it is one small step in the advancement of your career, make it memorable!
At SugarApple Communications we are experienced in all stages of the publication process, and can advise on data presentation and analysis. If you are time-poor and need to get that manuscript out, get in touch today and let us talk.
July 30th, 2018 | by Sharon Johnatty
Writing a scientific paper or thesis for most of us can be a daunting task. It reminds me of a classic poem I learnt as a child “Maria intended a letter to write, but could not begin, as she thought to indite…”. The rest of the poem was advice from her mother to think of it as though speaking to the person, but with her pen.
Oh that it could be so simple getting your manuscripts written and published! In this article I will outline some general principles that I’ve gleaned over the course of 30 years in academic research and publishing in a range of scientific journals, editing student theses and research grants, and as a peer reviewer for various biomedical journals.
In the simplest of terms, the entire paper should consist of the context set out in the Introduction, the content presented in the Results, and the conclusion brought together in the Discussion.
An important over-riding principle of scientific writing is clarity. Keep the message clear and accessible. Think about your driving hypothesis, and phrase it in the simplest of terms without compromise to accuracy.
“If you write in a way that is accessible to non-specialists, you are not only opening yourself up to citations by experts in other fields, but you are also making your writing available to laypeople, which is especially important in the biomedical fields.” (Stacy Konkiel in ‘The write stuff’ Nature 2018)
Some journals require a brief statement of the main findings written in language that is accessible to all readers. This is an opportunity to encourage the reader to want to know more about your work. This also applies to the Abstract, which should focus on the study question, why it is important, how you have addressed the question, and the broader implications of your work.
Keep in mind that PubMed searchers and e-alerts will bring up only the Title and the Abstract, which is all that most people will ever read. The Abstract should be written so that those outside your field will get the big picture and entice them to access the full paper.
The Introduction should not be a comprehensive long-winded overview of the topic, but should be concise, sufficiently capturing the relevant aspects of the topic that help to clarify why you undertook your research. It should give the reader enough of the broad scope of the topic and possible deficiencies in the current knowledge.
Choose references that are fairly recent, scientifically sound, and published in reputable journals. Finish the Introduction with a brief paragraph on your hypothesis and what you are about to present, which should logically flow from the information outlined in the preceding paragraphs.
The Methods section should be quite straightforward. For some it is the easiest section to write, and can be written up prior to obtaining results. Experiments can take months to complete, and analysing data and writing up your results require good record keeping. For this reason, lab books are not only critical to your manuscript, they are also legal documents in both academic and industry research.
The same applies to large-scale data analysis; you may spend weeks or months cleaning and organizing large datasets and performing quality checks before beginning the analysis. Keeping a log of what you did will pay dividends when you begin to write up your methods.
It is often a good exercise to review how methods are written for other publications in the target journal, and the level of detail that is acceptable. This is often where acronyms abound and can lead to statements that are incomprehensible. I recently read a thesis where a simple two-word expression used only three times in the entire document was turned into an acronym. It reminded me of the scene in the movie Good Morning Vietnam where Adrian Cronauer (Robin Williams) said “Seeing as how the VP is such a VIP, shouldn’t we keep the PC on the QT? ‘Cause if it leaks to the VC he could end up MIA, and then we’d all be put on KP” (I just about fell out of my seat laughing!).
Acronyms have their place, but use them sparingly and only where it helps to avoid verbosity. It’s a good idea to stick with acronyms already in use in the published literature, and avoid confusion by developing new ones, particularly for terms that are commonly used as keywords in PubMed.
The Results section should describe the results as factually and as clearly as possible. Avoid the temptation to insert justification or interpretation in the Results section. Some journals allow a combined Results and Discussion section. This helps to avoid repeating the main findings at the start of the Discussion. However the same principle applies, i.e. Results should describe the main findings based on stated aims that are detailed in the Methods, while the Discussion allows interpretation of the results in the broader context of the topic, and how it fits with the existing literature.
Avoid introducing results that are not specifically outlined as part of your Methods, or making claims that are not consistent with the evidence obtained, especially if so-called ‘exploratory analyses’ were undertaken. Likewise, any analysis outlined in the Methods should be reported in the Results. Supplementary Material is usually a good place to provide additional data or analyses, as long as it is part of the research undertaken. Large-scale genome-wide association studies often utilise Supplementary Information to provide effect estimates that reach a certain significance threshold, even if they are not all discussed in the main paper.
The Discussion can be the most challenging section to write, and requires considerable knowledge of the existing literature. Good reviewers will be well informed on the topic, and will highlight deficiencies in the Discussion or aspects of the topic that should be considered. Conclusions should be confidently stated and evidence-based. In general, a good Discussion leaves no loose ends; it shows that you have considered alternative explanations for your findings, and addresses strengths and weaknesses of the research and the reasons for them. As your research is unlikely to be the final chapter on the topic, include a statement about unanswered questions that may form the basis of ongoing work.
Finally, give serious consideration to crafting a Title that stands out. ‘Punchy’ titles have their place in scientific writing, but for manuscripts published in scientific journals, aim for one that provides a clear informative statement highlighting the most important finding of the study, and that sets it apart from others in the field. Avoid boring titles that begin with “Studies of XYZ in ABC…” or “Characterization of crumple-horned snorkaks in…”. A recent published study assessing title characteristics of health care articles concluded that easy-to-understand, declarative titles were more likely to be picked up by the popular press compared to those with more uncommon words.
The cover letter is the final document you will draft before submitting. When you consider the months or years it took to produce your paper―an important stepping stone in your scientific career―then what level of importance should you place on the cover letter? I would say very high importance! The cover letter is the most important part of your written submission. I cannot over-emphasize the importance of a well-crafted cover-letter. Regardless of how amazing your work is, it could determine whether your paper will be considered for review or rejected outright. Your cover letter should be formally written by the corresponding author, and addressed directly to the editor in chief; so take the time to look up his/her name. It should be no more than a single page briefly summing up in a few sentences the main highlight of the study, how it fills an existing gap, and why it warrants publication in your target journal. Most journals will provide guidelines for what to include in a cover letter, such as suggestions for reviewers and their contact information, or those to exclude. Careful thought should be given to this.
As a final pre-submission step, check the literature one last time for relevant papers that have come online while preparing the final draft. As the lead author, you are responsible for EVERYTHING in this paper. Read your paper again (and again!) for grammar and spelling errors, formatting, references, line spacing, and the multitude of non-scientific content like acknowledgements, affiliations, and referencing style. Check that tables and figures are correct and correctly formatted. Scan references for any glitches that occur with your referencing software—they sometimes have a mind of their own! A major annoyance to reviewers is careless mistakes, suggesting that the paper was rushed out and sloppily done. Your aim should be to make life easier for a potential reviewer and your paper a pleasure to read.
There is a lot to be said about avoiding dry and boring traditional writing styles that are accessible only to a select few. Scientific writing should be factual and evidence-based, while at the same time creative enough to draw the reader in. It is important to strike a balance between emotive language that sensationalizes the science and engaging the reader with uncomplicated accessible language.
It’s great to be able to say my grandmother who is not a scientist can tell her friends about my work. Scientific writing is not about ‘dumbing it down’; it’s about telling a great story with clarity, confidence and conviction. And yes! You can!
At SugarApple Communications our writers are long-standing authors with experience in all stages of the publication process, including data management, statistical analysis, and ethical publication practice. Get in touch today and let us discuss you next publication.
January 31st, 2018 | by Sharon Johnatty
The unethical practice of ‘ghost-writing’ or non-disclosure of medical or freelance writers employed to write journal articles led to the development of guidelines for ethical and transparent publication practices by the International Committee of Medical Journal Editors (ICMJE). These were first published as early as 1979, and were recently updated in the GPP3 guidelines. ICMJE defines authorship according to the following criteria:
- The conception or design of the work; or the acquisition, analysis, or interpretation of data for the work; AND
- Drafting the work or revising it critically for important intellectual content; AND
- Final approval of the version to be published; AND
- Agreement to be accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved.
“In their current state, the ICMJE authorship criteria also generally preclude medical writers from being authors because they usually cannot (or are unwilling to) satisfy criteria 3 and 4.” (Phil Leventhal, 2016 Medical Writing Vol 25:1)
In the context of scientific or medical publishing, the roles of professional writers and authors are not the same. Professional writers are accredited by various professional societies mainly for expertise in technical writing. But the question of who is responsible for the accuracy of the manuscript in terms of the data that is analysed and reported, can be a ‘grey area’ in publications by commercial entities.
Inaccurate data can be just as clearly and concisely written up as accurate data, and few will know the difference. However, the fact remains, publications of inaccurate data are essentially ‘fake news’ and misleading to the medical and scientific community.
Although listed authors are accountable for all aspects of the work, they are generally known as key opinion leaders or KOLs — influential researchers and physicians selected by the commercial entity to participate in the study. Some KOLs are academics who may have authored hundreds of publications. Others are busy clinicians with less requirement or interest in publishing. Once the study is completed, decisions are made by relevant stakeholders to publish the findings, and a freelance or professional writer is engaged to work directly with the KOLs to create a submission-ready manuscript.
The criteria of accountability can easily be lost when a freelancer drafts a manuscript reporting on data that she has obtained from an external source. For example, a study may be commissioned by Acme Pharmaceuticals, who has selected certain KOLs to lead the study, and has engaged the services of Emca Data Systems to collect and curate their data. This data may then be analysed by Roadrunner Analytics, which provides summaries and study estimates that are given to the professional writer, Ms Kayotte, who liaises with the KOLs to draft the manuscript.
Who then is accountable for the accuracy of the published findings? Ms. Kayotte may assume that it is not her responsibility to do any accuracy checks and that what she got from Roadrunner Analytics was ready to be written up. When none of the previous entities are transparent in this process, or named in the final publication, and reputation, time frames, and costs are a primary concern, it becomes incredibly easy for matters of accuracy and accountability to be lost in this complex mix of players, protocols and data confidentiality concerns. The real question that relates to GPP3 guidelines, as well as those developed by Medicines Australia, is who ultimately is responsible for the integrity and accuracy of what is published?
As in the above scenario, a freelancer may be commissioned to write a manuscript for a client, and a preliminary analysis may be provided for review. As many of us who work with data know, the process of exporting data from one format to another can often introduce errors. It is simply fundamental to anyone involved in or with data analysis, that it never be assumed that the first round of analysis is ready for write-up. In the above scenario, the freelancer unfortunately makes this assumption!
It must be stated that it is simply a matter of good ethics in science to do multiple data integrity and accuracy checks. It is advisable to have someone else on your team independently review the manuscript, even at the submission stage, to ensure that your publication will stand up to scrutiny, and hand on heart, you can attest to its accuracy.
If after publishing a manuscript, errors are discovered, the ethical thing to do is submit a corrigendum or erratum to the journal. This serves not only to attest to your integrity as a scientist, but acknowledges that errors can be discovered even after publication. Before proceeding with the analysis, it is advisable to do some preliminary checks on the exported data, i.e. are the demographics of the study population what you are expecting? These data checks can be done very quickly by those intimately involved in the study protocol. But as an author would know, you never assume a preliminary analysis is final, NEVER!!
The important question for those who hire freelancers is, are your freelancers as concerned about the quality, clarity and accuracy of manuscripts reporting your important proprietary data, as an author would be? When it comes to work that really matters, it is important to employ professional writers with a long publishing history, and who have high ethical standards that are reflected in the manuscripts they are responsible for.
I have written previously on the challenges of reproducible research, as this continues to generate much discussion in the academic scientific community. Given the complicated process of getting the data to the stage where a manuscript is drafted, it is essential the writer liaise with the authors and KOLs involved in commercial research to verify the accuracy of the reported data.
Clearly there is a considerable difference in expectation and responsibility attributed to authorship vs freelance medical writers. However, it should be emphasized that, for the sake of her reputation, the professional writer who puts ‘pen to paper’ (fingers to keyboard) to draft the manuscript, needs to take full responsibility for ensuring that what she is writing is an accurate representation of the data. It should be a part of her role to conduct data checks and insist that this is non-negotiable in a scientific environment that is plagued by retractions, even if the manuscript is held up in order to ensure its integrity.
The value of working with professional writers who have sufficient experience as authors, and who also have expertise with data analysis cannot be overemphasized. Writers should at the very least be aware that several rounds of data checks are necessary before you enter the submission process. In other words, before you hire a freelancer or contract with an agency, check them out on PubMed to see if they have actually published anything, and when. Junior scientists employed by medical communications agencies, and freelancers with little or no actual experience writing up their own work for publication, will not appreciate the nuances of data manipulation and the importance of data accuracy, nor will they have sufficient experience as novices to understand the implications of publishing incorrect data.
“Properly trained and experienced writers can help authors with the development of publications in a compliant, complete, and timely manner, particularly when authors have limited time… Professional medical writers have a responsibility to ensure that findings are presented clearly, accurately, and without any intent of misleading readers.” (Battisti et al 2015, GPP3)
Additionally, freelancers are under the same requirement to observe ethical practices as do authors, and should do their due diligence to ensure that the data they are reporting on is indeed accurate, and sufficient data checks have been done by those commissioned to do so, before drafting the manuscript.
By ensuring data accuracy, you avoid the inevitable disappointments and frustrations that are incurred when a manuscript is withdrawn or held up because the manuscript was drafted on the basis of a preliminary analysis.
At SugarApple Communications our writers are also long-standing authors with experience in all stages of the publication process, including data management, statistical analysis, and ethical publication practice. We will liaise with all stakeholders to ensure that analyses are accurate and correctly interpreted, and follow it through to the final publication stages. Don’t risk putting your important research in the hands of multiple entities, or novices with no actual authorship experience, when your costly efforts matter!
We can help you find the best way to communicate with your intended audience and assist with writing, editing and statistics. Get in touch today and let’s talk.
June 20th, 2017 | by Sharon Johnatty
Deciding on the journal best suited to your research can be difficult, particularly for interdisciplinary research. Reputable journals are increasingly rejecting sound, well-written manuscripts without review, and with the recent explosion in new journals, deciding where to publish is an ongoing challenge.
Only you and your collaborators can assess the value of your research findings and decide where to submit your work, but some basic journal sleuthing will be a good start. It can be difficult to discriminate the legitimate journals from the predatory ones, and this could take some time, but it is well worth the effort.
My last blog outlined strategies to avoid predatory journals. In this article I continue the theme of journal selection with additional thoughts that also apply to avoiding predatory journals. A combination of these approaches will yield the best results.
- Develop a list of journals suitable for your subject area. This should ideally be done at the start of the write-up process. Research the journals relevant to your specialty area and develop a list of candidate journals for submission. I usually start by looking at articles that I have referenced or used to generate my hypothesis. Check with your co-authors, collaborators or colleagues and ask about their experiences with journals on your list and their response time. In many large collaborative groups this information is often readily shared. Update your list as you progress, or if your work is not accepted by your first choice journal.
- Read the aims and scope of the journals on your list. Beyond the evidence of transparent and accessible peer-review editorial and fee policies, the question of whether the journal is a good fit for your topic should be considered. Check that the Aims and Scope of the journal fits with your research, particularly if your topic is narrowly focused on a specific scientific question. Likewise, if you notice that a journal’s aims are so broad that they will publish just about anything, then look at some of their recently published articles before making a decision.
- Think about your target audience. Journals you find interesting and relevant to your field are also likely to be accessed by your peers or other researchers in your field. Compare articles that are similar in design and methodology to yours, and check if your target journal publishes such articles. This helps if you wish to refute a claim by a journal that ‘we do not publish’ these types of articles. I recently had the experience where a journal’s Editor-in-Chief took 15 weeks to respond, stating that they did not publish our particular methodology — which was actually spelled out in the title of the article! Consider a presubmission inquiry if in doubt, as their response may also highlight their legitimacy and save you precious time.
- Impact factor isn’t everything. Publishing in journals with a high impact factor was the second most common reason for journal selection, according to a recent opinion poll. Despite concerns that they can be manipulated, impact factors continue to play a key role in academic career progression. Journal impact factors and other journal metrics are available from Thomson Reuters Web of Science and can be used in conjunction with SCImago Journal Rankings based on Scopus. Changes in journal rankings and metrics over the past few years should be reviewed, as a drop in journal metrics may signal potential problems, but an increase may simply be the result of decreasing numbers of articles published by the journal. A recent systematic analysis explored contributors to changes in journal impact factors and recommends caution in interpretation, as the number of articles published by the journal is also part of the impact factor calculation.
- Check with your university or institution whether they have an approved list of journals. Many institutions, in cooperation with the Directory of Open Access Journals (DOAJ) and other publishers of ‘whitelists’, may already have resources in place as part of their career advancement scheme, and to encourage researchers to publish in credible journals. In the past three to four years, a number of online forums and websites that aim to protect against deceptive publication practices have been initiated. For additional guidance and advice visit the Think Check Submit website.
As part of large international consortia, I have been fortunate enough in my career to have co-authored articles published in high-impact journals, as well as a number of respectable journals with lower impact factors. In my experience, an honest appraisal of your research, both in terms of interpreting your results and assessing methodological flaws, should be considered in determining the appropriate repository for this work.
It is important to find the right balance when deciding if your research is ready for publication. Attempting to submit weak or incomplete findings for the sake of your publication record, or holding out too long for what you hope will be Nobel-prize winning results, can prove to be demoralising. Applying a ‘Goldilocks’ approach in deciding when and where to publish can save time and money, and avoid frustration and disappointments.
Please feel free to contact us with additional thoughts on this topic. We welcome your comments and suggestions.
June 14th, 2017 | by Sharon Johnatty
Scientific publishing has become increasingly contaminated by ‘noise’ resulting from the explosion in new journals eager to gain a market share. In 2014 alone 1,000 new journals were launched. With the rise in investigative reports in reputable journals such as Science and Nature highlighting this problem, it is becoming a daunting task just working out the unethical ones from the real ones.
In the first of a 2-part series on strategies to select journals that are appropriate for your work while avoiding predatory journals, I have listed a number of approaches (in no particular order of importance), that collectively will yield the best results.
- Read the journal policies and articles they publish
- Legitimate journals should have clear and transparent peer-review, editorial and fee policies that are accessible on their website.
- Publication fees are not uncommon for open access journals but should be paid only when the article is accepted for publication.
- You should not need to pay ‘submission’ or ‘handling’ fees.
- Check publication guidelines on whether the author or the journal retains copyright. Reputable open access journals tend to retain the copyright licence under the Creative Commons guidelines, but you should not be expected to transfer copyright before the article is accepted for publication.
- Look at the quality of articles published by the journal in question. If there are obvious typos, it is poorly written or the science does not stack up, avoid it like the plague! As a writer and editor, I have real problems with journals that publish poorly written/edited manuscripts. You should too.
- Verify the claimed journal impact factor. Some predatory journals have been known to use contrived impact factors on their websites or in emails to appear credible. Journal impact factors published by Thomson Reuters Web of Science are the most widely accepted journal metrics. Most universities and research institutes make this available through their online library access.
- Check the journal’s membership with recognized best practice professional organisations. The Committee on Publication Ethics (COPE) was established in 1997 as a forum for editors and peer-reviewed journals to exchanging views and advise on how to deal with research and publication misconduct. Currently it has a membership of over 10,000 worldwide. Other professional organisations include the International Association of Scientific, Technical, & Medical Publishers (STM), or the Open Access Scholarly Publishers Association (OASPA). However no resource is completely error-free, so use this information judiciously.
- Check whether the journal is listed with the Directory of Open Access Journals (DOAJ). Although predatory journals have been found on their ‘whitelists’, DOAJ implemented a stringent review process two years ago and has delisted over a third of their journal listings in an effort to crack down on this scam. As with ‘blacklists’, a comprehensive approach including your own review of ‘whitelists’ will yield the best result, as these resources are dynamic in nature.
- Check online journal comparison tools and resources. These allow researchers to not only select the best journals for their research, but they also allow authors to filter on various criteria including performance and prestige, and report their publishing experiences. Many are free to use, while some require a small fee. A review of this can be found here. Another new resource is scheduled to be launched on 15th June 2017 that require a paid subscription to a ‘blacklist’ of 3,900 predatory journals, developed using 65 criteria that the curators say will be reviewed quarterly. The utility of this resource and how many institutions will sign up for it remains to be seen.
- Are the listed members of the editorial board contactable by phone or email? Cross-check claims regarding editorial board members with their own or affiliated institutional websites regarding their activities and involvement with the journal. A recent sting operation uncovered major retractions because of fabricated editors.
Many of us in academia are bombarded with emails almost daily from journals inviting us to submit manuscripts for publication, or submit abstracts for scientific conferences. These journal names are deceptively similar to those we are familiar with, and it is easy to fall prey to their invitations, particularly academics in developing countries where competition for research funding and pressures to publish can be very intense. Many legitimate start-up journals, including those focused on sub-specialties important to regional research in developing countries, can be unfairly tarnished by this trend. However, some common sense when reviewing journal policies and following the above guidelines will go a long way to allaying concerns of legitimacy.
As part of our publications management service at SugarApple Communications, we provide advice to our clients on journal selection as well as submission management and follow-up. We understand that publishing in recognised credible journals is critical to your success.
Regarding those pesky unsolicited email invitations to submit an article or sign up for their conferences, if a quick look reveals poor grammar, spelling mistakes, or overly flattering language about your research, then hit the ‘Delete’ button!!
Please feel free to contact us with additional thoughts on this topic. We welcome your comments and suggestions.
- Read the journal policies and articles they publish
June 5th, 2017 | by Sharon Johnatty
Yes, predatory journals are out there, and their aim is to try to cash in on a potentially lucrative publication market by extracting fees from authors anxious to publish. Jeffery Beall, professor and librarian at the University of Colorado, began his investigations into “predatory open access publishing” (his term) in 2008 when he began receiving emails from unfamiliar journals asking him to serve on editorial boards. A major tip-off was the fact that these emails had numerous grammatical errors. He developed a list of ‘potential, possible or probably predatory scholarly open-access publishers.” The list went from 18 in 2011 to 923 in 2016. He estimated that such journals publish about 5-10 percent of all open access articles.
The entire content of his website was removed on 15th January 2017, along with his faculty page. After much speculation on social media as to why the list was removed, the University of Colorado declared that the decision was a personal one by Beall himself. Academics regarded this as a disaster, because this list represented an extremely important resource. However he had made a considerable mark on the scientific community and publishing houses who undertook their own investigations. For many years open access was regarded as a thorn in the side of publishers, while researchers welcomed the concept as it seems exorbitant to pay for access to articles funded by research grants. However this has been the arena in which these predatory journals have flourished.
In 2013 Science published the results of a sting operation where a paper of essentially ‘fake’ science written by a fictitious author at a non-existent research institute was submitted to 304 open-access journals. More than half of the journals accepted the paper without noticing glaring experimental flaws including material that a high-school chemistry student would spot. Beyond this, the operation uncovered the convoluted schemes in place to conceal the identity, location and financial paper trails of these publishing companies that prey upon researchers. More worrying was the fact that some reputable journals hosted by industry giants such as Sage and Elsevier accepted this fake paper!!
The question of fake editors and reviewers have also been recently investigated and published in Nature in March 2017. A fake application for the position of editor was submitted to 360 journals, both legitimate and suspected predatory journals. Along with the application, the sting operation included accounts created for the applicant on academia.edu, Google+ and Twitter, and a profile and CV that included experience and interests that were hopelessly inadequate for the role of an editor. The operation included sufficiently stringent criteria for coding a journal’s response to the fake application as ‘accepted’, ‘rejected’ or ‘no response’. None of the 120 journals with an official impact factor as indexed on Journal Citation Reports accepted this fake application, compared to 7% of journals listed on the Directory of Open Access Journals, and 33% of predatory journals on Beall’s list who accepted this application.
In May of 2017 Science published some astounding statistics of papers retracted by Tumor Biology, a former Springer journal, citing evidence of a journal editorial board consisting of fake reviewers. Springer tried to scape-goat agencies specializing in manuscript editing by suggesting that these agencies were proposing fabricated reviewer names. But as later discovered, seven members of the editorial board could not be contacted, some did not work at the listed institute, and one who recognized the scam and tried to remove his name as a board member continued to be listed until the journal was recently taken over by Sage. Even if an author or agency proposes reviewers for their manuscript, which is not uncommon, it is the responsibility of the journal to evaluate potential reviewers, and if necessary, engage a ‘real’ one, as most reputable journals will. Springer has since publicly stated they will develop tools for its remaining journals to ensure the peer-review process is more robust.
At SugarApple Communications we believe ethical publication practice is of paramount importance to scientific endeavours in all fields. We make journal recommendations to our clients based on whether journal policies and author fees are available on the journal website. We also check that editorial board members are listed with their affiliations, and are recognised experts in the relevant scientific discipline.
In our next installment in this series, we will provide additional details on this ongoing problem and ways to identify and avoid these predatory journals.
Please feel free to contact us with additional thoughts on this topic. We welcome your comments and suggestions.