Previous posts were about how one grassroots scientific journal could look like, but if it all works out we will have thousands of them. They can strengthen each other: Sharing reviews reduces the work load and also creates a network of trust that shows which journals are reliable.
Peer review is a relatively new invention. It is often seen as gate keeping, but it actually helps new and fringe authors to gain some initial credibility that makes others willing to invest time in understanding their work. This has become more important with the increasing scales of science and the internationalization of science, which makes it harder to know who is doing good work. People working in related fields will find it harder to assess the quality of studies and benefit from peer review as readers. With science becoming more interdisciplinary this role has become more important as well.
Credibility of journals
Just like articles need credibility also scientific journals need credibility. Again this is important for the authors and readers to find worthwhile studies. Again this is most important for new journals by normal scientists. In the beginning that would be the typical situation for grassroots journals.
In today’s age of science micromanagement the credibility of journals is also important to determine the output of researchers and which articles are listed in the big databases such as the [[Web of Science]], [[Scopus]] or Google Scholar.
Journals sharing reviews shows that these scientists trust each other and weaves a network of reliable science. Also showing how well the assessments of the general importance of papers correlates with future citations can demonstrate that editors are doing a good job. We could also ask the authors of the reviewed articles, which are the relevant community, how they assess the quality of the journal.
Caveats of credibility metrics
People could build up a credible journal in their field and occasionally publish ideologically motivated bunk science. That could hurt trust of colleagues, but the penalty would be limited if these articles would be in another field and a few bad quality reviews would be hard to see in future citations. People have build up [[large networks of fake websites]] linking to each other to get better rankings in Google. That could theoretically also happen here, but is something we could detect in the citation scores of the journals.
Some journals may legitimate not have any (or many) ties to other journals. Maybe the editors are new or young, maybe there are no other journals (yet) in related fields, maybe there are conflicts, but both groups are scientifically credible. Such cases would make it hard to assess the credibility of the journal. The only information would be metrics based on future citations to their articles, which takes years to become informative.
Google likes to make everything automatic. (And makes it impossible to reach a human when things go wrong.) The automation step is good to reduce the work load, but I feel human judgement is key. That is also why I propose grassroots journals and not just one big database where every scientist can vote articles up and down. Editors assessing the expertise of the reviewers is important and editors making sure every manuscript is reviewed is important.
In case a traditional journal does not function the publisher would jump in to save its reputation. Maybe editors of several journals could team up and together with science societies in their field build some sort of accreditation organisation to replace that role of publishers. Or editors could elect the members of such a group. One traditional journal only has one publisher but one grassroots journal could be member of several such accreditation groups, thus reducing dangers of power abuse by the “publisher”. This group could also be helpful for mediation in case there are conflicts among editors of a journal.
How to create such a network of trust among the journals may be the most difficult part of grassroots publishing. Thus especially here I welcome feedback in the comments below and fresh ideas on how to make this work.