Content of review 1, reviewed on February 21, 2024

The manuscript "Interactions among anthropogenic stressors in freshwater ecosystems: a systematic review of 2,396 multiple-stressor experiments" uses a systematic review framework to assess interactions among multiple stressors in freshwater ecosystems. It is a nicely written and a well put together manuscript. I do, however, have a number of questions and comments. Apologies the below is quite waffly in an attempt to explain what I mean, most of the paragraphs only require one or two actions!

Methods:
For the machine learning algorithm, from what I can see there was no testing as to the influence of the training dataset on the ranking of relevance. Also, there is limited description of this training dataset in the methods. It would be good to explain it upfront, as the training data has a significant potential to influence the results of the studies. Can the authors be sure that there has not been a systematic bias imposed on the results of the paper by the selection of the training of the algorithm? Some further explanation of this, or some form of sensitivity analysis would be a nice addition to the paper. Also, along the same lines, does including the 1000 papers for the accuracy assessment influence the process, for example if the 1000 papers has a overall greater or lesser relevance (i.e., proportion of relevant vs. not relevant) might this influence the results across the different subsets? I appreciate the 1000 papers are to combat this to some degree, but it was just something that crossed my mind! The splitting of the process could incur variability in results between subsets and I think it would be good to provide some assessment of this - i.e., running the algorithm for different random combinations of subsets, etc. to check for the influence of the splitting of the records.

I think the methodology has the potential to be used across other systems and stressors, and therefore it would be good to try and make it as clearly described and open in the methods section. At the moment most of the 'useful' information is in the Supplementary Materials and a Github repo - which is great, but including a flow diagram in the methods would be nice. Also, in terms of the Github repo, it would be good to store it as an archived version on Zenodo or another platform - this way any further changes or adaptations can be solidly recorded as a new version. There are some help pages here on releasing versions of a repository https://docs.github.com/en/repositories/releasing-projects-on-github/managing-releases-in-a-repository

Results:
This is a relatively descriptive section and it would be good to highlight the novel findings from this process. At the moment this is not particularly clear. Some form of further analysis of the data (i.e., a meta-analysis) might provide some interesting ecological insights of wider interest to the ecological community.

Framework for detecting stressor interactions:
This is where the really novel and interesting aspect of the manuscript comes in - but apart from an initial link made to the systematic review, there is nothing to suggest that these suggestions and recommendations are uniquely derived from the above analysis. Focusing the manuscript more on these sections, and providing more underlying detail, would in my opinion, strengthen the manuscript.

Overall, I enjoyed reading this manuscript, but I think there could be more made of the novel findings from the review, this would enhance the reach and impact of the paper - especially for the readers of Ecology Letters, who are not only freshwater or multiple stressor researchers. I think there are 2 clear options:

  1. An additional meta-analysis on the dataset this might be better and more suitable for this journal - providing some key ecological insights, as well as the descriptive review.
  2. Focusing on the latter sections (4-5), but providing a clearer link between the systematic review and these findings (which are novel and insightful).

Source

    © 2024 the Reviewer.

Content of review 2, reviewed on May 23, 2024

Having reviewed a previous version of this manuscript, it is a pleasure to see so many of the suggestions taken on board. The work is much improved, and there is a lot more analysis and synthesis of the dataset and it now really shows off the power of such approaches. It is very nicely written and covers a lot of ground, it is now looking really polished.

Source

    © 2024 the Reviewer.