🚨 New Video Alert! 🚨 Curious about the reliability of a specific #evidencesynthesis? Join us in exploring Criterion 7: Data Synthesis as part of CEE’s guide to interpreting the CEESAT criteria. In this latest instalment, CEE Executive Andrew Pullin delves into the nuances of CEESAT Criterion 7. Discover how review authors conducted and reported data synthesis, along with the constraints posed by their available primary data. 📺 Watch Part 7 now: https://lnkd.in/e6eZbqQA CEE's CEESAT tool, integral to the CEE Database of Evidence Reviews (CEEDER), evaluates the confidence in review findings. Interested in other CEESAT criteria? Dive into the complete playlist here: 🎥 https://lnkd.in/e-53RkUn #EvidenceSynthesis #CEESAT #SystematicReview #CEEDER
Collaboration for Environmental Evidence
Research Services
Reliable evidence, informed decisions, better environment
About us
We are an open community of stakeholders working towards a sustainable global environment and the conservation of biodiversity. CEE seeks to promote and deliver evidence syntheses on issues of greatest concern to environmental policy and practice as a public service. Our services to both researchers and users are provided by an international group of volunteers. So, we are always looking for people to join us! If you look on our website at https://environmentalevidence.org/opportunities/ you will see a list of positions. If you are interested in any of these please do contact us by email [email protected]
- Website
-
http://www.environmentalevidence.org
External link for Collaboration for Environmental Evidence
- Industry
- Research Services
- Company size
- 2-10 employees
- Headquarters
- Bangor
- Type
- Nonprofit
- Founded
- 2008
- Specialties
- Evidence Synthesis, Systematic Review, Systematic Map, Evidence Service, Environmental Management, Climate Change, Agriculture, Conservation, Forestry, Land Use, Marine Resource Management, and Water Management
Locations
Employees at Collaboration for Environmental Evidence
-
Simon Gardner
Head of Digital Environment at NERC: Natural Environment Research Council
-
Barbara LIVOREIL
PhD Biologie & Ethologie (1994), Consultante Revues systématiques en sciences de l'environnement + Psychologue du travail depuis 2020
-
Gerry Post
Chief Medical Officer at OneHealth/FidoCure
-
Andrew Pullin
CEE Chief Executive Officer
Updates
-
Collaboration for Environmental Evidence reposted this
A new AI & Automation Methods Group is launching on 3rd June! The group is jointly run between the Cochrane Collaboration, The Campbell Collaboration, JBI and the Collaboration for Environmental Evidence (CEE), and will help define and support responsible AI use across these four leading evidence synthesis organizations. 💡 What are the aims? ▪️ Spearhead methods research and development and act as a bridge between evidence synthesis organizations and the wider research community. ▪️Cochrane is launching a new AI & Automation Methods Group on 3rd June! Define best practice and ensure guidance for accepted methods is up to date. ▪️ Support the implementation of new or amended methods by acting as an advisor or through involvement in methods implementation in our respective evidence synthesis organizations. Want to find out more? 👩💻 Join our webinar on 3rd June on ‘Recommendations and guidance on responsible AI in evidence synthesis’ -- https://buff.ly/3FbsYtV 🔗 Visit our AI Methods Group website -- https://buff.ly/4h1uTPe 📚 Read the news item on our website -- https://buff.ly/41zHsfT #AIinResearch #EvidenceSynthesis #ResponsibleAI #Cochrane
-
-
Over three years ago, the Global Commission on Evidence to Address Societal Challenges was launched to improve how research evidence is used—not only in crises but also in shaping routine decision-making. Today, as highlighted in their Update 2025, the need for a systematic and transparent approach to using evidence remains urgent. Collaboration for Environmental Evidence is proud to have been among the very few environmental evidence partners in this important global effort. Environmental and #conservation challenges—from #biodiversity loss to #climate adaptation—require rigorous, timely, and accessible evidence to drive effective action. The work of the Commission reinforces our commitment to advancing evidence-informed environmental policy and practice. 📖 Read more in Update 2025 (available in multiple languages): https://lnkd.in/dtVRrr5D 🎙 Hear highlights from Evidence Commission co-lead John Lavis: https://lnkd.in/gqRPJajt #EvidenceForImpact #EnvironmentalEvidence #SystematicReviews #EvidenceCommission #GlobalChallenges #ESIC
-
Collaboration for Environmental Evidence reposted this
Finally, we start to see these types of statements appear: "Most impact claims rest on evidence that wouldn't pass basic scientific scrutiny. When millions in funding and countless lives are at stake, "we think it's working" isn't good enough. " That is the state of evidence of evaluation - never mind the state of the evidence being used to decide to develop and then test the intervention in the first place.
Facilitator | Founder, Monitoring & Evaluation Academy | Gender & Inclusion Advocate | Follow me for quality content
Want to hear an uncomfortable truth? Most impact claims rest on evidence that wouldn't pass basic scientific scrutiny. When millions in funding and countless lives are at stake, "we think it's working" isn't good enough. A rigorous impact evaluation ensures that development interventions truly create change—rather than just appearing to. This is a great documemt to help you design impact evaluations. It helps you to: ➤ Define a Strong Counterfactual Impact evaluation must compare what actually happened with what would have happened without the intervention. Methods like randomised controlled trials (RCTs), difference-in-differences, and regression discontinuity designs provide credible counterfactuals . ➤ Beware of Selection Bias If the people who receive an intervention are already different from those who don’t, your results may be misleading. Matching techniques and instrumental variables help correct for bias . ➤ Use the Right Experimental and Non-Experimental Methods RCTs are considered the gold standard but aren’t always feasible. Quasi-experimental approaches like propensity score matching and synthetic control methods can also provide strong evidence . ➤ Move Beyond Outputs to Real Impact Many Monitoring and Evaluation (M&E) systems mistake activities for impact—e.g., counting the number of trainings rather than measuring changes in behaviour, policy, or well-being. Ready to measure real change? Attend this Saturday's webinar to learn how to have an M&E system that captures impact (and not just activities) We were sold out, but I added 5 additional spots! And once these go..that's it!🏃 🔥 Sign up now: https://lnkd.in/eZtv9pmA #MonitoringAndEvaluationSystem #Impact
-
We definitely agree with the broad message below that funders should expect #evidence of effectiveness when considering #funding #conservation action, or indeed any #environmental management actions. We also emphasise that stronger evidence is provided by #systematicreviews rather than individual studies that may be cherry-picked to support the applicant's case.
Conservation funders urged to embrace evidence-based approaches for greater biodiversity recovery. 🌱 ⬆️ New research by 18 global wildlife conservation funders, in collaboration with Conservation Evidence, outlines the need to transform conservation action by encouraging evidence-based decision making when awarding grants. To read the full paper, visit: https://lnkd.in/ewxUuYJd (Image: An arboreal wildlife bridge, designed for native dormice after seeing successes in Japan, on the Isle of Wight. It helps to reconnect fragmented woodland patches. Credit: People's Trust for Endangered Species)
-
-
🚨 Make Your Voice Heard! 🚨 The Evidence Synthesis Infrastructure Collaborative (ESIC) is seeking feedback on six reports that will shape its efforts to deliver reliable, timely, and accessible research evidence to those who need it most. 🗓️ Deadline for comments: March 19, 1700 UTC 📖 Read the reports & provide feedback: https://lnkd.in/e52qiNsG Providing feedback is easy! The feedback survey asks 3 questions: 1. What aspects of the report/findings do you find most valuable? 2. What areas of the report/findings could be strengthened or improved? 3. Any suggestions on additional interest holders we should engage with or resources we should consider? 🌱 Share the reports with your #EnvironmentalEvidence network and get involved! 🌍 With support from the Wellcome Trust, ESIC is engaged in developing global frameworks for a more coordinated and collaborative approach to evidence synthesis. 👉 Learn more about ESIC: https://lnkd.in/epQTkeA6 #EvidenceSynthesis #ResearchImpact #SciPol
-
Really interesting study on 27 #systematicreviews of the effectiveness of automated citation searching by Darren Rajit, looking across the health, social sciences and #environmental sectors (using our #CEEDER database).
✨ New Paper Alert : Feels good to finally get some of my evidence synthesis automation work out :) A few highlights from our new paper simulating and evaluating automated citation searching within evidence retrieval: 🔍 We evaluated automated citation searching across 27 systematic reviews spanning health (from the Cochrane library), environmental management (from the Collaboration for Environmental Evidence database), and social policy domains from (The Campbell Collaboration) using OpenAlex and Semantic Scholar APIs 📊 Key findings: Automated methods had poor recall, though had better precision and F1 scores 🌿 Performance varied by discipline! Environmental management reviews showed better results than social policy reviews, possibly due to the greater concentration of grey lit in our dataset (Within social policy reviews) 💡 Practical takeaway: Automated citation searching works best as a supplementary strategy rather than a standalone method, particularly in contexts where precision matters as much as recall. I'd probably use it as a quick scoping / sanity search, or to assemble a gold set to validate a search strategy later on. YMMV 📖 Full open access paper here: https://lnkd.in/gZ6JPXCW 🛠️ Try out Automated Citation Searching for yourself (note: experimental): https://lnkd.in/g9_53fzr Huge thanks to my supervisors: Joanne Enticott, Helena Teede and Lan Du Also - big thanks to the OpenAlex team at Our Research and Semantic Scholar team at the Ai2 for API access 💖
-
Collaboration for Environmental Evidence reposted this
What are some "zombie" 🧟♀️ ideas in ecological (or other fields) meta-analysis? I have started this list: (high heterogenity makes these unreliable) 1) 🧟♀️Trim and Fill 🧟♀️ 2) 🧟♀️Egger’s Regression🧟♀️ 3) 🧟♀️Fail-safe N🧟♀️ (low power makes these unreliable) 4) 🧟♀️Vote Counting🧟♀️ 5) 🧟♀️Over-reliance on p-values 🧟♀️
-
🚨 Deadline Extended: 31 May 2025 🚨 #CallForPapers Environmental Evidence is still accepting submissions for our special collection on AI for Systematic Evidence Synthesis in Environmental Management! 🌍 🤖 If your research explores how AI can enhance evidence synthesis, don’t miss this chance to contribute to this collection edited by Biljana Macura, Shinichi Nakagawa, and Samantha Cheng! 🔗 Learn more & submit: https://lnkd.in/dfj6FjgS #AI #EnvironmentalScience #EvidenceSynthesis #SystematicReviews #Research #MachineLearning #Sustainability
-
👏 👏 to CEE Board member Matthew Grainger for his work with ESHackathon in developing new tools for #evidencesynthesis!
Looking for tools to support Systematic Evidence Synthesis? We’ve developed several open-access (completely free!) tools to streamline systematic reviews, meta-analyses, and other evidence synthesis methods—plus, more are in development! 🔗 Explore them here: https://lnkd.in/dtP9npPN Do you have an idea for a new tool? Let’s collaborate! Reach out, and we might be able to help bring it to life. #EvidenceSynthesis #SystematicReview #MetaAnalysis