Year Five introductory webinar - questions and answers

Read through the questions asked at an introductory webinar for Year Five held in November 2025.

How can the cohort view their ratings?

The foundations in the cohort are initially sent the data from the research done by Giving Evidence in September or October and have three weeks to reply. That data is then turned into a rating – for each domain and an overall rating for each foundation in the cohort. Ratings are published in March of the following year – there is an online event, and results are published on the Foundation Practice Rating website.

How many foundations do you rate each year?

Around 100.

Would you have heard by now if you were in the Year Five cohort?

Yes. All foundations have been notified if they are in the Year Five cohort. You will also have received your data for review. 

How did you decide the sample for this year's Foundation Practice Rating?
  1. all the foundations funding this project. That is because this project is not about anybody pointing the finger at anybody else: the funding foundations are all being assessed as part of their work on improving. The foundations funding this project are listed on the partner’s page [1].
  2. the five largest foundations in the UK (by grant budget). This is because they are so large relative to the overall size of grant-making: the UK’s ten largest foundations give over a third of the total given by the UK’s largest 300 or so foundations. Giving by the Wellcome Trust alone is 12% of that.
  3. a stratified random subset of other foundations. We took the list of the UK’s largest foundations as published in the Foundation Giving Trends report 2019 published by the Association of Charitable Foundations[2], plus the UK’s community foundations listed by UK Community Foundations[3] for whom financial information is given. That gave 383 foundations. We then took a random sample which is a fifth from the top quintile (in terms of annual giving budget), a fifth from the second quintile, and so on.

 

[1] The foundations funding this project include the Joseph Rowntree Reform Trust and Power to Change, neither of which are registered charities. They are the sole two non-charities included.

[2] https://www.acf.org.uk/policy-practice/research-publications/

[3] https://www.ukcommunityfoundations.org/

We have used the criteria list as our own checklist - had you anticipated that organisations would do that, and do you have any longer-term plans for introducing a self-assessment checklist?

Foundations are very welcome to use the criteria as a self-assessment tool, but we don’t have any plans to create a formal self-assessment tool. We know that 50+ criteria could be overwhelming, so we suggest breaking it into chunks if you do plan to use it this way.  

You can also opt-in for £1,200 (Year Five fee), which pays for researcher time. You receive your results, but they aren’t included in the published results of the cohort. 

Has the Charity Commission shown any interest in the outputs?

[For context, the UK has three charity regulators: the Charity Commission for England and Wales; the Charity Commission for Northern Ireland; and the Office of the Scottish Charity Regulator.] 

The Foundation Practice Rating doesn’t relate to regulatory activity, so there is no need for formal involvement of the regulators.  

That said: 

  • We do use foundations’ annual reports and accounts, and some other information, as published by the regulators;  
  • The CCEW has been supportive e.g., has attended our events.
How many foundations opt in?

We introduced this option in Year Two as we were asked by a foundation who had been in the Year One cohort and wanted to redo the assessment. Each year, a handful of foundations has opted in; this year, five have done so. 

How can foundations opt in if they don't know that the Foundation Practice Rating exists?

We have done a lot of comms to raise awareness of the FPR. 

From the beginning, we’ve had media coverage in Civil Society Media and Third Sector Magazine, e.g., for the first three years, the report was covered in a special supplement published by Civil Society Media. 

Internationally, we have also written about it in the Stanford Social Innovation Review.  

In terms of sector events, we’ve spoken at many conferences: national ones and some regional ones, e.g., in Yorkshire.  

We use two social media channels (Bluesky and LinkedIn) plus our website. We used to use Twitter. Giving Evidence (which produces the research) also writes about it on their social media and website. 

In addition, the Association of Charitable Foundations comes to our Funders Group meetings and has had material about FPR on its website and communications channels. The ~12 foundations which fund FPR also talk about FPR via their own channels. We are thinking about how we can communicate more throughout the year and will do more where we can. We are open to suggestions.

Foundations have lots of requests for their time and resources - the Foundation Practice Rating is one of many things we’re asked to respond to. What should a small foundation team focus on?

Responding to the data you receive shouldn’t be too onerous – as a funder of the Foundation Practice Rating, Friends Provident Foundation is assessed every year, and reviewing our data isn’t too time intensive.

If a foundation doesn’t accept unsolicited grants, they don’t publish grant criteria. What then?

This would mean the foundation is exempt from the criteria which relate to that. Details of our exemption system are published on our website.

The owner of this website has made a commitment to accessibility and inclusion, please report any problems that you encounter using the contact form on this website. This site uses the WP ADA Compliance Check plugin to enhance accessibility.