Nadia Makarevich has some great numbers from a realistic benchmark. (h/t to Chris Coyier for putting it in my feed.) The whole writeup is worth reading, since it goes in-depth into several data-loading strategies. The summary is excellent:

Server Components alone don't improve performance if the app is a mix of Client and Server components. They don't reduce the bundle size enough to have any measureable performance impact. Streaming and Suspense are what matter. The main performance benefit comes from completely rewriting data fetching to be Server Components-first.

React Router proved that you don't need React Server Components to stream data from the server in React (and pretty much every other frontend framework has its own streaming rendering solution these days), so what's the point? A novel component architecture that spans two computers? It's really technically cool, but the practicality and tradeoffs aren't self-evident.

React's entire MO is "What cool things happen when you build UIs with a pure functional paradigm?" Async React and the React Compiler are amazing, groundbreaking demonstrations of what's possible, but with so many other valid ways of doing the same thing, is the juice worth the squeeze?

And I want to be clear — if RSCs work for you, more power to you. Again, they're really cool. I had a little 🤯 moment when the Next.js team announced "use cache". Even the Remix team has enabled React Server Components in React Router.

What I'm pushing back against is that RSCs are universally better than any other way of fetching data in React, much like React isn't necessarily better than any other frontend framework (or no frontend framework).

The part that is unsaid from Nadia's exploration is that the best way to improve the performance of a site is to fix the bottleneck, specifically the slow backend responses. I know, easier said than done, but streaming and novel data fetching techniques are ultimately bandaids over slow source data.