It will depend on the game, but this has always been the strategy deployed with consoles.
The PS4 and XB1 aren't running 1080p@60fps natively on almost any games. Even on the games where that is boasted, in truth, for the most graphically intense scenes, they resort to "dynamic scaling". This means the game temporarily resorts to rendering natively at 720p or 900p, then upscaling that to 1080p. In many games they're still running at 30fps, but most games just peak at 720p@60fps or 900p@60fps across the board.
Keep in mind the XSX has inferior raw processing power to the RTX 2080 Ti. Assuming leaks about the RDNA 2.0 GPU architecture are true, it may actually boast equal or even superior synthetic/gaming power to the RTX 2080 Ti despite this raw disadvantage, but we're still talking about a system that roughly equates to an R7-2700 CPU + RTX 2080 Ti GPU. That's the more powerful of the two systems.
This isn't sufficient to run the most demanding
current games at 4K@60fps minimum (not average) framerate on the highest settings, and we're talking about $2K+ computers bought in the past few years. This and the fact higher framerate 4K monitors aren't great yet (not affordable and even the best suffer from lag) is why all the best PC gaming websites still recommend 1440p@144Hz as the zenith of PC gaming right now.
Example. This was with a 9900K CPU (much stronger than the ones in the new consoles).
NVIDIA GeForce RTX 2080Ti runs Star Wars Jedi: Fallen Order with 44-65fps in 4K/Epic Settings
Roundup here. Notice
Ghost Recond Wildlands, Shadow of the Tomb Raider, and
Final Fantasy XV brought it to its knees, too. These are also
average framerates. For a stable FPS the way consoles like to run them you have to keep the minimum above 60fps, and that means you're shooting for closer to an 85fps average. You can examine this in the charts:
https://www.eurogamer.net/articles/...07-nvidia-geforce-rtx-2080-ti-benchmarks-7001