Este tema ha sido cerrado
8=========D~ 20 ENE 2023 a las 3:33 p. m.
RTX 4090 is not for Native 4K gaming with the latest game engine
with a huge price tag of 3000$ (if youre lucky 2000$ + tax 2400$ Approximately) almost for a card its only able to barely touch 4K 60 fps on the Latest Game engine UNREAL ENGINE 5.1 ? so is it safe to assume ALL gpu's are designed to run games at 1440p and we should all be OK paying 3000$ just to be satisfied at 1080p and 1440p ?? it was not too long ago i remember a GTX 1080 TI was able to run games at 4k 60 yes it was based on games from that generation but shouldnt a GPU thats released now be able to run all games at atleast 4k 120 fps with the latest game engine IF its the top end GPU of current generation??

Sure if you use old generation engines it does reach that fps but what about the future games using the latest game engine ?? are we expected to pay 3000$ every year for the next gpu ??

Here is the benchmark and keep in mind i am only referring to NATIVE 4K not DLSS : https://www.youtube.com/watch?v=dr7LpP7Dm4E


Update January 27 2023 : Newly released game Forespoken running at 4K Native at 43 FPS its not even been a year since 4090 release and this is its performance : https://www.youtube.com/watch?v=U0u9l4Wkh9s

Update February 07 2023

Newly released game as of the date hogwarts legacy doesnt even get 60 fps : https://www.youtube.com/watch?v=5dKUpcMckBg

Lets make sure to blame every single game developer from all different companies and pretend the RTX 4090 is actually a 4k card and all developers are doing everything wrong from all studios and all companies. Its DEFINITELY NOT NVIDIA pretending and Lying publicly about RTX 4090 being a true "Native 4k" GPU.

Lets all keep pretending.
Última edición por 8=========D~; 7 FEB 2023 a las 3:16 p. m.
< >
Mostrando 151-165 de 616 comentarios
emoticorpse 25 ENE 2023 a las 9:59 a. m. 
Publicado originalmente por Komarimaru:
Publicado originalmente por emoticorpse:

I'm not going to say that Nvidia claimed something like that since obviously it's too direct a claim that could easily be construed as absolute truth and then lead to backlash from consumers when it doesn't work out.

However since businesses lead to confusion and open debate by being ambiguous at time, I would say at this time Nvidia is hyping up the 4090 to where you would think this card can easily do 4k for the next couple years.

The reason I say this is from the line on the official 4090 page where "8k gaming" is basically stamped as a major feature. 8k is nowhere near. Not even a standard yet. More than years away so I would have to deduce that if this card can do 8k right now then it should be able to handle 4k easily.

But this is where like I mentioned just now about ambiguity, they are still covered. They could (or you could) argue they never said "max settings" and can come back with "well, what we meant was 8k low settings, rt disabled, and 30 fps" and they'd be right.
Well, if you read on the site, they say this for it.
7680x4320 Resolution, Highest Game Settings, DLSS Super Resolution Highest Performance Mode When Applicable, DLSS Frame Generation When Applicable, i9-12900K, 32GB RAM, Win 11 x64.

Not once did they claim 8K Native, ever. 4K native, easily in many games though. Better than AMD's BS of 8K gaming, when can't even do it with FSR on the 7900XTX, where as the 4090 can under the settings listed of 80+ FPS.

How did I imply the 8k performance was native? That wouldn't make any sense. If the 8k performance was native, then that would mean it could do higher resolution with DLSS?


If that were the case 4k would be a cakewalk? Why would you think I was making a claim that a 4090 could do 8k native. That's just crazy. If it could, this thread wouldn't even exist.
Azza ☠ 25 ENE 2023 a las 10:35 a. m. 
Publicado originalmente por 8==========D:
Publicado originalmente por r3d33m3r:
Dumbest topic ever. The 4090 is very much a card that is suited for 4K gaming. It runs all current games at high framerates even with Raytracing enabled.

If you want even more fps turn off RTX and/or use dynamic resolution.
keyword "current" it expires in approximately 11-13 months with new games using new game engines . now its a 30 fps card at 4k

False statement.

I have a RTX 4090 and have yet to have a game drop below 56 FPS upon 4K resolution, using very high to ultra quality settings. Either use an optimized game or the correct game settings. Also, get the latest Nvidia drivers.

With the DLSS 3.0, new games added monthly, it doubles the FPS to 120 FPS or so. Ideal for my 120Hz monitor refresh rate. That includes using Ray Tracing and HDR quality.

Cyberpunk is quite a rough game, but this video shows close up the difference in using real-time ray tracing...

https://www.youtube.com/watch?v=Xf2QCdScU6o

https://www.youtube.com/watch?v=cJ1ubgaCl6I

Most people don't understand or respect the amount of processing power used to do that. Even seen it. So if you are happy at 1440p resolution with the preloaded shaders, stick to that and you won't need as a crazy beast of a graphic card to drive it. If you want to do 4K + HDR + Ray Tracing, then what do you expect? You are like still comparing game engines back to ones in the early 2000s when they had none of that quality or real-time processing.

https://www.youtube.com/watch?v=oCsgTrGLDiI

That is what the RTX cards are trying to drive towards in real-time. Hense RTX the marketing abbreviation, meaning “Ray Tracing Texel eXtreme”. If you don't want to use Ray Tracing, then why would you buy a graphics card labelled as such? It also helps with Virtual Reality 3D object environments and calculating 3D surround soundscapes, bouncing off those objects.

We want to be able to do that for a more immersive experience upon gaming and even creating CGI in movies, etc. Shadows have a huge impact here, specially in adventure games or horrors. Consider going through a dark cave, lit by a flaming torch and the sun rays coming through. A soft shadow of a monster casted upon the wall as it roams past. All with Ray Tracing and HDR quality, with rich whites and pure black darkness. Previously that would of just been all a dark blur and greyed out. It's the right step towards next gen PC gaming.

However, turn off DLSS 3.0 and Ray Tracing if desired:

https://www.youtube.com/watch?v=ELQ1hG7mOv0

The RTX 4090 still gets a min of 70 FPS / avg 95 FPS / high of 126 FPS at Ultra settings upon Tomb Raider and other games...
Última edición por Azza ☠; 25 ENE 2023 a las 10:57 a. m.
Komarimaru 25 ENE 2023 a las 11:00 a. m. 
Publicado originalmente por emoticorpse:
Publicado originalmente por Komarimaru:
Well, if you read on the site, they say this for it.
7680x4320 Resolution, Highest Game Settings, DLSS Super Resolution Highest Performance Mode When Applicable, DLSS Frame Generation When Applicable, i9-12900K, 32GB RAM, Win 11 x64.

Not once did they claim 8K Native, ever. 4K native, easily in many games though. Better than AMD's BS of 8K gaming, when can't even do it with FSR on the 7900XTX, where as the 4090 can under the settings listed of 80+ FPS.

How did I imply the 8k performance was native? That wouldn't make any sense. If the 8k performance was native, then that would mean it could do higher resolution with DLSS?


If that were the case 4k would be a cakewalk? Why would you think I was making a claim that a 4090 could do 8k native. That's just crazy. If it could, this thread wouldn't even exist.
4k is a cakewalk for it. Better than any GPU can do right now, twice as much power as any other GPU at 4k, literally. Compare the 4090 to any other GPU in performance, especially once ray tracing is enabled. Now double that ray tracing+lumen for Fortnite

A 3090TI, while having to enable DLSS Quality, barely keeps up with a 4090 running native 4K. DLSS quality making it 2560x1440.

Ya, I think the entire thread is a joke, since they posted garbage and won't buy anything anyway and would rather complain without knowing any facts.
Última edición por Komarimaru; 25 ENE 2023 a las 11:04 a. m.
emoticorpse 25 ENE 2023 a las 11:03 a. m. 
Publicado originalmente por Azza ☠:
Publicado originalmente por 8==========D:
keyword "current" it expires in approximately 11-13 months with new games using new game engines . now its a 30 fps card at 4k

False statement.

I have a RTX 4090 and have yet to have a game drop below 56 FPS upon 4K resolution, using very high to ultra quality settings. Either use an optimized game or the correct game settings. Also, get the latest Nvidia drivers.

With the DLSS 3.0, new games added monthly, it doubles the FPS to 120 FPS or so. Ideal for my 120Hz monitor refresh rate. That includes using Ray Tracing and HDR quality.

Cyberpunk is quite a rough game, but this video shows close up the difference in using real-time ray tracing...

https://www.youtube.com/watch?v=Xf2QCdScU6o

https://www.youtube.com/watch?v=cJ1ubgaCl6I

Most people don't understand or respect the amount of processing power used to do that. Even seen it. So if you are happy at 1440p resolution with the preloaded shaders, stick to that and you won't need as a crazy beast of a graphic card to drive it. If you want to do 4K + HDR + Ray Tracing, then what do you expect? You are like still comparing game engines back to ones in the early 2000s when they had none of that quality or real-time processing.

https://www.youtube.com/watch?v=oCsgTrGLDiI

That is what the RTX cards are trying to drive towards in real-time. Hense RTX the marketing abbreviation, meaning “Ray Tracing Texel eXtreme”. If you don't want to use Ray Tracing, then why would you buy a graphics card labelled as such? It also helps with Virtual Reality 3D object environments and calculating 3D surround soundscapes, bouncing off those objects.

We want to be able to do that for a more immersive experience upon gaming and even creating CGI in movies, etc. Shadows have a huge impact here, specially in adventure games or horrors. Consider going through a dark cave, lit by a flaming torch and the sun rays coming through. A soft shadow of a monster casted upon the wall as it roams past. All with Ray Tracing and HDR quality, with rich whites and pure black darkness. Previously that would of just been all a dark blur and greyed out. It's the right step towards next gen PC gaming.

However, turn off DLSS 3.0 and Ray Tracing if desired:

https://www.youtube.com/watch?v=ELQ1hG7mOv0

The RTX 4090 still gets a min of 70 FPS / avg 95 FPS / high of 126 FPS at Ultra settings upon Tomb Raider and other games...

So if your 4090 gets you 4k/120fps/highest settings/raytracing/hdr with DLSS, what do you think it will give you at 8k?
Azza ☠ 25 ENE 2023 a las 11:20 a. m. 
Publicado originalmente por emoticorpse:
Publicado originalmente por Azza ☠:

False statement.

I have a RTX 4090 and have yet to have a game drop below 56 FPS upon 4K resolution, using very high to ultra quality settings. Either use an optimized game or the correct game settings. Also, get the latest Nvidia drivers.

With the DLSS 3.0, new games added monthly, it doubles the FPS to 120 FPS or so. Ideal for my 120Hz monitor refresh rate. That includes using Ray Tracing and HDR quality.

Cyberpunk is quite a rough game, but this video shows close up the difference in using real-time ray tracing...

https://www.youtube.com/watch?v=Xf2QCdScU6o

https://www.youtube.com/watch?v=cJ1ubgaCl6I

Most people don't understand or respect the amount of processing power used to do that. Even seen it. So if you are happy at 1440p resolution with the preloaded shaders, stick to that and you won't need as a crazy beast of a graphic card to drive it. If you want to do 4K + HDR + Ray Tracing, then what do you expect? You are like still comparing game engines back to ones in the early 2000s when they had none of that quality or real-time processing.

https://www.youtube.com/watch?v=oCsgTrGLDiI

That is what the RTX cards are trying to drive towards in real-time. Hense RTX the marketing abbreviation, meaning “Ray Tracing Texel eXtreme”. If you don't want to use Ray Tracing, then why would you buy a graphics card labelled as such? It also helps with Virtual Reality 3D object environments and calculating 3D surround soundscapes, bouncing off those objects.

We want to be able to do that for a more immersive experience upon gaming and even creating CGI in movies, etc. Shadows have a huge impact here, specially in adventure games or horrors. Consider going through a dark cave, lit by a flaming torch and the sun rays coming through. A soft shadow of a monster casted upon the wall as it roams past. All with Ray Tracing and HDR quality, with rich whites and pure black darkness. Previously that would of just been all a dark blur and greyed out. It's the right step towards next gen PC gaming.

However, turn off DLSS 3.0 and Ray Tracing if desired:

https://www.youtube.com/watch?v=ELQ1hG7mOv0

The RTX 4090 still gets a min of 70 FPS / avg 95 FPS / high of 126 FPS at Ultra settings upon Tomb Raider and other games...

So if your 4090 gets you 4k/120fps/highest settings/raytracing/hdr with DLSS, what do you think it will give you at 8k?

I haven't tested any 8K monitors, nor do I wish.

The point is 1440p resolution is already maxed out and those that wish could sit happy upon it. Just like how 1080p resolution was maxed out in the past. We are touching now upon 4K. Not only that but new features which run in real-time to generate.

Basically put: It's like someone complaining about the original cost of a new coloured TV screen, when they had just arrived. Then comparing that coloured TV with the old black and white, saying it's not as good. Remember when we use to play 8-bit games and they are so cheap now, blah, blah, blah. It's an illogical statement of living in the past.
emoticorpse 25 ENE 2023 a las 12:00 p. m. 
Publicado originalmente por Azza ☠:
I haven't tested any 8K monitors, nor do I wish.

Convenient you don't want to talk about 8k performance and Nvidia's 8k marketing when you have a 4090. I got it. I was under the wrong impression.


Publicado originalmente por Azza ☠:
The point is 1440p resolution is already maxed out and those that wish could sit happy upon it.

I understand this point, but that isn't the point of this thread. I'll repeat it again though, your point is valid. I say the same thing at times and genuinely believe it. But I'll also point out that because a gamer should appreciate and be happy with a full 1440p experience, doesn't give Nvidia the right to mislead people into thinking a 4090 is a 8k card.

I already stated about the ambiguity of Nvidia's claims (even though Komarimaru helped my case by further specifying Nvidia's details on the performance at 8k). The way I see it, they have "marketed" the 4090 as a 8k card even if it's not 8k/144hz/rtx/ultra settings. My whole point is that by that claim a 4k experience would be easily achievable.

I mean, it says right there in big bold 3d text "8k HDR GAMING" and yet I'm not seeing 4k, but I do see "3840x2160 Resolution" in tiny letters and not even an fps is given in the performance section. All they did was put some bars side by side comparing it to a 3090 ti.

Publicado originalmente por Azza ☠:
Just like how 1080p resolution was maxed out in the past. We are touching now upon 4K. Not only that but new features which run in real-time to generate.

I also feel this helps my case. If 1440p is maxed out and we are "touching now upon 4k" then why is 8k hype bigger on the page than 4k?


Publicado originalmente por Azza ☠:
Basically put: It's like someone complaining about the original cost of a new coloured TV screen, when they had just arrived. Then comparing that coloured TV with the old black and white, saying it's not as good. Remember when we use to play 8-bit games and they are so cheap now, blah, blah, blah. It's an illogical statement of living in the past.

In OP's case, he probably is an example of the people you're talking about. But just because he's the one that started the thread and you don't like his take on Nvidia and 4k performance doesn't mean another claim should be swept under the rug. My whole thing started with someone asking where marketed the card as an 4k card and so far I've been putting my answer to that.

Also, I'm not angry about the cards not reaching 4090. I doubt you'll believe this though.
Última edición por emoticorpse; 25 ENE 2023 a las 12:33 p. m.
8=========D~ 25 ENE 2023 a las 12:44 p. m. 
Publicado originalmente por Ulfrinn:
Publicado originalmente por 8==========D:
i personally think 4k gaming is definitely possible but its not quite there in terms of native performance at high fps its there only for single player games as of now.

It isn't, not unless you wanted to stick with last gen games just at a higher resolution. Sure you could run an N64 game at 4K and still get the same frame rates, you might even be able to do it for Gamecube era stuff, but do those games still look good to you just because the resolution is higher? No, it doesn't. You want things like better lighting, more realistic physics, smoother animations, and that's going to cost hardware resources. Whenever a new GPU comes out that can theoretically make it possible, the performance costs of new games are going to bring you right back to where you were.

And let's be real, 4K resolution isn't that much noticeably sharper at normal viewing distances. You would have to lean in to see a difference. And you're not going to maintain a stable 60 usually, but even if you do, 1440P, and 1080P will be running at over 120fps. And guess what? The jump in smoothness from 60fps to 120fps has a much more noticeable effect on gaming than going to higher resolutions.

It's a gimmick to sell you new hardware and new monitors. And it's not the last useless gimmick that will only exist to make you buy more hardware. If everyone bought 4K crap today, they'd just go all-in on making you think 8K was the future immediately after.
to be fair thats exactly what im trying to tell others we dont have 4k native performance yet... unless we all go back and use 10-20 year old game engines to make games thats the only way but if we want optimal performance at 4k with newest game engines its like 30-50 fps average at 4k high settings with the most expensive card..
Dr.Shadowds 🐉 25 ENE 2023 a las 12:55 p. m. 
RTX 4090 release in Oct 2022

Unreal engine 5.1 release in Nov 2022 a month later after RTX 4090 released.

Fortnite Engine update to 5.1 was less than a month ago...

Again this was not marketed towards UE 5.1, this wasn't even marketed to promote that you will get 120FPS in all games at 4k from it release and in the future like what you smoking?
Última edición por Dr.Shadowds 🐉; 25 ENE 2023 a las 1:08 p. m.
8=========D~ 25 ENE 2023 a las 12:56 p. m. 
Publicado originalmente por emoticorpse:
Publicado originalmente por Komarimaru:
Well, if you read on the site, they say this for it.
7680x4320 Resolution, Highest Game Settings, DLSS Super Resolution Highest Performance Mode When Applicable, DLSS Frame Generation When Applicable, i9-12900K, 32GB RAM, Win 11 x64.

Not once did they claim 8K Native, ever. 4K native, easily in many games though. Better than AMD's BS of 8K gaming, when can't even do it with FSR on the 7900XTX, where as the 4090 can under the settings listed of 80+ FPS.

How did I imply the 8k performance was native? That wouldn't make any sense. If the 8k performance was native, then that would mean it could do higher resolution with DLSS?


If that were the case 4k would be a cakewalk? Why would you think I was making a claim that a 4090 could do 8k native. That's just crazy. If it could, this thread wouldn't even exist.
you are correct they advertised 8K as a feature it can do when the reality is it runs 4k at 30 fps bare minimum and struggling to keep up at 30 fps 4k on unreal engine 5 yet they advertised "8k" we arent even close to 4k 60 fps yet on that card thats the truth while people here are pretending "well ackshually it runs this game which is running on a 13 year old game engine at 4k 60 fps smoothly" while forgetting the game engine is old and we arent talking about "old titles' even a 1070 TI can run 20 year old game at 4k 30 fps . Linus even made a video on 1070 running 4k 60 fps on all old games ..... keyword "OLD TITLES" the moment a new game with new engine comes out within few months after you built your high end system it can no longer support 4k 120 fps at high settings
Última edición por 8=========D~; 25 ENE 2023 a las 1:01 p. m.
8=========D~ 25 ENE 2023 a las 12:58 p. m. 
Publicado originalmente por Dr.Shadowds 🐉:
RTX 4090 release in Oct 2020

Unreal engine 5.1 release in Nov 2022 a month later after RTX 4090 released.

Fortnite Engine update to 5.1 was less than a month ago...

Again this was not marketed towards UE 5.1, this wasn't even marketed to promote that you will get 120FPS in all games at 4k from it release and in the future like what you smoking?
thats the whole point of a NEW gpu isnt it ? and dont you think game engine developers have back end connection with hardware manufacturers to a point where they would synchronize their system requirements knowing what it can or cannot do ? the game engine Public release was in November 2022 but it doesnt mean Nvidia or other GPU manufacturer didnt know about this game engine launch before hand internally...
emoticorpse 25 ENE 2023 a las 1:00 p. m. 
Publicado originalmente por Dr.Shadowds 🐉:
RTX 4090 release in Oct 2020

Publicado originalmente por Dr.Shadowds 🐉:
Unreal engine 5.1 release in Nov 2022 a month later after RTX 4090 released
8=========D~ 25 ENE 2023 a las 1:01 p. m. 
Publicado originalmente por Dr.Shadowds 🐉:
RTX 4090 release in Oct 2020

Unreal engine 5.1 release in Nov 2022 a month later after RTX 4090 released.

Fortnite Engine update to 5.1 was less than a month ago...

Again this was not marketed towards UE 5.1, this wasn't even marketed to promote that you will get 120FPS in all games at 4k from it release and in the future like what you smoking?
Just checked RTX 4090 DID NOT release in 2020. Lying is illegal in 189 countries across the planet
emoticorpse 25 ENE 2023 a las 1:03 p. m. 
Publicado originalmente por 8==========D:
Publicado originalmente por Dr.Shadowds 🐉:
RTX 4090 release in Oct 2020

Unreal engine 5.1 release in Nov 2022 a month later after RTX 4090 released.

Fortnite Engine update to 5.1 was less than a month ago...

Again this was not marketed towards UE 5.1, this wasn't even marketed to promote that you will get 120FPS in all games at 4k from it release and in the future like what you smoking?
thats the whole point of a NEW gpu isnt it ? and dont you think game engine developers have back end connection with hardware manufacturers to a point where they would synchronize their system requirements knowing what it can or cannot do ? the game engine Public release was in November 2022 but it doesnt mean Nvidia or other GPU manufacturer didnt know about this game engine launch before hand internally...

Well, the wording you used with "new gpu" should be changed to "more modern gpu" if I get what you're saying. Someone could get a new gpu that isn't more modern, technically.

I was actually thinking the same exact thing you put down though. Nvidia would now everything about Unreal engine 5.1. Release dates only matter for the public I would think.
8=========D~ 25 ENE 2023 a las 1:07 p. m. 
Publicado originalmente por emoticorpse:
Publicado originalmente por 8==========D:
thats the whole point of a NEW gpu isnt it ? and dont you think game engine developers have back end connection with hardware manufacturers to a point where they would synchronize their system requirements knowing what it can or cannot do ? the game engine Public release was in November 2022 but it doesnt mean Nvidia or other GPU manufacturer didnt know about this game engine launch before hand internally...

Well, the wording you used with "new gpu" should be changed to "more modern gpu" if I get what you're saying. Someone could get a new gpu that isn't more modern, technically.

I was actually thinking the same exact thing you put down though. Nvidia would now everything about Unreal engine 5.1. Release dates only matter for the public I would think.
The release date he typed claiming "RTX 4090 came out in 2020" is a Lie by the way and also im confident hardware and software / game developers work back end before public releases of new game engines or titles . Sure they might not be perfectly sync'd but i would expect highest end retail consumer GPU to be able to hold up with high performance for atleast 2 years with best settings on newest game engines with high fps .
Dr.Shadowds 🐉 25 ENE 2023 a las 1:08 p. m. 
Publicado originalmente por 8==========D:
Publicado originalmente por Dr.Shadowds 🐉:
RTX 4090 release in Oct 2022

Unreal engine 5.1 release in Nov 2022 a month later after RTX 4090 released.

Fortnite Engine update to 5.1 was less than a month ago...

Again this was not marketed towards UE 5.1, this wasn't even marketed to promote that you will get 120FPS in all games at 4k from it release and in the future like what you smoking?
thats the whole point of a NEW gpu isnt it ?
You fail to understand, if they were gonna claim something they would've claim something in that regerd, they would've marketed making said claim which is the whole point you ignored, as they never marketed, or claim it will be doing something you're ranting about.

Publicado originalmente por 8==========D:
and dont you think game engine developers have back end connection with hardware manufacturers to a point where they would synchronize their system requirements knowing what it can or cannot do ? the game engine Public release was in November 2022 but it doesnt mean Nvidia or other GPU manufacturer didnt know about this game engine launch before hand internally...
Game engines developers are not obligated to bow to GPU manufacturers they only do what they benefit them, or what they want for themselves, as GPU sales has no bareing on them so why would it matter what GPU markers do really besides new tech features that what I'm saying. So may have games not optimize well, and some games optimize very well, and for all you know Fortnite might not have gotten fully optimize for it UE 5.1, but again you shouldn't expect 120 FPS for all future games, and updates that happens, as that not how real world works, if Devs have to limit themselves, then basically UE 5.1 wouldn't exist, or be delay until it meet demand for latest GPU, hence why they don't bow to GPU manufactures.

Yeah they may know about it, but doesn't mean they will have hands on future game engines, let alone would do things with game engines as you expect them to do as NOT ALL GAMES are made the same, there more factors you kept ignoring.


Publicado originalmente por emoticorpse:
Publicado originalmente por Dr.Shadowds 🐉:
RTX 4090 release in Oct 2020
Publicado originalmente por Dr.Shadowds 🐉:
Unreal engine 5.1 release in Nov 2022 a month later after RTX 4090 released
Typo on year.

Publicado originalmente por 8==========D:
Just checked RTX 4090 DID NOT release in 2020. Lying is illegal in 189 countries across the planet
Yeah it's a typo, get over it.
Última edición por Dr.Shadowds 🐉; 25 ENE 2023 a las 1:10 p. m.
< >
Mostrando 151-165 de 616 comentarios
Por página: 1530 50

Publicado el: 20 ENE 2023 a las 3:33 p. m.
Mensajes: 616