Tech Gaming Hardware discussion (& Hardware Sales) thread

just bought a 3070. finally.

thinking about getting a liquid CPU cooler, but I dont think I have room in my case
 
Launch MSRP for the 3090 Ti was $1999, and that MSI first appeared on Amazon on May 23 at a price of $1949. At $1700, you got it for $249 off the real market launch price.
Do you remember you could not get anything near MSRP most of them where 2,500 and up even from established dealers.

It says it in the Amazon ad.

"

List Price: $2,159.99 Details
Price: $1,469.99$1,469.99 & FREE Returns
 
Last edited:
Yea I got an LG C1 and about a month or two later the prices on those TV's dropped by like $400. Was kind of annoyed.
Is it a 60 hz TV because I bought a 120hz TV for 950 and everyone was like I bought a 75 inch TV for 495. What they don't understand no matter how many times I tell them it's a 60 hz TV. I am sure next month my new set will be selling for 500 bucks as the NEO's come out newest micro QLED stuff come out the technology is changing so fast.
 
Is it a 60 hz TV because I bought a 120hz TV for 950 and everyone was like I bought a 75 inch TV for 495. What they don't understand no matter how many times I tell them it's a 60 hz TV. I am sure next month my new set will be selling for 500 bucks as the NEO's come out newest micro QLED stuff come out the technology is changing so fast.

Yea it’s 120hz. Supposed to be like the best or one of the best for gaming.
 
  • Like
Reactions: PEB
any truth to this? thoughts?



A little hyperbolic but computing companies love the idea especially Apple remember the Apple IIc, the original Mac, IMac on an on. Apple never been big on expansion and the M1 now M2 is trying to upend the Apple cart again. Dell tried it and largely failed because everyone running the same computing hardware this is why Apple dropped AMD and Intel because it was hard to make a case for a locked box unless everything was done inhouse out performing pretty much everything in most common office and science application not so much gaming. AMD, NVIDIA and now Intel stand to make more money on upgrading hardware so I doubt this will become a thing.

If people can not upgrade they will hold on to their hardware longer drying up the well for Chip suppliers and Apples M2 really was not that significant an advancement to cause M1 owners to run out and upgrade. M1 Ultra got the jump because I think it was the first chip built on 5nm process allowing Apple and TSMC to squeeze an insane amount of transistors into the die 114 billion with the Ultra. AMD, Intel and Nvidia will close that gap with their own 5nm chips and advanced chiplet design as well as new higher performance buses that scale depending on workload. I have seen rumored benchmarks at least from AMD that their newest chips will significantly outperform M2 at single core "Key metric" and multcore application tests. Apple will continue to throw boatloads of cash at their own chip development meaning TSMC will prioritize whoever writing the biggest check "IE Apple".
 
This is an interesting take to get high end GPU's sold given the current state in the market.

 
This is interesting on the Intel GPU's.

"According to a leaked document making the rounds online, the flagship GPU of Intel Arc A-Series GPUs(opens in new tab), the Arc A770, won't cost more than $399. This confirms our suspicions that Intel's upcoming slate of GPUs will be pretty competitively priced when they release later this year. "

They say top of the line A770 should fall between an 3070 an 3080.
https://www.pcgamer.com/intel-arc-a...il-for-less-than-dollar399-according-to-leak/

"
That would be nice, the cheapest used 3070 on Ebay is $450 with free shipping (FINALLY below the $500 MSRP).
Hopefully it's better than the last time Intel made video cards with the dog shit i740 in the late 90s.
 
That would be nice, the cheapest used 3070 on Ebay is $450 with free shipping (FINALLY below the $500 MSRP).
Hopefully it's better than the last time Intel made video cards with the dog shit i740 in the late 90s.
Well here are the first benchmarks looks like it's performance puts it on 3070's level slightly lower entry price. Don't think it will cause people to jump for it though has more ram 12 gigs vs 8 then the 3070. They are picking a hard time to launch this card with prices dropping on higher end cards like stones.

"
The Intel Arc A750 test rig consisted of an Intel Core i9-12900K, Asus ROG Maximus Z690 Hero motherboard, 32 GB of 4,800 MT/s memory, and a Corsair MP600 Pro NVMe SSD along with Windows 11. A total of five games were tested, including F1 2021, Cyberpunk 2077, Control, Borderlands 3 and Fortnite, where the Arc A750 managed to outperform the Nvidia GeForce RTX 3060. All titles were tested at 2,560 x 1,440 at the 'High' preset. The difference in performance was the most apparent in F1 2021, where the Intel GPU managed to output 192 FPS compared to the RTX 3060's 164 FPS.

Little is known about the Intel Arc A750's specifications, but previous leaks suggested that it would pack 12 GB of 16 Gbps VRAM on a 192-bit bus and consume 225 Watts of power. These specs should, on paper, make it a contender for the GeForce RTX 3060, provided that Intel can sort out the numerous driver issues. There is no word about the Arc A750 will hit shelves, but it shouldn't be long, given that Intel is confident enough to showcase its performance numbers."
 
More crazy rumors about the RTX 4090 it will be at least 50 to 90 percent faster running benchmarks then the RTX 3090 ti but will pull substantially more power like 50 percent more. If you need at least a 850 watt to 1000 watt power supply "I picked up a 1200 watt" who only knows what monster you will need to run the RTX 3090? This getting nuts with the power requirements I really think that Nvidia and AMD have to address this issue.
 
This is interesting on the Intel GPU's.

"According to a leaked document making the rounds online, the flagship GPU of Intel Arc A-Series GPUs(opens in new tab), the Arc A770, won't cost more than $399. This confirms our suspicions that Intel's upcoming slate of GPUs will be pretty competitively priced when they release later this year. "

They say top of the line A770 should fall between an 3070 an 3080.
https://www.pcgamer.com/intel-arc-a...il-for-less-than-dollar399-according-to-leak/
That would be nice, the cheapest used 3070 on Ebay is $450 with free shipping (FINALLY below the $500 MSRP).
Hopefully it's better than the last time Intel made video cards with the dog shit i740 in the late 90s.

"
There's little point in wasting time on Intel GPU rumor leaks. Flagship is the A770, now? It's been leaked to be the A780 previously. Who cares until Intel actually shows they can matter? All these articles are pointless sponsored content Intel is indirectly funding to try to build hype for what will almost certainly be a de facto paper launch just like the Arctic Sound GPUs from last year. This crap with "Arc is coming!" has been going on for nearly six years, now. I'm getting a little exhausted with the vaporware bloviations. Put up or shut up, Intel. NVIDIA and AMD actually produce.
 
It seems like Arc might be a really good option for Linux users. From what I understand Intel's graphics drivers on linux have been open-source for 6+ years now and Arc is optimized for Vulkan and DX12, which is good because Proton translates DX calls to Vulkan. Not that that will really matter much for total market-share, but to the people it matters to, it matters a lot.
 
  • Like
Reactions: PEB
It seems like Arc might be a really good option for Linux users. From what I understand Intel's graphics drivers on linux have been open-source for 6+ years now and Arc is optimized for Vulkan and DX12, which is good because Proton translates DX calls to Vulkan. Not that that will really matter much for total market-share, but to the people it matters to, it matters a lot.
I heard the driver developers due to being open source found issues like major issues with Intel code. They corrected some of them a squeezed out a nice performance boost. It could lead to helping Intel with their proformance issues. Who knows but it's an example of how the community can help.
 
I heard the driver developers due to being open source found issues like major issues with Intel code. They corrected some of them a squeezed out a nice performance boost. It could lead to helping Intel with their proformance issues. Who knows but it's an example of how the community can help.
This maybe hyperbolic but it's part of the linux connection.

"
Intel Linux GPU driver developers have released an update that results in a massive 100X boost in ray tracing performance. This is something to be celebrated, of course. However, on the flip side, the driver was 100X slower than it should have been because of a memory allocation oversight. The news comes amid reports that Intel's shipping drivers for its Arc GPUs are fraught with issues in Windows that are akin to "[...]living in the middle of a minefield - mind you, while playing drunk." The company has also admitted that Arc performance is sub-par with older APIs, like DX11, in Windows.

Linux-centric news site Phoronix reports that a fix merged into the open-source Intel Mesa Vulkan driver was implemented by Intel Linux graphics driver engineering stalwart Lionel Landwerlin on Thursday. The developer wryly commented that the merge request, which already landed in Mesa 22.2, would deliver “Like a 100x (not joking) improvement.” Intel has been working on Vulkan raytracing support since late 2020, but this fix is better late than never."

https://www.tomshardware.com/news/intel-gpu-100x-performance-ray-tracing
 
just bought a 3070. finally.

thinking about getting a liquid CPU cooler, but I dont think I have room in my case

I don't know what kind you are talking about but they don't take up much sace. I have a corsair something bullshit and I have the radiator on top of the case. I don't know what to say about liquid cooling. When I test it and overclock it to 5.1 ghz, I can only get up to like 87C. The CPU starts glitching on some other issue at higher clocks.
 
I got a 2070 super. I think I'll skip the 3000 series even though it's temping to get one right now that they're affordable again for who knows how long
 
I got a 2070 super. I think I'll skip the 3000 series even though it's temping to get one right now that they're affordable again for who knows how long
2070 S is still legit, it's almost as good as a 3060 Ti.
 
Oh boy, rumors saying ARC is on the chopping block with significant hardware-level issues and now the software is totally fucked.
 
Back
Top