Tech Gaming Hardware discussion (& Hardware Sales) thread

Crucial is getting rid of it's Ballistix line of ram.

Micron Kills Off Crucial's Iconic Ballistix Gaming RAM

Through a press release today, Micron announced that it has decided to readjust its business strategy for Crucial's consumer memory products. As a result, Micron has discontinued the Ballistix, Ballistix Max, and Ballistix Max RGB lineups.
It's unclear why Micron has abandoned its only enthusiast-class memory family. However, the company stated in the communique that it "will intensify its focus on the development of Micron's DDR5 client and server product roadmap, along with the expansion of the Crucial memory and storage product portfolio."
"We remain focused on growing our NVMe and Portable SSD product categories, which both offer storage solutions for PC and console gamers. Additionally, Crucial JEDEC standard DDR5 memory provides mainstream gamers with DDR5-enabled computers with better high-speed performance, data transfers, and bandwidth than previously available with Crucial Ballistix memory," said Teresa Kelley, Vice President and General Manager of the Micron Commercial Products Group.
 
Well, AMD's latest laptop CPUs are out, Zen 3+, and they're are firmly behind Intel and Apple. AMD hasn't announced any plans for a Zen 3+ desktop series, and now it would appear even a stopgap before the debut of Zen 4 would be somewhat pointless. If Zen 3+ isn't beating Alder Lake in mobile, it won't in the desktop space, either.

AMD Ryzen 6000 Mobile CPU Benchmarked: The Bronze Medal Ain't Bad
The Ryzen 9 6900HS clocks in behind Intel's "Alder Lake" Core i9 and Apple's M1...but not by much.
02dNZ4RZisDHbsbS0wXbvT1-9.fit_lim.size_768x.png


Click here to jump to table with Gaming benchmarks

Meanwhile, Intel is preparing its own launch of new hardware in the laptop space.
Intel's First-Gen Discrete Laptop GPUs Are Just Weeks Away
The chipmaker also teased a third-gen GPU, which it claims will be for the "ultra-enthusiast segment."
Intel just announced that its first discrete graphics cards in over a decade will start shipping soon. We didn’t get a specific release date, but the long-awaited Intel Arc graphics for laptops will be ready in Q1, so no later than April. That’s right around the time 12th-gen Intel H-series notebooks are slated to be released, and we expect some of these upcoming systems will turn to Intel instead of AMD or Nvidia.

Intel has already committed to releasing future iterations of its Arc processors. The first-gen version arriving in the coming months is called Alchemist, while a second Battlemage edition is scheduled for 2023-2024. Alchemist is said to be more of a mainstream product while Battlemage will take on Nvidia RTX and AMD Radeon as a high-end graphics option.

Intel also announced a third-gen product called Celestial, a sign the company is confident in its journey to take on current market leaders. Celestial GPUs will supposedly be made for the “ultra-enthusiast segment.” That implies a chip designed for e-sports gamers that can run certain competitive titles at high refresh rates—one that could be a direct competitor to Nvidia’s RTX 3090 or AMD’s RX 6900 XT. A fourth-gen chip called Druid is rumored to arrive in 2025.
Intel has kept a tight lid on the specs of the Alchemist card, but rumors are that it will support ray-tracing, it will carry 6GB of VRAM, and ought to be a competitor to the GTX 1650 Super.

It also sounds like Intel intends to splash into the Cloud gaming race with their "Project Endgame". It would seem that NVIDIA's GeForce Now service more than Microsoft's Cloud Gaming (with Game Pass Ultimate) would be a competitor.
 
Last edited:
Meanwhile, Intel is preparing its own launch of new hardware in the laptop space.
Intel's First-Gen Discrete Laptop GPUs Are Just Weeks Away
The chipmaker also teased a third-gen GPU, which it claims will be for the "ultra-enthusiast segment."

Intel has kept a tight lid on the specs of the Alchemist card, but rumors are that it will support ray-tracing, it will carry 6GB of VRAM, and ought to be a competitor to the GTX 1650 Super.

It also sounds like Intel intends to splash into the Cloud gaming race with their "Project Endgame". It would seem that NVIDIA's GeForce Now service more than Microsoft's Cloud Gaming (with Game Pass Ultimate) would be a competitor.
Sooner than expected, Intel has already shined daylight on these.
7cL6GfgGr4bwz6uJVq6ya-320-80.png

Intel Demonstrates Arc Alchemist Desktop Graphics Card
Running Tomb Raider with XeSS.
Intel demonstrated a desktop Arc Alchemist graphics card up and running at its Investor Meeting 2022 event on Thursday, a picture posted by Raja Koduri, Intel's graphics chief, shows. The demonstration further supports that Intel's DG2 family of desktop graphics cards is getting ready for launch in Q2 2022 and that the line-up will include boards that fit into enthusiast-grade Intel's NUC 11 Extreme 'Beast Canyon.'
The fact it demo'd a 2018 title would appear to confirm that it's probably a performance peer to the GTX 1650 Super, not the GTX 3070 as originally thought. It also doesn't really tell us anything about how sleek its form factor is, and whether or not it might power devices as small as ultrabooks, for example, since the NUC they used to debut it will support most of the largest desktop cards.

However, that still isn't clear, as Ars Technica recently reported the allegedly leaked specs. The top SKU ought to contend with the 3070, after all, the 3070 Ti even, if the leak is accurate:
Latest Intel Arc GPU leaks: 3070 Ti-ish speeds, 5 different options for laptops

But at least it will good for consumers to have a bit more supply in the GPU space to relieve the inflation of prices.
 
Intel Arc DG2 Lineup Specs Revealed
full

The A380 regularly comes in third place in SiSoftware’s tests, beaten by cards such as the GTX 1660Ti, and RX 6500 XT, putting it in roughly the same place as an earlier leak that placed the A380 neck-and-neck with the GTX 1650 Super....

The full specs of the A380 card tested are 128 compute units capable of processing 1,024 threads and running at 2.45 GHz, 16 tensor cores, 6GB of GDDR6 with a bandwidth of 192 GB/s, 32 render output units, 64 texture mappers, and a power draw of just 75W (i.e. GTX 1050). The memory bandwidth, in particular, is poor compared to Nvidia cards - the 1660 Ti and 3050 manage 288 and 224 GB/s respectively - while the power draw is low compared to the two Nvidia cards’ pull of 120 and 130W.

Transfer of data across the PCIe bus also appears slow, with the A380 managing a download figure of 3.06GB/s and an upload of 2.88GB/s, while the Nvidia cards’ figures are much closer to 12GB/s in both directions.

This is a low-end card, however, and the modest power draw may attract builders of small form-factor, low-powered systems who aren’t looking for remarkable hashing abilities as long as the price is competitive.
They will be on the TSMC 6nm process technology.
intel-arc-a380-overall-600x236.png


intel-arc-a380-perf-vs-power-600x236.png


A380 should be on par with the GTX 1650 Super as the article states, confirming previous rumors prior to confusion stemming from the revelation there would be multiple lines, and most securely established since SiSoft reviewed this specific GPU, although as the article notes, in gaming, it may be very close to the 6500 XT (or even superior to it) with more specific benchmark parameters that don't penalize it for the lack of double precision floating point performance.

However, the most exciting spec? The power draw was just 75W. This could become a highly prized card by those desiring to cheaply convert office builds or other comps lacking a PSU equipped with the required connector to supply power to more powerful GPUs. And as a peer to the above cards it would flatten the best currently available card on the market with a true >75W draw (i.e. GTX 1050, not the GTX 1650). It also bested both NVIDIA and AMD's latest and greatest architecture in power efficiency performance. That's nothing to sneeze at.

The A500 series is expected to perform on par with the RTX 3060 and RX 6700 XT.

The A700 series is expected to perform on par with the RTX 3070 Ti and RX 6800.
 
  • Like
Reactions: PEB
Not sure if anyone has talked about this yet but I just modded my 3DS:



Normally, I like to keep my systems stock but with the eShop closing, just screw it. It's not a quick or easy process (it took hours) but it's done and I have now all of my 3DS and DS games backed up and digital.

I'm a big physical media guy when it comes to movies because discs are still the best quality and have the most features. To me, it's just quick and easier to buy a movie and have it on my shelf forever in the best possible quality. With games, the digital and physical are identical as far as content goes. If I can dump the game (or download it) and it's exactly the same content, I'd rather go digital. Especially if I have access to the file, can back it up, transfer it, etc. That's exactly what I'm doing here. Now that my games are dumped, I'm just going to sell off my physical copies. It's just a case on a shelf at this point.

If you have a 3DS, it's worth taking a look at.
 
Not sure if anyone has talked about this yet but I just modded my 3DS:



Normally, I like to keep my systems stock but with the eShop closing, just screw it. It's not a quick or easy process (it took hours) but it's done and I have now all of my 3DS and DS games backed up and digital.

I'm a big physical media guy when it comes to movies because discs are still the best quality and have the most features. To me, it's just quick and easier to buy a movie and have it on my shelf forever in the best possible quality. With games, the digital and physical are identical as far as content goes. If I can dump the game (or download it) and it's exactly the same content, I'd rather go digital. Especially if I have access to the file, can back it up, transfer it, etc. That's exactly what I'm doing here. Now that my games are dumped, I'm just going to sell off my physical copies. It's just a case on a shelf at this point.

If you have a 3DS, it's worth taking a look at.

Mod it to play emulate SNES games.

<{jackyeah}>
 
Well, AMD's latest laptop CPUs are out, Zen 3+, and they're are firmly behind Intel and Apple. AMD hasn't announced any plans for a Zen 3+ desktop series, and now it would appear even a stopgap before the debut of Zen 4 would be somewhat pointless. If Zen 3+ isn't beating Alder Lake in mobile, it won't in the desktop space, either.

AMD Ryzen 6000 Mobile CPU Benchmarked: The Bronze Medal Ain't Bad
The Ryzen 9 6900HS clocks in behind Intel's "Alder Lake" Core i9 and Apple's M1...but not by much.
02dNZ4RZisDHbsbS0wXbvT1-9.fit_lim.size_768x.png


Click here to jump to table with Gaming benchmarks

Meanwhile, Intel is preparing its own launch of new hardware in the laptop space.
Intel's First-Gen Discrete Laptop GPUs Are Just Weeks Away
The chipmaker also teased a third-gen GPU, which it claims will be for the "ultra-enthusiast segment."

Intel has kept a tight lid on the specs of the Alchemist card, but rumors are that it will support ray-tracing, it will carry 6GB of VRAM, and ought to be a competitor to the GTX 1650 Super.

It also sounds like Intel intends to splash into the Cloud gaming race with their "Project Endgame". It would seem that NVIDIA's GeForce Now service more than Microsoft's Cloud Gaming (with Game Pass Ultimate) would be a competitor.

But the new iGPUs on the AMDs aren't bad and they are more power efficient which has advantages for ultrabooks.
 
Well, AMD's latest laptop CPUs are out, Zen 3+, and they're are firmly behind Intel and Apple. AMD hasn't announced any plans for a Zen 3+ desktop series, and now it would appear even a stopgap before the debut of Zen 4 would be somewhat pointless. If Zen 3+ isn't beating Alder Lake in mobile, it won't in the desktop space, either.

AMD Ryzen 6000 Mobile CPU Benchmarked: The Bronze Medal Ain't Bad
The Ryzen 9 6900HS clocks in behind Intel's "Alder Lake" Core i9 and Apple's M1...but not by much.
02dNZ4RZisDHbsbS0wXbvT1-9.fit_lim.size_768x.png


Click here to jump to table with Gaming benchmarks

Meanwhile, Intel is preparing its own launch of new hardware in the laptop space.
Intel's First-Gen Discrete Laptop GPUs Are Just Weeks Away
The chipmaker also teased a third-gen GPU, which it claims will be for the "ultra-enthusiast segment."

Intel has kept a tight lid on the specs of the Alchemist card, but rumors are that it will support ray-tracing, it will carry 6GB of VRAM, and ought to be a competitor to the GTX 1650 Super.

It also sounds like Intel intends to splash into the Cloud gaming race with their "Project Endgame". It would seem that NVIDIA's GeForce Now service more than Microsoft's Cloud Gaming (with Game Pass Ultimate) would be a competitor.


hello there. do you think that AMD as a whole is over-rated and that INTEL has a better long term future?
 
hello there. do you think that AMD as a whole is over-rated and that INTEL has a better long term future?
I tend to reserve speculation on market matters for this thread:
https://forums.sherdog.com/threads/intels-f.4083915/

As briefly as possible, yes, I think AMD's stock is massively overinflated, now, despite that I was the first person on this board to predict their meteoric rise years and years ago. I think that Intel is very well positioned to have a phenomenal future. I am confident in saying that. I'm not confident in a prediction of who will be in a better place five years from now, but I'm inclined to say Intel because their pockets are so deep, and they have so much more invested in first party manufacturing capacity. I do think the future of computing will focus more on greater energy efficiency, with integrated chipsets, like the M1 processors, and AMD's advantage in GPU sophistication, as well as their experience with contracts supplying such chipsets for the gaming consoles, as well as the Steam Deck, gives them an advantage there.

But semiconductors are about much more than just gaming processors which is what I tend to know intimately. Gaming is a relatively insignificant slice of the overall picture.
 
I thought a Comp USA had reopened close by. But then I looked it up and it closed in 2008.

<DCrying>
There was a Compusa near where I lived in El Paso that turned into a TigerDirect. I’m assuming Tiger bought them out
 
I swear I’ve seen this headline every month since this shit first got out of hand. I just looked and 3080s are still double the price on Amazon. So yeah.

Yeah, that's what I'm thinking, until proven otherwise.

It absolutely disgusts me how scalpers are now a long-term issue, for graphics cards and consoles.
 
Why are Apple's CPU's so ahead of the curve here, and what's stopping companies like Intel and AMD from matching them?
Because they're on the cutting edge in every facet.

First, they're on the 5nm process. Second, they use the hybrid performance+efficiency cores design strategy that Intel finally implemented in the desktop space. Those two combined account for a large portion of their efficiency advantage.

Third, per my comment about its ridiculous system memory speeds, the M1 was the first processor on the market that made use of LPDDR5, and remains one of the only capable of that. In fact, the latest Alder Lake mobile chips that just dropped don't quote above LPDDR5-5200 support, natively, while the M1 supported 6400 out of the gate. In addition to that, the M1 Max is simply monstrous, incorporating four dedicated 128-bit LPDDR5 banks. Its real-world throughput isn't quite as high as it would appear on paper, but it's still absurd at 224.0 GB/s actual. That's just nuts. That's the same bandwidth of the VRAM in the GTX 1060!

Yet this is memory bandwidth fed to the CPU, not just the GPU. For comparison, the specified throughput of the Steam Deck is 88.0 GB/s, and the Steam Deck is only that high because Steam doubled the channels to four specifically so it would have higher throughput for games. It's over double the ceiling throughput fed to the iGPU on the latest AMD Zen 3+ mobile processors, and nearly triple what is fed to the older Iris Xe iGPU in the Alder Lake mobile processors, and for either, that's if they're put in a laptop with a quad channel design. For context, say you have DDR4-3600 RAM running in dual channel in your desktop, which is what most builders target today due to bang-for-your-buck. Your CPU is fed memory data at a ceiling 57.6 GB/s. Sure, your discrete GPU's VRAM is way, way faster, but your CPU is sitting there waiting to catch up. That's a big reason that laptop is schooling even the most powerful desktops in real-world video editing tasks.

Fourth, these chips aren't designed for video games. Like server CPUs, one of the reasons for their superior processing power and efficiency is derived from more clusters in parallel design. That's why the the GPU clocks are so far below what you see in desktop GPUs, for example. This is another reason they're so efficient, but it's also because Apple wasn't concerned with catering to gamers where sometimes games depend on those higher frequencies.

After that, Apple just does a fantastic job of organizing the layout of the chip. It's not a Frankenstein of mixed and matched specs that are designed to be interchangeable with other processors. This is consistent with Apple vs. PC writ large. Windows laptops assemble a bunch of different potatohead designs from the same parts on the market while Apple designs each laptop individually as a whole unit. Each M1 processor is built from the ground up as a singular unit intended to operate only as itself. Meanwhile, recall how I told you these Alder Lake mobile processors are being integrated with the same Iris iGPU that goes onto the Rocket Lake chips. And that iGPU is married to half a dozen processors from each generation. The layout is what maximizes the efficiency of each portion of the processor working with the rest, and also of maximizing usage of the die space. It's a thing of beauty.
I do think the future of computing will focus more on greater energy efficiency, with integrated chipsets, like the M1 processors...
Apple obviously also thinks this is the future of processors because it's where they're taking us.

Apple Announces M1 Ultra: Combining Two M1 Maxes For Workstation Performance
20-Core Apple M1 with 64-Core GPU Rivals RTX 3090
Anandtech and Tom's Hardware said:
Apple has thrown the industry a fresh curveball by not just combining two M1 Max dies into a single chip package, but by making the two dies present themselves as a single, monolithic GPU, marking yet another first for the chipmaking industry.

Back when Apple announced the M1 Pro and the ridiculously powerful M1 Max last fall, we figured Apple was done with M1 chips. After all, how would you even top a single 432mm2 chip that’s already pushing the limits of manufacturability on TSMC’s N5 process? Well, as the answer turns out to be, Apple can do one better. Or perhaps it would be more accurate to say twice as better.

According to Apple, the M1 Ultra is formed by physically connecting two M1 Max chips together using a heretofore "hidden feature" on the M1 Max -- a silicon interposer capable of achieving up to 2.5TB/s interprocessor bandwidth.

The net result is a chip that, without a doubt, manages to be one of the most interesting designs I’ve ever seen for a consumer SoC. As we’ll touch upon in our analysis, the M1 Ultra is not quite like any other consumer chip currently on the market. And while double die strategy benefits sprawling multi-threaded CPU and GPU workloads far more than it does more single-threaded tasks – an area where Apple is already starting to fall behind – in the process they re breaking new ground on the GPU front. By enabling the M1 Ultra’s two dies to transparently present themselves as a single GPU, Apple has kicked off a new technology race for placing multi-die GPUs in high-end consumer and workstation hardware...
That interprocessor bandwidth is nearly 10x what AMD's EPYC is capable of sharing between server CPUs clustered together.

The below exaggerates the multicore performance over the 12900K, as you'll see from Geekbench scores below, but the power consumption claims are not exaggerated. The M1 Ultra effectively trails only Alder Lake's best in terms of gaming CPU single-core potential, it exceeds all processors in existence in multicore gaming & video/photo editing potential, and it rivals the RTX 3090 or RX 6900 XT in terms of raw GPU rasterization. It does this while consuming just 110W peak in real-world testing; compared to a total peak draw of ~700W for the 12900K + RTX 3090.
Regarding relative performance, Apple says that the M1 Ultra can operate at the same performance level as the Alder Lake-based Core i5-12600K (paired with DDR5 memory) while using 65 percent less power. When running full-bore, the M1 Ultra allegedly delivers 90 percent higher multi-core performance compared to the flagship Core i9-12900K while consuming just a third of the power.
9r3QzMRpgfAGaU6fEpUkh3-1920-80.jpeg



M1 Ultra With 20-Core CPU, 128GB Unified RAM, Beats Desktop 64-Core Ryzen Threadripper 3990X CPU in Single-Core Results, Nearly Matches in Multi-Thread
Geekbench 5 CPU Scores (Single-core / Multi-core)
  • TR-3990X (64-core) = 1,213 / 25,133
  • Apple M1 Ultra (20-core) = 1,793 / 24,055
  • i9-12900K (16-core) = 1,997 / 17,201
  • R9-5950X (16-core) = 1,686 / 16,564
 
Last edited:
@Madmick I have an audio question.
I have a Samsung UN55F6300AF and an SMSL SA50 amp.
When I connect the amp to the tv, I have to set the sound to TV out, which makes the volume control on the remote no longer work. So I have to get up and walk to the amp to manually adjust the volume.
I’ve tried a 3.5mm to an rca adapter as well as an optical to rca dac. Neither options allow volume control on the remote to work. I get a “your tv is set to use external speakers. Adjust the volume on the connected speakers directly” message.
I get the same message with an XBOX One and Fire Tv stick.
Do you have any idea on how to be able to control the volume with my remote?
 
I have a I7 7700K and 16 GB RAM and a 1080 TI
Still Windows update says I'm not ready for Windows 11.
Do you guys know why this is ?
 
Back
Top