Nvidia RTX Pro 6000 Blackwell appears online with an eye-watering price tag of over $11,000

Nvidia RTX Pro 6000 Blackwell GPUs
(Image credit: Tom's Hardware)

Nvidia’s upcoming RTX Pro 6000 Blackwell workstation GPU has started appearing in more online listings, particularly in Japan and Europe. According to Twitter/X user @jisakuhibi, a Japanese retailer has listed the GPU for ¥1,630,600, which is approximately $11,326.

We also spotted the GPU at UK-based online retailer Scan, which is accepting pre-orders for a PNY-branded RTX Pro 6000 graphics card at £7,859.99 (around $10,433). Notably, the Scan listing currently displays an image of an RTX 5000 Founders Edition card, likely serving as a placeholder.​

Last month the RTX Pro 6000 Blackwell was spotted on a US-based IT retailer for enterprise customers, just days after the official announcement at GTC. The retailer had listed the GPU at $8,565, which is about a 26% increase over the previous-generation RTX 6000 Ada.

The RTX Pro 6000 GPU primarily targets professionals working in high-performance computing, AI development, data science, content creation, and engineering visualization. It's designed for workstation users who need extreme levels of GPU acceleration for tasks such as complex simulations, large-scale AI model training, real-time ray tracing, and advanced 3D rendering.

Based on the GB202 chip, the RTX Pro 6000 workstation-class GPU features 24,064 CUDA cores spread across 188 streaming multiprocessors with 128 CUDA cores each. With a boost clock speed of 2,617MHz, the GPU comes with a massive 96GB of GDDR7 memory. For comparison, the RTX 5090, which is currently the most powerful consumer-grade GPU from Nvidia, also uses the GB202 chip but with a reduced core count of 21,760 CUDA cores, a peak clock of 2,410MHz, and 32GB of GDDR7 memory.

Just last week, the RTX Pro 6000 Blackwell was spotted in a leaked benchmark listing where the GPU scored 368,219 points in Geekbench’s OpenCL benchmark, trailing the RTX 5090’s 376,858. While the performance gap is quite narrow, it's surprising given the Pro GPU’s superior hardware including the massive 96GB memory. However, as a pre-release product with early drivers, its full potential likely hasn’t been realized yet. Power limitations and restricted memory access due to unfinished software further explain the current performance shortfall.

While raw specifications of the RTX Pro 6000 Blackwell are impressive, buying one, especially at these sky-high prices, seems impractical for typical enthusiasts. The GPU is clearly going to end up being a niche solution for enterprise and specialized professional workloads. Potential buyers will likely need to weigh whether the incremental performance gains and expansive memory can justify the steep premium over previous-gen models and consumer-grade alternatives like the RTX 5090.

Follow Tom's Hardware on Google News to get our up-to-date news, analysis, and reviews in your feeds. Make sure to click the Follow button.

Kunal Khullar
News Contributor

Kunal Khullar is a contributing writer at Tom’s Hardware.  He is a long time technology journalist and reviewer specializing in PC components and peripherals, and welcomes any and every question around building a PC.

  • RUSerious
    Hmm, do I really need two kidneys??
    Reply
  • Eximo
    Walk into bank, tell them you want an auto loan, and when they ask for the model just say RTX stands for Road Track Extreme.
    Reply
  • Mattzun
    This card shows the real reason that NVidia doesn't care about gamers at all and does everything it can to keep gaming cards from being used for AI professional tasks.

    If a RTX Pro 6000 sells for 8600 dollars and a 5090 sells for 3000, NVidia is probably making 5-10 times more money on slightly better silicon (the extra VRAM is only a couple hundred dollars)

    That is just the profit on a pro card - I can't even imagine the margin on silicon used for a datacenter product.

    The pricing is much better than I expected if you can really get it for the 8600 dollar US price mentioned in the article,
    Assuming a 20 percent tariff, that is only 5 percent more expensive than the previous pro model and the card is significantly better for AI work.
    Reply
  • jp7189
    Mattzun said:
    This card shows the real reason that NVidia doesn't care about gamers at all and does everything it can to keep gaming cards from being used for AI professional tasks.

    If a RTX Pro 6000 sells for 8600 dollars and a 5090 sells for 3000, NVidia is probably making 5-10 times more money on slightly better silicon (the extra VRAM is only a couple hundred dollars)

    That is just the profit on a pro card - I can't even imagine the margin on silicon used for a datacenter product.

    The pricing is much better than I expected if you can really get it for the 8600 dollar US price mentioned in the article,
    Assuming a 20 percent tariff, that is only 5 percent more expensive than the previous pro model and the card is significantly better for AI work.
    I completely agree, and I also have my doubts the price will land at $8600. It may for the maxQ variant.

    It looks like PNY will be the official board partner again this gen. I signed up on their site for availability notifications. We'll see where it lands soon.
    Reply
  • valthuer
    Running Red Dead Redemption 2 at maximum Resolution Scale, requires over 37 GB of VRAM, so there's at least one good reason to buy this card.
    Reply
  • blppt
    I think, if the finished drivers show any kind of a performance advantage over a 5090 in games, the rich kids will be pouncing all over these for their "uber gaming rig".

    In the past, IIRC, the pro-grade cards usually ended up doing worse in games didn't they?
    Reply
  • abufrejoval
    When I made my first money programming on the side while I was studying computer science, I had a hard choice to make:
    In 1985 I could spend the equivalent of $30.000 in today's money on
    a brand new compact car,
    a used Porsche or
    an IBM PC-AT clone with an 8 MHz 80286, 640KB of RAM, 20MB of hard disk, an EGA graphics card (640x350 pixel, 16 colors), a matching monitor, keyboard and mouse.Only one of them was going to help me make more money, so it wasn't much of a contest.

    You guys simply don't appreciate how cheap things have become.

    Yes, the top range is still growing upward, and things there cost more than a few pennies, but at the lower end you can get an insane amount of computing power for little more than a few lunches.

    That IBM-PC was more than 50% of my entire income for a year!

    Just because there are now 4000hp Porsches out there doesn't mean everyone should be able to afford one on their hobby budget.
    Reply
  • coromonadalix
    this is the actual cards should be, not the messed up 8gb or 16gb and even the 24gb
    Reply
  • dipluz
    abufrejoval said:
    When I made my first money programming on the side while I was studying computer science, I had a hard choice to make:
    In 1985 I could spend the equivalent of $30.000 in today's money on
    a brand new compact car,
    a used Porsche or
    an IBM PC-AT clone with an 8 MHz 80286, 640KB of RAM, 20MB of hard disk, an EGA graphics card (640x350 pixel, 16 colors), a matching monitor, keyboard and mouse.Only one of them was going to help me make more money, so it wasn't much of a contest.

    You guys simply don't appreciate how cheap things have become.

    Yes, the top range is still growing upward, and things there cost more than a few pennies, but at the lower end you can get an insane amount of computing power for little more than a few lunches.

    That IBM-PC was more than 50% of my entire income for a year!

    Just because there are now 4000hp Porsches out there doesn't mean everyone should be able to afford one on their hobby budget.
    I totally agree computer hardware is insanely cheap. My only question is if I work with Machine Learning, why would I buy that card instead of just waiting until summer and spend 10k USD on the new Nvidia DGX Station, that comes with over 280gb of GPU memory, 400+ system memory, instead of paying 11k USD on 1 gpu.
    Reply
  • abufrejoval
    dipluz said:
    I totally agree computer hardware is insanely cheap. My only question is if I work with Machine Learning, why would I buy that card instead of just waiting until summer and spend 10k USD on the new Nvidia DGX Station, that comes with over 280gb of GPU memory, 400+ system memory, instead of paying 11k USD on 1 gpu.
    I agree that that is a tough one.

    I've never really played with the truly big online models, only done extensive tests with what I could run in the company lab (V100) and the home lab (up to RTX 4090).

    But while vendors keep claiming that quality is not only improving but reaching GAI levels, my experiments have been very disappointing throughout.

    I've gone and tried everything that I could somehow fit into my hardware, nearly every model (family) available from HuggingFace and then gone through the various sizes and weight precisions to see how they'd influence the speed and quality.

    And in some cases that meant having to wait a very long time to get answers even from 70B models, which clearly won't fit into an RTX 4090 even at the smallest quantizations, so a lot of layers wind up running on my 16-core and its 128GB of DRAM.

    My main takeaway: the garbage they produce is so bad, they are just not useful. And within 1-70B weights and FP16-INT4 quantization that changes remarkably little. Yes, they gain depth and seem to become much more knowledgeable, but even the reasoning models never know when they fall off their knowledge cliff and fall to hallucinations that defy very basic human reality.

    I've never been interested in GAI, I'd have been perfectly happy for these LLMs to have as good an understanding of the world as any servant would have, but they must be reliable with regards to any information for the domain/household they are working it. I'd have been happy with a peasant with manners who sticks to my orders: context and RAG data needs to be interpreted with precision and strict obedience.

    Alas, when these models are smart enough to know Marie Antoinette as the wife of Loius XVI, but claim that she died in obscurity ten years after being executed and didn't have a biological mother, you obviously can't trust them to even toggle a light switch, because they might as well just electrocute you, let alone take control of family logistics as a domestic with control over sharp blades or foodstuff that can be turned into poison.

    The IBM PC-AT represented a value return that was basically guaranteed for years. Today buying AI hardware is like crypto mining: hard to tell if you'll even break even.

    For the RTX 4090 it was still easy, it sees dual use in after-hours gaming (far too little, actually). The stuff you mention: no meaningful alternate use that I can see. So even if I could afford that, I wouldn't jump, especially since I no longer have a career riding on it.

    If you get around having your DGX station, I'd love to see your results!
    Reply