Jump to content


 


Register a free account to unlock additional features at BleepingComputer.com
Welcome to BleepingComputer, a free community where people like yourself come together to discuss and learn how to use their computers. Using the site is easy and fun. As a guest, you can browse and view the various discussions in the forums, but can not create a new topic or reply to an existing one unless you are logged in. Other benefits of registering an account are subscribing to topics and forums, creating a blog, and having no ads shown anywhere on the site.


Click here to Register a free account now! or read our Welcome Guide to learn how to use this site.

Photo

GPU Recommendation


  • Please log in to reply
7 replies to this topic

#1 rboone2020

rboone2020

  • Members
  • 69 posts
  • OFFLINE
  •  
  • Local time:10:44 AM

Posted 02 January 2018 - 07:29 PM

Folks,

 

I'm going to build a computer system, but have questions regarding a GPU.  This is a production system, used in scientific work and production, editing, spatial analyses, etc.  Gaming is not a focus.   It will be Threadripper processor, 1950x, 32 GB or 64 GB ram, probably ASUS motherboard, AOI cooling on the CPU, lots of airflow.  

 

I will be using a 4K monitor (I own the LG 43UD79-B that can display 4 screens at once).  So frame rates are not critical to me, 60 hrz would be fine, with a bit of stuttering probably not noticed by me.   But I do need the 4K resolution.  

 

Do you have suggestions for a graphics card that won't break the bank but will serve me well?

 

Thank you,

R Boone


Edited by hamluis, 03 January 2018 - 05:54 PM.
Moved from Internal Hardware to Building/Upgrading - Hamluis.


BC AdBot (Login to Remove)

 


#2 Platypus

Platypus

  • Global Moderator
  • 15,194 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:Australia
  • Local time:02:44 AM

Posted 02 January 2018 - 07:49 PM

Do you know if the software you will be using has specific requirements or recommendations for a physics engine? I'd think that would be an important consideration in establishing your range of choices.


Top 5 things that never get done:

1.

#3 rboone2020

rboone2020
  • Topic Starter

  • Members
  • 69 posts
  • OFFLINE
  •  
  • Local time:10:44 AM

Posted 02 January 2018 - 11:46 PM

Good question, but no.  I use GIS and simulation software that isn't graphically intensive.  No special requirements that I am aware of.

 

Thanks for the response,

Randy



#4 Platypus

Platypus

  • Global Moderator
  • 15,194 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:Australia
  • Local time:02:44 AM

Posted 03 January 2018 - 01:11 AM

It's not the graphical requirement, modern scientific and analytical software may be able to use the GPU to perform calculations, and a suitable GPU can be considerably more powerful than the CPU at this function. If the software you use can do this, you could be able to save a good amount on the CPU cost by choosing the appropriate GPU/CPU combination. Conversely, if you use software that unknown to you could use for example nVidia CUDA, and you got an AMD Radeon card, you'd be missing out on that performance enhancement.
Top 5 things that never get done:

1.

#5 rboone2020

rboone2020
  • Topic Starter

  • Members
  • 69 posts
  • OFFLINE
  •  
  • Local time:10:44 AM

Posted 03 January 2018 - 08:38 AM

I've considered experimenting with massively parallel simulation on a GPU in my simulations, but I have not, and the software I use does not either.

 

Randy



#6 RecursiveNerd

RecursiveNerd

  • Malware Study Hall Junior
  • 237 posts
  • ONLINE
  •  
  • Gender:Male
  • Location:Louisville, KY
  • Local time:11:44 AM

Posted 03 January 2018 - 09:35 AM

While I can't give any specific recommendation without knowing what kind of technology you're working with (libraries, applications, etc), back when I was in school, we used several NVidia Titan Xs in parallel to do a lot of simulation, as well as machine learning and neural networks. It appears that CUDA is the widest supported technology for doing these types of things, using technologies such as Tensorflow, Pytorch, Caffe, etc. Since the focus is going to be scientific work, I'm assuming you'll be doing the same simulations over and over, or running the same calculations repetitively. Therefore, I would most likely recommend whichever NVidia GPU you can afford at the time. Since your budget seems pretty high (threadripper, 64GB RAM), I would suggest looking at the GTX 1070 ti or 1080 ti.



#7 jonuk76

jonuk76

  • Members
  • 2,180 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:Wales, UK
  • Local time:04:44 PM

Posted 03 January 2018 - 11:22 AM


I've considered experimenting with massively parallel simulation on a GPU in my simulations, but I have not, and the software I use does not either.

 

Randy

 

If the software doesn't use the GPU for computation purposes, and there is no intention to do so with this hardware in future, then it sounds like really it's just a question of a GPU that will drive 4K monitors.

 

A low-mid level consumer card equipped with HDMI 2.0 and/or DisplayPort (for example, a Geforce GTX 1050 or Radeon RX550) would do that comfortably.  For a bit more, entry level workstation cards like the Radeon Pro WX4100 or Quadro P600 offer the ability to run three or four 4K displays at once, and aren't hugely expensive in the context of a system like that.


Edited by jonuk76, 03 January 2018 - 11:25 AM.

7sbvuf-6.png


#8 rboone2020

rboone2020
  • Topic Starter

  • Members
  • 69 posts
  • OFFLINE
  •  
  • Local time:10:44 AM

Posted 04 January 2018 - 02:30 PM

Great, thanks, folks.  I will check out each of these..  I appreciate your time.

 

Randy






0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users