!!! Tesla V100 !!!!

Having problems installing that new stick of memory? Found some great software or having issues with something? Or maybe want to chat about your PlayStation, X-Box, Nintendo, Sega, even your old Spectrum 48k....! Or maybe something you want to sell or acquire (computing related of course!). Let us know here...
Post Reply
chriscambridge
Active UBT Contributor 1+ yr
Posts: 2178
Joined: Mon Aug 08, 2016 1:56 pm
Location: UK

!!! Tesla V100 !!!!

Post by chriscambridge »

Image
damienh
UBT Contributor
Posts: 1685
Joined: Mon Aug 26, 2019 2:16 pm

Re: !!! Tesla V100 !!!!

Post by damienh »

You have a Tesla V100?
chriscambridge
Active UBT Contributor 1+ yr
Posts: 2178
Joined: Mon Aug 08, 2016 1:56 pm
Location: UK

Re: !!! Tesla V100 !!!!

Post by chriscambridge »

https://milkyway.cs.rpi.edu/milkyway/ho ... id=1181878

Its a cloud account with the V100 that I'm testing out.

Unfortunately for some reason Milkyway is refusing to give me any work units for it
damienh
UBT Contributor
Posts: 1685
Joined: Mon Aug 26, 2019 2:16 pm

Re: !!! Tesla V100 !!!!

Post by damienh »

Ah, I see. Much nicer to use them temporarily. They are rather expensive !
damienh
UBT Contributor
Posts: 1685
Joined: Mon Aug 26, 2019 2:16 pm

Re: !!! Tesla V100 !!!!

Post by damienh »

Do you have opencl installed? I can't see it in your graphic.
UBT - Timbo
UBT Forum Admin
Posts: 9673
Joined: Mon Mar 13, 2006 12:00 am
Location: NW Midlands
Contact:

Re: !!! Tesla V100 !!!!

Post by UBT - Timbo »

chriscambridge wrote: Mon Nov 30, 2020 12:34 pm https://milkyway.cs.rpi.edu/milkyway/ho ... id=1181878

Its a cloud account with the V100 that I'm testing out.

Unfortunately for some reason Milkyway is refusing to give me any work units for it
Hi Chris

Nice !!

I've been interested in these Tesla GPU's for a while and was almost tempted to get one of the early Fermi or Kepler models (as you can buy them real cheap for less than £50...).

But then I looked on the Wiki and saw how much power they draw :-( ...and some of the PCi-e versions don't even have built-in fan assisted cooling (as they were designed to rely on the chassis cooling systems).

And as time has gone on, newer versions have come out such as the Maxwell, Pascal and Volta and the performance has improved but the power consumption is still about the same (200-300W). But the Turing versions seem to now need less than 100W, which makes them more viable (IMHO). And the latest Ampere versions seem to show great promise (but they should at the price they are going for !!).

regards
Tim
chriscambridge
Active UBT Contributor 1+ yr
Posts: 2178
Joined: Mon Aug 08, 2016 1:56 pm
Location: UK

Re: !!! Tesla V100 !!!!

Post by chriscambridge »

Working backwards..

This is why I created the FP16/32/64/TDP table as I wanted to see the differences between all the GPUs. If you check out the most recent version I have added loads more GPUs, including older FP64 capable versions, as well as the most powerful/expensive Teslas, and added TDP values.

http://www.ukboincteam.org.uk/newforum/ ... =18&t=6757

The reason I got this V100 is because I wanted to blast Milkyway, and I wanted to test if it would be cheaper to use a cloud based V100 version rather than the 1080TI on this project, as:

(1080TI) 0.354 TFLOPS vs (V100) 7.006 TFLOPS

That's obviously a huge difference

Yeah GPU servers (commercial) blast so much air through the chassis they can not only cool the CPU but also the GPUs, both passive versions. This makes sense in terms of power usage as it saves having loads of little fans on the CPU and GPUs.

--

I just realised I am missing OpenCL.. I assumed because its an Nvidia card that they would be using CUDA.

--

Milkyway Benchmark:

1080 TI = 1 min 60
RX 580 = 1 min 30
V100 = 10-12 seconds!

(and thats with the CPU 90% loaded also)
UBT - Timbo
UBT Forum Admin
Posts: 9673
Joined: Mon Mar 13, 2006 12:00 am
Location: NW Midlands
Contact:

Re: !!! Tesla V100 !!!!

Post by UBT - Timbo »

chriscambridge wrote: Mon Nov 30, 2020 6:10 pm Milkyway Benchmark:

1080 TI = 1 min 60
RX 580 = 1 min 30
V100 = 10-12 seconds!

(and thats with the CPU 90% loaded also)
WHAT !!!!

That's impressive...though I'm not THAT surprised, as the whole idea of these GPU's was to really crunch hard and given that they have no display to "drive", so all the hardware can be dedicated just for crunching numbers.

It's just a shame that the commercial side of the "Tesla" division is geared up for business users and not hobbyists...and of course as such these products earn NVidia some big bucks !!


What is renting the use of a V100 costing? And how long do you have to rent it for?

regards
Tim
chriscambridge
Active UBT Contributor 1+ yr
Posts: 2178
Joined: Mon Aug 08, 2016 1:56 pm
Location: UK

Re: !!! Tesla V100 !!!!

Post by chriscambridge »

Image

Its about 75p an hour. Its billed per minute I think, or maybe by the hour.

www.exoscale.com

I am interested in how long Damien's TITAN V used to do MW tasks in, as on paper they have very similar FP specs.
Last edited by chriscambridge on Tue Dec 01, 2020 4:49 am, edited 1 time in total.
damienh
UBT Contributor
Posts: 1685
Joined: Mon Aug 26, 2019 2:16 pm

Re: !!! Tesla V100 !!!!

Post by damienh »

That's Richard's not mine (the Titan V)
chriscambridge
Active UBT Contributor 1+ yr
Posts: 2178
Joined: Mon Aug 08, 2016 1:56 pm
Location: UK

Re: !!! Tesla V100 !!!!

Post by chriscambridge »

Oh I thought you had a Titan.. Richard, is that homefarm? I don't remember seeing anyone called Richard..
damienh
UBT Contributor
Posts: 1685
Joined: Mon Aug 26, 2019 2:16 pm

Re: !!! Tesla V100 !!!!

Post by damienh »

Homefarm is Richard, yes. I have a couple of Titan Xs, but Pascal generation. They are basically 1080 TIs. Richard had a Titan V, not sure if he has sold it?
chriscambridge
Active UBT Contributor 1+ yr
Posts: 2178
Joined: Mon Aug 08, 2016 1:56 pm
Location: UK

Re: !!! Tesla V100 !!!!

Post by chriscambridge »

Thanks, hopefully he will see and visit this post. Although a friend who also has a Titan V did say they were taking about 20 secs to complete.

In terms of billing, I just asked the (excellent) support how much am I being billed for an hour?
It looks like you are using a single GPU2-small which is priced at 1.25011 EUR/Hour (or 900.08 EUR/Month). In addition to this
you are charged for 200GB of local storage attached to the instance at 0.00013 EUR / GB hour (i.e. 0.026 / hour). You may also
be charged for bandwidth but at the moment your usage is within the free tier.

I make that to be a total of 1.2504 EUR (£1.12) / hour.
If anyone wants to have a play, you can sign up for around £5. As soon as your instance goes over that then it will then suspend. The most it has gone over my balance and suspended was £1.25 which I had to pay to get the instance back online.

If you sign up and have any issues with the firewall give me a shout and I'll tell you what rules you need to setup for BOINC. To get the V100 added as a plan (for you to sign up) just contact support and ask them to add it.
damienh
UBT Contributor
Posts: 1685
Joined: Mon Aug 26, 2019 2:16 pm

Re: !!! Tesla V100 !!!!

Post by damienh »

Which cloud provider is it? I have had a GPU instance live before, but only within GCP.
chriscambridge
Active UBT Contributor 1+ yr
Posts: 2178
Joined: Mon Aug 08, 2016 1:56 pm
Location: UK

Re: !!! Tesla V100 !!!!

Post by chriscambridge »

This is the cheapest I have found for V100s

Exoscale:

www.exoscale.com

https://www.exoscale.com/pricing

--

For large core CPU servers, the cheapest is clearly

Hetzner

https://www.hetzner.com/dedicated-rootserver/matrix-ax
Post Reply