Page 1 of 1

!!! Tesla V100 !!!!

Posted: Mon Nov 30, 2020 11:53 am
by chriscambridge
Image

Re: !!! Tesla V100 !!!!

Posted: Mon Nov 30, 2020 12:17 pm
by damienh
You have a Tesla V100?

Re: !!! Tesla V100 !!!!

Posted: Mon Nov 30, 2020 12:34 pm
by chriscambridge
https://milkyway.cs.rpi.edu/milkyway/ho ... id=1181878

Its a cloud account with the V100 that I'm testing out.

Unfortunately for some reason Milkyway is refusing to give me any work units for it

Re: !!! Tesla V100 !!!!

Posted: Mon Nov 30, 2020 4:03 pm
by damienh
Ah, I see. Much nicer to use them temporarily. They are rather expensive !

Re: !!! Tesla V100 !!!!

Posted: Mon Nov 30, 2020 4:05 pm
by damienh
Do you have opencl installed? I can't see it in your graphic.

Re: !!! Tesla V100 !!!!

Posted: Mon Nov 30, 2020 5:34 pm
by UBT - Timbo
chriscambridge wrote: Mon Nov 30, 2020 12:34 pm https://milkyway.cs.rpi.edu/milkyway/ho ... id=1181878

Its a cloud account with the V100 that I'm testing out.

Unfortunately for some reason Milkyway is refusing to give me any work units for it
Hi Chris

Nice !!

I've been interested in these Tesla GPU's for a while and was almost tempted to get one of the early Fermi or Kepler models (as you can buy them real cheap for less than £50...).

But then I looked on the Wiki and saw how much power they draw :-( ...and some of the PCi-e versions don't even have built-in fan assisted cooling (as they were designed to rely on the chassis cooling systems).

And as time has gone on, newer versions have come out such as the Maxwell, Pascal and Volta and the performance has improved but the power consumption is still about the same (200-300W). But the Turing versions seem to now need less than 100W, which makes them more viable (IMHO). And the latest Ampere versions seem to show great promise (but they should at the price they are going for !!).

regards
Tim

Re: !!! Tesla V100 !!!!

Posted: Mon Nov 30, 2020 6:10 pm
by chriscambridge
Working backwards..

This is why I created the FP16/32/64/TDP table as I wanted to see the differences between all the GPUs. If you check out the most recent version I have added loads more GPUs, including older FP64 capable versions, as well as the most powerful/expensive Teslas, and added TDP values.

http://www.ukboincteam.org.uk/newforum/ ... =18&t=6757

The reason I got this V100 is because I wanted to blast Milkyway, and I wanted to test if it would be cheaper to use a cloud based V100 version rather than the 1080TI on this project, as:

(1080TI) 0.354 TFLOPS vs (V100) 7.006 TFLOPS

That's obviously a huge difference

Yeah GPU servers (commercial) blast so much air through the chassis they can not only cool the CPU but also the GPUs, both passive versions. This makes sense in terms of power usage as it saves having loads of little fans on the CPU and GPUs.

--

I just realised I am missing OpenCL.. I assumed because its an Nvidia card that they would be using CUDA.

--

Milkyway Benchmark:

1080 TI = 1 min 60
RX 580 = 1 min 30
V100 = 10-12 seconds!

(and thats with the CPU 90% loaded also)

Re: !!! Tesla V100 !!!!

Posted: Mon Nov 30, 2020 6:17 pm
by UBT - Timbo
chriscambridge wrote: Mon Nov 30, 2020 6:10 pm Milkyway Benchmark:

1080 TI = 1 min 60
RX 580 = 1 min 30
V100 = 10-12 seconds!

(and thats with the CPU 90% loaded also)
WHAT !!!!

That's impressive...though I'm not THAT surprised, as the whole idea of these GPU's was to really crunch hard and given that they have no display to "drive", so all the hardware can be dedicated just for crunching numbers.

It's just a shame that the commercial side of the "Tesla" division is geared up for business users and not hobbyists...and of course as such these products earn NVidia some big bucks !!


What is renting the use of a V100 costing? And how long do you have to rent it for?

regards
Tim

Re: !!! Tesla V100 !!!!

Posted: Mon Nov 30, 2020 6:20 pm
by chriscambridge
Image

Its about 75p an hour. Its billed per minute I think, or maybe by the hour.

www.exoscale.com

I am interested in how long Damien's TITAN V used to do MW tasks in, as on paper they have very similar FP specs.

Re: !!! Tesla V100 !!!!

Posted: Mon Nov 30, 2020 7:00 pm
by damienh
That's Richard's not mine (the Titan V)

Re: !!! Tesla V100 !!!!

Posted: Mon Nov 30, 2020 7:29 pm
by chriscambridge
Oh I thought you had a Titan.. Richard, is that homefarm? I don't remember seeing anyone called Richard..

Re: !!! Tesla V100 !!!!

Posted: Mon Nov 30, 2020 8:36 pm
by damienh
Homefarm is Richard, yes. I have a couple of Titan Xs, but Pascal generation. They are basically 1080 TIs. Richard had a Titan V, not sure if he has sold it?

Re: !!! Tesla V100 !!!!

Posted: Tue Dec 01, 2020 9:33 am
by chriscambridge
Thanks, hopefully he will see and visit this post. Although a friend who also has a Titan V did say they were taking about 20 secs to complete.

In terms of billing, I just asked the (excellent) support how much am I being billed for an hour?
It looks like you are using a single GPU2-small which is priced at 1.25011 EUR/Hour (or 900.08 EUR/Month). In addition to this
you are charged for 200GB of local storage attached to the instance at 0.00013 EUR / GB hour (i.e. 0.026 / hour). You may also
be charged for bandwidth but at the moment your usage is within the free tier.

I make that to be a total of 1.2504 EUR (£1.12) / hour.
If anyone wants to have a play, you can sign up for around £5. As soon as your instance goes over that then it will then suspend. The most it has gone over my balance and suspended was £1.25 which I had to pay to get the instance back online.

If you sign up and have any issues with the firewall give me a shout and I'll tell you what rules you need to setup for BOINC. To get the V100 added as a plan (for you to sign up) just contact support and ask them to add it.

Re: !!! Tesla V100 !!!!

Posted: Tue Dec 01, 2020 10:58 am
by damienh
Which cloud provider is it? I have had a GPU instance live before, but only within GCP.

Re: !!! Tesla V100 !!!!

Posted: Tue Dec 01, 2020 11:00 am
by chriscambridge
This is the cheapest I have found for V100s

Exoscale:

www.exoscale.com

https://www.exoscale.com/pricing

--

For large core CPU servers, the cheapest is clearly

Hetzner

https://www.hetzner.com/dedicated-rootserver/matrix-ax