New applications for GPU released
log in

Advanced search

Message boards : News : New applications for GPU released

Previous · 1 · 2 · 3 · 4 · 5 · 6 . . . 8 · Next
Author Message
Profile HA-SOFT, s.r.o.
Project developer
Project tester
Send message
Joined: 21 Dec 12
Posts: 176
Credit: 104,979,360
RAC: 17,428
Message 2297 - Posted: 30 Dec 2013, 17:30:57 UTC - in response to Message 2296.

Great news for the project, but to be honest I am somewhat disappointed by the performance of mid-range card. I guess I will be sticking around with my CPU somewhat longer...


These are expected results. GPUs are very specialized and it's hard to fit code to every hw architecture.

Jan Vaclavik
Send message
Joined: 26 Jan 13
Posts: 25
Credit: 1,013,760
RAC: 1,706
Message 2303 - Posted: 30 Dec 2013, 18:00:15 UTC - in response to Message 2297.

Well, maybe its just me expecting too much performance from those mid-range cards.
I am usually using low-end or mid-range graphics cards, because of the lower price and reasonable power consumption, but while those are fine gaming-wise, I guess GPGPU shows their true colours.

Offtopic: Speaking of gaming, are there any long term plans to look at PS4 and XBO?

Profile (retired account)
Send message
Joined: 3 Jan 13
Posts: 30
Credit: 1,705,200
RAC: 0
Message 2311 - Posted: 30 Dec 2013, 19:03:28 UTC - in response to Message 2303.

Well, maybe its just me expecting too much performance from those mid-range cards.


The performance on a highend card is also a bit disappointing. My GTX Titan seems to need nearly one hour for a workunit. This is with high load (98% GPU, 33% memory controller, 1099 MB memory load), running in SP mode, limited by temperature and with some noticeable lags in Windows GUI. On the bright side: cpu usage is low.

Will also give it a try on my GT 650M now that the last GPUGrid short run is finished, but I guess it will be slower than the CPU there...
____________

Profile HA-SOFT, s.r.o.
Project developer
Project tester
Send message
Joined: 21 Dec 12
Posts: 176
Credit: 104,979,360
RAC: 17,428
Message 2312 - Posted: 30 Dec 2013, 19:07:10 UTC - in response to Message 2311.
Last modified: 30 Dec 2013, 19:08:46 UTC

Times of Titan are somewhere about 36 min. Not to much comparing to Intel CPU's even when we have 100% card usage.

Profile (retired account)
Send message
Joined: 3 Jan 13
Posts: 30
Credit: 1,705,200
RAC: 0
Message 2313 - Posted: 30 Dec 2013, 19:13:09 UTC - in response to Message 2312.
Last modified: 30 Dec 2013, 20:10:49 UTC

Times of Titan are somewhere about 36 min.


I have activated the manual fan control now, to avoid the temperature limitation (don't want to exceed 80° here), so gpu clock is now higher, let's see...

EDIT: Not quite. 46 min. @ 980 MHz GPU and ~ 74°C now.

Not to much comparing to Intel CPU's even when we have 100% card usage.


Yes, agreed.

WTBroughton
Send message
Joined: 19 Jun 12
Posts: 7
Credit: 7,372,836
RAC: 7,783
Message 2316 - Posted: 30 Dec 2013, 21:05:33 UTC

GTX460 ~9350 secs.

Profile HA-SOFT, s.r.o.
Project developer
Project tester
Send message
Joined: 21 Dec 12
Posts: 176
Credit: 104,979,360
RAC: 17,428
Message 2317 - Posted: 30 Dec 2013, 21:23:37 UTC - in response to Message 2313.



EDIT: Not quite. 46 min. @ 980 MHz GPU and ~ 74°C now.


My at 1094Mhz ~60 °C fan full speed

see

http://asteroidsathome.net/boinc/result.php?resultid=27987109

But there may be different wu also.

Profile Kyong
Project administrator
Project developer
Project tester
Project scientist
Avatar
Send message
Joined: 9 Jun 12
Posts: 570
Credit: 52,629,744
RAC: 0
Message 2318 - Posted: 30 Dec 2013, 21:27:53 UTC - in response to Message 2303.
Last modified: 30 Dec 2013, 21:28:28 UTC

Well, maybe its just me expecting too much performance from those mid-range cards.
I am usually using low-end or mid-range graphics cards, because of the lower price and reasonable power consumption, but while those are fine gaming-wise, I guess GPGPU shows their true colours.

Offtopic: Speaking of gaming, are there any long term plans to look at PS4 and XBO?


Well, I really don't know, what operating system runs on PS4 and XBOX, nor the hardware specification. But if there would be possibility to get linux on PS4 and XBOX, than it should be no problem ti compile application for it. But I don't have access to this hardware.

Profile (retired account)
Send message
Joined: 3 Jan 13
Posts: 30
Credit: 1,705,200
RAC: 0
Message 2320 - Posted: 30 Dec 2013, 21:48:15 UTC - in response to Message 2317.

My at 1094Mhz ~60 °C fan full speed

see

http://asteroidsathome.net/boinc/result.php?resultid=27987109

But there may be different wu also.


I see, nicely overclocked. Just curious: Is this reference design cooling? (mine is)

Profile HA-SOFT, s.r.o.
Project developer
Project tester
Send message
Joined: 21 Dec 12
Posts: 176
Credit: 104,979,360
RAC: 17,428
Message 2321 - Posted: 30 Dec 2013, 21:51:55 UTC - in response to Message 2320.

I see, nicely overclocked. Just curious: Is this reference design cooling? (mine is)


Yes. Reference design from Zotac.

cyrusNGC_224@P3D
Send message
Joined: 1 Apr 13
Posts: 37
Credit: 144,471,720
RAC: 145,026
Message 2324 - Posted: 30 Dec 2013, 22:56:28 UTC - in response to Message 2311.

The performance on a highend card is also a bit disappointing. My GTX Titan seems to need nearly one hour for a workunit.

That's why i'm looking forward to the OpenCL version with the very much stronger AMD GPUs.

Profile HA-SOFT, s.r.o.
Project developer
Project tester
Send message
Joined: 21 Dec 12
Posts: 176
Credit: 104,979,360
RAC: 17,428
Message 2325 - Posted: 30 Dec 2013, 23:07:33 UTC - in response to Message 2324.
Last modified: 30 Dec 2013, 23:20:43 UTC

The performance on a highend card is also a bit disappointing. My GTX Titan seems to need nearly one hour for a workunit.

That's why i'm looking forward to the OpenCL version with the very much stronger AMD GPUs.


I'm affraid it will be the same problem. Only 79xx card looks promising.

Jan Vaclavik
Send message
Joined: 26 Jan 13
Posts: 25
Credit: 1,013,760
RAC: 1,706
Message 2326 - Posted: 31 Dec 2013, 0:02:50 UTC - in response to Message 2318.

Well, I really don't know, what operating system runs on PS4 and XBOX, nor the hardware specification. But if there would be possibility to get linux on PS4 and XBOX, than it should be no problem ti compile application for it. But I don't have access to this hardware.
I was asking about native app.
I believe its custom hardware running custom OS. The hardware is nothing to write home about (2x4 Jaguar cores with beefed up GPU). While it wont get better over time like PCs, the installed base is going to increase.
I would only expect one of you guys to look into the thing if you already planned to pick on of the consoles for gaming purposes.

xyzzy
Send message
Joined: 21 Apr 13
Posts: 4
Credit: 6,438,240
RAC: 0
Message 2327 - Posted: 31 Dec 2013, 2:49:57 UTC

Lots of diplomatic talk about GPU vs. CPU credit.....
To the point:

1. The GPU credit is ridiculously low (by 5x to 10x).
2. You are missing a very big opportunity (due to low GPU credit)to use the new GPU app to attract the big GPU crunchers into an important project.

xyzzy

Dagorath
Send message
Joined: 16 Aug 12
Posts: 293
Credit: 1,116,280
RAC: 0
Message 2328 - Posted: 31 Dec 2013, 5:25:28 UTC - in response to Message 2327.

The GPU tasks are the same tasks as the CPU tasks. Why should a GPU get more credits for doing the same work?

It's true, they could attract more GPU crunchers by paying more credits but since a GPU performs only a little better than a CPU on their tasks, buying credit whores with credits would a shameful waste of resources. It's better for GPUs to stay at projects that use their full potential. By doing the morally responsible thing and not squandering our resources, A&H admins are demonstrating that they are responsible citizens of the BOINC community.
____________
BOINC FAQ Service
Official BOINC wiki
Installing BOINC on Linux

Dagorath
Send message
Joined: 16 Aug 12
Posts: 293
Credit: 1,116,280
RAC: 0
Message 2329 - Posted: 31 Dec 2013, 5:36:09 UTC

H.A. Soft,

The time you reported for your Titan has a note in parentheses beside it... (recommended CUDA double precision disabled). That confuses me because I thought A@H GPU tasks would benefit from DP but it appears you recommend disabling it. I was planning on hacking my GTX 660Ti to unlock it's DP capability, as was discussed about a month ago, but now it seems like that would be a waste of time.



____________
BOINC FAQ Service
Official BOINC wiki
Installing BOINC on Linux

Profile (retired account)
Send message
Joined: 3 Jan 13
Posts: 30
Credit: 1,705,200
RAC: 0
Message 2330 - Posted: 31 Dec 2013, 7:04:37 UTC - in response to Message 2311.
Last modified: 31 Dec 2013, 7:12:18 UTC

Will also give it a try on my GT 650M now that the last GPUGrid short run is finished, but I guess it will be slower than the CPU there...


Yes, indeed: GT 650M ~18700 s (@ 950 MHz, DDR3 @ 900 MHz)

Profile (retired account)
Send message
Joined: 3 Jan 13
Posts: 30
Credit: 1,705,200
RAC: 0
Message 2331 - Posted: 31 Dec 2013, 9:09:19 UTC - in response to Message 2313.

EDIT: Not quite. 46 min. @ 980 MHz GPU and ~ 74°C now.


In comparison a first workunit in DP mode on the Titan: 52 min. @ 823 MHz GPU and ~ 71°C. Keeping in mind that the shader clock is only 84%, this does indicate some speedup in double prec. mode on a clock per clock basis. No significant changes in GPU and memory load.

Profile HA-SOFT, s.r.o.
Project developer
Project tester
Send message
Joined: 21 Dec 12
Posts: 176
Credit: 104,979,360
RAC: 17,428
Message 2332 - Posted: 31 Dec 2013, 11:55:00 UTC - in response to Message 2327.

Lots of diplomatic talk about GPU vs. CPU credit.....
To the point:

1. The GPU credit is ridiculously low (by 5x to 10x).
2. You are missing a very big opportunity (due to low GPU credit)to use the new GPU app to attract the big GPU crunchers into an important project.

xyzzy


1. I agree with Dagorath. Same work = Sam credit. Yes I know that power consumption of GPUs in compare to credit gain is disadvantageous.

2. I think if we improve GPU app the things will be better, but we can never beat DistrRtGen or GPUGrid in credit.

Profile HA-SOFT, s.r.o.
Project developer
Project tester
Send message
Joined: 21 Dec 12
Posts: 176
Credit: 104,979,360
RAC: 17,428
Message 2333 - Posted: 31 Dec 2013, 11:57:28 UTC - in response to Message 2329.
Last modified: 31 Dec 2013, 11:57:49 UTC

The time you reported for your Titan has a note in parentheses beside it... (recommended CUDA double precision disabled). That confuses me because I thought A@H GPU tasks would benefit from DP but it appears you recommend disabling it. I was planning on hacking my GTX 660Ti to unlock it's DP capability, as was discussed about a month ago, but now it seems like that would be a waste of time.


It's because the time of calculation is divided half-and-half between memory load and dp calculations. The second point is that we mostly use multiply-add ops which are not so agresive.

Previous · 1 · 2 · 3 · 4 · 5 · 6 . . . 8 · Next
Post to thread

Message boards : News : New applications for GPU released


Main page · Your account · Message boards


Copyright © 2020 Asteroids@home