New applications for GPU released


Message boards : News : New applications for GPU released

Message board moderation

To post messages, you must log in.
Previous · 1 · 2 · 3 · 4 · 5 · 6 . . . 8 · Next
AuthorMessage
Jan Vaclavik

Send message
Joined: 26 Jan 13
Posts: 31
Credit: 1,550,275
RAC: 232
Message 2303 - Posted: 30 Dec 2013, 18:00:15 UTC - in response to Message 2297.  
Well, maybe its just me expecting too much performance from those mid-range cards.
I am usually using low-end or mid-range graphics cards, because of the lower price and reasonable power consumption, but while those are fine gaming-wise, I guess GPGPU shows their true colours.

Offtopic: Speaking of gaming, are there any long term plans to look at PS4 and XBO?
ID: 2303 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile (retired account)

Send message
Joined: 3 Jan 13
Posts: 30
Credit: 1,705,200
RAC: 0
Message 2311 - Posted: 30 Dec 2013, 19:03:28 UTC - in response to Message 2303.  
Well, maybe its just me expecting too much performance from those mid-range cards.


The performance on a highend card is also a bit disappointing. My GTX Titan seems to need nearly one hour for a workunit. This is with high load (98% GPU, 33% memory controller, 1099 MB memory load), running in SP mode, limited by temperature and with some noticeable lags in Windows GUI. On the bright side: cpu usage is low.

Will also give it a try on my GT 650M now that the last GPUGrid short run is finished, but I guess it will be slower than the CPU there...
ID: 2311 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile HA-SOFT, s.r.o.
Project developer
Project tester

Send message
Joined: 21 Dec 12
Posts: 176
Credit: 136,462,135
RAC: 8
Message 2312 - Posted: 30 Dec 2013, 19:07:10 UTC - in response to Message 2311.  

Last modified: 30 Dec 2013, 19:08:46 UTC
Times of Titan are somewhere about 36 min. Not to much comparing to Intel CPU's even when we have 100% card usage.
ID: 2312 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile (retired account)

Send message
Joined: 3 Jan 13
Posts: 30
Credit: 1,705,200
RAC: 0
Message 2313 - Posted: 30 Dec 2013, 19:13:09 UTC - in response to Message 2312.  

Last modified: 30 Dec 2013, 20:10:49 UTC
Times of Titan are somewhere about 36 min.


I have activated the manual fan control now, to avoid the temperature limitation (don't want to exceed 80° here), so gpu clock is now higher, let's see...

EDIT: Not quite. 46 min. @ 980 MHz GPU and ~ 74°C now.

Not to much comparing to Intel CPU's even when we have 100% card usage.


Yes, agreed.
ID: 2313 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
WTBroughton

Send message
Joined: 19 Jun 12
Posts: 7
Credit: 10,008,492
RAC: 481
Message 2316 - Posted: 30 Dec 2013, 21:05:33 UTC
GTX460 ~9350 secs.
ID: 2316 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile HA-SOFT, s.r.o.
Project developer
Project tester

Send message
Joined: 21 Dec 12
Posts: 176
Credit: 136,462,135
RAC: 8
Message 2317 - Posted: 30 Dec 2013, 21:23:37 UTC - in response to Message 2313.  


EDIT: Not quite. 46 min. @ 980 MHz GPU and ~ 74°C now.


My at 1094Mhz ~60 °C fan full speed

see

http://asteroidsathome.net/boinc/result.php?resultid=27987109

But there may be different wu also.
ID: 2317 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Kyong
Project administrator
Project developer
Project tester
Project scientist
Avatar

Send message
Joined: 9 Jun 12
Posts: 584
Credit: 52,667,664
RAC: 0
Message 2318 - Posted: 30 Dec 2013, 21:27:53 UTC - in response to Message 2303.  

Last modified: 30 Dec 2013, 21:28:28 UTC
Well, maybe its just me expecting too much performance from those mid-range cards.
I am usually using low-end or mid-range graphics cards, because of the lower price and reasonable power consumption, but while those are fine gaming-wise, I guess GPGPU shows their true colours.

Offtopic: Speaking of gaming, are there any long term plans to look at PS4 and XBO?


Well, I really don't know, what operating system runs on PS4 and XBOX, nor the hardware specification. But if there would be possibility to get linux on PS4 and XBOX, than it should be no problem ti compile application for it. But I don't have access to this hardware.
ID: 2318 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile (retired account)

Send message
Joined: 3 Jan 13
Posts: 30
Credit: 1,705,200
RAC: 0
Message 2320 - Posted: 30 Dec 2013, 21:48:15 UTC - in response to Message 2317.  
My at 1094Mhz ~60 °C fan full speed

see

http://asteroidsathome.net/boinc/result.php?resultid=27987109

But there may be different wu also.


I see, nicely overclocked. Just curious: Is this reference design cooling? (mine is)
ID: 2320 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile HA-SOFT, s.r.o.
Project developer
Project tester

Send message
Joined: 21 Dec 12
Posts: 176
Credit: 136,462,135
RAC: 8
Message 2321 - Posted: 30 Dec 2013, 21:51:55 UTC - in response to Message 2320.  
I see, nicely overclocked. Just curious: Is this reference design cooling? (mine is)


Yes. Reference design from Zotac.
ID: 2321 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
cyrusNGC_224@P3D

Send message
Joined: 1 Apr 13
Posts: 37
Credit: 153,496,537
RAC: 0
Message 2324 - Posted: 30 Dec 2013, 22:56:28 UTC - in response to Message 2311.  
The performance on a highend card is also a bit disappointing. My GTX Titan seems to need nearly one hour for a workunit.

That's why i'm looking forward to the OpenCL version with the very much stronger AMD GPUs.
ID: 2324 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile HA-SOFT, s.r.o.
Project developer
Project tester

Send message
Joined: 21 Dec 12
Posts: 176
Credit: 136,462,135
RAC: 8
Message 2325 - Posted: 30 Dec 2013, 23:07:33 UTC - in response to Message 2324.  

Last modified: 30 Dec 2013, 23:20:43 UTC
The performance on a highend card is also a bit disappointing. My GTX Titan seems to need nearly one hour for a workunit.

That's why i'm looking forward to the OpenCL version with the very much stronger AMD GPUs.


I'm affraid it will be the same problem. Only 79xx card looks promising.
ID: 2325 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Jan Vaclavik

Send message
Joined: 26 Jan 13
Posts: 31
Credit: 1,550,275
RAC: 232
Message 2326 - Posted: 31 Dec 2013, 0:02:50 UTC - in response to Message 2318.  
Well, I really don't know, what operating system runs on PS4 and XBOX, nor the hardware specification. But if there would be possibility to get linux on PS4 and XBOX, than it should be no problem ti compile application for it. But I don't have access to this hardware.
I was asking about native app.
I believe its custom hardware running custom OS. The hardware is nothing to write home about (2x4 Jaguar cores with beefed up GPU). While it wont get better over time like PCs, the installed base is going to increase.
I would only expect one of you guys to look into the thing if you already planned to pick on of the consoles for gaming purposes.
ID: 2326 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
xyzzy

Send message
Joined: 21 Apr 13
Posts: 4
Credit: 6,438,240
RAC: 0
Message 2327 - Posted: 31 Dec 2013, 2:49:57 UTC
Lots of diplomatic talk about GPU vs. CPU credit.....
To the point:

1. The GPU credit is ridiculously low (by 5x to 10x).
2. You are missing a very big opportunity (due to low GPU credit)to use the new GPU app to attract the big GPU crunchers into an important project.

xyzzy
ID: 2327 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Dagorath

Send message
Joined: 16 Aug 12
Posts: 293
Credit: 1,116,280
RAC: 0
Message 2328 - Posted: 31 Dec 2013, 5:25:28 UTC - in response to Message 2327.  
The GPU tasks are the same tasks as the CPU tasks. Why should a GPU get more credits for doing the same work?

It's true, they could attract more GPU crunchers by paying more credits but since a GPU performs only a little better than a CPU on their tasks, buying credit whores with credits would a shameful waste of resources. It's better for GPUs to stay at projects that use their full potential. By doing the morally responsible thing and not squandering our resources, A&H admins are demonstrating that they are responsible citizens of the BOINC community.
BOINC FAQ Service
Official BOINC wiki
Installing BOINC on Linux
ID: 2328 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Dagorath

Send message
Joined: 16 Aug 12
Posts: 293
Credit: 1,116,280
RAC: 0
Message 2329 - Posted: 31 Dec 2013, 5:36:09 UTC
H.A. Soft,

The time you reported for your Titan has a note in parentheses beside it... (recommended CUDA double precision disabled). That confuses me because I thought A@H GPU tasks would benefit from DP but it appears you recommend disabling it. I was planning on hacking my GTX 660Ti to unlock it's DP capability, as was discussed about a month ago, but now it seems like that would be a waste of time.



BOINC FAQ Service
Official BOINC wiki
Installing BOINC on Linux
ID: 2329 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile (retired account)

Send message
Joined: 3 Jan 13
Posts: 30
Credit: 1,705,200
RAC: 0
Message 2330 - Posted: 31 Dec 2013, 7:04:37 UTC - in response to Message 2311.  

Last modified: 31 Dec 2013, 7:12:18 UTC
Will also give it a try on my GT 650M now that the last GPUGrid short run is finished, but I guess it will be slower than the CPU there...


Yes, indeed: GT 650M ~18700 s (@ 950 MHz, DDR3 @ 900 MHz)
ID: 2330 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile (retired account)

Send message
Joined: 3 Jan 13
Posts: 30
Credit: 1,705,200
RAC: 0
Message 2331 - Posted: 31 Dec 2013, 9:09:19 UTC - in response to Message 2313.  
EDIT: Not quite. 46 min. @ 980 MHz GPU and ~ 74°C now.


In comparison a first workunit in DP mode on the Titan: 52 min. @ 823 MHz GPU and ~ 71°C. Keeping in mind that the shader clock is only 84%, this does indicate some speedup in double prec. mode on a clock per clock basis. No significant changes in GPU and memory load.
ID: 2331 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile HA-SOFT, s.r.o.
Project developer
Project tester

Send message
Joined: 21 Dec 12
Posts: 176
Credit: 136,462,135
RAC: 8
Message 2332 - Posted: 31 Dec 2013, 11:55:00 UTC - in response to Message 2327.  
Lots of diplomatic talk about GPU vs. CPU credit.....
To the point:

1. The GPU credit is ridiculously low (by 5x to 10x).
2. You are missing a very big opportunity (due to low GPU credit)to use the new GPU app to attract the big GPU crunchers into an important project.

xyzzy


1. I agree with Dagorath. Same work = Sam credit. Yes I know that power consumption of GPUs in compare to credit gain is disadvantageous.

2. I think if we improve GPU app the things will be better, but we can never beat DistrRtGen or GPUGrid in credit.
ID: 2332 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile HA-SOFT, s.r.o.
Project developer
Project tester

Send message
Joined: 21 Dec 12
Posts: 176
Credit: 136,462,135
RAC: 8
Message 2333 - Posted: 31 Dec 2013, 11:57:28 UTC - in response to Message 2329.  

Last modified: 31 Dec 2013, 11:57:49 UTC
The time you reported for your Titan has a note in parentheses beside it... (recommended CUDA double precision disabled). That confuses me because I thought A@H GPU tasks would benefit from DP but it appears you recommend disabling it. I was planning on hacking my GTX 660Ti to unlock it's DP capability, as was discussed about a month ago, but now it seems like that would be a waste of time.


It's because the time of calculation is divided half-and-half between memory load and dp calculations. The second point is that we mostly use multiply-add ops which are not so agresive.
ID: 2333 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile HA-SOFT, s.r.o.
Project developer
Project tester

Send message
Joined: 21 Dec 12
Posts: 176
Credit: 136,462,135
RAC: 8
Message 2334 - Posted: 31 Dec 2013, 12:09:52 UTC - in response to Message 2333.  

Last modified: 31 Dec 2013, 12:10:42 UTC
To all GPUs fans.

This is first public version and our development do not stop with this version.

Next version update will:

1. Use less memory. (example Titan 1100MB -> 688MB). Little speedup.
2. I will try to move some memory arrays to textures with int2 texels. This may speedup app.

PS: It looks like we made to much fast CPU apps :-D
ID: 2334 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Previous · 1 · 2 · 3 · 4 · 5 · 6 . . . 8 · Next

Message boards : News : New applications for GPU released