New applications for GPU released
Message boards :
News :
New applications for GPU released
Message board moderation
Author | Message |
---|---|
Send message Joined: 26 Jan 13 Posts: 31 Credit: 1,549,710 RAC: 244 |
Well, maybe its just me expecting too much performance from those mid-range cards. I am usually using low-end or mid-range graphics cards, because of the lower price and reasonable power consumption, but while those are fine gaming-wise, I guess GPGPU shows their true colours. Offtopic: Speaking of gaming, are there any long term plans to look at PS4 and XBO? |
Send message Joined: 3 Jan 13 Posts: 30 Credit: 1,705,200 RAC: 0 |
Well, maybe its just me expecting too much performance from those mid-range cards. The performance on a highend card is also a bit disappointing. My GTX Titan seems to need nearly one hour for a workunit. This is with high load (98% GPU, 33% memory controller, 1099 MB memory load), running in SP mode, limited by temperature and with some noticeable lags in Windows GUI. On the bright side: cpu usage is low. Will also give it a try on my GT 650M now that the last GPUGrid short run is finished, but I guess it will be slower than the CPU there... |
Send message Joined: 21 Dec 12 Posts: 176 Credit: 136,462,135 RAC: 11 |
Last modified: 30 Dec 2013, 19:08:46 UTC |
Send message Joined: 3 Jan 13 Posts: 30 Credit: 1,705,200 RAC: 0 |
Last modified: 30 Dec 2013, 20:10:49 UTC Times of Titan are somewhere about 36 min. I have activated the manual fan control now, to avoid the temperature limitation (don't want to exceed 80° here), so gpu clock is now higher, let's see... EDIT: Not quite. 46 min. @ 980 MHz GPU and ~ 74°C now. Not to much comparing to Intel CPU's even when we have 100% card usage. Yes, agreed. |
Send message Joined: 19 Jun 12 Posts: 7 Credit: 10,008,113 RAC: 596 |
|
Send message Joined: 21 Dec 12 Posts: 176 Credit: 136,462,135 RAC: 11 |
My at 1094Mhz ~60 °C fan full speed see http://asteroidsathome.net/boinc/result.php?resultid=27987109 But there may be different wu also. |
Send message Joined: 9 Jun 12 Posts: 584 Credit: 52,667,664 RAC: 0 |
Last modified: 30 Dec 2013, 21:28:28 UTC Well, maybe its just me expecting too much performance from those mid-range cards. Well, I really don't know, what operating system runs on PS4 and XBOX, nor the hardware specification. But if there would be possibility to get linux on PS4 and XBOX, than it should be no problem ti compile application for it. But I don't have access to this hardware. |
Send message Joined: 3 Jan 13 Posts: 30 Credit: 1,705,200 RAC: 0 |
My at 1094Mhz ~60 °C fan full speed I see, nicely overclocked. Just curious: Is this reference design cooling? (mine is) |
Send message Joined: 21 Dec 12 Posts: 176 Credit: 136,462,135 RAC: 11 |
I see, nicely overclocked. Just curious: Is this reference design cooling? (mine is) Yes. Reference design from Zotac. |
Send message Joined: 1 Apr 13 Posts: 37 Credit: 153,496,537 RAC: 0 |
|
Send message Joined: 21 Dec 12 Posts: 176 Credit: 136,462,135 RAC: 11 |
Last modified: 30 Dec 2013, 23:20:43 UTC The performance on a highend card is also a bit disappointing. My GTX Titan seems to need nearly one hour for a workunit. I'm affraid it will be the same problem. Only 79xx card looks promising. |
Send message Joined: 26 Jan 13 Posts: 31 Credit: 1,549,710 RAC: 244 |
Well, I really don't know, what operating system runs on PS4 and XBOX, nor the hardware specification. But if there would be possibility to get linux on PS4 and XBOX, than it should be no problem ti compile application for it. But I don't have access to this hardware.I was asking about native app. I believe its custom hardware running custom OS. The hardware is nothing to write home about (2x4 Jaguar cores with beefed up GPU). While it wont get better over time like PCs, the installed base is going to increase. I would only expect one of you guys to look into the thing if you already planned to pick on of the consoles for gaming purposes. |
Send message Joined: 21 Apr 13 Posts: 4 Credit: 6,438,240 RAC: 0 |
Lots of diplomatic talk about GPU vs. CPU credit..... To the point: 1. The GPU credit is ridiculously low (by 5x to 10x). 2. You are missing a very big opportunity (due to low GPU credit)to use the new GPU app to attract the big GPU crunchers into an important project. xyzzy |
Send message Joined: 16 Aug 12 Posts: 293 Credit: 1,116,280 RAC: 0 |
The GPU tasks are the same tasks as the CPU tasks. Why should a GPU get more credits for doing the same work? It's true, they could attract more GPU crunchers by paying more credits but since a GPU performs only a little better than a CPU on their tasks, buying credit whores with credits would a shameful waste of resources. It's better for GPUs to stay at projects that use their full potential. By doing the morally responsible thing and not squandering our resources, A&H admins are demonstrating that they are responsible citizens of the BOINC community. BOINC FAQ Service Official BOINC wiki Installing BOINC on Linux |
Send message Joined: 16 Aug 12 Posts: 293 Credit: 1,116,280 RAC: 0 |
H.A. Soft, The time you reported for your Titan has a note in parentheses beside it... (recommended CUDA double precision disabled). That confuses me because I thought A@H GPU tasks would benefit from DP but it appears you recommend disabling it. I was planning on hacking my GTX 660Ti to unlock it's DP capability, as was discussed about a month ago, but now it seems like that would be a waste of time. BOINC FAQ Service Official BOINC wiki Installing BOINC on Linux |
Send message Joined: 3 Jan 13 Posts: 30 Credit: 1,705,200 RAC: 0 |
Last modified: 31 Dec 2013, 7:12:18 UTC |
Send message Joined: 3 Jan 13 Posts: 30 Credit: 1,705,200 RAC: 0 |
EDIT: Not quite. 46 min. @ 980 MHz GPU and ~ 74°C now. In comparison a first workunit in DP mode on the Titan: 52 min. @ 823 MHz GPU and ~ 71°C. Keeping in mind that the shader clock is only 84%, this does indicate some speedup in double prec. mode on a clock per clock basis. No significant changes in GPU and memory load. |
Send message Joined: 21 Dec 12 Posts: 176 Credit: 136,462,135 RAC: 11 |
Lots of diplomatic talk about GPU vs. CPU credit..... 1. I agree with Dagorath. Same work = Sam credit. Yes I know that power consumption of GPUs in compare to credit gain is disadvantageous. 2. I think if we improve GPU app the things will be better, but we can never beat DistrRtGen or GPUGrid in credit. |
Send message Joined: 21 Dec 12 Posts: 176 Credit: 136,462,135 RAC: 11 |
Last modified: 31 Dec 2013, 11:57:49 UTC The time you reported for your Titan has a note in parentheses beside it... (recommended CUDA double precision disabled). That confuses me because I thought A@H GPU tasks would benefit from DP but it appears you recommend disabling it. I was planning on hacking my GTX 660Ti to unlock it's DP capability, as was discussed about a month ago, but now it seems like that would be a waste of time. It's because the time of calculation is divided half-and-half between memory load and dp calculations. The second point is that we mostly use multiply-add ops which are not so agresive. |
Send message Joined: 21 Dec 12 Posts: 176 Credit: 136,462,135 RAC: 11 |
Last modified: 31 Dec 2013, 12:10:42 UTC To all GPUs fans. This is first public version and our development do not stop with this version. Next version update will: 1. Use less memory. (example Titan 1100MB -> 688MB). Little speedup. 2. I will try to move some memory arrays to textures with int2 texels. This may speedup app. PS: It looks like we made to much fast CPU apps :-D |
Message boards :
News :
New applications for GPU released