Support for ATI Radeon GPUs


Message boards : Wish list : Support for ATI Radeon GPUs

Message board moderation

To post messages, you must log in.
Previous · 1 · 2 · 3 · 4 · 5 . . . 7 · Next
AuthorMessage
Profile Un4Seen

Send message
Joined: 7 Dec 12
Posts: 87
Credit: 3,206,012
RAC: 127
Message 2973 - Posted: 5 May 2014, 21:15:55 UTC - in response to Message 2971.  
Yepp, you are right. Unfortunately I've never measured the electricity draw of my 8 core PC, but I suspect that it consumes between 150-200 Watts. Of course it can be used for a lot of things for which the TV stick cannot be used, so if we look at things that way, the normal PC is a better choice.

I think the good old rule that higher integration leads to lower computation cost is still true, meaning that if you buy a muscular desktop CPU, computation will be cheaper. But in my case I managed to get the TV sticks at a really cheap price from China, so the cost efficiency is probably quite close to that of the desktop PC if we think of running them for only a few years (electricity is surprisingly cheap compared to other energy sources, so the price advantage of the ARM devices will take a lot of time to be compensated by the desktop PC's slightly increased power efficiency).

Anyway, my point was that ARM devices are getting strong these days and that they are a viable option for running BOINC :)
ID: 2973 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
[TA]Assimilator1
Avatar

Send message
Joined: 24 Aug 13
Posts: 111
Credit: 31,239,773
RAC: 7,452
Message 2978 - Posted: 6 May 2014, 16:44:58 UTC
Fair enough :)
Incidentally electricity might be cheap in your country but it certainly isn't here in the UK. How much do you pay per unit?

Btw, despite what BOINC says your CPU isn't an 8 core CPU, it's a 4 core CPU with Hyper Threading. i7 2600 info http://ark.intel.com/products/52213 .Still it's a great CPU & much quicker than mine! ;)
Team AnandTech - SETI@H, Muon1 DPAD, Folding@H, MilkyWay@H, Asteroids@H, LHC@H, POGS, Rosetta@H, Einstein@H,DHPE & CPDN

Main rig - Ryzen 3600, 32GB DDR4 3200, RX 580 8GB, Win10
2nd rig - i7 4930k @4.1 GHz, 16GB DDR3 1866, HD 7870 XT 3GB(DS), Win7
ID: 2978 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Un4Seen

Send message
Joined: 7 Dec 12
Posts: 87
Credit: 3,206,012
RAC: 127
Message 2979 - Posted: 6 May 2014, 18:00:56 UTC

Last modified: 6 May 2014, 18:01:44 UTC
I pay about 0.12 Euros per kWh. Just out of curiosity, how much does it cost in the UK?

Yes, indeed, my strong CPU has 4 physical cores and it seems to have 8 because of the hyper threading.

I re-checked my calculations and my strong PC can complete less tasks than I originally thought. About 100 tasks in 42 hours. The ARM cluster completes 24 tasks in 42 hours. But the ARM cluster runs 24/7 while the strong PC runs about 8 hours a day, so, in a sense, the ARM cluster will generate 72% as much credit as my strongest PC (when the TV sticks will arrive from China, that is) :)
http://iqjar.com
ID: 2979 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
[TA]Assimilator1
Avatar

Send message
Joined: 24 Aug 13
Posts: 111
Credit: 31,239,773
RAC: 7,452
Message 2985 - Posted: 7 May 2014, 17:25:28 UTC - in response to Message 2979.  
I pay 14p/kWh which at todays exchange rate is €0.17/kWh, so that's nearly 42% more! :(.

Btw I'm curious, why did you pick a 42hr period? ;)
It would be interesting to see how much your PC draws, you know you can get plug in the wall power meters for not too much, mine cost £25 but that was 8 yrs ago ;).
Team AnandTech - SETI@H, Muon1 DPAD, Folding@H, MilkyWay@H, Asteroids@H, LHC@H, POGS, Rosetta@H, Einstein@H,DHPE & CPDN

Main rig - Ryzen 3600, 32GB DDR4 3200, RX 580 8GB, Win10
2nd rig - i7 4930k @4.1 GHz, 16GB DDR3 1866, HD 7870 XT 3GB(DS), Win7
ID: 2985 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Un4Seen

Send message
Joined: 7 Dec 12
Posts: 87
Credit: 3,206,012
RAC: 127
Message 2988 - Posted: 7 May 2014, 18:19:34 UTC - in response to Message 2985.  
I picked the 42 hours as a reference period because that's how long it takes for the TV stick to finish 4 tasks simultaneously, no other reason :)

I'll try to measure the current draw of the strong PC one day. I do have the tool for measuring, I just have to take it to work, because that's where the strong PC is.
ID: 2988 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Un4Seen

Send message
Joined: 7 Dec 12
Posts: 87
Credit: 3,206,012
RAC: 127
Message 2992 - Posted: 8 May 2014, 7:12:29 UTC

Last modified: 8 May 2014, 7:13:15 UTC
OK, I've done the measurements. My strong PC, which can process about 100 A@H tasks in 42 hours, consumes about 170-175W when running BOINC at full load and 95-100W when not doing anything, just running idle. On the other hand the ARM cluster, which completes 24 tasks in 42 hours, consumes 30-35W. Assuming I would have 4 such ARM clusters, processing 96 tasks in 42 hours (let's consider taht equal to the 100 tasks of the strong PC), the total ARM power consumption would be 120-140W, so it's actually more energy efficient compared to the strong PC! Also less noisy, cheaper (assuming you catch some really good deals from China, like I did) and more fascinating :)

An interesting side calculation is that it costs about 75 Watts to run BOINC on the strong PC. According to the estimations of the measuring device, that's a difference of 10-12 Euros per month :) But given that I only run it for 8 hours per day, it's max 4 Euros per month. I consider this an investment into the future of mankind, so I don't mind :)
ID: 2992 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
[TA]Assimilator1
Avatar

Send message
Joined: 24 Aug 13
Posts: 111
Credit: 31,239,773
RAC: 7,452
Message 2994 - Posted: 8 May 2014, 18:48:27 UTC

Last modified: 8 May 2014, 18:49:43 UTC
Cool! :), btw have you got a link describing those TV sticks? I've got no idea what they are, lol.

So your PC does ~57 tasks/day whilst the ARM cluster does ~55/day (no need to round up :P), which with A@H's bizarre fixed credit rate per task regardless of it's length (compared with the same h/w) makes it easy to work out credit/w. So taking the average power draw of ~173w for the PC & 130w for the ARM cluster that's 3.035w/credit for the PC & 2.363w/credit for the ARM cluster, yep it's much more efficient! Forgot to say at the start btw, cool project :).

One question though, are those TV sticks designed to run fully loaded 24/7? I doubt it, I wonder how long they'd last like that? Keep us posted! ;)

Btw your PC is pretty efficient in my books anyway, my old PC (the main rig) draws 190w running A@H on all 4 cores!

So yea your ARM cluster is more efficient, at least whilst GPU crunching isn't big in A@H anyway, I think if you did a comparison in MW@H with ARM cluster vs mid to high end AMD GPUs you might find a different picture.

(Their, I ended that going back to GPUs ;) )
Team AnandTech - SETI@H, Muon1 DPAD, Folding@H, MilkyWay@H, Asteroids@H, LHC@H, POGS, Rosetta@H, Einstein@H,DHPE & CPDN

Main rig - Ryzen 3600, 32GB DDR4 3200, RX 580 8GB, Win10
2nd rig - i7 4930k @4.1 GHz, 16GB DDR3 1866, HD 7870 XT 3GB(DS), Win7
ID: 2994 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Un4Seen

Send message
Joined: 7 Dec 12
Posts: 87
Credit: 3,206,012
RAC: 127
Message 2995 - Posted: 8 May 2014, 20:20:09 UTC - in response to Message 2994.  

Last modified: 8 May 2014, 20:23:29 UTC
Search for Android TV sticks such as the MK908, MK908II, CX-919, etc. They are really cool small ARM computers that you hook up to your TV through the HDMI port and turn the TV into a "smart TV" basically by having a Wi-Fi enabled computer connected to it. They really are cool for the purpose they were designed for, but the trick is that some newer ones (like those mentioned above) are based on the RK3188 chipset, which is really strong for an ARM SoC (quad-core, 1.4-1.8 GHz, 2GB RAM). So that's how I use their computing power in BOINC :)

Those TV sticks are only partially designed for constant load, in the sense that they must be able to process full-HD movies for hours without problems. They've got some quite strong GPUs in them, BTW (nothing compared to those in x86 PC's, of course). So what happens if you try to run BOINC on them 24/7? They start heating up. Some models behave better, others turn very hot. The worst that can happen is that when the chipset reaches around 80 degrees Celsius, thermal throttling steps into place and shuts down some cores until they cool down a bit. That ruins BOINC efficiency, so what you want to do is to cool them. You can try to take off the small passive heat sink and replace it with a huge one, but that obviously does not fit in the USB dongle sized stick, so you have to cut the case open. What I do is to attach a small VGA cooler to it from the outside and that sucks fresh air inside the TV stick constantly through its ventilation holes. Works like a charm. I use resistors to tune down the speed of the VGA cooler, so it rotates very slowly and is virtually noiseless, but it still keeps the air flowing in the TV stick's case and BOINC crunches away happily :)

Check out this video when you have time:
https://www.youtube.com/watch?v=9c-5mO5Gmjc
The narration is in Romanian, but you'll get the idea anyway.


I must agree that the best performance (and probably also power efficiency) would be obtained with super-strong GPUs. It's a great thing that A@H supports NVidia, but, as luck has it, I have ATI Radeon cards and built-in Intel Graphics cards :) And that's how this whole thread was born :P
ID: 2995 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile mikey
Avatar

Send message
Joined: 1 Jan 14
Posts: 300
Credit: 32,274,317
RAC: 7,252
Message 2996 - Posted: 9 May 2014, 10:59:23 UTC - in response to Message 2995.  
Search for Android TV sticks such as the MK908, MK908II, CX-919, etc. They are really cool small ARM computers that you hook up to your TV through the HDMI port and turn the TV into a "smart TV" basically by having a Wi-Fi enabled computer connected to it. They really are cool for the purpose they were designed for, but the trick is that some newer ones (like those mentioned above) are based on the RK3188 chipset, which is really strong for an ARM SoC (quad-core, 1.4-1.8 GHz, 2GB RAM). So that's how I use their computing power in BOINC :)

Those TV sticks are only partially designed for constant load, in the sense that they must be able to process full-HD movies for hours without problems. They've got some quite strong GPUs in them, BTW (nothing compared to those in x86 PC's, of course). So what happens if you try to run BOINC on them 24/7? They start heating up. Some models behave better, others turn very hot. The worst that can happen is that when the chipset reaches around 80 degrees Celsius, thermal throttling steps into place and shuts down some cores until they cool down a bit. That ruins BOINC efficiency, so what you want to do is to cool them. You can try to take off the small passive heat sink and replace it with a huge one, but that obviously does not fit in the USB dongle sized stick, so you have to cut the case open. What I do is to attach a small VGA cooler to it from the outside and that sucks fresh air inside the TV stick constantly through its ventilation holes. Works like a charm. I use resistors to tune down the speed of the VGA cooler, so it rotates very slowly and is virtually noiseless, but it still keeps the air flowing in the TV stick's case and BOINC crunches away happily :)

Check out this video when you have time:
https://www.youtube.com/watch?v=9c-5mO5Gmjc
The narration is in Romanian, but you'll get the idea anyway.


I must agree that the best performance (and probably also power efficiency) would be obtained with super-strong GPUs. It's a great thing that A@H supports NVidia, but, as luck has it, I have ATI Radeon cards and built-in Intel Graphics cards :) And that's how this whole thread was born :P


Have you looked at one of these yet:


It will let you plug upto 3 devices into a single hdmi port.
ID: 2996 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Un4Seen

Send message
Joined: 7 Dec 12
Posts: 87
Credit: 3,206,012
RAC: 127
Message 2997 - Posted: 9 May 2014, 11:07:45 UTC - in response to Message 2996.  
Nice!

I personally don't need such a cable. I set up the TV sticks in a matter of minutes, I download VNC server onto them and from that moment on I run them in headless mode. The only cable connected to them is the power cord :)

BTW, I take this moment to thank Kyong for implementing the Android support for A@H! Without that there would be no A@H ARM cluster for me :)
ID: 2997 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile mikey
Avatar

Send message
Joined: 1 Jan 14
Posts: 300
Credit: 32,274,317
RAC: 7,252
Message 3008 - Posted: 10 May 2014, 11:00:48 UTC - in response to Message 2997.  
Nice!

I personally don't need such a cable. I set up the TV sticks in a matter of minutes, I download VNC server onto them and from that moment on I run them in headless mode. The only cable connected to them is the power cord :)

BTW, I take this moment to thank Kyong for implementing the Android support for A@H! Without that there would be no A@H ARM cluster for me :)


Don't you have to plug them into an hdmi port to work though?

And yes several project admins have worked hard to implement Android support, they all deserve our thanks!
ID: 3008 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Un4Seen

Send message
Joined: 7 Dec 12
Posts: 87
Credit: 3,206,012
RAC: 127
Message 3011 - Posted: 10 May 2014, 13:30:59 UTC - in response to Message 3008.  
Nope, the HDMI port is just needed if you want to see what it does on some kind of screen (Computer monitor or TV). It works without it too, you just use VNC to log onto it. An exception is the MK908II on ehich VNC does not work (bug)
ID: 3011 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile mikey
Avatar

Send message
Joined: 1 Jan 14
Posts: 300
Credit: 32,274,317
RAC: 7,252
Message 3019 - Posted: 11 May 2014, 10:53:18 UTC - in response to Message 3011.  
Nope, the HDMI port is just needed if you want to see what it does on some kind of screen (Computer monitor or TV). It works without it too, you just use VNC to log onto it. An exception is the MK908II on ehich VNC does not work (bug)


That's pretty cool, thanks!
ID: 3019 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Un4Seen

Send message
Joined: 7 Dec 12
Posts: 87
Credit: 3,206,012
RAC: 127
Message 3028 - Posted: 15 May 2014, 21:16:30 UTC
I was just thinking that soon I might have about 150$ to invest into new hardware. I was wondering what would be worth buying from the perspective of running A@H tasks on it.

I could either upgrade one of my dual-core second-generation Intel CPUs (clocked at 2.6 GHz currently, 2 cores) to a third generation Core-i5 processor (clocked at 3 GHz and with 4 cores) or I could buy a weaker NVidia GPU, for example a GeForce GTX 650, GTX 650 Ti, GTX 750, GTX 660 DirectCU, etc.

Would one of these GPUs be faster at crunching than the slightly faster CPU and the additional 2 CPU cores? If yes, how much faster (approximately)?

A@H works on NVidia GPUs with CUDA 5.5 or greater, but I have no idea how to know if a GPU has CUDA 5.5 or greater...
Also, it might be worth waiting for the AMD GPU support. Some say that AMD GPUs are much better and if the A@H support will be implemented well, it will leave NVidia GPUs biting the dust.

Any opinions?
Thanks!
http://iqjar.com
ID: 3028 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
alexander

Send message
Joined: 28 Apr 13
Posts: 87
Credit: 26,716,176
RAC: 9
Message 3032 - Posted: 15 May 2014, 21:40:36 UTC - in response to Message 3028.  
I was just thinking that soon I might have about 150$ to invest into new hardware. I was wondering what would be worth buying from the perspective of running A@H tasks on it.

I could either upgrade one of my dual-core second-generation Intel CPUs (clocked at 2.6 GHz currently, 2 cores) to a third generation Core-i5 processor (clocked at 3 GHz and with 4 cores) or I could buy a weaker NVidia GPU, for example a GeForce GTX 650, GTX 650 Ti, GTX 750, GTX 660 DirectCU, etc.

Would one of these GPUs be faster at crunching than the slightly faster CPU and the additional 2 CPU cores? If yes, how much faster (approximately)?

A@H works on NVidia GPUs with CUDA 5.5 or greater, but I have no idea how to know if a GPU has CUDA 5.5 or greater...
Also, it might be worth waiting for the AMD GPU support. Some say that AMD GPUs are much better and if the A@H support will be implemented well, it will leave NVidia GPUs biting the dust.

Any opinions?
Thanks!


I did some wu's on my GTX750ti, runtimes between 4,038 and 5,081 sec.
Runtimes of cpu wu's
avx 5,337
sse3 6,054
sse2 5.081
i7 3770 @ 3.4GHz win7 x64 8GB GTX750ti

So for a@h a cpu with more cores will help you more for the money you have.
ID: 3032 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Un4Seen

Send message
Joined: 7 Dec 12
Posts: 87
Credit: 3,206,012
RAC: 127
Message 3033 - Posted: 15 May 2014, 21:45:26 UTC - in response to Message 3032.  
Thank you!
I'm a bit surprised... I thought GPUs were much better at this.
ID: 3033 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile mikey
Avatar

Send message
Joined: 1 Jan 14
Posts: 300
Credit: 32,274,317
RAC: 7,252
Message 3039 - Posted: 16 May 2014, 10:51:02 UTC - in response to Message 3028.  

Last modified: 16 May 2014, 10:51:45 UTC
I was just thinking that soon I might have about 150$ to invest into new hardware. I was wondering what would be worth buying from the perspective of running A@H tasks on it.

I could either upgrade one of my dual-core second-generation Intel CPUs (clocked at 2.6 GHz currently, 2 cores) to a third generation Core-i5 processor (clocked at 3 GHz and with 4 cores) or I could buy a weaker NVidia GPU, for example a GeForce GTX 650, GTX 650 Ti, GTX 750, GTX 660 DirectCU, etc.

Would one of these GPUs be faster at crunching than the slightly faster CPU and the additional 2 CPU cores? If yes, how much faster (approximately)?

A@H works on NVidia GPUs with CUDA 5.5 or greater, but I have no idea how to know if a GPU has CUDA 5.5 or greater...
Also, it might be worth waiting for the AMD GPU support. Some say that AMD GPUs are much better and if the A@H support will be implemented well, it will leave NVidia GPUs biting the dust.

Any opinions?
Thanks!


If it were me I would buy the quad core cpu, then buy the better gpu later on. there are LOTS of Boinc projects but only a small handful that can use gpu's, getting a quad core will make you more compatible with 99% of the Boinc projects out there. Yes a cpu is slower then a gpu in most cases, but no project is likely to announce 'the cure' for anything right now. Nor are they likely to announce the newly found 'planet destroying asteroid' either. That means we are in it for the long term, and while it is fun and very helpful to crunch faster, more power means they can do more with our hardware, not just search faster. Crunching fast means they could have seen the asteroid that whizzed by the Earth last week, crunching with better hardware meant they KNEW FOR CERTAIN that it would miss us.
ID: 3039 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile mikey
Avatar

Send message
Joined: 1 Jan 14
Posts: 300
Credit: 32,274,317
RAC: 7,252
Message 3040 - Posted: 16 May 2014, 10:57:33 UTC - in response to Message 3033.  

Last modified: 16 May 2014, 10:58:14 UTC
Thank you!
I'm a bit surprised... I thought GPUs were much better at this.


They are, but at your price range the pickings are slim.

Here is a small chart I made up for me:
Nvidia GTX 560Ti-yes 384
Nvidia 650Ti-yes 768
Nvidia 660-yes 1344
Nvidia 670- yes 1344
Nvidia 680-yes 1536
Nvidia 690-yes 3072
Nvidia 750-yes 512
Nvidia 750Ti-yes 640
Nvidia 760-yes 1152
Nvidia 770-yes 1536
Nvidia 780-yes 2880
Nvidia 790-yes 3072

AMD 5770-no 800
AMD 5870-yes 1600
AMD 6770-no 800
AMD 6850-no 960
AMD 6870-no 1120
AMD 6950-yes 1408
AMD 6970-yes 1536
AMD 7750-yes 512
AMD 7770-yes 640
AMD 7790-yes 896
AMD 7850-yes 1024
AMD 7870-yes 1280
AMD 7950-yes 1792
AMD 7970-yes 2048
AMD 7990-yes 4096
AMD R7 260x 896
AMD R9 270x 1280
AMD R9 280x 2048
AMD R9 290x 2816

Yes means Dual Precision capable, while the number means the number of stream processors or cuda cores. The numbers are NOT comparable between AMD and Nvidia, ONLY within each type. Obviously the better the card the more it costs, I just got an Nvidia 760 on sale for 250 US dollars, the better cards can easily cost in the 400 and up range.
ID: 3040 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Un4Seen

Send message
Joined: 7 Dec 12
Posts: 87
Credit: 3,206,012
RAC: 127
Message 3041 - Posted: 16 May 2014, 11:06:50 UTC - in response to Message 3039.  

Last modified: 16 May 2014, 11:08:00 UTC
Mikey, I must say that you are right! Very right!

We're in for the very long term. As hardware gets better over the years, projects like A@H will be able to implement more complex algorithms that will do more than the one that runs today. I just hope that they can get enough developers for the job. Seems to me that the weak point of A@H today is the scarceness of man-power. There are many people willing to crunch, but only few extremely busy developers and admins who can make new things happen. Or at least this is the feeling I get... Hopefully they will be able to implement more complex algorithms in the future and make better use of stronger hardware. Most likely CPU apps will be the first to change. GPUs are too diverse. The more different hardware you support, the harder it is to implement changes in all of the apps.

I'm not really considering running other BOINC projects in the near future, A@H is the one I believe in most. Perhaps MilkyWay@Home or Einstein@Home could have a chance to run on my machines. But I support more A@H because it needs my support the most.

Asteroids@Home operates at around 130-140 GFLOPS,
MilkyWay@Home operates at around 500 GFLOPS
Einstein@Home operates at around 1.100.000 GFLOPS

so yes, A@H needs my help most :)
ID: 3041 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Un4Seen

Send message
Joined: 7 Dec 12
Posts: 87
Credit: 3,206,012
RAC: 127
Message 3042 - Posted: 16 May 2014, 11:10:12 UTC - in response to Message 3040.  
Thanks for the GPU list, Mikey! Really helpful, not just for me, but for other readers as well!
ID: 3042 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Previous · 1 · 2 · 3 · 4 · 5 . . . 7 · Next

Message boards : Wish list : Support for ATI Radeon GPUs