Posts by w1hue

1) (Message 7275)
Posted 26 Oct 2022 by w1hue
Post:
Message sorting does not seem to work -- this thread always starts with 2021 posts regardless of what sorting
option is selected -- at least on my Windows 11 system running FireFox.
2) (Message 6818)
Posted 3 Jul 2020 by w1hue
Post:
Unfortunately, this project seems to be using the Mushroom Theory of Management!

For those who don't know about the Mushroom Theory of Management, it describes a style of management where employees or users (like mushrooms) are kept in the dark and periodically given a load of manure.

And when they start to grow, chop their heads off . . .
3) (Message 6751)
Posted 24 May 2020 by w1hue
Post:
Received a replacement GTX 1050 TI -- Runing fine along with my old GTX 750 TI.
4) (Message 6659)
Posted 10 May 2020 by w1hue
Post:
I have decided that the board is defective and will return it for a replacement.
5) (Message 6654)
Posted 7 May 2020 by w1hue
Post:
I am running a Windows 10 machine with a NVIDIA GTX 750 TI and recently added a second GPU, a NIVIDA GTX 1050 TI. It is not recognized by BOINC and Windows Device Manager shows a "Code 43" error. And windows does not find it as a second display. I installed a new NVIDIA driver -- no change. Tried running it with and without a "dummy" HDMI plug -- no change. It seems to be recognized by GPU-Z, but it shows the same GPU usage as for the 750 which I don't believe is correct. I don't see any BIOS settings dealing with multiple GPU cards.

Here is a copy of my cc_config.xml file:
<cc_config>
<options>
<use_all_gpus>1</use_all_gpus>
<exclude_gpu>
<url>asteroidsathome.net/boinc</url>
<device_num>1</device_num>
</exclude_gpu>
</options>
</cc_config>
I don't want Astroids to run in the 750 because it impacts the display too much.

Any suggestions??
6) (Message 4240)
Posted 24 Mar 2015 by w1hue
Post:
No need to abort 'em, just click on "Update" and they will go 'way . . .
7) (Message 4239)
Posted 24 Mar 2015 by w1hue
Post:
You didn't break the Internet, your Computer arrived at http://hmpg.net/ and is now Downloading.

Really?? My downloads are still ending in error. . .
8) (Message 4196)
Posted 21 Mar 2015 by w1hue
Post:
The last three WUs that I tried to download resulted in "Download Error". Guess others are seeing the same thing...
9) (Message 3932)
Posted 8 Jan 2015 by w1hue
Post:
Milkyway do floating point, but insist on a card that can do it. I am sure if single precision or integer maths was good enough Asteroids would have used it in their apps.

My GT 520 does a good job on Milkyway tasks (they use DOUBLE PRECISION FP, BTW). Check my stats below -- all Milkyway credits were earned with the GPU.

Edit -- well forget looking at my stats in the signature line; BOINC Stats must be screwing around with user IDs again!!
10) (Message 3926)
Posted 7 Jan 2015 by w1hue
Post:
If you abort them, they will just be sent out to someone else. If they time out, they will also be sent to someone else. So you might as well abort a few since they will probably timeout eventually.
11) (Message 3924)
Posted 6 Jan 2015 by w1hue
Post:

But yet your very first post in this thread included this:
"My GT 520 ain't the fastest horse in the race, but gee. . .", I am confused! If it's about the Science then who cares how long it takes, or how many points we get, as long as GOOD Science is getting done. Science is a LIFETIME endeavor, not an overnight thing, some of these projects will NEVER end!!

I don't really care how long it takes -- if I did, I'd invest in faster hardware! I was just surprised that there seemed to be no advantage in using the GPU over the CPU for this project.
12) (Message 3922)
Posted 5 Jan 2015 by w1hue
Post:
I am currently running projects that are of most interest to me. The ones that you mentioned are of no or little interest. I'm not just doing it for the "points". . .

Thanks anyway.
13) (Message 3919)
Posted 5 Jan 2015 by w1hue
Post:
To be honest, I haven't really compared my GPU and CPU tasks on this project too closely. With some projects the size of each task can vary a lot, and many give GPUs larger tasks since they are able to process normal ones much quicker. However, I'm just speaking generally--don't have enough experience with this particular project to say whether that is the case or not, since it seems a lot of GPU tasks get validated against CPU ones.

Since the GPU tasks are given the same credits as the CPU tasks, I assume that they are both doing the same sort of computations.

If you're interested in increasing your GPU's output, you might have a look at some of the other GPU projects.

I am running SETI@HOME, Einstein@Home and Milkyway@Home tasks in the GPU because they run considerably faster than in the CPU. Except for some of the Einstein tasks -- which I run (CPU versions only) in a couple of machines that don't have BOINC compatible GPUs.
14) (Message 3916)
Posted 4 Jan 2015 by w1hue
Post:
Most of us found that out 6-7 years ago :P

Good for you. . .

This is obviously an example where using a GPU to do the calculations is of no particular advantage. So why bother. . .
15) (Message 3911)
Posted 3 Jan 2015 by w1hue
Post:
I recently completed four Astroids@Home WUs using a NVIDIA GT 520 GPU (overclocked 15%) and the average run time was 38,353 seconds. The previous four WUs were executed in the AMD Athion 64 X2 4600+ CPU (overclocked 10%) in the same machine and had an average run time of 30,878 seconds. The GPU executes SETI, Einstein and Milkyway WUs between 6 and 8 times faster than equivalent WU in the CPU -- so I will continue running those WUs and not this project's WUs in the GPU and this project's in the CPU.

My GT 520 ain't the fastest horse in the race, but gee. . .
16) (Message 3899)
Posted 29 Dec 2014 by w1hue
Post:
Why not just abort them?
17) (Message 3665)
Posted 2 Oct 2014 by w1hue
Post:
The same thing has been posted in the "News" forum. I'm wondering if anyone associated with the project has time to read what's posted here. . .
18) (Message 3646)
Posted 27 Sep 2014 by w1hue
Post:
My computers have been receiving project initiated scheduling requests every five minutes for the past few days. What's going on??
19) (Message 3603)
Posted 12 Sep 2014 by w1hue
Post:
I just completed my first NVIDIA work unit -- it took 14 hours to run and I got 480 points credit -- the SAME credit I have been getting for CPU WU's that take 9-10 hours to run!! My GPU (GT-520) ain't the fastest horse in the race, but it executes SETI and Einstein WU at least 8 times faster than similar WU's running on my AMD Dual Core CPU. So ... I will let the current GPU WU complete and not run any more!
20) (Message 3230)
Posted 13 Jun 2014 by w1hue
Post:
I see very low CPU usage & high GPU load. Perfect!

GPU usage on my GT 520 is indeed high -- 99% -- which is good.

But -- it appears to me that CPU usage is actually much higher than the "0.01 cpus" indicated by BOINC Manager. When the GPU task is running, window's task manager shows high "kernal times" -- mostly due, apparently, to explorer.exe running and using on the order of 20-30% CPU time (on a dual core AMD). I see the same sort of thing happening with einstein@home BRP GPU tasks but NOT with seti@home or milkyway@home GPU tasks.

Hmmm...