Posts by Shadow.SETI.USA [TopGun]

\n studio-striking\n
1) Message boards : News : Switched to granting static credit (Message 283)
Posted 12 May 2011 by Shadow.SETI.USA [TopGun]
Post:
Yep, being penalized for having fairly new, fairly top of the line hardware, doesn't sit well.
2) Questions and Answers : Windows : Long run times (Message 190)
Posted 9 May 2011 by Shadow.SETI.USA [TopGun]
Post:
Running win7x64 on dual 6970's, working great.
3) Message boards : Wish list : BOINC stats (Message 178)
Posted 9 May 2011 by Shadow.SETI.USA [TopGun]
Post:
I emailed Willy this morning about getting Moo! added to BoincStats. He said he will add it as soon as he gets everything back in working order again. His server crashed yesterday.
4) Questions and Answers : Windows : Multi-GPU task takes longer than single-gpu task (Message 158)
Posted 7 May 2011 by Shadow.SETI.USA [TopGun]
Post:
I haven't had the long work units like others have had. My dual 6970's are doing them in 8 minutes and 14 seconds. It's just weird how some duals are working fine and others are taking hours to do a work unit.
5) Questions and Answers : Windows : Multi-GPU task takes longer than single-gpu task (Message 112)
Posted 5 May 2011 by Shadow.SETI.USA [TopGun]
Post:
As some one mentioned, we would DEARLY love to have the project crunch on individual GPUs rather than use all in the machine. It is more efficient for the computers and more work would get done for the project.


Hi,

I understand that request but unfortunately this is not up to me. Distributed.net Client doesn't allow selecting which GPU they use and they detect and use all of them. Until they support this switch, there's nothing much I can do. Sorry. :(

-w

Is there a way to change the number of packets in each work unit? If they all had an even number of packets, it would at least eliminate one of the GPU's idling in a multi GPU setup.
6) Questions and Answers : Windows : Multi-GPU task takes longer than single-gpu task (Message 84)
Posted 4 May 2011 by Shadow.SETI.USA [TopGun]
Post:
Assuming this project is built on DNETC sources: Was something changed in the ATI application.


I'm using Distributed.net Client without any changes just like they are on their download site. I don't have their sources so I can't build a custom client (there's also no need for such thing).

One of the two GPUs only gets back up to 50% GPU usage after it finished the first segment (block of work). Previously both GPUs were running over 90% GPU usage the whole time, except for the end when the faster GPU has to wait for the slower on to finish it's last segment.


Interesting as it should behave like the latter part. That's also how it works on my own system (and on many others too, I would hope). This can be a bug in the dnet client but there's also some changes I can do in my wrapper code to see if things improve.

I believe this is also something dnetc@home battled with some earlier versions of their apps. Not sure if they ever got it working since I obviously didn't have that problem. :)

-w

I think the problem with Dnetc was that some work units had an odd number of packets, so one GPU would still be working while the other had nothing to do. Then it took a few workunits for the other one to kick back in again. So a WU with 12 packets would run normal with 2 GPU's, yet one with 11 or 13 packets would cause at least one (if not both) GPU to hang.
7) Questions and Answers : Windows : High CPU usage (Message 71)
Posted 3 May 2011 by Shadow.SETI.USA [TopGun]
Post:
I just ran into the same type of problem. Workunits running for 45 minutes with no GPU activity and CPU was getting hammered hard. Did a reboot and started fresh WU, same thing. Was working fine until an hour ago or so.
8) Questions and Answers : Windows : Low credits on 6970's (Message 24)
Posted 2 May 2011 by Shadow.SETI.USA [TopGun]
Post:
I'm seeing other participants getting between 2-4K credits per work unit. I'm running 6970's on Windows 7 64 bit with Catalyst 11.3 and boinc version 6.10.58 and only getting between 300 - 400 credits per work unit.





 
Copyright © 2011-2024 Moo! Wrapper Project