Cannot ignore 2nd ATI on Moo

\n studio-striking\n

Questions and Answers : Windows : Cannot ignore 2nd ATI on Moo
Message board moderation

To post messages, you must log in.

AuthorMessage
Profile Mad Matt
Avatar

Send message
Joined: 7 May 11
Posts: 4
Credit: 61,768,083
RAC: 0
Message 755 - Posted: 3 Jul 2011, 21:39:55 UTC

I set BOINC to ignore ATI 0 (ATI 5450) and crunch only on ATI 1 (ATI 5870). <ignore_ati_dev>0</ignore_ati_dev> in the cc_config file. Works wonderful on MW, but lately I noted strange crunching times and CPU usage when running Moo! on that rig.

So I checked back and found the ATI 0 gets load from Moo. Also the DNETC benchmark seems to check both cards and gives the usual #3 recommendation for dual cards.

Is there any way to fix this and truly exclude device 0 from running on Moo?

Cheers
ID: 755 · Rating: 0 · rate: Rate + / Rate - Report as offensive
Profile Zydor
Avatar

Send message
Joined: 5 May 11
Posts: 233
Credit: 351,414,150
RAC: 0
Message 756 - Posted: 3 Jul 2011, 23:10:16 UTC - in response to Message 755.  

Maybe possible but I doubt it .... Moo is a multi threaded GPU app and therefore will use all available GPUs. The application instructions for the threads are a layer(s) below the BOINC statement, so the BOINC statement re device 0 will have no effect.

Regards
Zy
ID: 756 · Rating: 0 · rate: Rate + / Rate - Report as offensive
Profile Mad Matt
Avatar

Send message
Joined: 7 May 11
Posts: 4
Credit: 61,768,083
RAC: 0
Message 767 - Posted: 6 Jul 2011, 12:48:20 UTC - in response to Message 756.  
Last modified: 6 Jul 2011, 12:49:01 UTC

Maybe possible but I doubt it .... Moo is a multi threaded GPU app and therefore will use all available GPUs. The application instructions for the threads are a layer(s) below the BOINC statement, so the BOINC statement re device 0 will have no effect.

Regards
Zy


Cheers, Zydor.

I may try excluding the GPU for the project explicitly, but I am afraid this will happen at the same level and have no effect. So probably the only way of getting one idle ATI is pairing it with an Nvidia GPU.
ID: 767 · Rating: 0 · rate: Rate + / Rate - Report as offensive
Profile Teemu Mannermaa
Project administrator
Project developer
Project tester

Send message
Joined: 20 Apr 11
Posts: 388
Credit: 822,356,221
RAC: 0
Message 773 - Posted: 8 Jul 2011, 9:19:09 UTC

Hi,

Unfortunately, Distributed.net Client that we use is designed to detect and use every card it can find. At least it's limited to ATI or nVidia cards depending on the type of the app.

The client does have a -numcpu command line option and also related ini-file settings. However, I'm not sure if those are effective for GPUs or only for CPU crunchers.. Regardless, selection of device it runs on is missing so one can only "ignore" devices in the end of detection chain.

-w
ID: 773 · Rating: 0 · rate: Rate + / Rate - Report as offensive
Profile Beyond
Avatar

Send message
Joined: 18 May 11
Posts: 46
Credit: 1,254,302,893
RAC: 0
Message 792 - Posted: 13 Jul 2011, 16:43:02 UTC - in response to Message 773.  

The client does have a -numcpu command line option and also related ini-file settings. However, I'm not sure if those are effective for GPUs or only for CPU crunchers.. Regardless, selection of device it runs on is missing so one can only "ignore" devices in the end of detection chain. -w

I tried the -numcpu switch a while ago and it didn't work for GPUs. Is the code available for the distributed.net client? If so would it be possible to remove the "use all GPUs" part?
ID: 792 · Rating: 0 · rate: Rate + / Rate - Report as offensive
Ageless
Avatar

Send message
Joined: 2 May 11
Posts: 4
Credit: 75,588
RAC: 0
Message 805 - Posted: 16 Jul 2011, 5:44:47 UTC - in response to Message 755.  

I have been testing this and I cannot reproduce it. The <ignore_ati_dev>0</ignore_ati_dev> does ignore my only ATI GPU on this project. I only request and actually do get work for the CPU.

Perhaps it's your BOINC version that doesn't fully support this, you should try 6.12.33 to see if that one still does ignore your ignore preference. I'm not sure at what level 6.10 ignored the GPU, but 6.12 does it immediately at the detection level.

Like so:
16/07/2011 07:24:00 | | ATI GPU 0 (ignored by config): ATI Radeon HD 4700/4800 (RV740/RV770) (CAL version 1.4.1332, 1024MB, 1000 GFLOPS peak)

Jord

Used to be a single voice that vanished in a crowd. Vague just like a distant sun when hidden by the clouds.
Found a way to surface and to speak my truth aloud. Be powerful. Stand fast and proud.
ID: 805 · Rating: 0 · rate: Rate + / Rate - Report as offensive
Profile Beyond
Avatar

Send message
Joined: 18 May 11
Posts: 46
Credit: 1,254,302,893
RAC: 0
Message 806 - Posted: 16 Jul 2011, 12:08:30 UTC - in response to Message 805.  

I have been testing this and I cannot reproduce it. The <ignore_ati_dev>0</ignore_ati_dev> does ignore my only ATI GPU on this project. I only request and actually do get work for the CPU.

To completely ignore ATI or NVidia or CPU all you have to do is uncheck the appropriate box in the project preferences. His problem is that once the dnet client starts an ATI WU it detects and grabs both GPUs at the client level. Since you're ignoring your only ATI, BOINC manager isn't DLing the ATI WUs at all so no problem in your case. It's very different.
ID: 806 · Rating: 0 · rate: Rate + / Rate - Report as offensive
Ageless
Avatar

Send message
Joined: 2 May 11
Posts: 4
Credit: 75,588
RAC: 0
Message 807 - Posted: 16 Jul 2011, 13:38:58 UTC - in response to Message 806.  

Ah, OK, that explains that. Thanks.
But really, if other projects such as DNETC can use the distributed.net programs while honouring BOINC's preferences, why can't Moo! Wrapper do that?

Just here to learn, not to yell.
Jord

Used to be a single voice that vanished in a crowd. Vague just like a distant sun when hidden by the clouds.
Found a way to surface and to speak my truth aloud. Be powerful. Stand fast and proud.
ID: 807 · Rating: 0 · rate: Rate + / Rate - Report as offensive
Profile Beyond
Avatar

Send message
Joined: 18 May 11
Posts: 46
Credit: 1,254,302,893
RAC: 0
Message 809 - Posted: 16 Jul 2011, 23:03:38 UTC - in response to Message 807.  

But really, if other projects such as DNETC can use the distributed.net programs while honouring BOINC's preferences, why can't Moo! Wrapper do that?

AFAIK no one has been able to run WUs on 1 ATI GPU in a machine with 2 or more ATIs at DNETC either. It was also a big issue over there. The difference is that Teemu has been forthcoming with good explanations on the subject rather than trying to cover up or ignore it.
ID: 809 · Rating: 0 · rate: Rate + / Rate - Report as offensive
Extra Ball

Send message
Joined: 12 Jun 11
Posts: 2
Credit: 50,013,129
RAC: 0
Message 829 - Posted: 23 Jul 2011, 22:43:11 UTC - in response to Message 755.  

Is there any way to fix this and truly exclude device 0 from running on Moo?

See my old post at DNETC forum here

Try using method 2/ (load as many WUs as possible with 1 video card plugged before you start Boinc Manager) before restarting Boinc Manager with 2 GPUs connected/recognized + using the exclusion tag in the config file.
This should work fine...

At least, it works for me (just checked) with the following config file:

<cc_config>
<log_flags>
</log_flags>
<options>
<ignore_ati_dev>0</ignore_ati_dev>
<no_gpus>0</no_gpus>
<use_all_gpus>1</use_all_gpus>
</options>
</cc_config>
ID: 829 · Rating: 0 · rate: Rate + / Rate - Report as offensive
Profile Mad Matt
Avatar

Send message
Joined: 7 May 11
Posts: 4
Credit: 61,768,083
RAC: 0
Message 831 - Posted: 24 Jul 2011, 11:14:06 UTC - in response to Message 829.  

Is there any way to fix this and truly exclude device 0 from running on Moo?

See my old post at DNETC forum here

Try using method 2/ (load as many WUs as possible with 1 video card plugged before you start Boinc Manager) before restarting Boinc Manager with 2 GPUs connected/recognized + using the exclusion tag in the config file.
This should work fine...

At least, it works for me (just checked) with the following config file:

<cc_config>
<log_flags>
</log_flags>
<options>
<ignore_ati_dev>0</ignore_ati_dev>
<no_gpus>0</no_gpus>
<use_all_gpus>1</use_all_gpus>
</options>
</cc_config>


Cheers, Extra Ball, this sounds like a plan. Just curious to see how this affects CPU load and core selection. Right now I removed the 2nd card, but I will test this as an opportunity arises.
ID: 831 · Rating: 0 · rate: Rate + / Rate - Report as offensive
Profile Beyond
Avatar

Send message
Joined: 18 May 11
Posts: 46
Credit: 1,254,302,893
RAC: 0
Message 832 - Posted: 24 Jul 2011, 12:48:29 UTC - in response to Message 829.  

Try using method 2/ (load as many WUs as possible with 1 video card plugged before you start Boinc Manager) before restarting Boinc Manager with 2 GPUs connected/recognized + using the exclusion tag in the config file.
This should work fine...

2/ If you start BM with only 1 GPU plugged THEN load as many WUs as desired, each of them will only require 0.4 CPU + 1 GPU in the future (even if BM is restarted with 2 cards installed)
In this case, you can process 2 WUs in parallel without problem and the CPU usage is less than 1% per task. You can even mix DNETC WUs with Collatz ones (didn't try with Milky but I guess this should work as well)

So shutdown and remove one of the video cards every time we need to DL work. Download the WUs then shutdown, replace the 2nd GPU and restart. Ouch. Might work but a PITA. Over time this can't be good for the machine either.
ID: 832 · Rating: 0 · rate: Rate + / Rate - Report as offensive
Extra Ball

Send message
Joined: 12 Jun 11
Posts: 2
Credit: 50,013,129
RAC: 0
Message 833 - Posted: 24 Jul 2011, 16:35:28 UTC - in response to Message 832.  

So shutdown and remove one of the video cards every time we need to DL work. Download the WUs then shutdown, replace the 2nd GPU and restart. Ouch. Might work but a PITA. Over time this can't be good for the machine either.

Just to be clear: you don't have to physically remove the video card from the PC, just disconnect the video cable (VGA, HDMI or DVI) before launching Boinc Manager so only 1 card is detected at startup
ID: 833 · Rating: 0 · rate: Rate + / Rate - Report as offensive
Profile Beyond
Avatar

Send message
Joined: 18 May 11
Posts: 46
Credit: 1,254,302,893
RAC: 0
Message 836 - Posted: 24 Jul 2011, 20:07:20 UTC - in response to Message 833.  

So shutdown and remove one of the video cards every time we need to DL work. Download the WUs then shutdown, replace the 2nd GPU and restart. Ouch. Might work but a PITA. Over time this can't be good for the machine either.

Just to be clear: you don't have to physically remove the video card from the PC, just disconnect the video cable (VGA, HDMI or DVI) before launching Boinc Manager so only 1 card is detected at startup

Ah, that makes it a lot easier. Might give it a try. Thanks for the explanation.
ID: 836 · Rating: 0 · rate: Rate + / Rate - Report as offensive
Profile Beyond
Avatar

Send message
Joined: 18 May 11
Posts: 46
Credit: 1,254,302,893
RAC: 0
Message 855 - Posted: 29 Jul 2011, 17:43:07 UTC - in response to Message 833.  

Just to be clear: you don't have to physically remove the video card from the PC, just disconnect the video cable (VGA, HDMI or DVI) before launching Boinc Manager so only 1 card is detected at startup

Tried it and the Moo! client still grabbed both cards. Downloaded the WUs with only 1 GPU detected by BOINC and restarted after replugging the 2nd GPU. Unfortunately, no go.
ID: 855 · Rating: 0 · rate: Rate + / Rate - Report as offensive
Profile Teemu Mannermaa
Project administrator
Project developer
Project tester

Send message
Joined: 20 Apr 11
Posts: 388
Credit: 822,356,221
RAC: 0
Message 860 - Posted: 31 Jul 2011, 17:43:11 UTC - in response to Message 792.  

I tried the -numcpu switch a while ago and it didn't work for GPUs. Is the code available for the distributed.net client? If so would it be possible to remove the "use all GPUs" part?


Actually, you can't use the command line switch since you'd need to give it to the client part but there are only ways you can give it to the wrapper part, which ignores it. Sorry about adding to your confusion there. :(

However, you can use the config file option to make the distributed.net client ignore cards. Unfortunately, this can only be made to ignore from the end of list so ignoring only first card is impossible with current code.

The way to do this is to use Anonymous platform and a custom .ini file for the client. Edit the config file downloaded to have following block:

[processor-usage]
priority=4
max-threads=2


The processor-usage group and priority setting should already be there so you'd only add the max-threads setting. Example code makes the client use only first two cards detected (and ignore third and fourth, if there are such things). Please remember to tell BOINC Client to require the same amount of ATI resources/cards. Or better yet, make it also ignore any extra cards.

You should also note that client will always use the first two cards, even if BOINC Client thinks it allocated card 2 and 3 to our application. So funky things can happen if things get out-of-sync.. So best is to ignore any "extra" cards. Maybe new "ignore card x for project y" config in the development BOINC Client can help here (to run the extra cards with other projects), though..

The source (or part of it, anyway) is available from Distributed.net and I've been thinking about contributing to them a "select specific cards for crunching" feature. No idea when and if that's even possible. (I've never contributed to their source so don't know if they accept patches from outsiders). And I still have my hands full with the wrapper part. :)

-w
ID: 860 · Rating: 0 · rate: Rate + / Rate - Report as offensive
Profile Teemu Mannermaa
Project administrator
Project developer
Project tester

Send message
Joined: 20 Apr 11
Posts: 388
Credit: 822,356,221
RAC: 0
Message 861 - Posted: 31 Jul 2011, 17:49:04 UTC - in response to Message 807.  

But really, if other projects such as DNETC can use the distributed.net programs while honouring BOINC's preferences, why can't Moo! Wrapper do that?


Yeah, it's a limitation on the Distributed.net Client we wrap to the BOINC world. It does it's own card detection and defaults to using everything it finds. Like I said, it does have a way to limit but not to select cards. This started out as a pure CPU application so in that sense these "limits" are understandably.. (OS Thread Scheduler deals with allocating CPU cores and time slices.)

In future I'll parse the device params from BOINC Client and set the numcpu param accordingly to automate this. This is best we can do for now and it does have it's limits.. :(

-w
ID: 861 · Rating: 0 · rate: Rate + / Rate - Report as offensive
Profile Teemu Mannermaa
Project administrator
Project developer
Project tester

Send message
Joined: 20 Apr 11
Posts: 388
Credit: 822,356,221
RAC: 0
Message 868 - Posted: 1 Aug 2011, 3:58:29 UTC - in response to Message 860.  

Actually, you can't use the command line switch since you'd need to give it to the client part but there are only ways you can give it to the wrapper part, which ignores it. Sorry about adding to your confusion there. :(


Err, no, wait. You actually can if you use Anonymous platform and modify job XML file that has the d.net client command line right there. More confusion. :)

Did you try this way and it didn't work?

-w
ID: 868 · Rating: 0 · rate: Rate + / Rate - Report as offensive
Profile Beyond
Avatar

Send message
Joined: 18 May 11
Posts: 46
Credit: 1,254,302,893
RAC: 0
Message 1078 - Posted: 23 Sep 2011, 14:58:38 UTC - in response to Message 868.  
Last modified: 23 Sep 2011, 14:59:06 UTC

Actually, you can't use the command line switch since you'd need to give it to the client part but there are only ways you can give it to the wrapper part, which ignores it. Sorry about adding to your confusion there. :(

Err, no, wait. You actually can if you use Anonymous platform and modify job XML file that has the d.net client command line right there. More confusion. :)

Did you try this way and it didn't work? -w

Yep, that's what I did. It didn't work at DNETC either. I'm running my multi ATI boxes at MW/Collatz and the ATI card on the 4 machines with mixed ATI/NVidia here. One GPU/WU works way better than two (for me at least).
ID: 1078 · Rating: 0 · rate: Rate + / Rate - Report as offensive
sparkler99

Send message
Joined: 14 Dec 11
Posts: 1
Credit: 15,772
RAC: 0
Message 1732 - Posted: 14 Dec 2011, 20:14:14 UTC

ive just found this out now supposed to use my hd5750 with its new cooler but instead is using both that and my main hd5850 that i was using for terraria grrrrr and is currently struggling with its not so great stock cooler
ID: 1732 · Rating: 0 · rate: Rate + / Rate - Report as offensive

Questions and Answers : Windows : Cannot ignore 2nd ATI on Moo


 
Copyright © 2011-2024 Moo! Wrapper Project