Questions and Answers :
Windows :
Cannot ignore 2nd ATI on Moo
Message board moderation
Author | Message |
---|---|
Send message Joined: 7 May 11 Posts: 4 Credit: 61,768,083 RAC: 0 |
I set BOINC to ignore ATI 0 (ATI 5450) and crunch only on ATI 1 (ATI 5870). <ignore_ati_dev>0</ignore_ati_dev> in the cc_config file. Works wonderful on MW, but lately I noted strange crunching times and CPU usage when running Moo! on that rig. So I checked back and found the ATI 0 gets load from Moo. Also the DNETC benchmark seems to check both cards and gives the usual #3 recommendation for dual cards. Is there any way to fix this and truly exclude device 0 from running on Moo? Cheers |
Send message Joined: 5 May 11 Posts: 233 Credit: 351,414,150 RAC: 0 |
Maybe possible but I doubt it .... Moo is a multi threaded GPU app and therefore will use all available GPUs. The application instructions for the threads are a layer(s) below the BOINC statement, so the BOINC statement re device 0 will have no effect. Regards Zy |
Send message Joined: 7 May 11 Posts: 4 Credit: 61,768,083 RAC: 0 |
Maybe possible but I doubt it .... Moo is a multi threaded GPU app and therefore will use all available GPUs. The application instructions for the threads are a layer(s) below the BOINC statement, so the BOINC statement re device 0 will have no effect. Cheers, Zydor. I may try excluding the GPU for the project explicitly, but I am afraid this will happen at the same level and have no effect. So probably the only way of getting one idle ATI is pairing it with an Nvidia GPU. |
Send message Joined: 20 Apr 11 Posts: 388 Credit: 822,356,221 RAC: 0 |
Hi, Unfortunately, Distributed.net Client that we use is designed to detect and use every card it can find. At least it's limited to ATI or nVidia cards depending on the type of the app. The client does have a -numcpu command line option and also related ini-file settings. However, I'm not sure if those are effective for GPUs or only for CPU crunchers.. Regardless, selection of device it runs on is missing so one can only "ignore" devices in the end of detection chain. -w |
Send message Joined: 18 May 11 Posts: 46 Credit: 1,254,302,893 RAC: 0 |
The client does have a -numcpu command line option and also related ini-file settings. However, I'm not sure if those are effective for GPUs or only for CPU crunchers.. Regardless, selection of device it runs on is missing so one can only "ignore" devices in the end of detection chain. -w I tried the -numcpu switch a while ago and it didn't work for GPUs. Is the code available for the distributed.net client? If so would it be possible to remove the "use all GPUs" part? |
Send message Joined: 2 May 11 Posts: 4 Credit: 75,588 RAC: 0 |
I have been testing this and I cannot reproduce it. The <ignore_ati_dev>0</ignore_ati_dev> does ignore my only ATI GPU on this project. I only request and actually do get work for the CPU. Perhaps it's your BOINC version that doesn't fully support this, you should try 6.12.33 to see if that one still does ignore your ignore preference. I'm not sure at what level 6.10 ignored the GPU, but 6.12 does it immediately at the detection level. Like so: 16/07/2011 07:24:00 | | ATI GPU 0 (ignored by config): ATI Radeon HD 4700/4800 (RV740/RV770) (CAL version 1.4.1332, 1024MB, 1000 GFLOPS peak) Jord Used to be a single voice that vanished in a crowd. Vague just like a distant sun when hidden by the clouds. Found a way to surface and to speak my truth aloud. Be powerful. Stand fast and proud. |
Send message Joined: 18 May 11 Posts: 46 Credit: 1,254,302,893 RAC: 0 |
I have been testing this and I cannot reproduce it. The <ignore_ati_dev>0</ignore_ati_dev> does ignore my only ATI GPU on this project. I only request and actually do get work for the CPU. To completely ignore ATI or NVidia or CPU all you have to do is uncheck the appropriate box in the project preferences. His problem is that once the dnet client starts an ATI WU it detects and grabs both GPUs at the client level. Since you're ignoring your only ATI, BOINC manager isn't DLing the ATI WUs at all so no problem in your case. It's very different. |
Send message Joined: 2 May 11 Posts: 4 Credit: 75,588 RAC: 0 |
Ah, OK, that explains that. Thanks. But really, if other projects such as DNETC can use the distributed.net programs while honouring BOINC's preferences, why can't Moo! Wrapper do that? Just here to learn, not to yell. Jord Used to be a single voice that vanished in a crowd. Vague just like a distant sun when hidden by the clouds. Found a way to surface and to speak my truth aloud. Be powerful. Stand fast and proud. |
Send message Joined: 18 May 11 Posts: 46 Credit: 1,254,302,893 RAC: 0 |
But really, if other projects such as DNETC can use the distributed.net programs while honouring BOINC's preferences, why can't Moo! Wrapper do that? AFAIK no one has been able to run WUs on 1 ATI GPU in a machine with 2 or more ATIs at DNETC either. It was also a big issue over there. The difference is that Teemu has been forthcoming with good explanations on the subject rather than trying to cover up or ignore it. |
Send message Joined: 12 Jun 11 Posts: 2 Credit: 50,013,129 RAC: 0 |
Is there any way to fix this and truly exclude device 0 from running on Moo? See my old post at DNETC forum here Try using method 2/ (load as many WUs as possible with 1 video card plugged before you start Boinc Manager) before restarting Boinc Manager with 2 GPUs connected/recognized + using the exclusion tag in the config file. This should work fine... At least, it works for me (just checked) with the following config file: <cc_config> <log_flags> </log_flags> <options> <ignore_ati_dev>0</ignore_ati_dev> <no_gpus>0</no_gpus> <use_all_gpus>1</use_all_gpus> </options> </cc_config> |
Send message Joined: 7 May 11 Posts: 4 Credit: 61,768,083 RAC: 0 |
Is there any way to fix this and truly exclude device 0 from running on Moo? Cheers, Extra Ball, this sounds like a plan. Just curious to see how this affects CPU load and core selection. Right now I removed the 2nd card, but I will test this as an opportunity arises. |
Send message Joined: 18 May 11 Posts: 46 Credit: 1,254,302,893 RAC: 0 |
Try using method 2/ (load as many WUs as possible with 1 video card plugged before you start Boinc Manager) before restarting Boinc Manager with 2 GPUs connected/recognized + using the exclusion tag in the config file. 2/ If you start BM with only 1 GPU plugged THEN load as many WUs as desired, each of them will only require 0.4 CPU + 1 GPU in the future (even if BM is restarted with 2 cards installed) So shutdown and remove one of the video cards every time we need to DL work. Download the WUs then shutdown, replace the 2nd GPU and restart. Ouch. Might work but a PITA. Over time this can't be good for the machine either. |
Send message Joined: 12 Jun 11 Posts: 2 Credit: 50,013,129 RAC: 0 |
So shutdown and remove one of the video cards every time we need to DL work. Download the WUs then shutdown, replace the 2nd GPU and restart. Ouch. Might work but a PITA. Over time this can't be good for the machine either. Just to be clear: you don't have to physically remove the video card from the PC, just disconnect the video cable (VGA, HDMI or DVI) before launching Boinc Manager so only 1 card is detected at startup |
Send message Joined: 18 May 11 Posts: 46 Credit: 1,254,302,893 RAC: 0 |
So shutdown and remove one of the video cards every time we need to DL work. Download the WUs then shutdown, replace the 2nd GPU and restart. Ouch. Might work but a PITA. Over time this can't be good for the machine either. Ah, that makes it a lot easier. Might give it a try. Thanks for the explanation. |
Send message Joined: 18 May 11 Posts: 46 Credit: 1,254,302,893 RAC: 0 |
Just to be clear: you don't have to physically remove the video card from the PC, just disconnect the video cable (VGA, HDMI or DVI) before launching Boinc Manager so only 1 card is detected at startup Tried it and the Moo! client still grabbed both cards. Downloaded the WUs with only 1 GPU detected by BOINC and restarted after replugging the 2nd GPU. Unfortunately, no go. |
Send message Joined: 20 Apr 11 Posts: 388 Credit: 822,356,221 RAC: 0 |
I tried the -numcpu switch a while ago and it didn't work for GPUs. Is the code available for the distributed.net client? If so would it be possible to remove the "use all GPUs" part? Actually, you can't use the command line switch since you'd need to give it to the client part but there are only ways you can give it to the wrapper part, which ignores it. Sorry about adding to your confusion there. :( However, you can use the config file option to make the distributed.net client ignore cards. Unfortunately, this can only be made to ignore from the end of list so ignoring only first card is impossible with current code. The way to do this is to use Anonymous platform and a custom .ini file for the client. Edit the config file downloaded to have following block: [processor-usage] priority=4 max-threads=2 The processor-usage group and priority setting should already be there so you'd only add the max-threads setting. Example code makes the client use only first two cards detected (and ignore third and fourth, if there are such things). Please remember to tell BOINC Client to require the same amount of ATI resources/cards. Or better yet, make it also ignore any extra cards. You should also note that client will always use the first two cards, even if BOINC Client thinks it allocated card 2 and 3 to our application. So funky things can happen if things get out-of-sync.. So best is to ignore any "extra" cards. Maybe new "ignore card x for project y" config in the development BOINC Client can help here (to run the extra cards with other projects), though.. The source (or part of it, anyway) is available from Distributed.net and I've been thinking about contributing to them a "select specific cards for crunching" feature. No idea when and if that's even possible. (I've never contributed to their source so don't know if they accept patches from outsiders). And I still have my hands full with the wrapper part. :) -w |
Send message Joined: 20 Apr 11 Posts: 388 Credit: 822,356,221 RAC: 0 |
But really, if other projects such as DNETC can use the distributed.net programs while honouring BOINC's preferences, why can't Moo! Wrapper do that? Yeah, it's a limitation on the Distributed.net Client we wrap to the BOINC world. It does it's own card detection and defaults to using everything it finds. Like I said, it does have a way to limit but not to select cards. This started out as a pure CPU application so in that sense these "limits" are understandably.. (OS Thread Scheduler deals with allocating CPU cores and time slices.) In future I'll parse the device params from BOINC Client and set the numcpu param accordingly to automate this. This is best we can do for now and it does have it's limits.. :( -w |
Send message Joined: 20 Apr 11 Posts: 388 Credit: 822,356,221 RAC: 0 |
Actually, you can't use the command line switch since you'd need to give it to the client part but there are only ways you can give it to the wrapper part, which ignores it. Sorry about adding to your confusion there. :( Err, no, wait. You actually can if you use Anonymous platform and modify job XML file that has the d.net client command line right there. More confusion. :) Did you try this way and it didn't work? -w |
Send message Joined: 18 May 11 Posts: 46 Credit: 1,254,302,893 RAC: 0 |
Actually, you can't use the command line switch since you'd need to give it to the client part but there are only ways you can give it to the wrapper part, which ignores it. Sorry about adding to your confusion there. :( Yep, that's what I did. It didn't work at DNETC either. I'm running my multi ATI boxes at MW/Collatz and the ATI card on the 4 machines with mixed ATI/NVidia here. One GPU/WU works way better than two (for me at least). |
Send message Joined: 14 Dec 11 Posts: 1 Credit: 15,772 RAC: 0 |
ive just found this out now supposed to use my hd5750 with its new cooler but instead is using both that and my main hd5850 that i was using for terraria grrrrr and is currently struggling with its not so great stock cooler |