Whole Tomato Software Forums
Whole Tomato Software Forums
Main Site | Profile | Register | Active Topics | Members | Search | FAQ
User name:
Password:
Save Password
Forgot your password?

 All Forums
 Visual Assist
 Technical Support
 Parsing in large solutions impacts VS usability
 New Topic  Reply to Topic
 Printer Friendly
Author Previous Topic Topic Next Topic  

dinkai
Senior Member

27 Posts

Posted - Jan 21 2013 :  12:18:56 PM  Show Profile  Reply with Quote
Hi, we regularly work in 250+ project solutions, we've tried working with "parse all files when opening project" turned off which gave us the IDE back but made visual assist frustrating to use. With the option turned on VS is hovering at 95+ cpu and a significant parse time ~30-60 mins. Our solutions are also dynamically regenerated as well which means we don't just go through it once.

Can we either decrease the amount of resources used by "parse all" or improve how the parser operates when that option is disabled because it's frustrating having to open files in the IDE to get VA to parse them so I can get intelisense back on classes in other projects within the solution.

Thanks

feline
Whole Tomato Software

United Kingdom
17256 Posts

Posted - Jan 21 2013 :  4:00:34 PM  Show Profile  Reply with Quote
Which IDE are you using?

How many files do you have in your solution?

If you open VA's Open File in Solution dialog (Alt-Shift-O) the title bar contains two numbers. The first number is the number of files currently listed, which changes as you filter the list. The second number is the total number of files in the list, which is normally the number of files in your solution. What is this second number?

250+ is a lot of projects, but not unheard of. Still, this is a very long parsing time. If you are using VS2008 or earlier this thread describes a registry flag that can be used to stop the VA parser and the IDE parser running at the same time:

http://forums.wholetomato.com/forum/topic.asp?TOPIC_ID=8464

which might help.

zen is the art of being at one with the two'ness
Go to Top of Page

dinkai
Senior Member

27 Posts

Posted - Jan 22 2013 :  2:26:50 PM  Show Profile  Reply with Quote
Hi, VS2010 and approximately 22000. Visual Studio intelisense/parser is disabled.
Go to Top of Page

feline
Whole Tomato Software

United Kingdom
17256 Posts

Posted - Jan 23 2013 :  6:26:27 PM  Show Profile  Reply with Quote
That's a very large solution. How often does your solution get regenerated?

Have you tried turning off the IDE intellisense? This might make a difference, since you will only have one parser scanning your system, not two:

IDE tools menu -> Options -> Text Editor -> C/C++ -> Advanced -> Disable Database = True

We work hard to make VA's parser run as fast as possible, so we cannot do much to simply make the parser faster. So trying to reparse less often, or use a smaller solution are possible solutions.

zen is the art of being at one with the two'ness
Go to Top of Page

dinkai
Senior Member

27 Posts

Posted - Jan 25 2013 :  10:34:55 AM  Show Profile  Reply with Quote
It varies, we learn to not sync/build until we absolutely have to, but lets say once a day or two, it varies, some do it much more/less. VS IDE is disabled, we might as well go home if it wasn't. Limiting the amount of cpu usage by the parser would be great because then we get the advantage of an eventually completed full parse plus a responsive system, as opposed to a parse going full speed and taking our system/IDE down with it.
Go to Top of Page

feline
Whole Tomato Software

United Kingdom
17256 Posts

Posted - Jan 28 2013 :  12:46:53 PM  Show Profile  Reply with Quote
This might be a silly question, but why does the solution have to be recreated twice a day? I would normally only expect the solution file its self only has to be changed when you add or remove files.

I am not sure that slowing down the parsing is going to work to well here. If you are looking at nearly 60 minutes to parse everything at full speed, and a new solution twice a day, then you are going to have to start parsing everything again every 4 hours, so slow the parsing down by a factor of 4 and effectively you never finish parsing.

Is making a set of small stable solutions, that only include say half a dozen projects an option, and use these for editing but not building? Since the solution would include far fewer files, VA would be able to parse it much more quickly.

When doing an update from source control, do you see a lot of new files? Or mainly edits to existing files?

I am wondering if trying:

VA Options -> Performance -> watch for externally modified files and reparse when necessary = On
VA Options -> Performance -> Parse all files when opening a project = Off

and making sure you had the IDE and solution loaded before doing the source control sync would help. This should minimise the amount of parsing that has to happen, but does it keep you sufficiently up to date?

zen is the art of being at one with the two'ness
Go to Top of Page

dinkai
Senior Member

27 Posts

Posted - Jan 29 2013 :  06:52:04 AM  Show Profile  Reply with Quote
We have a lot of auto generated code.

The update from source control is less the issue, and more the regeneration of the solution due to potentially auto generated code. Even if that code has not "changed" the file is new. Which brings us to another question, is there some way to minimize the amount of files parsed? Is there some way of improving whether the parser chooses to reparse a file? Or does it work that way already? ie. a new time stamp should not necessarily be cause for a reparse.

With regards to slowing down the parsing can we at least get a reg entry to set the max number of cpu cores the parser can use?

We're aware of those two VA options some people are disabling the second but the consequence is a less robust experience.
Go to Top of Page

feline
Whole Tomato Software

United Kingdom
17256 Posts

Posted - Jan 29 2013 :  4:12:45 PM  Show Profile  Reply with Quote
I am not sure what method the parser uses to work out when to reparse a file, but a new time stamp seems a likely guess. The only real way around that would be to store a hash of every file, and to check the hash before deciding if we were going to reparse the file or not. This would help here, but it is a very specific solution, its not going to help other people much.

What we really need is a simple way to exclude just those files. Is there any chance your generated code files use a separate file extension, to make it easier to filter them out?

Is there any simple rule or pattern that could be used to determine which files are the auto generated ones? Do they live in a particular set of directory trees? Do they have a detectable pattern to the file names?

As a temporary work around for the CPU usage, I have found this page:

http://stackoverflow.com/questions/827754/how-to-set-processor-affinity-from-batch-file-for-windows-xp

which uses the SysInternals tool PsExec to set the processor affinity, to specify which CPU cores the launched process is running on. So the command:

PsExec -a 1,2 "C:\\Program Files\\Microsoft Visual Studio 10.0\\Common7\\IDE\\devenv.exe"

launches VS2010 using core 1 and 2 on my 4 core system, leaving 2 cores free. This should help a bit, but obviously won't help much with hard drive throughput while parsing.

zen is the art of being at one with the two'ness
Go to Top of Page

dinkai
Senior Member

27 Posts

Posted - Jan 30 2013 :  08:48:24 AM  Show Profile  Reply with Quote
I'm afraid we don't want to exclude those files from being parsed, especially if they've changed.

As for the core affinity solution, while it prevents devenv from taking the system down it doesn't help with continuing to use the IDE, and would just render it unusable for longer.

I'll talk to our partners, they have a significant number of VA licenses and it blows my mind that they might not have initiated a similar conversation in the past. The only thing I can think of is that people have moved away from using VS as their editing IDE and just accepted the situation.

Thanks for your support.
Go to Top of Page

sean
Whole Tomato Software

USA
2817 Posts

Posted - Jan 30 2013 :  11:09:18 AM  Show Profile  Reply with Quote
The next build will support a registry entry to limit the number of processors hit by our use of parallel operations. case=72066
Go to Top of Page

feline
Whole Tomato Software

United Kingdom
17256 Posts

Posted - Jan 30 2013 :  3:31:33 PM  Show Profile  Reply with Quote
I am not sure what else to suggest. If I understand correctly, you need all of the files in the solution to be parsed, but also change a large proportion of these files every few hours.

The only solution I can see so far is to parse fewer files, or to change the files less often. Or have I misunderstood something here?

zen is the art of being at one with the two'ness
Go to Top of Page

dinkai
Senior Member

27 Posts

Posted - Jan 31 2013 :  06:47:38 AM  Show Profile  Reply with Quote
Seans suggestion is one option, another would be to reduce the priority of the parsing VAX threads. Not everyone works the same and regenerates as frequently, but knowing that it's going and not visibly taking my system or VS down is huge. Again thank you both for the support.
Go to Top of Page

support
Whole Tomato Software

5566 Posts

Posted - Feb 25 2013 :  12:05:47 PM  Show Profile  Reply with Quote
case=72066 is implemented in build 1929

Create DWORD reg value named "MaxConcurrency" at HKCU\\Software\\Whole Tomato . Note this location is different from the IDE-specific registry customization locations.
Delete value to restore default behavior.
Set value to maximum number of hardware threads that parallel algorithms can use.
Go to Top of Page

sean
Whole Tomato Software

USA
2817 Posts

Posted - Feb 25 2013 :  12:25:02 PM  Show Profile  Reply with Quote
In other words, MaxConcurrency sets the maximum number of logical cores that the parallel algorithms are allowed to use.
Go to Top of Page
  Previous Topic Topic Next Topic  
 New Topic  Reply to Topic
 Printer Friendly
Jump To:
© 2019 Whole Tomato Software, LLC Go To Top Of Page
Snitz Forums 2000