T O P I C R E V I E W |
HateDread |
Posted - Aug 04 2017 : 09:26:27 AM It sounds a little crazy, but hear me out.
At work we have a good hundred high-spec machines, and at home I have a smaller network (currently 2, soon to be 4), and distribute our compilation via Incredibuild (and at home I use FASTBuild to do the same).
When I open a massive solution like UE4, or some of our stuff at work, Visual Assist takes a while to parse the solution, pegged at 100% CPU usage, but hardly any disk.
In other situations where CPU is the limitation (like the above distributed compilation stuff), it makes sense to distribute the load. What do you think of the idea of splitting parsing the solution into pieces and offloading those to other PCs running some Visual Assist-branded service? The limitation is all CPU, and I am surrounded by CPUs that could be put to better use!
It would be preferable to split below the project level since something like UE4 is one big project, but I could see that as an interesting first step/experiment.
Anyways, thought I'd float the idea out there. Parsing a big solution is one of the last bottlenecks to my programmer experience, and I and most other developers have the hardware to change that but not software that uses it.
Cheers :) |
2 L A T E S T R E P L I E S (Newest First) |
accord |
Posted - Aug 21 2017 : 08:25:08 AM Initial parse of large solutions, especially those that include Unreal Engine 4 (UE4), requires less time. Initial parse times for UE4 have dropped by up to 75%.
From What's New page for build 2231 https://www.wholetomato.com/features/whats-new.asp
|
accord |
Posted - Aug 07 2017 : 3:49:28 PM We are currently investigating ways to speed up the parsing process of UE4. The parsing of a file is relatively quick, much quicker than compilation. However, we found some bottlenecks in unexpected places, most notably we are looking into ways to speed up include resolution because UE4 uses a large amount of include paths. We can re-evaluate this question after case=109296 is fixed.
That you for the feedback, we appreciate it. |
|
|