Closed
Bug 687388
Opened 13 years ago
Closed 11 years ago
Visual Studio Project Generation
Categories
(Firefox Build System :: General, defect)
Firefox Build System
General
Tracking
(Not tracked)
RESOLVED
FIXED
mozilla30
People
(Reporter: gps, Assigned: gps)
References
(Depends on 3 open bugs, Blocks 1 open bug)
Details
(Whiteboard: [mach build && mach build-backend -b VisualStudio])
Attachments
(1 file, 5 obsolete files)
37.19 KB,
patch
|
mshal
:
review+
|
Details | Diff | Splinter Review |
Many developers love developing in Visual Studio. Unfortunately, we don't currently maintain Visual Studio project files for mozilla-central.
I would like to see Mozilla officially support Visual Studio project files at some level. By providing Visual Studio support, we will likely make Windows developers feel more at home. And this will help Mozilla attract (and hopefully retain) more Windows developers.
As for implementation, it will be a long road. In the current state of the build system, I see the only feasible procedure for constructing Visual Studio files to be parsing the makefiles and extracting useful metadata. (The alternatives are 1) reproduce everything in makefiles in Visual Studio or nmake and keep these in sync for all of time; 2) switch to a build system that allows generation of Visual Studio projects (like CMake- but even that excludes header files from produced projects, which is very annoying). Both of these alternatives are a lot more effort than extracting from the existing build files).
The biggest drawback to parsing makefiles is that we will need to codify the rules for extracting metadata. This means that for every different style of building things (e.g. minor variable name differences), we'll need to code a rule for that. Well, the good news is our tree today is very consistent! Checking for specific variables (such as MODULE, LIBRARY, and CPPSRCS) allows us to extract data from most makefiles! Going forward, we could probably create audit tools that ensure Makefiles all follow "allowed" "syntax" and audit compliance would ensure that the Visual Studio generator tool continues to work.
Anyway, I put together a proof-of-concept showing all this is possible. I created a Python module that extracts metadata from makefiles using PyMake and another module to convert this metadata into Visual Studio projects. In its current form, you point it at a pre-configured objdir directory and it produces VS2010 projects for the C++ libraries and a solution including all these. Currently, the projects just contain lists of files. Build steps are not configured. But, you can still use the IDE for file searching, IntelliSense, etc.
Code can be found at https://github.com/indygreg/mozilla-central/commits/visual-studio-generation. The initial commit is surprisingly simple: https://github.com/indygreg/mozilla-central/commit/16432d63109083afbc248b3338e15d20a3d60513
I'll continue to update this bug with progress.
Comment 1•13 years ago
|
||
Greg, I am very interested in this project. Please let me know what I can do to help.
I think there is an important reason NOT to create separate projects for each module: To support a future in which we can *build* in Visual Studio (including edit-and-continue support), we have to adhere to Visual Studio's system where each project has exactly one output DLL/EXE/LIB. We cannot have projects generate static libraries (LIB) because incremental linking doesn't work when the linker encounters a static library that changed. This leaves us with one project per EXE and DLL produced. This means that most of Gecko should be in a single libxul project.
I have hacked together a VS2010 project file that supports my usage of VS2010 for navigating the parts of Gecko that I work on every day. Generally, almost everything does (or will soon) get linked into libxul. I have found I could just through all the *.cpp;*.c;*.h;*.idl files under most of the source directories into a single VS2010 "lixul" project and things worked well. I think you should design this system for this future simpler world where there are not separate DLLs for NSPR and NSS, and mozjs.
I created a "mozilla" directory outside of $(OBJDIR) and $(SRCDIR); then I hard-linked Mutex.h (et al.) and other headers that get #included as "mozilla/Mutex.h" (et al.) to their locations under the source folder. This allows Visual Studio's indexer to find these headers.
It is also important to include all the _xpidlgen folders under $(OBJDIR) individually in the include path, instead of just adding $(OBJDIR)/dist/include in the include directory. Otherwise, when you use source code nagivation (e.g. F12), you will end up editing $(OBJDIR)/dist/include/$(header).h instead of the version under the source directory. IIRC, I had to do something similar for NSPR's prcpucfg.h.
Comment 2•13 years ago
|
||
(In reply to Brian Smith (:bsmith) from comment #1)
> I think you should design this system for this future simpler
> world where there are not separate DLLs for NSPR and NSS, and mozjs.
By the way, there are already efforts to do this; see bug 648407, bug 609976, bug 561842.
I think many DEFINES which are used only in specific modules could be made global to all modules (in libxul); however, I am sure there are some DEFINEs that will break things when set globally for all of libxul. I suggest we find and document all these DEFINEs and work on making the project build when they are all set globally.
Assignee | ||
Comment 3•13 years ago
|
||
(In reply to Brian Smith (:bsmith) from comment #1)
> we have to adhere to Visual Studio's system where each project has exactly
> one output DLL/EXE/LIB. We cannot have projects generate static libraries
> (LIB) because incremental linking doesn't work when the linker encounters
> a static library that changed.
I'm going to challenge this assertion.
First, I've configured Visual Studio to output a static AND shared library (.lib and .dll) in one project and configuration. I know because it took me hours to figure out how to do this. Unfortunately, I can't remember the exact settings, but I do have project files saved (e.g. https://github.com/indygreg/zippylog/blob/master/msvc/libzmq.vcproj). I'm pretty sure you need to enable some "edit and continue" flag which has the side-effect of not deleting the .lib which is always produced during a shared library build but is deleted if it isn't needed (or something like that).
To be sure I'm not on crack, I performed a clean build of the aforementioned project, verified a .lib and .dll were both being produced, entered a debugger, hit a breakpoint, changed a source file, hit continue, and watched an incremental build only rebuild the file I touched.
Maybe my scenario is different from what would be required for m-c, but things are possible under the right conditions.
> I have hacked together a VS2010 project file that supports my usage of
> VS2010 for navigating the parts of Gecko that I work on every day.
Is there any value in throwing up these files somewhere for me to look at?
> I have found I could just through all the *.cpp;*.c;*.h;*.idl files under most
> of the source directories into a single VS2010 "lixul" project and things
> worked well. I think you should design this system for this future simpler
> world where there are not separate DLLs for NSPR and NSS, and mozjs.
Manually crawling directories for *.cpp, *.h, etc files and populating projects from the results had crossed my mind. The only reason I haven't done this yet is that CPPSRCS, EXPORTS, etc are pretty well-defined throughout the tree. That being said, there are a number of places that don't so it is inevitable we'll have to fall back to tree searching at some point.
> It is also important to include all the _xpidlgen folders under $(OBJDIR)
> individually in the include path, instead of just adding
> $(OBJDIR)/dist/include in the include directory. Otherwise, when you use
> source code nagivation (e.g. F12), you will end up editing
> $(OBJDIR)/dist/include/$(header).h instead of the version under the source
> directory. IIRC, I had to do something similar for NSPR's prcpucfg.h.
Yeah, supporting IDLs properly is going to be a pain point if we wish to build natively from within Visual Studio. I hope to delay this pain as long as possible by calling out to PyMake for as long as possible. That means we won't get edit and continue in Visual Studio, but we get the IDE and debugger, so that's a net win.
Assignee | ||
Comment 4•13 years ago
|
||
Since the initial commit, I've made the following progress:
- hooked up build event to call out to PyMake
- added stub projects for all Makefiles which we don't seem to define a module or library. projects are empty but have the build event plumbing, so they can theoretically build
- changed project names to be more human friendly
- added preprocessor defines (so IDE can populate properly)
I can see builds invoking PyMake, but none of the dependencies are defined, so it is full of build failures. But, IntelliSense is working pretty well across the tree!
Source is pushed to the 'visual-studio-generation' branch of git://github.com/indygreg/mozilla-central.git
Comment 5•13 years ago
|
||
FWIW, Visual Studio has an option somewhere called something like "don't output intermediate static libs", which is basically the same concept as our fakelibs stuff, it just links the object files directly to the final binaries instead of wasting time producing static libs.
Comment 6•13 years ago
|
||
(In reply to Gregory Szorc [:gps] from comment #3)
> (In reply to Brian Smith (:bsmith) from comment #1)
> > we have to adhere to Visual Studio's system where each project has exactly
> > one output DLL/EXE/LIB. We cannot have projects generate static libraries
> > (LIB) because incremental linking doesn't work when the linker encounters
> > a static library that changed.
>
> I'm going to challenge this assertion.
>
> First, I've configured Visual Studio to output a static AND shared library
> (.lib and .dll) in one project and configuration. I know because it took me
> hours to figure out how to do this.
If, like Ted suggested, there is a way to cause Visual Studio to generate a bunch of *.obj files without linking them into *.libs, then separate projects would work fine. I could not figure out how to make it do this when I investigated it late last year, so I just created a single libxul project.
> > I have hacked together a VS2010 project file that supports my usage of
> > VS2010 for navigating the parts of Gecko that I work on every day.
> Is there any value in throwing up these files somewhere for me to look at?
My VS2010 projects were built by hand, iteratively via the IDE. I did write some find(1)/xargs(1)/sed(1)-based scripts that generate the XML that gets pasted into the solution/project file but I think your automation of this is much further along than any automation I did.
> Yeah, supporting IDLs properly is going to be a pain point if we
> wish to build natively from within Visual Studio. I hope to
> delay this pain as long as possible by calling out to PyMake for
> as long as possible. That means we won't get edit and continue
> in Visual Studio, but we get the IDE and debugger, so that's a
> net win.
IMO, compile/link libxul and edit-and-continue are the main things that are missing from Visual-Studio-based Windows development right now. The debugger already works fine. And, the IDE works fine even now if you do one build with pymake on the command line before opening the project in VS. (FWIW, I use VS2010 all day every day this way.)
I suggest that breaking the build into three stages: (1) all code generation needed to build libxul, (2) compiling and linking libxul, and (3) everything else. Doing (2) would get edit-and-continue functionality and useful compiler error navigation within the IDE; either would be a huge productivity improvement (for me). Being able to do stages (1) and/or (3) without going to the command line would also be great, but IMO (2) is the big win by far.
Assignee | ||
Comment 7•13 years ago
|
||
I have successfully compiled some C++ libraries *inside Visual Studio* using automatically generated Visual Studio projects! Visual Studio *does not* call out to any external process (PyMake, msys, cl.py, etc). Instead, I've converted the compilation arguments to native Visual Studio Project features. I just load the solution in Visual Studio and click build and it just works.
Granted, it is still very hacky and lots of things don't work (IDLs and shared libraries for starters). But, it is better than it was before.
As always, code is at https://github.com/indygreg/mozilla-central/tree/visual-studio-generation. I also updated the README.txt file with current status. In that file is also my dream vision of how the build system could one day evolve to get us out of this Makefile hell.
Assignee | ||
Comment 8•13 years ago
|
||
And IDL's now compile within Visual Studio! Code on GitHub.
Unfortunately, the IDLs won't work with Visual Studio's built-in IDL tool, so I have to call out to Python to produce the .h files. Visual Studio is smart enough to recompile the .h file when the source .idl changes. You can also list additional dependencies on the source files. But, I don't have that hooked up yet.
Pretty much the entire netwerk tree now builds within Visual Studio! I love progress.
Updated•13 years ago
|
Assignee: nobody → gps
Comment 9•13 years ago
|
||
Greg, this is great. Here is some feedback:
> GOALS
>
> * Build firefox.exe from Visual Studio without requiring a shell
IMO, building libxul.dll specifically is the most urgent need. Other things (including NSPR and NSS) would be great (especially for me, as I work on them all the time), but just getting libxul to work well to start would be a HUGE win.
> * Allow developers to see every relevant source file from Visual Studio
I have found that VS2010 will get VERY slow if you add all the tests, HTML documents, etc. It might make sense to get just the source code (without the tests) working first.
> * Likely ship the Visual Studio generation files. Maybe ship
> pre-generated project files.
I think we should just ship with the generation scripts, not with pre-generated project files, unless/until we are willing to require building with the project files on Windows (using msbuild and/or the IDE instead of gmake).
> NON-GOALS
>
> Emulating the exact behavior of the build system
> * We don't need every file to be in dist/
I mostly agree. But, all tests should be runnable (with correct results).
> * We don't need every library to be linked exactly the same way
I disagree here. We should eventually be able to link (and compile!) the same way. But, we can target our future linking state (everything linked into libxul except softoken and freebl) even if we aren't there today.
> * Supporting every combination of build via project configurations (e.g.
> Debug vs Release, jemalloc vs non-jemalloc)
This seems pretty reasonable to me, but it would be great if we could have both a PGO configuration that is exactly like the release, and a debug configuration without crash reporter but with other things like the release builds.
> I want to emphasize the importance of having a Makefile "style"
> convention. What I mean by this is having all the Makefiles have
> the same typical pattern of declaring the same variables.
+1. This would also make it easier to enable non-recursive make (bug 167254, bug 623617, etc.)
Comment 10•13 years ago
|
||
> FWIW, Visual Studio has an option somewhere called something like "don't output
> intermediate static libs", which is basically the same concept as our fakelibs
> stuff, it just links the object files directly to the final binaries instead of
> wasting time producing static libs.
Did we ever find a way to do this?
> # TODO fix directories causing us hurt
> ignore_dirs = [
> 'js/src/xpconnect', # hangs
> 'modules/libbz2', # somehow forks and calls itself recursively
> 'security/manager', # hangs
> ]
Do you have any more details on the security/manager hangs? Feel free to ask for me to help fix them, since I work on security/manager.
> # NSPR also has a set of per-platform .cfg files. We copy these to
> # the output directory.
...
> # NSPR takes one of these config files and renames it to prcpucfg.h
If we ship the generation scripts instead of the project files in mozilla-central, then I imagine the build process will go something like this:
export MOZCONFIG=<whatever>
autoconf-2.13
cd $OBJDIR
../src/configure
../src/build/generate-msvc.py
One option then, is to move the above NSPR logic above the generate-mscv step into the configure step, to avoid the custom processing on generate-msvc.py. Logically, this makes more sense as part of configure anyway.
Assignee | ||
Comment 11•13 years ago
|
||
(In reply to Brian Smith (:bsmith) from comment #10)
Oh no, you are reading the source code ;) It isn't the most readable Python code I could produce (many of the nested method definitions need to go, for example). Tread carefully and without much scrutiny for coding practices for now, please.
> > FWIW, Visual Studio has an option somewhere called something like "don't output
> > intermediate static libs", which is basically the same concept as our fakelibs
> > stuff, it just links the object files directly to the final binaries instead of
> > wasting time producing static libs.
>
> Did we ever find a way to do this?
I have not found a way to do this. However, I haven't really looked yet. I know you can tell Visual Studio to link the .obj files instead of the static libs, but I think that is different.
http://msdn.microsoft.com/en-us/library/19z1t1wy.aspx, http://msdn.microsoft.com/en-us/library/ms168837.aspx, http://msdn.microsoft.com/en-us/library/microsoft.visualstudio.vcprojectengine.vclibrariantool.aspx, and http://msdn.microsoft.com/en-us/library/microsoft.visualstudio.vcprojectengine.vclinkertool.aspx are excellent resources, if you want to beat me to it.
I have a feeling that the solution will be to change the project type from a library to Utility (or something like that) and rely on the default builder for .cpp files to kick in and produce only the .obj files.
>
> > # TODO fix directories causing us hurt
> > ignore_dirs = [
> > 'js/src/xpconnect', # hangs
> > 'modules/libbz2', # somehow forks and calls itself recursively
> > 'security/manager', # hangs
> > ]
>
> Do you have any more details on the security/manager hangs? Feel free to ask
> for me to help fix them, since I work on security/manager.
I haven't really investigated this yet. I'm guessing PyMake is dynamically calling out to something that never really returns when attempting to evaluate a variable or something. It'll probably be a while before I tackle these, as they don't currently block me. Whatever the root cause, there will likely be bugs filed.
> If we ship the generation scripts instead of the project files in
> mozilla-central, then I imagine the build process will go something like
> this:
>
> export MOZCONFIG=<whatever>
> autoconf-2.13
> cd $OBJDIR
> ../src/configure
> ../src/build/generate-msvc.py
>
> One option then, is to move the above NSPR logic above the generate-mscv
> step into the configure step, to avoid the custom processing on
> generate-msvc.py. Logically, this makes more sense as part of configure
> anyway.
I support this idea.
That being said, I'm currently trying to be as non-invasive as possible to m-c. I'm thinking that once things are somewhat stable, it will be pretty obvious where the pain points are and bugs can be filed to deal with them then. I also don't want to be blocked on progress by other bugs I file along the way. If you think I should be more proactive in filing "obvious" bugs giving me pain, I can do this. I'm just not sure how receptive people will be to making changes which effectively amount to supporting an experiment.
Assignee | ||
Comment 12•13 years ago
|
||
(In reply to Brian Smith (:bsmith) from comment #9)
> > * Build firefox.exe from Visual Studio without requiring a shell
>
> IMO, building libxul.dll specifically is the most urgent need. Other things
> (including NSPR and NSS) would be great (especially for me, as I work on
> them all the time), but just getting libxul to work well to start would be a
> HUGE win.
After tackling project generation for NSPR, it does kinda make sense to skip these one-off projects (NSPR, NSS and possibly JS, jemalloc, others) from project generation for the time being, as they are a real time sink. I could certainly change the instructions to include the steps to manually build these one-off projects before running generate-msvc.py.
> I have found that VS2010 will get VERY slow if you add all the tests, HTML
> documents, etc. It might make sense to get just the source code (without the
> tests) working first.
This is very good to know. I haven't seen any obvious performance issues yet. But, I don't have it loading test files yet either.
> > * We don't need every library to be linked exactly the same way
>
> I disagree here. We should eventually be able to link (and compile!) the
> same way. But, we can target our future linking state (everything linked
> into libxul except softoken and freebl) even if we aren't there today.
Is it really as simple as "everything linked into libxul?" What about NSPR, NSS, JS? I don't suppose there is a document stating this vision with more technical details?
> > * Supporting every combination of build via project configurations (e.g.
> > Debug vs Release, jemalloc vs non-jemalloc)
>
> This seems pretty reasonable to me, but it would be great if we could have
> both a PGO configuration that is exactly like the release, and a debug
> configuration without crash reporter but with other things like the release
> builds.
I would love to do this as well. In my grand vision, we'd produce projects with multiple configurations (Debug, Release, PGO, with x86 and AMD64 variants of each). The path from a single configuration (the current approach) should be relatively straightforward, so I'm focusing on the simple solution now.
Assignee | ||
Comment 13•13 years ago
|
||
(In reply to Gregory Szorc [:gps] from comment #12)
> Is it really as simple as "everything linked into libxul?" What about NSPR,
> NSS, JS? I don't suppose there is a document stating this vision with more
> technical details?
Brian is so good, he answered this question in comment #2, I now see.
Assignee | ||
Comment 14•13 years ago
|
||
I took a step back and realized that I was effectively designing two systems: 1) a Mozilla build system data extraction API 2) Visual Studio project generation.
So, I've been spending a lot of time recently formalizing #1 and focusing less on #2. I'm anticipating this being one of those times where you put a lot of legwork in the low-level APIs and the high-level pieces just kind of fall into place.
In today's push [1], I completely broke Visual Studio generation (it didn't really work that well anyway). However, I now have a Mozilla-specific build system "parser" built on top of the PyMake APIs. It reads in all the Makefiles and converts them to a set of Python classes [2]. You get data structures for libraries, tests, file exports, IDLs, programs, etc.
The next step is to feed these data structures into something useful. For example, I could send them to a Visual Studio project outputter. Or, I could output a single, derecursified Makefile. Or, I could construct a monolithic data structure of ALL THE DATA. Now that I've isolated the data representation of the build system, I can effectively do whatever I want with that data. I think that's pretty damn cool.
I've updated the README [3] with the new state of the world. As always, I welcome feedback. I recognize I'm getting side-tracked here. But, I feel that if I do this right, others can build on top of it and some splendid build improvements can come out of it (mainly drastic speed improvements).
[1] https://github.com/indygreg/mozilla-central/compare/299014529e...8d5cef27c6
[2] https://github.com/indygreg/mozilla-central/blob/visual-studio-generation/build/buildparser/data.py
[3] https://github.com/indygreg/mozilla-central/blob/visual-studio-generation/README.txt
Instead of focusing on visual studio project files why not focus on producing gyp files?
Assignee | ||
Comment 16•13 years ago
|
||
(In reply to ben turner [:bent] from comment #15)
> Instead of focusing on visual studio project files why not focus on
> producing gyp files?
This is certainly a tempting idea, as production of GYP files would theoretically yield Visual Studio, XCode, Makefile, SCons, etc files, saving the hassle of manually coding these transformations. I will definitely explore this option.
One thing that does worry me is that (assuming the docs [1] are correct) GYP suffers the same "flaw" as CMake in that it produces {Visual Studio, XCode, etc} files that are the minimum to compile the project, nothing more. For example, a "target" in GYP neglects to allow the definition of header files (.h, .hpp, etc, only allowing source files (.c, .cpp, etc) to be defined. The resulting Visual Studio projects are arguably barely usable (at least as an IDE), as they would be missing a number of files! (Of course, the produced projects could be consumed by the MSVC tools, which would probably yield faster compilation times than Makefiles on Windows, so all is not lost.)
I have no experience with GYP, so I could be totally off-base here. And, even if this is a current limitation, we could certainly patch or extend GYP to do what we want/need.
Exploring GYP is definitely on my road map. If anyone has any experience with it, I would love to hear details. Otherwise, I guess I'll learn all about it once I start focusing less on the data extraction component.
[1] https://code.google.com/p/gyp/wiki/GypLanguageSpecification
Comment 17•13 years ago
|
||
Interesting! Did you handle all the corner-cases in the build system, or are you just glossing over them for now? (Or do we need to fix those to make this actually work properly?)
I'm on the fence about gyp. It seems like they designed something that's good enough for Chromium, but I'm not sure that it meets our needs particularly well.
Assignee | ||
Comment 18•13 years ago
|
||
(In reply to Ted Mielczarek [:ted, :luser] from comment #17)
> Interesting! Did you handle all the corner-cases in the build system, or are
> you just glossing over them for now? (Or do we need to fix those to make
> this actually work properly?)
I have handled most of the common cases (variables appearing > 10 times). But, there is a long tail of ~50 uppercase variables only appearing once or twice. I'd like to produce a script making analysis of this tail easier, as I'm sure there are a bunch that could be fixed today without much effort.
For my approach to work reliably and completely, this long tail needs to be handled, reduced, and controlled (via automation) to ensure the tree is in compliance tomorrow as well as today. In other words, we should get an orange if build data is introduced that we don't know how to handle or is a known bad form.
Assignee | ||
Comment 19•13 years ago
|
||
(In reply to Brian Smith (:bsmith) from comment #10)
> > # TODO fix directories causing us hurt
> > ignore_dirs = [
> > 'js/src/xpconnect', # hangs
> > 'modules/libbz2', # somehow forks and calls itself recursively
> > 'security/manager', # hangs
> > ]
>
> Do you have any more details on the security/manager hangs? Feel free to ask
> for me to help fix them, since I work on security/manager.
This appears to be a side-effect of the particular (improper) environment from which I was invoking the PyMake Python APIs. However, the behavior of apparent hangs is buggy, so I filed bug 698529.
Assignee | ||
Comment 20•13 years ago
|
||
I recently pushed a script (build/parse-tree.py) that prints some metadata extracted from the build tree. It is pretty basic right now, only printing variable counts for variables that exist in the Makefiles and counts of variables currently unhandled by my parsing code.
http://gps.pastebin.mozilla.org/1370358 contains the list of all variables. http://gps.pastebin.mozilla.org/1370361 contains all variables unhandled by my parser. You can see the long tail I've been talking about.
Assignee | ||
Comment 21•13 years ago
|
||
I was playing around with IDLs tonight and was able to integrate my code with the Python IDL parser in xpcom/idl-parser, which conveniently has an API to find file dependencies. I parsed all the IDLs during my data extraction phase then fed this in to a Makefile generator. The end result is a single Makefile which explicitly lists each target with individualized dependencies. I just derecursified IDL generation!
Assignee | ||
Comment 22•13 years ago
|
||
2 updates from this weekend:
1) Logic for creating a single Makefile for IDL and file exporting now works much better than at the time of the last comment. https://gist.github.com/1363034#file_optimized.mk should run on Linux and hopefully OS X (not tested there, however). Just change the paths at the top. It runs about 2x faster than the current `make export` phase (12.5s vs 25s). Incremental builds are 0.5s vs 5.5s and don't actually evaluate any targets (because the dependencies are fully proper).
2) I wrote a tool to make cross-referencing and analyzing the build system easier. A snapshot is at http://gregoryszorc.com/mozilla/build.html. I'll give it UX love later. If anyone has requests, I can add them in there easily enough. Or, things are on GitHub, so feel free to submit a pull request :)
If you grab my code from Git, you can produce the same file via
$ ./build/parse-tree.py --generate-html=./build.html ~/src/mozilla-central-git/obj-ff-debug/
Makefile-aware functionality could certainly be added to MXR or DXR using this work. Everything I've written is a reusable Python module, so it's just a matter of plumbing.
Assignee | ||
Comment 23•12 years ago
|
||
http://gregoryszorc.com/blog/2012/08/28/visual-studio-project-generation-for-mozilla-central/
It's basically IntelliSense only at this point, but it's better than nothing.
Assignee | ||
Comment 24•11 years ago
|
||
Not actively working on. But it's not out of mind.
Assignee: gps → nobody
Assignee | ||
Comment 25•11 years ago
|
||
A new moz.build-based build backend for Visual Studio project generation
has been added. The build backend can be used by specifying
'VisualStudio' to the backend option of config.status or mach
build-backend. e.g. `mach build-backend -b VisualStudio`. Run that
command then open objdir/msvc/mozilla.sln in Visual Studio.
The Visual Studio solution consists of a project for each static library
plus a "binaries" project. Each library project is dependent on
"binaries" and the "binaries" project has a build rule that will shell
out to make to build via `mach build binaries`. If you debug the
"binaries" project, firefox.exe (or whatever the configured application
is) should start.
Generated projects have includes and defines set, so IntelliSense
*should* work. Although, it's not auto-completing for me locally. It
might be my machine.
At this point, I'm interested in collecting feedback to see how useful
this approach is. Does IntelliSense work for you? Do you like the
solution layout? What features is it lacking for this to be usable by
you?
Before this lands, I'll likely:
* Add basic unit tests
* Enable Visual Studio project generation automatically as part of
config.status
* Advertise the existence of Visual Studio files in the output of the
build system
Keep in mind that perfect is the enemy of done. I argue that some Visual
Studio is better than no Visual Studio. Let's try to capture a minimum
required feature set for Gecko developers, land it, and refine later.
Attachment #8376921 -
Flags: feedback?(ehsan)
Attachment #8376921 -
Flags: feedback?(brian)
Assignee | ||
Updated•11 years ago
|
Assignee: nobody → gps
Status: NEW → ASSIGNED
Comment 26•11 years ago
|
||
*awesome*
Comment 27•11 years ago
|
||
(In reply to Jim Mathies [:jimm] from comment #26)
> *awesome*
I'll have more feedback beyond this once I finish a build :)
Comment 28•11 years ago
|
||
Hmm, I ran into one problem, the src path seems to be broken. In my build I have the src and obj dirs at the same root. Trying to open a src file results in a file not found error, the path displayed shows that the project references the src file in the wrong location. For example:
f:\Mozilla\mc\(src)
f:\Mozilla\relobj (obj)
Trying to open something in widget\windows error path:
f:\Mozilla\widget\windows\(file)
^ missing the 'mc' root directory.
Comment 29•11 years ago
|
||
Before I start, this is a great effort, thanks for doing this! I'm going to note the problems I hit below but I don't want the comment to sound too negative because of that, so, great work! :-)
I tried running this command three times, once immediately after I applied your patch on a dirty objdir (had merged with origin/master right before), and got this error:
$ ./mach build-backend -b VisualStudio
0:00.17 c:\moz\src\obj-ff-dbg\_virtualenv\Scripts\python.exe c:\moz\src\obj-ff-
dbg\config.status --backend=VisualStudio
Reticulating splines...
Traceback (most recent call last):
File "c:\moz\src\obj-ff-dbg\config.status", line 881, in <module>
config_status(**args)
File "c:\moz\src\python\mozbuild\mozbuild\config_status.py", line 130, in conf
ig_status
summary = the_backend.consume(definitions)
File "c:\moz\src\python\mozbuild\mozbuild\backend\base.py", line 186, in consu
me
for obj in objs:
File "c:\moz\src\python\mozbuild\mozbuild\frontend\emitter.py", line 98, in em
it
for out in output:
File "c:\moz\src\python\mozbuild\mozbuild\frontend\reader.py", line 726, in re
ad_mozbuild
raise bre
mozbuild.frontend.reader.BuildReaderError: ==============================
ERROR PROCESSING MOZBUILD FILE
==============================
The error occurred while processing the following file:
c:/moz/src/js/src/moz.build
The error was triggered on line 21 of this file:
LIBRARY_NAME = CONFIG['JS_LIBRARY_NAME']
The underlying problem is an attempt to write an illegal value to a special vari
able.
The variable whose value was rejected is:
LIBRARY_NAME
The value being written to it was of the following type:
NoneType
This variable expects the following type(s):
unicode
Change the file to write a value of the appropriate type and try again.
Then I tried ./mach clobber and ./mach build-backend -b VisualStudio again, and got this error this time:
$ ./mach build-backend -b VisualStudio
0:00.19 c:\moz\src\obj-ff-dbg\_virtualenv\Scripts\python.exe c:\moz\src\obj-ff-
dbg\config.status --backend=VisualStudio
Error running mach:
['build-backend', '-b', 'VisualStudio']
The error occurred in code that was called by the mach command. This is either
a bug in the called code itself or in the way that mach is calling it.
You should consider filing a bug for this issue.
If filing a bug, please include the full output of mach, including this error
message.
The details of the failure are as follows:
WindowsError: [Error 267] The directory name is invalid
File "c:\moz\src\python/mozbuild/mozbuild/mach_commands.py", line 546, in buil
d_backend
ensure_exit_code=False)
File "c:\moz\src\python/mozbuild\mozbuild\base.py", line 518, in _run_command_
in_objdir
return self.run_process(cwd=self.topobjdir, **args)
File "c:\moz\src\python/mach\mach\mixin\process.py", line 121, in run_process
status = subprocess.call(args, cwd=cwd, env=use_env)
File "c:\mozilla-build\python\lib\subprocess.py", line 524, in call
return Popen(*popenargs, **kwargs).wait()
File "c:\mozilla-build\python\lib\subprocess.py", line 711, in __init__
errread, errwrite)
File "c:\mozilla-build\python\lib\subprocess.py", line 948, in _execute_child
startupinfo)
And then I ran ./mach build, let it run past configure scripts, and this time the command worked. VC++2012 asked me to update the project files to use the Visual C++ 2012 compiler and libraries, to which I said no.
On the support for building, I had aborted the ./mach build command above after the configure step, so did not have a fully built directory. When I used the Build Solution option in Visual Studio, it seemed like it's trying to build the binaries target, which of course won't work:
1>------ Build started: Project: binaries (Visual Studio 2010), Configuration: Build Win32 ------
1> C:/mozilla-build/msys/bin/sh.exe -c c:/mozilla-build/mozmake/mozmake.EXE -C c:/moz/src/obj-ff-dbg -j6 -s backend.RecursiveMakeBackend
1> C:/mozilla-build/msys/bin/sh.exe -c c:/mozilla-build/mozmake/mozmake.EXE -j6 -s binaries
1> From dist/public: Kept 0 existing; Added/updated 0; Removed 0 files and 0 directories.
1> From dist/sdk: Kept 0 existing; Added/updated 0; Removed 0 files and 0 directories.
1> From dist/private: Kept 0 existing; Added/updated 0; Removed 0 files and 0 directories.
1> From dist/bin: Kept 49 existing; Added/updated 0; Removed 0 files and 0 directories.
1> From dist/idl: Kept 1151 existing; Added/updated 0; Removed 0 files and 0 directories.
1> From dist/include: Kept 3809 existing; Added/updated 0; Removed 0 files and 0 directories.
1> From _tests: Kept 13322 existing; Added/updated 0; Removed 0 files and 0 directories.
1> mfbt.lib.desc
1> mozz.lib.desc
1> module.res
1> module.res
1> Creating Resource file: module.res
1> module.res
1> Creating Resource file: module.res
1> Creating Resource file: module.res
1> mozsqlite3.lib.desc
1> Microsoft (R) Windows (R) Resource Compiler Version 6.2.9200.16384
1>
1> Copyright (C) Microsoft Corporation. All rights reserved.
1>
1>
1> crashinject.exe
1> Microsoft (R) Windows (R) Resource Compiler Version 6.2.9200.16384
1>
1> Copyright (C) Microsoft Corporation. All rights reserved.
1>
1>
1> Microsoft (R) Windows (R) Resource Compiler Version 6.2.9200.16384
1>
1> Copyright (C) Microsoft Corporation. All rights reserved.
1>
1>
1> mozglue.dll
1> mozalloc.dll
1> bz2.lib.desc
1> media_libjpeg.lib.desc
1> hostbz2.lib
1> Executing: link -NOLOGO -DLL -OUT:mozalloc.dll -PDB:mozalloc.pdb -SUBSYSTEM:WINDOWS -MACHINE:X86 @c:\moz\src\obj-ff-dbg\memory\mozalloc\tmpuukof5.list module.res -LARGEADDRESSAWARE -NXCOMPAT -DYNAMICBASE -SAFESEH -DEBUG -DEBUGTYPE:CV -DEBUG -OPT:REF c:/moz/src/obj-ff-dbg/dist/lib/mozglue.lib kernel32.lib user32.lib gdi32.lib winmm.lib wsock32.lib advapi32.lib secur32.lib netapi32.lib
1> c:\moz\src\obj-ff-dbg\memory\mozalloc\tmpuukof5.list:
1> msvc_raise_wrappers.obj
1> msvc_throw_wrapper.obj
1> Unified_cpp_memory_mozalloc0.obj
1>
1>LINK : fatal error LNK1181: cannot open input file 'c:/moz/src/obj-ff-dbg/dist/lib/mozglue.lib'
1>
1> c:/moz/src/config/rules.mk:886: recipe for target 'mozalloc.dll' failed
1> mozmake.EXE[2]: *** [mozalloc.dll] Error 1181
1> c:/moz/src/config/recurse.mk:100: recipe for target 'memory/mozalloc/binaries' failed
1> mozmake.EXE[1]: *** [memory/mozalloc/binaries] Error 2
1> mozmake.EXE[1]: *** Waiting for unfinished jobs....
1> Creating library mozglue.lib and object mozglue.exp
1>
1> c:/moz/src/config/recurse.mk:39: recipe for target 'binaries' failed
1> mozmake.EXE: *** [binaries] Error 2
1> 2
1>C:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\V110\Microsoft.MakeFile.Targets(38,5): error MSB3073: The command "mozilla.bat" exited with code 1.
2>------ Build started: Project: AccessibleMarshal (Visual Studio 2010), Configuration: Build Win32 ------
3>------ Build started: Project: CNG (Visual Studio 2010), Configuration: Build Win32 ------
4>------ Build started: Project: G711 (Visual Studio 2010), Configuration: Build Win32 ------
5>------ Build started: Project: IA2Marshal (Visual Studio 2010), Configuration: Build Win32 ------
6>------ Build started: Project: NetEq (Visual Studio 2010), Configuration: Build Win32 ------
7>------ Build started: Project: NetEq4 (Visual Studio 2010), Configuration: Build Win32 ------
2>C:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\V110\Microsoft.MakeFile.Targets(37,5): warning MSB8005: The property 'NMakeBuildCommandLine' doesn't exist. Skipping...
I don't see why we want the binaries target here, and not just a regular build. The latter seems much less error prone. But I guess this is a limitation that we can live with for starters if there is a good reason.
And as you can see towards the end of this log, it's trying to build all of those libraries. It would be nice if we could avoid that. (Definitely not a show-stopper!)
I then tried to test the IntelliSense support, and let Visual Studio run for long enough to index everything in the project. I then tried editing various files across the tree and did not manage to get IntelliSense to work even once. I'm not exactly sure why that is. I know that we don't have the full build information in these projects (it seems like we only have a list of preprocessor symbol definitions, perhaps coming from the DEFINES variable?) so perhaps that is a reason. Anyway, I think BenWa managed to get this to work in his generated project files with hacky.mk so he may have better ideas here.
On the structure of the generated project, this is perhaps my biggest disappointment with the generated project. Can we please avoid leaking the notion of our "libraries" into this project, and instead just use folders and files which map where these source files are in the file system? Besides the fact that these libraries don't really mean anything to most people (what's "CNG" for example? first time I'm hearing of it, and I have hacked on the build system!), a whole bunch of information is actually missing from the project: all headers, IDLs, test files, etc are missing from the project and there is no way to edit them short of opening the file from the disk manually. Also, this way of categorizing things makes things such as the Class View pretty much useless since it groups the classes it finds across individual projects.
The other problem I noticed is that sometimes Visual Studio seems to be confused on where files are, for example, I never managed to open any of the NSPR headers (e.g. prlog.h). The Visual Studio Include Search Paths seems to be set up per project, so not sure what's going wrong there...
Flags: needinfo?(bgirard)
Comment 30•11 years ago
|
||
Comment on attachment 8376921 [details] [diff] [review]
Visual Studio project generation
I'm not sure if I'm going to have much to say besides the last comment, redirecting the request to Vlad who actually used BenWa's generated Visual Studio project when he was working on it, and Bas who I think will be interested in testing this in his daily work.
Attachment #8376921 -
Flags: feedback?(vladimir)
Attachment #8376921 -
Flags: feedback?(ehsan)
Attachment #8376921 -
Flags: feedback?(bas)
Comment 31•11 years ago
|
||
With hacky.mk I declared the DEFINE for each file (and a big of code to group them to make VS' life easier). You can find how I did this from the hacky.mk source. This is useful for auto-completion but I'm convinced we don't want to build from MSBuild.
Flags: needinfo?(bgirard)
Comment 32•11 years ago
|
||
(In reply to comment #31)
> With hacky.mk I declared the DEFINE for each file (and a big of code to group
> them to make VS' life easier). You can find how I did this from the hacky.mk
> source.
Thanks, that confirms what I was remembering. Since we have (most of) DEFINES in moz.build this should be possible to do, and with my upcoming work to port CFLAGS over we'd be in even a better spot.
> This is useful for auto-completion but I'm convinced we don't want to
> build from MSBuild.
Yes, absolutely agreed.
Assignee | ||
Comment 33•11 years ago
|
||
(In reply to Jim Mathies [:jimm] from comment #28)
> Hmm, I ran into one problem, the src path seems to be broken. In my build I
> have the src and obj dirs at the same root. Trying to open a src file
> results in a file not found error, the path displayed shows that the project
> references the src file in the wrong location. For example:
Bah. I left some hard-coded paths in visualstudio.py around line 250. Uncomment the commented out lines and paths should be sane. Will be fixed in the next patch.
Assignee | ||
Comment 34•11 years ago
|
||
(In reply to :Ehsan Akhgari (needinfo? me!) (slow responsiveness, emailapocalypse) from comment #29)
> I tried running this command three times, once immediately after I applied
> your patch on a dirty objdir (had merged with origin/master right before),
> and got this error:
> Then I tried ./mach clobber and ./mach build-backend -b VisualStudio again,
> and got this error this time:
> And then I ran ./mach build, let it run past configure scripts, and this
> time the command worked. VC++2012 asked me to update the project files to
> use the Visual C++ 2012 compiler and libraries, to which I said no.
Yeah, you need to run configure first before generating this backend. The failures you encountered are partly the failure of config.status to validate state, partly the failures of this patch. I'll try to make things better.
> On the support for building, I had aborted the ./mach build command above
> after the configure step, so did not have a fully built directory. When I
> used the Build Solution option in Visual Studio, it seemed like it's trying
> to build the binaries target, which of course won't work:
>
> I don't see why we want the binaries target here, and not just a regular
> build. The latter seems much less error prone. But I guess this is a
> limitation that we can live with for starters if there is a good reason.
Speed. Speed. Speed.
Currently the VS projects are targeted at Gecko developers. No JS. No jar.mn, etc.
The binaries target exists for Gecko developers. Unfortunately, it doesn't work unless the tree has already been built.
> And as you can see towards the end of this log, it's trying to build all of
> those libraries. It would be nice if we could avoid that. (Definitely not
> a show-stopper!)
Everything except the "binaries" project is listed as a make file based project but has no build command. Building these projects should be a no-op and fast. If you remove the build "rule" from the project, MSBuild complains about there being no build rule. I haven't yet discovered a way to suppress this. I think the current solution is the best we can do.
> I then tried to test the IntelliSense support, and let Visual Studio run for
> long enough to index everything in the project. I then tried editing
> various files across the tree and did not manage to get IntelliSense to work
> even once. I'm not exactly sure why that is. I know that we don't have the
> full build information in these projects (it seems like we only have a list
> of preprocessor symbol definitions, perhaps coming from the DEFINES
> variable?) so perhaps that is a reason. Anyway, I think BenWa managed to
> get this to work in his generated project files with hacky.mk so he may have
> better ideas here.
I populate includes and defines from moz.build. Most of the info should be there. I'd expect some IntelliSense to work.
I guess my failing setup is indicative of others. I'll bang on it until it works.
> On the structure of the generated project, this is perhaps my biggest
> disappointment with the generated project. Can we please avoid leaking the
> notion of our "libraries" into this project, and instead just use folders
> and files which map where these source files are in the file system?
> Besides the fact that these libraries don't really mean anything to most
> people (what's "CNG" for example? first time I'm hearing of it, and I have
> hacked on the build system!),
As of a few months ago, each folder with a compilation rule corresponds to a static library of the name folder.replace('/', '_').
As much as I would like to not leak our concept of libraries into Visual Studio, defines and includes are specified at the project level. Not the directory level. Not the file level. As these vary between directories/libraries, it is necessary to represent each directory/library as its own project. If we tried to collapse things, I think IntelliSense (presumably I'll get it working) will fail hard.
If it makes you feel better, I was as surprised by "CNG" as you were :)
> a whole bunch of information is actually
> missing from the project: all headers, IDLs, test files, etc are missing
> from the project and there is no way to edit them short of opening the file
> from the disk manually. Also, this way of categorizing things makes things
> such as the Class View pretty much useless since it groups the classes it
> finds across individual projects.
I plan to throw those in.
> The other problem I noticed is that sometimes Visual Studio seems to be
> confused on where files are, for example, I never managed to open any of the
> NSPR headers (e.g. prlog.h). The Visual Studio Include Search Paths seems
> to be set up per project, so not sure what's going wrong there...
The include paths are currently defined as $(TopObjDir)\dist\include + LOCAL_INCLUDES. It's missing a few common location. Will fix that.
I /think/ that no matter how much I try, Visual Studio will always be confused about file locations. It will see exported files in both topsrcdir and dist\include, for example. I'm not sure if there's a way to tell it that file X in srcdir is really file Y in objdir. Maybe if I add a MSBuild rule to copy the file the IDE will realize that? There's a lot of functionality hidden in the bowels of MSBuild and the .targets/.props files included as part of Visual Studio. I might be able to cobble something together. I'd appreciate any pointers people may have.
Comment 35•11 years ago
|
||
I'm not sure how people are testing intellisense support so quickly, my database is still getting updated. This is after doing a full command line build with vc11, then opening the project in the vs11 ide, converting all projects, and then right-click searching for a class declaration in a cpp. That kicked off a big indexing that's still running.
Assignee | ||
Comment 36•11 years ago
|
||
My initial index takes maybe 5 minutes. I'm on an i7-2600K with everything sitting on an SSD.
I believe the IntelliSense database is a Microsoft SQL database backed by a file or two next to the solution. If you are swapping or have slow I/O, you're gonna have a bad time.
Comment 37•11 years ago
|
||
(In reply to Jim Mathies [:jimm] from comment #35)
> I'm not sure how people are testing intellisense support so quickly, my
> database is still getting updated. This is after doing a full command line
> build with vc11, then opening the project in the vs11 ide, converting all
> projects, and then right-click searching for a class declaration in a cpp.
> That kicked off a big indexing that's still running.
My indexing finished in a matter of minutes too, and this was running under a VM. But like I said, I couldn't get IntelliSense to work at all!
Comment 38•11 years ago
|
||
My initial scan was quick, but once I started trying to find stuff it kicked off a new indexing process. The sql database isn't that large, about 400kb. My src, obj, and sdk are all on a fast ocz-vertex3 ssd. The machine is pretty good too, 4-core intel xeon w3540 @ 2.9GHz.
I think I'll let it finish and see what I get as a result for intellisense support.
Comment 39•11 years ago
|
||
(In reply to Gregory Szorc [:gps] from comment #34)
> (In reply to :Ehsan Akhgari (needinfo? me!) (slow responsiveness,
> emailapocalypse) from comment #29)
>
> > I tried running this command three times, once immediately after I applied
> > your patch on a dirty objdir (had merged with origin/master right before),
> > and got this error:
>
> > Then I tried ./mach clobber and ./mach build-backend -b VisualStudio again,
> > and got this error this time:
>
> > And then I ran ./mach build, let it run past configure scripts, and this
> > time the command worked. VC++2012 asked me to update the project files to
> > use the Visual C++ 2012 compiler and libraries, to which I said no.
>
> Yeah, you need to run configure first before generating this backend. The
> failures you encountered are partly the failure of config.status to validate
> state, partly the failures of this patch. I'll try to make things better.
Fair enough. If we have proper checking in place, requiring a configure as a prerequisite for this should not be an issue.
> > On the support for building, I had aborted the ./mach build command above
> > after the configure step, so did not have a fully built directory. When I
> > used the Build Solution option in Visual Studio, it seemed like it's trying
> > to build the binaries target, which of course won't work:
> >
> > I don't see why we want the binaries target here, and not just a regular
> > build. The latter seems much less error prone. But I guess this is a
> > limitation that we can live with for starters if there is a good reason.
>
> Speed. Speed. Speed.
What about correctness? ;-)
> Currently the VS projects are targeted at Gecko developers. No JS. No
> jar.mn, etc.
>
> The binaries target exists for Gecko developers. Unfortunately, it doesn't
> work unless the tree has already been built.
OK, I'm not going to debate this here. I think my position on the binaries target is very clear: it should have proper fallbacks to at least detect when a full rebuild is required. Without that it will keep on causing surprises, but that is only partially relevant to the issue at hand.
> > And as you can see towards the end of this log, it's trying to build all of
> > those libraries. It would be nice if we could avoid that. (Definitely not
> > a show-stopper!)
>
> Everything except the "binaries" project is listed as a make file based
> project but has no build command. Building these projects should be a no-op
> and fast. If you remove the build "rule" from the project, MSBuild complains
> about there being no build rule. I haven't yet discovered a way to suppress
> this. I think the current solution is the best we can do.
OK, that's fair. The problem here isn't so much speed as it was the clutter in the output pane, but if we can avoid generating all of these projects then that should largely stop being an issue.
[snipped]
> > On the structure of the generated project, this is perhaps my biggest
> > disappointment with the generated project. Can we please avoid leaking the
> > notion of our "libraries" into this project, and instead just use folders
> > and files which map where these source files are in the file system?
> > Besides the fact that these libraries don't really mean anything to most
> > people (what's "CNG" for example? first time I'm hearing of it, and I have
> > hacked on the build system!),
>
> As of a few months ago, each folder with a compilation rule corresponds to a
> static library of the name folder.replace('/', '_').
There are way too many exceptions to this rule based on a quick glance over the project.
> As much as I would like to not leak our concept of libraries into Visual
> Studio, defines and includes are specified at the project level. Not the
> directory level. Not the file level.
This may be a by-product of making these Makefile projects. For "regular" Visual Studio projects, you can definitely adjust all of these per file (I'm pretty sure not per-directory though!).
> As these vary between
> directories/libraries, it is necessary to represent each directory/library
> as its own project. If we tried to collapse things, I think IntelliSense
> (presumably I'll get it working) will fail hard.
Well, we should first figure out why IntelliSense doesn't work. Perhaps there is a bigger problem which doesn't interfere here?
> If it makes you feel better, I was as surprised by "CNG" as you were :)
It makes me feel much worse actually! ;-)
[snipped]
> > The other problem I noticed is that sometimes Visual Studio seems to be
> > confused on where files are, for example, I never managed to open any of the
> > NSPR headers (e.g. prlog.h). The Visual Studio Include Search Paths seems
> > to be set up per project, so not sure what's going wrong there...
>
> The include paths are currently defined as $(TopObjDir)\dist\include +
> LOCAL_INCLUDES. It's missing a few common location. Will fix that.
Sounds good.
> I /think/ that no matter how much I try, Visual Studio will always be
> confused about file locations. It will see exported files in both topsrcdir
> and dist\include, for example. I'm not sure if there's a way to tell it that
> file X in srcdir is really file Y in objdir. Maybe if I add a MSBuild rule
> to copy the file the IDE will realize that? There's a lot of functionality
> hidden in the bowels of MSBuild and the .targets/.props files included as
> part of Visual Studio. I might be able to cobble something together. I'd
> appreciate any pointers people may have.
I think Vlad and BenWa were experimenting with symlinks to address the dist/include duplication problem, perhaps check with them on the details? Really I think you should chat with them about this, they learned a lot from their experiments and I think a lot of those lessons would be useful here too!
Comment 40•11 years ago
|
||
On the issue of IntelliSense, please see this if you haven't already: <http://msdn.microsoft.com/en-us/library/vstudio/ms173379%28v=vs.100%29.aspx>
Assignee | ||
Comment 41•11 years ago
|
||
Changes since last upload:
* Assumptions about topsrcdir and topobjdir relative relationship have
been removed.
* mozilla-config.h is now listed as a force include and its defines
have been removed from DEFINES
* More include paths are present and logic should somewhat resemble
what's in config.mk
* .h files added to projects
This is definitely better. IntelliSense still isn't working. I think I
have that isolated to not pulling in the VS default include paths.
Hopefully in the next hour or so...
Assignee | ||
Updated•11 years ago
|
Attachment #8376921 -
Attachment is obsolete: true
Attachment #8376921 -
Flags: feedback?(vladimir)
Attachment #8376921 -
Flags: feedback?(brian)
Attachment #8376921 -
Flags: feedback?(bas)
Assignee | ||
Comment 42•11 years ago
|
||
IntelliSense now works \o/
Turns out the system default includes paths ($(VCInstallDir) and
friends) weren't getting set in the project. I may have to dig into the
bowels of MSBuild to solve this better. But you can't argue with
results.
Assignee | ||
Updated•11 years ago
|
Attachment #8377233 -
Attachment is obsolete: true
Assignee | ||
Comment 43•11 years ago
|
||
This version disables the building of the various library projects. If
you build the solution, it will only build the "binaries" target. You
can still build an individual library. But it's a no-op/warning.
FWIW, fixing IntelliSense now means that scanning takes longer. It's now
taking ~9 minutes on my machine. The final database weighs in around 500
MB.
I'm reasonably happy with the state of things right now. Aside from a
few locations where includes aren't getting picked up b/c they are in
Makefile.in (Ehsan has been writing patches to fix those - thanks
Ehsan), parsing and IntelliSense seem to "just work" and my ability to
author C++ code (I don't know the code base too well) is boosted as a
result.
Requesting feedback from interested parties so I have some assurance
others get decent results with this patch.
Attachment #8377254 -
Flags: feedback?(vladimir)
Attachment #8377254 -
Flags: feedback?(jmathies)
Attachment #8377254 -
Flags: feedback?(ehsan)
Attachment #8377254 -
Flags: feedback?(brian)
Attachment #8377254 -
Flags: feedback?(bgirard)
Assignee | ||
Updated•11 years ago
|
Attachment #8377249 -
Attachment is obsolete: true
Comment 44•11 years ago
|
||
Comment on attachment 8377254 [details] [diff] [review]
Visual Studio project generation
Not sure if I have much else to add... Please re-request feedback if you have anything in particular in mind.
Attachment #8377254 -
Flags: feedback?(ehsan)
Comment 45•11 years ago
|
||
Comment on attachment 8377254 [details] [diff] [review]
Visual Studio project generation
Review of attachment 8377254 [details] [diff] [review]:
-----------------------------------------------------------------
I'm using Visual Studio 2013 update 1.
This is great.
Using Ctrl+, to navigate works. Name completion for names that the project is aware of works. Ctrl+12 works. At least, those things work in the few instances I tried for Necko and PSM. Ctrl+Shift+B for rebuilding "binaries" (after I set "binaries" to be the startup project) worked. When build errors are detected, they are properly highlighted in the build output (note: I have VSColorOutput installed). Also, the error list view was populated.
It seems like the NSS header files aren't indexed, so many NSS symbols are not found. Note that, ideally, these symbols would be found in the headers under $srcdir/security/nss, not the ones that get generated under $OBJDIR/dist/include. This is the biggest thing that would block me from using this project instead of my own hand-made project, AFAICT.
If at all possible, please generate the project outside of $OBJDIR. I often rm -Rf $OBJDIR because "mach clobber" sometimes fails on Windows, but I don't want to blow away the project.
I also think "binaries" should be set as the startup project by default.
When a build fails, the error is properly shown in the output and highlighted in red (with VSColorOutput). But, the error is obscured due to a huge amount of output at the end that looks like this:
366>------ Skipped Build: Project: xpcomsample, Configuration: Build Win32 ------
366>Project not selected to build for this solution configuration
367>------ Skipped Build: Project: xpconnect_s, Configuration: Build Win32 ------
367>Project not selected to build for this solution configuration
...
It would be great to prevent that extra output from showing up at the end.
Again, great work.
Attachment #8377254 -
Flags: feedback?(brian) → feedback+
Comment 46•11 years ago
|
||
Also, it would be great to be able to create a target for "mach xpcshell-test security/manager/ssl/tests/unit" so that I can run my unit tests from within the iDE and then use the error list view and error navigation shortcut keys to jump directly to the source of the error (as reported in JS call stacks).
Comment 47•11 years ago
|
||
In my hand-made project I added security/manager/ssl/tests/unit/*.js so that I can navigate to my xpcshell tests with Ctrl+,. This is another thing that seems to be missing from this that would be a regression from my hand-made project.
Again, though, overall I am very impressed and excited about this!
Assignee | ||
Comment 48•11 years ago
|
||
Adding NSS includes to the search path is doable. I just need to expose a make variable to Visual Studio.
I'm going to punt adding .js to the solution to a follow-up primarily because I think it's scope bloat. Furthermore, when I experimented with VS projects over a year ago, adding all the extra .js/.css files to the project made Visual Studio extremely slow. I was using 2008 at the time. Not sure if 2010 and newer can handle it better.
Thanks for the great feedback.
Comment 49•11 years ago
|
||
Comment on attachment 8377254 [details] [diff] [review]
Visual Studio project generation
Review of attachment 8377254 [details] [diff] [review]:
-----------------------------------------------------------------
::: python/mozbuild/mozbuild/backend/visualstudio.py
@@ +96,5 @@
> +
> + if p.startswith('/'):
> + includes.append(os.path.join('$(TopSrcDir)', p[1:]))
> + else:
> + includes.append(os.path.join('$(TopSrcDir)', reldir, p))
I feel like most things here could be shared by a general IDEBackend or something like that.
I'm looking at the .cproject file and they aren't as bad as I though. Let's refactor the pieces that aren't specific to VS so that we can share them with a CppEclipse backend.
Attachment #8377254 -
Flags: feedback?(bgirard) → feedback+
Assignee | ||
Comment 50•11 years ago
|
||
I've added some basic tests and advertisement to config.status.
I believe I've fixed the NSS symbol discovery issue.
This patch also adds projects to the solution for:
* Performing a full build
* Performing a binaries build
* Performing an export build
* Running MOZ_BUILD_APP.exe
* Running js.exe
* Running xpcshell.exe
There are nearly infinite ways this patch can be improved. But, I think
they are all follow-up worthy. Perfect is the enemy of done. Something
is better than nothing. Let's get this into the tree and see what
happens. It's completely opt-in, so it's not like we're forcing a
workflow change on people. The upside is we increase our test audience
and can iterate on improvements faster and easier. Who knows, perhaps we
even increase productivity of Gecko developers while we're at it.
https://tbpl.mozilla.org/?tree=Try&rev=d4b5a40963f8
Attachment #8377316 -
Flags: review?(mshal)
Assignee | ||
Updated•11 years ago
|
Attachment #8377254 -
Attachment is obsolete: true
Attachment #8377254 -
Flags: feedback?(vladimir)
Attachment #8377254 -
Flags: feedback?(jmathies)
Updated•11 years ago
|
Whiteboard: [mach build && mach build-backend -b VisualStudio]
Assignee | ||
Comment 51•11 years ago
|
||
Fix a unit test failure due to Unicode foo.
Attachment #8377346 -
Flags: review?(mshal)
Assignee | ||
Updated•11 years ago
|
Attachment #8377316 -
Attachment is obsolete: true
Attachment #8377316 -
Flags: review?(mshal)
Assignee | ||
Comment 52•11 years ago
|
||
Comment 53•11 years ago
|
||
Comment on attachment 8377346 [details] [diff] [review]
Visual Studio project generation
Congrats on getting this done! Since this is alpha & optional I'm inclined to r+ and hope people can try it out.
>diff --git a/python/mozbuild/mozbuild/config_status.py b/python/mozbuild/mozbuild/config_status.py
>--- a/python/mozbuild/mozbuild/config_status.py
>+++ b/python/mozbuild/mozbuild/config_status.py
>@@ -13,16 +13,17 @@ import os
> import sys
>
> from optparse import OptionParser
>
> from mach.logging import LoggingManager
> from mozbuild.backend.configenvironment import ConfigEnvironment
> from mozbuild.backend.android_eclipse import AndroidEclipseBackend
> from mozbuild.backend.recursivemake import RecursiveMakeBackend
>+from mozbuild.backend.visualstudio import VisualStudioBackend
Does it make sense to move the import (and other Backend imports) down to inside the "elif options.backend == 'VisualStudio'" block? Might help as we start adding more backends to avoid slowing down config_status.py, since presumably we only ever need one active at a time.
Attachment #8377346 -
Flags: review?(mshal) → review+
Assignee | ||
Comment 54•11 years ago
|
||
Changed to delay imports.
https://hg.mozilla.org/integration/mozilla-inbound/rev/0b1c1795142e
Please test and file follow-ups.
No longer depends on: 487182
Flags: in-testsuite+
Assignee | ||
Comment 55•11 years ago
|
||
https://hg.mozilla.org/integration/mozilla-inbound/rev/33d272d4ad6c
I had to disable the test because of inconsistent failures in automation. Likely cause is os.path.relpath failing due to srcdir and objdir on different drives. Yay for inconsistent slave configuration.
I'll file a follow-up.
https://hg.mozilla.org/mozilla-central/rev/0b1c1795142e
https://hg.mozilla.org/mozilla-central/rev/33d272d4ad6c
Status: ASSIGNED → RESOLVED
Closed: 11 years ago
Resolution: --- → FIXED
Target Milestone: --- → mozilla30
Updated•11 years ago
|
Comment 57•11 years ago
|
||
It works great, thanks for doing this.
Updated•11 years ago
|
Comment 58•10 years ago
|
||
minor nit from a noob at building firefox: when the solution generation completes, it outputs this line:
Generated Visual Studio solution at c:/mozilla-source/mozilla-central/obj-i686-pc-mingw32\msvc\mozilla.sln
When I naively copy-pasted that line into visual studio "load solution" dialog, it failed because it has both forward and backward slashes. Worked fine when I manually edited to windows backslashes.
Updated•7 years ago
|
Product: Core → Firefox Build System
You need to log in
before you can comment on or make changes to this bug.
Description
•