Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations IamaSherpa on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Visual Studio battles

Status
Not open for further replies.

xwb

Programmer
Jul 11, 2002
6,828
GB
I maintain a library with different vcproj files for the different compilers. So typically for project xxx, I'll have

VS2005 xxxv8.sln, xxxv8.vcproj
VS2008 xxxv9.sln, xxxv9.vcproj
VS2010 xxxv10.sln, xxxv10.vcxproj

Battle 1: To go from xxx to xxxv8, I rename xxx.vcproj to xxxv8.vcproj. I then start the project from xxxv8.vcproj and it creates xxxv8.sln for me. Only problem is I don't want the project to be called xxxv8, I want it to be called xxx. If I rename the project in VS, it renames the vcproj file back to xxx.vcproj. The only way I've found around this is to let it do everything as xxxv8.vcproj and then go in with vi or notepad and change xxxv8 to xxx in xxxv8.sln. I have to do this for v9 and v10 too. Is there a better way?

Battle 2: VS allows you to put the binaries in a particular place but not the obj files. The obj files always go into the object directory. I put the binaries into bin\v8, bin\v9, bin\v10 or bin\x86\v8 etc. Some of the code uses 32-bit DLLs, so it cannot be agnostic. I'd like the objects to go into obj\v8, obj\v9, obj\v10 otherwise the build falls over if I do a compile for both VS2005 and VS2010. If I do 2005 first, there is a link error in 2010. If I do 2010 first, there is a link error in 2005. I basically need to wipe out the entire object directory before I start a new build system. Again, the only way I've found is to go into the vcproj files with notepad and change the object directory. Is there a better way?

Battle 3: When moving between the compilers, Clean Solution doesn't work. It doesn't clear away anything. I actually have to come out of VS, delete both the bin and object directories (before I changed it to obj\vx) and then it builds properly and Clean works again. Any idea why clean solution doesn't work when moving between compilers?



 
Battle 1:
Have you considered separating your sln files from prj files?
In a prior life, we had a swiss army knife loadoout. You'd create a new SLN in the Solutions directory, and then just add the necessary "tool" projects. Note this works best if you are not actually editing the various project files, but rather using them as "pre built" components. If you need to do any editing, you had a special single project sln for it.

Battle 2: You can add pre-build and post-build steps.
You could manually clear out the directory based upon your need.

Lodlaiden

You've got questions and source code. We want both!
Oh? That? That's not an important password. - IT Security Admin (pw on whiteboard)
 
Battle 1: Creating new solutions is OK if you only have to do it once. Where I'm working, they've been sitting on VS8 until now. Now they want support for VS9, VS10 and VS11. Having to create solutions 3 times and adding about 50 projects to it is quite error prone. I've got another one to do with about 150 projects made up of a mix of C#, C++ and C. They are all equally badly behaved when it comes to solutions and projects. C++ is a lot more controllable from VS than C#. If you have to go in and out of 8 sets of dialogs (in C++ we do builds for MT, MTD, ML, MLD on win32 and x64) you soon lose the will to live. Going in with vi and using its macro facility is a lot easier. Management is now asking whether we can port this to mono and gcc.

Battle 2: I've tried prebuild events before but got into an absolute mess when we were doing parallel builds. It works quite well on sequential builds but when running parallel builds each compiler needs to write to a different space otherwise they just mess each other up.

Battle 4: Deployment. Since you can't use depends on C# programs, you never really know whether you have all the DLLs it needs. I tried going in with GAC and then doing depends on the C++ DLLs but still managed to miss the late loaded ones out. If nobody uses that functionality and QA missed it out, you will never know. We only found that we had a missing DLL when a customer complained that some feature didn't work. This was after a product had been out in the field for 3 years!

C# is great for coding but building and deployment is a different ball game.
 
I can't comment on Battle1 because I don't have an answer.

Battle2: Create your own build file and have it control how things are built. MSBuild is well documented and pretty easy to control.

Battle3: Stop doing manual deploys. You have a really complicated architecture. Time to script deployment. Use something like Thoughtworks Go or Octopus Deploy. It will make life much, much easier. It also sounds like you aren't using Continuous Integration to manage your builds. Put that in place before automated deployment. If your build isn't right, it doesn't matter how automated the deployment is.

Craig Berntson
MCSD, Visual C# MVP,
 
Battle 2: At the moment, this is just a batch script that runs the solution file with the different configurations for all the different versions of VS. What you are talking about is deploying MSBuild inside the solution. Doesn't it already do that? I'll have a look at VS8 and VS9. I know it uses MSBuild on VS10.

Battle 3: Continuous Integration is great when the product is new and everyone is still around. When something is 10 years old and has been dropped on your lap it is a bit more difficult. I have introduced CI on the new parts but the old parts that are no longer being developed, I have to invent the tests (it is a small dept so I am part QA too) to put some regression testing in after the package has been built. The missing DLLs come from the old bits.

I don't quite understand the difference between manual deployment and automated deployment. At the moment, everything goes into a batch file. It does the build, does the tests and creates the delivery package. If a new DLL comes along, I just add it to the batch file. When a new test comes along, I add it to the batch file. With the fancy tools, I'll still have to add the DLL to the script: they are not capable to figuring out which bits to deploy automatically. I'll also have to add the test. So what is the difference? In both cases, I still have to add the relevant DLLs and tests to the script, whether it is a batch file or an XML file or clicking somewhere on a UI.

At the moment, when a new version of the compiler comes along, it is just a one line change in the deployment batch file (unless MS does a change like it did from VS7.1 to VS8). I'll have to see whether it is just as simple with Octopus Deploy. I suspect it isn't. I've looked at several tools in this area: a batch file is the simplest. It allows very fine control on what goes where and with simple loops. The great thing about batch files, is that they are the only tools that are guaranteed to work across all versions (past and future) versions of windows. With other tools, there are all sorts of compatibility issues. eg VS8 SP2 is for Vista and W7 only, some tools need the latest version of .net framework etc. Life is a lot easier when you don't have to fight the tools you use and can deploy them in any environment.
 
Batch files are easy, but eventually you run into issues. Doing builds via a batch file is considered an anti-pattern. How do you detect that the build failed? Or the unit tests?

I don't need to add new tests to the build script. It uses the test project and runs the tests listed there. It also knows what projects are in the solution and builds them. So, if I add a new project or test, I don't do anything with the build script. And, the CI server knows if the build or tests fails and reports that back.

You don't deploy MSBuild inside the solution. MSBuild is part of the .Net Framework, thus it should already be installed on the build server.

In best practice, compile (build) and deployment should be separated. They are tightly coupled in your example. But, they are different things.

Life is a lot easier when you follow best practices and don't have to fight home-grown solutions that eventually fail and you don't have to support.

Craig Berntson
MCSD, Visual C# MVP,
 
Basically what we do is

1) do the build (battles 1, 2 and 3)
2) package it (battle 4)
3) unpackage it into fresh VMs
4) send tests to the various VMs to test the build
5) check the spreadsheet at the end of the run. If none of the tests have failed and the build hasn't failed, then it is a release candidate.
6) at the end, the VMs are wiped so there is no residue of previous tests.

Build failures are quite easy: the build script checks whether the relevant files have been created and writes it out to the spreadsheet. The first line is titles, second line is totals. If the total in the Not Created column is more than 0, then something has failed.

This is a library that works both in Windows and on Linux (Mono). As such the tests are test programs that use various features of the library.

You've gotten me interested - can your CI server run multi-processor tests? That is one of the reasons why we run batch scripts: because most automated tools can only cope with single processes. The tests involve from 1 to 4 processes or processors (VMs). We had to write our own tools to send commands to both Windows and Linux VMs and get results back. The results are recorded on a spreadsheet so checking for failures isn't a problem. Just look at the failure column: if it is more than 0, then something failed.

Some of the tests are between Linux and Windows. The batch script will launch the relevant VMs, send the test source files over to get them built using Mono and various releases of VC# express on 32 and 64 bit platforms. It synchronizes the runs between the Linux and Windows VMs. The VM write the results on a shared drive and these are recorded on a spreadsheet. I don't know whether there is a CI server that will launch VMs because the library needs to be tested on 32/64 bit XP, Vista, W7 and Linux. It used to be 2K too at one time.

I haven't even looked at off the shelf solutions for these tests recently. The last time I looked (2002), it was £50K with an annual renewable licence and that didn't even do a quarter of what the batch scripts did. It needed some pretty high spec hardware, everything had to be pre-built and was a Windows Vista only solution. The best bit was that popup windows like the ones we used to send using net send would crash it. We had a demo and the lunch van came so the receptionist did her usual net send * and it crashed the demo.

Yes, it is a home grown solution and yes it does not follow best practice. If I can find a best practice that actually works across several compilers and operating systems, then I am willing to follow it.
 
Most modern CI servers can use multiple processors and even multiple machines. There are also very good tools to handle pushing binaries to VMs (both Windows and Linux), installing, testing, gathering results, and resetting everything. It's all done with scripts.

Craig Berntson
MCSD, Visual C# MVP,
 
Just downloaded Zed to have a play. I just want to see if I'll be writing the same number of scripts, more scripts or less scripts and how difficult or easy it is to use with multiple platforms and processes.

Hopefully it is not one of those we do it but not on the community edition you have downloaded - you have to pay $mega for that version. That is the answer I got when I was at British Airways when I asked whether there was an option to keep the menu on the screen long enough for the process to click on it for the submenu and select an item from the submenu. The base version was 50K, it was another 20K for the module with that option.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top