Any tips?

B

Balog Pal

Ok, back to my original comment about the futility of these discussions.
You don't trust me, and I don't trust you. Neither of us is going to
change.

You missed the point despite I even described it in an earlier post.
It's not a pissing or dick-measuring contest.

And I'm still interested in *how* you do the wasteless navigation --
can you tell us or it's a top secret?
 
B

Balog Pal

Thanks for the tip, it has not occured to me to use arrow keys in the
ctrl-tab list though I have seen it occasionally (for navigating through
the open files I am using Alt-W digit or Alt-W W instead).

Now the next step would be to get the arrow keys working in the Alt-Tab
list as well...

The other navigation feature I use very much is Alt-minus to go to
previous spot(s) (and forward with shift). That is something I doubt
single-window editors could hope to get.

I find it useful despite having a plenty of ways to have information
without leaving the edited spot (like tooltips, code definition window).

In the workflow of doing review -- either interactively or just
inspecting patches all the navigation support is also critical to
productivity IME. Not only affecting the raw time but precision. A
feature that shows documentation of a function by just hovering the
mouse over it can save overlooking some precondition or responsibility.
 
N

Nick Keighley

I am coming to C++ from python. Do you guys have any tips for me? Examples: How to learn it effectively, what not to do, etc.

"Accelerated C++" might suit you. It tends to start with the powerful
high level stuff and only grubs around with pointers and such like
later on.

I liked the original Stroustrup (ToC could be better!) and he's now
written a book for beginners (no idea what its like).

I like the Effective books but again I'm a bit out of date on where
these are now.

I can't function without Josuttis on the standard library, but you may
be able to replace it with some of the better online stuff.

Oh and read the comp.lang.c++ FAQ (ignore his rants about "business
value")

Write Lots Of Code.


Happy programming!
 
N

Nick Keighley

    [...]
2. You should never learn "Modern" C++ with a book that was written before
1997. My recommendations:
  1) Andrew Koenig & Barbara Moo. Accelerated C++, Addison-Wesley, 2000.
  2) Bjarne Stroustrup. Programming: Principles and Practice in C++,
     Addison-Wesley, 2009.
  3) Bjarne Stroustrup. The C++ Programming Language, Addison-Wesley,
     2013. (Coming soon)
  4) Stanley Lippman, Josee Lajoie & Barbara Moo. C++ Primer, 2012.
  5) Bjarne Stroustrup. The C++ Programming Language, Addison-Wesley,
     special edition, 2000.
At this time, the point is: use C++98 or C++11, the books
1, 2 and 5 are based on C++98 and 3 and 4 based on C++11.
of course there other good books. I believe the book #1 and #2
are really good. #1 is concise and #2 is very detailed about
programming not C++.

The choice between 1 and 2 will depend partially on how much
experience you have programming, in general.  If you know no
programming, I would recommend 2, regardless of what language
you ultimately want to program in, because that's what it
teaches.  If you're already an experienced programmer, 1 is
a lot shorter, and will still present everything you need to
know that isn't general programming skills.

    [...]
4. Use modern C++ programming environment.
My recommendation:
  - Visual Studio 2012, Visual Studio 2010 (Windows)
  - Code::Blocks (Linux/Ubuntu)

I don't know.  I use Visual Studios 2012 (at present) under
Windows, because that's my employers standard; I've always used
vim, bash and makefiles under Unix.  And the vim, bash and
makefiles environment is far more productive than the Visual
Studios environment.

really? I've used both (well not vim). I've also used Qt Creator. I'd
like to see how you measure "productivity"
 If you're just starting programming,
something like Visual Studios is probably a pretty good idea, so
you don't have to learn everything at once, just to compile
hello world.

even compiling hello world is non-trivial with VS!
 But if you are already an experienced programmer,
it's probably worth your while to learn how to use more powerful
tools; there's just so much you can't do in Visual Studios (or
in any of the IDE's I've used under Linux, but I've not tried
any new ones recently).

like? If I want to process text I use Perl and tend to use it as a
scripting language as well (bash syntax drives me nuts) but VS seems
fine for most stuff.
 
W

woodbrian77

In my opinion Solaris (and the numerous OpenSolaris derivatives) have

better developer tools than Linux, especially for analysing applications

(and the OS) in a production0n environment.

I'm thinking about OpenSolaris again. Currently my Linux
server forks several copies of itself and the children
do an accept on the same socket. IIrc, that doesn't work
on Solaris. If that's right, what's the alternative on
Solaris?

I'm using Arch Linux right now and like it for the
most part. It seems easier to administer than the
version of Linux I was using previously. I haven't
used Solaris in a number of years, so am not sure
how the administration of Solaris compares with Linux.
Solaris might be right for me at some point, but am
not sure if that time is yet.

Brian
 
N

Nick Keighley

Hopefully he did, or I'd have to start an Emacs-vs-Vim war ;-)

Unix versus (other) IDEs has been discussed continuously for as long
as I can remember, and it never leads anywhere. It's near impossible
to find someone who knows both worlds well enough to have a useful
general opinion, and if such a person exists she can't prove it to us.

Personally: my workflow combined with a decent Unix environment
doesn't feel limiting. And I don't think it's because my brain is
wired in an unusual way. Can I switch to an IDE and change my
workflow to become even /less/ limited? No idea; it would take years
to find out, and I'm not prepared to make that huge investment with
an uncertain outcome.

Also, I've never seen anyone do anything with an IDE that I couldn't
do as well or better. (That's not proof, because most people seem not
to use their tools efficiently at all, no matter what the tools are.)

I tend to find GUIs slicker at what they're designed to do and fail
miserably when you want to do something different. Unix GUIs are
getting
better but still seem a little clunky compared with windows (I've no
modern experience of Macs (apart from trying to display a few photos
on
one and finding it a nightmare!)). Unix shells are nasty until you
try
and get anything done on windows when you realise how great they are!
Hence my choice of Perl as a scripting language. Its systax slightly
less shit than other shells and its sort of portable.

I suppose the ultimate would be a GUI with a scripting language (yes
I've heard of emacs), I understand Macs have something like this and
I've
used tools that tried to do this (Rational Rose, Understand) but
never
really grokked them.
 
J

Jorgen Grahn

You missed the point despite I even described it in an earlier post.
It's not a pissing or dick-measuring contest.

It feels a lot like one ... and this will be my last posting in this
part of the thread.
And I'm still interested in *how* you do the wasteless navigation --
can you tell us or it's a top secret?

I didn't understand you were asking a question. Do you mean this from
upthread?

|> It probably fits some workflow when you concentrate on a few-line
|> change. I'm currently reshaping an old codebase that's over 1MLOC in
|> size. Would be dead in the water if could not navigate error
|
|I do these tasks too, and I can assure you am not dead in the water!
|
|> or search
|> locations at once anywhere in the ~3300 files.
|
|Did I say I couldn't do that? I don't use IDEs, but I don't write my
|code in Notepad, either! This is a vital feature.

The last thing I assumed was obvious: tags. AIUI the first text
editor feature to target programmers -- it was supported in vi
back around 1980.

If you're asking about the larger "reshaping a large codebase" part, I
have no short, specific answer. I use a mix of techniques and
standard tools.

/Jorgen
 
W

woodbrian77

Are you implying that some fundamental OS-level feature is missing on

Solaris, making this platform unusable for some purposes? That's kind of

hard to believe.

I was unsure if I could find the link again but did
find it with https://duckduckgo.com
..

http://plumeria.vmth.ucdavis.edu/info/oreilly/perl/cookbook/ch17_13.htm

That says, "On some operating systems, notably Solaris, you cannot have multiple children doing an accept on the same socket. You have to use file locking to ensure that only one child can call accept at any particular moment."

Is that still the case with Solaris 11?


Brian
Ebenezer Enterprises - Proverbs 3:5,6.
http://webEbenezer.net
 
J

James Kanze

On 3/7/2013 9:10 AM, Jorgen Grahn wrote:
[...]
Yeah, I was pretty shocked to observe that with all those supposedly
excellent unix-based tools my colleagues (who were linux guru level)
could not do as much as walking a list of compiler errors with F4 --
instead had to do actions to open the related sources and navigate to
lines. :-((( I thought we left that behind in last century.

That would shock me too, since I've been doing it under Unix
(and Windows) since over 20 years. (Of course, it's not F4, but
some other key sequence. But I'm assuming that that's not your
point.)
 
J

James Kanze

[...]
I don't know.  I use Visual Studios 2012 (at present) under
Windows, because that's my employers standard; I've always used
vim, bash and makefiles under Unix.  And the vim, bash and
makefiles environment is far more productive than the Visual
Studios environment.
really? I've used both (well not vim). I've also used Qt Creator. I'd
like to see how you measure "productivity"

Getting working code out of the door. Actually, creating and
maintaining working code effectively, for a complete definition
of working code (i.e. maintainable, tested, documented...).
even compiling hello world is non-trivial with VS!

You mean because you have to create a solution, with a project?

For anything more complex, the fact that you don't have to
write a makefile is a win for a beginner.

For production code, of course, the fact that you can't really
create arbitrary rules, like you can in a makefile, is
a problem. As is sharing "projects" between different
solutions, which use different compiler options. I've done it,
but it involves editing the project files by hand; at that
point, you're better off using makefiles, because the higher
level makefile can pass explicit information down to the project
file. (With VC++, you have to create rules conditioned on the
solution name.)

If you limit yourself to the IDE, just about anything useful.
Try creating a project in which several different sources are
generated from other programs. There's special mechanism for
the case where a single source and header are generated by
a single tool (e.g. lex and yacc), and you can have one (but
only one) pre-build step. But the build system doesn't
understand any dependencies created by the pre-build, and if you
want two or more operations, you have to wrap them into some
sort of script. (The fact that the build system decides what
needs recompiling *before* doing the pre-build is a serious
error, since the purpose of the pre-build is normally to
regenerate some files.)
If I want to process text I use Perl and tend to use it as a
scripting language as well (bash syntax drives me nuts) but VS seems
fine for most stuff.

Bash syntax is a bit special, but the real problem is that the
individual tools aren't always coherent. You need to learn
several different variations of regular expressions, for
example. It's still an order of magnitude better than Perl, but
I tend to use Python for anything non-trivial today (unless it's
non-trivial enough to justify using C++).
 
I

Ian Collins

James said:
If you limit yourself to the IDE, just about anything useful.
Try creating a project in which several different sources are
generated from other programs. There's special mechanism for
the case where a single source and header are generated by
a single tool (e.g. lex and yacc), and you can have one (but
only one) pre-build step. But the build system doesn't
understand any dependencies created by the pre-build, and if you
want two or more operations, you have to wrap them into some
sort of script. (The fact that the build system decides what
needs recompiling *before* doing the pre-build is a serious
error, since the purpose of the pre-build is normally to
regenerate some files.)

One advantage of popular cross-platform IDE like Eclipse and NetBeans is
they do generate makefiles under the hood and (at least with NetBeans)
those makefiles are designed to support hand editing. The IDE generated
akefiles can also be use to run builds on remote hosts where running an
IDE is impractical.
 
B

Balog Pal

For production code, of course, the fact that you can't really
create arbitrary rules, like you can in a makefile, is
a problem. As is sharing "projects" between different
solutions, which use different compiler options. I've done it,
but it involves editing the project files by hand; at that
point, you're better off using makefiles, because the higher
level makefile can pass explicit information down to the project
file. (With VC++, you have to create rules conditioned on the
solution name.)

I wonder where this comes -- MS switched to MsBuild a few years ago, and
in that you can do about anything you could in a makefile. Yes, those
who mastered makefiles already may be better off with that. Others are
probably not, as for the sensible stuff it's ways simpler, and plays
naturally with the IDE.

Certainly you CAN and definitely should share most rules -- that is done
by creating .props and .targets files, and the .vcxproj should have only
the file lists. Certainly any really specific options can be set (either
per project or per file).

So all you write about is possible alright. I don't say it is trivial or
nice but that same applies to make and probably most other build systems
for big real-life projects.
Bash syntax is a bit special, but the real problem is that the
individual tools aren't always coherent. You need to learn
several different variations of regular expressions, for
example. It's still an order of magnitude better than Perl, but
I tend to use Python for anything non-trivial today (unless it's
non-trivial enough to justify using C++).

Yeah, me too -- IMO python beats why bother with old scripting stuff
when it provides all the power, is portable and reads like a real
language instead of emulating the old modem output before NO CARRIER. :->
 
I

Ian Collins

Please wrap your lines!
I was unsure if I could find the link again but did find it with
https://duckduckgo.com ..

http://plumeria.vmth.ucdavis.edu/info/oreilly/perl/cookbook/ch17_13.htm

That says, "On some operating systems, notably Solaris, you cannot
have multiple children doing an accept on the same socket. You have
to use file locking to ensure that only one child can call accept at
any particular moment."

Is that still the case with Solaris 11?

It doesn't appear to have been the case for a long time:

https://blogs.oracle.com/pgdh/entry/how_not_to_load_balance1
 
J

James Kanze

James Kanze wrote:
[...]
One advantage of popular cross-platform IDE like Eclipse and NetBeans is
they do generate makefiles under the hood and (at least with NetBeans)
those makefiles are designed to support hand editing. The IDE generated
akefiles can also be use to run builds on remote hosts where running an
IDE is impractical.

I'll have to give Eclipse another try. I did try it, and found
that while it might be OK for Java, it didn't handle C++ very
well, and it didn't integrate foreign editors (like vim) very
well (and like most IDE's, it's native editor was shit). That
was a long time ago, however, so those problems might be
solved.

Of course, when you've been using makefiles for a while, it
doesn't take much to write a new one; in the simplest cases, you
just define a macro with your sources, and include your master
makefile. And the ones generated by the IDE aren't probably
much use in the more complicated cases, where you have to build
the tool which generates some of your sources. (And of course,
this isn't even possible under VS; a single project can't build
an intermediate .exe needed to build the final .dll.)
 
J

James Kanze

I wonder where this comes -- MS switched to MsBuild a few years ago, and
in that you can do about anything you could in a makefile. Yes, those
who mastered makefiles already may be better off with that. Others are
probably not, as for the sensible stuff it's ways simpler, and plays
naturally with the IDE.

That's simply not true. There are so many things that one can
easily do in makefiles, but which are extremely complicated, if
not impossible, with MS's project and solution files (which is
what the IDE uses). Some examples of things you cannot do
in a Microsoft project:

- Build a local .exe to generate the code you need to compile
for the .dll the project is supposed to build. You have to
create a separate project for the .exe (even if it is build from
only a single .cpp). Which, of course, exposes an internal
detail of the project to everyone working on the solution.

- Use the same project in different solutions, with the
solution passing down information telling the project how to
configure itself. You can work around this somewhat, by using
conditionals in the project file, but it's awkward, since the
support for conditionals only supports full equality. You
can't make something conditional on a regular expression match
on the solution name, for example. And conditions on the
solution name are the only way you can make anything dependent
on the actual solution file.

Add to this that the VS compiler's error messages can be totally
useless when templates are involved. Not always, not even
usually, but from time to time, you'll get an error message
without the slightest indication of where the error is in *your*
sources. And the fact that you can't debug into code which was
compiled elsewhere, even if you have the sources. (I've
recently worked on a Python interface, and at times, it was
useful to step into the Python code, to see what Python was
expecting of me, and what I was doing wrong. The VS debugger
simply refused to do it; luckily, I had access to Linux box as
well, with gdb.)
Certainly you CAN and definitely should share most rules -- that is done
by creating .props and .targets files, and the .vcxproj should have only
the file lists. Certainly any really specific options can be set (either
per project or per file).

Certainly. That's what we do, and I've written Python scripts
to generate new projects, to ensure that every one stays on the
same page. (There are a few things, normally at the top, which
don't seem to work if you put them in at .props: the list of
project configurations, for example. Which was a pain when
I had to add support for 64 bits, but that's not something you
do every day.) But the problem remains: you cannot directly
push down information from the solution file (which in addition
uses GUIDs everywhere, rather than the names of anything, which
makes editing it by hand a real pain). Our .props files are
full of things like:

<attribute_name Condition=" '$(SolutionName)' == 'OneSolution' ">value</attribute_name>
<attribute_name Condition=" '$(SolutionName)' == 'DifferentSolution' ">different_value</attribute_name>
...

Every time we use the project in a new solution, we have to come
back and add it here.
So all you write about is possible alright. I don't say it is trivial or
nice but that same applies to make and probably most other build systems
for big real-life projects.

It's true that most build systems are pretty bad: I've been
using GNU make, largely because I know it, but the make syntax
is horrible. Of the other systems I've tried, however, either
they don't have the support for some feature I need, or they're
simply broken.

The dependency checking in both Jam and VS is seriously broken:
VS fails when pre-build steps modify sources or headers (and
only allows one pre-build step per project, so you have to write
scripts to chain several together), and Jam fails when you use
things like:

#include GB_dependentInclude(syst,dependentSource.cc)

(where, of course, the underlying macros will change the path so
that dependentSource.cc will be found in a system dependent
directory---instead of syst, you could do arch, for architecture
dependent, or comp, for compiler dependent).
Yeah, me too -- IMO python beats why bother with old scripting stuff
when it provides all the power, is portable and reads like a real
language instead of emulating the old modem output before NO CARRIER. :->

So you've tried Perl as well. :)
 
I

Ian Collins

James said:
James Kanze wrote:
[...]
One advantage of popular cross-platform IDE like Eclipse and NetBeans is
they do generate makefiles under the hood and (at least with NetBeans)
those makefiles are designed to support hand editing. The IDE generated
akefiles can also be use to run builds on remote hosts where running an
IDE is impractical.

I'll have to give Eclipse another try. I did try it, and found
that while it might be OK for Java, it didn't handle C++ very
well, and it didn't integrate foreign editors (like vim) very
well (and like most IDE's, it's native editor was shit). That
was a long time ago, however, so those problems might be
solved.

NetBeans C and C++ improved significantly once it became the IDE used by
Sun for their Studio IDE.
Of course, when you've been using makefiles for a while, it
doesn't take much to write a new one; in the simplest cases, you
just define a macro with your sources, and include your master
makefile. And the ones generated by the IDE aren't probably
much use in the more complicated cases, where you have to build
the tool which generates some of your sources. (And of course,
this isn't even possible under VS; a single project can't build
an intermediate .exe needed to build the final .dll.)

My point was that with NetBeans (and possibly Eclipse) the top level
makefile is only generated once and is designed for hand editing. I
often do exactly what you mention on Solaris, build the object files
used by dtrace probes, as part of my build process.
 
B

Balog Pal

That's simply not true.

'True' is never so simple. :) Well, reading my words it is possible to
mean it too literally -- by "anything" I meant a practical approach, we
have many build sytems tuned for different things and ways of work, and
some weaknesses, but *able* to carry out processing any kind of inputs
to any kind of outputs using any kind of translators in between. In
actual work we normally bent to what our tools like rather than waste
much time on fighting a weakness. And IME it's way less terrible than it
looks unless build systems are changed every few months.

Yes, VS has a plenty of problems writing the project files (at least
VS2010 I use), like the property manager is just FUBAR due to bugs, and
the regular editor is obsessed to write the attributes for every config.
But the former is hardly ever needed and the latter is easy to overcome
-- and before checkin you look at diff anyway.
There are so many things that one can
easily do in makefiles, but which are extremely complicated, if
not impossible, with MS's project and solution files (which is
what the IDE uses). Some examples of things you cannot do
in a Microsoft project:

- Build a local .exe to generate the code you need to compile
for the .dll the project is supposed to build. You have to
create a separate project for the .exe (even if it is build from
only a single .cpp). Which, of course, exposes an internal
detail of the project to everyone working on the solution.

You just described how to solve the *build*. So it is not impossible,
right? All tools normally impose some way of organization. So you youse
two projects instead of one. Actually I can think of several ways to
work it around, but all look way too much hassle with an excessive
requirement (and one pretty moot with original comparison).
- Use the same project in different solutions, with the
solution passing down information telling the project how to
configure itself. You can work around this somewhat, by using
conditionals in the project file, but it's awkward, since the
support for conditionals only supports full equality.

Well, it's indeed an overlook that you can't define macros and params
directly in .sln, but you can define them in several ways. And having a
..props file with the condition on solution name and defining them seem
not such a big difference to me.
You
can't make something conditional on a regular expression match
on the solution name, for example.

I'm not so proficient in msbuild to provide actual counterexample code
but am 99% positive that claim is false. You can use a wide range of
string manipulation and other functions -- I do in practice. [it was a
year ago when I had to fix the build of or project]. I recall i could
use anything you can do with a C# String.
And conditions on the
solution name are the only way you can make anything dependent
on the actual solution file.

I bet in practice you use the project in just a handful of solutions and
can figure out groups with just basic string ops if regex is really not
accessible -- what may well be.
Add to this that the VS compiler's error messages can be totally
useless when templates are involved. Not always, not even
usually, but from time to time, you'll get an error message
without the slightest indication of where the error is in *your*
sources.

Yeah, that is plain pure PITA and shame on MS folks. Actually templates
are the better covered part -- but just have an error in a .h file, the
message will not tell how it was included or even which .cpp was being
compiled. (and it's braindead that /showincludes is option that modifies
the project rather than being part of env. Can be worked around somewhat
but still major WTF, especially as would need minimal work on the
implementation.)

I wish msvc had diagnostics on level of gcc. But it's the compiler issue
and has little to do with the IDE or the build.

(for template probblems remember that the 'error list' contains only one
line of info -- if you switch to output you (may) find more diagnostics
explaining the case, with navigation and whatever. )

And the fact that you can't debug into code which was
compiled elsewhere, even if you have the sources.

You what? You can attach to any process, and if one crashes you're
offered attach JIT. You can load symbols anytime and anywhere from
remote servers or collect them in your cache, or just ad hoc -- if you
have them certainly. (did you try right-click on gray lines in stack
trace and use Load symbols'?

Without symbols you're really limited to assy, but it may still be
better than nothing. And IIRC you can force to use source file even if
it is not the same as during compile at your own risk.
(I've
recently worked on a Python interface, and at times, it was
useful to step into the Python code, to see what Python was
expecting of me, and what I was doing wrong. The VS debugger
simply refused to do it; luckily, I had access to Linux box as
well, with gdb.)

Did you install the python extension to VS? I never tried a mix like you
describe but it definitely can walk on python separately.
Certainly. That's what we do, and I've written Python scripts
to generate new projects, to ensure that every one stays on the
same page. (There are a few things, normally at the top, which
don't seem to work if you put them in at .props: the list of
project configurations, for example.

I didn't have to mess with that part. IMO configuration should work if
you import the props early enough, but the IDE would likely not
recognize them. (It looks for some labels to decypher stuff and if you
stray from what it expects, it misreports some elements -- but they
still fork fine in the build.)
Which was a pain when
I had to add support for 64 bits, but that's not something you
do every day.)

My observation is also that configurations are PITA -- and if you try to
mix C# with C++ in a solution it goes out of hand. Fortunately I could
weed out everything and just have a Debug and Release from the initial
mess. It's especially "nice" how solution creates a full matrix of
config and target.

Do I guess right that you tried to sidestep the mess by creating a copy
of the .sln and one use 32 and the other for 64 bit? I'd probably try
something like that.
But the problem remains: you cannot directly
push down information from the solution file (which in addition
uses GUIDs everywhere, rather than the names of anything, which
makes editing it by hand a real pain). Our .props files are
full of things like:

<attribute_name Condition=" '$(SolutionName)' == 'OneSolution' ">value</attribute_name>
<attribute_name Condition=" '$(SolutionName)' == 'DifferentSolution' ">different_value</attribute_name>
...

Every time we use the project in a new solution, we have to come
back and add it here.

If you use some naming schema that fits with the available string
functions you can make it much simpler. I'd put .EndsWith("64") or
..IndexOf to locate substrings. That IMO could save most of the edits in
a practical system.

But even with the raw form,
compared to all the crap I face everyday I would not consider to mention
that as a minor nuisance. Yes, you have to configure the stuff somewhere
-- so have one file that maps solution names to a set of attribute
values, and the projects work with the latter. And include the mapper up
front.

Sure in my practice there's no proliferation of .sln files, others may
not be so fortunate.

http://msdn.microsoft.com/en-us/library/dd633440.aspx

And from hits like these
http://www.msbuildextensionpack.com/help/4.0.5.0/html/9c5401ed-6f55-089e-3918-2476c186ca66.htm
I'd think you can even extend the msbuild expressions.

It's true that most build systems are pretty bad: I've been
using GNU make, largely because I know it, but the make syntax
is horrible. Of the other systems I've tried, however, either
they don't have the support for some feature I need, or they're
simply broken.

I worked with several build systems and recall none that could be
figured out without massive reading, or that was simple to find out any
problem if it surfaced. So my general conclusion is that build is a
general life-sucker and I feel lucky if there's a guy around who can
manage it. :) Well, currently I'm out of that luck.
The dependency checking in both Jam and VS is seriously broken:
VS fails when pre-build steps modify sources or headers (and

Well, that one I would not call 'broken' but the specified way it does
things. IMO it is possible to make build fit that model.
only allows one pre-build step per project, so you have to write
scripts to chain several together),

All those steps are defined in some msbuild script file that can be
rewritten to your liking. Yes, I agree that the stock pre and postbuild
steps are extremely limited, and are just dropped in for simple things.
But it is not a msbuild restriction, rather like I claimed make is
broken because I didn't like some of the stock rules.

IIRC the stock steps DO have attribute to describe inputs and outputs
and if you use them correctly, the dependency will be considered. If you
jut modify files without telling the system that will go unnoticed.
However I recall in VS2010 the thing is bugged -- while specification
and even UI allows a list of outputs, only the first one is actually
considered. :-(((


My practical problem is not related to the theory of work at all -- I
keep source generation and other translation as solvable.

But I encounter buggy behavior with dependency output itself. (in the
intermediate folder files are created with inputs, outputs, command
lines, etc) On my machine I see some weird entries listed as
dependency, some nvidia drs.bin file belonging to the video driver.
Other time some .h files are missing from the list. With a small but
noticeable chance. Guess due to some race conditions like those
prevalent in the msdev IDE. Now THAT is something really nasty.

If you called this system not usable for that reason I'd not say a word
of objection.
 
J

James Kanze

On 3/14/2013 6:34 PM, James Kanze wrote:
[...]
There are so many things that one can easily do in
makefiles, but which are extremely complicated, if not
impossible, with MS's project and solution files (which is
what the IDE uses). Some examples of things you cannot do
in a Microsoft project:
- Build a local .exe to generate the code you need to compile
for the .dll the project is supposed to build. You have to
create a separate project for the .exe (even if it is build from
only a single .cpp). Which, of course, exposes an internal
detail of the project to everyone working on the solution.
You just described how to solve the *build*. So it is not impossible,
right? All tools normally impose some way of organization. So you youse
two projects instead of one. Actually I can think of several ways to
work it around, but all look way too much hassle with an excessive
requirement (and one pretty moot with original comparison).

Adding projects right and left breaks encapsulation.
Well, it's indeed an overlook that you can't define macros and params
directly in .sln, but you can define them in several ways. And having a
.props file with the condition on solution name and defining them seem
not such a big difference to me.

Yes. Project files seem to lack a few features, but The
solution files are literally from the middle ages.
I'm not so proficient in msbuild to provide actual counterexample code
but am 99% positive that claim is false. You can use a wide range of
string manipulation and other functions -- I do in practice. [it was a
year ago when I had to fix the build of or project]. I recall i could
use anything you can do with a C# String.

I'm basing my statement on the reference
(http://msdn.microsoft.com/en-us/library/7szfhaft.aspx). If
some form of regular expressions are supported, Microsoft isn't
documenting the syntax needed to use it. (In other contexts,
I'm sure that Microsoft supports regular expressions, but they
don't seem to provide a syntax for it here.)
I bet in practice you use the project in just a handful of solutions and
can figure out groups with just basic string ops if regex is really not
accessible -- what may well be.

In practice, I have no idea what solution files are being used.
I deliver library components, which other groups merge into
their final product. I can't make an exhaustive list of all
solutions, because I don't even know all of my clients.
Yeah, that is plain pure PITA and shame on MS folks. Actually templates
are the better covered part -- but just have an error in a .h file, the
message will not tell how it was included or even which .cpp was being
compiled. (and it's braindead that /showincludes is option that modifies
the project rather than being part of env. Can be worked around somewhat
but still major WTF, especially as would need minimal work on the
implementation.)

The problem with the .h file seems to be fixed in 2012. On the
other hand, VS parallelizes the builds, and error messages from
different projects are mixed in the output. For the moment, if
I can't find the error quickly, I'll copy paste the entire
output pane into vim, then run a Python script
(SortOutputByProject.py) over it, then search for error. Once
I know which project isn't building correctly, I can build only
it, and get some usable error output.

[...]
You what? You can attach to any process, and if one crashes you're
offered attach JIT. You can load symbols anytime and anywhere from
remote servers or collect them in your cache, or just ad hoc -- if you
have them certainly. (did you try right-click on gray lines in stack
trace and use Load symbols'?

This seems to be a regression in 2012; I do remember doing
something like what you describe in the past. But when I do
step into function on a Python function, it still steps over.
Without symbols you're really limited to assy, but it may still be
better than nothing.

It is. Even in g++, you can't get full symbolic debugging
without all of the symbols. But if you're familiar with the
compiler and the assembler, you can sort of match things up, and
see what's happening. (I'm not trying to debug Python. I'm
just trying to figure out where it's deciding that my input was
wrong.)
And IIRC you can force to use source file even if
it is not the same as during compile at your own risk.

With 2012, you have to. If there is a pre-build step, the
debugger always decides that the library isn't up to date.
(This definitely wasn't a problem in earlier versions.)
I didn't have to mess with that part. IMO configuration should work if
you import the props early enough, but the IDE would likely not
recognize them. (It looks for some labels to decypher stuff and if you
stray from what it expects, it misreports some elements -- but they
still fork fine in the build.)

It only seems to be an issue when you create new projects.
Except for VS completely rewriting your filter files whenever
you add or remove a file.

(There's another wierd thing in 2012: my pre-build step will
write the output to second file, the delete the original and
rename the second---not really necessary in this case, but old
habits die hard. Rather often, when I do this, VS looses the
generated file. The name still appears in the navigation pane,
but clicking on it gets you nowhere, and if there's an error in
the generated code, clicking on it is also does nothing. This
is definitely a regression; I never had any problems with this
in earlier versions.)
My observation is also that configurations are PITA -- and if you try to
mix C# with C++ in a solution it goes out of hand. Fortunately I could
weed out everything and just have a Debug and Release from the initial
mess. It's especially "nice" how solution creates a full matrix of
config and target.
Do I guess right that you tried to sidestep the mess by creating a copy
of the .sln and one use 32 and the other for 64 bit? I'd probably try
something like that.

No, we have one .sln for both. I actually works pretty well,
but there were a few painful moments getting there.
If you use some naming schema that fits with the available string
functions you can make it much simpler. I'd put .EndsWith("64") or
.IndexOf to locate substrings. That IMO could save most of the edits in
a practical system.

We do. The problem is that the conditionals only support ==, !=
and HasTrailingSlash for strings.
But even with the raw form,
compared to all the crap I face everyday I would not consider to mention
that as a minor nuisance. Yes, you have to configure the stuff somewhere
-- so have one file that maps solution names to a set of attribute
values, and the projects work with the latter. And include the mapper up
front.
Sure in my practice there's no proliferation of .sln files, others may
not be so fortunate.

This seems to apply to the MSBuild framework, and not to the
project files; at least, it talks of tasks, which aren't present
in the .vcxproj files (at least not that I've seen).
 
N

Nick Keighley

Getting working code out of the door.  Actually, creating and
maintaining working code effectively, for a complete definition
of working code (i.e. maintainable, tested, documented...).

and what evidence do you have that this is easier with vim/bash/make
than VS? Just saying so isn't enough. As I say I've worked in both
environments I don't see the massive disparity you claim. I think
there's a little bit of unix bigotry.
You mean because you have to create a solution, with a project?

yep. And turn off a bunch of MS extensions and explain you want to use
ASCII and I think a couple of other things. Out of the box VS isn't a
C compiler.
For anything more complex, the fact that you don't have to
write a makefile is a win for a beginner.

not just beginners
For production code, of course, the fact that you can't really
create arbitrary rules, like you can in a makefile, is
a problem.  As is sharing "projects" between different
solutions, which use different compiler options.  I've done it,
but it involves editing the project files by hand; at that
point, you're better off using makefiles, because the higher
level makefile can pass explicit information down to the project
file.  (With VC++, you have to create rules conditioned on the
solution name.)



If you limit yourself to the IDE, just about anything useful.
Try creating a project in which several different sources are
generated from other programs.  There's special mechanism for
the case where a single source and header are generated by
a single tool (e.g. lex and yacc), and you can have one (but
only one) pre-build step.  But the build system doesn't
understand any dependencies created by the pre-build, and if you
want two or more operations, you have to wrap them into some
sort of script.  (The fact that the build system decides what
needs recompiling *before* doing the pre-build is a serious
error, since the purpose of the pre-build is normally to
regenerate some files.)

ok. I think you have me! The last VS project I worked on we avoided
such things (probably because it was hard). The only thing that did do
this seemed to rebuild everytime even then it wasn't necessary.
Bash syntax is a bit special, but the real problem is that the
individual tools aren't always coherent.  You need to learn
several different variations of regular expressions, for
example.  It's still an order of magnitude better than Perl,

really? What tools? sed and awk and such like?
 
N

Nick Keighley

I wonder where this comes -- MS switched to MsBuild a few years ago, and
in that you can do about anything you could in a makefile. Yes, those
who mastered makefiles already may be better off with that. Others are
probably not, as for the sensible stuff it's ways simpler, and plays
naturally with the IDE.

Certainly you CAN and definitely should share most rules -- that is done
by creating .props and .targets files, and the .vcxproj should have only
the file lists. Certainly any really specific options can be set (either
per project or per file).

So all you write about is possible alright. I don't say it is trivial or
nice but that same applies to make and probably most other build systems
for big real-life projects.


Yeah, me too -- IMO python beats why bother with old scripting stuff
when it provides all the power, is portable and reads like a real
language instead of emulating the old modem output before NO CARRIER. :->

perhaps I need to look at Python again. I choose Perl for its
integration of regexps. It looked like the language awk should have
been to me.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,743
Messages
2,569,478
Members
44,899
Latest member
RodneyMcAu

Latest Threads

Top