Why do people say 'extern' is error prone ?

P

pereges

yet i see so many good codes which make extensive usage of extern
keyword. I myself need to make certain data visible throughout the
program..is it advisable to use extern or just pass the variables,
reutrn them etc etc ?
 
P

pereges

Richard Heathfield said
I don't recall seeing any. The extensive usage of the extern keyword rather
spoils them.
Then your best path is probably a re-design.
Try to:

(a) minimise the scope of each object (that is, try to make it visible from
as few places as possible);
(d) use the parameter-passing and return mechanisms for data-sharing, as
far as is practical;
(c) reduce (preferably to one or fewer) the number of objects you need to
give file scope.


The way I see it - any function can see, read and write to it,
causing unexpected behaviour and makes one wonder where that variable
is defined, what and where it's used, etc. That is why many people try
to avoid it. But in my case I have a valid excuse for using it. It's
not like I'm using it indiscriminately either. My project is numerical
computation based on ray tracing. I figured out a few things which
should probably have a global scope to make things a little smooth -

1. Source and Receiver locations - After the user enters them,
unlikely to change for one complete computation. Needed everywhere for
ray computations.

2. Object Specification - After the particular object has been read
from the ASCII file, it will not change for one complete computation.
This information is also crucial because you are ray tracing against
object itself.

3. Kd tree - You need it while traversing the rays. Doesn't look good
to pass the root pointer every where all the time.



Apart from these three, there is nothing in my program that needs to
have global scope. In this case also its not an absolute necessity but
yeah it makes things more organized and you have less to worry about.
 
B

Bartc

Richard Heathfield said:
pereges said:


I don't recall seeing any. The extensive usage of the extern keyword
rather
spoils them.


Then your best path is probably a re-design.

Is this the argument against using global variables?

I might be in trouble then. My last proper application used I think between
1000 and 2000 global variables, shared amongst all modules. And that doesn't
include file-scope variables in each module, perhaps another 1000.

I suppose many could have been collected in a giant struct which is then
passed everywhere, but it sounds silly to collect unrelated variables like
that.

And some could be accessed by functions instead, although that just
substitutes global functions for global variables.

What exactly is the problem with global variables?
 
B

Bartc

Eric said:
Bartc, I *love* your program! It's the perfect adjunct
for my super-duper workspace/gamespace/faceplace app, and the
combination will be so enormously popular that we two will
become Filthy Stinkin' Rich. My plan is to set my program up
as a framework that runs a couple dozen copies of your code,
all as part of one giant program. Is there anything about
your program that might cause us trouble?

I guess it might need tweaking. But I'd imagine a lot of softwares would
give the same trouble, globals or not.
Okay, that's whimsical -- but I lived through the aftermath
of just such a migration at a PPOE. The program was a document
editor that integrated text, drawings, pictures, charts, tables,
and so on all in one package (run-of-the-mill stuff nowadays,
but this was a couple decades ago). You launched the program,
it opened your document, you fiddled with it, and then you shut
the program down.

Well, that program of mine was some years ago, and I've improved since (I
hope) but still use plenty of globals and file-scope variables where
appropriate.

Looking at that code now, I could probably reduce the globals by half. But
there would still be a good case for leaving the rest in.

The application was a CAD-type product with a built-in language (compiler
and interpreter) for running much of itself and for add-ons. So it was quite
extensive, and it's difficult to imagine how I could eliminate most
globals -- working from it's original code.

Starting anew however things would be different (a C interpreter core
running most of the code as a dynamic language, and with this setup, sharing
problems become simpler).

But take this one example of a global variable (the code was not C):

byte winnt /* set to 1 when this is WinNT, 0 otherwise */

What was I supposed to do with that? A few functions out of thousands need
to know that value.
Then we decided to embed our document editor into a full-
fledged document management system, so you'd work on several
documents at the same time in different windows. Unfortunately,
the editor was festooned and beribboned with global variables
pertaining to "the" document being edited. Thousands of them,
just like your program ...

No, my program could in theory have coped with multiple documents
(drawings), although the user could only edit one. That's why I only had
2000 or so globals..
 
U

user923005

Bartc said:
[...]
But take this one example of a global variable (the code was not C):
byte winnt    /* set to 1 when this is WinNT, 0 otherwise */
What was I supposed to do with that? A few functions out of thousands need
to know that value.

     I have no patience with zealots who argue that all global
variables are ipso facto evil.  When someone does so, I delight
in offering

        #include <stdio.h>
        int main(void) {
            puts("Hello, world!");
            return 0;
        }

... and inviting him to eliminate the global variable, that is,
the global variable whose name isn't even mentioned!

prog > output.txt
But I guess when you look at the data eventually, the global variable
will rear its ugly head.
     And yet, I feel there is reason to avoid global variables.
They grow like kudzu, they induce unexpected and unwanted
couplings between modules that could have been independent, they
make debugging harder ("I can see that foo() crashes because the
value of global variable bar is invalid, but how in God's green
earth did bar get clobbered?")  And there's always the chance --
as in my lengthy tale of woe upthread -- that your code will be
recycled into a situation where the Singleton is non-singular
and encounters a singularity ...

     I use 'em, but only when there are no witnesses.

After chasing down problems due to globals (it used to be worse, with
Fortran common blocks which were often treated like public unions)
it's easy to get sick of 'em.

They are not all evil. Only most of them.
The ones I hate are the totally unnecessary ones, spawned from lazy
thinking.
 
P

pereges

So far in my code, I haven't used a single static variable or a const
variable(I prefer to #defining constants and have them at one place in
common.h file) either. I just never felt the need to do it so far. I
don't know if the C gurus consider this as a good practice.
 
I

Ian Collins

pereges said:
So far in my code, I haven't used a single static variable or a const
variable(I prefer to #defining constants and have them at one place in
common.h file) either. I just never felt the need to do it so far. I
don't know if the C gurus consider this as a good practice.

#define constants are the scourge of the maintenance programmer. They
are often impossible to read in a debugger.
 
B

Bartc

Richard Heathfield said:
pereges said:


#define has gained a strangely poor reputation amongst some C programmers,
and one that I don't really understand.

I have a problem with #define, finding it a crude way of defining constants
which you don't want as variables.

And, in a language that frowns on globals, #defines have file scope, so I
can't write:

#define size 1200

int main(void)
{
#define size 300
}

Using const int size=1200 works, but it might generate an unncessary
variable. And it's not that difficult to change these 'constants':

#include <stdio.h>

int main(void){
const double pi=3.142;
double *p=(double*)&pi;

*p=2.71828;

printf("Pi = %f\n",pi);
}

Something in-between is needed! Namely, a true immutable constant with a
proper type and normal scope rules.
The principal objection seems to
be that, in a debugger, they are "impossible to read".
I just tried this in gdb. I don't use gdb a great deal, but I know /how/ to
use it, so I wrote the following program:

I don't use debuggers either. But I think you're saying that, to examine a
'variable' that the debugger doesnt know, one has to go back to the source
code, work backwards to the definition of that variable, and with luck one
may stumble across a simple #define with that value!

Or it could be an expression involving further #defines buried in a myriad
of include files somewhere.

So yes, I can see the problem.
 
A

Antoninus Twink

I just tried this in gdb. I don't use gdb a great deal, but I know /how/ to
use it, so I wrote the following program:

#include <stdio.h>

#define X 42

int main(void)
{
printf("%d\n", X);
return 0;
}

and loaded the program into gdb.

+++++++++++++++++++++++++++++++++++++
(gdb) run
Starting program: [...]./foo

Breakpoint 1, main () at foo.c:7
7 printf("%d\n", X);
(gdb) print X
No symbol "X" in current context.
+++++++++++++++++++++++++++++++++++++

Oh deary deary me. But wait!

+++++++++++++++++++++++++++++++++++++
(gdb) list
2
3 #define X 42
4
5 int main(void)
6 {
7 printf("%d\n", X);
8 return 0;
9 }
10
(gdb)
+++++++++++++++++++++++++++++++++++++

Also, of course, we can simply look in the original source code to see the
value. So I really don't see this as being a serious objection.

If you're using gcc as a compiler, it supports a -g3 debugging level,
which includes extra information, such as all the macro definitions
present in the program. Some debuggers (notably gdb) then support macro
expansion.

(gdb) r
Starting program: [...]/foo

Breakpoint 1, main () at foo.c:7
7 printf("%d\n", X);
(gdb) p X
$1 = 42
(gdb)
 
F

Flash Gordon

Richard Heathfield wrote, On 12/04/08 07:24:
pereges said:


#define has gained a strangely poor reputation amongst some C programmers,
and one that I don't really understand. The principal objection seems to
be that, in a debugger, they are "impossible to read".

The solution is to upgrade to a better system if that is possible.
Admittedly sometimes it is not.
I just tried this in gdb. I don't use gdb a great deal, but I know /how/ to
use it, so I wrote the following program:

Either you are running versions of gcc and gdb which are too old or you
don't know how to drive them well enough.
#include <stdio.h>

#define X 42

int main(void)
{
printf("%d\n", X);
return 0;
}

and loaded the program into gdb.

+++++++++++++++++++++++++++++++++++++
(gdb) run
Starting program: [...]./foo

Breakpoint 1, main () at foo.c:7
7 printf("%d\n", X);
(gdb) print X
No symbol "X" in current context.
+++++++++++++++++++++++++++++++++++++

Oh deary deary me. But wait!

markg@brenda:~$ gcc -g3 -ansi -pedantic -Wall -Wextra t.c
markg@brenda:~$ gdb ./a.out
GNU gdb 6.6-debian
Copyright (C) 2006 Free Software Foundation, Inc.
GDB is free software, covered by the GNU General Public License, and you are
welcome to change it and/or distribute copies of it under certain
conditions.
Type "show copying" to see the conditions.
There is absolutely no warranty for GDB. Type "show warranty" for details.
This GDB was configured as "i486-linux-gnu"...
Using host libthread_db library "/lib/tls/i686/cmov/libthread_db.so.1".
(gdb) b main
Breakpoint 1 at 0x8048385: file t.c, line 7.
(gdb) r
Starting program: /home/markg/a.out

Breakpoint 1, main () at t.c:7
7 printf("%d\n", X);
(gdb) p X
$1 = 42
(gdb)

Works for me.

It depends who you talk to, I think. (It also depends on whom you consider
to be gurus.)

I would use #define (or enum for an integer if I want scope) rather than
a const variable. However, opinions do differ.
 
F

Flash Gordon

Richard Heathfield wrote, On 12/04/08 13:13:
Flash Gordon said:



It's the former - my gdb is "too" old (well, it's old, anyway).

So now you have a reason to upgrade your tool chain :)
 
F

Flash Gordon

Richard Heathfield wrote, On 12/04/08 17:10:
Flash Gordon said:


I do?

Yes. Whether it is a sufficient reason is for you to decide, but it is a
reason.
Why would *I* want to look up #define values in gdb? That's what
editors are for, innit? (Or grep.)

If you are in gdb why throw up another tool? It also means that you can
evaluate a sub-expression easily using copy/paste even if it contains
macros.
I have never really understood this craze for plugging the latest bugs into
a production environment, where there is no good business case for doing
so. And as far as I'm concerned, my principal development machine *is* a
production environment, so I don't change *any* of its tools (and least of
all its OS) without a cracking good reason and a fair amount of testing.

You don't need to go to the latest versions to get this feature.

In any case, I did not say it was sufficient reason on it's own, but it
is a reason.
 
R

Richard

Richard Heathfield said:
Flash Gordon said:


I do? Why would *I* want to look up #define values in gdb? That's what
editors are for, innit? (Or grep.)

Once has to laugh. You are clearly totally ignorant on how to use a good
debugger. I hate to think of the time and money you waste doing it "your
way" using outdated tools and thinking.
I have never really understood this craze for plugging the latest bugs into
a production environment, where there is no good business case for
doing

Latest bugs?
so. And as far as I'm concerned, my principal development machine *is* a
production environment, so I don't change *any* of its tools (and least of
all its OS) without a cracking good reason and a fair amount of
testing.

Yes and travelling at 25 Mph on a train will cause your head to blow
off.

Sheesh.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,769
Messages
2,569,576
Members
45,054
Latest member
LucyCarper

Latest Threads

Top