it would be a local variable i'd be undefining, probably not an
instance variable (although i'd like that option also)
In your code you used global variables, and they work different
in Perl than you seem to think. If you use a global variable
the Perl interpreter reserves an entry in the stash of the
package of the global variable. If you do "undef $a" you do
the equivalent of "$a = undef". This example shows, that
the entry exists even without assigning a value to the variable
$a:
perl -MData:
umper -e 'sub d { print Dumper [ map { $_, ${"$_"} }\
grep { /^a$/ } keys %{*{main::}} ] }; d(); $a; d(); '
If you use @a as a global variable it's no different, in fact
the same entry is used, but you can save different things to this entry
at the same time: scalars (those can be references or blessed
references, too), globs, code objects, arrays and hashes. (Thanks
for reminding me how horrible Perl's internals are, ugh!)
If you call defined on @a, what you actually do is checking if the
array @a contains any elements, because the stash entry exists in any
case. To use defined for aggregates is deprecated, see
perldoc -f defined:
: Use of "defined" on aggregates (hashes and arrays) is depre- cated.
: It used to report whether memory for that aggregate has ever been
: allocated. This behavior may disappear in future versions of Perl.
: You should instead use a simple test for size:
:
: if (@an_array) { print "has array elements\n" }
: if (%a_hash) { print "has hash members\n" }
Ruby's global variables work the same way as Perl's do. But if
you use lexical variables (starting with downcased letters),
the variables is !defined? until the interpreter sees an
assignment to it. It's BTW not necessary to execute the
assignment:
ruby -e 'p defined?(a); false and a = nil ; p defined?(a)'
If you use lexical variables in Perl you have to use "my $a", to reserve
memory for it in the current scope. In Perl it's of course not possible
to have a lexical variable that is undefined in the Ruby way, that is,
undefined before the "my $a", because it would be interpreted by Perl as
a global variable. This would lead to an entry for it in the stash like
shown above.
only if the logic you use makes the code smaller or more efficient...
Premature optimization is the root of all evil...
[defined?]
unfortunatley it doesn't do me any good, as i'll want to undef it
later.
So you can do "a = nil" or "$a = nil" and check for a.nil? to be
true. Or perhaps better just rely on the fact that nil is false in
Ruby. Using nil will be equivalent to "undef $a" in Perl.