Non-constant constant strings

I

Ian Collins

Rick said:
I am trying Solaris Studio 12.3 right now. It uses Netbeans 7.0. Is there
a way to upgrade it to 7.3 or later? 7.0 has weird errors.

Not currently. There should be a long overdue update in the next few
months.
One is if you
launch less than full maximized, and then later maximize, the mouse does
not track appropriately.

I've never seen this.
 
K

Kaz Kylheku

[QUOTE="Richard said:
It sounds like you style is better suited to interpreted languages than
to a compiled language such as C. I'm sure I'm not alone in considering
having to use a debugger as a failure of the development process...
What nonsense. A debugger is an integral part of any development
process.

You certainly know by now (having been around this newsgroup for a few
years or so) that it is an integral part of the CLC religion that real men
don't use debuggers. Debuggers are for sissies.[/QUOTE]

Not using a debugger is like building some electronic device and never using
a multimeter or oscilloscope.

Even $2 microcontrollers have debuggers nowadays. I have here a PIC
Microstick: a dev board the size of a stick of gum. It hooks up to your PC via
USB cable and you can single-step through C running on the PIC24 chip.
 
D

David Thompson

On Tue, 21 Jan 2014 11:09:03 -0500, James Kuyper
Much simpler would be the following:
[string-lit concatentation to get the strings in adjacent memory]
char read_write[] =
"if (something[9999])\r\n\0"
"{\r\n\0"
" // Do something\r\n\0"
"} else {\r\n\0"
" // Do something else\r\n\0"
"}";

// Count the strings
int strings=0;
for(char *ptr = read_write; ptr < read_write + sizeof read_write;
ptr++)
if(*ptr = '\0')
strings++;
s/=/==/ or just if(!*ptr)
// Set up an array of pointers to the strings.
char **rw_ptrs = malloc(strings * sizeof *rw_ptrs);
if(rw_ptrs)
{
char *ptr = read_write;
for(int str = 0; str < strings; str++)
{
rw_ptrs[str] = ptr;
while(*ptr++); // move past end of string
}
}

But it bothers me to have the two loops structured differently.
I would feel more comfortable doing:

int n = 0; char * ptr = big_array;
while( ptr < end_of_array ){
++n; ptr = strchr (ptr, '\0') + 1; }
/* or ++n; while(*ptr++){} */

rwptrs = malloc (n * sizeof *rwptrs);
if( !rwptrs ) /* error */

n = 0; ptr = big_array;
while( ptr < end_of_array ){
rwptrs[n++] = ptr; ptr = strchr (ptr, '\0') + 1; }
/* similarly */

(Also I prefer to handle errors branches first
and then forget about them where reasonable;
for one it doesn't matter, but for 5 or 10 it
avoids the normal-case code, that I am most
interested in, being indented far too much.)
 
D

David Thompson

Note: mostly OT but the comparison can be of interest

Rick C. Hodgin <[email protected]> wrote:
I personally believe it's a silly requirement to [share string
literal values] to save a few bytes of space by default. I'd
rather have it always duplicated and then allow the developer
to provide a manually inserted command line switch which
specifically turns on that kind of checking, and that kind
of substituting.
As already explained, it's not a requirement. Because programs
aren't allowed to mutate string literals, an implementation *can*
share them or *can* make them distinct.
I believe Java requires String constants in the same class
(maybe only method) to have the same reference value. That is, in

if("string"=="string") ...

the if condition will be true. As far as I know, C doesn't
require that, but allows for it.
C yes. Java requires it across all methods and (probably*) all
classes. Specifically, the String class has a method to 'intern'
a string such that interning multiple strings with the same value
(.equals() is true) re-use the same object and are reference==.
String literals are automatically interned.

Somewhat similarly but further off topic, the boxed types
(like Integer for int and Boolean for bool) are required to provide,
and the boxing conversions to return, the same (immutable)
object for multiple uses of the same bool or byte value, and
(implementation-dependent) small values of other integer types.

* Normally there is only one instance of the Class for String.
I speculate a sufficiently sophisticated, or broken, classloader
could create multiple String Class'es, and then it's not clear what
is required; I expect you'd get one intern cache per String Class.

As already observed, C also allows (but doesn't require)
and some compilers do implement sharing of one string literal
with a suffix of another (which works for null terminated).
Java does not require sharing a literal with a substring of another,
even though the original (Sun) implementation of String made
substrings cheap, and I have never seend the Suncle compiler
do so. Apparently luckily so, since I just learned belatedly that
7u6 abruptly changed this; since then String.substring(int,int)
doesn't share the original string's char[], instead it copies
and the .offset and .length instance fields were dropped.
 
K

Kenny McCormack

You certainly know by now (having been around this newsgroup for a few
years or so) that it is an integral part of the CLC religion that real men
don't use debuggers. Debuggers are for sissies.

Not using a debugger is like building some electronic device and never using
a multimeter or oscilloscope.

Even $2 microcontrollers have debuggers nowadays. I have here a PIC
Microstick: a dev board the size of a stick of gum. It hooks up to your PC via
USB cable and you can single-step through C running on the PIC24 chip.[/QUOTE]

Sissy!

(Kiki and the others wouldn't need no steenkin' debugger!)
 
G

glen herrmannsfeldt

Kaz Kylheku said:
Not using a debugger is like building some electronic device
and never using a multimeter or oscilloscope.

Well, maybe like not using the oscilloscope. Pretty often I can
debug knowing the statement where it died. Java, for example,
normally tells you were a subscript bound error occurred.
That might compare to the multimeter, which will give you
the AC or DC voltage, but not the details of the waveform
that the oscilloscope gives.
Even $2 microcontrollers have debuggers nowadays. I have here a PIC
Microstick: a dev board the size of a stick of gum.
It hooks up to your PC via USB cable and you can single-step
through C running on the PIC24 chip.

-- glen
 
G

glen herrmannsfeldt

(snip)
C yes. Java requires it across all methods and (probably*) all
classes. Specifically, the String class has a method to 'intern'
a string such that interning multiple strings with the same value
(.equals() is true) re-use the same object and are reference==.
String literals are automatically interned.

I knew it did within a class, but I had figured that each
class was compiled separately with its own strings. I never
tried to test it between classes.

(snip)
As already observed, C also allows (but doesn't require)
and some compilers do implement sharing of one string literal
with a suffix of another (which works for null terminated).
Java does not require sharing a literal with a substring of another,
even though the original (Sun) implementation of String made
substrings cheap, and I have never seend the Suncle compiler
do so.

Some years ago, I had a program processing lines with StringTokenizer,
and storing some of the words in a HashTable. It started to run out
of memory faster than I thought it should. I finally found that it
was storing the words, generated by substring from inside
StringTokenizer along with the whole line as a char[] array.
Using new String() in each word before storing it in the hash
table fixed that one. It wasn't documented well by Sun.
Apparently luckily so, since I just learned belatedly that
7u6 abruptly changed this; since then String.substring(int,int)
doesn't share the original string's char[], instead it copies
and the .offset and .length instance fields were dropped.

-- glen
 
I

Ian Collins

Kaz said:
You certainly know by now (having been around this newsgroup for a few
years or so) that it is an integral part of the CLC religion that real men
don't use debuggers. Debuggers are for sissies.

Not using a debugger is like building some electronic device and never using
a multimeter or oscilloscope.[/QUOTE]

More like not using a crutch when you sprain your ankle. The trick is
to avoid the sprain in the first place :)
 
D

David Brown

Linus Torvalds does all of his development without a debugger. Some
people do this. I cannot. I can get all of my logic working perfectly
pretty much all of the time when just writing code top-down, but only
about 98% of my code is perfect in syntax. I make little mistakes.
Sometimes odd mistakes. I usually understand what should be coded,
but I simply type things wrong sometimes, and don't catch them when
reading the code -- and sometimes I don't catch it at all until the
faulty results are there and I have to go back through. I usually find
something I had no intention of typing, and wasn't part of my thought
processes, but was something that simply came out wrong when I was
typing. That doesn't happen as often, but probably once every two
weeks.

Your claims and your numbers just don't match up. On the one hand, you
say that edit-and-continue is the greatest thing since sliced bread and
that you practically live inside the debugger. On the other hand, you
claim your program logic is typically perfect first try, and you make
small syntax errors which you correct with a debugger (!) every two
weeks. And that this /one/ feature of the VS debugger /always/ leads to
a vast improvement in productivity for /all/ developers for /all/ types
of system.

And now you are going to single-handedly revolutionise the entire
software development world by inventing a language that breaks
well-known and well-grounded rules for maximising structure and
minimising problems (by making everything read/write - code, data, etc.,
and preferably global as well). Despite breaking the rules that are
particularly important in parallel systems, your language will
apparently be perfect for the hundred-core cpus that will dominate every
part of computing in the near future. It will be ideal for everything
from the smallest embedded system (once we have replaced these silly
flash devices with /real/ read-write memories for 50x productivity
gains) to the biggest machines - just as long as they are x86, ARM, or
an imaginary base-3 cpu. And it will run on Windows using OpenGL,
because that's the future of the world.

I've got to admire your optimism and your persistence. But don't give
up the day job.
 
D

David Brown

You certainly know by now (having been around this newsgroup for a few
years or so) that it is an integral part of the CLC religion that real men
don't use debuggers. Debuggers are for sissies.

Yeah I know.

What a load of unadulterated bullshit.

Especially with modern frameworks and APIs.

I would find it incredible, bordering on incompetent, if someone didnt
familiarise themselves with a large system by stepping through with a
debugger and examining the key data exchange points. Superb systems like
IntelliJ didnt get build because they were unnecessary. Not to mention
the setting of breakpoints/watchpoints in order to catch bugs. Using
printf is WRONG.

I can only assume the guys here have never worked on anything more than
petulant 2 liners to demonstrate why sizeof with brackets is wrong....
[/QUOTE]

/Debugging/ is a vital part of development - we may strive to make it
unnecessary, but it is rare that it code is bug-free when it is ready
for live testing. But /debuggers/ - of the sort discussed here, such as
the one in VS or gdb - are not always necessary. When designing an
embedded system, I prefer if possible to have a JTAG port available for
a debugger. But I've made many systems where I haven't used it, and
worked on many systems where there is no equivalent way to connect a
debugger. And there are plenty of systems where an uncontrolled stop
(such as via a debugger) causes a disaster (think of a motor controller,
for example). I still do debugging - flashing lights, extra UART port,
oscilloscope probes, etc., can all be part of the debugging process even
when you don't use "debugger" as such.
 
J

James Kuyper

On Tue, 21 Jan 2014 11:09:03 -0500, James Kuyper
Much simpler would be the following:
[string-lit concatentation to get the strings in adjacent memory] ....
for(char *ptr = read_write; ptr < read_write + sizeof read_write;
ptr++)
if(*ptr = '\0')
strings++;
s/=/==/ or just if(!*ptr)

I knew I should have tested it before posting. :-(

....
....
(Also I prefer to handle errors branches first
and then forget about them where reasonable;

I normally do the same. I should have mentioned that I was suppressing
error handling to simplify the code. Better yet, I should have inserted
an empty block with a comment indicating that it should be filled in
with appropriate error handling code.
 
B

Ben Bacarisse

Richard said:
How dare you! That's just not cricket. c.l.c "regs" don't acknowledge
errors in things they use and havent seen and certainly dont use
debuggers.

You should have the courage to name the people you are being rude
about. You are one of the "regs" here, so is Kenny. The term is used
to avoid responsibility.
 
B

Ben Bacarisse

David Brown said:
/Debugging/ is a vital part of development - we may strive to make it
unnecessary, but it is rare that it code is bug-free when it is ready
for live testing. But /debuggers/ - of the sort discussed here, such as
the one in VS or gdb - are not always necessary.

That's an important distinction to keep in mind. Another is that there
are lots of tools that can be called debuggers. If I were to be cast
away on a desert island, and could take only one debugging tool with me,
it would be valgrind.

<snip>
 
R

Rick C. Hodgin

I have found that this approach has pitfalls.
(a) You fix the bug in memory, but it doesn't get back into the source
code (or perhaps a slightly different fix ends up in the source code),
so you have to debug it again.

The way Microsoft's edit-and-continue operates is you modify the source
code, click "Apply Changes," and it updates the binary executable in
memory, resulting in the source code and binary executable being in sync.
(b) You fix the bug in memory, but you don't realize that fixing the bug
means that you cannot *GET* to the point of error after the fix,,
because with the fix the program now goes wrong (or at least
"differently") *earlier*.
Ibid.

(c) For some changes you almost *HAVE* to start over. If, for example,
you fix incorrect variable initialization which resulted in an off-by-one
array index, you have to figure out what the state would have
been if the initialization had been correct, which can require
fixing a *LOT* of internal state of the program. There's a
good chance that your idea of what that state is is incorrect.

There are some powerful features available in the Microsoft Debugger,
allowing for variables to be examined in a watch window, and easily changed
at any point by clicking and typing. There are also raw memory windows
which can be displayed in raw form, hexadecimal byte, word, dword, qword,
or float or double displays. This allows them also to be edited easily.

In some cases it is faster to start over than to continue editing. It
requires some user experience and some user wherewithal to determine. :)
(d) Sometimes you can "fix" the problem simply by moving the copy of
a function to the end, making any trivial change. Run it
straight as it compiles, and the bug is back again. A problem
like this is often associated with array overrun into data
associated with the function where you think the problem is.

I don't know about this one. The purpose of my design's edit and continue
is to allow anything to be changed or added at runtime. Add new variables,
constants, remove them, new code, functions, remove them, etc. It will
all be supported (James 4:15). :)

Best regards,
Rick C. Hodgin
 
D

David Brown

The way Microsoft's edit-and-continue operates is you modify the source
code, click "Apply Changes," and it updates the binary executable in
memory, resulting in the source code and binary executable being in sync.

Hopefully somewhere along the line there is also a "save" involved too!
(I'm guessing this has moved on since the VB3 days, where you couldn't
save the changes until you finished the debugging session - great fun if
the program hangs and you "finish" it by killing the whole VB3.)

Of course, you mean your source code is now in sync with /some/ of your
object code. If your code is compiled with optimisations, inlining will
mess up this process.
There are some powerful features available in the Microsoft Debugger,
allowing for variables to be examined in a watch window, and easily changed
at any point by clicking and typing. There are also raw memory windows
which can be displayed in raw form, hexadecimal byte, word, dword, qword,
or float or double displays. This allows them also to be edited easily.

Wow, that's powerful! I wish some of the debuggers I use could view and
edit variables!!

In case you didn't spot the sarcasm, I have /never/ seen a debugger that
could not do that - and I have used perhaps thirty or forty different
debuggers over the last twenty odd years.
 
R

Rick C. Hodgin

Your claims and your numbers just don't match up. On the one hand, you
say that edit-and-continue is the greatest thing since sliced bread and
that you practically live inside the debugger. On the other hand, you
claim your program logic is typically perfect first try, and you make
small syntax errors which you correct with a debugger (!) every two
weeks.

The ones every couple of weeks are ones I don't catch easily during the
development process. It's because, for example, I meant in my head to
type the variable "foo" but instead I typed the variable "i" because I
was thinking ahead to my next for loop. That kind of error results in
something that I cannot easily determine the cause of because in my mind
I typed "foo" ... it's just that in my code I typed "i". So, as I'm
going back through my code step-by-step, line-by-line, I'm having to
figure out what is wrong.

I'm getting better at this over the years. I don't make as many mistakes
as I used to in all of my development. It's these odd occasional ones
that really throw me sometimes though. I once spent three days (back in
the early 1990s) debugging an application I had written, stepping through
every line of code, only to conclude that there was absolutely nothing
wrong with my algorithms. I then started back over and reproduced the
steps from the beginning I did to try to recreate the error, including
creating the actual initialization files. In the process, I remembered
that I had copied over some code from another system. The other system
had an piece of data that was 10 characters long, and this one I was
using needed it to be 7. I realized as I was going through those non-
programming setup portions that I had left it at 10 instead of 7, and
that was the cause of my error.

I laughed out loud, turned off my computer, and went on a three day drive
to ultimately touch the waters of Lake Michigan, some 12 hours away by my
route. :)
And that this /one/ feature of the VS debugger /always/ leads to
a vast improvement in productivity for /all/ developers for /all/ types
of system.

Yes. For those who are able to do development and testing of algorithms
in the edit-and-continue environment, people will see a marked improvement.
And now you are going to single-handedly revolutionise the entire
software development world by inventing a language that breaks
well-known and well-grounded rules for maximising structure and
minimising problems (by making everything read/write - code, data, etc.,
and preferably global as well). Despite breaking the rules that are
particularly important in parallel systems, your language will
apparently be perfect for the hundred-core cpus that will dominate every
part of computing in the near future. It will be ideal for everything
from the smallest embedded system (once we have replaced these silly
flash devices with /real/ read-write memories for 50x productivity
gains) to the biggest machines - just as long as they are x86, ARM, or
an imaginary base-3 cpu. And it will run on Windows using OpenGL,
because that's the future of the world.

My targets are: Windows, Linux, FreeBSD, and Android. I will never support
any Apple product. And I will at some point remove support for Windows and
later Linux and FreeBSD once I get my own operating systems up to snuff.
I've got to admire your optimism and your persistence. But don't give
up the day job.

Thank you. And thank you. It is something I offer unto the Lord. I am
giving back to Him the best I have been given by Him in terms of my
abilities and knowledge. I desire to have a product on this Earth which
is founded upon faith in Him, able to bring the Christian concept of "Love
thy neighbor as thyself" into a practical example, through code sharing,
through an organization that acknowledges Jesus Christ as the head of our
lives and the reasons why we move. I don't see that other places on this
Earth. If I did, I would probably contribute my time there.

I actually was nearly committed to completing the HURD kernel for the GNU
project, but came across some quotes by Richard Stallman related to heinous
sexual acts, things he believed in. I emailed him to find out about those
quotes, to validate that he was accurately quoted and really believed those
things. He did. As such, I disavowed myself from contributing to any of
the GNU projects or any part of the FSF because of Richard Stallman being
at the head, and that contrary spirit filtering down through the ranks.

The Liberty Software Foundation was created after that as a purposeful
alternative, one devoted to Jesus Christ. Our goals are nearly identical,
except that I/we cite the Lord as the reason why we do what we do, and
the example He first gave us (such as every fruit tree producing fruit
with its seed within itself, which can be something along the line of
sharing the knowledge of reproduction of the thing with those who receive
the thing, which is open source software, and a conveyance of educational
mechanisms to allow the individual receiving the thing to be able to know
and understand the thing, so as to improve upon it using the unique and
special talents that God gave them, coupled to their unique and special
experience and opportunities, also that God gave them.

I desire to put my Lord and Savior first. He is the reason I am doing all
of this. And it is because of His sacrifice for me that I am able to
continue pressing on. It does not matter to me if people do not use what
I will offer. I know that if He finds value in it He will send those who
are His own. The rest will never see value in it, and that is okay because
my goals are not to change the world, but to serve the Lord. He will make
any end results of my offering come to fruition. It will not be through
my evangelizing about RDC, or Visual FreePro, or the VVM virtual machine.
I will explain what I'm doing so people are aware of the alternative purpose
in what and Who I pursue, and I will ask people to come, but I will not try
to convince them as through robust argument as to why my way is better than
other ways. I will let the work speak for itself. And ultimately it will
be those who come to develop that will have done so on their own. It has
to be from within, not without, that people desire to participate. It is
the same with faith. I cannot from an outward push convince someone to
believe in Jesus Christ. It is God alone who changes people's hearts from
the inside that allows them to be drawn in the first place.

Best regards,
Rick C. Hodgin
 
R

Rick C. Hodgin

Hopefully somewhere along the line there is also a "save" involved too!
(I'm guessing this has moved on since the VB3 days, where you couldn't
save the changes until you finished the debugging session - great fun if
the program hangs and you "finish" it by killing the whole VB3.)

Microsoft's compilers automatically save unsaved changes when the apply
changes button is pressed. You cannot have unsaved changes when you apply
them. Only up until that point where you are still editing them.
Of course, you mean your source code is now in sync with /some/ of your
object code. If your code is compiled with optimisations, inlining will
mess up this process.

In debug mode the code is by default, and most often in use, compiled
without any optimizations. You can include debugging information including
edit-and-continue abilities in release mode with optimizations. It becomes
somewhat weird, however, as the optimizer will remove and reuse things
making source-level debugging an odd experience. It still works though,
it just sometimes requires looking at the disassembly to see what other
variable a particular variable is pointing to.
Wow, that's powerful! I wish some of the debuggers I use could view and
edit variables!!

LOL! :)
In case you didn't spot the sarcasm, I have /never/ seen a debugger that
could not do that - and I have used perhaps thirty or forty different
debuggers over the last twenty odd years.

I caught it. There are integrated features of Microsoft's debugger which
change the editing experience. It's a native interface, rather than through
a text-based protocol. It has a different feel.

Best regards,
Rick C. Hodgin
 
K

Kaz Kylheku

That's an important distinction to keep in mind. Another is that there
are lots of tools that can be called debuggers. If I were to be cast
away on a desert island, and could take only one debugging tool with me,
it would be valgrind.

And then what would you do with all that free time ...

:)
 
B

BartC

Kaz Kylheku said:
Not using a debugger is like building some electronic device and never
using
a multimeter or oscilloscope.

I remember designing and building computer circuits (microprocessor, memory,
logic etc) without using an oscilloscope, because I didn't have one. You
just use different approaches. (Later I did use analog non-storage scopes,
and you just developed techniques for using these with digital circuits. I
did have a multimeter, but you can appreciate that they are not very useful
for signals switching millions of times every second.)

And I've never used a debugger either. Again, bugs do still get fixed.
Usually the sort that a debugger wouldn't have been very useful for tracking
down anyway (because the symptom is billions of instructions away from the
cause.)
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,780
Messages
2,569,609
Members
45,254
Latest member
Top Crypto TwitterChannel

Latest Threads

Top