The problem with size_t

T

Tech07

Nick Keighley said:
I'll proably regret this...

You've made a few comments like this about Jacob's proposed library
(and other things)

at least I am consistent.
but you seem to be lacking in detail.

You want details, I want a million dollars. :p
Jacob is
trying to implement container classes for C.

I think he has already done so (?).
Is he going about it the
wrong way?

Isn't "container library in/for C" an oxymoron?
What should he do?

He'll be/do just fine. What you "should do", I cannot (or decline to) help
you with.
Is it a mistake to try and add
containers to C?

She can get in those prissy pumps.... with a SHOEHORN! "we" used to make
"drawrings" of the teacher's feet trying to escape "her" shoes. "containers
in C" is like "bikini on a fat woman": Ewww!

What should he do instead?

He can decide what to do and doesn't need you holding his hand. (Not that
there is anything wrong with that. I'm really tough and have been called
"abrasive" (amongst other things)).
Is he unnecessarily
constraining himself by only considering standard C types?

'Twas my opinion obviously if you've ever read me.
What should
he do instead?

I have no problem with him or what he is doing.
Should he invent a special container_size_t type?

And wheel you around in your wheelchair giving up his life?!
 
N

Nick Keighley

....and I did. I attempted on to engage Tech07 in a sensible discussion
about his opinions of Jacob's container library. As usual Tech07
prefers vague waffle to substantive discussion.

at least I am consistent.


You want details, I want a million dollars. :p

and it continues in a similar vein

<snip>
 
S

Stephen Sprunk

Joe said:
Theoretically I suppose. Are there any servers sold today with such
capability?

x86 servers have had 32-bit virtual/linear addresses but 36-bit physical
addresses for years; look up PAE.

x86-64 processors present a 32-bit virtual/linear address to 32-bit
tasks, but have physical addresses of up to 52 bits. (AMD and/or Intel
will need to define yet another page table level or two to reach a full
64 bits.) IIRC, current implementations don't go past 40 bits.

S
 
J

James Dow Allen

A terabyte should be enough for anyone

Off-topic: N.Y. Times published an article recently mentioning
petabyte-sized data (maybe Facebook's image collection or some such);
it defined "petabyte" as 1000 terabytes but didn't define terabyte.

It wasn't so very long ago that "megabyte" would need definition in
a general-audience newspaper article and even technies might not
have heard of "gigabyte"!

I remember my Dad coming home from work 43 years ago excited about
data processing's first terabyte-sized "random access" memory!
He said there were only two installations. It was only many years
later I learned there'd been a third installation ... in Langley,
Virg.

James
 
R

Richard Bos

James Dow Allen said:
Off-topic: N.Y. Times published an article recently mentioning
petabyte-sized data (maybe Facebook's image collection or some such);
it defined "petabyte" as 1000 terabytes but didn't define terabyte.

It wasn't so very long ago that "megabyte" would need definition in
a general-audience newspaper article and even technies might not
have heard of "gigabyte"!

When I hear that, I think Knuth should have gone farther with his
Potzrebie system, or even better, given it up for a futile job.

Richard
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,776
Messages
2,569,603
Members
45,197
Latest member
Sean29G025

Latest Threads

Top