size of a sizeof(pointer)

M

Mark McIntyre

Is there any alive implementation that uses 16bit chars?? (I know of the
existance of a machine that a byte is 6-bit)

Unicode springs to mind.

I suspect that quite a few DSPs do, tho typically they're freestanding
implementations.

That aside, I'd be unsurprised to see future implementations using 16 bits
for chars.
 
M

Malcolm

Grumble said:
An implementation cannot have 16-bit chars and 24-bit ints?

How about 16-bit chars and 24-bit pointers?
Not allowed. chars and bytes, or to be pedantic unsigned chars and bytes,
are the same thing in C. An unfortunate hangover from the early days.
All types have to be a whole multiple of char.
 
M

Malcolm

CBFalconer said:
Nope. A pointer points. What information it needs to hold to do
that is up to the implementation. It could consist of a URL and
other information, just as a not too wild example. Another might
be "Malcolms house, under the bed beside the dirty socks, last
Tuesday". The amount of information needed is usually constrained
by limiting the things that the pointer is allowed to point to.
Clear now?
Don't patronise.
You and I both know that perverse implementations are allowed. Since
pointers have to be a fixed size then using a URL would be grossly
inefficient.
Since the OP needs to understand how pointers are represented in memory on a
typical system such as the one he will certainly be using, telling him that
32 bit pointers are needed to address 4GB gets across the message clearly.
Talk about URL pointers is liable to confuse.
You should neither know nor care, unless you are implementing the
system.
Well you very often need to break the bounds of ANSI C and go to a lower
level. An example would be if you have a custom memory scheme. How do you
know if a pointer comes from your arena or from elsewhere?
Another example would be using a debugger. Invalid pointers are often set to
some defined bit pattern. You need to know something about addressing to
detect these bad pointers.
Programming is practical. It doesn't make sense to hand someone a copy of
the standard and expect them to be able to write fully-conforming ANSI C.
You need to play with a real implementation on a real machine to have any
hope of understanding what is going on.
 
L

Leor Zolman

Exactly. "Equivalence" is the accepted term for what is going on, which is
confusing.

I've never heard the term before starting to read this newsgroup. I've
always called it "array/pointer duality"
-leor


Leor Zolman
BD Software
(e-mail address removed)
www.bdsoft.com -- On-Site Training in C/C++, Java, Perl & Unix
C++ users: Download BD Software's free STL Error Message
Decryptor at www.bdsoft.com/tools/stlfilt.html
 
P

Papadopoulos Giannis

Mark said:
On Mon, 09 Feb 2004 19:28:21 +0200, in comp.lang.c , Papadopoulos Giannis

That aside, I'd be unsurprised to see future implementations using 16 bits
for chars.

If we use 16-bit values as char, then the new C0x spec must define
something like "byte" (java's char is unicode and it haves an 8-bit type)..

There is of course wchar_t so there is definately no need for 16bit
chars.. Or so I think... Comments?

--
#include <stdio.h>
#define p(s) printf(#s" endian")
int main(void){int v=1;*(char*)&v?p(Little):p(Big);return 0;}

Giannis Papadopoulos
http://dop.users.uth.gr/
University of Thessaly
Computer & Communications Engineering dept.
 
K

Keith Thompson

Papadopoulos Giannis said:
If we use 16-bit values as char, then the new C0x spec must define
something like "byte" (java's char is unicode and it haves an 8-bit
type)..

There is of course wchar_t so there is definately no need for 16bit
chars.. Or so I think... Comments?

I think C will always define a char as being one byte (sizeof(char)==1).
There's too much code that would break if that were changed. The
process that led to the 1989 ANSI standard was probably the last real
opportunity to change this.

I'd greatly prefer the concepts of "character" and "uniquely
addressable storage unit" to be separate, but it's too late to fix it.

It just might be possible to deprecate the use of the word "byte"
(which is part of the desciption of the language, not part of the
language itself) while continuing to guarantee that sizeof(char)==1,
but I doubt that even that will be done.
 
M

Mike Wahler

Malcolm said:
Don't patronise.
You and I both know that perverse implementations are allowed.

For suitable defintions of 'perverse'.
Since
pointers have to be a fixed size

C & V please.
then using a URL would be grossly
inefficient.
Since the OP needs to understand how pointers are represented in memory

That's platform/implemenatation dependent.
on a
typical system

Whose definition of 'typical'?
such as the one he will certainly be using,

Doesn't matter which one. The answers will be platform-specific,
not applicable to standard C.
telling him that
32 bit pointers are needed to address 4GB gets across the message clearly.

That's one of many possible ways to represent such an address space.
Talk about URL pointers is liable to confuse.

It's intended to clarify (and imo it did) that a pointer is
an *abstraction*, and as such, one need not (should not) be
concerned about its physical implementation.
Well you very often need to break the bounds of ANSI C and go to a lower
level.

In which case the dicussion needs to depart from clc.
An example would be if you have a custom memory scheme. How do you
know if a pointer comes from your arena or from elsewhere?

Then one would need to ask/read about it where such things are discussed.
Not here.
Another example would be using a debugger. Invalid pointers are often set to
some defined bit pattern. You need to know something about addressing to
detect these bad pointers.

Then one would need to ask/read about it where debuggers are discussed. Not
here.
Programming is practical.

The subject of clc is not programming.
It doesn't make sense to hand someone a copy of
the standard and expect them to be able to write fully-conforming ANSI C.

That's why we have books, schools, intructors, etc.
You need to play with a real implementation on a real machine to have any
hope of understanding what is going on.

Not at the abstract level of ISO C. 'Way' back when, I got a decent
understanding
of how COBOL worked, before I ever laid eyes on any hardware. This was
proven
when I actually coded, compiled, and successfully ran programs when we did
get access to a computer.

-Mike
 
R

Richard Bos

Mike Wahler said:
The subject of clc is not programming.

Well, yes, it is. Where Malcolm goes wrong is in believing that locking
yourself into the Wintel platform is part of that practicality.

Richard
 
K

Kelsey Bjarnason

[snips]

C & V please.


That's platform/implemenatation dependent.

I've always favord SQL queries. Store all the values in a database and
the pointers are all just queries to retrieve them.
That's one of many possible ways to represent such an address space.

Anyone who ever used older DOS compilers will appreciate the clarity of
not assuming pointers make any sort of inherent sense. :)
 
M

Malcolm

Mike Wahler said:
C & V please.
Uggle *ptr = 0;

Uggle **uptr = malloc(sizeof(Uggle *));

*uptr = ptr;

*uptr now must be set to NULL. How is this achieved if an Uggle * is of
variable width?
Whose definition of 'typical'?
Natural language definition of "typical".
Doesn't matter which one. The answers will be platform-specific,
not applicable to standard C.
But standard C is deeply dependent on the types of architectures that exist
in the real world. That's why it has pointers, rather than the "advance"
commands that would be expected of Turing machines.
That's one of many possible ways to represent such an address
space.
Use of 32 bit pointers to address a 4GB memory space is not just one of many
possible ways to represent such a space. It's the most obvious, natural way
to do so.
It's intended to clarify (and imo it did) that a pointer is
an *abstraction*, and as such, one need not (should not) be
concerned about its physical implementation.
You need to understand the physical representation to understand how the
ANSI committee made their decisions. Or else why not say that a pointer is
held in a variable size memory entity?
In which case the dicussion needs to depart from clc.
NO, because clc is not cl.ansic. The newsgriup precedes the ANSI standard,
which is proving itself to be an ephemeral chapter in the history of the
language. The C99 standard seems to have failed.
Then one would need to ask/read about it where such things are
discussed. Not here.
It's a perfectly on-topic question. I have implemeted a mymalloc() using a
static arena, when a pointer is passed to myfree(), how can I verify that it
is from the arena. The ANSI answer is that you can't, but that's not good
enough.
[ debuggers ]
Then one would need to ask/read about it where debuggers are
discussed. Not here.
You need to understand the sorts of ways pointers are represented in memory
before you can understand debuggers, or indeed the (ANSI) %p format
specifier to the printf() family of functions. Perfectly on topic, but
nothing to do with ANSI.
The subject of clc is not programming.
It's C programming. Not ANSI C programming, portable C programming i.e.
compiler-specfic questions are off-topic, but not, for example, "how does a
typical implemetation provide malloc()".
That's why we have books, schools, intructors, etc.
And also comp.lang,c. Otherwise one could simply post the standard in answer
to every query.
Not at the abstract level of ISO C. 'Way' back when, I got a decent
understanding of how COBOL worked, before I ever laid eyes on
any hardware. This was proven when I actually coded, compiled,
and successfully ran programs when we did get access to a computer.
Well done but that's unusual, and an inefficient way of learning. Basically
you are using the tutor to dry run code, and he will do so several million
times slower than a processor.
Programming is a practical skill, which means that you need to understnad
your implementation. Otherwise we could simply hand a copy of the standard
to every newbie and expect them to become proficient C programmers. It
doesn't work like that.

Basically engage brain before trying to obfuscate my explanations with
references to URL pointers and other such rubbish.
 
M

Malcolm

Richard Bos said:
Well, yes, it is. Where Malcolm goes wrong is in believing that
locking yourself into the Wintel platform is part of that practicality.
So you think that Wintel is the only platform that uses 32-bit pointers to
address a 4GB memory space?
 
M

Mark McIntyre

Uggle *ptr = 0;

Uggle **uptr = malloc(sizeof(Uggle *));

*uptr = ptr;

*uptr now must be set to NULL. How is this achieved if an Uggle * is of
variable width?

Mike meant that different types' pointers might be different widths. Thus
an Uggle** might be wider (or narrower) than an Uggle*, which might in turn
be wider (or narrower) than an int*.
 
M

Mike Wahler

In case you didn't know, that acronym means "Chapter & Verse"
I'm asking you to support your claim with a citation from
the standard.
Uggle *ptr = 0;

Uggle **uptr = malloc(sizeof(Uggle *));

*uptr = ptr;

*uptr now must be set to NULL. How is this achieved if an Uggle * is of
variable width?

Doesn't matter "how". It must simply 'work correctly'. That's
all the standard requires.

Please don't omit context. Restored:

Malcolm: on a typical system

Mike: Whose definition of 'typical'?
Natural language definition of "typical".

OK I suppose I have to spell it out. Whose definition of
'typical *system*'. In some contexts a 'typical system'
is a PC. In others, it's a cell phone. In the widest
(computer system) context, if 'typical' is the most
widely used, it's certainly not a PC, but more likely
some embedded system I've probably never heard of.
But standard C is deeply dependent on the types of architectures that exist
in the real world.

Not at all. The standard makes requirements that an implementation
must meet. If a platform cannot provide support sufficient for
such an implementation (either directly or via e.g. software emulation,
etc.)
(perhaps it only has 6 bit bytes) then it's simply not possible to create a
conforming C implemenation for it. Period. So you have the 'dependency'
issue exactly backwards.

That's why it has pointers,

I'd have to ask Mr. Ritchie for the 'real' answer, but imo
it has pointers because they allow one to do the useful things
they can do. They implement an abstraction: indirection.

rather than the "advance"
commands that would be expected of Turing machines.
Use of 32 bit pointers to address a 4GB memory space is not just one of many
possible ways to represent such a space. It's the most obvious, natural way
to do so.
You need to understand the physical representation to understand how the
ANSI committee made their decisions.

I need to understand neither physical representation, nor know (or care)
why the committee decided what they did, in order to successfully write
standard C. All I need is a conforming implementation, and access to
the rules (the standard). Of course textbooks written in a more 'prose'
like form are a huge help.
Or else why not say that a pointer is
held in a variable size memory entity?

Because either one would be acceptable with regard to the standard.
It's called flexibility, which I suspect the committe allowed for
when possible. For example why do you suppose there's no hard
definition for the exact representation of '\n'?

NO, because clc is not cl.ansic.

For the zillionth time that I've stated this here, the name of a newsgroup
does *not* define its exact nature. It's only a general guideline.

The nature and guidelines of clc are stated in the 'welcome message',
which has by consensus of the regulars become the defining document.
The newsgriup precedes the ANSI standard,
Irrelevant.

which is proving itself to be an ephemeral chapter in the history of the
language. The C99 standard seems to have failed.

Your opinion. And you seem to have imposed some arbitrary
time limit for C99 to 'succeed'.
Then one would need to ask/read about it where such things are
discussed. Not here.
It's a perfectly on-topic question. I have implemeted a mymalloc() using a
static arena, when a pointer is passed to myfree(), how can I verify that it
is from the arena. The ANSI answer is that you can't, but that's not good
enough.
Tough.
[ debuggers ]
Then one would need to ask/read about it where debuggers are
discussed. Not here.
You need to understand the sorts of ways pointers are represented in memory
before you can understand debuggers,

Debuggers are not topical here.
or indeed the (ANSI) %p format

All one need know is that it will print the value of a type 'void*'
object. The exact display format used is left up to the implemenation.
specifier to the printf() family of functions. Perfectly on topic, but
nothing to do with ANSI.

%p (the ISO specification of it) is indeed topical. Its implementation
is not.
It's C programming.

It's the C programming *language* and how to *use* it.
Not ANSI C programming, portable C programming i.e.
compiler-specfic questions are off-topic, but not, for example, "how does a
typical implemetation provide malloc()".

That's an implementation specific issue. The language only
specifies 'malloc()'s *behavior*.
And also comp.lang,c. Otherwise one could simply post the standard in answer
to every query.

So here you are at comp.lang.c where so many experts graciously share
their knowledge and skill, gratis. So instead of desperately trying
to prove yourself "right", why not *listen* and learn? I did.
When I first came to clc, I considered myself, if not 'expert',
at least very knowledgable about C. A couple days here proved
me wrong. I did not allow my ego to obscure or deny this fact.
Well done but that's unusual,

I suppose one might call it "unusual". I found my instructor's
methods to be brilliant.
and an inefficient way of learning.

I suppose that depends upon what you mean by "efficient". Fast?
Fast just means fast, not necessarily "good".

I found it a very *effective* way to learn.
Basically
you are using the tutor to dry run code,

Actually the students all used one another to represent
system components, one of which was the CPU, who was
given a sequence of predefined instructions. Others
represented data objects, peripheral devices, etc.
We 'executed' a 'program' according to a strict
formal set of rules (analagous to a standard
language specification). But these rules did *not*
mandate implementation methods. E.g. a the person
representing an 'accumulator' was only required
to 'reset', 'accumulate', and report a value.
It was not mandated *how* to do so. He was free
to rely on his memory, or he could write things
down, or use a handheld calculator, etc.
and he will do so several million
times slower than a processor.

Speed was not the objective. Learning was.
And after the students all having participated
in the 'execution' of a 'program' we all had a
much better appreciation for the true power
of a computer, and the discipline required to
effectively program one.
Programming is a practical skill,

Yes, and a programming language is only a small part of it.
This newsgroup provides only a small part of the knowledge
necessary. Other learning resources exist for the other
issues.

which means that you need to understnad
your implementation.

Not to use C you don't.
Otherwise we could simply hand a copy of the standard
to every newbie and expect them to become proficient C programmers. It
doesn't work like that.

As I already said, that's why we have schools, books, instructors, etc.
Basically engage brain before trying to obfuscate my explanations

I have in no way tried to obfuscate anything you've 'explained'.
I've only debated your opinions.
with
references to URL pointers and other such rubbish.

I made no reference to a URL pointer.

-Mike
 
M

Malcolm

Mike Wahler said:
Please don't omit context. Restored:

Malcolm: on a typical system

Mike: Whose definition of 'typical'?
Well every system I know uses fixed-size pointers. There is one main
exception to the rule that the size of the pointer represents the size of
the address space, and that's quite an important one, old x86 compilers with
their segmented architecture.
I think we can call the x86 "non-typical" because the natural thing to do is
to have one pointer value equalling one address, and because virtually every
other system works that way.
that exist in the real world.

Not at all. The standard makes requirements that an implementation
must meet. If a platform cannot provide support sufficient for
such an implementation (either directly or via e.g. software emulation,
etc.)
(perhaps it only has 6 bit bytes) then it's simply not possible to create
a conforming C implemenation for it. Period. So you have the
'dependency' issue exactly backwards.
C is not an abstract language for specifying the behviour of Turing
machines, but one that is deeply-dependent on the types of architectures
that exist. You can incidentally provide a conforming C implemetation for
any Turing-comptible machine, even if it uses 6-bit bytes internally, as
long as you are prepared to accept gross inefficiency.
It is precisely because 6-bit byte general-purpose processors are rare that
C doesn't easily support them.
I need to understand neither physical representation, nor know (or
care) why the committee decided what they did, in order to
successfully write standard C. All I need is a conforming
implementation, and access to the rules (the standard). Of course
textbooks written in a more 'prose' like form are a huge help.
This is nonsense. People are not machines. You can't learn French from a
dictionary and grammar, nor is it possible to learn C from the standard. And
over-literal explantions, such as "pointers can be URLs" obfusucate rather
than illuminate.
Irrelevant.
No highly relevant. And ANSI has shot itself in the foot by proposing a
standard that has not been widely adopted, which means that now C will
probably spread into several dialects. The newsgroup precedes ANSI, and will
survive when ANSI is just a memory.
Your opinion. And you seem to have imposed some arbitrary
time limit for C99 to 'succeed'.
It's only five years, and obviously |I cannot fortell the future, but it
seems likely that C99 will never be widely implemeted. I think that what
will happen is that people will increasingly run C code through a C++
compiler to use useful C99 features such as single line comments and inline
functions.
Tough for you but you're being unnecessarily restrictive. How about
explaining how this can be done in C on some platforms, but not portably?The details of a specific debugger are not topical, debuggers generally (for
instance we had a thread recently about whether or not they were time
wasters) are topical.
%p (the ISO specification of it) is indeed topical. Its implementation
is not.
Implemetation of standard library functions is topical.
So here you are at comp.lang.c where so many experts graciously
share their knowledge and skill, gratis. So instead of desperately
trying to prove yourself "right", why not *listen* and learn? I did.
When I first came to clc, I considered myself, if not 'expert',
at least very knowledgable about C. A couple days here proved
me wrong. I did not allow my ego to obscure or deny this fact.
It doesn't take more than a couple of days to learn all the C you need to
know, unless you want to write a compiler, if you already know another
language. That is one of the great strengths of C.
To know the answer to exotica takes a bit longer, but you don't actually
need to know this to write successful C. How about learning from someone who
knows a great deal about programming, without claiming to be at the leading
edge?
Actually the students all used one another to represent
system components, one of which was the CPU, who was
given a sequence of predefined instructions. Others
represented data objects, peripheral devices, etc.
We 'executed' a 'program' according to a strict
formal set of rules (analagous to a standard
language specification). But these rules did *not*
mandate implementation methods. E.g. a the person
representing an 'accumulator' was only required
to 'reset', 'accumulate', and report a value.
It was not mandated *how* to do so. He was free
to rely on his memory, or he could write things
down, or use a handheld calculator, etc.
If you don't have a computer then you can use these sorts of devices to
teach programming. It sounds highly creative and I wouldn't want to knock
your tutor. However if you just hnad someone a computer and let them play
with it, they can very quickly pick up programming if they have a natural
aptitude for it.
Yes, and a programming language is only a small part of it.
This newsgroup provides only a small part of the knowledge
necessary. Other learning resources exist for the other
issues.
Yes sure, knowing C is only a small part of knowing "how to program", which
is a bit like "knowing how to cook", there are a few basics everyone has to
learn, but you can be perfectly competent at meat and 2 veg without being a
cordon bleu chef.
Not to use C you don't.
Yes you do, because to make mistakes and funny things happen. Formally we
could just post a copy of the standard in response to every query, in
practise humans aren't built like that.
I made no reference to a URL pointer.
No, you've defended someone who corrected my statement that typically a
pointer has enough bits to address the meory space of the computer by
pointing out that the implemetation could use a URL pointer. Formally he's
right of course, in the same way that it could use decimal ten-state memory
instead of binary.

In fact a non-perverse use of pointers would be to store the bounds of the
data item pointed to in every pointer. Then an attempt to address memeory
illegally could be caught. To my knowledge not a single implemetation
actually uses safe pointers. The reason of course is that C programmers
expect pointer dereferences to compile to single machine instructions -
something again not mentioned in the standard but highly relevant to anyone
who programs in C.
 
M

Michael Wojcik

In fact a non-perverse use of pointers would be to store the bounds of the
data item pointed to in every pointer. Then an attempt to address memeory
illegally could be caught. To my knowledge not a single implemetation
actually uses safe pointers.

Your knowledge is incomplete. At least three C implementations for the
AS/400 - EPM C, System C, and ILE C - use 16-byte / 128-bit pointers
(CHAR_BIT is 8) which are not simple addresses but descriptors, and
which include a reference to a memory space, an offset in that memory
space, and a validity flag which can only be set by a privileged-mode
instruction. Mucking about with a pointer's internals resets the
flag, rendering the pointer invalid.

All three implementations will immediately trap on invalid pointer
access.

I believe ILE C (the current one) is a fully conforming C94 hosted
implementation, and System C was a fully conforming C90 hosted
implementation. I suspect EPM C wasn't a conforming hosted
implementation, though it probably came fairly close, and may have
been a conforming freestanding implementation.
The reason of course is that C programmers
expect pointer dereferences to compile to single machine instructions -
something again not mentioned in the standard but highly relevant to anyone
who programs in C.

C programmers working on the AS/400 will find that expectation is
incorrect. In C on the AS/400, *nothing* compiles to machine
instructions, single or otherwise. It compiles to a pseudoassembly
language called "MI". And that's a good thing, for AS/400 software,
since it's one of the qualities that allowed IBM to completely change
the machine's architecture without breaking working programs. (That's
*binaries*, with no recompilation required, in many cases.)

On the AS/400, robustness trumps performance. That was the design
decision for the whole architecture, and C needed to fall in line.
One of the nice things about the C standard was that it could
accomodate that.

More C programmers should do some work on the AS/400. (For one thing,
it'd make them appreciate their other development environments all
the more, if they use IBM's awful Program Development Manager and
Source Entry Utility.) You can learn a lot about what a conforming
hosted implementation can do. And if you're using a real 5250
terminal, you can also learn those swell trigraph sequences (or the
EBCDIC code points for various C punctuation characters).


--
Michael Wojcik (e-mail address removed)

Pseudoscientific Nonsense Quote o' the Day:
From the scientific standpoint, until these energies are directly
sensed by the evolving perceptions of the individual, via the right
brain, inner-conscious, intuitive faculties, scientists will never
grasp the true workings of the universe's ubiquitous computer system.
-- Noel Huntley
 
M

Malcolm

Michael Wojcik said:
C programmers working on the AS/400 will find that expectation
[that pointer dereferences compile to single machine instructions ] is
incorrect. In C on the AS/400, *nothing* compiles to machine
instructions, single or otherwise. It compiles to a pseudoassembly
language called "MI".
This really is the exception that proves the point. A platform that
disallows native machine langauge programs cannot really be said to have a
compiler. Nor is C the ideal language for such an environment - you need
something which does memory management for you.
 
C

Chris Torek

Michael Wojcik said:
C programmers working on the AS/400 will find that expectation
[that pointer dereferences compile to single machine instructions ] is
incorrect. In C on the AS/400, *nothing* compiles to machine
instructions, single or otherwise. It compiles to a pseudoassembly
language called "MI".

This really is the exception that proves the point. A platform that
disallows native machine langauge programs cannot really be said to have a
compiler. Nor is C the ideal language for such an environment - you need
something which does memory management for you.

But if you believe that C on this machine is not "compiled", then
you must believe that *nothing* on the AS/400 is *ever* compiled --
not COBOL, not RPG, not Modula-2. Yet IBM will sell you "compilers"
for all of these, as well as for C and C++. There are even AS/400
assemblers that read "MI" source and produces "machine code":
<http://www-1.ibm.com/servers/eserver/iseries/whpapr/translator.html>.

Would you also claim that any machine on which the machine's "opcodes"
are interpreted by microcode has no compilers? If not, why do you
distinguish between OMI opcodes and microcoded-machine opcodes?
 
K

Keith Thompson

Malcolm said:
Michael Wojcik said:
C programmers working on the AS/400 will find that expectation
[that pointer dereferences compile to single machine instructions ] is
incorrect. In C on the AS/400, *nothing* compiles to machine
instructions, single or otherwise. It compiles to a pseudoassembly
language called "MI".
This really is the exception that proves the point. A platform that
disallows native machine langauge programs cannot really be said to have a
compiler. Nor is C the ideal language for such an environment - you need
something which does memory management for you.

Exceptions don't prove points, as least not in the sense you mean.

There are plenty of compilers that generate something other than
machine code. I'm not familiar with the AS/400, but I haven't seen
anything to suggest that C is a poor language for it.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,769
Messages
2,569,579
Members
45,053
Latest member
BrodieSola

Latest Threads

Top