minimum time slice get in linux/unix

Discussion in 'C Programming' started by Wei Wang (weiwang1), Jun 30, 2003.

  1. Functions such as "gettimeofday" gives the time. How fine the
    result is ? It probably is not as fine as 1 microsecond.
    I use a test code on redhat linux 7.3, it gives approximately
    3 microsecond, is it too good ? I know windows system is far
    coarser.


    Wei
     
    Wei Wang (weiwang1), Jun 30, 2003
    #1
    1. Advertisements

  2. Wei Wang (weiwang1)

    Jack Klein Guest

    Not in standard C, the topic of comp.lang.c, it doesn't, because there
    is no function named gettimeofday(). The only function in standard C
    that returns time is deliberately hidden by giving it the confusing
    name time().
    Not only is comp.lang.c not a good place for this question,
    gnu.gcc.help is not either, because gcc is basically a system
    independent compiler that uses system-supplied libraries on most
    platforms, and does not control many characteristics of the underlying
    operating system.

    There is no one universal "linux/unix" time slice. If you have a
    Linux specific question you should ask in
    If you have a question about a
    specific version of UNIX you should hunt up a newsgroup for that
    specific version, or perhaps ask about a specific version in

    There is no one answer that applies to Linux and all UNIX or UNIX-like
    systems.

    --
    Jack Klein
    Home: http://JK-Technology.Com
    FAQs for
    comp.lang.c http://www.eskimo.com/~scs/C-faq/top.html
    comp.lang.c++ http://www.parashift.com/c++-faq-lite/
    alt.comp.lang.learn.c-c++ ftp://snurse-l.org/pub/acllc-c++/faq
     
    Jack Klein, Jul 1, 2003
    #2
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.