Prob with delay timer

L

lynology

My program takes in a key pressed value from the main routine and
based on the key pressed, it selects the command to be executed. The
problem I have is in creating a delay timer so that a message appears
on my screen for only one second. The condensed code is as follows:

#include <sys\timeb.h>
#define DELAY_1SEC 1000 // in millisec

void Scan_Menu_Keys(int key_press)
{ switch(key_press){
case 1: Execute1stCommand(); break;
case 2: Start_Timer();
while (!Timer_Expired(DELAY_1SEC))
strcpy(CG_ScreenKeyboard.Screen, "Display Message");
Clear_Screen();
break;
default: break;
}
}

void Start_Timer(){
ftime(&start_time); // ftime is function defined in sys\timeb
}

int Timer_Expired(){
ftime(&current_time);
time_diff = (int) (1000.0*(current_time.time - start_time.time) +
(current_time.millitm - start_time.millitm));
return (time_diff >= DELAY_1SEC);
}

The problem I ran into was that the timer would wait for one second
before displaying the message "Display Message". Effectively, that
meant the message never displayed because when it got out of the 1
second delay loop, the screen was cleared. I would love to get
suggestions on why this delay loop is not be executing the command
within the loop and any alternative methods to achieving the same
results.

Thanks in advance!

LYN
 
J

John Harrison

lynology said:
My program takes in a key pressed value from the main routine and
based on the key pressed, it selects the command to be executed. The
problem I have is in creating a delay timer so that a message appears
on my screen for only one second. The condensed code is as follows:

#include <sys\timeb.h>
#define DELAY_1SEC 1000 // in millisec

void Scan_Menu_Keys(int key_press)
{ switch(key_press){
case 1: Execute1stCommand(); break;
case 2: Start_Timer();
while (!Timer_Expired(DELAY_1SEC))
strcpy(CG_ScreenKeyboard.Screen, "Display Message");
Clear_Screen();
break;
default: break;
}
}

void Start_Timer(){
ftime(&start_time); // ftime is function defined in sys\timeb
}

int Timer_Expired(){
ftime(&current_time);
time_diff = (int) (1000.0*(current_time.time - start_time.time) +
(current_time.millitm - start_time.millitm));
return (time_diff >= DELAY_1SEC);
}

Think about the above function. What happens if (say)

current_time.time = 1000000
current_time.millitm = 10

start_time.time = 999999
start_time.millitm = 100

I can't answer that question because you are using non-standard C++ (ftime),
but your method of working out the difference between two times may be
incorrect (I think it depends on whether millitm is a signed or unsigned).

john
 
R

Ralph D. Ungermann

lynology said:
The
problem I have is in creating a delay timer so that a message appears
on my screen for only one second.

case 2: Start_Timer();
while (!Timer_Expired(DELAY_1SEC))
strcpy(CG_ScreenKeyboard.Screen, "Display Message");
Clear_Screen();

Timer_Expired() is called with one argument. Whether this compiles,
depends on a previous declaration (which you `condensed out'). But
surely, it won't link with your definition below.
I would love to get
suggestions on why this delay loop is not be executing the command
within the loop

Presumimng, that the while-condition is true for exactly one second,
there will be many, many strcpy() executions in this time (depending on
your CPU speed). But I can't see a single _output_ statement. Is there
some hidden magic in writing to CG_ScreenKeyboard.Screen?
and any alternative methods to achieving the same
results.

Please consult the documentation of CG_ScreenKeyboard. This is not a
standard C++ class, so I'm afraid, that nobody here can help you.


Ralph
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,769
Messages
2,569,581
Members
45,056
Latest member
GlycogenSupporthealth

Latest Threads

Top