Ok, so, I got the signal through but for some reason the process exists after it receives the signal.
If I add an endless loop (while(1) ) before I even create the socket then it works as prescribed. So... something in my socket code is quitting when the kill command is issued.
I don't see what it would be though. Without the kill, the process sits there indefinitely accepting connections and sending messages to the clients. Why would the kill (and the incrementation of the variable that follows) provoke the socket to get out of it's loop and let the process end?
The socket code is beneath...
[EDITED again]
$SIGNAL = 0;
sub sigHandler{
#&logData("SIGNALED");
$SIGNAL++ ;
}
$SIG{"USR1"}=\&sigHandler;
# Create a new socket, on port 9999
my $PORT = 9999;
print ("opening connection on port $PORT");
$lsn = new IO::Socket::INET(Listen => 1,
LocalPort => $PORT,
Reuse => 1,
Proto => 'tcp' );
#or die ("Couldn't start server: $!");
# Create an IO::Select handler
$sel = new IO::Select( $lsn );
# Close filehandles
close(STDIN); close(STDOUT);
warn "Server ready. Waiting for connections . . . on \n";
# Enter into while loop, listening to the handles that are available.
# this SHOULD be an infinite loop... I don't see why it would eval to false when
# I send a signal to increment $SIGNAL by one.
while( @read_ready = $sel->can_read ) {
$MESSAGE = 0;
$fh = $read_ready[0];
# Create a new socket
if($fh == $lsn) {
$new = $lsn->accept;
$sel->add($new);
push( @data, fileno($new) . " has joined.");
warn "Connection from " . $new->peerhost . ".\n";
}
# Handle connection
else {
$input = <$fh>;
chomp $input;
warn "GOT INPUT '$input'\n";
if($input eq "<policy-file-request/>"){
$MESSAGE =
qq~<?xml version="1.0"?>
<cross-domain-policy>
<allow-access-from domain="*" to-ports="*"/>
</cross-domain-policy>\0~;
$SIGNAL++;
}
if ( $input eq '') {#disconnection notification by client
warn "Disconnection from " . $new->peerhost . ".\n";
$sel->remove($fh);
$fh->close;
}
if ( $input eq 'READY'){
warn "CLIENT READY = 1\n";
$CLIENT_READY = 1;
}
}
# Write to the clients that are available
foreach $fh ( @write_ready = $sel->can_write(0) ) {
if($MESSAGE == 0){
#set message here based on criteria
$MESSAGE = "UPDATE";
}
warn "outside send if\n";
if($CLIENT_READY == 1 && $SIGNAL > 0){
warn ("sending $MESSAGE to $fh\n");
$CLIENT_READY = 0;
$SIGNAL--;
print $fh "$MESSAGE\0" or warn "can't send message to $fh";
}
}
}
warn "Server ended.\n";
If you install a signal handler in your perl script, it does not have to end the script. (Do not exit the handler by calling die
) Note that you do not need to send SIGINT or SIGKILL, you can also send SIGUSR1.
Edit The commandline to your kill command has a comma behind -USR1 that should not be there, (it is kill -USR1 4169)
Edit 2 The while (can_read) loop probably exists when it receives an empty array of file handles when it is interrupted by the signal. You could prevent this by having the condition:
while ((@read_ready = $sel->can_read) || 0 < $sel->count ) {
and update the loop to handle an empty @ready
array.
I don't understand if I'm misreading your question or if you're overlooking the obvious. I wouldn't rely on signals for doing something like this.
I'd have scriptA
listening to a socket and have scriptB
send a message to scriptA
(rather than a signal). If it receives the right message, it would write out the relevant data to all the clients connected to it.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With