c++ - libircclient : Selective connection absolutely impossible to debug -
i'm not type post question, , more search why doesn't work first, time did could, , can't figure out wrong.
so here's thing:
i'm programming irc bot, , i'm using libircclient, small c library handle irc connections. it's working pretty great, job , kinda easy use, ...
i'm connecting 2 different servers, , i'm using custom networking loop, uses select function. on personal computer, there's no problem loop, , works great.
but (here's problem), on remote server, bot hosted, can connect 1 server not other.
i tried debug could. went examine sources of libircclient, see how worked, , put printfs could, , see comes from, don't understand why this.
so here's code server (the irc_session_t objects encapsulated, it's kinda easy understand. feel free ask more informations if want to):
// connect first session first.connect(); // connect osu! session second.connect(); // initialize sockets sets fd_set sockets, out_sockets; // initialize sockets count int sockets_count; // initialize timeout struct struct timeval timeout; // set running true running = true; // while server running (which means always) while (running) { // first session has disconnected if (!first.connected()) // reconnect first.connect(); // second session has disconnected if (!second.connected()) // reconnect second.connect(); // reset timeout values timeout.tv_sec = 1; timeout.tv_usec = 0; // reset sockets count sockets_count = 0; // reset sockets , out sockets fd_zero(&sockets); fd_zero(&out_sockets); // add sessions descriptors irc_add_select_descriptors(first.session(), &sockets, &out_sockets, &sockets_count); irc_add_select_descriptors(second.session(), &sockets, &out_sockets, &sockets_count); // select something. if went wrong int available = select(sockets_count + 1, &sockets, &out_sockets, null, &timeout); // error if (available < 0) // error utils::throw_error("server", "run", "something went wrong when selecting socket"); // have socket if (available > 0) { // if there wrong when processing first session if (irc_process_select_descriptors(first.session(), &sockets, &out_sockets)) // error utils::throw_error("server", "run", utils::string_format("error first session: %s", first.get_error())); // if there wrong when processing second session if (irc_process_select_descriptors(second.session(), &sockets, &out_sockets)) // error utils::throw_error("server", "run", utils::string_format("error second session: %s", second.get_error())); }
the problem in code line:
irc_process_select_descriptors(second.session(), &sockets, &out_sockets)
always return error the first time it's called, , 1 server. weird thing on windows computer, works perfectly, while on ubuntu server, doesn't want to, , can't understand why.
i did in-depth debug, , saw libircclient this:
if (session->state == libirc_state_connecting && fd_isset(session->sock, out_set))
and goes wrong. session state correctly set libirc_state_connecting, second thing, fd_isset(session->sock, out_set) return false. returns true first session, second session, never.
the 2 servers irc.twitch.tv:6667 , irc.ppy.sh:6667. servers correctly set, , server passwords correct too, since works fine on personal computer.
sorry long post.
thanks in advance !
alright, after hours of debug, got problem.
so when session connected, enter in libirc_state_connecting state, , when calling irc_process_select_descriptors, check this:
if (session->state == libirc_state_connecting && fd_isset(session->sock, out_set))
the problem select() alter sockets sets, , remove sets not relevant.
so if server didn't send messages before calling irc_process_select_descriptors, fd_isset return 0, because select() thought socket not relevant.
i fixed writing
if (session->state == libirc_state_connecting) { if(!fd_isset(session->sock, out_set)) return 0; ... }
so make program wait until server has sent anything.
sorry not having checked !
Comments
Post a Comment