XMLSocket multiple send problem

  • Hanzo
  • Born
  • Born
  • Hanzo
  • Posts: 1
  • Loc: Denver, CO

Post 3+ Months Ago

Hey, all. I'm new to the forums so go easy on me. :D

Well, I went ahead and built a C++ Winsock TCP server to respond to XML
messages, send by Flash MX 2004. Although simple, it does support queuing of messages into a "buffer", as well as the ability to receive a "datasize" message, preparing it for the next message which will contain the actual game-related data.

Initial tests right now are quite successful. On the server, I have the
ability to startup, shutdown, accept TCP clients, assign them an id for
identification by the server for specific messages, and accept "close"
messages, which close the socket off. Meanwhile, I have built an
ActionScript 2.0 class in Flash specifically tailored to read these
messages, and send off its own, back to the server. I have a simple Flash
app that, on startup, can connect immediately to the server, and there is no
problem whatsoever. FWIW, the class extends XMLSocket.

Here is where my current issue lies: When I issue a disconnect message from
my Flash app, the server seems to only want to read in the datasize message;
not anything more. The Flash app issues two messages to the server, one
right after the other: A datasize message, saying how big the next message
is, and then, the data message that says "I'm closing down my connection
now." However, the server seems to refuse to accept that second message.

Messages themselves are very simple, an example is:

<message name="datasize" type="-1" toId="0" fromId="1" />

Transmitting one message at a time works perfectly. I have stepped through the debugger (both in VC++ and in Flash) and have confirmed that an entire message can be received, read, and parsed into a game packet, no problemo. Two messages are sent in Flash as simple as this:

myClient.send(datasizeXML);
myClient.send(gameXML);

This may be the clue that might help you: If I step through the debugger in
Flash, and set a breakpoint on the first .send() call listed above, then wait a second, then Step To the next line of code...*Both* messages are receieved properly by the server and the server gracefully disconnects the Flash app, as if there was never a problem at all.

Anybody have any ideas at all as to what Flash might be doing under the hood to cause these two consecutive calls to fail, yet succeed if the debugger stops the code between the two? I've tried to put an extensive delay between the two Flash send() calls, but it doesn't seem to do any good (and quite possibly, maybe my delay code is no good...but should I even need one?)

Any help wouuld be greatly appreciated.

- Hanzo
  • Anonymous
  • Bot
  • No Avatar
  • Posts: ?
  • Loc: Ozzuland
  • Status: Online

Post 3+ Months Ago

  • nmunky
  • Born
  • Born
  • nmunky
  • Posts: 1

Post 3+ Months Ago

I don't know if this will help, but I had a similar problem with a VB6 winsock server that was caused by a buggy winsock component, that would cause an error with clients that were significantly slower than the server (EG Flash).

The solution to this was to make sure that my copy of VStudio was patched to SP6

Post Information

  • Total Posts in this topic: 2 posts
  • Users browsing this forum: No registered users and 8 guests
  • You cannot post new topics in this forum
  • You cannot reply to topics in this forum
  • You cannot edit your posts in this forum
  • You cannot delete your posts in this forum
  • You cannot post attachments in this forum
 
 

© 1998-2014. Ozzu® is a registered trademark of Unmelted, LLC.