4

我有一个相当通用的 C# 套接字服务器,它使用套接字类的异步方法 - BeginAccept()、BeginReceive() 等。过去 4 年,该服务器在许多运行 Win Server 2003 的客户站点上运行良好。最近我将其安装在 64 位 Windows Server 2008 R2 服务器上。在第一个客户端连接并在接受处理程序中发出 BeginReceive() 和 BeginAccept() 调用之前,一切看起来都很好。发生这种情况时,CPU 使用率会飙升至 100% 并保持这种状态,直到我关闭侦听套接字。

不确定这是否重要,但服务器正在虚拟机中运行。

已经做了很多测试,但似乎没有任何帮助。使用 Process Explorer,我可以看到在 BeginReceive()/BeginAccept() 调用后不久启动了两个线程,它们是消耗处理器的线程。不幸的是,我无法在我的 Win7 64 位工作站上重现此问题。

我做了很多研究,到目前为止,我发现的只是以下两篇 KB 文章,它们暗示 Server 2008 R2 可能存在 TCP/IP 组件问题,但它们仅作为热修复程序提供:KB2465772 和KB2477730。在我更确定他们会解决问题之前,我不愿意让我的客户安装它们。

有没有其他人有这个问题?如果是这样,您必须采取什么措施来解决此问题?

这是我认为导致这种情况的方法:

private void AcceptCallback(IAsyncResult result) {
 ConnectionInfo connection = new ConnectionInfo();

 try {
    // Finish accept.
    Socket listener = (Socket)result.AsyncState;
    connection.Socket = listener.EndAccept(result);
    connection.Request = new StringBuilder(256);

    // Start receive and a new accept.
    connection.Socket.BeginReceive(connection.Buffer, 0,
       connection.Buffer.Length, SocketFlags.None,
       new AsyncCallback(ReceiveCallback), connection);

    _serverSocket.BeginAccept(new AsyncCallback(AcceptCallback), listener);

    // CPU usage spikes at 100% shortly after this...

 }
 catch (ObjectDisposedException /*ode*/) {
    _log.Debug("[AcceptCallback] ObjectDisposedException");
 }
 catch (SocketException se) {
    connection.Socket.Close();
    _log.ErrorFormat("[AcceptCallback] Socket Exception ({0}: {1} {2}", connection.ClientAddress, se.ErrorCode, se.Message);
 }
 catch (Exception ex) {
    connection.Socket.Close();
    _log.ErrorFormat("[AcceptCallback] Exception {0}: {1}", connection.ClientAddress, ex.Message);
 }
}
4

2 回答 2

2

该问题是由于在设置侦听器套接字时多次调用 BeginAccept() 引起的。不知道为什么这个问题只出现在 64 位服务器上,但是如下所示更改代码解决了这个问题。

原始代码:

public SetupServerSocket() {
   IPEndPoint myEndPoint = new IPEndPoint(IPAddress.Any, _port);

   // Create the socket, bind it, and start listening.
  _serverSocket = new Socket(myEndPoint.Address.AddressFamily, SocketType.Stream, ProtocolType.Tcp);
  _serverSocket.Bind(myEndPoint);
  _serverSocket.Listen((int)SocketOptionName.MaxConnections);

  for (int i = 0; i < 10; i++) {
     _serverSocket.BeginAccept(new AsyncCallback(AcceptCallback), _serverSocket);
  }
}

到以下:

public SetupServerSocket() {
   IPEndPoint myEndPoint = new IPEndPoint(IPAddress.Any, _port);

   // Create the socket, bind it, and start listening.
  _serverSocket = new Socket(myEndPoint.Address.AddressFamily, SocketType.Stream, ProtocolType.Tcp);
  _serverSocket.Bind(myEndPoint);
  _serverSocket.Listen((int)SocketOptionName.MaxConnections);

  //for (int i = 0; i < 10; i++) {
     _serverSocket.BeginAccept(new AsyncCallback(AcceptCallback), _serverSocket);
  //}
}
于 2011-07-14T04:04:33.167 回答
1

I know the "Wire Performance" article where you found SetupServerSocket() - the original 10x for loop is to support 10 listen threads if you have rapid new client connections. You've changed it to one listener. Maybe that's the only possible solution if Win2k8r2 has such a bug. You might want to be sure you have robust connect retry code in your client.

Get Closer to the Wire with High-Performance Sockets in .NET http://msdn.microsoft.com/en-us/magazine/cc300760.aspx

于 2012-07-24T22:27:39.973 回答