1

我正在尝试将一些字符串和图像数据从 python 脚本发送到在 OSX 上运行的目标 C 应用程序。

我正在使用 GCDAsyncSocket 收集传输的数据,并将其附加到 NSMutableData 直到服务器断开连接。然后我正在处理该 NSData 并将其拆分为原始部分。

传输的数据包括以下内容:

ID 字符串,填充到 16 个字节。

图像编号字符串,填充到 16 个字节。

原始图像数据。

终止字符串,填充到 16 个字节。

问题是我没有接收/获取最后一块数据,我最终错过了 JPEG 图像的结尾,导致图像损坏(尽管大部分显示),并且缺少终止字符串。

这是我使用 GCDAsyncSocket 来获取数据并处理它的代码:

插座连接:

- (void)socket:(GCDAsyncSocket *)sock didAcceptNewSocket:(GCDAsyncSocket *)newSocket
{
// This method is executed on the socketQueue (not the main thread)

@synchronized(connectedSockets)
{
    [connectedSockets addObject:newSocket];
}

NSString *host = [newSocket connectedHost];
UInt16 port = [newSocket connectedPort];

dispatch_async(dispatch_get_main_queue(), ^{
    @autoreleasepool {

        [self logInfo:FORMAT(@"Accepted client %@:%hu", host, port)];

    }
});

[newSocket readDataToData:[GCDAsyncSocket CRLFData] withTimeout:-1 tag:0];

}

接收到的套接字数据

- (void)socket:(GCDAsyncSocket *)sock didReadData:(NSData *)data withTag:(long)tag
{
// This method is executed on the socketQueue (not the main thread)

dispatch_async(dispatch_get_main_queue(), ^{
    @autoreleasepool {

        NSLog(@"Thread Data Length is %lu", (unsigned long)[data length]);
        if (!imageBuffer){
            imageBuffer = [[NSMutableData alloc]init];
        }

        [imageBuffer appendData:[data subdataWithRange:NSMakeRange(0, [data length])]];
        NSLog(@"Total Data Length is %lu", (unsigned long)[imageBuffer length]);

    }
});

// Echo message back to client
[sock writeData:data withTimeout:-1 tag:ECHO_MSG];
    [sock readDataToData:[GCDAsyncSocket CRLFData] withTimeout:-1 tag:0];
}

插座断开

- (void)socketDidDisconnect:(GCDAsyncSocket *)sock withError:(NSError *)err
{
if (sock != listenSocket)
{
    dispatch_async(dispatch_get_main_queue(), ^{
        @autoreleasepool {

            [self logInfo:FORMAT(@"Client Disconnected")];
            NSData *cameraNumberData;
            NSData *imageNumberData;
            NSData *imageData;
            NSData *endCommandData;
            //if ([data length] > 40){
            cameraNumberData = [imageBuffer subdataWithRange:NSMakeRange(0, 16)];
            imageNumberData = [imageBuffer subdataWithRange:NSMakeRange(16, 16)];
            imageData = [imageBuffer subdataWithRange:NSMakeRange(32, [imageBuffer length]-34)];
            endCommandData = [imageBuffer subdataWithRange:NSMakeRange([imageBuffer length]-16, 16)];
            //}
            NSString *cameraNumberString = [[NSString alloc] initWithData:cameraNumberData encoding:NSUTF8StringEncoding];
            NSString *imageNumberString = [[NSString alloc] initWithData:imageNumberData encoding:NSUTF8StringEncoding];
            NSString *endCommandString = [[NSString alloc] initWithData:endCommandData encoding:NSUTF8StringEncoding];
            NSImage* image = [[NSImage alloc]initWithData:imageData];
            if (cameraNumberString)
            {
                NSLog(@"Image recieved from Camera no %@", cameraNumberString);
                [self logMessage:cameraNumberString];
            }
            else
            {
                [self logError:@"Error converting received data into UTF-8 String"];
            }

            if (imageNumberString)
            {
                NSLog(@"Image is number %@", imageNumberString);
                [self logMessage:imageNumberString];
            }
            else
            {
                [self logError:@"Error converting received data into UTF-8 String"];
            }

            if (image)
            {
                NSLog(@"We have an image");
                [self.imageView setImage:image];
            }
            else
            {
                [self logError:@"Error converting received data into image"];
            }

            if (endCommandString)
            {
                NSLog(@"Command String is %@", endCommandString);
                [self logMessage:endCommandString];
            }
            else
            {
                [self logError:@"No command string"];
            }

            //self.imageBuffer = nil;

        }
    });

        @synchronized(connectedSockets)
    {
        [connectedSockets removeObject:sock];
    }
}
}

我使用了wireshark,数据正在传输,只是没有通过GCDAsynSocket。

所以,我显然错过了一些东西。像这样的套接字编程和数据编码/解码对我来说相对较新,所以我可能是个白痴。

非常感谢帮助!

谢谢

加雷斯

4

1 回答 1

2

好的,所以我终于得到了这个工作。它涉及修改 Python 中的传输代码以在数据末尾发送一个完成字符串,并对其进行监视。最大的收获是每次套接字读取一些数据时我都需要重新调用 readDataToData: 方法,否则它只会坐在那里等待,而发送套接字也只会坐在那里。

我还必须使用标签实现重新调用第二个接收,以便我可以将接收到的数据存储在 NSMutableArray 中正确的 NSMutableData 对象中,否则我无法在第一次接收后知道数据来自哪个传输套接字ID 仅在第一条消息的开头。

这是 didReadData 代码:

- (void)socket:(GCDAsyncSocket *)sock didReadData:(NSData *)data withTag:(long)tag
{

dispatch_async(dispatch_get_main_queue(), ^{
    @autoreleasepool {

        NSInteger cameraNumberNumber = 0;
        NSString *cameraNumberString = [[NSString alloc]init];

        if (tag > 10){

            cameraNumberNumber = tag-11;
            DDLogVerbose(@"Second data loop, tag is %ld", tag);
        } else {

        NSData *cameraNumberData;
        //if ([data length] > 40){
        cameraNumberData = [data subdataWithRange:NSMakeRange(0, 16)];
        NSString *cameraNumberString = [[NSString alloc] initWithData:cameraNumberData encoding:NSUTF8StringEncoding];
        cameraNumberString = [cameraNumberString stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]];
        cameraNumberNumber = [cameraNumberString intValue]-1;

        }

        if (cameraNumberNumber+1 <= self.images.count){

                if ([self.images objectAtIndex:cameraNumberNumber] == [NSNull null]){
                        image* cameraImage = [[image alloc]init];
                        [self.images replaceObjectAtIndex: cameraNumberNumber withObject:cameraImage];
                    }

                image* cameraImage = [self.images objectAtIndex:cameraNumberNumber];
                [cameraImage.imageData appendData:[data subdataWithRange:NSMakeRange(0, [data length])]];
                cameraImage.cameraNumber = cameraNumberString;

                if (!imageBuffer){
                        imageBuffer = [[NSMutableData alloc]init];
                    }


                [imageBuffer appendData:[data subdataWithRange:NSMakeRange(0, [data length])]];
                DDLogVerbose(@"Total Data Length is %lu", (unsigned long)[imageBuffer length]);
        } else {

            DDLogInfo(@"Wrong camera quantity!");
            NSAlert *testAlert = [NSAlert alertWithMessageText:@"Wrong camera quantity!"
                                                 defaultButton:@"Ok"
                                               alternateButton:nil
                                                   otherButton:nil
                                     informativeTextWithFormat:@"We have recieved more images than cameras, please set No.Cameras correctly!"];

            [testAlert beginSheetModalForWindow:[self window]
                                  modalDelegate:self
                                 didEndSelector:@selector(stop)
                                    contextInfo:nil];

        }

                [sock readDataToData:[@"end" dataUsingEncoding:NSUTF8StringEncoding] withTimeout:-1 tag:cameraNumberNumber + 11];

    }

});
}

这是socketDidDisconnect代码,这里的很多东西在上下文中没有意义,但它显示了我如何处理接收到的数据。

- (void)socketDidDisconnect:(GCDAsyncSocket *)sock withError:(NSError *)err
{
if (sock != listenSocket)
{
    dispatch_async(dispatch_get_main_queue(), ^{
        @autoreleasepool {
            totalCamerasFetched = [NSNumber numberWithInt:1+[totalCamerasFetched intValue]];
            if ([totalCamerasFetched integerValue] >= [numberOfCameras integerValue]){

                for (image* cameraImage in self.images){

                        NSData *cameraNumberData;
                        NSData *imageNumberData;
                        NSData *imageData;
                        NSData *endCommandData;
                        NSInteger cameraNumberNumber = 0;
                        cameraNumberData = [cameraImage.imageData subdataWithRange:NSMakeRange(0, 16)];
                        imageNumberData = [cameraImage.imageData subdataWithRange:NSMakeRange(16, 16)];
                        imageData = [cameraImage.imageData subdataWithRange:NSMakeRange(32, [cameraImage.imageData length]-32)];
                        endCommandData = [cameraImage.imageData subdataWithRange:NSMakeRange([cameraImage.imageData length]-16, 16)];
                        NSString *cameraNumberString = [[NSString alloc] initWithData:cameraNumberData encoding:NSUTF8StringEncoding];
                        cameraNumberString = [cameraNumberString stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]];
                        NSString *imageNumberString = [[NSString alloc] initWithData:imageNumberData encoding:NSUTF8StringEncoding];
                        imageNumberString = [imageNumberString stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]];
                        NSString *endCommandString = [[NSString alloc] initWithData:endCommandData encoding:NSUTF8StringEncoding];
                        NSImage* image = [[NSImage alloc]initWithData:imageData];
                        cameraNumberNumber = [cameraNumberString intValue]-1;






                        if (cameraNumberString)
                            {
                                    DDLogInfo(@"Image recieved from Camera no %@", cameraNumberString);
                            }
                        else
                        {
                                    DDLogError(@"No Camera number in data");
                        }

                        if (imageNumberString)
                        {
                                    DDLogInfo(@"Image is number %@", imageNumberString);
                        }
                        else
                        {
                                    DDLogError(@"No Image number in data");
                        }




                        if (image)
                        {

                        DDLogVerbose(@"We have an image");


                        NSString* dataPath = [[NSString alloc]initWithFormat:@"%@/image%@/",self.exportLocation, imageNumberString];

                        if (![[NSFileManager defaultManager] fileExistsAtPath:dataPath]){

                                NSError* error;
                                [[NSFileManager defaultManager] createDirectoryAtPath:dataPath withIntermediateDirectories:NO attributes:nil error:&error];

                                if (error)
                                    {
                                            DDLogError(@"[%@] ERROR: attempting to write directory for images", [self class]);
                                            NSAssert( FALSE, @"Failed to create directory maybe out of disk space?");
                                        }
                            }

                        NSString* dataPathVideo = [[NSString alloc]initWithFormat:@"%@/video%@/",self.exportLocation, imageNumberString];

                        if (![[NSFileManager defaultManager] fileExistsAtPath:dataPathVideo]){

                                NSError* error;
                                [[NSFileManager defaultManager] createDirectoryAtPath:dataPathVideo withIntermediateDirectories:NO attributes:nil error:&error];

                                if (error)
                                {
                                    DDLogError(@"[%@] ERROR: attempting to write directory for images", [self class]);
                                    NSAssert( FALSE, @"Failed to create directory maybe out of disk space?");
                                }
                            }

                        NSString * exportLocationFull = [[NSString alloc]initWithFormat:@"%@/image%@/camera_%@.jpg",self.exportLocation, imageNumberString, cameraNumberString];
                            DDLogInfo(@"Full export URL = %@", exportLocationFull);
                        [imageData writeToFile:exportLocationFull atomically:YES];
                        self.currentSet = [NSNumber numberWithInt:[imageNumberString intValue]];

                        NSImage* imageToStore = [[NSImage alloc]initWithData:imageData];


                        [self.imagesToMakeVideo replaceObjectAtIndex: cameraNumberNumber withObject:imageToStore];


                        } else {
                            DDLogError(@"No image loacted in data");
                        }

                        if (endCommandString)
                        {
                            DDLogVerbose(@"Command String is %@", endCommandString);
                            //[self logMessage:endCommandString];
                        }
                        else
                        {
                            //[self logError:@"No command string"];
                        }

                        self.imageBuffer = nil;

                    }

                self.totalCamerasFetched = [NSNumber numberWithInt:0];
                [self loadandDisplayLatestImages];
                [self createVideowithImages:imagesToMakeVideo toLocation:[[NSString alloc]initWithFormat:@"%@/video%@/image_sequence_%@.mov",self.exportLocation, self.currentSet, self.currentSet]];
                processing = false;
            }//end of for loop
        }
    });

    @synchronized(connectedSockets)
    {
        [connectedSockets removeObject:sock];
    }
}

}

这也是我修改 Python 代码以添加额外的“结束”标签的方式。

def send_media_to(self, ip, port, media_name, media_number, media_dir):
    camera_number = self.camera.current_mode['option'].number
    sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
    sock.connect((ip, port))
    try:
        sock.send(bytes(str(camera_number).ljust(16), 'utf-8'))
        sock.send(bytes(str(media_number).ljust(16), 'utf-8'))
        with open(media_dir + media_name, 'rb') as media:
            sock.sendall(media.read())
    finally:
        sock.send(bytes(str("end").ljust(16), 'utf-8'))
        sock.close()

希望这可以帮助其他陷入相同情况的人!

于 2014-04-14T09:24:42.587 回答