这个问题有点难以解释,因为代码片段是一个更大项目的一部分。我会尽力解释这个问题。
我有两个文件
FILE *f,*m;
f=fopen("/home/machine/decoder.txt","a+");
m=fopen("/home/machine/offset.txt","a+");
在一个函数中,我运行以下代码,
char *c;
int i=0;
c = malloc(sizeof(SslDecoder));
//Pick a value from "decoder" file and compare it to a variable in the function
while (fgets(c, sizeof(SslDecoder), f) != NULL) {
//Print its value to offset file
fprintf(m,"%s\n",c);
// Print value of another variable to offset file.
for(i=0;i<32;i++){
fprintf(m,"%02x",ssl->client_random.data[i]);
}
fprintf(m,"\n");
//Compare the memory in the pointers.
int check = memcmp(c,ssl->client_random.data,32);
fprintf(m,"MEMCMP value: %d\n",check);
}
offset.txt中打印的值如下
625b70a9659b2fe9ba76ea26d3cfb6126bae4a48b4997548b26d9a101e682bc3
625b70a9659b2fe9ba76ea26d3cfb6126bae4a48b4997548b26d9a101e682bc3
MEMCMP value: -44
client_random 和 ssl 的定义如下 -
typedef struct _StringInfo {
guchar *data; /* Backing storage which may be larger than data_len */
guint data_len; /* Length of the meaningful part of data */
} StringInfo;
typedef struct _SslDecryptSession {
StringInfo server_random;
StringInfo client_random;
StringInfo master_secret;
guchar _client_data_for_iv[24];
StringInfo client_data_for_iv;
gint state;
SslCipherSuite cipher_suite;
SslDecoder *server;
SslDecoder *client;
SslSession session;
} SslDecryptSession;
我不明白为什么 memcmp 的值不为零。我怀疑存储在指针中的数据的编码方式不同,但在这种情况下如何比较这些值。我不知道任何一个指针中的数据是十六进制类型还是原始/ASCII数据。