为什么要质疑算法的一小部分?更全面地审视整个算法可能是一个更好的主意。您当前的算法依赖于这种相当密集且不必要的代码。如果您可以消除将整个文件读入内存的依赖,那么您的解决方案将更具可扩展性。也许通过“一种更好的方法”,您的意思是“更快”或“这样我就可以处理 100GB 大小的文件而不会停止爬行”。
Consider a finite state machine that can read, process and extract the required information from your file one byte at a time. You probably wouldn't need malloc so much. Without a specific description of your problem, we can't help you derive a finite state machine to solve your problem. However, one example that stands out might be finding a maximum integer in a 100GB file:
while (fscanf(file, "%d", ¤t_num) == 1) {
if (current_num > max_num) {
max_num = current_num;
}
}
This code clearly doesn't need malloc, let alone to read the entire file into memory; It'll only ever use a constant amount of memory, regardless of the size of the file.