0

我正在努力在服务器上创建多个文件的 zip,并在创建时将其流式传输到客户端。最初,我使用的是ArchiverJs,如果我将缓冲区附加到它,它工作正常,但是当我需要向其中添加流时它会失败。然后在Github上进行了一些讨论后,我切换到Node zip-stream,感谢jntesteves开始正常工作。但是当我在GKE k8s 上部署代码时,我开始收到大文件的Network Failed错误。

这是我的示例代码:


const ZipStream = require("zip-stream");

/**
 * @summary Adding readable stream provided by https module into zipStreamer using entry method
 */
const handleEntryCB = ({ readableStream, zipStreamer, fileName, resolve }) => {
  readableStream.on("error", () => {
    console.error("Error while listening readableStream : ", error);
    resolve("done");
  });
  zipStreamer.entry(readableStream, { name: fileName }, error => {
    if (!error) {
      resolve("done");
    } else {
      console.error("Error while listening zipStream readableStream : ", error);
      resolve("done");
    }
  });
};

/**
 * @summary Handling downloading of files using native https, http and request modules
 */
const handleUrl = ({ elem, zipStreamer }) => {
  return new Promise((resolve, reject) => {
    let fileName = elem.fileName;
    const url = elem.url;
    //Used in most of the cases
    if (url.startsWith("https")) {
      https.get(url, readableStream => {
        handleEntryCB({ readableStream, zipStreamer, url, fileName, resolve, reject });
      });
    } else if (url.startsWith("http")) {
      http.get(url, readableStream => {
        handleEntryCB({ readableStream, zipStreamer, url, fileName, resolve, reject });
      });
    } else {
      const readableStream = request(url);
      handleEntryCB({ readableStream, zipStreamer, url, fileName, resolve, reject });
    }
  });
};

const downloadZipFile = async (data, resp) => {
  let { urls = [] } = data || {};
  if (!urls.length) {
    throw new Error("URLs are mandatory.");
  }
  //Output zip name
  const outputFileName = `Test items.zip`;
  console.log("Downloading using streams.");
  //Initialize zip-stream instance
  const zipStreamer = new ZipStream();
  //Set headers to response
  resp.writeHead(200, {
    "Content-Type": "application/zip",
    "Content-Disposition": `attachment; filename="${outputFileName}"`,
    "Access-Control-Allow-Origin": "*",
    "Access-Control-Allow-Methods": "GET, POST, OPTIONS"
  });
  //piping zipStreamer to the resp so that client starts getting response
  //as soon as first chunk is added to the zipStreamer
  zipStreamer.pipe(resp);
  for (const elem of urls) {
    await handleUrl({ elem, zipStreamer });
  }
  zipStreamer.finish();
};

app.post(restPrefix + "/downloadFIle", (req, resp) => {
  try {
    const { data } = req.body || {};
    downloadZipFile(data, resp);
  } catch (error) {
    console.error("[FileBundler] unknown error : ", error);
    if (resp.headersSent) {
      resp.end("Unknown error while archiving.");
    } else {
      resp.status(500).end("Unknown error while archiving.");
    }
  }
});

我在本地测试了 7-8 个约 4.5 GB 的文件,它工作正常,当我在 google k8s 上尝试相同时,出现网络失败错误。经过更多研究,我在 k8s t0 3000 秒上增加了服务器超时,而不是它开始正常工作,但我猜增加超时并不好。我在代码级别上是否缺少任何东西,或者您能否为可以下载具有许多并发用户的大文件的服务器建议一些好的GKE部署配置?在过去的 1.5 个多月里,我一直坚持这一点。请帮忙!

编辑1:我在入口编辑了超时,即网络服务->负载平衡->编辑服务中的超时 后端服务详情_1

4

0 回答 0