1

在此先感谢您的任何指示或帮助。

基本上我希望异步版本的性能比同步版本好得多,但同步版本的性能相当或更好。

我做错了什么,什么给了?我尝试不使用 Javalin,以防框架中的某些内容产生问题,似乎给出了类似的结果。我确实只用 Netty 尝试过这个(太长而无法发布代码),我也遇到了类似的结果。

我写了以下代码:(javalin-3.12.0 和 jetty-9.4.31.v20200723)

import io.javalin.Javalin;
import org.eclipse.jetty.server.Server;
import org.eclipse.jetty.util.thread.QueuedThreadPool;
import java.io.IOException;
import java.util.concurrent.ScheduledThreadPoolExecutor;
import java.util.concurrent.TimeUnit;

public class AsyncTest {
    static ScheduledThreadPoolExecutor scheduledThreadPoolExecutor = new ScheduledThreadPoolExecutor(5000);
    public static void main(String[] args) {
        var jav = Javalin.create();
        jav.config.server(() -> new Server(new QueuedThreadPool(5000, 500, 120_000)));
        Javalin app = jav.start(8080);

        app.get("/async-delay", ctx -> {
            var async = ctx.req.startAsync();
            scheduledThreadPoolExecutor.schedule(() -> {
                try {
                    ctx.res.getOutputStream().println("ok");
                } catch (IOException e) {
                    e.printStackTrace();
                }
                async.complete();

            }, 100, TimeUnit.MILLISECONDS);
        });

        app.get("/delay", ctx -> {
            Thread.sleep(100);
            ctx.result("ok");
        });

        app.get("/no-delay", ctx -> {
            ctx.result("ok");
        });
    }
}

并得到以下结果:

➜  ~ wrk2 -t16 -c300 -d5s -R3000 http://localhost:8080/delay
Running 5s test @ http://localhost:8080/delay
  16 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   331.36ms  138.72ms 626.18ms   57.34%
    Req/Sec        nan       nan   0.00      0.00%
  10854 requests in 5.00s, 1.24MB read
  Socket errors: connect 53, read 0, write 0, timeout 106
Requests/sec:   2170.40
Transfer/sec:    254.34KB
➜  ~ wrk2 -t16 -c300 -d5s -R3000 http://localhost:8080/async-delay
Running 5s test @ http://localhost:8080/async-delay
  16 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   285.84ms  120.75ms 522.50ms   56.27%
    Req/Sec        nan       nan   0.00      0.00%
  11060 requests in 6.10s, 1.29MB read
  Socket errors: connect 53, read 0, write 0, timeout 124
Requests/sec:   1814.16
Transfer/sec:    216.14KB
➜  ~ wrk2 -t16 -c16 -d5s -R70000 http://localhost:8080/no-delay
Running 5s test @ http://localhost:8080/no-delay
  16 threads and 16 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     2.51ms    3.12ms  21.95ms   88.36%
    Req/Sec        nan       nan   0.00      0.00%
  349824 requests in 5.00s, 40.03MB read
Requests/sec:  69995.44
Transfer/sec:      8.01MB
4

2 回答 2

2

由于 Jetty 9+ 从一开始就是 100% 异步的,所以这种缺乏差异是有道理的。(事实上​​,在 Jetty 9+ 中,当使用像or之类的同步 API 时,需要做额外的工作来假装是同步的)InputStream.read()OutputStream.write()

此外,您的负载测试工作负载也不现实。

  • 您需要更多的客户端机器来进行测试。没有一个单独的软件客户端能够对 Jetty 服务器施加压力。在达到任何类型的 Jetty 服务限制之前,您将达到系统资源限制。
    • 客户端机器与服务器机器的比率至少为 4 比 1(我们以 8 比 1 的比率进行测试),以产生足够的负载来给 Jetty 施加压力。
  • 您需要到服务器的许多并发连接。(想想 40,000+)
    • 或者您想要图片中的 HTTP/2(这也以自己独特的方式强调服务器资源)
  • 您希望返回大量数据(需要多个网络缓冲区才能返回)。
  • 您还想加入一些读取速度很慢的客户端连接(在同步服务器上,这些连接可能会通过简单地消耗太多资源而影响其他不慢的连接)
于 2020-12-09T13:59:34.817 回答
0

是的,Jaokim 称之为,wrk 是这里的瓶颈。如果我按照上面的建议并行运行它们,则 rps 的数量是 4 倍。

➜  ~ ➜  ~ wrk2 -t16 -c500 -d5s -R3000 http://localhost:8080/async-delay & ; wrk2 -t16 -c500 -d5s -R3000 http://localhost:8080/async-delay & ; wrk2 -t16 -c500 -d5s -R3000 http://localhost:8080/async-delay & ; wrk2 -t16 -c500 -d5s -R3000 http://localhost:8080/async-delay & ;
[1] 2779
[2] 2780
[3] 2781
[4] 2782
Running 5s test @ http://localhost:8080/async-delay
  16 threads and 500 connections
Running 5s test @ http://localhost:8080/async-delay
  16 threads and 500 connections
Running 5s test @ http://localhost:8080/async-delay
  16 threads and 500 connections
Running 5s test @ http://localhost:8080/async-delay
  16 threads and 500 connections
➜  ~   Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   104.09ms   13.99ms 337.66ms   97.43%
    Req/Sec        nan       nan   0.00      0.00%
  7066 requests in 5.06s, 841.85KB read
  Socket errors: connect 261, read 14, write 1, timeout 522
Requests/sec:   1395.35
Transfer/sec:    166.24KB
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   103.84ms   12.70ms 239.23ms   97.48%
    Req/Sec        nan       nan   0.00      0.00%
  7066 requests in 5.06s, 841.85KB read
  Socket errors: connect 261, read 9, write 2, timeout 522
Requests/sec:   1395.56
Transfer/sec:    166.27KB

[1]    2779 done       wrk2 -t16 -c500 -d5s -R3000 http://localhost:8080/async-delay
[2]    2780 done       wrk2 -t16 -c500 -d5s -R3000 http://localhost:8080/async-delay
➜  ~ ➜  ~   Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   103.62ms   12.48ms 243.58ms   97.67%
    Req/Sec        nan       nan   0.00      0.00%
  7064 requests in 6.16s, 841.61KB read
  Socket errors: connect 261, read 13, write 2, timeout 584
Requests/sec:   1147.51
Transfer/sec:    136.71KB
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   103.50ms   12.94ms 339.46ms   97.83%
    Req/Sec        nan       nan   0.00      0.00%
  7055 requests in 6.16s, 840.54KB read
  Socket errors: connect 261, read 6, write 2, timeout 646
Requests/sec:   1145.42
Transfer/sec:    136.47KB

[3]  - 2781 done       wrk2 -t16 -c500 -d5s -R3000 http://localhost:8080/async-delay
[4]  + 2782 done       wrk2 -t16 -c500 -d5s -R3000 http://localhost:8080/async-delay
➜  ~ ➜  ~
于 2020-12-09T14:20:49.660 回答